When I run delta infopackage  no data was loaded

Dear all friends
I created a datasource by view in R3 side(t-cd rso2)
and delta field is one of view key
In bw side I created two infopackage init&delta
one case
1, run init infopackage without data , no problem
2,create delta infopackage (timestamp UL 1800,LL 0)
3,run delta infopackage but no data can be loaded
two case
1, run init infopackage withdata , no problem (32 records were loaded)
2,create delta infopackage  (timestamp UL 1800,LL 0)
3,run delta infopackage but no data can be loaded
4, I checked rsa7, and found the time stamp is 0 WHY?
When I test I also insert or delete r3 side records into table
but I can not load data to psa with delta update
PLEASE HELP ME

Hi,
    What Delta specific field are you using ?
Have you mentioned the Safety Upper & Lower limit ?
Follow this document. It will show you where you have missed out..
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/84bf4d68-0601-0010-13b5-b062adbb3e33
Regards,
Balaji V

Similar Messages

  • Records are Missing when we run Delta Load( FIGL-ACCOUNT ITEMS)

    Hi,
      Some records are missing when we run Delta load.
    We have one generic data source on FAGLFLEXA table.
    We select  Field nm TIMESTAMP
    TIME STAMP IS LOCAL.
    upper limit is Blank
    Lower limit is 1000.
    we run this process chain every day.
    we select delta settings is New status for changed records.
    Please give me any idea why records are missing when we run deltas.
    Thanks
    Naik

    Hi Anju,
    Please ensure that you are following the below steps while Initilizing the application 13:
    1. All the users in source system, must be locked.
    2. Ensure that SMQ1 and RSA7 are having no data for application 13. Delete the application 13 datasource entries from RSA7.
    3. Delete and refill the setup tables for application 13.
    4. Run a INIT with data transfer load for the required datasources of application 13.
    5. The Deltas can follow from the next day.
    This will ensure that your deltas are fetching correct data. Note that, delta will pick up the same conditions on which the INIT load was run.
    Please let me know if you need any more information.
    Regards,
    Pankaj

  • Why am I seeing many of the same reocrds in my PSA when I run my infopackag

    I am running into an issue when on occassion I will get the same records several times over when I run my infopackage.  It grabs by date which is on each record so it should nver pull it more than 1 itme.  this doens't happen all the time but seems to occur if i take a bigger date range

    Hi,
    Please delete the requests from PSA if you do not need these to be loaded.
    This will sort your problem.
    -Vikram

  • Explorer/Polestar: Need to re-index every time new data was loaded?

    I have a question concerning the indexing functionality of BO Explorer/Polestar. It's clear that I need to re-index my infospace everytime the structure of the infospace has changed (e.g. I added a new object from my universe). What I'm not sure about is whether I also need to re-index my infospace as soon as new data was loaded in the warehouse which is supposed to be a part of my infospace. Example: I crate an infospace consisting of countries (UK, USA, Germany and Japan) and revenue. I index this infospace. The next day a new country (France) is loaded in the dwh. Do I need to re-index the infospace so that users can see "France" in BO Explorer?
    Thanks for your help!
    Agnes

    Hi Agnes,
    according to the Explorer documentation new data are available AFTER reindexing.
    Indexing refreshes the data and metadata in Information Spaces. After
    indexing, any new data on the corporate data providers upon which those
    Information Spaces are based becomes available for search and exploration.
    Regards,
    Stratos

  • Error when perform init delta process (w/data)

    Hello all!
    when i perform init delta process (without data transfer), i am receiving the follow message:
    SUBRC= 2 The current application triggered a termination wi
    Recently we upgrated to SapNetWeaver 7.31 (before was 7.0).
    Is there anything that i need to do to execute a init delta (without data transfer)?
    SPK 731 LEVEL 10.
    thanks.

    Hi,
    What data source and at which server you doing this?
    Can you check your extract structure(source system side) was active or not?
    replicate your data source into bw and activate it.
    later try to do init without data transfer.
    Thanks

  • BW - Delta for Master data not loading

    Hello All,
    I am not able to load delta for master data of vendor and material.
    The initial load works fine but when I load delta it gives an error
    " The extraction program does not support object 0vendor"
    " ALE change pointers are not set up correctly"
    Do i need to activate the change pointer in BD61. I was not sure so checking up with all experts.
    Regards
    Vanya

    Hi Ravi,
                We faced the same problem as you for <b>0MATERIAL_ATTR</b> and we solved it by the following steps,
    1. Delete the previous Delta Init for the InfoObject
    If you have access to R/3 Side you can follow the steps or you can follow the BW Steps
    Goto RSA7 ->Select and Delete the Init for the InfoObject
    <b>From BW Side</b>
    InfoPackage-> Inti for Source System -> Select & Delete
    2. Execute the Init InfoPackage for the InfoPackage again.
    3. After successful completion, execute the delta InfoPackage.
    We have have got resolved by doing these steps and it may help for your problem as well.
    Thanks & Regards,
    Chandran Ganesan
    SAP Business Intelligence

  • Can we have 1 extractor that runs delta on 3 date fields in 3 diff tables?

    Hi guys
    i need a extractor that tracks changes to 3 different tables based on three different date fields one in each table. can this be done in one extractor. i understand u can have only one delta pointer per extractor. but is there a way to do this using a function Module?
    the other option is i know create three extractors, but since the fields i need are going to be a inner join b/w all three tables i was wondering if only one extractor is possible?
    Thanks

    Hi,
    this dummy date as the generic delta field will be passed to the fm (in case of a delta request) in the table i_t_select. Grab the value and design your select statement/s as you need. E.g.
    select * from your_view
    where date1 >= delta_date or date2 >= delta_date or date3 >= delta_date....
    Hope this helps!
    regards
    Siggi

  • The battery died and shut down, when I powered up all my data was gone.

    When the battery died on my MacBook Pro it shut down. When I powered back up the machine went back to the original startup configuration. I could not access my data. I know it is there as I ran the disk utility and it shows the drive is almost full. Can anyone help me recover my data?
    Thanks

    Hi johndiiullo: Before the MacBook Pro shut down, did you happen to change the "Short Name" of the computer? If so check here: http://docs.info.apple.com/article.html?artnum=107854
    Stedman

  • Problem running an infopackage

    I have created a Generic extractor, for the Snnn planning structure.
    But i have a problem when i run the infopackage to upload the data in BW.
    The field SPBUP in the r/3 system is NUMC 6, and the corresponding infoobject (0FISCPER) is NUMC7, so, i create a routine in the transfer roule, to convert from numc6 to numc7, and it solves this problem.
    But when i create an infopackage and use in the data selection the field SPBUP, for limiting the period of selection, the result is always zero records.
    I think that the problem is that in BW the field is numc7 and in R/3 is numc6.
    Is anyway of making a routine to convert the numc7 to numc6 when i run the infopackage?

    Hello,
    i have tested the debug in the RSA3, but what i see, confirm my doubt, in my selection i have the format 'yyyymmm' and in the source system table (Snnn) it as the 'yyyymm' so it can´t do any selection.
    I tested the conversion routine as sais Siggi, but it only mades the conversion for the output field: passes from 'yyyymmm' to 'yyyy.mmm'
    Any tips more?
    thanks
    António Oliveira

  • I have Process chains as well as some jobs which run.Most  of the data is l

    Hi Experts,
    I have Process chains as well as some jobs which run.Most  of the data is loaded through custom ABAP extracts.Now where to check for these jobs and how to understand them.I know that  we can check for jobs in sm37.But how do u load data using these jobs.Please tell me in detail how  and where u execute these jobs.Please give me an answer

    For creating a job use..transaction sm36.Define start condition ,periodicity. etc..
    To monitor it you can use--sm37 job monitoring
    Various options in sm37 which you can use after check(tick) on any particulur job..and then click on joblog.
    To view variants of any job click step>goto(top menu bar)>variant.
    ,rsmo ( process monitor) -- to view data load--you can view there data load daywise..here to view any particulur data load..click on infopackage /dtp/datasource in rsa1 and click on the monitor icon at the top..system will show you the concerned dataloads in process monitor
    For monitoring process chains--go in RSPC and double click on the chain and click on log button(yellow color) select data range..and you can view the status of a chain....red if failed and green if success..and yellow if still running.
    Also in sm37 you can find the status of a process chain by searching with job name---bi_process_trigger.
    To execute chaingo to RSPC>double click on the chain-->schedule(F8)
    If you know the data target of loading ,go in rsa1 and right click on the data target and select manage and you can view the request being loaded.

  • Reports fail when run against a different data source

    Hello,
    We have a VB.NET 2008 WinForms application running on Microsoft .NET 3.5. We are using Crystal Reports 2008 runtime, service pack 3 -- using the CrystalDecisions.Windows.Forms.CrystalReportViewer in the app to view reports. In the GAC on all our client computers, we have versions 12.0.1100.0 and 12.0.2000.0 of CrystalDecisions.CrystalReports.Engine, CrystalDecisions.Shared, and CrystalDecisions.Windows.Forms.
    Please refer to another one of our posted forum issues, u201CCritical issue since upgrading from CR9 to CR2008u201D, as these issues seem to be related:
    Critical issue since upgrading from CR9 to CR2008
    We were concerned with report display slow down, and we seemed to have solved this by using the Oracle Server driver (instead of either Microsoft's or Oracle's OLEDB driver).  But now we must find a resolution to another piece of the puzzle, which is:  why does a report break if one data source is embedded in the .rpt file is different than the one you are trying to run the report against, in the .NET Viewer?
    Problem:
    If you have a production database name (e.g. "ProdDB") embedded in your .rpt file that you built your report from and try to run that report against a development database (e.g. "DevDB") (OR VICE VERSA -- it is the switch that is the important concept here), the report fails with a list of messages such as this:
        Failed to retrieve data from the database
        Details:  [Database vendor code: 6550 ]
    This only seems to happen if the source of the report data (i.e. the underlying query) is an Oracle stored procedure or a Crystal Reports SQL Command -- the reports run fine against all data sources if the source is a table or a view).  In trying different things to troubleshoot this, including adding a ReportDocument.VerifyDatabase() call after setting the connection information, the Crystal Reports viewer will spit out other nonsensical errers regarding being unable to find certain fields (e.g. "The field name is not known), or not able to find the table (even though the source data should be coming from an Oracle stored procedure, not a table).
    When the reports are run in the Crystal Reports Designer, they run fine no matter what database is being used; but the problem only happens while being run in the .NET viewer.  It's almost as if something internally isn't getting fully "set" to the new data source, or something -- we're really grasping at straws here.
    For the sake of completeness of information, here is how we're setting the connection information
            '-- Set database connection info for the main report
            For Each oConnectionInfo In oCrystalReport.DataSourceConnections
                oConnectionInfo.SetConnection(gsDBDataSource, "", gsDBUserID, gsDBPassword)
            Next oConnectionInfo
            '-- Set database connection info for each subreport
            For Each oSubreport In oCrystalReport.Subreports
                For Each oConnectionInfo In oSubreport.DataSourceConnections
                    oConnectionInfo.SetConnection(gsDBDataSource, "", gsDBUserID, gsDBPassword)
                Next oConnectionInfo
            Next oSubreport
    ... but in troubleshooting, we've even tried an "overkill" approach and added this code as well:
            '-- Set database connection info for each table in the main report
            For Each oTable In oCrystalReport.Database.Tables
                With oTable.LogOnInfo.ConnectionInfo
                    .ServerName = gsDBDataSource
                    .UserID = gsDBUserID
                    .Password = gsDBPassword
                    For Each oPair In .LogonProperties
                        If UCase(CStr(oPair.Name)) = "DATA SOURCE" Then
                            oPair.Value = gsDBDataSource
                            Exit For
                        End If
                    Next oPair
                End With
                oTable.ApplyLogOnInfo(oTable.LogOnInfo)
            Next oTable
            '-- Set database connection info for each table in each subreport
            For Each oSubreport In oCrystalReport.Subreports
                For Each oTable In oSubreport.Database.Tables
                    With oTable.LogOnInfo.ConnectionInfo
                        .ServerName = gsDBDataSource
                        .UserID = gsDBUserID
                        .Password = gsDBPassword
                        For Each oPair In .LogonProperties
                            If UCase(CStr(oPair.Name)) = "DATA SOURCE" Then
                                oPair.Value = gsDBDataSource
                                Exit For
                            End If
                        Next oPair
                    End With
                    oTable.ApplyLogOnInfo(oTable.LogOnInfo)
                Next oTable
            Next oSubreport
    ... alas, it makes no difference.  If we run the report against a database that is different than the one specified with "Set Datasource Location" in Crystal, it fails with nonsense errors 

    Thanks for the reply, Ludek.  We have made some breakthroughs, uncovered some Crystal bugs and workarounds, and we're probably 90% there I hope.
    For your first point, unfortunately the information on the Oracle 6550 error was generic, and not much help in our case.  And for your second point, the errors didn't have anything to do with subreports at that time -- the error would manifest itself even in a simple, one-level report.
    However, your third point (pointing us to KB 1553921) helped move us forward quite a bit more.  For the benefit of all, here is a link to that KB article:
    Link: [KB 1553921|http://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/oss_notes_boj/sdn_oss_boj_bi/sap(bD1lbiZjPTAwMQ==)/bc/bsp/spn/scn_bosap/notes%7B6163636573733d36393736354636443646363436353344333933393338323636393736354637333631373036453646373436353733354636453735364436323635373233443330333033303331333533353333333933323331%7D.do]
    We downloaded the tool referenced there, and pointed it at a couple of our reports.  The bottom line is that the code it generated uses a completely new area of the Crystal Reports .NET API which we had not used before -- in the CrystalDecisions.ReportAppServer namespace.  Using code based on what that RasConnectionInfo tool generated, we were able gain greater visibility into some of the objects in the API and to uncover what I think qualifies as a genuine bug in Crystal Reports.
    The CrystalDecisions.ReportAppServer.DataDefModel.ISCRTable class exposes a property called QualifiedName, something that isn't exposed by the more commonly-used CrystalDecisions.CrystalReports.Engine.Table class.  When changing the data source with our old code referenced above (CrystalDecisions.Shared.ConnectionInfo.SetConnection), I saw that Crystal would actually change the Table.QualifiedName from something like "SCHEMAOWNER.PACKAGENAME.PROCNAME" to just "PROCNAME" (essentially stripping off the schema and package name).  Bad, Crystal...  VERY BAD!  IMHO, Crystal potentially deserves to be swatted on the a** with the proverbial rolled-up newspaper.
    I believe this explains why we were also able to generate errors indicating that field names or tables were not found -- because Crystal had gone and changed the QualifiedName to remove some key info identifying the database object!  So, knowing this and using the code generated by the RasConnectionInfo tool, we were able to work around this bug with code that worked for most of our reports ("most" is the key word here -- more on that in a bit).
    So, first of all, I'll post our new code.  Here is the main area where we loop through all of the tables in the report and subreports:
    '-- Replace each table in the main report with new connection info
    For Each oTable In oCrystalReport.ReportClientDocument.DatabaseController.Database.Tables
        oNewTable = oTable.Clone()
        oNewTable.ConnectionInfo = GetNewConnectionInfo(oTable)
        oCrystalReport.ReportClientDocument.DatabaseController.SetTableLocation(oTable, oNewTable)
    Next oTable
    '-- Replace each table in any subreports with new connection info
    For iLoop = 0 To oCrystalReport.Subreports.Count - 1
        sSubreportName = oCrystalReport.Subreports(iLoop).Name
        For Each oTable In oCrystalReport.ReportClientDocument.SubreportController.GetSubreportDatabase(sSubreportName).Tables
            oNewTable = oTable.Clone()
            oNewTable.ConnectionInfo = GetNewConnectionInfo(oTable)
            oCrystalReport.ReportClientDocument.SubreportController.SetTableLocation(sSubreportName, oTable, oNewTable)
        Next oTable
    Next iLoop
    '-- Call VerifyDatabase() to ensure that the tables update properly
    oCrystalReport.VerifyDatabase()
    (Thanks to Colin Stynes for his post in the following thread, which describes how to handle the subreports):
    Setting subreport connection info at runtime
    There seems to be a limitation on the number of characters in a post on this forum (before all formatting gets lost), so please see my next post for the rest....

  • Running delta buy need add one filed at infopackage dataselection

    hi
    i am uisng 2lis_11_vahdr, i am running delta now
    now my requirement is need to add one field at deta selection tab ( infopackage)
    please let me know what steps i need to follow

    Hi Sunil,
    You need to enhance the data source in R/3 itself to add an extra field to the data source. go to lbwe , select the required data source and in the extractor , choose the extra field from the right listbox and drop it to the left listbox. if you need a custom field, then you need to extend the data source cmod/smod , create a project.
    Thanks,
    Bhavani Prasad

  • My performance is very slow when I run graphs. How do I increase the speed at which I can do other things while the data is being updated and displayed on the graphs?

    I am doing an an aquisition and displaying the data on graphs. When I run the program it is slow. I think because I have the number of scans to read associated with my scan rate. It takes the number of seconds I want to display on the chart times the scan rate and feeds that into the number of samples to read at a time from the AI read. The problem is that it stalls until the data points are aquired and displayed so I cannot click or change values on the front panel until the updates occur on the graph. What can I do to be able to help this?

    On Fri, 15 Aug 2003 11:55:03 -0500 (CDT), HAL wrote:
    >My performance is very slow when I run graphs. How do I increase the
    >speed at which I can do other things while the data is being updated
    >and displayed on the graphs?
    >
    >I am doing an an aquisition and displaying the data on graphs. When I
    >run the program it is slow. I think because I have the number of
    >scans to read associated with my scan rate. It takes the number of
    >seconds I want to display on the chart times the scan rate and feeds
    >that into the number of samples to read at a time from the AI read.
    >The problem is that it stalls until the data points are aquired and
    >displayed so I cannot click or change values on the front panel until
    >the updates occur on the graph. What can I do to be a
    ble to help
    >this?
    It may also be your graphics card. LabVIEW can max the CPU and you
    screen may not be refreshing very fast.
    --Ray
    "There are very few problems that cannot be solved by
    orders ending with 'or die.' " -Alistair J.R Young

  • Hi am trying to save Data into a write to measurement file vi using a NI PXI 1042Q with a real time mode but it is not working but when i run it with uploading it into the PXI it save in to the file

    Hi am trying to save Data into a write to measurement file vi using a NI PXI 1042Q and DAQ NI PXI-6229 with a real time mode but it is not working but when i run it without uploading it into the PXI it save in to the file please find attached my vi
    Attachments:
    PWMs.vi ‏130 KB

     other problem is that the channel DAQmx only works at real time mode not on stand alone vi using Labview 8.2 and Real time 8.2

  • How can I save my data and the date,the time into the same file when I run this VI at different times?

    I use a translation stage for the experiment.For each user in the lab the stage position (to start an experiment) is different.I defined one end of the stage as zero. I want to save the position , date and time of the stage with respect to zero.I want all these in one file, nd everytime I run it it should save to the same file, like this:
    2/12/03 16:04 13567
    2/13/03 10:15 35678
    So I will track the position from day to day.If naybody helps, I appreciate it.Thanks.

    evolution wrote in message news:<[email protected]>...
    > How can I save my data and the date,the time into the same file when
    > I run this VI at different times?
    >
    > I use a translation stage for the experiment.For each user in the lab
    > the stage position (to start an experiment) is different.I defined one
    > end of the stage as zero. I want to save the position , date and time
    > of the stage with respect to zero.I want all these in one file, nd
    > everytime I run it it should save to the same file, like this:
    > 2/12/03 16:04 13567
    > 2/13/03 10:15 35678
    >
    > So I will track the position from day to day.If naybody helps, I
    > appreciate it.Thanks.
    Hi,
    I know the function "write to spreadsheet file.vi"
    can append the data
    to file. You can use the "concatenate strings" to display the date,
    time as well as data... Hope this help.
    Regards,
    celery

Maybe you are looking for