Data loading issue in between BI and R/3

Hi Everybody,
Our one of team member has made some changes on the server of BI server and we have lost the LBWQ data of 21-July and 22-July.  The data didnt come in PSA for standard extractors.
When the connection between R/3 and BI was made correct the data started coming after 23rd July. Now we wanted to have the data of 21-July and 22-July. Please let me know how to do it.
Thanks
A. Jenifer

Hi goutham,
Can you please tel me where  I can find 'Repeat delta update' option.
Regards,
Amar.

Similar Messages

  • Data Load Issue "Request is in obsolete version of DataSource"

    Hello,
    I am getting a very strange data load issue in production, I am able to load the data upto PSA, but when I am running the DTP to load the data into 0EMPLOYEE ( Master data Object) getting bellow msg
    Request REQU_1IGEUD6M8EZH8V65JTENZGQHD not extracted; request is in obsolete version of DataSource
    The request REQU_1IGEUD6M8EZH8V65JTENZGQHD was loaded into the PSA table when the DataSource had a different structure to the current one. Incompatible changes have been made to the DataSource since then and the request cannot be extracted with the DTP anymore.
    I have taken the follwoing action
    1. Replicated the data source
    2. Deleted all request from PSA
    2. Activated the data source using (RSDS_DATASOURCE_ACTIVATE_ALL)
    3. Re transported the datasource , transformation, DTP
    Still getting the same issue
    If you have any idea please reply asap.
    Samit

    Hi
    Generate your datasource in R/3 then replicate and activate the transfer rules.
    Regards,
    Chandu.

  • TileList data load issue

    I am having an issue where the data that drives a tilelist
    works correctly when the tile list is not loaded on the first page
    of the application. When it is put on a second page in a viewstack
    then the tilelist displays correctly when you navigate to it. When
    the tilelist is placed in the first page of the application I get
    the correct number of items to display in the tilelist but the
    information the item renderer is supposed to display, ie a picture,
    caption and title, does not. The strange thing is that a Tree
    populates correctly given the same situation. Here is the sequence
    of events:
    // get tree is that data for the tree and get groups is the
    data for the tilelist
    creationComplete="get_tree.send();get_groups.send();"
    <mx:HTTPService showBusyCursor="true" id="get_groups"
    url="[some xml doc]" resultFormat="e4x"/>
    <mx:XMLListCollection id="myXMlist"
    source="{get_groups.lastResult.groups}"/>
    <mx:HTTPService showBusyCursor="true" id="get_tree"
    url="[some xml doc]" resultFormat="e4x" />
    <mx:XMLListCollection id="myTreeXMlist"
    source="{get_tree.lastResult.groups}"/>
    And then the data provider of the tilelist and tree are set
    accordingly. I tried putting moving the data calls from the
    creation complete to the initialize event thinking that it would
    hit earlier in the process and be done by the time the final
    completion came about but that didn't help either. I guess I'm just
    at a loss as to why the tree works fine no matter where I put it
    but the TileList does not. It's almost like the tree and the
    tilelist will sit and wait for the data but the item renderer in
    the tilelist will not wait. Which would explain why clicking on the
    tile list still produces the correct sequence of events but the
    visual component of the tilelist is just not working right. Anyone
    have any ideas?

    Ok, so if ASO value is wrong, then its a data load issue and no point messing around with the BSO app. You are loading two transactions to the exact same intersection. Make sure your data load is set to aggregate values and not overwrite.

  • Demantra Data Load Issue

    I am new to Demantra. Have installed a stand alone Demantra system in our server. In order to load data, I created a new model, defined item and location levels, then clicked on 'Build Model'. The data is loaded into 3 custom tables created by me. After creating the model, I cannot login to 'Collaborator Workbench', it gives message 'There are system errors. Please contact your System Administrator'. Can anyone please tell me what I am doing wrong and how to resolve the issue.
    Thanks

    Ok, so if ASO value is wrong, then its a data load issue and no point messing around with the BSO app. You are loading two transactions to the exact same intersection. Make sure your data load is set to aggregate values and not overwrite.

  • Data Loading issues in PSA and Infocube

    Hi team,
    i am loading data into PSA and from there into info cube via a DTP,
    when i load data into PSA all 6000 records are transfered to PSA and process is completed sucessfully,
    When i execute the DTP to load data into info cube the data load process is completed sucessfully but when i observe
    the "manage" tab i see
    "Transferred 6000 || Added Records 50"
    i am not able to get as why only 50 records are loaded into infocube and if some records are rejected where can i find them and the reason for that,
    kindly assist me in understanding this issue,
    Regards
    bs

    hi,
    The records would have got aggregated based on common values.
    if in source u had
    custname xnumber kf1
    R                56             100
    R                53             200
    S                54              200
    after aggregation u will have
    custname                kf1
    R                            300
    S                           200
    if u do not have xnumber in your cube.
    Hope it is clear now.
    Regards,
    Rathy

  • Flash and XML data loading issue

    I have this SWF that loads 2 XML files that controls 2 different areas of the SWF.  I am using this code because I have stacked layers and when this was 2 seperate SWFs I couldn't use the SWFObject because of the wmode quirk in FF.  So my issue is when you first load this it seems to only load the slideshow (left side) and part of the right menu and then on refresh it loads the rest of the menu - how do I get it to load everything at once?  I have attacehed all of my files.
    <div id="LargeFlashContainerFlashMenu">     
                            <script language="javascript">
                                  if (AC_FL_RunContent == 0) {
                                       alert("This page requires AC_RunActiveContent.js.");
                                  } else {
                                       AC_FL_RunContent(
                                            'codebase', 'http://download.macromedia.com/pub/shockwave/cabs/flash/swflash.cab#version=9,0,0,0',
                                            'width', '940',
                                            'height', '265',
                                            'src', 'JGSB_HeaderRightNav_v05',
                                            'FlashVars', 'strNavContentXML=BICMenu.xml&strShowContentXML=settings.xml',
                                            'quality', 'high',
                                            'pluginspage', 'http://www.macromedia.com/go/getflashplayer',
                                            'align', 'middle',
                                            'play', 'true',
                                            'loop', 'true',
                                            'scale', 'showall',
                                            'wmode', 'transparent',
                                            'devicefont', 'false',
                                            'id', 'JGSB_HeaderRightNav_v05',
                                            'bgcolor', '#000000',
                                            'name', 'JGSB_HeaderRightNav_v05',
                                            'menu', 'true',
                                            'allowFullScreen', 'false',
                                            'allowScriptAccess','sameDomain',
                                            'movie', 'JGSB_HeaderRightNav_v05',
                                            'salign', ''
                                            ); //end AC code
                             </script>
                             <noscript>
                                       <object classid="clsid:d27cdb6e-ae6d-11cf-96b8-444553540000" codebase="http://download.macromedia.com/pub/shockwave/cabs/flash/swflash.cab#version=9,0,0,0" width="940" height="265" id="JGSB_HeaderRightNav_v05" align="middle">
                                            <param name="allowScriptAccess" value="sameDomain" />
                                            <param name="allowFullScreen" value="false" />
                                            <param name="movie" value="JGSB_HeaderRightNav_v05.swf" />
                                            <param name="FlashVars" value="strNavContentXML=BICMenu.xml&strShowContentXML=settings.xml" />
                                            <param name="quality" value="high" />
                                            <param name="wmode" value="transparent" />
                                            <param name="bgcolor" value="#000000" />
                                            <embed wmode="transparent" src="JGSB_HeaderRightNav_v05.swf" quality="high" bgcolor="#0099cc" width="940" height="265" name="JGSB_HeaderRightNav_v05" align="middle" allowScriptAccess="sameDomain" allowFullScreen="false" type="application/x-shockwave-flash" pluginspage="http://www.macromedia.com/go/getflashplayer" />
                                       </object>
                             </noscript>
                   </div>

    ok I couldnt upload the files but I can send anyone who can help the zip with all files in it via email.

  • Data loading Issue in BW

    Hi ,
    We are facing an issue while loading language text data by using data Source 0HAP_V_LIST_TEXT in to BW,
    It showing that "Language is not valid". Please find the below screen shot.
    To resolve this issue, that language is need to be installed in the source system, but the client is not interested to install it as it is not required.
    How can I resolve this issue without installing.
    Could you please throw some light on this.
    Thanks in advance,
    Ajay

    Hi Ajay
    Please check with the user if the value entered in the field is really correct and this value is expected for records in future as well.
    If you find that data is incorrect please try to convince him to correct the data in source itself to have consistent data between source and BW.
    In case he insists that value is correct and is expected in future then please check what languages are maintained in Trnx: SMLT as the system will allow only those which are maintained in this transaction. (Only exception is if you have a Unicode migrated system, as this allows all language to process through)
    Regards
    Ashish

  • Data Inconsistency for 0EMPLOYEE between ECC and BI

    Hi,
    We do a full load to 0EMPLOYEE using 0EMPLOYEE_ATTR from ECC. There were records deleted for lot of employees (some action types) in ECC. This has caused data inconsistency for 0EMPLOYEE master data (time-dependent) between ECC and BI. In BI we have more records for these employees (Additional records have time-dependent range that were deleted from ECC but still exist in BI). These employee records are already being used in lot of InfoProviders. Is there an efficient way to fix this issue? One of the solution is to delete data from all InfoProviders and then delete from Master data for 0EMPLOYEE, but the deletion of employee records can happen quite often, so we don't want to take this route. Also, I tried to re-organize master data attributes for 0EMPLOYEE through process chain, but that didn't work either.
    Message was edited by:
            Ripel Desai

    Hi Ripel,
    I share your pain. This is one of the real pains of Time-Dependant master data in BW. I have been in your exact position and the only way round the issue for me was to clear out all the cubes that used 0EMPLOYEE and then delete and re-load 0EMPLOYEE data.
    I know this responce doesn't help you but at least you are not alone.
    Regards,
    Pete
    http://www.teklink.co.uk

  • BI 7.0 data load issue: InfoPackage can only load data to PSA?

    BI 7.0 backend extraction gurus,
    We created a generic datasource on R3 and replicated it to our BI system, created an InfoSource, the Transformation from the datasource to the InfoSource, an ODS, the transformation from the InfoSource to the ODS. 
    After the transformation creation between the InfoSource and the ODS is done on this BI system, a new folder called "Data Transfer Process" is also created under this ODS in the InfoProvider view.  In the Data Transfer Process, in the Extraction tab, picks 'Full' in the field Extraction Mode, in the Execute tab, there is a button 'Execute', click this button (note: so far we have not created InfoPackage yet) which sounds like to conduct the data load, but find there is no data available even if all the status show green (we do have a couple of records in the R3 table). 
    Then we tried to create an InfoPackage, in the Processing tab, find 'Only PSA' radio button is checked and all others like 'PSA and then into Data Targets (Package by Package)" are dimmed!  In the Data Target tab, find the ODS as a target can't be selected!  Also there are some new columns in this tab, 'Maintain Old Update Rule' is marked with red color 'X', under another column 'DTP(S) are active and load to this target', there is an inactive picture icon, that's weird since we have already activated the Data Transfer Process!  Anyway, we started the data load in the InfoPackage, and the monitor shows the records are brought in, but since in the Process tab in the InfoPackage, 'Only PSA' radio button is checked with all others dimmed that there is no any data going to this ODS!  Why in BI 7.0, 'Only PSA' radio button can be checked with others all dimmed?
    Many new features with BI 7.0!  Any one's idea/experience is greatly appreciate on how to load data in BI 7.0!

    You dont have to select anything..
    Once loaded to PSA in DTP you have the option of FULL or DELTA ,full loads all the data from PSA and DELTA loads only the last load of PSA.
    Go through the links for Lucid explainations
    Infopackage -
    http://help.sap.com/saphelp_nw2004s/helpdata/en/43/03808225cf5167e10000000a1553f6/content.htm
    DTP
    http://help.sap.com/saphelp_nw2004s/helpdata/en/42/f98e07cc483255e10000000a1553f7/frameset.htm
    Creating DTP
    http://help.sap.com/saphelp_nw2004s/helpdata/en/42/fa50e40f501a77e10000000a422035/content.htm
    <b>Pre-requisite-</b>
    You have used transformations to define the data flow between the source and target object.
    Creating transformations-
    http://help.sap.com/saphelp_nw2004s/helpdata/en/f8/7913426e48db2ce10000000a1550b0/content.htm
    Hope it Helps
    Chetan
    @CP..

  • Which LKM and IKM to use for Fast data loading b/w MSSQL 2005 and Oracle 11

    Hi,
    Can anybody help us to decide which LKMs and IKMs are best for data loading between MSSQL and Oracle.
    Staging Area is Oracle. We have to load around 400Million rows from MSSQL to Oracle 11g.
    Best regards,
    Muhammad

    Thanks Ayush,
    You are right and it has dumped the file very quickly; but it is giving error on sqlldr call thorugh jython. I have reaised SR with oracle to look into it further.
    thanks again and have a very nice time.
    Regards,
    Muhammad

  • Inventory data load issue

    HI all,
             We have 2 source system SAP4.7 and ECC6.0. I am using 3 data source BX, BF and UM.
    one year data we required from SAP4.7 and all data from ECC6.0.
    In SAP4.7 totally 7 years data available but we required only last one year. my doubt is if i extract last one year data weather  in my report OPENING STOCK and CLOSING STOCK show correctly?
    - SAP4.7 closing stock will be opening stock in ECC6.0 like the manner they will upload in source system.
    Getting data from two source system, what is the extraction steps i have to follow.
    Kindly give me yr suggestion.
    Thanks
    sara

    Hi,
    First you need to make sure the closing stock of R/3 4.7 is matching with the ECC 6.0 opening stock. It should be same, usually when the data cut over happens this will be addressed. you can cross check the same using T code - MB5B / MB52 etc.
    When you load the data using BX source, it is a full load one, it will be pulling the data for as on date. So it will be brining the data only from the ECC 6.0 system being connected and there wont be any issue with the data.
    BF - will bring the material movement, which is needed if you need to see the historic data. So do the loading in the normal manner. Split the load depending up on the data volume. While doing the set up table activity in 4.7 select only the needed period. do the same for the UM data source also.
    While setting up the delta, do it only for 6.0.
    Regards

  • Data load issue in BI7.0 Enviornment

    Hi,
    We went to BI7.0 technical upgrade from BW 3.x recently and we are upgarde the system BI 7.0. We have new requirement for that i need to create a copy cube from Base(A) cube and  load the data from base(A) cube to new (B)cube .
    I need to enhance the Base(A) cube with few key figures and need historical data that why i am not using Remodeling method.These key figures are attributes of masterdata
    For loading data from Base cube (A) to New Cube(B)
    Need to follow the below steps:
    1>Export data source
    2>create update rules
    3>create info pack
    or
    Do i need to create
    1>transformation
    2>DTP
    The Base (A) cube is loading from two ODS and i am going to write update routine between ODS and cube (A) to get populate new key figure in the cube using update routine,so that i will pick the data from the master data and these new key figures as attributes of master data.
    Please some one can advise me on it.

    Thnaks alot khaja for quick response.
    I will store key figures in cube A and it require historical data.
    These key figure are attribute of Material master.Will populate these key figures in cube from masterdata using update routine.If i go for Remodeling then i would not get historical data.
    If i get the key figure from master data then it would load historical data righ?
    I want to move the Cube A to Cube B and load the data from A to B cube
    Once the enhancement is done i will bring back data from Cube B to cube A.
    Please advise me :
    Once the data is loaded into cube B and delete the whole data in cube A
    Instead of bring back B data to cube A ,I would like schedule data from ODS to cube A
    Which is in the flow.
    Do i need to create Transformation and DTP or Update rules and Infopackage?
    Please advis me in this scenario

  • Purchasing Data Not Tying Up between BW and R/3.

    Hi,
    As a part of our project, we reloaded the all the data related to Purchasing through the extractors   
    2LIS_02_SCL
    2LIS_02_ITM
    2LIS_02_HDR
    As these were corrupted due to patch application.
    We took all the possible care during reloading of all the data from R/3 like locking the user from making any transactions, Deleting the setup tables etc in a step by step process etc.
    But however, after the extraction the data is not tying up between R/3 and BW Purchasing ODS.
    Is there some thing wrong did we do or How can we get that missing records into BW again.
    Appreciate your help.
    Thanks.

    Hi......
    Did u fill the set up table with any selection...........?.....Previously no of records picked was larger than this time........right?..........Hav u checked RSA3 in the source system..........is it showing correct no of records........? if yes............then check the selection tab of the IP..whether u hav maintained any selection there.........
    If no..........then......first check the selection in the set up table..............then the function module....which is extracting the data..........u will get the function module in RSA2.........there just double click on the Extractor.........u will get the function module  name........
    Anyways.........did u know which records r missed.........then just fill the set up table for those values only.........then do full load......
    Anyways,............I think u know the steps in LO Extraction...........still for ur reference..........
    1. Transfer the logistics DataSource for transaction data from Business Content
    With the transfer of the DataSource, the associated extraction structure is
    also delivered, but the extraction structure is based on LIS communication
    structures. Furthermore, based on the extraction structure for the DataSource,
    a restructuring table that is used for the initialization and full update to BI
    is generated.
    Naming convention:
    DataSource 2LIS_<Application>_<Event><Suffix>; where <Event> is
    optional
    Examples:
    2LIS_02_HDR: 02 = MM purchasing HDR (u2192 HEADER ) = Document
    header...
    Extraction structure MC<Application><Event/group of
    events>0<Suffix>; where MC is derived from the associated communication
    structures and <Suffix> is optional
    Examples:
    MC02M_0HDR: Extraction structure for the DataSource 2LIS_02_HDR,
    where M_ indicates the group for the events MA (order), MD (delivery
    schedule), ME (contact) and MF (request).
    Restructuring table (= setup table) <Extraction structure>SETUP
    Example:
    Extraction structure: MC11VA0IT u21D2 Restructuring table:
    MC11VA0ITSETUP
    2. Maintain extraction structure (transaction LBWE)
    This means that fields can be added to the extraction structure that is
    delivered with the DataSource without modifying anything. On the one
    hand, fields from the LIS communication structures that are assigned to the
    extraction structure can be used, that means standard fields that SAP has
    not selected, and on the other hand, customer fields that were attached to
    the LIS communication structures with the append technique can be used.
    After the extraction structure is created, it is generated automatically and the
    associated restructuring table is adapted.
    3. Maintain/generate DataSource
    In the DataSource maintenance, you can assign the properties Selection,
    Hide, Inversion (= Cancellation) and Field Only Known in Customer Exit to
    the fields of the extraction structure. After enhancing the extraction structure,
    the DataSource always has to be generated again!
    4. Replicate and activate DataSource in SAP BI (=metadata upload)
    5. Maintain Data Target (Data Store Object, InfoCube)
    6. Maintain Transformations between DataSource and Data Target
    7. Create a Data Transfer Process
    the Data Transfer Process will be used later to update the data from the PSA
    table into the Data Target.
    8. Set extraction structure for updating to active (transaction LBWE)
    In this way, data can be written to the restructuring table or the delta queue
    from then on using the extraction structure (see following steps).
    9. Filling the restructuring table/restructure (OLI*BW)
    During this process, no documents should be created or changed in the
    system! In some applications, it is possible to fill the restructuring table
    beforehand in simulation mode. These results are listed in a log (transaction
    LBWF). Before filling the restructuring table, you must ensure that the
    content of the tables is deleted (transaction LBWG), preventing the table
    from being filled multiple times. Once the restructuring tables are filled,
    document editing can resume as long as Unserialized V3 Update or Queued
    Delta is selected in the next step. Be absolutely sure that no V3 collection
    run is started until the next successful posting of an InfoPackage for delta
    initialization (see step 11).
    10. Select update method
    • Unserialized V3 update
    • Queued delta
    • Direct delta
    11. Create an InfoPackage for the DataSource and schedule the Delta
    Initialization in the Scheduler
    This updates the BI-relevant data from the restructuring table to the
    PSA table. Since the restructuring table is no longer needed after delta
    initialization, the content can be deleted (transaction LBWG).
    Use the Data Transfer Process created in step 7 to update the data from
    the PSA table into the Data Targets. After successful delta initialization,
    document editing can resume, as long as the direct delta update method
    was selected in step 13. This means that BI-relevant delta data is written
    directly to the delta queue.
    Note:
    If the DataSource supports early-delta initialization, the delta data can
    be written to the delta queue during delta initialization. This feature is
    controlled with an indicator in the Scheduler.
    12. Start V3 collection run (transaction LBWE)
    This step is only necessary when the update method unserialized V3 Update
    or Queued Delta was selected in step 10. By starting a corresponding job for
    an application, the BI-relevant delta data is read from the update tables or
    extraction queue and written to the delta queue.
    13. Create an InfoPackage for the DataSource in BI and schedule the Delta
    Update in the Scheduler
    The BI-relevant delta data from the delta queue for the DataSource is updated
    to the PSA table. Use the Data Transfer Process created in step 7 to update
    the data from the PSA table into the Data Targets.
    Hope this helps.......
    Regards,
    Debjani...

  • Error in 0EMPLOYEE Master Data Load Issue

    Hi,
    We have 0EMPLOYEE Master Data. Due to our new development changes ralted to 0EMPLOYEE, we have scheduled 2 new info packages with Personnel number range. While creation of infopackages, we forget to main time interval from 01.01.1900 to 31.12.9999. Instead of this, the default range was selected as 24.04.2009 to 31.12.9999. Because of this selection in InfoPackage, the Employee Master Data Valid from date was changed to 24.04.2009 for all the employees in the master data after the data load.
    Even after i change this selection properly and loading the data also, its not correcting with correct valid from dates.
    Can you please advice, how can we fix this issue ASAP as its a production issue?
    Thanks!
    Best regards,
    Venkata

    > Even after i change this selection properly and loading the data also, its not correcting with correct valid from dates.
    May be for this you have the ONLY option to delete 0Employee master data and reload it again. For this you need to delete dependent transaction data also.
    Cheers,
    Sree

  • Master data loading issue

    Hi gurus,
        Presently i am working on BI 7.0.I have small issue regarding master data loading.
        I have generic data soruce for master data loading.i have to fetch this data to BW side.Always i have to do full data load to the master data object.first time i have scheduled info package and run DTP to load data to master data object, no issues, data got loaded successfully.whenever i run infopacage for second time and run DTP i am getting error saying that duplicated records.
       How can i handle this.
    Best Regards
    Prasad

    Hi Prasad,
    Following is happening in your case:
    <b>Loading 1st Time:</b>
    1. Data loaded to PSA through ipack.It is a full load.
    2. data loaded to infoobject through DTP.
    <b>Loading 2nd Time:</b>
    1. Data is again loaded to PSA. It is a full load.
    2. At this point, data in PSA itself is duplicate. So when you are running the DTP, it picks up the data of both the requests that were loaded to PSA. And hence, you are getting the Duplicate record error.
    Please clear the PSA after the data is loaded to infoobject.
    Assign points if helpful.
    Regards,
    Tej Trivedi

Maybe you are looking for