Load Identifier for Input Records

Hi
OWB Version 9.0.3 (9.0.4 available)
Having created an initial source file load mapping (loading a STG1_GOOD table) I also plan to load the 'bad' file generated by SQL*Loader for rejected records (into a STG1_ERR table) in a second mapping.
Do you have any suggestions on how I can generate a record id that spans both these tables and will represent the position of the record in the original source file ??
Many thanks.

Not sure about your configuration, but you should use a sequence. 9.0.4 also has multi table insert that might be useful in these cases.
Regards:
Igor

Similar Messages

  • Replicating data once again to CRM after initial load fails for few records

    My question (to put it simply):
    We performed an initial load for customers and some records error out in CRM due to invalid data in R/3. How do we get the data into CRM after fixing the errors in R/3?
    Detailed information:
    This is a follow up question to the one posted here.
    Can we turn off email validation during BP replication ?
    We are doing an initial load of customers from R/3 to CRM, and those customers with invalid email address in R/3 error out and show up in SMW01 as having an invalid email address.
    If we decide to fix the email address errors on R/3, these customers should then be replicated to CRM automatically, right? (since the deltas for customers are already active) The delta replication takes place, but, then we get this error message "Business Partner with GUID 'XXXX...' does not exist".
    We ran the program ZREPAIR_CRMKUNNR provided by SAP to clear out any inconsistent data in the intermediate tables CRMKUNNR and CRM_BUT_CUSTNO, and then tried the delta load again. It still didn't seem to go through.
    Any ideas how to resolve this issue?
    Thanks in advance.
    Max

    Subramaniyan/Frederic,
    We already performed an initial load of customers from R/3 to CRM. We had 30,330 records in R/3 and 30,300 of them have come over to CRM in the initial load. The remaining 30 show BDOC errors due to invalid email address.
    I checked the delta load (R3AC4) and it is active for customers. Any changes I make for customers already in CRM come through successfully.  When I make changes to customers with an invalid email address, the delta gets triggered and data come through to CRM, and I get the BDOC error "BP with GUID XXX... does not exist"
    When I do a request load for that specific customer, it stays in "Wait" state forever in "Monitor Requests"
    No, the DIMA did not help Frederic. I did follow the same steps you had mentioned in the other thread, but it just doesn't seem to run. I am going to open an OSS message with SAP for it. I'll update the other thread.
    Thanks,
    Max

  • FDMEE Import error "No periods were identified for loading data into table 'AIF_EBS_GL_BALANCES_STG'

    Hi,
    We are having trouble while importing one ledger 'GERMANY EUR GGAAP'. It works for Dec 2014 but while trying to import data for 2015 it gives an error.
    Import error shows " RuntimeError: No periods were identified for loading data into table 'AIF_EBS_GL_BALANCES_STG'."
    I tried all Knowledge docs from Oracle support but no luck. Please help us resolving this issue as its occurring in our Production system.
    I also checked all period settings under Data Management> Setup> Integration Setup > Global Mapping and Source Mapping and they all look correct.
    Also its only happening to one ledger rest all ledgers are working fine without any issues.
    Thanks

    Hi,
    there are some Support documents related to this issue.
    I would suggest you have a look to them.
    Regards

  • How to identify the maximum data loads done for a particular day?

    Hi all,
    There is huge volume of data been loaded on a last monday, and then data was deleted from the cubes on the same day. And hence i needs to see which are all the cubes which was loaded wtih lot of records for a particular day,
    i happened to look at rsmo, i am unable to see the ods nor the cube data loads,
    were do i seet it?
    Thanks

    See if the table RSSELDONE helps. This will give you the recent data load details. Based on those loads , you can search the targets.
    And also check table TBTCO  which will give the latest job details. You will have to analyze the same jobs to know what loads were done . Give a selection for date.

  • LSMW Batch Input Recording for Create BOM

    Dear All,
    I want to do LSMW for Creating BOM using Batch Input Recording,
    I know that i have to make 2 time Recording, first for the BOM Header and second for the BOM item
    For the BOM header i have done it, but for the BOM item i got some trouble with the item number (POSNR),
    when i run the LSMW, for the first item (item 0010) was success, but for the item 0020, it can't work and i got error
    message " NO BATCH INPUT FOR SAPLCSDI 0150"
    How is the recording step by step for BOM Item so the Item number can increase well?
    Very need your help,
    Regards,
    Marufat

    Hello Santosh,
    Thanks for the reply,
    I already check in SM35 where i also thought that the main problem is about adding the new item number,
    but when i tried to do repeat recording, i can not find any entry for adding the line item, so the line item after 0010 cannot be input
    Is there any solution?
    Regards,
    Marufat

  • Customized LSMW (Batch Input Recording) to upload data for Vendor

    Hello Fiends,
       can u help me on this object and how to upload in xk01.
            Customized LSMW (Batch Input Recording) to upload data for Vendor Master using Transaction code XK01
    With best wishes,
    Chandu.
    Point will be rewarded....

    Hi,
    Go through the following link, you will find your answer
    http://www.sapbrain.com
    Regards,
    Bhaskar

  • LSMW for Listings via Batch Input recording - Help ?

    Hi
    Has anyone had any luck with trying to upload Listings via LSMW at all ?
    I have created on, via batch input recording
    I have lots of records with entries from 1 to 16
    I have recorded enough entries to cope with 16
    However when i try to execute the batch session, if there is only 1 material that requires an entry, then i cannot get past this screen, the LSMW is wanting entries for the next line and so on
    I presume I must have to put some sort of a ruling in per Material number it is expecting, so if it is not poupulated then to move on, but to save the entries that have already been made
    I do not have much experience in writing rules into LSMW so
    Any ideas / help would be much appreciated
    Thanks
    Tony

    Hello Tony,
    In the past I have suggested this to few people and were successful with this link, please try, but only thing you need to take care is use VB02 instead of XK01 with reference to this link what I am trying to say.
    http://youtu.be/fz94PcvtdZw
    Regards,
    Sridhar.

  • Data Load for 20M records from PSA

    Hi Team,
                   We need to reload a huge volume of data (around 20 million records) of Billing data (2LIS_13_VDITM) PSA to the first level DSO and then to the higher level targets.
    If we are going to run the entire load with one full request from PSA to DSO for 20M records will it have any performance issue?
    Will it be a good approach to split the load based on ‘Billing Document Number’?
    In Case, If we the load by 'Billing Document Number'; will it create any performance issue from the reporting perspective (if we receive the data from multiple requests?) Since most of the report would be ran based on Date and not by 'Billing Document Number'.
    Thanks
    San

    Hi,
    Better solution put the filter based on the year and fiscal year.
    check the how many years of data based on the you can put filter.
    Thanks,
    Phani.

  • Data Entry Layouts - Rows not ready for inputs - Multiple records

    Hi,
    I have created a simple layout using a predefined simple structure for rows (FS Items) and a data driven one for columns (for the period value in LC).
    Some cells show some data in them and are not available for input because there are multiple transaction records behind the numbers.
    The transaction records share the same breakdowns except for the Currency Translation Indicator (it is empty for original records and shows '1' for records created during currency translation) and the currency key for transaction currency (sometimes it is empty).
    It would be much appreciated if you let me know a way to introduce new data with layouts for FS Items (accounts) which already have transaction records.

    Hi Roberto,
    Data may not be ready for input because of different reasons. The most I met were - you didn't expand the rows structure until the leaves (not nodes) or you marked the column as display only.

  • LSMW for material master using Batch input recording method

    Dears,
    I am using Batch input recording method to upload material master data. But while selecting views , I need to scroll to select the vies lets say Plant storage loc view. While scrolling and selecting ,views are not recorded propelry . That is when I run in foreground system selects only basic data and purchasing views but the plant data views are not selected(palnt data views are selected by scrolling while recording).
    How to resolve this issue.
    Pls help
    Regards
    Kamesh

    HI,
    , I need to scroll to select the vies lets say Plant storage loc view. While scrolling and selecting ,views are not recorded propelry
    Don,t scroll use Page down button from key Bord
    like select Basis view and then press page down button from key board ,then you will get next view now select it
    Regards
    Kailas Ugale

  • Up to how many inputs for audio recording.

    just like the subject, up to how many inputs can be used for audio recordings at one time. i have four audio tracks but i could only arm 2 because i only have the choice for input 1 and 2. ive tried several things but cant seem to get any more inputs. is it possible to get more? if so can anyone help me. ive been told audio input objects but it didnt work.

    Greetings! You don't say what audio hardware you have. If you have nothing other than your standard inputs on the MacBook, then you can't have more than two inputs. If you have a multi-input audio interface then it sounds as if it's not connected properly. Check:
    Audio > Audio Hardware and Drivers
    Make sure your audio interface is selected.
    Also, click-hold on the I/O button for each track and make sure that you have selected different inputs - there should be one input listed for each physical input port on your audio interface.
    If you're still having problems, you may need to delete your preferences and run Logic Set Assistant to rebuild your system - only takes a couple of minutes and worth doing.
    Clangwork is also right about Audio MIDI Setup since Logic can only see what AMS tells it about.
    Pete

  • New status for changed records or Additive Delta

    Hi Guys,
    I have a design question for ODS and Extractor.
    The requirement is as follows...
    There is a table in CRM with service point as primary key. The fields in the table for example are bill account and Meter ID. Key figure is a XYZ value. In CRM for that Bill account Meter ID combination the XYZ value will change every year. In CRM the same record is being updated with changed value of XYZ. But in BW when the delta is being sent I do not want it to update the existing record but add it as  a new record. This way the history can be tracked for previous years XYZ value.
    What is required to make this happen?
    When creating the extractor I think I should give the delta as new status or additive delta?
    What should I indicate as delta specified field?
    For the ODS ( I am on BI 7.0) should I create the normal ODS or should I create the write Optimized or something for this requirement ( I dont remember right now the 3 types of ODS we can create).
    Your inputs will be greatly appreciated.
    Thanks
    Kumar.

    Thanks Puneet!
    Good Answer.
    New and Changed records does help in getting the records as desired.
    But the problem is while loading into BW I cannot create a new record. The primary key is only Service Point. So even if the key figures and dates associated with that record changes, BW will still update my existing record.
    Is there a possibility of using the dates as a pointer that a new record should be created.
    Let me explain it better. Following is the structure of table.
    Service Point (primary key)
    Product GUID (key in CRM)
    Circuit ID
    Installation date
    Effective date
    Termination date
    CRC (some key figure)
    Scenario is - This year Service Point A with CRC value as 50 with effective and termination date for example .. May1 2009 - Apr 31 2010.
    Service Point   CRC   EffectiveDate   TerminationDate
    A                     50      5/1/2009         4/31/2010
    Next year the same service point will have different values and CRM system will update the record in CRM table.
    Service Point   CRC   EffectiveDate   TerminationDate
    A                     60      5/1/2010         4/31/2011
    1) So I was wondering if there is a way or some code can be written to check if effective and termination dates are not same it should be added to ODS as a new record.
    2) Another way is to make some sort of Time dependency. Like use Valid From and Valid To dates to track the changes to the record and display all the different values for every record when the CRC values were valid between what dates.
    3) or some other way you can suggest

  • Cross Day work issue in Time Evaluation for positive recording.

    Dear Experts,
    We would like to know whether anyone else has faced the below described issue before or has any suggestions for the same.
    Issue Background: We are uploading time events from the clocking recording system into SAP using an interface program. The clocking events are loaded without indicator 'M' and based on whether they belong to current day then assignment is '=' and '<' for previous day assignment.
    Issue Description: Some operational staff have planned daily shifts from 00:00 - 08:30. Hence they can clock-in before 00:00 on the previous day and since there is no 'Next Day Indicator' in SAP, we split the time events so that if the clock-in is at 23:00 previous day then the time pairs are formed as
    23:00 - 24:00 Current Day with Day assignment '='
    00:00 - 08:30 Next Day with Day assignment '='
    When we run the time evaluation, the day assignment for Next day pair 00:00 - 08:30 is automatically changed to '<'. This makes the Next day without any events and the time evaluation throws error in the next day.
    Now instead of automatic upload, if we manually upload the same time events, the records are as follows:
    23:00 - 24:00 Current Day with Day Assignment '+' and Origin Indicator 'M'
    00:00 - 08:30 Next Day with Day Assignment '+' and Origin Indicator 'M'
    Now if we execute time evaluation, the behaviour is as per expected and the time evaluation doesn't change the day assignment for any records.
    Work Around Proposed: Keep atleast 1 sec difference between the time events say the above records can be uploaded as follows:
    23:00 - 23:59 Current Day with Day assignment '='
    00:00 - 08:30 Next Day with Day assignment '='
    But we would like to avoid this if possible.
    Summary: In brief we would like to know whether why time evaluation has different processing for Origin indicator 'M' and SPACE.
    Many thanks for your inputs in advance, thanks.
    Regards,
    Roshan.

    Hi,
    see thread
    Time evaluation -early clockin for shifts starting at midnight ?
    maybe of interest
    bg
    Edited by: bg on Sep 16, 2010 9:26 AM

  • Post mapping for Reject records!!

    hi
    I would like to capture all the rejected records when I am loading the fact table. I know I can get this information from WB_RT_ERRORS and WB_RT_ERROR_SOURCES. But, what I would like to do is to have Post mapping process which and the action set to "on error", I would like to capture all the reject records into a flat file. Is there a way I can identify reject in the post mapping process other than referring to WB_RT_ERRORS and WB_RT_ERROR_SOURCES in runtime.Because this is what client is requesting for , Any help on this would be greatly appreciated.
    Please mark me directly also since I do not get emails sometimes through the distribution list.
    Thanks in Anticipation,
    Balaji

    Reposting the response from M.Van Der Wiel:
    2 comments:
    - Ideally, you would explicitly capture the errors, and insert those into a separate table. This would enable you to run the mapping in set-based mode (traditionally this means: no error logging) for optimal performance, and you still get the errors. This does mean you would have
    to explicitly design what may go wrong, so you should know what your data looks like. Your flat file could then be created out of the explicit error records, which is probably a bit easier (and faster) than to go from the WB_RT_ERRORS.
    - The mapping errors out once the maximum number of errors is reached (as passed at runtime; is defaulted by the configuration setting). Anything in between 0 and the maximum number of errors will result in a status warning.
    To do what you want to do, you could indeed use the post mapping process, but perhaps you want to design a separate mapping to write the errors to a file, and use a process flow (with conditional sequencing) to run the second mapping (only if the first one fails or results in a warning). This may be a nicer solution than to write the code manually and implement it as a standalone post-mapping process.
    Finally, notice that WB_RT_ERRORS and the like are not tables you should directly query (they will change in the future). Rather you should use the ALL_RT_<something> views to access the same information. Going forward, as the customer desires to migrate to a later release, it is
    more likely that their customizations still work.
    Thanks,
    Mark.
    PS.
    Another possiblity - if the errors violate a key constraint - would be to configure the mapping target with the constraints parameter set to false and redirect the error records to a error table (this can be done in the mapping configuration - sources and targets section). This configuration will disable the constraints during the load and re-enable them after the load, putting the offending records in the error table. You can then download the records from the error table into a flat file in a separate mapping or in a post-mapping process.
    Regards:
    Igor

  • How to load AFS Material Master records.

    Hello everyone,
    I am new to loading AFS Material Master records.  In the past I have simply used the Material Master direct input programs, or batch input, to load the data.  Because of the extra AFS fields, I was guessing there must be a new direct input program, or batch input, for it.  Maybe there is a IDOC that must be used?
    Any help will be appreciated.  Thank you all.

    Hey there again, I just wanted to bring this subject back up again. 
    I always took it for granted the we loaded materials in batches.  We would load basic data for some materials and then extend the plant data, then sales, storloc data, warehouse data.  That is a total of 5 files one after the other.  I never really thought of why we did this, but now that I think about it maybe we did because it only took a little bit of data to create the basic data.  When you extend the plant data, you do not need to enter in the data already created from the initial basic data load.  Sales data only required so much so you could skip all the unnecessary basic and plant data. 
    Is that why  you loaded in batches, because it is easier to extend? 
    If you wanted to you could have all the basic, plant, sales, purchasing data on one record if you really wanted to.  Why didnu2019t you do it that way?
    Thanks for the input.

Maybe you are looking for