Data Load - un-successful for LIS structure 2LIS_01_S005

Hi Guys,
I have installed a SD infocube 2LIS_01_S005 (for Shipping Point) from BC, every thing went on smooth, finally when I arrive at content display, I found zero records.
I am unable to trace where thing(s) went wrong.
please write in your valuable suggestions/solutions,
it would be greatly appreciated.
Regards,
Nasir.

Those seem to be old datasources.
Check the below link
question about 2LIS_01_S001
Hope this helps.

Similar Messages

  • BPC NW 7.0: Data Load: rejected entries for ENTITY member

    Hi,
    when trying to load data from a BW info provider into BPC (using UJD_TEST_PACKAGE & process chain scheduling), a number of records is being rejected due to missing member entries for the ENTITY dimension in the application.
    However, the ENTITY member actually do exist in the application. Also, the dimension is processed with no errors. The dimension member are also visible usnig the Excel Client naviagtion pane for selecting members.
    The error also appears when kicking of the data load from the Excel Client for BPC. Any ideas how to analyze this further or resolve this?
    Thanks,
    Claudia Elsner

    Jeffrey,
    this question is closely related to the issue, because there is also a short dump when trying to load the data into BPC. I am not sure whether both problems are directly related though:
    Short dump with UJD_TEST_PACKAGE
    Problem desription of the post:
    When running UJD_TEST_PACKAGE, I get a short dump.
    TSV_TNEW_PAGE_ALLOC_FAILED
    No more storage space available for extending an internal table.
    Other keywords are CL_SHM_AREA and ATTACHUPDATE70.
    When I looked at NOTES, I found this Note 928044 - BI lock server". Looking at the note and debugging UJD_TEST_PACKAGE leaves me some questions:
    1. Do I need a BI lock server?
    2. Should I change the enque/table_size setting be increased on the central instance from 10000 to 25000 or larger?
    Claudia

  • Hierarchical Data Loading: XSD design for Native data

    We are working on native data received in the form of flat file (attaching a few lines below)
    FWDREF VXA04_X001_GC
    FWDREF VXA04_X010_GC
    FWDREF VXA04_X050_GC
    FWDREF VXA04_X051_GC
    FWDREF VXA04_X075_GC
    FWDREF VXA04_X100_GC
    FWDREF VXA04_X101_GC
    FWDREF VXA04_X150_GC
    SECTIDAVXBOSY SHELL AND PANELS
    SECTIDAGBBOSY SHELL AND PANELS
    SECTIDABKº¾À¿ÃÁ ½° º°À¾ÁµÀ¸Ï° ¸ ¿°½µ»¸
    SECTIDACZDKELET KAROSERIE A PANELY
    ILLREFBA1 A05_A1_B
    ILLREFBA1-1 A05_A1-1_B
    ILLREFBA1-2 A05_A1-2_B
    FWDREF VXB04_X101_GC
    FWDREF VXB04_X150_GC
    SECTIDBVXBOSY SHELL AND PANELS
    SECTIDBGBBOSY SHELL AND PANELS
    SECTIDACZDKELET KAROSERIE A PANELY
    ILLREFBA1 B05_A1_B
    ILLREFBA1-1 B05_A1-1_B
    This data is hierarchical.
    -FWDREF
    --SECTID
    ---ILLREF
    The challenge is that the number of occurrences of parent and child are not fixed and they might not occur at all
    for eg there might be a set of rows like this (in the example below, there is no SECTID)
    FWDREF VXB04_X150_GC
    ILLREFBA1 B05_A1_B
    How can the schema be designed in this case?
    Thanks in advance

    @rp0428
    Thanks for taking out the time to reply to this. If we talk in terms of a tree structure, in the normal scenario we would the hierarchy as described before.
    -FWDREF
    --SECTID
    ---ILLREF
    If we don't talk in terms of xml and xsd and just talk in terms of database and keys, FWDREF would be parent, SECTID would be child and ILLREF would be grandchild. Now, in case, SECTID does not appear, we would still want a key to be generated but have a default value corresponding to it so that the parent, child and grandchild relationship is mentioned.
    The whole purpose of this XSD design is to use it in ODI where this feed file will be automatically loaded into tables from the XML generated with the parent, child and grandchild relationship maintained.
    Also, I have taken a sample data set. In the actual case, the hierarchy goes upto a maximum of 20 levels.
    Thanks to everyone and anyone who takes out time for this!

  • Data load in BI7 for an single infoobject

    Hi
      I had added a new info object (PO number) in ODS and mapped to the corresponding R/3 field (BSTNK). Now i want to load data only to the newly added field (PO number) in BW from R/3 with out deleting the existing data. I came to know it is possible in BI7 can someone help me with the steps please.
    Thanks a lot
    Sheetal

    <u>it is possible even earlier versions as well provided that no key figure mapped to Addition Update Method</u>.
    If any KF has Update Method "addition"...we have these 2 options ... even for BI 7.0.
    1. you need to drop the Data and reload entire data.
    2. create a custom extractor to load only this field.
    If no KF set to Addition, Just reload entire data with existing DS.
    Remodeling workbench in BI 7.0, we can't work with DSO. it's not supported for current Release we have. Planed for future release... don't have idea whether we can achive this or not!
    Nagesh Ganisetti.
    Assign points if it helps.

  • Second BW data load job waits for first to complete.

    Hi all,
    Iam manually loading data into my BW system , i have observed that if i try to load two data source together the system wait fror one to complete then only it will trigger the secnond load. eg if i run 2lis_11_vahdr and 2lis_13_Vdhdr together.
    The RSMo shows data records for 11_vahdr but for 13_vdhdr it is zero untill the the 11_vahdr finishes.
    I have 8 background processes are available both in Bw and R/3
    Please help.
    Regards,

    Hi,
    the case may not be like that...if one finishes then only the other one starts....
    When u start those 2 loads observe the background process in R/3 which starts with job name:BI_REQ*
    As per my guesses bcoz of ur 8 background processes when u start 2 loads at a time there will be 2 jobs trigerring in R/3 with job name given above....
    One job may be going into RELEASED state in R/3 and waits for the first  job to finish and after that finishes the previous one comes into ACTIVE state and start generating idoc's...
    by this we can say that this may be due to resorces avialability in R/3...
    u closely mnitor the jobs in R/3....
    rgds,
    nkr.

  • HFM Data Load Issue

    Hello All,
    We had an EPMA Type HFM application whose all dimensions were local, The application validated and deployed successfully.
    We tried loading data into the HFM application and the data load was successful.
    Then we decided to convert all of the local dimension of the above mentioned HFM application as shared dimensions. After converting all the dimensions as shared dimension successfully we are getting error while loading data into the same HFM application (The app does validate and can be deployed after changes)
    The Error log is below:
    Load data started: 11/29/2014 10:53:15.
    Line: 216, Error: Invalid cell for Period Oct.
    ACTUAL;2014; Oct; YTD; E_2100;<Entity Currency>;89920000; [ICP None]; CORP; [None]; [None]; FARM21000;11979
    >>>>>>
    Line: 217, Error: Invalid cell for Period Nov.
    ACTUAL;2014; Nov; YTD; E_2100;<Entity Currency>;89920000; [ICP None]; CORP; [None]; [None]; FARM21000;23544
    >>>>>>
    Line: 218, Error: Invalid cell for Period Dec.
    ACTUAL;2014; Dec; YTD; E_2100;<Entity Currency>;89920000; [ICP None]; CORP; [None]; [None]; FARM21000;58709
    >>>>>>
    Line: 219, Error: Invalid cell for Period Oct.
    ACTUAL;2014; Oct; YTD; E_2100;<Entity Currency>;28050000; E_6000_20; [None]; [None]; [None]; FARM21000;-11979
    >>>>>>
    Line: 220, Error: Invalid cell for Period Nov.
    ACTUAL;2014; Nov; YTD; E_2100;<Entity Currency>;28050000; E_6000_20; [None]; [None]; [None]; FARM21000;-11565
    >>>>>>
    Wanted to know whether there is something I might have missed while converting local dimension into shared (If there is any sequence to do so,or any constraint that I may not be aware of, though the conversion looks good as application is validated and deployed after changes)
    What can be the reason for the failed data load, can anyone help?
    Thanks
    Arpan

    Hi,
    I would look at the account properties for that account (89920000) and see the TopCustom1...4Member. You will find the reason behind the invalid cells.
    When you convert the local dimensions to shared, have you checked the 'Dimension Association' for Accounts and Entitities?
    It does seem to lose the dimension association if a proper sequence is not followed.
    Regards,
    S

  • Master data load gives dump with exception CX_RSR_X_MESSAGE

    Hi all,
    I scheduled master data load for 0MATERIAL in our BI 7.0 Quality server. The load ended in dump with the exception CX_RSR_X_MESSAGE. The data has loaded into PSA( 37257 from 37257 records) but the technical status is red and has not updated the master data object.
    I have seen note 615389 which talks about number range but the check is successful. The data load was successful in development server. This error has occured only in Quality.
    We are on SP 12 in BI 7.0.
    We are moving to production soon, so this is an urgent issue. I would appreciate if anyone responds soon. Points will be rewarded.
    with regards,
    Ashish

    Hi,
    The log of the dump is given below.
    Error analysis
    An exception occurred which is explained in detail below.
    The exception, which is assigned to class 'CX_RSR_X_MESSAGE', was not caught
    and
    therefore caused a runtime error.
    The reason for the exception is:
    No text available for this exception
    Information on where terminated
    Termination occurred in the ABAP program "SAPLRRMS" - in "RRMS_X_MESSAGE".
    The main program was "SAPMSSY1 ".
    In the source code you have the termination point in line 78
    of the (Include) program "LRRMSU13".
    Source Code Extract
    Line
    SourceCde
    48
    *... Nachricht an den Message-Handler schicken
    49
    50
    *... S-Meldung wenn kein Handler initialisiert
    51
    DATA: l_type  TYPE smesg-msgty,
    52
    l_zeile TYPE rrx_mesg-zeile.
    53
    CALL FUNCTION 'MESSAGES_ACTIVE'
    54
    IMPORTING
    55
    zeile      = l_zeile
    56
    EXCEPTIONS
    57
    not_active = 1.
    58
    IF sy-subrc NE 0.
    59
    l_type = 'S'.
    60
    ELSE.
    61
    l_type = 'W'.
    62
    ENDIF.
    63
    64
    CALL FUNCTION 'RRMS_MESSAGE_HANDLING'
    65
    EXPORTING
    66
    i_class  = 'BRAIN'
    67
    i_type   = l_type
    68
    i_number = '299'
    69
    i_msgv1  = i_program
    70
    i_msgv2  = i_text.
    71
    ELSE.
    72
    DATA: l_text  TYPE string,
    73
    l_repid TYPE syrepid.
    74
    75
    l_repid = i_program.
    76
    l_text  = i_text.
    77
    >>>>>
    RAISE EXCEPTION TYPE cx_rsr_x_message
    79
    EXPORTING text = l_text
    80
    program = l_repid.
    81
    ENDIF.
    82
    83
    ENDFUNCTION.
    I went to SM21 and went through the system log.  There was a log with the following text  ORA-20000: Insufficient privileges#ORA-06512: at
    "SYS.DBMS_STATS", line 2150#ORA-06512: at "SYS.DBMS_STATS.
    I found a note 963760 for this error. After the changes said in this note were done I reloaded the data and it worked fine. Data load for another data source which gave dump now worked.
    I am not sure if the note did the trick. So i am keeping this post open in case someone finds another solution or can confirm the solution which worked for me.
    Thanks Oliver and Oscar, for replying.
    regards,
    Ashish

  • Data load failed at infocube level

    Dear Experts,
    I hve data loads from the ECC source system for datasource 2LIS_11_VAITM to 3 different datatargets in BI system. the data load is successful until PSA when comes to load to datatargets the load is successful to 2 datatargets and failed at one data target i.e. infocube. I got the following error message:
    Error 18 in the update
    Diagnosis
        The update delivered the error code 18 .
    Procedure
        You can find further information on this error in the error message of
        the update.
    Here I tried to activate update rules once again by excuting the  program and tried to reload using reconstruction fecility but I get the same error message.
    Kindly, please help me to analyze the issue.
    Thanks&Regards,
    Mannu

    Hi,
    Here I tried to trigger repeat delta in the impression that the error will not repeat but then I encountered the issues like
    1. the data load status in RSMO is red but where as in the data target the status is showing green
    2. when i try to analyze psa from rsmo Tcode PSA is giving me dump with the following.
    Following analysis is from  Tcode  ST22
    Runtime Errors         GETWA_NOT_ASSIGNED
    Short text
         Field symbol has not yet been assigned.
    What happened?
         Error in the ABAP Application Program
         The current ABAP program "SAPLSLVC" had to be terminated because it has
         come across a statement that unfortunately cannot be executed.
    What can you do?
         Note down which actions and inputs caused the error.
         To process the problem further, contact you SAP system
         administrator.
         Using Transaction ST22 for ABAP Dump Analysis, you can look
         at and manage termination messages, and you can also
         keep them for a long time.
    Error analysis
        You attempted to access an unassigned field symbol
        (data segment 32821).
        This error may occur if
        - You address a typed field symbol before it has been set with
          ASSIGN
        - You address a field symbol that pointed to the line of an
          internal table that was deleted
        - You address a field symbol that was previously reset using
          UNASSIGN or that pointed to a local field that no
          longer exists
        - You address a global function interface, although the
          respective function module is not active - that is, is
          not in the list of active calls. The list of active calls
          can be taken from this short dump.
    How to correct the error
        If the error occures in a non-modified SAP program, you may be able to
        find an interim solution in an SAP Note.
        If you have access to SAP Notes, carry out a search with the following
        keywords:
        "GETWA_NOT_ASSIGNED" " "
        "SAPLSLVC" or "LSLVCF36"
        "FILL_DATA_TABLE"
    Here I have activated the include LSLVCF36
    reactivated the transfer rules and update rules and retriggered the data load
    But still I am getting the same error...
    Could any one please help me to resolve this issue....
    Thanks a lot,
    Mannu
    Thanks & Regards,
    Mannu

  • Data load status stay in yellow

    Hi,
    My BW platform is BW 701 with SP 5.  I am loading ECCS data hourly with 3.5 method which include transfer rule and update rule.  Most of the time the data load are successfully completed.  The total number of records is about 180,000 records and extracted in 4 packets. Once in a while, randomly one of the data packet stay in yellow could not complete the load.  But, in the next hour data load, the data loaded successfully.  We know it is not data issue.  Does anyone know why the data load is not consistently?  Any suggestions are much appreciated.
    Thanks for your suggestions,
    Frank

    HI Frank,
    This might be because some of the TRFc or Idcos might got hung.
    Check the source system job got finished or not.
    If the source system job got completed check the TRFCs and IDOCs if there are any hung TRFCs and IDOCs.
    in the monitor screen --> menu bar environment > select job overview> source system (enter id n pwd ) check the job status.
    once its done to check the TRFCs
    Info package>Monitor->Environment> Trasact Rfcs-->In source system --> it will diplsay all the TRFCs check for your hung TRFCs and try to manually flush the TRFC(F6).
    If idocs are hung change - process the IDOCs manually.
    Regards
    KP

  • GL Data Load - INIT and FULL

    Hi,
    I am getting different record count for Full and Init GL - Data loads.
    Count for Init is about 300 records less than the count for Full load.
    What could be the reason ?
    Thanks,

    while posting question be clear what cube/datasource ,which GL OLD or NEW you are working whats the background...else its just speculating and beating around bush guessing..
    http://help.sap.com/saphelp_erp2005vp/helpdata/en/45/757140723d990ae10000000a155106/content.htm
    New Gl data flow-
    Re: New GL cubes 0FIGL_V10, 0FIGL_V11
    Hope it Helps
    Chetan
    @CP..

  • How to design data load process chain?

    Hello,
    I am designing data load process chains for the first time and would like to get some general information on best practicies in that area.
    My situation is as follows:
    I have 3 source systems (R3 and two for which I use flat files).
    How do you suggest, should I define one big chain for all my loading process (I have about 20 InfoSources) or define a few shorter e.g.
    1. Master data R3
    2. Master data flat file system 1
    3. Master data flat file system 2
    4. Transaction data R3
    5. Transaction data file sys 1
    ... and execute one after another succesful end?
    Could you also suggest me any links or manuals on that topic?
    Thank you
    Andrzej

    Andrzej,
    My advise is to make separate chains for master & transaction data (always load in this order!) and afterwards make a 'master chain' where you insert these 2 chains one after the other (so: Start process -> Master data chain -> Transaction data chain).
    Regarding the separate chains; paralellize as much as possible (if functionally allowed). Normally, the number of parallel ('vertical') chains equals the nr of CPU's available (check with basis-person).
    Hope this provides you with enough info to start off with!
    Regards,
    Marco

  • SPM Data Loads : Less number of records getting loaded in the Invoice Inbound DSO

    Dear Experts,
    We are working on a project, where data of different NON SAP Source Systems is being loaded into SPM, via Flat File Loads. We came across a very weird situation.
    For other Master and Transaction Data objects, it worked fine, but when we loaded Invoice File, less number of records are getting loaded in the Inbound DSO. The Invoice File contained 80000 records, but the inbound DSO has 78500 records only. We are losing out on 1500 Records.
    We are unable to figure out, as to which 1500 records are we missing out on. We couldn't find any logs, in the Inbound Invoice DSO. We are unable to find out if the records are erroneous, or there is any issue with something else. Is there a way to analyze the situation / Inbound invoice DSO.
    If there is any issue with the Outbound DSO or Cube, We know that it is possible to check the issue with the Data Load request, but for the Inbound DSO, we are not aware, as to which the way to analyze the issue, and why Inbound DSO is taking less records.
    Regards
    Pankaj

    Hi,
    Yes, It might be happen in DSO, because the data records have the simantic keys, so in Keyfileds selection you might have less no of records.
    If you have any rountines check the code(If any condetion for filtering the records).
    Regards.

  • Announcing 3 new Data Loader resources

    There are three new Data Loader resources available to customers and partners.
    •     Command Line Basics for Oracle Data Loader On Demand (for Windows) - This two-page guide (PDF) shows command line functions specifc to Data Loader.
    •     Writing a Properties File to Import Accounts - This 6-minute Webinar shows you how to write a properties file to import accounts using the Data Loader client. You'll also learn how to use the properties file to store parameters, and to use the command line to reference the properties file, thereby creating a reusable library of files to import or overwrite numerous record types.
    •     Writing a Batch File to Schedule a Contact Import - This 7-minute Webinar shows you how to write a batch file to schedule a contact import using the Data Loader client. You'll also learn how to reference the properties file.
    You can find these on the Data Import Resources page, on the Training and Support Center.
    •     Click the Learn More tab> Popular Resources> What's New> Data Import Resources
    or
    •     Simply search for "data import resources".
    You can also find the Data Import Resources page on My Oracle Support (ID 1085694.1).

    Unfortunately, I don't believe that approach will work.
    We use a similar mechanism for some loads (using the bulk loader instead of web services) for the objects that have a large qty of daily records).
    There is a technique (though messy) that works fine. Since Oracle does not allow the "queueing up" of objects of the same type (you have to wait for "account" to finish before you load the next "account" file), you can monitor the .LOG file to get the SBL 0363 error (which means you can submit another file yet (typically meaning one already exists).
    By monitoring for this error code in the log, you can sleep your process, then try again in a preset amount of time.
    We use this allow for an UPDATE, followed by an INSERT on the account... and then a similar technique so "dependent" objects have to wait for the prime object to finish processing.
    PS... Normal windows .BAT scripts aren't sophisticated enough to handle this. I would recommend either Windows POWERSHELL or C/Korn/Borne shell scripts in Unix.
    I hope that helps some.

  • Data Package 000001 : sent, not arrived. No data load in BW 7

    Hi gurus...i am working with BW 7 and i have already had the complete data flow from SRM to BW, some Business Content extractors and somo others generic extractors. We have created transformations, info packages and DTP's, in fact we have loaded master data and transactional data but since last monday data load is not working any more, now every time we manually run the infopackage the next waring message appears:
    Data Package 000001 : sent, not arrived
    and the data load just wait for an answer of SRM (with is the source system) or SRM waits a request from BW.
    We have reactivated the extractors in the SBIW, replicated and reactivated in BW (also we reactivate the transformations) and the data load is not working. Then we also regenerate the Data source but it does not work.
    Do you have and idea of what is happening or which could be the problem.
    Thanks

    Check in transaction SMQA in the source system. The communication might not be working. If there are any entries in RED, then select execute LUWs from the menu to manually process them.
    Read the error and try to resolve it before though.

  • Data loaded but request is in red

    Hi Friends,
    i have an ODS and Cube , the data gets loaded into ODS initially and then into Cube through that ODS. every thing happens thorough process chain. yesterday the ODS data loaded and available for reporting but the request is in Red color . it has failed to load the data into further to cube. my doubt is if the data loaded and available for reporting but Red request will give any problem? . i have tried to change the QM status but it did not allow to change. please guide me on this.. max points will be awarded.
    Thanks,
    Prasad

    Hi ...
    I think that the request in ur ODS is partially activated.......bcoz QM status red .......but available for reporting only happens when request ........you cannot change the status manually......since it is partially activated...........
    If the load is full upload........then delete the request from the target ..and reload it.........
    But if it is a delta load........then change the QM status of the IP to red in RSMO.........In the target level......I mean in the Manage screen it will not allow you to change....for full upload also its better to change the QM status in red......then delete it and again reload it.........
    Is there is any ODS activation step in ur PC.......if not check the settings of the ODS in RSA1........in the setting tab there is a check bob Activate Data Automatically.........it is checked or not......if it is checked it means after data get loaded in the ODS.....it will get activated automatically.................But when you are using a  PC..........this is not a good practice..........its better to keep a seperate ODS activation step in the PC..........
    Hope this helps......
    Regards,
    Debjani.......

Maybe you are looking for

  • Issue with ADF Table bound to a View Object Iterator??

    Can anyone say what would prevent an ADF table bound to a View Object Iterator from displaying all the records retrieved by the View Object Query? I ran the query directly from sqlplus and it returned 21 rows, However when i use the same exact query

  • Third-party external hardrive as media server

    I need to buy an external hard drive to store my rapidly increasing library. For space and cost reasons I'd rather go for a third-party drive with RAID than Apple's time capsule as I can upsize the drives as the library grows. BUT I also want to get

  • How do i update my itunes?

    I need help updating my itunes account. Every time I try, it redirects me to the apple website and wants me to download Itunes.... I'm confused. help??

  • Azure Pack / SMA Automation not working properly

    Hi, We have setup SMA web services and runbook workers on 3 servers.  We configured WAP on one of the servers. We also load balanced the 3 web services and have just a single url (https://smaws.domain.com) It seems like the automation piece doesnt wo

  • Cannot Uninstall old version of Quicktime

    I use my grandparents' computer for my iTunes and it was fine, but we had a couple problems so my grandpa wiped everything he didn't use...including iTunes! I tried reinstalling the new version, but then it said it would't work because Quicktime wasn