How to sortout the data load error?

Hi,
I have a data model which has the following flow:
First from IS to ODS1.
Second from ODS1 to ODS2
Third from ODS2 to Infocube.
Now while loading from ODS2 to Infocube there were some date fields which are incorrect and hence there was a dataload error. Since the first two dataloads were successful i am not sure where should i change the date values
The incorrect date values were in the format: "10.0..06.2"
Any ideas?
Raj

Hi
Check this links:
http://sap.ittoolbox.com/groups/technical-functional/sap-bw/loading-fails-from-an-ods-into-a-cube-error-603460
Re: error while loading data into cube
Hope it helps
Regards
Ashwin

Similar Messages

  • How to automate the data load process using data load file & task Scheduler

    Hi,
    I am doing Automated Process to load the data in Hyperion Planning application with the help of data_Load.bat file & Task Scheduler.
    I have created Data_Load.bat file but rest of the process i am unable complete.
    So could you help me , how to automate the data load process using Data_load.bat file & task Scheduler or what are the rest of the file is require to achieve this.
    Thanks

    To follow up on your question are you using the maxl scripts for the dataload?
    If so I have seen and issue within the batch (ex: load_data.bat) that if you do not have the full maxl script path with a batch when running it through event task scheduler the task will work but the log and/ or error file will not be created. Meaning the batch claims it ran from the task scheduler although it didn't do what you needed it to.
    If you are using maxl use this as the batch
    "essmsh C:\data\DataLoad.mxl" Or you can also use the full path for the maxl either way works. The only reason I would think that the maxl may then not work is if you do not have the batch updated to call on all the maxl PATH changes or if you need to update your environment variables to correct the essmsh command to work in a command prompt.

  • How to find the data loaded from r/3 to bw

    hi
    how to find the data loaded from r/3 to bw is correct . i am not able to find which feild in the query is connected to which feild in the r/3 . where i am geting the data from r/3 . is there any process to find which feild  and table the data is comming from . plz help
    thanks in advance to u all

    Hi Veda ... the mapping between R/3 fields and BW InfoObjects should take place in Transfer Rules. Other transformation could take place in Update Rule.
    So you could proceed this way: look at InfoProvider Data Model and see if the Query does perform any calculation (even with Virtual keyfigures / chars). Than go back to Update Rules and search for other calculation / transformation. At least there are Tranfer Rule and eventually DataSource / Extraction Enhancements.
    As you can easily get there are many points where you have to look for ... it's a quite complex work but very usefull.
    Once you will have identified all mappings / transfromation see if BW data matchs R/3 (considering calculations ...)
    Good job
    GFV

  • How to delete the data loaded into MySQL target table using Scripts

    Hi Experts
    I created a Job with a validation transformation. If the Validation was failed the data passed the validation will be loaded into Pass table and the data failed will be loaded into failed table.
    My requirement was if the data was loaded into Failed database table then i have to delete the data loaded into the Passed table using Script.
    But in the script i have written the code as
    sql('database','delete from <tablename>');
    but as it is an SQL Query execution it is rising exception for the query.
    How can i delete the data loaded into MySQL Target table using scripts.
    Please guide me for this error
    Thanks in Advance
    PrasannaKumar

    Hi Dirk Venken
    I got the Solution, the mistake i did was the query is not correct regarding MySQL.
    sql('MySQL', 'truncate world.customer_salesfact_details')
    error query
    sql('MySQL', 'delete table world.customer_salesfact_details')
    Thanks for your concern
    PrasannaKumar

  • How to stop the data loads through process chains

    hi,
    I want to stop all the data loads to BI through Process chains where load happens periodic.
    kindly suggest how can I proceed.

    Hi,
    Goto RSPC find your PC and double click on START then change the timings, i.e. give starting date is 01.01.9999 like that Save and ACtivate the PC, it won't start till 01.01.9999.
    Thanks
    Reddy

  • Data Load error Please respond ASAP

    Hi ,
    When i am trying to load data for year 07 using rule files its showing an error and aborting the data load
    Error Msg " Data Value [2007] Encountered before all dimensions Selected,[5] Records Completed"
    Any help greatly appreciated.
    Thanks

    Key here is that 5 records were loaded before failing on the 6th, indicating it's not a header issue on the load rule.
    Check the first 5 records and see how the column containing 2007 differs between them and the failing 6th record. Perhaps the 6th record has a null (as indicated eariler) or prehaps 2007 is not the name of the member in the outline.
    I have had both issues give similar error results.
    J

  • Automate the data load process using task scheduler

    Hi,
    I am doing Automated Process to load the data in Hyperion Planning application with the help of Task Scheduler.
    I have created Data_Load.bat file but rest of the process i am unable complete.
    So could you help me , how to automate the data load process using Data_load.bat file & task Scheduler.
    Thanks

    Thanks for your help.
    I have done data loading using Data_load.batch file . the .bat file call to .msh file using this i have followed your steps & automatic data lload is possible.
    If you need this then please let me know.
    Thanks again.
    Edited by: 949936 on Oct 26, 2012 5:46 AM

  • Need to skip the data load for one of the sub chain.

    Hi All,
    In our project, we have one meta chain and have many sub chains inside that. Today we don't want to run one of the sub chain. As we are in BW7.0, Skip options are not available. Can you please help how to stop the data load for the particular sub-chain.Please suggest.
    Thanks.

    Hi Jalina,
    If this is a frequent request then you can create a custom ABAP Program and then use the below simple logic to skip/run the meta chain, but you will also have to change the link between the sub chains as "Always" instead of "Successful" so that no matter if the dependent chain is successful OR not the nex chain will proceed. If you are comfortable with event then you can also use event to achieve this
    Program Logic- Create a new table where you maintain meaningful values (indicator/description) in it and the program should read the data from the table, Based on the values in the table the program will be successful/fail
    Thanks
    Abhishek Shanbhogue

  • Optimize the data load process into BPC Cubes on BW

    Hello Gurus,
    We like to know how to optimize the data load process and our scenario for this is that we have ECC Classic Ledger,  and we are looking for the best way to load data into the BW Infocubes from an ECC source.
    To complement the question above, from what tables the data must be extracted and then parsed to BW so the consolidation it´s done ?  also, is there any other module that has to be considered from other modules like FI or EC-CS for this?
    Best Regards,
    Rodrigo

    Hi Rodrigo,
    Have you looked at the BW Business Content extractors available for the classic GL? If not, I suggest you take a look. BW business content provides all the business logic you will normally need to get data out of ECC and into BW for pretty much every ECC application component in existence: [http://help.sap.com/saphelp_nw70/helpdata/en/17/cdfb637ca5436fa07f1fdc0123aaf8/frameset.htm]
    Ethan

  • Most common BW data load errors in production and how to solve them ..

    Hi All,
    Most common BW data load errors in production and how to solve them ..
    Any doc on it ,if so plz send it across to this id [email protected]
    Thanks in advance.
    Rgrds
    shoba

    hi
    1) RFC connection lost.
    2) Invalid characters while loading.
    3) ALEREMOTE user is locked.
    4) Lower case letters not allowed.
    5) While loading the data i am getting messeage that 'Record
    the field mentioned in the errror message is not mapped to any infoboject in the transfer rule.
    6) object locked.
    7) "Non-updated Idocs found in Source System".
    8) While loading master data, one of the datapackage has a red light error message:
    Master data/text of characteristic 'so and so' already deleted .
    9) extraction job aborted in r3
    10) request couldnt be activated because theres another request in the psa with a smaller sid
    11) repeat of last delta not possible
    12) datasource not replicated
    13) datasource/transfer structure not active ´
    14) Idoc Or Trfc Error
    15. ODS Activation Error

  • How to load the data from informatica into bw & how to report the data

    Hi friends,
    how to load the data from informatica into bw & how to report the data
    using cognos.(i.e how to access the data in sap bw using cognos 8 BI suite).
    Thanks,
    madhu.

    Inorder to report BW data into Cognos you can extract data from using Open Hub to the DB table from which Cognos reads.
    For BW informatic integration refer following docs:
    http://www.aman.co.il/aman/pfd/DataInteg_BR.q103cd.pdf.pdf
    http://h71028.www7.hp.com/enterprise/cache/3889-0-0-225-121.html
    http://devnet.informatica.com/learning/ePresentations.asp
    http://72.14.203.104/search?q=cache:C741L86Q19oJ:devnet.informatica.com/showcase/resources/Essbase_DataSheet.pdfinformaticapowerconnect(BI)&hl=en&gl=in&ct=clnk&cd=3
    http://www.informatica.com/customers/utilities_energy/fpl_group.htm
    http://www.informatica.com/solutions/resource_center/technote_sapbw_65241004.pdf#search=%22Informatica%20to%20Bw%22

  • How we can automate the data loading from BI-BPC

    Dear  Guru's
    Thanks for watching this thread,my question is
                  How we can load the data from BI7.0 to BPC.My environment is SAP-BI 7.0 and BPC is 7.5 MS version and 2008SQL.
    How we can automate the data loading from  BI- BPC Ms version.Is manual flat file load is mandatory in ms version.
    Thanks in Advance,
    Srinivasan.

    Here are some options
    1) Use standars packages and schedule them :
        A) Openhub masterdata file into a flat file/ BPC App server  and Schedule the package - Import Master Data from a Data File and  other relevent packages.
    2 ) Using Custom Tasks in Custom Packages ( SSIS)
    Procedure
    From the Microsoft SQL Server Business Intelligence Developer Studio, open the Microsoft SSIS folder.
    Create a new package, or select an existing package to modify.
    Choose  Task  Register Custom Task .
    In the Task Location field, browse for the target .dll file.
    Note
    By default, the .dll files are stored in BPC/Websrvr/bin.
    End of the note.
    Enter a task description, select an appropriate icon, then click OK.
    Drag the icon to the designer window. Enter data as required.
    Save the package.

  • What are the frequent data load errors in production support?

    what are the frequent data load errors in production support?

    It is a long list. Here is some of them.
    1. Idoc not arriving from source system.
    2. previous processes failed.
    3. change run unsucessful or did not run properly and data in master data is messed up.
    4. invalid characteristics in the data.
    5. Duplicate records found.
    and on  and on.
    Ravi Thotahdri

  • How to identify the status of the data load

    Hi All,
    Here is my requiremenet,
    I have a process called etl_pf2 which loads the data into staging tables. Now I have another process that does a partition exchange and moves the data from staging tables to online tables.
    In one procedure, I need to verify whether any data load into the staging tables is happings. If yes then partiontion should not occur, if no then movement should happen.
    Any idea on any parameter which can be used in the procedure to check the status of the data load in staging tables.
    Please help me.

    Thanx for reply
    But i thhink that the problem is with NQSServer which crashes but not disappears from the processes list.
    I tried to search obiee+USER_MEM_ARGS and nqsserver+USER_MEM_ARGS but found only this thread.
    How can JVM settings help to nqsserver.exe to work properly?

  • Data load error  in PSA

    hellow gurus
    i have run the data load for 2lis_02_scl  at  psa level..
    but when i went to RSMO
    i m seein
    Transfer ( Idoc and TRFC) : error occured
    Reuest IDoc : application documented posted
    Info Idoc1: Sent not arrived ,Data Passed to port Ok
    how to resolve this error
    thanks
    points wil be assigned as my gesture
    Regards
    Rahul

    Hi,
    Check SM58 and BD87 for pending tRFCs/IDOCS and execute them manually.
    Transact RFC error
    tRFC Error - status running Yellow for long time (Transact RFC will be enabled in Status tab in RSMO).
    Step 1:  Goto Details, Status get the IDoc number,and go to BD87 in R/3,place the cursor in the RED IDoc entroes in tRFC
    queue thats under outbound processing and click on display the IDOC which is on the menu bar.
    Step 2: In the next screen click on Display tRFC calls (will take you to SM58 particular TRFC call)
    place the cursor on the particular Transaction ID and go to EDIT in the menu bar --> press 'Execute LUW'
    (Display tRFC calls (will take you to SM58 particular TRFC call) ---> select the TrasnID ---> EDIT ---> Execute LUW)
    Rather than going to SM58 and executing LUW directly it is safer to go through BD87 giving the IDOC name as it will take you
    to the particular TRFC request for that Idoc.
    OR
    Go into the JOB Overview of the Load there you should be able to find the Data Package ID.
    (For this in RSMO Screen> Environment> there is a option for Job overview.)
    This Data Package TID is  Transaction ID in SM58.
    OR
    SM58 > Give * / user name or background  (Aleremote) user name and execute.It will show you all the pending TRFC with
    Transaction ID.
    In the Status Text column you can see two status
    Transation Recorded and Transaction Executing
    Don't disturb, if the status is second one Transaction Executing. If the status is first one (Transation Recorded) manually
    execute the "Execute LUWs"
    OR
    Directly go to SM58 > Give * / user name or background  (Aleremote) user name and execute. It will show TRFCs to be executed
    for that user. Find the particular TRFC (SM37 > Req name > TID from data packet with sysfail).select the TrasnID (SM58) --->
    EDIT ---> Execute LUW
    IDOCS Process Manually 
    http://help.sap.com/saphelp_nw04s/helpdata/en/0b/2a6620507d11d18ee90000e8366fc2/frameset.htm
    http://help.sap.com/saphelp_nw04/helpdata/en/dc/6b815e43d711d1893e0000e8323c4f/content.htm
    &messageID=2311994
    Thanks,
    JituK

Maybe you are looking for