Processflow to load data in datawarehouse FAILURE

People,
I have the following problem.
I would like to fill the datawarehouse with a processflow. i cann see he will run good for a while but after 2hours or something he gives a Failure.
a non-numeric character was found where a numeric was expected+* for the following 2 tables > ODS_rules_map and FCT_rules_map
What can i do here guys, because i dont know what the do because iam no expert(just a stand in for a collegae)
I got the feeling i have to change something in Oracle Warehouse builder(mapping for both tables or something, but how?)
hopefully someone can explain how to solve this.
Kind regards

Fooks,
i found the problem
this is the part of the mapping where its go`s wrong.
CASE
WHEN "RA_CUSTOMER_TRX_LINES_ALL"."INTERFACE_LINE_ATTRIBUTE4" = '-'
THEN NULL
WHEN "RA_CUSTOMER_TRX_LINES_ALL"."INTERFACE_LINE_CONTEXT" ='Externe Vordering'
THEN to_date("RA_CUSTOMER_TRX_LINES_ALL"."INTERFACE_LINE_ATTRIBUTE4",'yyyymmdd')
WHEN "RA_CUSTOMER_TRX_LINES_ALL"."INTERFACE_LINE_CONTEXT" ='OKS CONTRACTS'
THEN to_date("RA_CUSTOMER_TRX_LINES_ALL"."INTERFACE_LINE_ATTRIBUTE4",'yyyy/mm/dd')
ELSE null
END/* EXPRESSIONS.OUTGRP1.VORD_INGANGSDATUM */ "VORD_INGANGSDATUM",
CASE
WHEN "RA_CUSTOMER_TRX_LINES_ALL"."INTERFACE_LINE_ATTRIBUTE4" = '-'
THEN NULL
WHEN "RA_CUSTOMER_TRX_LINES_ALL"."INTERFACE_LINE_CONTEXT" = 'Externe Vordering'
THEN to_date("RA_CUSTOMER_TRX_LINES_ALL"."INTERFACE_LINE_ATTRIBUTE5",'yyyymmdd')
WHEN "RA_CUSTOMER_TRX_LINES_ALL"."INTERFACE_LINE_CONTEXT" = 'OKS CONTRACTS'
THEN to_date("RA_CUSTOMER_TRX_LINES_ALL"."INTERFACE_LINE_ATTRIBUTE5",'yyyy/mm/dd')
ELSE NULL
i used the following queries....
At this moment:
select to_date(ra.INTERFACE_LINE_ATTRIBUTE4,'yyyymmdd')
, to_date(ra.INTERFACE_LINE_ATTRIBUTE5,'yyyymmdd')
from RA_CUSTOMER_TRX_LINES_ALL ra
Need to be:
select to_date(replace(replace(ra.INTERFACE_LINE_ATTRIBUTE4,'/',''),'-',''),'yyyymmdd')
, to_date(replace(replace(ra.INTERFACE_LINE_ATTRIBUTE5,'/',''),'-',''),'yyyymmdd')
from RA_CUSTOMER_TRX_LINES_ALL ra
My question is what is the next step to do? how can i get this in my mapping?
just change above lines in the querie of the mapping in Oracle Warehouse Builder?
thans for helping me
greetzz

Similar Messages

  • How to load data to dimensions and fact table is it simialr to sql server

    How to load data to dimensions and fact table in a data ware house environment, is it simialr to sql server, oracle?
    Never migrated or loaded data to datawarehouse server.
    I have an interview and am really confused if they ask anything related to the dataware house side elements.
    Coudl you please if you don't mind provide me some steps and dimension and fact table info, an example only.
    Atleast for my knowledge.
    Thank you very much for the helpful info.

    Some discussions in previous forums should help you
    http://forums.sdn.sap.com/thread.jspa?threadID=2019448
    http://forums.sdn.sap.com/thread.jspa?threadID=1908902
    In the SAP tutorial, you can see a sample example of making fact tables.
    http://help.sap.com/businessobject/product_guides/boexir32SP1/en/xi321_ds_tutorial_en.pdf
    Arun

  • Failure while loading data from R/3 to ODS : URGENT

    Hi !
    while I am loading data from R/3 to ODS it is failing with message that Error occured in data selection .
    But in RSA3  it is extracting data .
    And in another scenario besides above error , it also giving error in IDoc  Request IDoc : Application document not posted .
    Please suggest.

    Please search SDN.
    Error in Data loading
    Re: Problem with Info Idoc -- please suggest

  • Data load failed while loading data from one DSO to another DSO..

    Hi,
    On SID generation data load failed while loading data  from Source DSO to Target DSO.
    Following are the error which is occuuring--
    Value "External Ref # 2421-0625511EXP  " (HEX 450078007400650072006E0061006C0020005200650066
    Error when assigning SID: Action VAL_SID_CONVERT, InfoObject 0BBP
    So, i'm  not getting  WHY in one DSO i.e Source  it got successful but in another DSO i.e. Target its got failed??
    While analyzing all i check that SIDs Generation upon Activation is ckecked in source DSO but not in Target DSO..so it is reason its got failed??
    Please explain..
    Thanks,
    Sneha

    Hi,
    I hope your data flow has been designed in such a way where the 1st DSO as a staging Device and all transformation rules and routine are maintained in between 1st to 2nd dso and sid generation upon activation maintained in 2nd DSO.  By doing so you will be getting your data 1st DSO same as your source system data since you are not doing any transformation rules and routine etc.. which helps to avoid data load failure.  
    Please analyze the following
    Have you loaded masterdata before transaction data ... if no please do it first
    go to the property of first dso and check whether there maintained sid generation up on activation (it may not be maintained I guess)
    Goto the property of 2nd Dso and check whether there maintained sid generation up on activation (It may be maintained I hope)
    this may be the reason.
    Also check whether there is any special char involvement in your transaction data (even lower case letter)
    Regards
    BVR

  • Error While loading data for LIS InfoSources.

    Hi All,
    I am repeatedly receiving load failure errors while loading data using 2lis_01_s001 (This is the case with all the InfoSources).
    The error message is:
    An error occurred in the source system.
    Note
    If the source system is a Client Workstation, then it is possible that the file that you wanted to load was being edited at the time of the data request. Make sure that the file is in the specified directory, that it is not being processed at the moment, and restart the request.
    in our Quality system, we disabled the LIS Updation to No Update(R3) and loaded data and then again changed the Update Mode for No Updating to Asynchronous Update(R3). But now we are doing dataloading in Production. How to proceed. Should we have to disable the LIS Updating whenever we have to load the loads from R3 to BW.
    Regards
    Jay

    Hi Jayanthy,
    Pls. check the order of the fields in the two set up tables for the S001 structure. The order of fields in both the tables should be the same.
    You can see the structure in the TCode - SE11.
    If the order is different, then you bneed to ask the BASIS person to change the order so that the order of fields in both the setup tables is same. This should fix the issue.
    Thanks,
    Raj

  • "Error While loading data from ODS to Target "

    Hi All,
    I am loading masterdata from ODS to infoobject as data target, I am getting an error message as
    Error message from the source system
    Diagnosis
    An error occurred in the source system.
    System response
    Caller 09 contains an error message.
    Further analysis:
    The error occurred in Service API .
    Refer to the error message.
    DataSource 80BP_ID does not have the same status as the source system in the Business Information Warehouse.
    The time stamp in the source system is 09.08.2006 10:17:24.
    The time stamp in the BW system is 15.03.2006 16:44:34.
    But we are getting Data till ODS, when we manually load from ODS to infoobject this error is coming up.
    Please help in resolving this,
    Thanks,
    Sairam.

    You are welcome Sairam
    The generated objects are export datasources based on the ODS objects. These are automatically generated because ODS objects are a key component to a data model or datawarehouse strategy where their role is to stage the data for consolidation or harmonization purposes. The generated datasource helps you to load data into further data targets. The use of a cube as an export datasource is not as common, and so the datasource is not automatically generated for them.
    Hope this helps...

  • How to edit data while loading data from W/O to Standard DSO?

    Hello,
    I am loading data from W/O to Standard DSO, during activation it got error out due to SID failure error for one infoobject(error is due to lowercase letter).But i can't change the infoobject setting and transformation.
    Is there any way to edit the data(either in W/O DSO or in new data of Standard DSO)?
    Thanks and regards,
    Himanshu.

    HI,
    Please check what is setting there in the transaction RSKC if it is set as ALL_CAPITAL then you must atleast chage the charecter setting, write a command in transformation, load to PSA and modify (Not applicable for BI7), Remove the setting in RSKC.(Not suggested).
    Cheers
    Vikram

  • Loading data into multiple tables - Bulk collect or regular Fetch

    I have a procedure to load data from one source table into eight different destination tables. The 8 tables have some of the columns of the source table with a common key.
    I have run into a couple of problems and have a few questions where I would like to seek advice:
    1.) Procedure with and without the BULK COLLECT clause took the same time for 100,000 records. I thought I would see improvement in performance when I include BULK COLLECT with LIMIT.
    2.) Updating the Load_Flag in source_table happens only for few records and not all. I had expected all records to be updated
    3.) Are there other suggestions to improve the performance? or could you provide links to other posts or articles on the web that will help me improve the code?
    Notes:
    1.) 8 Destination tables have at least 2 Million records each, have multiple indexes and are accessed by application in Production
    2.) There is an initial load of 1 Million rows with a subsequent daily load of 10,000 rows. Daily load will have updates for existing rows (not shown in code structure below)
    The structure of the procedure is as follows
    Declare
    dest_type is table of source_table%ROWTYPE;
    dest_tab dest_type ;
    iCount NUMBER;
    cursor source_cur is select * from source_table FOR UPDATE OF load_flag;
    BEGIN
    OPEN source_cur;
    LOOP
    FETCH source_cur -- BULK COLLECT
    INTO dest_tab -- LIMIT 1000
    EXIT WHEN source_cur%NOTFOUND;
    FOR i in dest_tab.FIRST .. dest_tab.LAST LOOP
    <Insert into app_tab1 values key, col12, col23, col34 ;>
    <Insert into app_tab2 values key, col15, col29, col31 ;>
    <Insert into app_tab3 values key, col52, col93, col56 ;>
    UPDATE source_table SET load_flag = 'Y' WHERE CURRENT OF source_cur ;
    iCount := iCount + 1 ;
    IF iCount = 1000 THEN
    COMMIT ;
    iCount := 0 ;
    END IF;
    END LOOP;
    END LOOP ;
         COMMIT ;
    END ;
    Edited by: user11368240 on Jul 14, 2009 11:08 AM

    Assuming you are on 10g or later, the PL/SQL compiler generates the bulk fetch for you automatically, so your code is the same as (untested):
    DECLARE
        iCount NUMBER;
        CURSOR source_cur is select * from source_table FOR UPDATE OF load_flag;
    BEGIN
        OPEN source_cur;
        FOR r IN source_cur
        LOOP
            <Insert into app_tab1 values key, col12, col23, col34 ;>
            <Insert into app_tab2 values key, col15, col29, col31 ;>
            <Insert into app_tab3 values key, col52, col93, col56 ;>
            UPDATE source_table SET load_flag = 'Y' WHERE CURRENT OF source_cur ;
            iCount := iCount + 1 ;
            IF iCount = 1000 THEN
                COMMIT ;
                iCount := 0 ;
            END IF;
            END LOOP;
        COMMIT ;
    END ;However most of the benefit of bulk fetching would come from using the array with a FORALL expression, which the PL/SQL compiler can't automate for you.
    If you are fetching 1000 rows at a time, purely from a code simplification point of view you could lose iCount and the IF...COMMIT...END IF and just commit each time after looping through the 1000-row array.
    However I'm not sure how committing every 1000 rows helps restartability, even if your real code has a WHERE clause in the cursor so that it only selects rows with load_flag = 'N' or whatever. If you are worried that it will roll back all your hard work on failure, why not just commit in your exception handler?

  • Load data from a staging table into an actual table by using flag value

    I am in need of a quick and dirty pl/sql code to read the staging table "STG_TABLE, row by row and load data into a PROD_TABLE. The load should fail entirely, that is a rollback should occur when there an error occurs during record inserting into a production table using a flag value of Y as successful completion and N as a failure.
    Any suggestions?

    Hi,
    It sounds as if you want something like:
    BEGIN
        :ok_flag := 'N';
        INSERT INTO  prod_table (col1, col2, ...)
         SELECT               col1, col2, ...
         FROM     stg_table;
        :ok_flag := 'Y';
    END;ROLLBACK happens automatically in the event of an error.
    Instead of a bind variable for ok_flag, you could use some other kind of variable defined outside the scope of this PL/SQL code.
    Edited by: Frank Kulash on Jun 5, 2012 10:47 PM

  • Data Plan Activation Failure after I updating to 8.0.2

    After I updated my Ipad2 to 8.0.2, I have not been able to use my cellular date. It keeps "searching" for connection and then pops up "Data Plan Activation Failure". What do I do?

    I turned gmail off and turned on iCloud email.  It won't load either. 

  • 10)example of errors while loading data and how do u resolve them?

    )example of errors while loading data and how do u resolve them?

    hai ram reddy this is ramprasad reddy....u can get this type of most commen errors....like below
    As the frequent failures and errors , there is no fixed reason for the load to be fail , if you want it for the interview perspective I would answer it in this way.
    a) Loads can be failed due to the invalid characters
    b) Can be because of the deadlock in the system
    c) Can be becuase of previuos load failure , if the load is dependant on other loads
    d) Can be because of erreneous records
    e) Can be because of RFC connections.....(sol: raise the ticket to BASIS team to provide the connection)
    f)can be coz of missing master data.
    g)can be coz of no data found in the source system.
    h)Time stamp error........(sol: activate the data source and replicate the data source and load)
    i)Short dump error.........(sol: delete the request and load once again)
    j)Data locking when loading in parallel.
    k)ODS activation failures.
    l)No SID found........(sol: should load masterdata before transaction data)

  • Executing host through a concurrent program to load data

    hello all,
    Dear Friends, i want to use host command in a concurrent request to load data in a table from a flat file .
    for that i wrote one flat file(test_data_host.txt) , one control file(test_data_host.ctl) , one script file(test_data_host_prog.prog)
    Finally i made one concurrent program BPIL_TEST_DATA_HOST and saved it as ' HOST '.
    i added to my test_data_host_prog.prog file four compulsory parameters ,but during registration in sysadmin -&gt; concurrent-&gt; program -&gt; define ,In parameters window, i had left all blank.(previously, same was running when i made one link from '$FND_TOP/bin/fndcpesr' file using command: ln -s $FND_TOP/bin/fndcpesr test_data_host_prog through telnet.)
    During running of the concurrent program it is giving error: The executable file /dev02/CPS/apps/apps_st/appl/ja/12.0.0/bin/test_data_host_prog for this concurrent program can not be executed.
    my files data is attached here with:
    1) test_data_host.txt file:
    Tamojit,1,history
    Vishnu,2,maths
    Krishna,3,HRMS
    2) test_data_host.ctl file:
    LOAD DATA
    APPEND
    INTO TABLE BPIL_TEST_CONTROL_FILE
    FIELDS TERMINATED BY ","
    OPTIONALLY ENCLOSED BY'"'
    TRAILING NULLCOLS
    ( name1,
    class1,
    subject,
    record_status CONSTANT 'NEW')
    3) test_data_host_prog.prog file:
    # Parameters passed into program
    ORA_USER_PASS=$1
    USERID=$2
    USERNAME=$3
    REQUESTID=$4
    #LOGON_STRING='apps/appscps@cps'
    #FILENAME=$5
    sqlload userid=apps/appscps@cps control=/usr/tmp/test_data_host.ctl data=/usr/tmp/test_data_host.txt log=/usr/tmp/test_data_host_log.log bad=/usr/tmp/test_data_host_bad.bad ERRORS=100000 silent=FEEDBACK &lt;&lt;!
    RC=$?
    echo "The sql loader exit code for loading Header table is :"$RC
    if [$RC -eq 0 -o $RC -eq 3 |http://forums.oracle.com/forums/]
    then
    echo 'Loading of file table successful.'
    else
    echo 'Error: Loading of file table has errors. Sqlload return code is '$RC
    exit 1
    fi
    case "$RC" in
    0) echo "SQL*Loader execution successful" ;;
    1) echo "SQL*Loader execution exited with failure, see logfile" ;;
    2) echo "SQL*Loader execution exited with warning, see logfile" ;;
    3) echo "SQL*Loader execution encountered a fatal error" ;;
    *) echo "unknown return code $RC" ;;
    esac
    kindly help me with the same to run the concurrent request successfully to load the data .
    Thanks & Regards
    Vishnu Pratap Patel
    ([email protected])
    Edited by: user649889 on Sep 2, 2008 2:01 PM

    Hi Gareth,
    I tried the code which you had given to me,but its not working .
    Since i am not very competent enough in shell script , so i want to know that is there any other way or do spaces in that code (during writing that code in my bpil_test_data_prog .prog file) can also make the code inefficient to successfully run my concurrent request.
    My .prog file is attached for your notice.
    # Parameters passed into program
    #ORA_USER_PASS=$1
    #USERID=$2
    #USERNAME=$3
    #REQUESTID=$4
    REQUEST_ID=`echo $*| cut -f2 -d" "|cut -c11-`
    USER_ID=`echo $*| cut -f3 -d" "|cut -c11- | sed 's/\"//g'`
    ORA_ID=`echo $*| cut -f4 -d" "|cut -c12-`
    APPS_USERNAME=`echo $*| cut -f5 -d" "|sed 's/\"//g'|cut -c14`
    #LOGON_STRING='apps/appscps@cps'
    #FILENAME=$5
    sqlload userid=apps/appscps@cps control=/usr/tmp/test_data_host.ctl data=/usr/tmp/test_data_host.txt log=/usr/tmp/test_data_host_log.log bad=/usr/tmp/test_data_host_bad.bad ERRORS=100000 silent=FEEDBACK <<!
    RC=$?
    echo "The sql loader exit code for loading Header table is :"$RC
    if [ $RC -eq 0 -o $RC -eq 3 ]
    then
         echo 'Loading of file table successful.'
    else
         echo 'Error: Loading of file table has errors. Sqlload return code is '$RC
         exit 1
    fi
    case "$RC" in
    0) echo "SQL*Loader execution successful" ;;
    1) echo "SQL*Loader execution exited with failure, see logfile" ;;
    2) echo "SQL*Loader execution exited with warning, see logfile" ;;
    3) echo "SQL*Loader execution encountered a fatal error" ;;
    *) echo "unknown return code $RC" ;;
    esac
    kindly have a look over it & help me for the same.
    Thanks & Regards
    Vishnu

  • Loading data one year at a time

    Hi,
    We have a situation where we need to load data one year at a time. I saw this done a few years ago but do not remember the details.
    What I am thinking is that we could initially run a full load with the following parameters:
    $$ANALYSIS_START: 1/1/2006
    $$ANALYSIS_START_WID: 1/1/2006
    $$INITIAL_EXTRACT_DATE: 1/1/2006
    $$ANALYSIS_END_WID: 1/1/2006
    And this should give us one year. What I am not sure about is how to load each subsequent year???
    Regards

    Is the issue a performance issue (ETLs running for too long)? The problem is that if you do Year by year..and you want to do a "incremental" load for each year, that would be even more of a load..since you are not allowing for BULK load (where the tables get truncated). Either you can truncate and do BULK or incrmental..which may be an even heavier load. I think you are assuming that this approach will somehow help you from a hardware limitation standpoint..do you know for sure that it will?
    If you really do want to do it, as I mentioned, you can edit the INITIAL and END parameters. It would help if you clarify the Hardware limitation...I think there are better ways to handle this than to do what you are doing.

  • Input ready query is not showing loaded data in the cube

    Dear Experts,
    In Input ready query we have problem that it is not showing the values which was not entered by throught hat query. Is any settings in input ready query that we can do to populate the loaded data on the cube as well as data entered through Input ready query itself.
    Thanks,
    Gopi R

    Hi,
    input ready queries always should display most recent data (i.e. all green and the yellow request). So you can check the status of the requests in the real-time InfoCube. There should exist only green requests and maybe at most one yellow request.
    In addition you can try to delete the OLAP cache for the plan buffer query: Use RSRCACHE to do this. The technical names of the plan buffer query can be found as follows:
    1. InfoCube\!!1InfoCube, e.g. ZTSC0T003/!!1ZTSC0T003 if ZTSC0T003 is the technical name of the InfoCube
    2. MPRO\!!1MPRO, e.g. ZTSC0M002/!!1ZTSC0M002 if ZTSC0M002 is the technical name of the multiprovider
    If the input ready query is defined on an aggregation level using a real-time InfoCube, the first case is relevant; if the aggregation level is defined on a multiprovider the second case is relevant. If the input-ready query is defined on a multiprovider containing aggregation levels again the first case is relevant (find the real-time InfoCubes used in the aggregation level).
    Regards,
    Gregor

  • Can we load data using .xls in user define format(without using default template)

    Hi All,
    I'm new bee to FDM. Part of HFM support i use FDM to load flatfile data. Just has a bit more knowledge than end user.
    Requirement is that i need to load data from MS excel to Planning application via FDM.
    Previously application is in Excel(Macro driven) and upstream(data) is also in Excel(multi tab).
    As of my knowledge data can be loaded from .csv file(Excel save as CSV) with single tab.
    Could you please let me know possibilities to load data from .xls(.xlsx) to FDM.
    Thanks in advance.

    If you want to load data using Excel, utilising FDM's out-of-the-box functionality you will have to use one of the templates supplied i.e. Excel Trial Balance or Excel Multi-load template.

Maybe you are looking for

  • BREW LG Dare (9700)

    hi all. this is my first post. I have been developing in Flash for a long time now, and am brand new to the mobile device realm. I recently purchased the LG Dare and was stoked to find out it supported flash through the brew platform. But then quickl

  • Premiere CC C300 playback crash

    Hi there, I have been running CS6 on my machine without any issues. Today I downloaded and installed CC 7.2.1 (4) and uninstalled CS6. I create a new project, and import a mix of C300 and 5D footage into bins. Everything seems to import without any t

  • What is the default log filename for connector logging in Weblogic 9.x?

    According to the JCA 1.5 Specification - The ApplicationServer manages the association of output stream with the ManagedConnectionFactory. http://java.sun.com/javaee/5/docs/api/javax/resource/spi/ManagedConnectionFactory.html#setLogWriter(java.io.Pri

  • Adobe Reader 11, Fehler 126

    Hallo, ich habe mir Adobe Reader 11 runter geladen (auf deutsch und auf schwedisch). Jetzt wollte ich schwedische Tickets ausdrucken und da zeigte er mir folgende Fehlermeldung an. Can not load library "LXBBPRP.DLL". Using defaults. Error: 126 Ich ha

  • BlackBerry SIM swap

    My Storm battery completely gave up the ghost and won't turn on. I ordered a new battery, but while I'm waiting for it to arrive, a friend gave me a different BlackBerry to use. I swapped the SIM card and booted it up. It recognized the SIM card but