Loading data from infoset to DSO

Hi,
Iam loading data from InfoSet  to DSO..the transformations were fine..no errors found.
But could not activate the target DSO.
Getting error.".Error while activating transformation"
How to over come this??
Thanks
Ravi

Hi Ravi Kanth,
Please check the following links. Hope this helps.
[/message/1410802#1410802 [original link is broken];
[/message/2115621#2115621 [original link is broken];
Thanks,
Sruthi

Similar Messages

  • Long time to load data from PSA to DSO -Sequential read RSBKDATA_V

    Hi ,
    It is taking long time to load data from PSA to DSO. It is doing Sequential read on RSBKDATA_V and table contents no data .
    we are at - SAPKW70105. It started since yesterday . There is no changes in system parameters.
    Please advice. 
    Thanks
    Nilesh

    Hi Nilesh,
    I guess the following SAP Note will help you in this situation.
    [1476842 - Performance: read RSBKDATA only when needed|https://websmp107.sap-ag.de/sap/support/notes/1476842]
    Just note that the reference to Support Packages is wrong. It is also included in SAP_BW 701 06.

  • Urgent : Error while loading data from PSA to DSO

    Hi,
    I am working on 7.0.
    When i look for data into PSA then i have 1 date field which is containing data in proper date format. This date field is mapped with 0CALMONTH & 0CALYEAR for which i have written routine and directly assigned to 0DOC_DATE .
    I have written a start routine to do some modifications in SOURCE_PACKAGE.
    But When i am loading data from PSA to DSO its giving below error:
    " Exception wrong_date ; see long text RSTRAN 303
    Diagnosis
    An exception wrong_date was raised while executing function module
    RST_TOBJ_TO_DERIVED_TOBJ .
    System Response
    Processing the corresponding record has been terminated.
    Procedure
    To analyze the cause, set a break point in the program of the
    transformation at the call point of function module
    RST_TOBJ_TO_DERIVED_TOBJ . Simulate the data transfer process to
    investigate the cause. "
    I am not getting why this error is coming up
    pls guide me !!

    you can map the code to DOC_DATE .
    The code is used to read from any date and convert that into Fiscal year/period.
    Try this code.
    data: l_fiscyear type t009b-bdatj.
    call function 'FISCPER_FROM_CALMONTH_CALC'
    exporting
    id_date = COMM_STRUCTURE-doc_date
    iv_periv = 'V9'
    importing
    ev_fiscyear = l_fiscyear.
    result value of the routine
    RESULT = l_fiscyear.

  • Error while loading data from PSA to DSO using DTP

    Hi,
    I have a Unique aplha numeric identifier of  type "Char" length "32" . When I am loading the data from PSA to DSO using DTP I get the following error message:
    "An error occurred while executing a transformation rule:
    The exact error message is
    Overflow converting from ' '
    The error was triggered at the following point in the program:
    GP4JJHUI6HD7NYAK6MVCDY4A01V 425
    System response
    Processing the data record has been terminated"
    Any idea how I can resolve this....
    Thanks

    Hi,
    fist check weather any special characteristics if not
    check in data source under this we have fields tab check the format of a particular field format internal/external/check u choose internal format, if any check routine once
    use Semantic Groups in the DTP.
    Try  it
    Thanku
    lokeeshM
    Edited by: lmedaSAP_BI on Oct 20, 2010 6:44 AM

  • Error while loading data from PSA to DSO

    Hi,
    How to identify the erroneous records in DSO?
    While loading the data from PSA to DSO through Process chains we are facing the error says:
    "Value '#' (hex. '0023') of characteristic 0BBP_DELREF contains invalid characters"
    "Error when assigning SID: Action VAL_SID_CONVERT InfoObject 0BBP_DELREF"
    There is no error records in PSA but it seems some invalid characters exists.
    Could you please help us how to find the error records in DSO and how to correct it.

    Hi,
    These are errors BRAIN290 & RSDRO302.
    The problem here most likely is that BW doesn't recognise a character you are trying to load. Generally the character is not "#",
    as BW displays all symbols it does not recognise as #. You should decode from the hex string what the actual value is. Note that hex values < 20 are not allowed in BW.
    Please review Note 173241 and the document mentioned within.
    This shows what characters are not allowed in BW characteristic values.
    You should check if the character is allowed, and then you can solve the problem in one of the following ways:
    1) add this character to the "permitted character" list in RSKC as described in the note.
    2) correct the value in the source system.
    3) correct the value in the PSA (to do this, you will need to delete the request from the ODS object and then you can change the disallowed character via the PSA maintenance).
    4) follow Note 1075403 so that the characters HEX00 to HEX1F are not checked (this affects only characteristics that do not allow "lower case").
    5) if you cannot use any of the above options, then you will need to create a routine in your transfer rules for the affected infoobject, and change the value to a character which is permitted in BW.
    These are the usual ways to solve this issue.
    Rgds,
    Colum

  • ERROR DURING LOADING DATA FROM PSA TO DSO IN "PRUEFMBKT"

    Respected all
    I am trying to load data through an r/3 field( pruefmbkt, char-40length) to infoobject. the data is coming correctly to psa but when i am loading it in to dso or infocube request is turning red giving following error message--
    Value '010/12.05-112.15/hold on not operate' (hex.
    '3031302F31322E30352D3131322E31352F686F6C64206F6E20') of characteristic
    PRU_LSIL contains invalid characters
    Message no. BRAIN060
    Diagnosis
        Only the following standard characters are valid in characteristic
        values by default:
        !"%&''()*,-./:;<=>?_0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZ.+
        Furthermore, characteristic values that only consist of the character #
        or that begin with ! are not valid.
        You are trying to load the invalid characteristic value 1. (hexidecimal
        representation 3031302F31322E30352D3131322E31352F686F6C64206F6E20).
    Procedure for System Administration
        Try to change the invalid characters to valid characters.
        If you want values that contain invalid characters to be admitted into
        the system, make the appropriate setting in BW Customizing. Refer to the
        documentation describing the requirements for special characters and the
        possible consequences.
        For more information on the problems involved with valid and invalid
        characters, click here.
    kindly give me the solution how to deal with this problem.
    thanks
    abhay

    HI Abhay,
    please check if the infoobject corresponding to field PRU_LSIL in BW system has lower case checkbox enabled in the infoobject properties, as the  value '010/12.05-112.15/hold on not operate' coming from R/3 has lower case alphabets.
    Hope this helps.
    Regards,
    Umesh.

  • Long time to load data from PSA to DSO / CUBE. Sequential read RSBKDATA_V

    Hi Gurus
    The process stays for several hours on sequential read from RSBKDATA_V - temporary storage for DTPs.
    After several hours, the processing finally starts, but even when I assign several BGDs to the DTP it takes hours to load the data.
    We have BI7.0 with SP22, so all relevant notes are already implemented.
    Does any one had similar problem, please?
    Thanks in advance
    Martin

    Hi Martin,
    this issue has cropped up a few times. Please check and implement the following notes:
    1338465    P22:DTP:LOG: Performance problem when messages are added +
    1331544    P22:HINT:Slow performance when accessing RSMONFACT +
    1312701    70SP21: Performance on view RSBKDATA_V selects
    1304234    70SP21: Performance on the hashed table p_th_rsbkdata_v **
    1168098    70SP19: Performance during DataStore object extraction
    1080027    70SP16: Performance during parallel processing
    These notes should improve the performance of the DTPs in your system.
    After you've added the notes, please check in tables RSBKDATA and RSBKDATAINFO whether they contain any data. If the tables are empty, please reorganise the tables, and restart the dtp.
    Hope this helps you.
    Rgds,
    Colum

  • DTP load data from DSO to CUBE can't debug

    Hi,all expert.
    I met a problem yestoday.
    When I load data from datasource to DSO,and debug the start routine,it was ok.But when I load data from DSO to CUBE,and debug the routine,it got an error like this:
    inconsistent input parameter(parameter:<unknow>,value <unknown>)
    message no: RS_EXCEPTION 101.
    I don't know why,PLZ help me,thank you very much.

    Hi,
    An alternative way to do this is to have a small code for infinite loop in your start routine. And when you start the load goto SM50 and debug the process which is executing the load, from there you can close the infinite loop and carry on with the debugging. Please let me know if you require further information.
    Thanks,
    Arminder

  • Load data from BW 7.0 DSO into a ECC 6.0 SD standard/Custom pricing table

    Gurus,
    We have a scenario, where we need to load data from a BW DSO to SD standard/Custom pricing table in ECC 6.0. That data will be few thousand records.
    Per my knowledge, in BPS, retractors are available to update data from BW to ECC and OpenHub also can be used to handle similar scenarios.
    Any one of you came across similar scenario?
    If you have any third option (not BPS rectors or OpeHub) as a solution to handle this kind scenario, and share the knowledge, it will be greatly appreciated.
    Thanks in advance,
    Vittal

    Hi Yogesh,
    Thanks for your reply.
    We have large data volumes for Billing datasource and hence moving flat files using PI is not an option and also as i mentioned a part of requirement is monitoring of the whole process as well. What i mean by this is if a 1000 billing document items were passed on to CRM7 to create a member activities and because of some reason a member activity was not created for one of the billing doc items, we should know the problematic record and reason why member activity was not created in CRM7 (reason code). And then be able to fix it.  All this requires an end to end monitoring capability and also guaranteed delivery of data to CRM7.
    Hence i was trying to explore the enterprise web service option.
    What i am not sure of is how to expose BW DSO delta request to CRM7 using a web service ? or any other method that gives end to end monitoring capability and also guaranteed delivery of data to CRM7.
    Any other suggestions ?
    Thanks
    CK

  • How to upload data from Infoset to Infocube

    Hi Gurus,
    I am trying to upload the data from , Infoset to Infocube, but its not allowing me in the transformations, showing error "Error while activating transformation    "    Message no. RSTRAN510
    Infoset containing 3 Data store objects, 
    how the data gets loaded from infoset to Infocube?
    how the delta mechanism works, since i am having 3 DSO's , ?
    Thanks
    Shiva

    Hi,
    But we have option, while creating transformation between infoset and Infocube,
    Under Infocube, when we create transformation, we have the option from where we can uplaod,   in that we have INFOSET also,
    i could able to create Transformation, but not able to activate,
    let me know, may be we cannot load from infoset to infocube, since we have the option, i think there is some way of loading data from infoset to infocube,

  • Job cancelled While loading data from DSO to CUBE using DTP

    Hi All,
      while i am loading data frm DSO to CUBE the job load is getting cancelled.
    in the job overview i got the following messages
        SYST: Date 00/01/0000 not expected.
       Job cancelled after system exception ERROR_MESSAGE
    wt can be the reason for this error as i have successfully loaded data into 2 layers of DSO before loading the data into the CUBE

    hi,
    Are u loading the flat file to DSO?
    then check the data in the PSA in the date field and replace with the correct date.
    i think u r using write optimised DSO which is similar to PSA it will take all the data from PSA to DSO.
    so clear the data in PSA and then load to DSO and then to CUBE.
    If u dont need a date field in CUBE then remove the mapping in transformation in cube and activate and make DTP activate and trigger it it will work.

  • Unable to load data from DSO to Cube

    Good morning all,
    I was trying to load data from DSO to Cube for validation. Before loading the new data, I deleted all from DSO and Cube. They contain no request at all. Cube has "Delta Update". First DSO was loaded 138,300 records successfully. Then Activated the DSO. The last when I clicked Exectue (DSO --> Cube), it loaded 0 record. I was able to load the data yesterday. What might be the reasons for this situation?
    Thank you so much!

    Hi BI User,
    For loading delta upload into the data target, there should be an initialization request in the data target with same selection criteria.
    So..first do the initialization and perform delta upload into the cube.
    Regards,
    Subhashini.

  • Error when loading data from DSO to Cube

    Hi,
    After upgrading to 2004s we get the following error message when trying to load data from a DSO to a Cube:
    Unexpected error: RSDRC_CONVERT_RANGES_TO_SELDR
    Has anyone experienced a similar problem or have some guidelines on how to resolve it?
    Kind regards,
    Fredrik

    Hej Martin,
    Tack!
    Problem solved
    //Fredrik

  • Dump when loading Data from DSO to Cube

    Hi,
    i get a short dump DBIF_REPO_PART_NOT_FOUND when trying to load data from a dso to a cube. From DSO to DSO it works fine.
    any idea`s?
    thx!
    Dominik

    the Dump occurs in the last step of the last data pakage in the DTP Monitor.
    I checked the OSS note, but i don´t know how to check the kernel version.
    Now i used another DSO for testing and i got another Dump with a different name.
    After that i logged in again and it works with my test DSO data...
    But still not working with the original one...
    @Kazmi: What detaisl u mean exactly?

  • How to load data from PSA to CUBE & DSO at a time using DTP in BI 7 ?

    HI all,
    I am new to BI 7 . How to load the data at same time to DSO & INFO CUBE using DTP.
    Please provide me steps to load & plz specify which update mode I have to use ( FULL OR DELTA ) which one is best.
    Plz Suggest me.
    Thanks & Regards,
    Kiran m.
    Message was edited by:
            kiran manyam

    Below are the basic steps which we follow in any BI 2004S system:
    1)Create datasource. Here u can set/check the Soucre System fields.
    2)Create Transformation for that datasource. (no more update rules/transfer rules)
    2.1) While creating transformation for DS it will ask you for data target name, so just assign where u want to update ur data.
    DataSource -> Transformation -> Data Target
    Now if you want to load data into data target from Source System Datasource:
    1) Create infopackage for that data source. If you are creating infopackage for new datasources, it will only allow you update upto PSA, all other options u can see as disabled.
    2)Now Create DTP (Data Transfer Process) for that data source.
    3) NOw schdule the Infopackage, once the data is loaded to PSA, you can execute your DTP which will load data to data target.
    If you are loading data from one one data target to other, no need to use PSA, you can directly execute DTP in that case.
    Data Source -> Transformation (IP/DTP) -> Data Target1 -> DTP ->Data Target 2
    Use the below link for detailed example:
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/fc61e12d-0a01-0010-2883-e2fc63ef729b
    Infosources are no more mandatory with BI 7.0, below is the link to scenarios where we use infosources:
    http://help.sap.com/saphelp_nw04s/helpdata/en/44/0243dd8ae1603ae10000000a1553f6/content.htm
    Full or delta depends on your requirement...
    chk the below thread to know better
    difference between the various loads
    hope it helps
    Message was edited by:
            sriram viswanathan

Maybe you are looking for