Problem in loading data to DSO

Hi,
I have a problem in loading the data to DSO through Info package. The delta request has been running for a long time, but no data has been loaded. The job details screen is showing as below.
Previously there was no issues with this chain.
Can some one guide me to solve this issue please.?
Regards
Ragu

Hi Ragu,
The same issue has been happened for me long time back. I have solved this issue by running the Repair Full request and then Init. delta without Data Transfer.
May be this may also work for you.
Steps:
1.Delete the request by marking it as RED
2.Run Info package Repair full Request.
Once this is success,
3.Change Info package and Run Init delta without data transfer
4. If it pop ups Init Delta already exists,then delete the previous init from the source.
after the init delta is ended fine, then change your Info package to Delta as it is before.
Hope it helps.
regards
Deva

Similar Messages

  • DTP load data from DSO to CUBE can't debug

    Hi,all expert.
    I met a problem yestoday.
    When I load data from datasource to DSO,and debug the start routine,it was ok.But when I load data from DSO to CUBE,and debug the routine,it got an error like this:
    inconsistent input parameter(parameter:<unknow>,value <unknown>)
    message no: RS_EXCEPTION 101.
    I don't know why,PLZ help me,thank you very much.

    Hi,
    An alternative way to do this is to have a small code for infinite loop in your start routine. And when you start the load goto SM50 and debug the process which is executing the load, from there you can close the infinite loop and carry on with the debugging. Please let me know if you require further information.
    Thanks,
    Arminder

  • Job cancelled While loading data from DSO to CUBE using DTP

    Hi All,
      while i am loading data frm DSO to CUBE the job load is getting cancelled.
    in the job overview i got the following messages
        SYST: Date 00/01/0000 not expected.
       Job cancelled after system exception ERROR_MESSAGE
    wt can be the reason for this error as i have successfully loaded data into 2 layers of DSO before loading the data into the CUBE

    hi,
    Are u loading the flat file to DSO?
    then check the data in the PSA in the date field and replace with the correct date.
    i think u r using write optimised DSO which is similar to PSA it will take all the data from PSA to DSO.
    so clear the data in PSA and then load to DSO and then to CUBE.
    If u dont need a date field in CUBE then remove the mapping in transformation in cube and activate and make DTP activate and trigger it it will work.

  • Unable to load data from DSO to Cube

    Good morning all,
    I was trying to load data from DSO to Cube for validation. Before loading the new data, I deleted all from DSO and Cube. They contain no request at all. Cube has "Delta Update". First DSO was loaded 138,300 records successfully. Then Activated the DSO. The last when I clicked Exectue (DSO --> Cube), it loaded 0 record. I was able to load the data yesterday. What might be the reasons for this situation?
    Thank you so much!

    Hi BI User,
    For loading delta upload into the data target, there should be an initialization request in the data target with same selection criteria.
    So..first do the initialization and perform delta upload into the cube.
    Regards,
    Subhashini.

  • Problem in Loading data for clob column using sql ldr

    Hi,
    I am having problem in loading data for tables having clob column.
    Could anyone help me in correcting the below script for ctrl file inorder to load the data which is in mentioned format.
    Any help really appreciated.
    Table Script
    Create table samp
    no number,
    col1 clob,
    col2 clob
    Ctrl File
    options (skip =1)
    load data
    infile 'c:\1.csv'
    Replace into table samp
    fields terminated by ","
    trailing nullcols
    no,
    col1 Char(100000000) ,
    col2 Char(100000000) enclosed by '"' and '"'
    Data File(1.csv)
    1,asdf,"assasadsdsdsd""sfasdfadf""sdsdsa,ssfsf"
    2,sfjass,"dksadk,kd,ss""dfdfjkdjfdk""sasfjaslaljs"
    Error Encountered
    ORA-01461: can bind a LONG value only for insert into a LONG column
    Table sampThanks in advance

    I can't reproduce it on my 10.2.0.4.0. CTL file:
    load data
    INFILE *
    Replace into table samp
    fields terminated by ","
    trailing nullcols
    no,
    col1 Char(100000000) ,
    col2 Char(100000000) enclosed by '"' and '"'
    BEGINDATA
    1,asdf,"assasadsdsdsd""sfasdfadf""sdsdsa,ssfsf"
    2,sfjass,"dksadk,kd,ss""dfdfjkdjfdk""sasfjaslaljs"Loading:
    SQL> Create table samp
      2  (
      3  no number,
      4  col1 clob,
      5  col2 clob
      6  );
    Table created.
    SQL> host sqlldr scott/tiger control=c:\temp\samp.ctl log=c:\temp\samp.log
    SQL> select * from samp
      2  /
            NO
    COL1
    COL2
             1
    asdf
    assasadsdsdsd"sfasdfadf"sdsdsa,ssfsf
             2
    sfjass
    dksadk,kd,ss"dfdfjkdjfdk"sasfjaslaljs
            NO
    COL1
    COL2
    SQL> SY.

  • Problem with loading data to Essbase

    Hi All,
    I have a problem with loading data into Essbase. I've prepared maxl script to load the data, calling rule file. The source table is located in RDBMS Oracle. The script works correctly, ie. generally loads data into Essbase.
    But the problem lies in the fact, that after deletion of data from Essbase, when I'm trying to load it again from the source table I get the message: WARNING - 1003035 - No data values modified by load of this data file - although there is no data in Essbase... I've also tried to change the mode of loading data from 'overwrite' to 'add to existing values' (in rule file) but it does'nt help ... Any ideas what can I do?

    Below few lines from EPM_ORACLE_INSTANCE/diagnostics/logs/essbase/dataload_ODL.err:
    [2013-09-24T12:01:40.480-10:01] [ESSBASE0] [AGENT-1160] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1116830016] Received Validate Login Session request
    [2013-09-24T12:01:40.482-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1114724672] Received client request: Get App and Database Status (from user [admin@Native Directory])
    [2013-09-24T12:01:54.488-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1101564224] Received client request: MaxL: Execute (from user [admin@Native Directory])
    [2013-09-24T12:01:54.492-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1115777344] Received client request: MaxL: Describe (from user [admin@Native Directory])
    [2013-09-24T12:01:54.492-10:01] [ESSBASE0] [MLEXEC-2] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1115777344] Output columns described
    [2013-09-24T12:01:54.494-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1102616896] Received client request: MaxL: Define (from user [admin@Native Directory])
    [2013-09-24T12:01:54.494-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1102616896] Received client request: MaxL: Fetch (from user [admin@Native Directory])
    [2013-09-24T12:01:54.494-10:01] [ESSBASE0] [MLEXEC-3] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1102616896] Record(s) fetched
    [2013-09-24T12:01:54.496-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1116830016] Received client request: MaxL: Fetch (from user [admin@Native Directory])
    [2013-09-24T12:01:54.498-10:01] [ESSBASE0] [AGENT-1160] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1114724672] Received Validate Login Session request
    [2013-09-24T12:01:54.499-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1101564224] Received client request: Get Application State (from user [admin@Native Directory])

  • Problem while loading data from ODS to infoobject

    Hi guys,
    I am facing problem while loading data from <b>ODS to infoobject</b>.
    If I load data via PSA it works fine but
    if I load data without PSA its giving error as Duplicate records.
    Do u have any idea why it is so.
    Thanks in advance
    savio

    Hi,
    when you load the data via the PSA, what did you select? Serial or Paralel?
    If you select serial most likely you don't have duplicates within the same datapackage and your load can go through.
    Loading directly in the IObj will happen thefore if you have the same key in two different packages, the "duplicate records" will be raised; you can perhaps flag your IPack with the option "ignore duplicate records" as suggested...
    hope this helps...
    Olivier.

  • Loading data to DSO in BI 7.0

    Hi.,
    How to load data to DSO in BI 7.0. Is Infosource mandatory if we want to load data to DSO. What about the 0RECORDMODE field for DSO. How to maintain the transformations. Please guide me in BI 7.0 process.

    Hi Selva,
    InfoSource is not mandatory in BI 7.0...
    0RECORDMODE
    The field 0RECORDMODE is used for delta update to pass indicators of the DataSource from OLTP to BI system.
    Questions and answers on InfoObject 0RECORDMODE
    SAP Note Number:[399739 |https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/oss_notes/sdn_oss_bw_bct/~form/handler%7b5f4150503d3030323030363832353030303030303031393732265f4556454e543d444953504c4159265f4e4e554d3d333939373339%7d]
    0RECORDMODE
    Re: Using 0RECORDMODE
    Transformation
    The transformation process allows you to consolidate, cleanse, and integrate data. You can semantically synchronize data from heterogeneous sources.
    When you load data from one BI object into a further BI object, the data is passed through a transformation. A transformation converts the fields of the source into the format of the target.
    http://help.sap.com/saphelp_nw04s/helpdata/en/f8/7913426e48db2ce10000000a1550b0/frameset.htm
    Creating Transformations
    http://help.sap.com/saphelp_nw04s/helpdata/en/d5/da13426e48db2ce10000000a1550b0/frameset.htm
    DataStore Object
    A DataStore object serves as a storage location for consolidated and cleansed transaction data or master data on a document (atomic) level.
    Unlike multidimensional data storage using InfoCubes, the data in DataStore objects is stored in transparent, flat database tables. The system does not create fact tables or dimension tables.
    http://help.sap.com/saphelp_nw04s/helpdata/en/f9/45503c242b4a67e10000000a114084/frameset.htm
    Scenario for Using Standard DataStore Objects (check the figure to see how DSO is integrated in data flow...)
    http://help.sap.com/saphelp_nw04s/helpdata/en/bd/c627c88f9e11d4b2c90050da4c74dc/frameset.htm
    Regards
    Andreas

  • Facing  problem while loading data into XI

    Hi All,
    I am facing a peculiar problem while loading data into XI when we load small amount of data say 5-10 records these records gets processed, but when we try to load a bit higher volume say 100-500 which comes out to be around 7 record s per second then the record processing gets stalled. That is it gives a message that "recorded for outbound processing".
    So can any body throw some light on it and can tell us why we rare facing such problem and how can we resolve that problem.
    Thanks in Advance.
    Regards
    rahul

    Hi rahul,
    Deregistering and Registering the Queue is a temporary solution.
    The better way to this would be to Tune your Xi system as already mentioned by Vijaya as mentioned in the guide ,
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/70ada5ef-0201-0010-1f8b-c935e444b0ad
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/7fdca26e-0601-0010-369d-b3fc87d3a2d9
    Regards,
    Bhavesh

  • Problem in loading data from one DSO to another

    Hi All,
       I am using Multicube 0PM_MP03, this mutli cube use DSO 0PM_DS02 & 0PM_DS01, data will flow from PSA to 0PM_DS01 & then to 0PM_DS02 using a Update rule, i have triggered my infopackage & i can see data in 0PM_DS01 & i have activated the data in 0PM_DS01, now on right click on 0PM_DS01 i click on the option  Additional Functions -> Update 3.x Data to Target, now the request status is green in 0PM_DS02 but i cannot see any data in this DSO, neither in new or active, can anyone help me out in this where i am doing wrong.
    Thanks,
    Pritam.

    Hi Pritam,
    This is strange, Check the below points.
    1. If the number of records added 0, in manage of DSO2?
    2. Confirm if there is a transfprmation between ODS1 & ODS2. If there is, you need to use a DTP (not Additional Functions -> Update 3.x Data to Target).
    3. Activate the request and then check.
    4. Maintain PSA for the datasource 80PM_DS01 and when loading data from DSO1 to DSO2 choose option PSA and then into targets.
    Regards,
    Panksj.

  • Problem in loading data from one DSO to onother DSO in BI7

    Hi
    Iam loading data from one DSO to onother, After executing DTP Iam getting error message as "Aggregation NOP not allowed  on DB (key figure 0DSCT_DAYS1)     ", Here 0DSCT_DAYS1 is the keyfigure aggragation type is "NOP" , exception aggregation is " No aggregation(X if more than one value unique to 0 occurs)"
    Please kindly give me solution .
    Radha Krishna

    We had this issue in our project. It mainly arises when you have written start routine in transformation. At start routine level it can not aggregate the fields for which NOP is set & that's why it throws an error.
    Please check the load by removing the start routine. If it works then introduce a data target in between which has exactly same structure as previous data target except the fields with NOP aggregation.

  • Semantic Partitioning Delta issue while load data from DSO to Cube -BW 7.3

    Hi All,
    We have created the semantic partion with the help of BADI to perform Partitions. Everthing looks good,
    first time i have loaded the data it is full update.
    Second time i init the data load. I pulled the delta  from DSO to Cube . DSO is standard where as Cube is partition with the help of semantic partition. What i can see is the records which are updated in Latest delta are shown into the report rest all are ignored by the system.
    I tried compression but still it did not worked.
    Can some has face this kind
    Thanks

    Yaseen & Soujanya,
    It is very hard to guess the problem with the amount of information you have provided.
    - What do you mean by cube is not being loaded? No records are extracted from DSO? Is the load completing with 0 records?
    - How is data loaded from DSO to InfoCube? Full? Are you first loading to PSA or not?
    - Is there data already in the InfoCube?
    - Is there change log data for DSO or did someone delete all the PSA data?
    Sincere there are so many reasons for the behavior you are witnessing, your best option is to approach your resident expert.
    Good luck.
    Sudhi Karkada
    <a href="http://main.nationalmssociety.org/site/TR/Bike/TXHBikeEvents?px=5888378&pg=personal&fr_id=10222">Biking for MS Relief</a>

  • Error when loading data from DSO to Cube

    Hi,
    After upgrading to 2004s we get the following error message when trying to load data from a DSO to a Cube:
    Unexpected error: RSDRC_CONVERT_RANGES_TO_SELDR
    Has anyone experienced a similar problem or have some guidelines on how to resolve it?
    Kind regards,
    Fredrik

    Hej Martin,
    Tack!
    Problem solved
    //Fredrik

  • Issue while loading data from DSO to Cube

    Gurus,
    I'm facing a typical problem while loading Cube from a DSO. The load is based upon certain conditions.
    Though the data in DSO satisfies those condition, but the Cube is still not being loaded.
    All the loads are managed through Process Chains.
    Would there be any reason in specific/particular for this type of problem?
    Any pointers would be of greatest help !
    Regards,
    Yaseen & Soujanya

    Yaseen & Soujanya,
    It is very hard to guess the problem with the amount of information you have provided.
    - What do you mean by cube is not being loaded? No records are extracted from DSO? Is the load completing with 0 records?
    - How is data loaded from DSO to InfoCube? Full? Are you first loading to PSA or not?
    - Is there data already in the InfoCube?
    - Is there change log data for DSO or did someone delete all the PSA data?
    Sincere there are so many reasons for the behavior you are witnessing, your best option is to approach your resident expert.
    Good luck.
    Sudhi Karkada
    <a href="http://main.nationalmssociety.org/site/TR/Bike/TXHBikeEvents?px=5888378&pg=personal&fr_id=10222">Biking for MS Relief</a>

  • Problem in transfering data from DSO to InfoCube

    Hi,
    I am trying to transfer data from DSO to Infocube. But it is giving the following message.
    Last delta update is not yet completed
    Therefore, no new delta update is possible.
    You can start the request again
    if the last delta request is red or green in the monitor (QM activity)
    It is not allowing to load data to InfoCube from DSO. I checked the request number. But there is no request number in InfoCube. If request no is there I can delete the request from InfoCube and delete the data mart status in DSO and reload again.But there is no request id in IC. So plz guide me how I can tranfer data to InfoCube overcoming this problem.
    Actually further updata was running from DSO to IC. But it failed. Executing Function Module : RSPC_PROCESS_FINISH, I set the chain  green before loading
    data to InfoCube. So it starts executing the next chain. So there is no request id. So plz guide me how I can retain this request id.
    Edited by: Purushotham Reddy on Jan 28, 2008 11:19 AM

    Hi Reddy,
    Make the QM Status Red of the failed request even if it is red.
    and then run the same infopackage used in the chain. This will ask for repeat of last delta.
    If this process is understood to u then its fine otherwise read below:
    the other option is check if this failed request is present in PSA. If it is present in PSA then update the failed request from PSA, here u wont need to do a repeat of last delta. This Option is easier. this u can do from the ,manage tab of the data source I think this u might be surely knowing....
    Assign Points if it helps u.....
    Thanks,
    Nagesh.

Maybe you are looking for