Data in DSO

Hi Experts,
I am loading data from PSA to DSO by dsing DTP.I have around 3 milloin records in PSA.But I got around 18 million records in DSO.I am wondering how come it possible.I have key fields in DSO are Sales doc no,Material no,Billing Item.Please advise me.
Thanks,
Rani.

hi,
you may have multiple rule groups that amy generate more than one records from one records.
chk in the transformation -> rule group .
else you may have start/end routine that may split the records.
chk in each delta load how many records were there.
are you chking the data volume in active table?.
Ramesh

Similar Messages

  • Issue when uploading Sales data from DSO to Cube.

    Dear All,
    I have an issue when I am uploading Sales data from DSO to Cube.  I am using BI 7.0 and I have uploaded all sales document level data to my DSO.  Then I use a transformation rule to calculate the Sales Value when I am doing DTP to cube.  Cube has customer wise aggregation data.
    In DSO I have NetPrice(KF) and Delivered_QTY(KF).  I do a simple multiplication routine in the transformation from DSO to Cube.
    RESULT =   SOURCE_FIELDS-NET_PRICE * SOURCE_FIELDS-DLV_QTY .
    At the moment I use Active Table(With out Archive) on the DSO to get the data since this is my first load.
    Issue is that the figure (Sales Value) in the cube is incorrect. I am getting a very large values which is impossible.
    Can someone please help me.
    Shanka

    Hi,
    are you sure that cube has customer wise aggregated data..since it will always aggregate the values of the key figures for the same set of characters...
    did you checked the values for the key figures as well and if even they are inflated or is the problem with this key figure only.
    During the data load the records may aggregates first for the same values of characters and then multiplication will be happening....if that is the case then you may have to multiply the values before storing it in the data package and then let it aggregate....can be achived through start routine...
    but first verify if other key figures are also having the same issue.
    Thanks
    Ajeet

  • Not able to activate data in DSO

    Hi Gurus,
    We are new to BI 7 . We are trying to load data into DSO . we followed the procedure :
    1)created flatfile
    2)created data source, clicked on proposal tab.
    3)created infoobjects(data type,length,.....)according to the proposal.
    4)created infopackage and loaded the data PSA
    5)created DSO with settings "activate Data target"....
    6)created transformation
    7)run DTP.
    data is coming upto new table(data is successfully loaded till here,green), data is not moving into active table .tried both the ways of activating but no use.it is not giving any concrete error msg,it says "time out"(red).
    Pls let me know if thr is any other procedure of loading daa into active table.
    Thanks,
    Praveen

    I put ny 2 cents here ..could be that you are running the activation job in DIALOG mode and hence timing out..
    You may want to execute the activation process in BACKGROUND mode with dedicated 1 BGD process in RSODSO_SETTINGS tcode.
    Fyip ..
    DSO Setting - Activate Data
    Also check for sm21 and st22 logs ...
    Hope it Helps
    Chetan
    @CP..

  • Loding data to dso from flatfile

    hi,
    while loading data to DSO from Flat file which fields should be mentiond in key fields and which fields should be mentioned in data fields.suppose,client,company code,record no,controllingarea,plant etc...... which flds sould be taken

    Hi,
    You can take the 0FISCPER, 0FISCVARNT, 0CURTYPE...etc are the Keyfileds.
    Rest of the fileds are Datafileds.
    Regards.

  • Error while activating data in DSO

    Hi Gurus, While activating I am getting these 2 messages "Error when assigning SID: Action VAL_SID_CONVERT table 0FISCPER" and the other message is "Fiscal year variant K4 not expected" and in some other Data Packages I am getting "Error when assigning SID: Action VAL_SID_CONVERT table 0FISCPER"
    Kindly help in this matter.

    Hi,
    Try these threads.
    Error when activating data in 0FI_GL_04 ODS (SIDS)
    Unable to activate data in DSO
    Error when assigning SID : Action VAL_SID _CONVERT table
    Can'T Activate ODS Data
    FI error in rspc chain
    Same issue as yours.
    Hope this helps.
    Thanks,
    JituK

  • Issue in Data from DSO to DSO Target with different Key

    Hello All,
    I am having Issue in Data from DSO to DSO Target with different Key
    Source DSO has Employee + Wage + Post numner as key and Target has Employee + Wage Type as key.
    DSO semantic grouping works like Group By clause in sql, is my understanding right ?
    Also if someone can explain this with a small example, it would be great.
    Many Thanks
    Krishna

    Dear, as explained earlier your issue has nothing to do with semantic grouping .
    Semantic grouping is only usefull when you have written a routine in the transformation for calculations and in error handling .
    Please go through this blog which explains very clearly the use of semantic grouping .
    http://scn.sap.com/community/data-warehousing/netweaver-bw/blog/2013/06/16/semantic-groups-in-dtp
    Now coming to your above question
    DSO 1
    Employee WageTyp Amount
    100          aa          200
    200          aa          200
    100          bb          400
    100          cc          300
    If we have semantic group as Employee .   If we have Employee as key of the target DSO and update type as summation .
    then target DSO will have
    Emp                Amount
    100                 700
    200                 200
    In this case Wage type will be the last record arriving from the data package . If the record 100  cc  300 is arrivng last then wage type will be cc .
    2) Case 2
    DSO 1
    Employee WageTyp Amount
    100          aa          200
    200          aa          200
    100          bb          400
    100          aa          300
    if we do Semantic grouping with Emp and Wage Type   If we have Employee and wage type as key of the target DSO and update type as summation .
    then target DSO will have
    Emp     Wage      Amount
    100          aa          500
    200          aa          200
    100          bb          400
    Hope this helps .

  • Problem while  data processing TRANSACTION data from DSO to CUBE

    Hi Guru's,
    we are facing problem while  data processing TRANSACTION data from DSO to CUBE. data packets  processing very slowly  and updating .Please help me regarding this.
    Thanks and regards,
    Sridhar

    Hi,
    I will suggest you to check a few places where you can see the status
    1) SM37 job log (give BIrequest name) and it should give you the details about the request. If its active make sure that the job log is getting updated at frequent intervals.
    2) SM66 get the job details (server name PID etc from SM37) and see in SM66 if the job is running or not. See if its accessing/updating some tables or is not doing anything at all.
    If its running and if you are able to see it active in SM66 you can wait for some time to let it finish.
    3) RSMO see what is available in details tab. It may be in update rules.
    4) ST22 check if any short dump has occured.
    You can also try SM50 / SM51 to see what is happening in the system level like reading/inserting tables etc.
    If you feel its active and running you can verify by checking if the number of records has increased in the cube.
    Thanks,
    JituK

  • Job cancelled While loading data from DSO to CUBE using DTP

    Hi All,
      while i am loading data frm DSO to CUBE the job load is getting cancelled.
    in the job overview i got the following messages
        SYST: Date 00/01/0000 not expected.
       Job cancelled after system exception ERROR_MESSAGE
    wt can be the reason for this error as i have successfully loaded data into 2 layers of DSO before loading the data into the CUBE

    hi,
    Are u loading the flat file to DSO?
    then check the data in the PSA in the date field and replace with the correct date.
    i think u r using write optimised DSO which is similar to PSA it will take all the data from PSA to DSO.
    so clear the data in PSA and then load to DSO and then to CUBE.
    If u dont need a date field in CUBE then remove the mapping in transformation in cube and activate and make DTP activate and trigger it it will work.

  • Error while loading data from DSO to Cube

    Dear Experts,
    I have change update routines of the update rule for DSO & update rule between DSO & cube. when i uploading data in DSO , working fine. But when uploading from DSO to cube , it giving error.
    Error Message :
    DataSource 8ZDSO_ABB does not have the same status as the source system in the Business Information Warehouse.
    The time stamp in the source system is 30.09.2010 14:05:17.
    The time stamp in the BW system is 09.05.2008 16:41:10.
    I executed program RS_TRANSTRU_ACTIVATE_ALL , still giving same error.
    Regards,
    Anand Mehrotra.

    Dear Experts,
    Thanks for help.Problem solved.
    Regards,
    Anand Mehrotra.

  • Unable to load data from DSO to Cube

    Good morning all,
    I was trying to load data from DSO to Cube for validation. Before loading the new data, I deleted all from DSO and Cube. They contain no request at all. Cube has "Delta Update". First DSO was loaded 138,300 records successfully. Then Activated the DSO. The last when I clicked Exectue (DSO --> Cube), it loaded 0 record. I was able to load the data yesterday. What might be the reasons for this situation?
    Thank you so much!

    Hi BI User,
    For loading delta upload into the data target, there should be an initialization request in the data target with same selection criteria.
    So..first do the initialization and perform delta upload into the cube.
    Regards,
    Subhashini.

  • Regarding  Loading the data from DSO to cube.

    Hello Experts,
    I have DSO which loads data from psa  using 7.0 tranformation (using DTP).  And i have a cube which loads the data from that DSO using 3.x transfer rules. Now i have deleted the init request for a infopack before load the data into DSO. But when i load the data from DSO to cube by  right click on DSO -> click On Additional Functions -> update the 3.x data to targets, It is giving me an error like 'Delete init. request REQU_4H7UY4ZXAO72WR4GTUIW0XUKP before running init. again with same selection
    Please help me with this.
    i want to load the data in the init request to cube..
    Thanks

    Hi Shanthi,
    Thanks For reply. I have already deleted the init request from source system to DSO and then i have tried still i am getting the error.
    Thanks

  • DTP load data from DSO to CUBE can't debug

    Hi,all expert.
    I met a problem yestoday.
    When I load data from datasource to DSO,and debug the start routine,it was ok.But when I load data from DSO to CUBE,and debug the routine,it got an error like this:
    inconsistent input parameter(parameter:<unknow>,value <unknown>)
    message no: RS_EXCEPTION 101.
    I don't know why,PLZ help me,thank you very much.

    Hi,
    An alternative way to do this is to have a small code for infinite loop in your start routine. And when you start the load goto SM50 and debug the process which is executing the load, from there you can close the infinite loop and carry on with the debugging. Please let me know if you require further information.
    Thanks,
    Arminder

  • Semantic Partitioning Delta issue while load data from DSO to Cube -BW 7.3

    Hi All,
    We have created the semantic partion with the help of BADI to perform Partitions. Everthing looks good,
    first time i have loaded the data it is full update.
    Second time i init the data load. I pulled the delta  from DSO to Cube . DSO is standard where as Cube is partition with the help of semantic partition. What i can see is the records which are updated in Latest delta are shown into the report rest all are ignored by the system.
    I tried compression but still it did not worked.
    Can some has face this kind
    Thanks

    Yaseen & Soujanya,
    It is very hard to guess the problem with the amount of information you have provided.
    - What do you mean by cube is not being loaded? No records are extracted from DSO? Is the load completing with 0 records?
    - How is data loaded from DSO to InfoCube? Full? Are you first loading to PSA or not?
    - Is there data already in the InfoCube?
    - Is there change log data for DSO or did someone delete all the PSA data?
    Sincere there are so many reasons for the behavior you are witnessing, your best option is to approach your resident expert.
    Good luck.
    Sudhi Karkada
    <a href="http://main.nationalmssociety.org/site/TR/Bike/TXHBikeEvents?px=5888378&pg=personal&fr_id=10222">Biking for MS Relief</a>

  • Issue while loading of data from DSO to InfoCube

    Hi Experts,
    Can you tell me what might root casue if data is coming into DSO from R3 its correct and fine as required but while loading it to InfoCube from DSO its showing wrong data like some of Line Items that were closed were shown open in Cube AND also KF values were not right
    Also there is no Routine code involved b/w DSO and InfoCube.
    Thanks in adv .
    NP

    Hope you didnt delete some req from DSO without deleting change log . This might cause inconsistency.
    If so , delete data from dso by right click delete data  and reload .

  • Data from DSO to CUBE

    hi gurus,
    please tell me from which table (Changelog or Active data) of DSO the cube will pick the records
    and also overwrite functionality works with DSO Change log
    Please awaiting a quick response
    Thank you

    Hi,
    Answer is already in my previous post.
    i.e.  Delta load - From Change Log,
      but in BI7.0 data flow, when you are using DTP the Delta behaviour will be based on your selection. as below.
    BW Data Manager: Extract from Database and Archive?
    The settings in the group frame Extraction From... or Delta Extraction From... of the Data Transfer Process maintenance specify the source from which the data of the DTP is extracted.  For a full DTP, these settings apply to all requests started by the DTP. For a delta DTP, the settings only apply to the first request (delta initialization), since because of the delta logic, the following requests must all be extracted from the change log.
    For Extraction from the DataStore Object, you have the following options:
    Active Table (with Archive)
    The data is read from the active table and from the archive or from a near-line storage if one exists. You can choose this option even if there is no active data archiving process yet for the DataStore object.
    Active Table (Without Archive)
    The data is only read from the active table. If there is data in the archive or in a near-line storage at the time of extraction, this data is not extracted.
    Archive (Only Full Extraction)
    The data is only read from the archive or from a near-line storage. Data is not extracted from the active table.
    Change Log
    The data is read from the change log of the DataStore object.
    For Extraction from the InfoCube, you have the following options:
    InfoCube Tables
    Data is only extracted from the database (E table and F table and aggregates).
    Archive (Only Full Extraction)
    The data is only read from the archive or from a near-line storage.
    hope this gives you clear idea.

  • Issue while loading data from DSO to Cube

    Gurus,
    I'm facing a typical problem while loading Cube from a DSO. The load is based upon certain conditions.
    Though the data in DSO satisfies those condition, but the Cube is still not being loaded.
    All the loads are managed through Process Chains.
    Would there be any reason in specific/particular for this type of problem?
    Any pointers would be of greatest help !
    Regards,
    Yaseen & Soujanya

    Yaseen & Soujanya,
    It is very hard to guess the problem with the amount of information you have provided.
    - What do you mean by cube is not being loaded? No records are extracted from DSO? Is the load completing with 0 records?
    - How is data loaded from DSO to InfoCube? Full? Are you first loading to PSA or not?
    - Is there data already in the InfoCube?
    - Is there change log data for DSO or did someone delete all the PSA data?
    Sincere there are so many reasons for the behavior you are witnessing, your best option is to approach your resident expert.
    Good luck.
    Sudhi Karkada
    <a href="http://main.nationalmssociety.org/site/TR/Bike/TXHBikeEvents?px=5888378&pg=personal&fr_id=10222">Biking for MS Relief</a>

Maybe you are looking for

  • Faster 2 TB external drive throughput?

    Hi.  I'm a heavy user of Aperture, and am looking for ways to overcome system waits (spinning pinwheel) when using very large Libraries.  The system waits happen when making adjustments, loading containers, and always when emptying the Aperture Trash

  • G4 iBook 1.33 (mid-2005) airport card install instructions?

    I asked this question in an earlier post but the title was misleading, I think. I need to reseat my Airport Express card and I don't know where it is or how to do it. Please point me to the appropriate URL that has diagrams. I've searched and searche

  • How can we identify the coupon code that triggered the promotion discount in the order?

    Hi, As we can create the coupons and assign the promotions to them to give discounts.How can we identity the coupon code that is added the promotion which caused the discount in the order? Regards, Chede

  • Configure Frequent visited sites in list format

    Hi Internet Explorer Expert, I have a query here. Usually the frequent visited websites will be shown in series of box icon form and usually we only see 10 icons of frequently visited websites.  Is it possible to change or view the those frequent vis

  • Erreur XNode avec Filtre FPGA

    Bonjour, Je suis en train de créer un petit VI FPGA, dans lequel je mesure et filtre 4 voies analogiques 4-20mA. L'acquisition des signaux (en U16)se fait dans une boucle à 1MHz, on filtre à 114Hz (passe-bas ordre 1), puis on ré-échantillonne à 500Hz