Is Manual loading to Standard DSO possible?

Dear Mates,
Is  there any program or functin module for writing data in Standard DSO. For cube we are having program.
Please help to solve this issue.
Thanks,
Ranganath.

Hi Ranganath,
What I did in past was, created an ABAP program and then using INSERT command I have directly created some records in Direct update DSO but I have not tried this with Standard DSO.
May be you can try to directly write data in active table of standard DSO by creating ABAP report.
Regards,
Durgesh.

Similar Messages

  • How to edit data while loading data from W/O to Standard DSO?

    Hello,
    I am loading data from W/O to Standard DSO, during activation it got error out due to SID failure error for one infoobject(error is due to lowercase letter).But i can't change the infoobject setting and transformation.
    Is there any way to edit the data(either in W/O DSO or in new data of Standard DSO)?
    Thanks and regards,
    Himanshu.

    HI,
    Please check what is setting there in the transaction RSKC if it is set as ALL_CAPITAL then you must atleast chage the charecter setting, write a command in transformation, load to PSA and modify (Not applicable for BI7), Remove the setting in RSKC.(Not suggested).
    Cheers
    Vikram

  • Loading of data in the standard dso

    hello everyone,
    i loaded the data in the standard dso using flat file
    after that i changed one record in the file and uploaded it with delta update
    so the thing is i cant see data that i changed in the dso or in the tables

    Hi,
    From flat file system doesn't understand the delta. Unless some specific code is written.
    For every delta load one time stamp is required and I guess you are just loading one flat file dirctly into system by selcting delta option in info package/DTP. Either you need to write code to make system understnd for time stamp or you sholud do full load in case of flat file. It will overwrite evrything and you will get changed records as well and you will filnd enteries in all the tables (activate the data after load).
    You may try pseudo delat as well by loading only changed records into DSO. In this case load will be full just the file will be having only changed record and it will add in existing record.
    I hope it will help.
    Thanks,
    S

  • How to preserve data when converting a Standard DSO into a Write Optimized

    Hi,
    I'm looking for proven strategies for preserving data when converting a standard DSO into a write optimized DSO. The data has to be dropped before the new DSO is transported into the environment.
    1. The DSO is currently in synch with a cube,
    2. The PSA does not have all the data which is in the DSO.
    3. Data volume is incredibly high for a full reload from ECC, so we'd like to avoid that option.
    Appreciate any help!

    Hi Gregg,
    have you considered just deleting the data? I know that sounds simple, but it might be a valid solution.
    If the DSO is just receiving new data (e.g. FI documents), you can continue to deliver a logically correct delta to the cube.
    Should that not be possible and you really want all data that you currently have in your DSO1 in the write optimized future of it, then how about this:
    - Create a new DSO2 same structure as DSO1
    - Load all data into that
    - Delete all data from your DSO1 and import the transport to make it write optimized
    - Load all data back into you now write optimized DSO1 from DSO2
    The problem you have then, is that all data that you have already loaded into your cube is due to be delivered as a delta from DSO1 again.
    Depending on your transformation / update rules that might or might not be a problem.
    Best,
    Ralf

  • Loading a BW DSO via BODS

    Hello Everyone
    We are using BW 7.3 and BODS 4.1
    I have a requirement to Load a BW DSO (standard DSO) via Business Objects Data Services but I am not getting much luck...
    I have successfully established (tested) the RFC connection between the two systems, however I am lost from the point that mentions "
    Create the BW Data Sources   -->    create or map the data sources needed with a certain RFC connection"
    I am not sure what is meant by that statement, I know the BODS only loads data into the PSA, is that why I am being asked to create a new datasource? if this is correct then what type of datasource should I be creating?
    I have created a DSO (which contains all of the fields that are going to be coming in) and it is going to be receiving data from a Non-SAP source that is being loading into an SQL Staging Area, I want to transfer the data from that SQL staging area into the BW DSO using BODS (BODS Triggered and controlled job)
    I have been following the following document -->   BOBJ Data Services and Netweaver BW
    and I have also found the Video Tutorial --> SAP D&T Academy - How to Load Data into SAP BW Using SAP Data Services - YouTube
    Can someone please help me understand and if possible provide Step by Step guidance as to what I need to do and why, it will be much appreciated and of course I will share the knowledge with others.

    Hi Kumar
    Thank you for your response, when I try to create a Datasource in the BO Data Services component, I get to the following screen....
    and then I am not sure as to what would go in the "Source Object" field...   Do I put the name (or table name) of the DSO that I want to Load via BODS here?     because when I perform a "Check" I get the following Error Message
    and I am not able to select other options or search for all available objects (by using the View all objects icon with the "Sunny Mountain" symbol)
    Can you or someone else please help me further?

  • Why in SE16 we can not  see New Data Table for standard DSO

    Hi,
    We says that there is three tables (New Data Table, Active Data Table and Change Log Table) of Standard DSO, Then Why in SE16 we can not  see New Data Table of Standard DSO.
    Regards,
    Sushant

    Hi Sushant,
    It is possible to see the 3 DSO tables data in through SE16. May be you do not have authorization to see data through SE16.
    Sankar Kumar

  • Error while Activating the Standard DSO

    Hi,
    I am getting the below error while Activating the Standard DSO.
    Characteristic 0DPM_DCAS__0CSM_SPRO: DS object 0DPM_DCAS__0CSM_STAT (master data check) does not exist
    I tried searching the forum , but didnt find any answer.
    Any suggestions are welcome.
    Thank you,
    Adhvi.

    Hi,
       Are you getting the error while trying to activate the DSO data after loading or just while trying to activate the DSO itself.
      If it is during the activation of a request,then please check if you have loaded some data in the DSO which doesnt have the corresponding Master Data loaded.For preventing this, you can check in Infopackage/DTP -"No Update without Master Data".
    You can also go into RSRV and perform elementary tests for the DSO and check the SID table consistency.
    Thanks,
    Divya.

  • Changing of Write Optimized DSO to Standard DSO

    Dear Experts,
    I have created few Write Optimized (WO)  DSOs based on the requirement and I have created few reports also on these WO DSOs.
    The problem is when I am creating an Info Set on the WO DSO, Standard DSO and a Cube (total 3 Info Providers I am including in this info Set) it is throwing an error while I am displaying the data from that Info Set.
    I came to know that the problem is with WO DSO So I want to change this WO DSO to Standard DSO.
    FYI We are in Development stage only.
    If I copy the same DSO and make it as a Standard DSO my reports which I created on these will disturb so I want alternate solution for this.
    Regards,
    Phani.
    Edited by: Sai Phani on Nov 12, 2010 5:25 PM

    Hi Sai
    Write optimized DSO's always help in optimal data performance. I am sure you created the WOD only to take advantage of this point.
    However, WOD are not suitable when it comes to reporting.
    So instead of converting the WOD to a standard DSO, (where you do lose the data load performance advantage), why don't you connect the WOD to a standard DSO wherein you extract data from the WOD to a standard DSO and use the standard DSO for reporting.  This would give you benefit during data loas as well as reportiing.
    Cheers
    Umesh

  • Standard DSO - Write Optimized DSO, key violation, same semantic key

    Hello everybody,
    I'm trying to load a Write-Optimized DSO from another Standard DSO and then is raised the "famous" error:
    During loading, there was a key violation. You tried to save more than
    one data record with the same semantic key.
    The problematic (newly loaded) data record has the following properties:
    o   DataStore object: ZSD_O09
    o   Request: DTPR_D7YTSFRQ9F7JFINY43QSH1FJ1
    o   Data package: 000001
    o   Data record number: 28474
    I've seen many different previous posts regarding the same issue but not quite equal as mine:
    [During loading, there was a key violation. You tried to save more than]
    [Duplicate data records at dtp]
    ...each of them suggests to make some changes in the Semantic Key. Here's my particular context:
    Dataflow goes: ZSD_o08 (Standard DSO) -> ZSD_o09 (Write-Optimized DSO)
    ZSD_o08 Semantic Keys:
    SK1
    SK2
    SK3
    ZSD_o09 Semantic Keys:
    SK1
    SK2
    SK3
    SK4 (value is taken in a routine as SY-DATUM-1)
    As far as I can see there are no repeated records for semantic keys into ZSD_o08 this is confirmed by querying at active data table for ZSD_o08 ODS. Looking for the Temporary Storage for the crashed DTP at the specific package for the error I can't neither see any "weird" thing.
    Let's suppose that the Semantic Key is crucial as is currently set.
    Could you please advice?. I look forward for your quick response. Thank you and best regards,
    Bernardo

    Hi  Bernardo:
    By maintaining the settings on your DTP you can indicate wether data should be extracted from the Active Table or Change Log Table as described below.
    >-Double click on the DTP that transfers the data from the Standard DSO to the Write Optimized DSO and click on the "Extraction" Tab, on the group at the bottom select one of the 4 options:
    >Active Table (With Archive)
    >Active Table (Without Archive)
    >Active Table (Full Extraction Only)
    >Change Log
    >Hit the F1 key to access the documentation
    >
    >===================================================================
    >Indicator: Extract from Online Database
    >The settings in the group frame Extraction From... or Delta Extraction From... of the Data Transfer Process maintenance specify the source from which the data of the DTP is extracted.  For a full DTP, these settings apply to all requests started by the DTP. For a delta DTP, the settings only apply to the first request (delta initialization), since because of the delta logic, the following requests must all be extracted from the change log.
    >For Extraction from the DataStore Object, you have the following options:
    >Active Table (with Archive)
    >The data is read from the active table and from the archive or from a near-line storage if one exists. You can choose this option even if there is no active data archiving process yet for the DataStore object.
    >Active Table (Without Archive)
    >The data is only read from the active table. If there is data in the archive or in a near-line storage at the time of extraction, this data is not extracted.
    >Archive (Only Full Extraction)
    >The data is only read from the archive or from a near-line storage. Data is not extracted from the active table.
    >Change Log
    >The data is read from the change log of the DataStore object.
    >For Extraction from the InfoCube, you have the following options:
    >InfoCube Tables
    >Data is only extracted from the database (E table and F table and aggregates).
    >Archive (Only Full Extraction)
    >The data is only read from the archive or from a near-line storage.
    Have you modified the default settings on the DTP? How is the DTP configured right now? (or how was configured before your testing?)
    Hope this helps,
    Francisco Milán.

  • Use of Error Stacks when loading to a DSO

    Hi all,
    Hope you can help with the below.
    I am loading from a flat file datasource to a DSO.
    I do not want to use the DTP option 'No futher processing without master data' to identify master data errors as this causes the load to fail should an error be encountered.
    I want to use the error stack and need the errors in both the transactional data AND master data to be moved to the error stack so the load to the DSO can continue, and errors be corrected later.
    When I load the file which contains transactional and master data the only error that is sent to the error stack is for the transactional data - a 'data field' set to overwrite in the mapping.
    However all Key Fields are master data and the errors are not being reported as well, they are flowing through to the DSO and only being reported when we try to activate the DSO.
    So how can I get master data (Key Field) errors to also be moved to the error stack ?
    Many thanks.

    I think its not possible.
    Check this doc http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/10c6c05c-89c4-2c10-89bd-fc5ce1d124e2?quicklink=index&overridelayout=true
    It defines a way to handle these records.

  • Loading data to DSO in BI 7.0

    Hi.,
    How to load data to DSO in BI 7.0. Is Infosource mandatory if we want to load data to DSO. What about the 0RECORDMODE field for DSO. How to maintain the transformations. Please guide me in BI 7.0 process.

    Hi Selva,
    InfoSource is not mandatory in BI 7.0...
    0RECORDMODE
    The field 0RECORDMODE is used for delta update to pass indicators of the DataSource from OLTP to BI system.
    Questions and answers on InfoObject 0RECORDMODE
    SAP Note Number:[399739 |https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/oss_notes/sdn_oss_bw_bct/~form/handler%7b5f4150503d3030323030363832353030303030303031393732265f4556454e543d444953504c4159265f4e4e554d3d333939373339%7d]
    0RECORDMODE
    Re: Using 0RECORDMODE
    Transformation
    The transformation process allows you to consolidate, cleanse, and integrate data. You can semantically synchronize data from heterogeneous sources.
    When you load data from one BI object into a further BI object, the data is passed through a transformation. A transformation converts the fields of the source into the format of the target.
    http://help.sap.com/saphelp_nw04s/helpdata/en/f8/7913426e48db2ce10000000a1550b0/frameset.htm
    Creating Transformations
    http://help.sap.com/saphelp_nw04s/helpdata/en/d5/da13426e48db2ce10000000a1550b0/frameset.htm
    DataStore Object
    A DataStore object serves as a storage location for consolidated and cleansed transaction data or master data on a document (atomic) level.
    Unlike multidimensional data storage using InfoCubes, the data in DataStore objects is stored in transparent, flat database tables. The system does not create fact tables or dimension tables.
    http://help.sap.com/saphelp_nw04s/helpdata/en/f9/45503c242b4a67e10000000a114084/frameset.htm
    Scenario for Using Standard DataStore Objects (check the figure to see how DSO is integrated in data flow...)
    http://help.sap.com/saphelp_nw04s/helpdata/en/bd/c627c88f9e11d4b2c90050da4c74dc/frameset.htm
    Regards
    Andreas

  • How to add fields to already loaded cube or dso and how to fill records in

    how to add fields to already loaded cube or dso and how to fill  it.can any one tell me the critical issues in data loading process..?

    This is sensitive task with regards to large volumes of data in infoproviders.
    The issue is to reload of data in case of adjusted structures of infoproviders.
    Indeed there are some tricks. See following:
    http://weblogs.sdn.sap.com/cs/blank/view/wlg/19300
    https://service.sap.com/sap/support/notes/1287382

  • Full load from a DSO to a cube processes less records than available in DSO

    We have a scenario, where every Sunday I have to make a full load from a DSO with OnHand Stock information to a cube, where I register on material and stoer level a counter if there is stock available.
    The DTP has no filters at all and has a semantic group on 0MATERIAL and 0PLANT.
    The key in the DSO is:
    0MATERIAL
    0PLANT
    0STOCKTYPE
    0STOR_LOC
    0BOM
    of which only 0MATERIAL, 0PLANT and 0STORE_LOC are later used in the transformation.
    As we had a growing number of records, we decided to delete in the START routine all records, where the inventory is not GT zero, thus eliminating zero and negative inventory records.
    Now comes the funny part of the story:
    Prior to these changes I would [in a test system, just copied from PROD] read some 33 million of records and write out the same amount of records. Of course, after the change we expected to write out less. To my total surprise I was reading now however 45 million of records with the same unchanged DTP, and writing out the expected less records.
    When checking the number of records in the DSO I found the 45 million, but cannot explain why in the loads before we only retrieved some 33 millions from the same unchanged amount of records.
    When checking in PROD - same result: we have some 45 million records in the DSO, but when we do the full load from the DSO to the cube the DTP only processes some 33 millions.
    What am I missing - is there a compression going on? Why would the amount of records in a DSO differ from the amount of records processed in the DataPackgages when I am making a FULL load without any filter restrictions and only a semantic grouping in place for parts of the DSO key?
    ANY idea, thought is appreciated.

    Thanks Gaurav.
    I did check if there were more/any loads doen inbetween - there were none in the test system.  As I mentioned that it was a new copy from PROD to TEST, I compared the number of entries in the DSO and that seems to be a match between TEST and PROD, ok a some more in PROD but they can be accounted for. In test I loaded the day before the changes were imported to have a comparison, and between that load and the one ofter the changes were imported nothing in the DSO was changed.
    Both DTPs in TEST and PW2 load from actived DSO [without archive]. The DTPs were not changed in quite a while - so I ruled that one out. Same with activation of data in the DSO - this DSO get's loaded and activated in PROD daily via process chain and we load daily deltas into the cube in question. Only on Sundays, for the begin of the new week/fiscal period, we need to make a full load to capture all materials per site with inventory. The deltas loaded during the week are less than 1 million, but the difference between the number of records in the DSO and the amount processed in the DataPackages is more than 10 millions per full load even in PROD.
    I really appreciated the knowledgable answer, I just wished you would pointed out something that I missed out on.

  • Job cancelled While loading data from DSO to CUBE using DTP

    Hi All,
      while i am loading data frm DSO to CUBE the job load is getting cancelled.
    in the job overview i got the following messages
        SYST: Date 00/01/0000 not expected.
       Job cancelled after system exception ERROR_MESSAGE
    wt can be the reason for this error as i have successfully loaded data into 2 layers of DSO before loading the data into the CUBE

    hi,
    Are u loading the flat file to DSO?
    then check the data in the PSA in the date field and replace with the correct date.
    i think u r using write optimised DSO which is similar to PSA it will take all the data from PSA to DSO.
    so clear the data in PSA and then load to DSO and then to CUBE.
    If u dont need a date field in CUBE then remove the mapping in transformation in cube and activate and make DTP activate and trigger it it will work.

  • Unable to load data from DSO to Cube

    Good morning all,
    I was trying to load data from DSO to Cube for validation. Before loading the new data, I deleted all from DSO and Cube. They contain no request at all. Cube has "Delta Update". First DSO was loaded 138,300 records successfully. Then Activated the DSO. The last when I clicked Exectue (DSO --> Cube), it loaded 0 record. I was able to load the data yesterday. What might be the reasons for this situation?
    Thank you so much!

    Hi BI User,
    For loading delta upload into the data target, there should be an initialization request in the data target with same selection criteria.
    So..first do the initialization and perform delta upload into the cube.
    Regards,
    Subhashini.

Maybe you are looking for

  • On closing Firefox (3.6.15) I finding that although the applicatione ends, the process remains in memory. Any ideas how I can solve this?

    As long as the process remains in memory, Firefox does not restart and if I have any music playing in a site on shut down, the sound continues. I can stop the process using task manager and I have twice reinstalled the application without success. In

  • List component - scrollbar customizing

    Anyone attempted to customize the scrollbars that the Flash list component uses? Is it possible? I know you can alter them within the Flash IDE by going into the component, but in Director it's another story. I'm kinda thinking I'd need to re-export

  • Agentry WinCE Enter key close the screen

    Hallo, we create a detailscreen with a listField which you can see in the screenshot. If we navigate to the screen in the WinCE Client and do nothing and press the enter key on the keyboard from the WinCE-client the screen closed. If we navigate to t

  • Image format from RAW image data????

    Hi all, here is my problem... I have an Image object created by copying data from the clipboard. Now, the question is, how can I write this as an image to Database and file? I need the byte[] RAW data for DB and the best suted format for file saving.

  • How to export clips?

    I'm a novice at iMovie... Ive imported various clips from various tapes. I've managed to turn them all into one project and export that, great, but how do I also export the various clips individually? Currently they are in the event library, each has