Disk throughput drops when inserting data packages in write-optimized DSO

Hi all,
we are currently testing our new freshly installed SAN.
To see the performance gain in BI, I'm currently doing some test loads.
And during the monitoring of those loads, I noticed something I'd like someone to explain :-):
I execute a DTP from PSA to a write-optimized DSO.
The n° of parallel processes = 9
Update method = serial extraction, immediate parallel processing
N° of records transferred: +23.000.000
Ok, in the first phase (read the PSA) only one process is used (serial extraction).  When I look in OS07, I notice we have very good throughput: +66.000 TransfKB/s. Very nice!
But as soon as BI starts inserting the data packages, and parallel processing kicks in, the throughput drops to 4K or something, and sometimes we get 20K at max.  That's not too good.
We have a massive SAN , but the BI system does not seem to use it?
I was wondering why this is the case.  I already toyed around with the package size, but it's always the same.
Also I noticed that the allocated processes don't seem to be active.  I allocated 9 BTC processes to this load.
They are all used, but we only see 3 inserts at the same time, max.  Also in the DTP-monitor, only 3 packages are processed at the same time.  As it's a write-optimized DSO, RSODSO_SETTINGS does not apply I presume.
Any ideas?
tnx!

Hi,
can you pls try to give some filetr in DTP and try to pull the data.
I am not sure why first data package is taking long time and otehr data package is taking less time..
Do you have any start routine..If datapak = 1.. the do this logic..
Pls check..
regards
Gopal

Similar Messages

  • Error while deleting data from a write optimized DSO using a Process Chain?

    Dear Bwers,
    I am facing a strange error while using process chains to delete data from data target which is a write optimized DSO. The process shows as failed but the data gets deleted in the DSO. The error message is as below. Did anybody have a similar problem? Any suggestions.
    Thanks
    Raj
    Error while deleting content of InfoCube/DataStore object ZLSD_G03
    Message no. RSDODSO153

    Please, check if you get any short dump in ST22 related to this issue.

  • Data archiving for Write Optimized DSO

    Hi Gurus,
    I am trying to archive data in Write Optimized DSO.
    Its allowing me to archive on request basis but it archives entire requests in the DSO(Means all data).
    But i want to select to archive from this request to this request.(Selection of request by my own).
    Please guide me.
    I got the below details from SDN.Kindly check.
    Archiving for Write optimized DSOs follow request based archiving as opposed to time slice archiving in standard DSO. This means that partial request activation is not possible; only complete requests have to be archived.
    Characteristic for Time Slice can be a time characteristic present in the WDSO, or the request creation data/request loading date. You are not allowed to add additional infoobjects for semantic groups, default is 0REQUEST & 0DATAPAKID.
    The actual process of archiving remains the same i.e
    Create a Data Archiving Process
    Create and schedule archiving requests
    Restore archiving requests (optional)
    Regards,
    kiruthika

    Hi,
    Please check the below OSS Note :
    http://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/oss_notes/sdn_oss_bw_whm/~form/handler%7b5f4150503d3030323030363832353030303030303031393732265f4556454e543d444953504c4159265f4e4e554d3d31313338303830%7d
    -Vikram

  • Write-Optimized DSO data deletion

    Hello All,
    We have a requirement to delete old data from a write-optimized DSO. The specific requirement is to delete the data older than 15 days (requests older than 15 days) in the DSO. I could not find any process type that could be used in process chains which would automatically delete the old data based on our settings (similar to the one used for PSA or Change-log deletion). Has any of you come across a process type or a batch job that be scheduled to delete old data in a Write-optimized DSO. Your help is much appreciated.
    Thanks,
    Veera

    Hi Nagesh,
    Thanks for your answer. But the use of these function modules would still require custom development to identify the requests to be deleted. We are trying to see if SAP has any standard process for deleting the old request out of a write optimized DSO or if anyone was successful in achieveing this with least amount of customization.
    Thanks,
    Veera

  • Multiple data loads in PSA with write optimized DSO objects

    Dear all,
    Could someone tell me how to deal with this situation?
    We are using write optimized DSO objects in our staging area. These DSO are filled with full loads from a BOB SAP environment.
    The content of these DSO u2013objects are deleted before loading, but we would like to keep the data in the PSA for error tracking and solving. This also provides the opportunity to see what are the differences between two data loads.
    For the normal operation the most recent package in the PSA should be loaded into these DSO-objects (as normal data staging in BW 3.5 and before) .
    As far as we can see, it is not possible to load only the most recent data into the staging layer. This will cause duplicate record errors when there are more data loads in the PSA.
    We all ready tried the functionality in the DTP with u201Call new records, but that only loads the oldest data package and is not processing the new PSA loads.
    Does any of you have a solution for this?
    Thanks in advance.
    Harald

    Hi Ajax,
    I did think about this, but it is more a work around. Call me naive but it should be working as it did in BW3.5!
    The proposed solution will ask a lot of maintenance afterwards. Beside that you also get a problem with changing PSA id's after the have been changed. If you use the posibility to delete the content of a PSA table via the process chain, it will fail when the datasourcese is changed due to a newly generated PSA table ID.
    Regards,
    Harald

  • Changes to write optimized DSO containing huge amount of data

    Hi Experts,
    We have appended two new fields in DSO containg huge amount of data. (new IO are amount and currency)
    We are able to make the changes in Development (with DSO containing data).  But when we tried to
    tranport the changes to our QA system, the transport hangs.  The transport triggers a job which
    filled-up the logs so we need to kill the job which aborts the transport.
    Does anyone of you had the same experience.  Do we need to empty the DSO so we can transport
    successfully?  We really don't want to empty the DSO's as it will take time to load? 
    Any help?
    Thank you very muhc for your help.
    Best regards,
    Rose

    emptying the dso should not be necessary, not for a normal dso and not for a write optimized DSO.
    What are the things in the logs; sort of conversions for all the records?
    Marco

  • Duplication Error while loading data in write optimized DSO

    Hi Experts,
    I have an issue.In BI7 I'm trying to load the data in a WRITE OPTIMIZED ODS from the controlling data source 0CO_OM_CCA_10. I'm getting the data properly in PSA, but while loading it into my WODSO i'm getting the duplication error although my keys fields and data fields are properly placed in my data target(WODSO).
    pls let me know what is the solution to load it successfully.
    Thanks in Advance.
    Amit

    Hi,
    thanks for your reply
    I'm getting this error message:
    Diagnosis
        During loading, there was a key violation. You tried to save more than
        one data record with the same semantic key.
        The problematic (newly loaded) data record has the following properties:
        o   DataStore object: GWFDSR02
        o   Request: DTPR_4BA3GF8JMQFVQ8YUENNZX3VG5
        o   Data package: 000006
        o   Data record number: 101
    Although i have selected the key fields which identifies unique record,then also i'm getting the duplication error.
    Even i have reffered to the BI content for this data source and found that it has the same key fields as of mine.
    Debjani: i need unique records without duplication and i'm doing a full load in DTP.
    What is to be done pls help
    Thanks in advance.
    Edited by: Amit Kotwani on Sep 26, 2008 10:56 AM

  • Duplicate Error while loading data to Write Optimized DSO

    Hi,
    When i do a dataload for Write Optimized DSO, I am getting an Error as "Duplicate Data Record Detected".  I have Sales Document Number, Fiscal Year Variant & Billing Item as Semantic Key in the DSO. 
    For this DSO, I am getting data from a Test ECC system, in which most of the Sales Document Number column is Blank for this Datasource.
    When i go into the Error Stack of the DSO, all the rows whichever has Sales Document number as Blank are displayed.  For all this rows, the Item Number is 10.
    Am i getting this Duplicate error as the Sales Document Number is Blank & Item number is 10 for all of them?  I read in Threads that Write Optimized DSO doesnt care about the Key Values, it loads the data even if the Key values are same.
    Any help is highly appreciated.
    Regards,
    Murali

    Hi Murali,
    Is the Item Number a key field ?
    When all the key fields are same then data gets aggreagted depending on the setting done in the transformation for the key figures. These 2 options for key figures are :
    1. Add up the key figures
    2. Replace the key figure
    Since the Sales Document No is blank and Item Number is same then their is a possibility that the key figures for these records might get added up or replaced and thus due to this property of SDSO it might not be throwing error.
    Check the KF value in the SDSO for that Sales Doc No and Item No. and try to find out what is the value for KF. It may have addition of all the common data fields or the KF value of last common record.
    Regards
    Raj Rai

  • Load Data with 7.0 DataSource from Falt file to Write Optimized DSO

    Hi all,
    we have a problem loading data from flat file using the 7.0 datasource.
    We have to load a flat file (monthly) into a WO DSO. The infopackage load file in full mode into the Datasource (PSA) and the DTP Load in delta mode data from datasource into the WO DSO.
    When i load the second file in the Datasource, the DTP load all data present in the Datasource and not only the new one as aspected using Delta mode.
    Has anyone any tips to help me?
    Thank you for help.
    Regards
    Emiliano

    Hi,
    Iam facing the similar problem.
    Iam using Write Optimized DSO and i have got only 1 req in PSA (have deleted all previous req from PSA and DSO).
    When iam doing a delta load from PSA to DSO, i expect to see only that 1 req to get loaded into DSO.
    But, its picking up the data from 3 other reqests and doubling the records...
    Can you please help me, how did you managed to get out of that isue?
    Cheers,
    Nisha

  • Unable to delete data target contents of Write-Optimized DSO in Process Chain

    Hi Experts,
    We are using SAP Net Weaver BW 7.01 version and we need to delete the entire data target contents of Write-Optimized DSO in the process chain before the next data load.
    I included this step in process chain but still it is failing with errore message"Message not found (in main memory), Drop Cube Failed In Data Target"
    This process type is working during BW 7.0 version but not in BW 7.01 version.
    However i found that we can use the program RSSM_DELETE_WO_DSO_REQUESTS to delete old requests in the Write-Optimized DSO for BW 7.01 SP07 as per SAP Note 1437407 but still it's not working even after implementing this program as the Prerequisite to delete the request is the data mart status should be updated where it is not happening for the program.
    We had an process type option to 'delete the requests from Write-Optimized DSO' directly in BW 7.3 but still not available in 7.01 version.
    Could you please suggest me on how to resolve this issue in BW 7.01?
    Many thanks for your help in advance.
    Regards,
    Madhu

    Create ABAP program as attached code.
    Then you can use that ABAP program in process chains through ABAP variant
    ABAP varaint should have following properties
    Select call mode as Synchronous; call from Local; and Program
    Give your ABAP program name in "program name" and create one program variant for each write optimized DSO.
    Please refer how to use ABAP program in process chains for further details.
    Hope this helps

  • Problem loading data into write optimized dso.....

    Hi ,
    I am having problem loading the data from PSA to write optimised DSO
    I have changed the normal DSO into a Write Optimised DSO. I have 2 data sources to be loaded into the write optimized DSO. One for Demand and one for Inventory.
    The loading of Demand data from PSA to DSO happens fine with out any error.
    Now while loading the Inventory data from PSA to DSO , i get the below errors
    "Data Structures were changed. Start Transaction before hand"
    Execption CX_RS_FAILED Logged
    I have tried reactivating the DSO, Transformation, DTP and tried to load the data again to the write optimised DSO, but no luck, I get the above error message always.
    Can some one please suggest me what could be done to avoid the above error messages and load the data successfully.
    Thanks in advance.

    Hi,
    Check the transformations is there any routines written.
    Check your Data struture of cube and DS as well.
    Is there any changes in structure.
    Data will load upto PSA normally, if is there any changes in struture of DSO, then the error may occur.just check it.
    Check the below blog:
    /people/martin.mouilpadeti/blog/2007/08/24/sap-netweaver-70-bi-new-datastore-write-optimized-dso
    Let us know status.........
    Reg
    Pra

  • SSIS package takes longer time when inserting data into temp tables

    querying records from one  server  and  inserting them into temp tables is taking longer time.
    are there any setting in package which  enhance the performance .

    will local temp table (#temp ) enhance the performance  ..
    If you're planning to use # tables in ssis make sure you read this
    http://consultingblogs.emc.com/jamiethomson/archive/2006/11/19/SSIS_3A00_-Using-temporary-tables.aspx
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

  • Error message "disk is locked" when inserting blank DVD

    My G4 is supposed to be able to burn a DVD, but when I insert the sample DVD that came with the computer when it was new, I keep getting the "disk is locked" message. It also happens with other DVD's that I bought specifically to use with the G4. I would appreciate your suggestions.
    G4   Mac OS 9.2.x  

    VenturaGutek
    Any optical drive software that is worth its salt should enable burning of data discs (4.7GB or 8.5GB in the case of DVDs) as well as of specialized file formats. That said, however, one of the commonest reasons for failure to burn, even failure to recognize an optical drive, is extension conflict. This was especially true of Roxio's Toast v5.x, when using which it was necessary to disable all Apple extensions, USB and FireWire, so that control of the drive could be taken by Toast extensions. I had to do this only recently to get a friend's LaCie DVD-R burner to be recognized as a DVD (rather than CD) burner by a B&W under OS 9.2.2, let alone to burn DVDs. That possibility was the reason for asking about your software. However, if you are using only Apple software, there should be no issue of software conflicts of this nature.
    Without wishing or intending to fob you off, I suggest that you will get much more up-to-the-moment guidance from active fellow-sufferers than is likely in this antediluvian category if you approach them in the iDVD specialist forums, which are divided into iDVD 6, 5, and 4 and under.
    Apple IIe; 68K: 11DT + 4PB; PPC: 5DT + 3PB; G3: 6DT     System 6.0.8 to OS 10.4.x

  • SQL API. Database corrupted when inserting data?

    Hi, I have a working sqlite application I was going to migrate to BerkeleyDB to try and solve some locking issues. However I am having some trouble creating the database.
    I downloaded the windows pre-built binaries and created my schema with dbsql.exe. The DDL worked fine and running db_verify reported no errors. However after I inserted about twenty rows of data using a SQL script of INSERT commands db_verify reported about ten "out-of-order key" messages. There are no INSERT triggers in the DDL. I tried running the INSERTs as single commands, and the out-of-order key messages appeared after about ten rows of data. A bit of googling only revealed that these errors often occur when using tools linked to different BerkelyDB versions. I have never had any version of BerkeleyDB installed before.
    I then downloaded the BerkeleyDB source and built the binaries myself. I again created the schema - no problems, inserted about twenty rows - errors the same as before. When running db_recover the error messages are as below:
    C:\Program Files (x86)\Oracle\Berkeley DB 11gR2 5.1.25\bin\SFTPVT.BDB-journal>..\db_recover.exe
    db_recover.exe: Ignoring log file: log.0000000001: magic number 1563f1a3, not 40988
    db_recover.exe: Invalid log file: log.0000000001: Invalid argument
    db_recover.exe: PANIC: Invalid argument
    db_recover.exe: process-private: unable to find environment
    db_recover.exe: DB_ENV->open: DB_RUNRECOVERY: Fatal error, run database recovery
    I thought the issues might be because I use encrypted database but the same things happen whether I use PRAGMA key="****************" in my scripts or not. I haven't played with DB_CONFIG - should I be creating a DB_CONFIG yet?
    I'm using Win 7 pro (x64), i5 processor with 8GB RAM, disks are a RAID10 array formatted as NTFS. Same errors occur if I try to create the database on a FAT32 USB key so it's probably not a filesystem problem. I used Visual Studio 10 and MSVCRT version 10 when building binaries myself. I notice the downloaded binaries were built against MVCRT version 8, so the problem doesn't look tied to the version of the VC runtime as it occurs on multiple versions.
    Can anybody help please? I would like to try BerkeleyDB but can't get past the first hurdle.
    Edited by: 856299 on 03-May-2011 10:46

    Hello,
    If you can provide a small stand-alone program to reproduce the problem along with the steps to reproduce it, I will give it a try.
    Thank you,
    Sandra

  • Best Practice for ViewObjects when inserting data through pl/sql procedure

    My applications is oracle form based enterprise level application and we are now developing new module in ADF 11g but there is restriction that all data insertion, updation, and deletion will be through oracle pl/sql procedures. Now my question is that adf pages should be binded with ViewObjects based on Entity Object or with Viewobjects not based on Entity / sql query. Currently I have developed pages with programmatic ViewObjects which are neither based on Entity Objects nor on sql query. In those view objects, i create transient attributes and then used it to create adf pages. Then on save, i extract the data from ViewObject's current row and pass it to procedure. This is working fine but just wondering whether this approach is ok or there is better alternative for that. Ideally i want to create ViewObjects based on EntityObject but don't finding any way to synchronize entityObjects with data inserted through procedures.

    Hi,
    I create a EO for the Database-View and override the doDML()-Method. For insert/update and delete I call the pl/sql-functions.
    See "38.5 Basing an Entity Object on a PL/SQL Package API" in Oracle® Fusion Middleware Fusion Developer's Guide for Oracle Application Development
    Framework.

Maybe you are looking for

  • Is it possible to input itunes into DVI/HDCP input from imac with Apple mini display portS?

    Is it possible to input itunes into DVI/HDCP input from imac with Apple mini display port? Or is there another way?

  • Cannot See 'Updated' projects in iMovie Library

    Recently updated to iMovie 10.01 and then 10.02.  Originally had the Updated Projects event and the Finalized Movies Event showing in the iMovie Library, but they are no longer there.  I can 'see' them in the finder, looking at the iMovie and the sho

  • Using iPad as a back up band

    I'm playing backing tracks with the iPad. It's like having an 'On-Demand" band. I need the iPad's iPod to only play one song at a time. The only setting I can find are Looping and Shuffling songs. Anyone know how to get this thing to play one song, t

  • Unexpected Error 2179 for blocking website

    I am trying to block one of our household computers from a specific website, but am getting this strange error: I tried with accessing my E4200 locally and through the linksyssmartwifi website. Same problem. Here's a close up of the details:

  • GUI to load and edit documents in Database

    we have a requirement as follows: The users should be able to load the documents in the Database and also at a later point be able to check out the documents, edit and save it back to the database. is there any GUI user interface already available wi