Preparing the initial data load

Hallo again,
what is the best approach to delete all the test data in our SoD environment? Batch delete or via web services? Any best practices, experiences and suggestions are welcome...
Thanks in advance
Michael

Hi Michael,
Batch delete is good enough but take a note that it is not available for custom objects
-- Venky CRMIT

Similar Messages

  • OIM Initial Data Load - Suggestion Please

    Hi,
    I have the following scenario :
    1.My client is currently having 2 systems 1.AD and 2.HR application.
    2.HR application is having only Employee Information.(it contains information like firstname,lastname,employeenumber etc)
    3.AD is having both employees and contractor information
    4.Client wanted my HR application to be the trusted source but the IDM login ID of existing users should be same us that of AD samaccount name.
    I am using OIM 9.0 and 9041 connector pack.What would be the best way to do the initial data loading in this case?.Thanks in advance.

    Hi,
    Can you tell me how do you relate employee in HR to corresponding record in AD.Then I will be in better situation to explain how you can do it.
    But even without this information following approach will solve your requirment.
    1.Do the trusted recon from AD.
    2.samAccountName will be mapped to user id field of OIM profile.
    3.Do the trusted recon with HR.The matching key should be the answer of my above question.
    4.For trusted recon with HR remove the Action "Create User" on No Match Found event.
    Hope this will help.
    Regards
    Nitesh
    Regards
    Nitesh

  • Sap ME 5. initial data load problem

    Hi
    I instaled Sap ME 5.2 SP3 and now the initial data load script give me a long error. Any ideas?
    Enter password for admin user:
    Confirm password for admin user:
    Error while processing XML message <?xml version="1.0" encoding="UTF-8"?>
    <DX_RESPONSE>
    <CONCLUSION success='no'/>
    <IMPORT_RESPONSE>
    <TOTALS failed='2' total='4' invalid='0' succeeded='2'/>
    <LOG>
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            ALARM_TYPE_CONFIG_BO processed
            Database commit occurred
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            1 records skipped; update is not allowed
            ALARM_CLEAR_CONFIG_BO processed
            Database commit occurred
            ALARM_PROP_BO failed (SITE:*, PROP_ID:1)
            Exception occurred: (Failed in component: sap.com/me~ear) Exception rais
    ed from invocation of public void com.sap.me.frame.BasicBOBean.DXImportCreate(co
    m.sap.me.frame.Data,com.sap.me.frame.Data,boolean) throws com.sap.me.frame.Basic
    BOBeanException method on bean instance com.sap.me.alarm.AlarmPropBOBean@143f363
    e for bean sap.com/me~earxml|me.alarm.ejb-5.2.3.1-Base.jarxml|AlarmPropBO in a
    pplication sap.com/me~ear.; nested exception is: javax.ejb.EJBException: (Failed
    in component: sap.com/me~ear) Exception raised from invocation of public void c
    om.sap.me.dbsequence.DBSequenceBOBean.adjustSequence(java.lang.String,java.lang.
    String,long) throws com.sap.me.frame.BasicBOBeanException method on bean instanc
    e com.sap.me.dbsequence.DBSequenceBOBean@712676c3 for bean sap.com/me~ear*xml|me
    .dbsequence.ejb-5.2.3.1-Base.jar*xml|DBSequenceBO in application sap.com/me~ear.
    ; nested exception is: javax.ejb.EJBException: nested exception is: com.microsof
    t.sqlserver.jdbc.SQLServerException: Incorrect syntax near the keyword 'TABLE'.;
    nested exception is: javax.ejb.EJBException: (Failed in component: sap.com/me~e
    ar) Exception raised from invocation of public void com.sap.me.frame.BasicBOBean
    .DXImportCreate(com.sap.me.frame.Data,com.sap.me.frame.Data,boolean) throws com.
    sap.me.frame.BasicBOBeanException method on bean instance com.sap.me.alarm.Alarm
    PropBOBean@143f363e for bean sap.com/me~earxml|me.alarm.ejb-5.2.3.1-Base.jarxm
    l|AlarmPropBO in application sap.com/me~ear.; nested exception is: javax.ejb.EJB
    Exception: (Failed in component: sap.com/me~ear) Exception raised from invocatio
    n of public void com.sap.me.dbsequence.DBSequenceBOBean.adjustSequence(java.lang
    .String,java.lang.String,long) throws com.sap.me.frame.BasicBOBeanException meth
    od on bean instance com.sap.me.dbsequence.DBSequenceBOBean@712676c3 for bean sap
    .com/me~earxml|me.dbsequence.ejb-5.2.3.1-Base.jarxml|DBSequenceBO in applicati
    on sap.com/me~ear.; nested exception is: javax.ejb.EJBException: nested exceptio
    n is: com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect syntax near the
    keyword 'TABLE'.
            Database rollback occurred
            ALARM_FILTER_BO failed (SITE:*, FILTER_ID:11)
            Exception occurred: (Failed in component: sap.com/me~ear) Exception rais
    ed from invocation of public void com.sap.me.frame.BasicBOBean.DXImportCreate(co
    m.sap.me.frame.Data,com.sap.me.frame.Data,boolean) throws com.sap.me.frame.Basic
    BOBeanException method on bean instance com.sap.me.alarm.AlarmFilterBOBean@4e101
    e8b for bean sap.com/me~earxml|me.alarm.ejb-5.2.3.1-Base.jarxml|AlarmFilterBO
    in application sap.com/me~ear.; nested exception is: javax.ejb.EJBException: (Fa
    iled in component: sap.com/me~ear) Exception raised from invocation of public vo
    id com.sap.me.dbsequence.DBSequenceBOBean.adjustSequence(java.lang.String,java.l
    ang.String,long) throws com.sap.me.frame.BasicBOBeanException method on bean ins
    tance com.sap.me.dbsequence.DBSequenceBOBean@75bb17af for bean sap.com/me~ear*xm
    l|me.dbsequence.ejb-5.2.3.1-Base.jar*xml|DBSequenceBO in application sap.com/me~
    ear.; nested exception is: javax.ejb.EJBException: nested exception is: com.micr
    osoft.sqlserver.jdbc.SQLServerException: Incorrect syntax near the keyword 'TABL
    E'.; nested exception is: javax.ejb.EJBException: (Failed in component: sap.com/
    me~ear) Exception raised from invocation of public void com.sap.me.frame.BasicBO
    Bean.DXImportCreate(com.sap.me.frame.Data,com.sap.me.frame.Data,boolean) throws
    com.sap.me.frame.BasicBOBeanException method on bean instance com.sap.me.alarm.A
    larmFilterBOBean@4e101e8b for bean sap.com/me~ear*xml|me.alarm.ejb-5.2.3.1-Base.
    jar*xml|AlarmFilterBO in application sap.com/me~ear.; nested exception is: javax
    .ejb.EJBException: (Failed in component: sap.com/me~ear) Exception raised from i
    nvocation of public void com.sap.me.dbsequence.DBSequenceBOBean.adjustSequence(j
    ava.lang.String,java.lang.String,long) throws com.sap.me.frame.BasicBOBeanExcept
    ion method on bean instance com.sap.me.dbsequence.DBSequenceBOBean@75bb17af for
    bean sap.com/me~earxml|me.dbsequence.ejb-5.2.3.1-Base.jarxml|DBSequenceBO in a
    pplication sap.com/me~ear.; nested exception is: javax.ejb.EJBException: nested
    exception is: com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect syntax
    near the keyword 'TABLE'.
            Database rollback occurred
            Database commit occurred
    </LOG>
    </IMPORT_RESPONSE>
    </DX_RESPONSE> (Message 12943)
    Press any key to continue . . .
    Jennsen

    Hello,
    I think SAP ME is trying to work with DB using Oracle syntax while you have a MS SQL. This can be because of improper -Ddb.vendor setting. Please, refer to Installation Guide for SAP ME 5.2, section 4.1, step 12 (or near that). There should be something like this:
    12. Expand the template node and choose the instance node located in the navigation tree. Choose the
    VM Parameters and Additional tabs containing JVM settings and add the following settings:
    -Dsce.home=set to the AppServer directory specified during installation
    -Ddb.vendor=ORACLE | SQLSERVER
    -Dvm.bkg.processor=true
    Check this settings in your installation, maybe it matters.
    Anton

  • XEM - Unable to load the initial data or the variances(delta) data into sys

    I am installing xEM 2.0 SP 10 (SAP xApp Emissions Management) in a windows environment with SQL 5000.  I installed xEM on NW 2004, usage types AS Java and EP 6.
    I am attempting to load the initial data or the variances (delta) data into the system.  Instruction is on page 15 in the install guide.
    I am supposed to enter the following in the command line:
    java -Djava.ext.dirs=. -jar SAPtrans.jar import logfile=import.log datafile=init.dat connectstring=[JDBC Driver];[JDBCUrl];[User];[Password]
    Example command for import into SQL Server:
    java -Djava.ext.dirs=. -jar SAPtrans.jar import logfile=import.log datafile=init.dat connectstring=com.ddtek.jdbc.sqlserver.SQLServerDriver; jdbc:datadirect:sqlserver://vma03:1433;SAPC11DB;password
    The customer I am with is running the xEM database on a different instance.  This is where I run into a problem.  I am not sure how to specify the instance in the script.  This is what  I have attempted so far:
    C:\>cd temp\load
    C:\Temp\load>java -Djava.ext.dirs=. -jar SAPtrans.jar import logfile=import.log datafile=init.dat connectstring=com.ddte
    k.jdbc.sqlserver.SQLServerDriver;jdbc:datadirect:sqlserver://PRODSQL43:SQL3:1534;SAPPEMDB;password
    java.lang.Exception: ERROR: Cannot connect to the database:ERROR: Connect failed to database jdbc:datadirect:sqlserver:/
    /PRODSQL43:SQL3:1534 as user SAPPEMDB (java.sql.SQLException/[DataDirect][SQLServer JDBC Driver]Unable to connect.  Inva
    lid URL.): [DataDirect][SQLServer JDBC Driver]Unable to connect.  Invalid URL.
            at com.sap.sdm.util.dbaccess.DBTask.dbImport(DBTask.java:356)
            at com.sap.sdm.util.dbaccess.SapTransTask.perform_import(SapTransTask.java:293)
            at com.sap.sdm.util.dbaccess.SapTransTask.execute(SapTransTask.java:51)
            at com.sap.sdm.util.dbaccess.SapTrans.main(SapTrans.java:21)
    import aborted with java.lang.Exception: ERROR: Cannot connect to the database:ERROR: Connect failed to database jdbc:da
    tadirect:sqlserver://PRODSQL43:SQL3:1534 as user SAPPEMDB (java.sql.SQLException/[DataDirect][SQLServer JDBC Driver]Unab
    le to connect.  Invalid URL.): [DataDirect][SQLServer JDBC Driver]Unable to connect.  Invalid URL.
    C:\Temp\load>java -Djava.ext.dirs=. -jar SAPtrans.jar import logfile=import.log datafile=init.dat connectstring=com.ddte
    k.jdbc.sqlserver.SQLServerDriver;jdbc:datadirect:sqlserver://PRODSQL43;SQL3:1534;SAPPEMDB;password
    java.lang.Exception: ERROR: Cannot connect to the database:ERROR: Connect failed to database jdbc:datadirect:sqlserver:/
    /PRODSQL43 as user SQL3:1534 (java.sql.SQLException/[DataDirect][SQLServer JDBC Driver]Error establishing socket. Connec
    tion refused: connect): [DataDirect][SQLServer JDBC Driver]Error establishing socket. Connection refused: connect
            at com.sap.sdm.util.dbaccess.DBTask.dbImport(DBTask.java:356)
            at com.sap.sdm.util.dbaccess.SapTransTask.perform_import(SapTransTask.java:293)
            at com.sap.sdm.util.dbaccess.SapTransTask.execute(SapTransTask.java:51)
            at com.sap.sdm.util.dbaccess.SapTrans.main(SapTrans.java:21)
    import aborted with java.lang.Exception: ERROR: Cannot connect to the database:ERROR: Connect failed to database jdbc:da
    tadirect:sqlserver://PRODSQL43 as user SQL3:1534 (java.sql.SQLException/[DataDirect][SQLServer JDBC Driver]Error establi
    shing socket. Connection refused: connect): [DataDirect][SQLServer JDBC Driver]Error establishing socket. Connection ref
    used: connect
    C:\Temp\load>java -Djava.ext.dirs=. -jar SAPtrans.jar import logfile=import.log datafile=init.dat connectstring=com.ddte
    k.jdbc.sqlserver.SQLServerDriver;jdbc:datadirect:sqlserver://PRODSQL43:1534;SQL3;SAPPEMDB;password
    java.lang.Exception: ERROR: Cannot connect to the database:ERROR: Connect failed to database jdbc:datadirect:sqlserver:/
    /PRODSQL43:1534 as user SQL3 (java.sql.SQLException/[DataDirect][SQLServer JDBC Driver][SQLServer]Login failed for user
    'SQL3'.): [DataDirect][SQLServer JDBC Driver][SQLServer]Login failed for user 'SQL3'.
            at com.sap.sdm.util.dbaccess.DBTask.dbImport(DBTask.java:356)
            at com.sap.sdm.util.dbaccess.SapTransTask.perform_import(SapTransTask.java:293)
            at com.sap.sdm.util.dbaccess.SapTransTask.execute(SapTransTask.java:51)
            at com.sap.sdm.util.dbaccess.SapTrans.main(SapTrans.java:21)
    import aborted with java.lang.Exception: ERROR: Cannot connect to the database:ERROR: Connect failed to database jdbc:da
    tadirect:sqlserver://PRODSQL43:1534 as user SQL3 (java.sql.SQLException/[DataDirect][SQLServer JDBC Driver][SQLServer]Lo
    gin failed for user 'SQL3'.): [DataDirect][SQLServer JDBC Driver][SQLServer]Login failed for user 'SQL3'.
    C:\Temp\load>java -Djava.ext.dirs=. -jar SAPtrans.jar import logfile=import.log datafile=init.dat connectstring=com.ddte
    k.jdbc.sqlserver.SQLServerDriver;jdbc:datadirect:sqlserver://PRODSQL43:1534:SQL3;SAPPEMDB;password
    java.lang.Exception: ERROR: Cannot connect to the database:ERROR: Connect failed to database jdbc:datadirect:sqlserver:/
    /PRODSQL43:1534:SQL3 as user SAPPEMDB (java.sql.SQLException/[DataDirect][SQLServer JDBC Driver]Unable to connect.  Inva
    lid URL.): [DataDirect][SQLServer JDBC Driver]Unable to connect.  Invalid URL.
            at com.sap.sdm.util.dbaccess.DBTask.dbImport(DBTask.java:356)
            at com.sap.sdm.util.dbaccess.SapTransTask.perform_import(SapTransTask.java:293)
            at com.sap.sdm.util.dbaccess.SapTransTask.execute(SapTransTask.java:51)
            at com.sap.sdm.util.dbaccess.SapTrans.main(SapTrans.java:21)
    import aborted with java.lang.Exception: ERROR: Cannot connect to the database:ERROR: Connect failed to database jdbc:da
    tadirect:sqlserver://PRODSQL43:1534:SQL3 as user SAPPEMDB (java.sql.SQLException/[DataDirect][SQLServer JDBC Driver]Unab
    le to connect.  Invalid URL.): [DataDirect][SQLServer JDBC Driver]Unable to connect.  Invalid URL.
    C:\Temp\load>
    My last attempt was a command using colons and semicolons with the following results.  The closest (there was a significant delay before the error or failure) appears to have been //PRODSQL43;SQL3:1534; (second attempt).  The error listed from this attempt is   "Error establishing socket. Connection refused: connect".
    I also checked the default database that user SAPPEMDB has in place and it is assign the correct database.
    Please help.
    Message was edited by: Owner
            Mike Smayrabunya

    Hey,
    It looks like one of the following:
    1. The DB is down,
    2. The user SAPPEMDB does not have the right authorization.
    3. The password of the user SAPPEMDB is not password
    4. The syntax is incorrect
    in order to find what is the problem,
    please:
    1. Login in the the DB PRODSQL43:1534 with the user "SAPPEMDB" and the password "password",
    this will eliminate the options 1 - DB down, 2 -SAPPEMDB does not have authorization and 3 - password of the user SAPPEMDB is not password.
    2. If the login failed, than please run sql trace with security elements (in the client there is a tool called "SQL Profiler"
    3. If the login is correct, than you check the syntax of the command:
    "java -Djava.ext.dirs=. -jar SAPtrans.jar import logfile=import.log datafile=init.dat connectstring=com.ddtek.jdbc.sqlserver.SQLServerDriver; jdbc:datadirect:sqlserver://vma03:1433;SAPC11DB;password"
    According to the error message "Error establishing socket. Connection refused"
    it looks like The DB is down or syntax is incorrect.

  • How to identify the maximum data loads done for a particular day?

    Hi all,
    There is huge volume of data been loaded on a last monday, and then data was deleted from the cubes on the same day. And hence i needs to see which are all the cubes which was loaded wtih lot of records for a particular day,
    i happened to look at rsmo, i am unable to see the ods nor the cube data loads,
    were do i seet it?
    Thanks

    See if the table RSSELDONE helps. This will give you the recent data load details. Based on those loads , you can search the targets.
    And also check table TBTCO  which will give the latest job details. You will have to analyze the same jobs to know what loads were done . Give a selection for date.

  • 2lis_03_um initial data load

    Dear Consultants,
    Our r/3 system is very busy.So I don't want to stop the r/3 system to extract the stock data.Is there any way to do it.I mean Can I extract the initial data while r/3 system is running.I am worrying about loosing the data.
    Best regards

    Hi,
    In case of V3 update of Queue Delta update methodes,We have to lock the users when we are filling the setup tables. But in cese of Direct Delta update Mode, we need to lock the R/3 users until we complete both Filling set uptale and Delta init.
    For more information on all these Methods, take a look on the note: 505700
    <i>1. Why should I no longer use the "Serialized V3 update" as of PI 2002.1 or PI-A 2002.1?
    The following restrictions and problems with the "serialized V3 update" resulted in alternative update methods being provided with the new plug-in and the "serialized V3 update" update method no longer being provided with an offset of 2 plug-in releases.
    a) If, in an R/3 System, several users who are logged on in different languages enter documents in an application, the V3 collective run only ever processes the update entries for one language during a process call. Subsequently, a process call is automatically started for the update entries from the documents that were entered in the next language. During the serialized V3 update, only update entries that were generated in direct chronological order and with the same logon language can therefore be processed. If the language in the sequence of the update entries changes, the V3 collective update process is terminated and then restarted with the new language.
    For every restart, the VBHDR update table is read sequentially on the database. If the update tables contain a very high number of entries, it may require so much time to process the update data that more new update records are simultaneously generated than the number of records being processed.
    b) The serialized V3 update can only guarantee the correct sequence of extraction data in a document if the document has not been changed twice in one second.
    c) Furthermore, the serialized V3 update can only ensure that the extraction data of a document is in the correct sequence if the times have been synchronized exactly on all system instances, since the time of the update record (which is determined using the locale time of the application server) is used in sorting the update data.
    d) In addition, the serialized V3 update can only ensure that the extraction data of a document is in the correct sequence if an error did not occur beforehand in the U2 update, since the V3 update only processes update data, for which the U2 update is successfully processed.
    2. New "direct delta" update method:
    With this update mode, extraction data is transferred directly to the BW delta queues every time a document is posted. In this way, each document posted with delta extraction is converted to exactly one LUW in the related BW delta queues.
    If you are using this method, there is no need to schedule a job at regular intervals to transfer the data to the BW delta queues. On the other hand, the number of LUWs per DataSource increases significantly in the BW delta queues because the deltas of many documents are not summarized into one LUW in the BW delta queues as was previously the case for the V3 update.
    If you are using this update mode, note that you cannot post any documents during delta initialization in an application from the start of the recompilation run in the OLTP until all delta init requests have been successfully updated successfully in BW. Otherwise, data from documents posted in the meantime is irretrievably lost.
    The restrictions and problems described in relation to the "Serialized V3 update" do not apply to this update method.
    This update method is recommended for the following general criteria:
    a) A maximum of 10,000 document changes (creating, changing or deleting documents) are accrued between two delta extractions for the application in question. A (considerably) larger number of LUWs in the BW delta queue can result in terminations during extraction.
    b) With a future delta initialization, you can ensure that no documents are posted from the start of the recompilation run in R/3 until all delta-init requests have been successfully posted. This applies particularly if, for example, you want to include more organizational units such as another plant or sales organization in the extraction. Stopping the posting of documents always applies to the entire client.
    3. The new "queued delta" update method:
    With this update mode, the extraction data for the affected application is compiled in an extraction queue (instead of in the update data) and can be transferred to the BW delta queues by an update collective run, as previously executed during the V3 update.
    Up to 10,000 delta extractions of documents to an LUW in the BW delta queues are cumulated in this way per DataSource, depending on the application.
    If you use this method, it is also necessary to schedule a job to regularly transfer the data to the BW delta queues ("update collective run"). However, you should note that reports delivered using the logistics extract structures Customizing cockpit are used during this scheduling. There is no point in scheduling with the RSM13005 report for this update method since this report only processes V3 update entries. The simplest way to perform scheduling is via the "Job control" function in the logistics extract structures Customizing Cockpit. We recommend that you schedule the job hourly during normal operation - that is, after successful delta initialization.
    In the case of a delta initialization, the document postings of the affected application can be included again after successful execution of the recompilation run in the OLTP, provided that you make sure that the update collective run is not started before all delta Init requests have been successfully updated in the BW.
    In the posting-free phase during the recompilation run in OLTP, you should execute the update collective run once (as before) to make sure that there are no old delta extraction data remaining in the extraction queues when you resume posting of documents.
    If you want to use the functions of the logistics extract structures Customizing cockpit to make changes to the extract structures of an application (for which you selected this update method), you should make absolutely sure that there is no data in the extraction queue before executing these changes in the affected systems. This applies in particular to the transfer of changes to a production system. You can perform a check when the V3 update is already in use in the respective target system using the RMCSBWCC check report.
    In the following cases, the extraction queues should never contain any data:
    - Importing an R/3 Support Package
    - Performing an R/3 upgrade
    - Importing a plug-in Support Packages
    - Executing a plug-in upgrade
    For an overview of the data of all extraction queues of the logistics extract structures Customizing Cockpit, use transaction LBWQ. You may also obtain this overview via the "Log queue overview" function in the logistics extract structures Customizing cockpit. Only the extraction queues that currently contain extraction data are displayed in this case.
    The restrictions and problems described in relation to the "Serialized V3 update" do not apply to this update method.
    This update method is recommended for the following general criteria:
    a) More than 10,000 document changes (creating, changing or deleting a documents) are performed each day for the application in question.
    b) In future delta initializations, you must reduce the posting-free phase to executing the recompilation run in R/3. The document postings should be included again when the delta Init requests are posted in BW. Of course, the conditions described above for the update collective run must be taken into account.
    4. The new "unserialized V3 update" update method:
    With this update mode, the extraction data of the application in question continues to be written to the update tables using a V3 update module and is retained there until the data is read and processed by a collective update run.
    However, unlike the current default values (serialized V3 update), the data is read in the update collective run (without taking the sequence from the update tables into account) and then transferred to the BW delta queues.
    The restrictions and problems described in relation to the "Serialized V3 update" do not apply to this update method since serialized data transfer is never the aim of this update method. However, you should note the following limitation of this update method:
    The extraction data of a document posting, where update terminations occurred in the V2 update, can only be processed by the V3 update when the V2 update has been successfully posted.
    This update method is recommended for the following general criteria:
    a) Due to the design of the data targets in BW and for the particular application in question, it is irrelevant whether or not the extraction data is transferred to BW in exactly the same sequence in which the data was generated in R/3.
    5. Other essential points to consider:
    a) If you want to select a new update method in application 02 (Purchasing), it is IMPERATIVE that you implement note 500736. Otherwise, even if you have selected another update method, the data will still be written to the V3 update. The update data can then no longer be processed using the RMBV302 report.
    b) If you want to select a new update method in application 03 (Inventory Management), it is IMPERATIVE that you implement note 486784. Otherwise, even if you have selected another update method, the data will still be written to the V3 update. The update data can then no longer be processed using the RMBWV303 report.
    c) If you want to select a new update method in application 04 (Production Planning and Control), it is IMPERATIVE that you implement note 491382. Otherwise, even if you have selected another update method, the data will still be written to the V3 update. The update data can then no longer be processed using the RMBWV304 report.
    d) If you want to select a new update method in application 45 (Agency Business), it is IMPERATIVE that you implement note 507357. Otherwise, even if you have selected another update method, the data will still be written to the V3 update. The update data can then no longer be processed using the RMBWV345 report.
    e) If you want to change the update method of an application to "queued delta", we urgently recommended that you install the latest qRFC version. In this case, you must refer to note 438015.
    f) If you use the new selection function in the logistics extract structures Customizing Cockpit in an application to change from the "Serialized V3" update method to the "direct delta" or "queued delta", you must make sure that there are no pending V3 updates for the applications concerned before importing the relevant correction in all affected systems. To check this, use transaction SA38 to run the RMCSBWCC report with one of the following extract structures in the relevant systems:
        Application 02:   MC02M_0HDR
        Application 03:   MC03BF0
        Application 04:   MC04PE0ARB
        Application 05:   MC05Q00ACT
        Application 08:   MC08TR0FKP
        Application 11:   MC11VA0HDR
        Application 12:   MC12VC0HDR
        Application 13:   MC13VD0HDR
        Application 17:   MC17I00ACT
        Application 18:   MC18I00ACT
        Application 40:   MC40RP0REV
        Application 43:   MC43RK0CAS
        Application 44:   MC44RB0REC
        Application 45:   MC45W_0HDR.
    You can switch the update method if this report does return any information on open V3 updates. Of course, you must not post any documents in the affected application after checking with the report and until you import the relevant Customizing request. This procedure applies in particular to importing the relevant Customizing request into a production system.
    Otherwise, the pending V3 updates are no longer processed. This processing is still feasible even after you import the Customizing request using the RSM13005 report. However, in this case, you can be sure that the serialization of data in the BW delta queues has not been preserved.
    g) Early delta initialization in the logistics extraction:
    As of PI 2002.1 and BW Release 3.0B, you can use the early delta initialization to perform the delta initialization for selected DataSources.
    Only the DataSources of applications 11, 12 and 13 support this procedure in the logistics extraction for PI 2002.1.
    The early delta initialization is used to admit document postings in the OLTP system as early as possible during the initialization procedure. If an early delta initialization InfoPackage was started in BW, data may be written immediately to the delta queue of the central delta management.
    When you use the "direct delta" update method in the logistics extraction, you do not have to wait for the successful conclusion of all delta Init requests in order to readmit document postings in the OLTP if you are using an early delta initialization.
    When you use the "queued delta" update method, early delta initialization is essentially of no advantage because here, as with conventional initialization procedures, you can readmit document postings after successfully filling the setup tables, provided that you ensure that no data is transferred from the extraction queue to the delta queue, that is, an update collective run is not triggered until all of the delta init requests have been successfully posted.
    Regardless of the initialization procedure or update method selected, it is nevertheless necessary to stop any document postings for the affected application during a recompilation run in the OLTP (filling the setup tables).</i>
    With rgds,
    Anil Kumar Sharma .P

  • What are the frequent data load errors in production support?

    what are the frequent data load errors in production support?

    It is a long list. Here is some of them.
    1. Idoc not arriving from source system.
    2. previous processes failed.
    3. change run unsucessful or did not run properly and data in master data is messed up.
    4. invalid characteristics in the data.
    5. Duplicate records found.
    and on  and on.
    Ravi Thotahdri

  • I need tables for prepare the master data templates for PM

    Dear Experts,
    can we prepare the master data templates by using the tables? if its possible means how we can prepare and what we have to considered ? is it right way or wrong?
    please suggest me to prepare the masterdata templates for ever use. if any small corrections we can correct it at usage time.
    please help me
    thanks a lot in advance

    Dear Jalu,
    Master data Templates have to be prepared based on Client's requirement after understanding their Business.
    For example, in Equipment master there are ~ 60 to 70 feilds, no client requires all of them. Some are inherited.
    After client requirement freezing, record LSMW and based on that prepare the Template.
    Regards,
    MLN Prasad

  • Finding the initial date and time of a modified instance of  reccurnt evnt

    The problem situation is as given below:
    I have created an event recurrent using the outlook client. I them modified one
    of the instance in it to occur in a data and time different from the existing
    date and time.
    Using the CAPI i wish to retrieve the initial date and time when the instance
    occured. According to documentation i should have the initial date and time in
    the Recurrence ID parameter. When i look at the icalendar i see that the
    recurrenceid matches the start date of the current instance and not the initial
    date and time. I am using the fetchEventByUID in the CAPI API set.
    Is this behaviour of changing date and time of the recurrence-id the expected
    one or is there any problem in the configuration on my machine.
    Counting on some insight into the parameter where the initial date and time of the event are stored.
    Best Regards
    Joyce

    Hi Joyce,
    The sdk always sets the recurrence-id to be the starttime of the instance.
    So the behavior you encountered is what one should expect with the sdk.
    There were some changes in a 10G to correct this situation for a given service. (i.e. have the initial date and time in the Recurrence ID parameter, for instances of the recurring set).
    This fix would not be available for the public sdk version, but given that this feature is important for your application, you could log a tar and it might become available in some 10G patch.
    Regards,
    Jean-Philippe Guguy

  • I want delete the Master data loads which i have already loaded earlier

    Hi BW Guru's,
    I want delete the Master data loads.
    But i am unable to delete.
    is there any option to delete from RSD1
    Please send steps

    Hi,
    If that master data related ifo is present in cube u cont delete master data.
    In that case u have to delete req in cube then u can delete
    let me know further
    *******Asgin Points if usefull***
    Cheers,
    Satya

  • Is there a REAL method to create the initial data replica TO A USB DRIVE for DPM 2012?

    I've got several remote office locations with file servers containing a good amount of data, but they have very poor (3Mbps) WAN connections.
    I've been tasked to set up DPM to back up these locations.  Due to the size of the data and the lack of bandwidth, I'm going to need to get the techs out there
    to copy the data portions of the servers onto USB drives and send them to me so I can import them into the DPM storage here.
    But I’m having some problems finding the document that instructs a tech
    how to copy the initial data replica to a USB drive, then docs on how I can import this data.
    I have found this: 
    http://technet.microsoft.com/en-us/library/jj642916.aspxbut
    it talks about server-to-server copying, which previously mentioned is not something we can
    not do due to bandwidth issues.  We need to copy to USB drive, then ship the drive to me for import into DPM.
    There’s some mention of doing this to tape (then importing to DPM) as well, but I can’t find more details either… it’s sort of weird, like the documentation or help
    files on TechNet are incomplete.  Like take a look at this link http://technet.microsoft.com/en-us/library/hh757818.aspx-- it’s
    pretty useless!
    Has anyone done this yet?  Can anyone please point me in the right direction?

    YES - we do this all the time.  In fact we never do the initial replica creation over the WAN.
    Let's say you want to protect the entire D: drive of your server.  Attach the USB drive to the server and use robocopy to copy everything to the USB drive.  Something like (assuming the USB drive is E:) --
    robocopy D:\ E:\ /b /e /r:0
    You don't have to worry about copying permissions and all that stuff to the USB drive. DPM will take care of that during the initial consistency check and will make sure the DPM replica is an exact copy.  (If you DO want to make sure the USB copy has
    the same permissions, add the /copyall switch to the robocopy command.)
    On the DPM server, set up protection for that server's D: drive and make sure you choose the option to do the initial replica creation MANUALLY.  Once protection has been set up, you basically use robocopy again to transfer data from the USB drive to
    the replica path on the DPM server.
    You can find the replica path on the DPM server by going to the "Protection" area of the console and selecting the protected member of your file server (like the D: drive in the above example).  In the bottom pane there will be a "Replica Path" line
    with a hyperlink that says "Click to view details".  Click that and a dialog will open. You can copy the path to the clipboard.  The path will be something like this:
    C:\Program Files\Microsoft DPM\DPM\Volumes\Replica\File System\vol_7a200c36-19b0-4f92-b05b-16458180f0fa\f1295bef-8329-46a4-9761-85e82017603d\Full
    Just use a similar robocopy command to copy everything to the replica path.  Make sure you put quotes around the replica path destination since there are spaces in it.  Once robocopy is complete, you initiate a consistency check in the DPM console.
    That's all there is to it...

  • ADF method call to fetch data from DB before the initiator page loads

    Hello everyone
    I'm developing an application using Oracle BPM 11.1.1.6.0 and JDeveloper 11.1.1.6.0
    I want to fetch some data from the database before the initiator task, so that when the user clicks on the process name, his/her history will be shown to them before proceeding.
    It was possible to have a service task before the initiator task in JDeveloper 11.1.1.5.0, but I have moved to 11.1.1.6.0 and it clearly mentions this to be an illegal way.
    I came across this thread which suggested to do this using an ADF method call, but I don't know how since I'm new to ADF.
    Re: Using Service Task Activity before Initiator Task issue
    Can anyone show me the way?
    Thanks in advance

    Thanks Sudipto
    I checked that article however I think I might be able to do what I want using ADF BC.
    See, what I'm trying to do is to get a record from a database and show it to the user on the initiator UI.
    I have been able to work with ADF BC and View Objects to get all the rows and show them to the user in a table.
    However, when I try to run the same query in the parameterized form to just return a single row, I hit a wall.
    In short, My problem is like this:
    I have an Application Module which has an entity object and a view object.
    My database is SQL Server 2008.
    when I try to create a new read only view object to return a single row I face the problem.
    The query I have in the query section of my View Object is like this:
    select *
    from dbo.Employee
    where EmployeeCode= 99which works fine.
    However when I define a bind variable, input_code for example, and change the query to the following it won't validate.
    select *
    from dbo.Employee
    where EmployeeCode= :input_codeIt just keeps saying incorrect syntax near ":" I don't know if this has to do with my DB not being Oracle or I'm doing something wrong.
    Can you help me with this problem please?
    thanks again
    bye

  • Error 8 when starting the extracting the program-data load error:status 51

    Dear all,
    <b>I am facing a data exracton problem while extracting data from SAP source system (Development Client 220). </b>The scenario and related setting are as the flowings:
    A. Setting:
    We have created 2 source system one for the development and another for the quality in BW development client
    1. BW server: SAP NetWeaver 2004s BI 7
    PI_BASIS: 2005_1_700 Level: 12
    SAP_BW: 700 Level:13
    Source system (Development Client 220)
    2. SAP ERP: SAP ERP Central Component 6.0
    PI_BASIS: 2005_1_700 Level: 12
    OS: SunOS
    Source system (Quality Client 300)
    2. SAP ERP: SAP ERP Central Component 6.0
    PI_BASIS: 2005_1_700 Level: 12
    OS: HP-UX
    B. The scenario:
    I was abele to load the Info provider from the Source system (Development Client 220), late we create another Source system (Quality Client 300) and abele to load the Info provider from that,
    After creating the another Source system (Quality Client 300), initially I abele to load the info provider from both the Source system – , but now I am unable to load the Info provider from the (Development Client 220),
    Source system Creation:
    For both 220 and 300, back ground user in source system is same with system type with same authorization (sap_all, sap_new, S_BI-WX_RFC) And user for source system to bw connection is dialog type (S_BI-WX_RFC, S_BI-WHM_RFC, SAP_ALL, SAP_NEW)
    1: Now while at the Info Package : Start data load immediately and then get the
    e-mail sent by user RFCUSER, and the content:
    Error message from the source system
    Diagnosis
    An error occurred in the source system.
    System Response
    Caller 09 contains an error message.
    Further analysis:
    The error occurred in Service API .
    2:Error in the detail tab of the call monitor as under,
    <b>bi data upload error:status 51 - error: Error 8 when starting the extracting the program </b>
    Extraction (messages): Errors occurred
      Error occurred in the data selection
    Transfer (IDocs and TRFC): Errors occurred
    Request IDoc : Application document not posted (red)
      bw side: status 03: "IDoc: 0000000000007088 Status: Data passed to port OK,
                                    IDoc sent to SAP system or external program"
      r<b>/3 side: status 51:  IDoc: 0000000000012140 Status: Application document not posted
                                   Error 8 when starting the extraction program</b>
    Info IDoc 1 : Application document posted (green)
       r/3 side: "IDoc: 0000000000012141 Status: Data passed to port OK
                     IDoc sent to SAP system or external program"
       bw side: "IDoc: 0000000000007089 Status: Application document posted,
                     IDoc was successfully transferred to the monitor updating"
    Have attached screen shots showing error at BW side.
    •     check connection is ok, tried to restore the setting for bw-r3 connection – though same problem
    •     Have checked partner profile.
    <b>what's wrong with the process? </b>
    Best regards,
    dushyant.

    Hi,
       Refer note 140147.
    Regards,
    Meiy

  • Unable to find the master data loaded request in manage tab

    Hi All,
    Iam unable to see the loaded request in the master data manage tab after some time of loading.
    Immediately after loading i could see the request.If i revisit the manage tab,iam unable to see the request.
    However data is available for the info object .
    Can anyone please suggest on this.
    Regards,
    Kavya

    hi Devupalli,
    Note your request number, find it using RSRQ tcode. Now click on the target icon and see if it takes you to the manage of your infoobject.
    There is a very common error which we sometimes make: are u sure you're checking the right thing? please check that if your infoobject has text, attribute and hierarchy, you're checking in the manage of the relevant one.
    For ex, if you're loading text, you have to go to manage of the text.
    Hope this helps!
    Sheen

  • Report with reportData tag stil user the initial data

    I have created a report based upon a file, this works fine.
    when I want to schedule it using the <reportData>...</reportData> tag it still uses my initial data.
    The data that is provided is encode.
    This is my soap envelope:
    <soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/">
    <soapenv:Body>
    <pub:scheduleReport xmlns:pub="http://xmlns.oracle.com/oxp/service/PublicReportService">
    <scheduleRequest>
    <deliveryRequest>
    <printOption>
    <printerName>Ontwik</printerName>
    <printNumberOfCopy>1</printNumberOfCopy>
    <printTray>Tray 2</printTray>
    <printSide>Double Sided Long Edge (Duplex)</printSide>
    </printOption>
    </deliveryRequest>
    <reportRequest>
    <attributeFormat>pdf</attributeFormat>
    <attributeTemplate>DefaultTemplate</attributeTemplate>
    <reportAbsolutePath>/~administrator/basedonFile/basedonFile.xdo</reportAbsolutePath> <reportData>50453576626C4E6F615842745A573530506A784462476C6C626E512B50474669596E4A6C646D6C6864476C76626A34354F544D3350433968596D4A795A585A700D0A595852706232342B504535686257552B51695A686258413755794242546C525852564A515043394F5957316C506A7776513278705A573530506A784A5A4756750D0A64476C6D61574E6864476C76626A3551515535454D446B7A4E5459794D444D384C306C6B5A57353061575A705932463061573975506A786862573931626E512B0D0A4D6A51334E4449344D7A77765957317664573530506A784F645731695A584A505A6B6C305A57317A506A457A4D6A7776546E5674596D567954325A4A644756740D0A637A3438546C4E514C55465556464A4A516C565552564D2B50464E5A553052424C564E356333526C5A57316B59585231625434784E79314B565577744D6A41770D0A4F54777655316C545245457455336C7A6447566C6257526864485674506A7856546B6C554C57566C626D686C6157512B517A77765655354A5643316C5A57356F0D0A5A576C6B506A785454314A555279317A623239796443426E6232566B5A584A6C626A354852447776553039535645637463323976636E51675A32396C5A4756790D0A5A57342B50464242546B52484C584268626D52325A584A7A64484A6C6132746C636A34354F544D335043395151553545527931775957356B646D5679633352790D0A5A5774725A58492B5045684A553152504C55687063335276636D6C6C617A34384C30684A553152504C55687063335276636D6C6C617A343856466C51543049740D0A64486C775A5342685A6D64705A6E526C596D56336157707A506C41384C31525A554539434C5852356347556759575A6E61575A305A574A6C64326C71637A34380D0A51306843525667745132686C59327469623367675A5868775A584A3061584E6C506C6B384C304E49516B56594C554E6F5A574E72596D393449475634634756790D0A64476C7A5A54343851306843533038745132686C59327469623367676132397A64475675506C6B384C304E49516B74504C554E6F5A574E72596D3934494774760D0A6333526C626A34384C303554554331425646525353554A5656455654506A7776546D3975553268706347316C626E512B</reportData><sizeOfDataChunkDownload>-1</sizeOfDataChunkDownload>
    </reportRequest>
    <userJobName>PrintJobs</userJobName></scheduleRequest><userID>Administrator</userID><password>XXXXXXXXXXX</password>
    </pub:scheduleReport>
    </soapenv:Body>
    </soapenv:Envelope>
    any one has a cleu ?

    same here; if i run runReport it works okay..when is use ScheduleReport it shows the intial data(preview)
    Do you already know of a solution?

Maybe you are looking for

  • My mac got slow suddenly.... how to clean wanted files????

    My mac got slow suddenly.... i think too much buildup of cache (i am a ex-windows user).... how to clean wanted files and make my Macbook to run again like a horse?????

  • Gloab Address List ( GAL ) not available

    Hi - We have configured corporate iPad airs to connect to corporate MS Exchange (2010) mailboxes. We use a digital certificate, an MDM profile, Nexus portwise syncs with AD to authenticate. Using the native iPad Mail client works well showing up to d

  • IPhone vibrates once a minute when connected to wall charger, nothing showing up on screen

    Hello, I have an iPhone 3GS that I want to repair. The thing is when connected to a wall charger ( not USB ) it vibrates every now and then... There is nothing showing on screen, it just stays black... My Mac doesn't detect the iPhone same for my Win

  • Custom substitution strings in custom report layout (XSL-FO)

    I copied the default generic column report layout (XSL-FO) and made my own template. Everything worked except that I would need to pass some of my custom substition strings to the XSL-FO template. For example, I had a subsitution string named PDF_HEA

  • JTable Combobox dependencies

    i have a jtable with 2 column each column has JComboBox for CellRenderer the second column is depends on the first one means if in row 1 , first combo select value 1 , i should have in second combobox thous values 11,12,13,14 if in row 2 , first comb