Sap ME 5. initial data load problem

Hi
I instaled Sap ME 5.2 SP3 and now the initial data load script give me a long error. Any ideas?
Enter password for admin user:
Confirm password for admin user:
Error while processing XML message <?xml version="1.0" encoding="UTF-8"?>
<DX_RESPONSE>
<CONCLUSION success='no'/>
<IMPORT_RESPONSE>
<TOTALS failed='2' total='4' invalid='0' succeeded='2'/>
<LOG>
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        ALARM_TYPE_CONFIG_BO processed
        Database commit occurred
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        1 records skipped; update is not allowed
        ALARM_CLEAR_CONFIG_BO processed
        Database commit occurred
        ALARM_PROP_BO failed (SITE:*, PROP_ID:1)
        Exception occurred: (Failed in component: sap.com/me~ear) Exception rais
ed from invocation of public void com.sap.me.frame.BasicBOBean.DXImportCreate(co
m.sap.me.frame.Data,com.sap.me.frame.Data,boolean) throws com.sap.me.frame.Basic
BOBeanException method on bean instance com.sap.me.alarm.AlarmPropBOBean@143f363
e for bean sap.com/me~earxml|me.alarm.ejb-5.2.3.1-Base.jarxml|AlarmPropBO in a
pplication sap.com/me~ear.; nested exception is: javax.ejb.EJBException: (Failed
in component: sap.com/me~ear) Exception raised from invocation of public void c
om.sap.me.dbsequence.DBSequenceBOBean.adjustSequence(java.lang.String,java.lang.
String,long) throws com.sap.me.frame.BasicBOBeanException method on bean instanc
e com.sap.me.dbsequence.DBSequenceBOBean@712676c3 for bean sap.com/me~ear*xml|me
.dbsequence.ejb-5.2.3.1-Base.jar*xml|DBSequenceBO in application sap.com/me~ear.
; nested exception is: javax.ejb.EJBException: nested exception is: com.microsof
t.sqlserver.jdbc.SQLServerException: Incorrect syntax near the keyword 'TABLE'.;
nested exception is: javax.ejb.EJBException: (Failed in component: sap.com/me~e
ar) Exception raised from invocation of public void com.sap.me.frame.BasicBOBean
.DXImportCreate(com.sap.me.frame.Data,com.sap.me.frame.Data,boolean) throws com.
sap.me.frame.BasicBOBeanException method on bean instance com.sap.me.alarm.Alarm
PropBOBean@143f363e for bean sap.com/me~earxml|me.alarm.ejb-5.2.3.1-Base.jarxm
l|AlarmPropBO in application sap.com/me~ear.; nested exception is: javax.ejb.EJB
Exception: (Failed in component: sap.com/me~ear) Exception raised from invocatio
n of public void com.sap.me.dbsequence.DBSequenceBOBean.adjustSequence(java.lang
.String,java.lang.String,long) throws com.sap.me.frame.BasicBOBeanException meth
od on bean instance com.sap.me.dbsequence.DBSequenceBOBean@712676c3 for bean sap
.com/me~earxml|me.dbsequence.ejb-5.2.3.1-Base.jarxml|DBSequenceBO in applicati
on sap.com/me~ear.; nested exception is: javax.ejb.EJBException: nested exceptio
n is: com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect syntax near the
keyword 'TABLE'.
        Database rollback occurred
        ALARM_FILTER_BO failed (SITE:*, FILTER_ID:11)
        Exception occurred: (Failed in component: sap.com/me~ear) Exception rais
ed from invocation of public void com.sap.me.frame.BasicBOBean.DXImportCreate(co
m.sap.me.frame.Data,com.sap.me.frame.Data,boolean) throws com.sap.me.frame.Basic
BOBeanException method on bean instance com.sap.me.alarm.AlarmFilterBOBean@4e101
e8b for bean sap.com/me~earxml|me.alarm.ejb-5.2.3.1-Base.jarxml|AlarmFilterBO
in application sap.com/me~ear.; nested exception is: javax.ejb.EJBException: (Fa
iled in component: sap.com/me~ear) Exception raised from invocation of public vo
id com.sap.me.dbsequence.DBSequenceBOBean.adjustSequence(java.lang.String,java.l
ang.String,long) throws com.sap.me.frame.BasicBOBeanException method on bean ins
tance com.sap.me.dbsequence.DBSequenceBOBean@75bb17af for bean sap.com/me~ear*xm
l|me.dbsequence.ejb-5.2.3.1-Base.jar*xml|DBSequenceBO in application sap.com/me~
ear.; nested exception is: javax.ejb.EJBException: nested exception is: com.micr
osoft.sqlserver.jdbc.SQLServerException: Incorrect syntax near the keyword 'TABL
E'.; nested exception is: javax.ejb.EJBException: (Failed in component: sap.com/
me~ear) Exception raised from invocation of public void com.sap.me.frame.BasicBO
Bean.DXImportCreate(com.sap.me.frame.Data,com.sap.me.frame.Data,boolean) throws
com.sap.me.frame.BasicBOBeanException method on bean instance com.sap.me.alarm.A
larmFilterBOBean@4e101e8b for bean sap.com/me~ear*xml|me.alarm.ejb-5.2.3.1-Base.
jar*xml|AlarmFilterBO in application sap.com/me~ear.; nested exception is: javax
.ejb.EJBException: (Failed in component: sap.com/me~ear) Exception raised from i
nvocation of public void com.sap.me.dbsequence.DBSequenceBOBean.adjustSequence(j
ava.lang.String,java.lang.String,long) throws com.sap.me.frame.BasicBOBeanExcept
ion method on bean instance com.sap.me.dbsequence.DBSequenceBOBean@75bb17af for
bean sap.com/me~earxml|me.dbsequence.ejb-5.2.3.1-Base.jarxml|DBSequenceBO in a
pplication sap.com/me~ear.; nested exception is: javax.ejb.EJBException: nested
exception is: com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect syntax
near the keyword 'TABLE'.
        Database rollback occurred
        Database commit occurred
</LOG>
</IMPORT_RESPONSE>
</DX_RESPONSE> (Message 12943)
Press any key to continue . . .
Jennsen

Hello,
I think SAP ME is trying to work with DB using Oracle syntax while you have a MS SQL. This can be because of improper -Ddb.vendor setting. Please, refer to Installation Guide for SAP ME 5.2, section 4.1, step 12 (or near that). There should be something like this:
12. Expand the template node and choose the instance node located in the navigation tree. Choose the
VM Parameters and Additional tabs containing JVM settings and add the following settings:
-Dsce.home=set to the AppServer directory specified during installation
-Ddb.vendor=ORACLE | SQLSERVER
-Dvm.bkg.processor=true
Check this settings in your installation, maybe it matters.
Anton

Similar Messages

  • OIM Initial Data Load - Suggestion Please

    Hi,
    I have the following scenario :
    1.My client is currently having 2 systems 1.AD and 2.HR application.
    2.HR application is having only Employee Information.(it contains information like firstname,lastname,employeenumber etc)
    3.AD is having both employees and contractor information
    4.Client wanted my HR application to be the trusted source but the IDM login ID of existing users should be same us that of AD samaccount name.
    I am using OIM 9.0 and 9041 connector pack.What would be the best way to do the initial data loading in this case?.Thanks in advance.

    Hi,
    Can you tell me how do you relate employee in HR to corresponding record in AD.Then I will be in better situation to explain how you can do it.
    But even without this information following approach will solve your requirment.
    1.Do the trusted recon from AD.
    2.samAccountName will be mapped to user id field of OIM profile.
    3.Do the trusted recon with HR.The matching key should be the answer of my above question.
    4.For trusted recon with HR remove the Action "Create User" on No Match Found event.
    Hope this will help.
    Regards
    Nitesh
    Regards
    Nitesh

  • Data load problem - BW and Source System on the same AS

    Hi experts,
    I’m starting with BW (7.0) in a sandbox environment where BW and the source system are installed on the same server (same AS). The source system is the SRM (Supplier Relationship Management) 5.0.
    BW is working on client 001 while SRM is on client 100 and I want to load data from the SRM into BW.
    I’ve configured the RFC connections and the BWREMOTE users with their corresponding profiles in both clients, added a SAP source system (named SRMCLNT100), installed SRM Business Content, replicated the data sources from this source system and everything worked fine.
    Now I want to load data from SRM (client 100) into BW (client 001) using standard data sources and extractors. To do this, I’ve created an  InfoPackage in one standard metadata data source (with data, checked through RSA3 on client 100 – source system). I’ve started the data load process, but the monitor says that “no Idocs arrived from the source system” and keeps the status yellow forever.
    Additional information:
    <u><b>BW Monitor Status:</b></u>
    Request still running
    Diagnosis
    No errors could be found. The current process has probably not finished yet.
    System Response
    The ALE inbox of the SAP BW is identical to the ALE outbox of the source system
    and/or
    the maximum wait time for this request has not yet run out
    and/or
    the batch job in the source system has not yet ended.
    Current status
    No Idocs arrived from the source system.
    <b><u>BW Monitor Details:</u></b>
    0 from 0 records
    – but there are 2 records on RSA3 for this data source
    Overall status: Missing messages or warnings
    -     Requests (messages): Everything OK
    o     Data request arranged
    o     Confirmed with: OK
    -     Extraction (messages): Missing messages
    o     Missing message: Request received
    o     Missing message: Number of sent records
    o     Missing message: Selection completed
    -     Transfer (IDocs and TRFC): Missing messages or warnings
    o     Request IDoc: sent, not arrived ; Data passed to port OK
    -     Processing (data packet): No data
    <b><u>Transactional RFC (sm58):</u></b>
    Function Module: IDOC_INBOUND_ASYNCHRONOUS
    Target System: SRMCLNT100
    Date Time: 08.03.2006 14:55:56
    Status text: No service for system SAPSRM, client 001 in Integration Directory
    Transaction ID: C8C415C718DC440F1AAC064E
    Host: srm
    Program: SAPMSSY1
    Client: 001
    Rpts: 0000
    <b><u>System Log (sm21):</u></b>
    14:55:56 DIA  000 100 BWREMOTE  D0  1 Transaction Canceled IDOC_ADAPTER 601 ( SAPSRM 001 )
    Documentation for system log message D0 1 :
    The transaction has been terminated.  This may be caused by a termination message from the application (MESSAGE Axxx) or by an error detected by the SAP System due to which it makes no sense to proceed with the transaction.  The actual reason for the termination is indicated by the T100 message and the parameters.
    Additional documentation for message IDOC_ADAPTER        601 No service for system &1, client &2 in Integration Directory No documentation exists for message ID601
    <b><u>RFC Destinations (sm59):</u></b>
    Both RFC destinations look fine, with connection and authorization tests successful.
    <b><u>RFC Users (su01):</u></b>
    BW: BWREMOTE with profile S_BI-WHM_RFC (plus SAP_ALL and SAP_NEW temporarily)
    Source System: BWREMOTE with profile S_BI-WX_RFCA (plus SAP_ALL and SAP_NEW temporarily)
    Someone could help ?
    Thanks,
    Guilherme

    Guilherme
    I didn't see any reason why it's not bringing. Are you doing full extraction or Delta. If delta extraction please check the extractor is delta enabled or not. Some times this may cause problems.
    Also check this weblog on data Load errors basic checks. it may help
    /people/siegfried.szameitat/blog/2005/07/28/data-load-errors--basic-checks
    Thanks
    Sat

  • Zero Record Data Load Problem

    Hi,
    Please give your suggestion for following problem.
    we are loading data from ETL (Flat File - Data Stage) into SAP BW 3.1.
    data may contain Zero records. When we try to push the data into BW. At ETL side, it is showing successful data transfer. At, BW side it is showing "Processing state" (Yellow light). and all BW resources are hang-up.
    When we try to send another data load from ETL side, We could not push the data as BW resources are hang up by the previous process.
    Whenever we are getting this kind of problem, we are killing the process and continuing with another data Re-load. But this is not a permanent solution. This is happening more often.
    What is the solution for this problem?
    One of my colleague suggested following suggestion. Shall I consider this one?
    Summary:  when loading with empty files, data may be in the processing state in BW 
    Details:  When user load with empty file(must be empty, can not have any line returns, user can check the data file in binary mode), data is loaded into BW with 0 records. BW will show be in yellow state(processing state) with 0 record showing, and in the PSA inside BW, 1 datapacket will show there with nothing inside. Depends on how user configured their system, BW server can either accept the 0 record packet or deny it. When BW server is configured to accept it, this load request will change to green state(finished state). When the BW server is configured to deny it, this load request will be in the yellow state.
    Please give me ur suggestions.
    Thanks in advance.
    Regards,
    VPR

    hi VPR,
    have you tried to set the light 'judge'ment
    go to monitor of one request and menu settings->evaluation of requests(traffic light), in next screen 'evaluation of requests', 'if no data is avaible in the system, the request' -> choose option 'is judged to be successful' (green).
    Set delta load to complete when no delta data
    hope this helps.

  • Data load problems

    Hello friends
    I am facing a problem in the data load. I modified a cube by adding few Characteristics. The characteristics were first added in the communication structure and in the transfer rules. Then I reactivated the update routine. Finally, I deleted any previous data load request for the cube and did a full load. However i wasn't able to find any data for the newly added fields in the Cube.
    Did I miss something. Any help will be appreciated in this regard.
    Thanks
    Rishi

    how come ODS came in to picture,this was not mentioned in ur prev post,are u loading it from ODS to CUbe and having problems??
    looks u are not using DTP ,in that case check the change log for the newly added fields and then have data flow as ODS>PSA>Cube and check the PSA if those fields are present ..if yes check for update rules in debugging mode of those gets deleted
    Hope it Helps
    Chetan
    @CP..

  • Data load problem in Apex 3.2

    Hi
    Apex 3.2:
    Using Utilities>Data Load/Unload I am trying to load spreadsheet data. Despite reducing the data down to the simplest possible example I keep getting all rows rejected with
    ORA-00926: missing VALUES keyword     
    I have tried tab delimited and comma delimited. I can't find any other users hitting this problem, and I have been pleasantly surprised when I lasted used this facility many months ago on 3.1.2 when it worked like a dream.
    Any thoughts?
    thanks...

    PROBLEM SOLVED
    The schema and the workspace names were in the form:
    AAAAAAAAA-AA
    I found that data imports worked OK for other workspaces which happened to have simpler naming. I created a new workspace and schema (shorter and without the hyphen) and everything works OK.
    I then tried the same with a new schema and tablespace both called TEST-AB and the load failed with the same error
    I'm guessing that this is probably a known issue? It would be nice if Apex could reject illegal schema/tablespace names!
    I am gratefull for help offered from other replies and hope this observation will help others...

  • 2lis_03_um initial data load

    Dear Consultants,
    Our r/3 system is very busy.So I don't want to stop the r/3 system to extract the stock data.Is there any way to do it.I mean Can I extract the initial data while r/3 system is running.I am worrying about loosing the data.
    Best regards

    Hi,
    In case of V3 update of Queue Delta update methodes,We have to lock the users when we are filling the setup tables. But in cese of Direct Delta update Mode, we need to lock the R/3 users until we complete both Filling set uptale and Delta init.
    For more information on all these Methods, take a look on the note: 505700
    <i>1. Why should I no longer use the "Serialized V3 update" as of PI 2002.1 or PI-A 2002.1?
    The following restrictions and problems with the "serialized V3 update" resulted in alternative update methods being provided with the new plug-in and the "serialized V3 update" update method no longer being provided with an offset of 2 plug-in releases.
    a) If, in an R/3 System, several users who are logged on in different languages enter documents in an application, the V3 collective run only ever processes the update entries for one language during a process call. Subsequently, a process call is automatically started for the update entries from the documents that were entered in the next language. During the serialized V3 update, only update entries that were generated in direct chronological order and with the same logon language can therefore be processed. If the language in the sequence of the update entries changes, the V3 collective update process is terminated and then restarted with the new language.
    For every restart, the VBHDR update table is read sequentially on the database. If the update tables contain a very high number of entries, it may require so much time to process the update data that more new update records are simultaneously generated than the number of records being processed.
    b) The serialized V3 update can only guarantee the correct sequence of extraction data in a document if the document has not been changed twice in one second.
    c) Furthermore, the serialized V3 update can only ensure that the extraction data of a document is in the correct sequence if the times have been synchronized exactly on all system instances, since the time of the update record (which is determined using the locale time of the application server) is used in sorting the update data.
    d) In addition, the serialized V3 update can only ensure that the extraction data of a document is in the correct sequence if an error did not occur beforehand in the U2 update, since the V3 update only processes update data, for which the U2 update is successfully processed.
    2. New "direct delta" update method:
    With this update mode, extraction data is transferred directly to the BW delta queues every time a document is posted. In this way, each document posted with delta extraction is converted to exactly one LUW in the related BW delta queues.
    If you are using this method, there is no need to schedule a job at regular intervals to transfer the data to the BW delta queues. On the other hand, the number of LUWs per DataSource increases significantly in the BW delta queues because the deltas of many documents are not summarized into one LUW in the BW delta queues as was previously the case for the V3 update.
    If you are using this update mode, note that you cannot post any documents during delta initialization in an application from the start of the recompilation run in the OLTP until all delta init requests have been successfully updated successfully in BW. Otherwise, data from documents posted in the meantime is irretrievably lost.
    The restrictions and problems described in relation to the "Serialized V3 update" do not apply to this update method.
    This update method is recommended for the following general criteria:
    a) A maximum of 10,000 document changes (creating, changing or deleting documents) are accrued between two delta extractions for the application in question. A (considerably) larger number of LUWs in the BW delta queue can result in terminations during extraction.
    b) With a future delta initialization, you can ensure that no documents are posted from the start of the recompilation run in R/3 until all delta-init requests have been successfully posted. This applies particularly if, for example, you want to include more organizational units such as another plant or sales organization in the extraction. Stopping the posting of documents always applies to the entire client.
    3. The new "queued delta" update method:
    With this update mode, the extraction data for the affected application is compiled in an extraction queue (instead of in the update data) and can be transferred to the BW delta queues by an update collective run, as previously executed during the V3 update.
    Up to 10,000 delta extractions of documents to an LUW in the BW delta queues are cumulated in this way per DataSource, depending on the application.
    If you use this method, it is also necessary to schedule a job to regularly transfer the data to the BW delta queues ("update collective run"). However, you should note that reports delivered using the logistics extract structures Customizing cockpit are used during this scheduling. There is no point in scheduling with the RSM13005 report for this update method since this report only processes V3 update entries. The simplest way to perform scheduling is via the "Job control" function in the logistics extract structures Customizing Cockpit. We recommend that you schedule the job hourly during normal operation - that is, after successful delta initialization.
    In the case of a delta initialization, the document postings of the affected application can be included again after successful execution of the recompilation run in the OLTP, provided that you make sure that the update collective run is not started before all delta Init requests have been successfully updated in the BW.
    In the posting-free phase during the recompilation run in OLTP, you should execute the update collective run once (as before) to make sure that there are no old delta extraction data remaining in the extraction queues when you resume posting of documents.
    If you want to use the functions of the logistics extract structures Customizing cockpit to make changes to the extract structures of an application (for which you selected this update method), you should make absolutely sure that there is no data in the extraction queue before executing these changes in the affected systems. This applies in particular to the transfer of changes to a production system. You can perform a check when the V3 update is already in use in the respective target system using the RMCSBWCC check report.
    In the following cases, the extraction queues should never contain any data:
    - Importing an R/3 Support Package
    - Performing an R/3 upgrade
    - Importing a plug-in Support Packages
    - Executing a plug-in upgrade
    For an overview of the data of all extraction queues of the logistics extract structures Customizing Cockpit, use transaction LBWQ. You may also obtain this overview via the "Log queue overview" function in the logistics extract structures Customizing cockpit. Only the extraction queues that currently contain extraction data are displayed in this case.
    The restrictions and problems described in relation to the "Serialized V3 update" do not apply to this update method.
    This update method is recommended for the following general criteria:
    a) More than 10,000 document changes (creating, changing or deleting a documents) are performed each day for the application in question.
    b) In future delta initializations, you must reduce the posting-free phase to executing the recompilation run in R/3. The document postings should be included again when the delta Init requests are posted in BW. Of course, the conditions described above for the update collective run must be taken into account.
    4. The new "unserialized V3 update" update method:
    With this update mode, the extraction data of the application in question continues to be written to the update tables using a V3 update module and is retained there until the data is read and processed by a collective update run.
    However, unlike the current default values (serialized V3 update), the data is read in the update collective run (without taking the sequence from the update tables into account) and then transferred to the BW delta queues.
    The restrictions and problems described in relation to the "Serialized V3 update" do not apply to this update method since serialized data transfer is never the aim of this update method. However, you should note the following limitation of this update method:
    The extraction data of a document posting, where update terminations occurred in the V2 update, can only be processed by the V3 update when the V2 update has been successfully posted.
    This update method is recommended for the following general criteria:
    a) Due to the design of the data targets in BW and for the particular application in question, it is irrelevant whether or not the extraction data is transferred to BW in exactly the same sequence in which the data was generated in R/3.
    5. Other essential points to consider:
    a) If you want to select a new update method in application 02 (Purchasing), it is IMPERATIVE that you implement note 500736. Otherwise, even if you have selected another update method, the data will still be written to the V3 update. The update data can then no longer be processed using the RMBV302 report.
    b) If you want to select a new update method in application 03 (Inventory Management), it is IMPERATIVE that you implement note 486784. Otherwise, even if you have selected another update method, the data will still be written to the V3 update. The update data can then no longer be processed using the RMBWV303 report.
    c) If you want to select a new update method in application 04 (Production Planning and Control), it is IMPERATIVE that you implement note 491382. Otherwise, even if you have selected another update method, the data will still be written to the V3 update. The update data can then no longer be processed using the RMBWV304 report.
    d) If you want to select a new update method in application 45 (Agency Business), it is IMPERATIVE that you implement note 507357. Otherwise, even if you have selected another update method, the data will still be written to the V3 update. The update data can then no longer be processed using the RMBWV345 report.
    e) If you want to change the update method of an application to "queued delta", we urgently recommended that you install the latest qRFC version. In this case, you must refer to note 438015.
    f) If you use the new selection function in the logistics extract structures Customizing Cockpit in an application to change from the "Serialized V3" update method to the "direct delta" or "queued delta", you must make sure that there are no pending V3 updates for the applications concerned before importing the relevant correction in all affected systems. To check this, use transaction SA38 to run the RMCSBWCC report with one of the following extract structures in the relevant systems:
        Application 02:   MC02M_0HDR
        Application 03:   MC03BF0
        Application 04:   MC04PE0ARB
        Application 05:   MC05Q00ACT
        Application 08:   MC08TR0FKP
        Application 11:   MC11VA0HDR
        Application 12:   MC12VC0HDR
        Application 13:   MC13VD0HDR
        Application 17:   MC17I00ACT
        Application 18:   MC18I00ACT
        Application 40:   MC40RP0REV
        Application 43:   MC43RK0CAS
        Application 44:   MC44RB0REC
        Application 45:   MC45W_0HDR.
    You can switch the update method if this report does return any information on open V3 updates. Of course, you must not post any documents in the affected application after checking with the report and until you import the relevant Customizing request. This procedure applies in particular to importing the relevant Customizing request into a production system.
    Otherwise, the pending V3 updates are no longer processed. This processing is still feasible even after you import the Customizing request using the RSM13005 report. However, in this case, you can be sure that the serialization of data in the BW delta queues has not been preserved.
    g) Early delta initialization in the logistics extraction:
    As of PI 2002.1 and BW Release 3.0B, you can use the early delta initialization to perform the delta initialization for selected DataSources.
    Only the DataSources of applications 11, 12 and 13 support this procedure in the logistics extraction for PI 2002.1.
    The early delta initialization is used to admit document postings in the OLTP system as early as possible during the initialization procedure. If an early delta initialization InfoPackage was started in BW, data may be written immediately to the delta queue of the central delta management.
    When you use the "direct delta" update method in the logistics extraction, you do not have to wait for the successful conclusion of all delta Init requests in order to readmit document postings in the OLTP if you are using an early delta initialization.
    When you use the "queued delta" update method, early delta initialization is essentially of no advantage because here, as with conventional initialization procedures, you can readmit document postings after successfully filling the setup tables, provided that you ensure that no data is transferred from the extraction queue to the delta queue, that is, an update collective run is not triggered until all of the delta init requests have been successfully posted.
    Regardless of the initialization procedure or update method selected, it is nevertheless necessary to stop any document postings for the affected application during a recompilation run in the OLTP (filling the setup tables).</i>
    With rgds,
    Anil Kumar Sharma .P

  • Data loading problem with Movement types

    Hi Friends,
            I extarcted data using the data source General Ledger : Line itemdata (0fi_gl_4) to BW side.
        Problem is Movement Types for some documents missing.
    But i checked in rsa3 that time showing correctly.
    i restricted the data in bw side infopackage level only particular document that time data loading perfecly.
    this data source having 53,460 records.among all the records 400 records doc type 'we' movement types are missing.
    please give me solution for this how to loading the data with movement types.
    i checked particular document of 50000313 in RSA3 it is showing movement types. then i loaded data in bw side that time that movement types are not comming to be side. then i gave the particular doc 50000313 in infopackage level loading the data that time movement types are loading correctly. this extaractor having 55000 records.
    this is very urgent problem.Please give me reply urgenty. i am waiting for your's replys.
    Thanks & Regards,
    Guna.
    Edited by: gunasekhar raya on May 8, 2008 9:40 AM

    Hi,
    we enhanced Mvement type field(MSEG-BWART) General ledger (0FI_GL_4) extractor.
    this field populated with data all the ACC. Doc . number.
    Only 50000295 to 50000615  in this range we are not getting the movement types values.
    we didn't write any routines in transfer and update rules level.
    just we mapped to BWART field 0MOVETYPE info object.
    we restrict the particular doc no 50000313 infopackage level that time loading the the data into cube with movement types.
    but we remove the restriction infopackage level then loading the data that time we missing the movement types data of particular doc no 50000295 to 50000615.
    Please give mesolution for this. i need to solve this very urgently.
    i am witing for your reply.
    Thanks,
    Guna.

  • ERPI: Data Loading problem Hyperion Planning & Oracle EBS

    Hi
    I am trying to load data from Oracle EBS to Hyperion Planning.
    When i push data Zero rows are inserted in Target.
    When i look at Table " SELECT * FROM TDATASEG "
    It is showing me data but it is not comminting data in Target application.
    The reason is Data difference in Source (EBS) and Target.
    In my source Year is 2013 but in Target 'FY14' so for Entity Source is '21' but Target is '2143213'
    Can you please let me know how to solve this issue?
    can i place a Lookup table for this in EPRI.
    i am using EPRI and ODI to push data.
    Regards
    Sher

    Have you setup the data load mapping correctly to map the source value to the proper target value? Based on what you are describing it seems that the system generated * to * map is being used, if you are mapping a source to a different target, this needs to be added to the data load mapping.

  • EIS data load problem

    Hi,
    I use user defined sql to do a data load using EIS 7x
    When i run the sql in toad just to see the data I see 70000 rows
    But when I use the same sql in the userdefined sql and try to load data...I dont know why EIS says that it loaded around 69000 rows and also I see no rejections...
    I even made some sql changes to find out what are the records that are not being loaded..and I see some rows when I run a sql in toad and if i use the same in EIS its not loading those rows which I can see in toad for the same sql ...
    This is very strange.. Can any one help me with this,..??

    Glenn-
    I dont think there is anything unique about the missing rows..
    Actually the a part of code has been added to the previously working view(which i use to load data) to bring in some additional data...
    I took that part and tried to load data...and I see no data being loaded or rejected..
    It just says that records loaded "0"..rejected 0...
    but the same part brings in the required additional data... when exectued in toad..
    and about the excel sheet lock and send..I did that a week and as you said to my surprise it loads everything....
    This was the test bu which I figured out that EIS is not even able to load a part of data...and I found the exact part of data by going through it closely..
    So I think this is something to do with sql interface..
    And I did not understand the last lines in your post...
    /////I know when I do regular Sql interface some data types don;t load and I have to convert them to varchar tto get them to work. If the Excel file and Flat file loads work, look at that//////
    what did convert to varchar???
    Excel load works fine..So I think its something similar to what you mentioned in your post...
    Can you please explain it ...

  • Data Loading Problem In BI

    Hi,
    here is very simple question; i have one Falt file only 2 fields Country and city such as :
    Andorra,Andorra la Vella 
    Angola,Luanda 
    Anguilla,The Valley 
    Antigua and Barbuda,Saint John's (Antigua) 
    Argentina,Buenos Aires 
    Armenia,Yerevan 
    Aruba,Oranjestad 
    Australia,Canberra 
    Austria,Vienna 
    Azerbaijan,Baku (Baki) 
    Bahrain,Manama 
    Bangladesh,Dhaka 
    Barbados,Bridgetown 
    When I upload the file, it just show only country not cities...
    schreen shot in here
    http://www.flickr.com/photos/25222280@N03/3277198674/sizes/o/
    http://www.flickr.com/photos/25222280@N03/3277203778/sizes/o/
    http://www.flickr.com/photos/25222280@N03/3277220212/sizes/o/
    http://www.flickr.com/photos/25222280@N03/3276411379/sizes/o/
    please is anyone tell me where i am doing mistake...
    Thanks

    Hi TK....
    U hav to map both the fields............ie both Country and city.............but I hav one doubt..........
    is City is the compounding characteristic of Country............if so u hav to load Compound Charateristic seperately........it is independent of the Master data loads.......but it need to be mapped.......
    Regards,
    Debjani.......

  • CRM 5.0 to R/3 BP Initial & Delta load problems

    Hi Experts,
    I am having problems trying to replicate a BP to R/3 (ECC). The BDOC status is all green and it shows 'Processed'...but my BP is not created in the OLTP side!!I have followed all the steps from the best practice document - CRM is the lead system and i have created a dedicated account grp with ext. nos range in OLTP.
    I am using BUPA_MAIN in R3AC1 as the load object with a filter setting for a single BP..In the sender i am using CRM and the receiver as OLTP...Am i missing something?? There is no error and so i do not know what to debug in the outbound scheduler...please advice (FYI - OLTP to CRM BP replication works fine)

    Hi Paul,
    If you need to work with just 1 or say a handful of BP, why dont you use synchronisation load instead of initial load.
    Goto transation R3AR2. (Define request)
    create new entry
    Give request name - say Z_BUPA_TEST
    Adapter object - BUPA_MAIN
    Click on request detail:
    Again click on new entries:
    table name - BUT000
    fieldname - PARTNER
    Incl/Ex - I
    Option - EQ
    LOW - your BP number (with leading spaces)
    Save this request.
    goto R3AR4 to execute this request.
    Enter request name(Z_BUPA_TEST) and source and destination systems. execute.
    To monitor this data transfer, goto transaction R3AR3.
    PLease let me know if this helps.
    Regards
    Kaushal

  • Sold-to's missing link to ship-to's - SAP ECC -to CRM data load

    I have recently extended my middleware between my CRM 2007 and SAP ECC systems.  I have adjusted the CRM middleware filters to include a newly created Sales Org that was created in our ECC system.  I'm loading the data from SAP to CRM and the accounts are coming across for this new Sales org however they all seem to be assigned to the Sold to Role and none of the accounts are assigned to Ship to, Bill to or payer roles.  This is despite the fact that PIDE is maintained correctly in SAP ECC. My middleware objects also seem to be fine.  All my other sales orgs worked perfectly in the past. 
    What objects (business, customizing and condition) do i need to need to run to get these relationships into CRM from SAP.
    The main one that I have run is CUSTOMER_MAIN and CUSTOMER_REL
    Any ideas as to what the problem might be?
    Thanks in advance
    eddie

    This issue defintely relates back to our SAP ECC Upgrade
    It looks to me like that any NEW accounts created in SAP (post our SAP ECC Upgrade from SAP 4.7)  and loaded into CRM do not include the relevant account relationships
    That is to say that if an account in SAP say 100001 has 4 sold tos 100002 - 100005, 1 payer 100006 and 1 bill to 100006 then that same account will not have any relationships.  The 7 accounts will be created in CRM but there will be no relationships between them.  All the historical data is fine, any master data that was created prior to the SAP upgrade is unaffected - the relationships exist.  Even if this master data was loaded into CRM after the upgrade.  The issue is with master data created in SAP after the ECC upgrade and when this loaded into CRM the problem is evident. 
    This never came to light during a recent business unit rollout as their master data was created in SAP before the ECC upgrade and came across to CRM without issue.  We only have seen this issue as we are now dealing with a new business unit that has had all its master data created on SAP post the SAP ECC upgrade.  Its this master data that is affected in CRM in terms of missing relationships.
    Any advice would be greatly appreciated.  Has anyone seen this issue before
    Thanks
    Eddie

  • SAP ABAP Proxy - recursive data structure problem

    Hi,
    For our customer we try to bind SAP with GW Calendar using GW Web Services. We tried to generate ABAP Proxy from groupwise.wsdl file but there are problems: GW uses recursive data structures what ABAP Proxy can not use. Is there any simple solution to this problem?
    Best regards
    Pawel

    At least I don't have a clue about ABAP Proxy.
    You are pretty much on your own unless someone
    else has tried it.
    Preston
    >>> On Tuesday, August 03, 2010 at 8:26 AM,
    pawelnowicki<[email protected]> wrote:
    > Hi,
    >
    > For our customer we try to bind SAP with GW Calendar using GW Web
    > Services. We tried to generate ABAP Proxy from groupwise.wsdl file but
    > there are problems: GW uses recursive data structures what ABAP Proxy
    > can not use. Is there any simple solution to this problem?
    >
    > Best regards
    > Pawel

  • BSEG table data load problem

    Hi friends,
    I have a problem where I need to load the BSEG table data but when it loads only one million records, it stuck and gives error. there are more than 5 millions records in it. is there any way to load the data to PSA. I doubt it is due to memory but how I increase the memory or enhancement for loading the data.
    Thanks
    Ahmed

    Hi Ahmed....
    Don't load all the data in a single go..............split the load with selection...........If it is a Transaction data datasource..........then copy the same infopackage..........in the Data selection field give the selection.........and execute them parallely.............since fact table will never loack each other....
    If it a Master data datasource......then don't run paralley..............since  master data table locks each other..........just give the selection one by one.........and load the data.....
    Hope this helps......
    Regards,
    Debjani.....

Maybe you are looking for

  • How to report text entries in captivate 5.0?

    I need students to write in a Text Entry Box and I would like to see what they have said. I do not want to score them or anything like that, just want to recieve their entries. Please is there anybody who knows how I can do that in Captivate 5?

  • Boot camp: No bootable device -- insert boot disk and press any key screen

    Hello there! I recently upgraded my imac mid 2010 system with a SSD hard-drive. The old "main" drive went to the place of super-drive and after reinstalling OS X into the new drive and wiping the old drive a very interesting problem happened. I start

  • Calendar won't start after install Mountain Lion

    After I upgraded my OSX Lion tot Mountain Lion I can't open my calendar. When I try to open it it's starts to blink but that's all. Nothin happens. I reinstalled ML but the problem stays. Does anybody know a solution?

  • Solaris 10 Openssh v5.3p1 sftp chroot works but denied permission

    Hi all, I have been working for 3 days to make chroot work on Solaris 10 with openssh v5.3p1 usring http://www.minstrel.org.uk/papers/sftp/builtin/ methods. All looks great, I can open a sftp session but when I try to write I get permissin denied mes

  • Bluetooth device is not working in labview

    i worked with simple bluetooth server.vi example program. i attached one screen shot of that vi.my problem is, i plugged my usb bluetooth device and i run this vi, but the 'create listener' function generates error 59. 'bluetooth discover' function d