Data load in prod system

HI,
In 0CRM_c04. I have a Zinfoobject which is the transaction number of each Oppt. For each Oppt created  a transaction number is created as well. And the No.of doc headers will be set to 1.
I have the transaction number created and data transfered to BI side. From Infosource -> Update rules to DSO. In DSO i have the transaction number and no.of doc headers as weel.
But when from DSO -> Cube. Not all Oppt numbers are transfered. And for few of the transfered Transaction numbers the no. of doc headers is set to zero.
in update rules from DSO->cube. Transaction number and no.of doc headers is direct assignment.
Please sugges

HI Raj ,
For 0CRM_C04 , you will have a start routine to delete the dicuments with status incorrect as below, and will will be missing those records in infocube
delete DATA_PACKAGE where STATECSYS2 = '10'. ( means the documents with Status: Incorrect are deleted )
and the number of headrs will be set as 1 only for the document header once and all other places will be 0 , to ensure correct count ,
Regards,
Sathya

Similar Messages

  • Data load from Legacy system to BW Server through BAPI

    Requirements: We have different kind of legacy systems and SAP BW server. We want to load all legacy system data into SAP BW server using BAPI. Before loading we have to validate all data. If there are bad data, data missing we have to let the legacy system user/ operator knows to fix the data into their system with detail explanation. When it is fixed, we have to load the data again.
    Load Scenario:  We have two options to load data from legacy systems to BW Server.
    1.     We need to load data directly from legacy system to BW Server using BAPI program.
    2.     Legacy Systems data would be in workstations or flash drive as .txt (one line separated by comma) or .csv file. Need to load from .txt /.csv file to BW Server using BAPI program.
    What we want in the BAPI program code?
    It will Read / Retrieve data from text / csv file and will put into the Internal table. Internal table structure would be based on BAPI InfoObject structure.
    Call BAPI InfoObject function module ‘BAPI_IOBJ_CREATE’ to create InfoObject, include all necessary / default components, do the error check, load the data and return the status.
    Could some one help me with the sample code please? I am new in ABAP / BAPI coding.
    Is there any other better idea to load data from legacy system to BW server? BTW we are using BW 3.10. Is there any better option with BI 7.0 to resolve the issue? I appreciate your help.

    my answers:
    1. this is a scendario for a data push into SAP BW. You can only use SOAP-Based Transfer of Data.
    http://help.sap.com/saphelp_nw04/helpdata/en/fd/8012403dbedd5fe10000000a155106/frameset.htm
    (here for BW 3.5, but you'll find similar for 7.0)
    In this scenario you'll have an RFC dinamically created for every Infosource you need to transfer data.
    2. You can make a chain for each data load, so you can call the RFC "RSPC_API_CHAIN_START" to start the chain from external.
    the second solution is more simply and available on every release.
    Regards,
    Sergio

  • Data Loads in Development System

    Hi,
       I need to stop all the data loads (Master/Transaction) iin Development system.as basis team are doing some backup and patch upgrade they asked me to stop all data loads in bw development.
    As this is my 1 week in this projects here there are some loads scheduled in process chains and some in info packages....
    Is there any way i can stop all the data loads in a single way
    Thanks

    Hi,
        Go to SM37  -> give job name as star, user name as star . select  sehedule and release tabs  and press on clock symbol it will display all scheduled and released jobs. based on that info you can cancel the jobs. you can cancel any active jobs here which are running in background. in menu bar goto job tab here you can cancel the active job, in the same way you can  move the released job to schedule. so that job will never get active.
    Regards
    Sankar

  • Data loading from source system takes long time.

    Hi,
         I am loading data from R/3 to BW. I am getting following message in the monitor.
    Request still running
    Diagnosis
    No errors could be found. The current process has probably not finished yet.
    System response
    The ALE inbox of the SAP BW is identical to the ALE outbox of the source system
    and/or
    the maximum wait time for this request has not yet run out
    and/or
    the batch job in the source system has not yet ended.
    Current status
    in the source system
    Is there anything wrong with partner profile maintanance in the source system.
    Cheers
    Senthil

    Hi,
    I will suggest you to check a few places where you can see the status
    1) SM37 job log (In source system if load is from R/3 or in BW if its a datamart load) (give request name) and it should give you the details about the request. If its active make sure that the job log is getting updated at frequent intervals.
    Also see if there is any 'sysfail' for any datapacket in SM37.
    2) SM66 get the job details (server name PID etc from SM37) and see in SM66 if the job is running or not. (In source system if load is from R/3 or in BW if its a datamart load). See if its accessing/updating some tables or is not doing anything at all.
    3) RSMO see what is available in details tab. It may be in update rules.
    4) ST22 check if any short dump has occured.(In source system if load is from R/3 or in BW if its a datamart load)
    5) SM58 and BD87 for pending tRFCs and IDOCS.
    Once you identify you can rectify the error.
    If all the records are in PSA you can pull it from the PSA to target. Else you may have to pull it again from source infoprovider.
    If its running and if you are able to see it active in SM66 you can wait for some time to let it finish. You can also try SM50 / SM51 to see what is happening in the system level like reading/inserting tables etc.
    If you feel its active and running you can verify by checking if the number of records has increased in the data tables.
    SM21 - System log can also be helpful.
    Thanks,
    JituK

  • Data loads running even after system is down?

    All,
    I have a data load from crm system to BI system currently loading to PSA. while the data load was in process the BI system or server went down for recycling yet the loads are running. Is this becasue the loads are upto the PSA. So to get my basics right, are PSA tables on the source system - I had the knowledge that they have same structures as that of souce system but lie on the BI system
    Inputs appreciated.
    Thanks!

    Hi,
    You please check that job names in R3 system and find those are active or not.....if those are active then job is running...if they are not active or canled,,,then you please change the QM status for that loads and re run the infopack..it will work
    Regards
    Srinivas

  • Data load problem - BW and Source System on the same AS

    Hi experts,
    I’m starting with BW (7.0) in a sandbox environment where BW and the source system are installed on the same server (same AS). The source system is the SRM (Supplier Relationship Management) 5.0.
    BW is working on client 001 while SRM is on client 100 and I want to load data from the SRM into BW.
    I’ve configured the RFC connections and the BWREMOTE users with their corresponding profiles in both clients, added a SAP source system (named SRMCLNT100), installed SRM Business Content, replicated the data sources from this source system and everything worked fine.
    Now I want to load data from SRM (client 100) into BW (client 001) using standard data sources and extractors. To do this, I’ve created an  InfoPackage in one standard metadata data source (with data, checked through RSA3 on client 100 – source system). I’ve started the data load process, but the monitor says that “no Idocs arrived from the source system” and keeps the status yellow forever.
    Additional information:
    <u><b>BW Monitor Status:</b></u>
    Request still running
    Diagnosis
    No errors could be found. The current process has probably not finished yet.
    System Response
    The ALE inbox of the SAP BW is identical to the ALE outbox of the source system
    and/or
    the maximum wait time for this request has not yet run out
    and/or
    the batch job in the source system has not yet ended.
    Current status
    No Idocs arrived from the source system.
    <b><u>BW Monitor Details:</u></b>
    0 from 0 records
    – but there are 2 records on RSA3 for this data source
    Overall status: Missing messages or warnings
    -     Requests (messages): Everything OK
    o     Data request arranged
    o     Confirmed with: OK
    -     Extraction (messages): Missing messages
    o     Missing message: Request received
    o     Missing message: Number of sent records
    o     Missing message: Selection completed
    -     Transfer (IDocs and TRFC): Missing messages or warnings
    o     Request IDoc: sent, not arrived ; Data passed to port OK
    -     Processing (data packet): No data
    <b><u>Transactional RFC (sm58):</u></b>
    Function Module: IDOC_INBOUND_ASYNCHRONOUS
    Target System: SRMCLNT100
    Date Time: 08.03.2006 14:55:56
    Status text: No service for system SAPSRM, client 001 in Integration Directory
    Transaction ID: C8C415C718DC440F1AAC064E
    Host: srm
    Program: SAPMSSY1
    Client: 001
    Rpts: 0000
    <b><u>System Log (sm21):</u></b>
    14:55:56 DIA  000 100 BWREMOTE  D0  1 Transaction Canceled IDOC_ADAPTER 601 ( SAPSRM 001 )
    Documentation for system log message D0 1 :
    The transaction has been terminated.  This may be caused by a termination message from the application (MESSAGE Axxx) or by an error detected by the SAP System due to which it makes no sense to proceed with the transaction.  The actual reason for the termination is indicated by the T100 message and the parameters.
    Additional documentation for message IDOC_ADAPTER        601 No service for system &1, client &2 in Integration Directory No documentation exists for message ID601
    <b><u>RFC Destinations (sm59):</u></b>
    Both RFC destinations look fine, with connection and authorization tests successful.
    <b><u>RFC Users (su01):</u></b>
    BW: BWREMOTE with profile S_BI-WHM_RFC (plus SAP_ALL and SAP_NEW temporarily)
    Source System: BWREMOTE with profile S_BI-WX_RFCA (plus SAP_ALL and SAP_NEW temporarily)
    Someone could help ?
    Thanks,
    Guilherme

    Guilherme
    I didn't see any reason why it's not bringing. Are you doing full extraction or Delta. If delta extraction please check the extractor is delta enabled or not. Some times this may cause problems.
    Also check this weblog on data Load errors basic checks. it may help
    /people/siegfried.szameitat/blog/2005/07/28/data-load-errors--basic-checks
    Thanks
    Sat

  • Data Load to the SAP System

    <Moderator Message: Even as a newbie you should be able to search for this information on your own. We are not here to spoonfeed>
    Hi All,
    I am very new to SAP BI.
    Could anybody please tell me the steps for loading data to SAP BI system?
    I wanted to do it through RFC and also through flat file.
    Thanks a lot in advance,
    Amilie
    Edited by: Siegfried Szameitat on Jul 16, 2009 2:15 PM

    >
    Amilie wrote:
    > I am very new to SAP BI.
    >
    > Thanks a lot in advance,
    > Amilie
    Please read the Rules and Engagement of this forum.

  • Loading of transaction data from SAP ECC system failed

    Hi!
    I successfully connected SAP ECC system to SAP BI system.
    The following steps have been executed:
    - user ALEREMOTE with max. authorization
    - RFC destination
    - Distributing Data model
    - Generated Partner profile
    - Maintaining message types in WE20
    Now when I try to load any data from SAP ECC system the loading process in hanging in status "yellow" and never comletes.
    [0FI_AR_4|http://www.file-upload.net/view-1447743/0FI_AR_4.jpg.html]
    The following steps within Load process are yellow:
    Extraction (messages): Missing messages
      Missing message: Request received
      Missing message: Number of sent records
      Missing message: Selection completed
    Transfer (IDocs and TRFC): Missing messages or warnings
      Request IDoc : Application document posted (is green)
      Data Package 1 : arrived in BW ; Processing : 2nd processing step not yet finished
      Info IDoc 1 : sent, not arrived ; Data passed to port OK
      Info IDoc 2 : sent, not arrived ; Data passed to port OK
      Info IDoc 3 : sent, not arrived ; Data passed to port OK
      Info IDoc 4 : sent, not arrived ; Data passed to port OK
    Subseq. processing (messages) : Missing messages
        Missing message: Subseq. processing completed
        DataStore Activation (Change Log) : not yet activated
    Question:
    Can some one give me some technical steps (tcode, report) to solve this problem?
    Thank you very much!
    Holger

    Hi!
    Many thanks for your answer.
    Via BD87 on BW system I detect that all the IDOC's (type: RSRQST) will be received from SAP ECC system.
    Via tcode SM58 I could not detect any entries.
    However the loading status from yesterday is set to "red".
    The errors are:
    Extraction (messages): Missing messages
    Data Package 1 : arrived in BW ; Processing : 2nd processing step not yet finished
    Info IDoc 1 : sent, not arrived ; Data passed to port OK
    Info IDoc 2 : sent, not arrived ; Data passed to port OK
    Can you investigate my issue again?
    Thank you very much!

  • Error is data loading from 3rd party source system with DBCONNECT

    Hi,
    We have just finished an upgrade of SAP BW 3.10 to SAP NW 7.0 EHP1.
    After the upgrade, we are facing a problem with data loads from a third party Oracle source system using DBConnect.
    The connection is working OK and we can see the tables in the source system. But we cannot load the data.
    The error in the monitor is as follows:
    'Error message from the source system
    Diagnosis
    An error occurred in the source system.
    System Response
    Caller 09 contains an error message.
    Further analysis:
    The error occurred in Extractor .
    Refer to the error message.'
    But, unfortunately, the error message has no further information.
    If we look at the job log in sm37, the job finished with the following log -                                                                               
    27.10.2009 12:14:19 Job started                                                                                00           516          S 
    27.10.2009 12:14:19 Step 001 started (program RSBATCH1, variant &0000000000119, user ID RXSAHA)                    00           550          S 
    27.10.2009 12:14:23 Start InfoPackage ZPAK_4FMNJ2ZHNNXC6HT3A2TYAAFXG                                              RSM1          797          S 
    27.10.2009 12:14:24 Element NOAUTHORITYCHECK is not available in the container                                     OL           356          S 
    27.10.2009 12:14:24 InfoPackage ZPAK_4FMNJ2ZHNNXC6HT3A2TYAAFXG created request REQU_4FMXSQ6TLSK5CYLXPBOGKF31G     RSM1          796          S 
    27.10.2009 12:14:24 Job finished                                                                                00           517          S 
    In a BW 3.10 system, there is no  message related to element NOAUTHORITYCHECK. So, I am wondering if this is something new in NW 7.0.
    Thanks in advance,
    Rajib

    There will be three things to get the errors like this
    1.RFC CONNECTION FAILED
    2.CHECK THE SOURCE SYSTEM
    3.CHECK IT OUT WITH Oracle Consultants WEATHER THEY ARE FILLING UP THE LOADS.TELL THEM TO STOP
    4.CHECK I DOC PROCESSING
    5.FINALLY MEMORY ISSUES.
    6.CATCH THE DATA SOURCE FIRST CHANGE IT AND THEN ACTIVATE AND RUN THE LOAD
    7.LAST IS MEMORY ISSUE.
    and also Check the RFC connection in SM59 If  it is ok then
    check the SAP note : 692195 for authorization
    Santosh

  • Data loads at Repository, but error in dashboard, System DSN

    Error in Dashboard UI:
    ORA-01017: invalid username/password; logon denied at OCI call OCISessionBegin. [nQSError: 17014] Could not connect to Oracle database. (HY000)
    While creating the System DSN with Oracle 11g, the error is as follows -
    Unable to connect
    SQLState=IM003
    Specified driver could not be loaded due to system error 127
    Any inputs would be appreciated.
    Thanks
    KSK

    hi
    1.tnsping directly the OBAW from cmd prompt or putty
    2.set the ODBC.INI tnsfile on linux box
    3.set the tnsnames.ora file under Oracle client on windows box
    4.try to directly view the data from repository
    5. some times network issues, so ask your dba to bounce the database
    6. take help of your DBA to investigate more from database side
    hope iit helps
    poinsts please
    thanks
    rakesh

  • Data extract/convert from SAP system and load to Unix system - Urgent

    Hi,
    I have a requirement that I need to extract data from SAP and each data field will have some conditions and need to map/Load it to the Legacy Unix system.
    there are four kinds of data objects of HR system, need to extract them individually and map those into Unix system.
    Can anyone please suggest me step-by-step how to go about it....like
    do we need to write a report for that?
    do we need to use Function modules? etc.,
    How to set mappings of that Data to a file?
    How to map that file to Unix system?
    Need to run those on daily basis and after first run, only changed data file need to be loaded. how to do this?
    thanks
    Alankaar
    rewarded with the points.

    Hi
    I can't say this becouse I don't the problem, anyway u have to do it by yourself.
    U should create:
    - a document where u define the file format: data type, size and name of the field of the unix system;
    - a document for the mapping: link between SAP fields and Unix system fields
    Based on these document you'll create a structure on the report:
    DATA: BEGIN OF STR_UNIX_SYSTEM,
                 FIELD1 ...,
                 FIELD2 ...,
                 FIELDN....,
               END  OF STR_UNIX_SYSTEM.
    and the code to transfer the data from SAP to legacy:
    MOVE: <SAP TABLE>-FIELD TO STR_UNIX_SYSTEM-FIELD1,
    So all steps have to be defined in the report before downloading the file.
    So I don't know which legacy system you're speaking about, after creating the file the legacy should read it in order to upload the data.
    U can create a daily job for that report, if you need to transfer the changed data only you should read the change document table (CDHDR and CDPOS) but I don't know HR so I'm not sure these tables are available for HR.
    Max

  • Spread the Data Loads in a SAP BW System

    Gurus,
    I want to spread the data loads in  our BW system, as a BASIS person how do I identify the jobs if they are full loads or delta loads, our goal is to make the load on the system to be evenly distributed as we see too many data loads starting and running around the same time. Can you suggest a right approach to achieve our goal.
    Thanks in advance
    Siva

    Hello Siva,
    As already mentioned the solution is to include the different steps of the data flow, extraction , ODS activation , rollup etc
    in process chains and schedule these chains to run at differet times so that they do not place too much load on the system.
    If the problem is specific to data loads  on the extraction step then I guess that maybe you see the resource problem on the
    source system side, if you don't have the load distribtion switched on in the RFC connection to the source system it is
    possible that you can specify that the source system extraxction jobs are executed on a particular application server,
    please see the information in the 'Solution' part of the note 147104 and read it carefully.
    Best Regards,
    Des

  • Data load into SAP ECC from Non SAP system

    Hi Experts,
    I am very new to BODS and I have want to load historical data from non SAP source system  into SAP R/3 tables like VBAK,VBAP using BODS, Can you please provide steps/documents or guidelines on how to achieve this.
    Regards,
    Monil

    Hi
    In order to load into SAP you have the following options
    1. Use IDocs. There are several standard IDocs in ECC for specific objects (MATMAS for materials, DEBMAS for customers, etc., ) You can generate and send IDocs as messages to the SAP Target using BODS.
    2. Use LSMW programs to load into SAP Target. These programs will require input files generated in specific layouts generated using BODS.
    3. Direct Input - The direct input method is to write ABAP programs targetting on specific tables. This approach is very complex and hence a lot of thought process needs to be applied.
    The OSS Notes supplied in previous messages are all excellent guidance to steer you in the right direction on the choice of load, etc.,
    However, the data load into SAP needs to be object specific. So targetting merely the sales tables will not help as the sales document data held in VBAK and VBAP tables you mentioned are related to Articles. These tables will hold sales document data for already created articles. So if you want to specifically target these tables, then you may need to prepare an LSMW program for the purpose.
    To answer your question on whether it is possible to load objects like Materials, customers, vendors etc using BODS, it is yes you can.
    Below is a standard list of IDocs that you can use for this purpose to load into SAP ECC system from a non SAP system.
    Customer Master - DEBMAS
    Article Master - ARTMAS
    Material Master - MATMAS
    Vendor Master - CREMAS
    Purchase Info Records (PIR) - INFREC
    The list is endless.........
    In order to achieve this, you will need to get the functional design consultants to provide ETL mapping for the legacy data to IDoc target schema and fields (better to ahve sa tech table names and fields too). You should then prepare the data after putting it through the standard check table validations for each object along with any business specific conversion rules and validations applied. Having prepared this data, you can either generate flat file output for load into SAP using LSMW programs or generate IDoc messages to the target SAPsystem.
    If you are going to post IDocs directly into SAP target using BODS, you will need to create a partner profile for BODS to send IDocs and define the IDocs you need as inbound IDocs. There are few more setings like RFC connectivity, authorizations etc, in order for BODS to successfully send IDocs into the SAP Target.
    Do let me know if you need more info on any specific queries or issues you may encounter.
    kind regards
    Raghu

  • Master Data Loading from APO--- BW System

    Hi Experts,
    I am loading master  data from the APO to bW System,after loading data into the BW System ,I found the number of data records are mismatching
    ex:APO (8340) --->BW (9445)
    and that to I tried to delet the data in InfoObject,
    only some records are deleting
    and one more doubt
    in APO i have info object ZA9MATNR CHAR 40
    With Conversion Routine PRODU
    in BW i have the same info object ZA9MATNR CHAR 40 with Conversion Routine ALPHA.
    and I have checked the Check box in transfer rules also,may be becoz of this I am getting more records.
    Could any body help me
    Thanks in Advance...
    Regards,
    Nagamani.

    Hi,
    Ideally it is impossible to get the increased count of data records when you load it into BW unless you are using the RETURN TABLE concept.
    Normally the record count decreases with the records with same keys gets overwritten/added.
    Coming to your second question, ALPHA conversion routine has nothing to do with the record count. It appends the ZEROS to the value coming in that InfoObject. For example, if the lenght of the object is 5 and data is coming only in 3 digits, it will append the ZEROS to make its length 5.
    Hope this info helps you.
    Regards,
    Yogesh.

  • Master data load from 2 source system

    Hi all,
            I am working on a project which has 2 source systems.
    Now I have to load master data from 2 source systems without making 0LOGSYS as compunding
    attribute. Because all the objects which i want to use compunding attribute are reference
    objects for lot of other info objects. So, i dont want to change the business content
    objects structure. Please guide me
    Thanks

    Hi,
    I cube there is nothing to do with compounding attribute, it will be handled in MD IO level.  As your requirement to separate the transaction happen in different source system then make add another info object as Zsource and make it as nav. attribute then design your report based on nav. attribute.
    Note that this separation is only possible in Material Related transaction not for others.
    Thanks
    BVR

Maybe you are looking for

  • JSF 1.2 ValueExpressions error.

    Hello guys, I have written a small JSF 1.2 component and I am running into all sorts of problems when using the expression langauge here is my simple component with two properties text and icon. I am running this on glassfish and i made sure i am usi

  • Using case in a function

    Hi, I have a fuction that accepts one input parameter, called v_inp. And this is the code of the function that i see in the code editor: --initialize variables here V_OUTPUT NUMBER; -- main body BEGIN case v_INP when 15,16,17,18,19 then V_OUTPUT := 1

  • Query on Time Dependent Info object

    Hi , I am trying to create a query out of a time dependent info object.The info object is 0employee and since it is time dependent it has the date from and date to automatically in the infoobject master data. However these fields do not come up as ch

  • SAP ECO 5.0 B2B: No customer found with customer number

    Hello all, we´re using SAP E-Commerce for ERP 5.0 (SP9) with backend SAP ECC 6.0 EHP4. We have changed from SU05 to SU01 user concept. All existing SU05 users have been migrated to SU01. In SU01 (under references) for each SAP E-Commerce user the map

  • Set some field's disable on runtime.

    Hi, It is possible to set some fields of the interactive forms disable or (read-only)? Thank you in advance