Data load delay after DB upgrade in source system

we have upgraded our source system DB to oracle 11g over the weekend. The first loads of our process chain the master data is taking huge time with the message as
I am getting message as
Data Package 000001: sent,not arrived
Info IDoc 2 : sent , not arrived ; Data passed to port OK
Info IDoc 1 : sent , not arrived ; Data passed to port OK
Info IDoc 3 : sent , not arrived ; Data passed to port OK
Info IDoc 4 : sent , not arrived ; Data passed to port OK
Request IDoc : Application document posted
processing (data packet) : No data
can anyone share ideas on this
thanks and regards
Edited by: AAL123 on Sep 26, 2011 7:17 AM

Hi,
Check the below given SAP Notes,
Note: 580869 ( RFC calls can be processed with report RSARFCEX) & 530997.
Regards,
Durgesh.

Similar Messages

  • Data Load Error after Client copy on Source System

    We did a client copy to our QAS source system.  After that was complete, I tried to do a couple of full data loads.  But they fail.  The only errors I see are on the source system when I try to manual update the Idocs.  They are:
    EDI: Partner Profile not available
    Entry in inbound table not found
    Is there a step I need to take after the client copy and before the loads?  I have a feeling that I'm forgeting something.
    Thanks,
    Chris

    Hi Chris,
    Check the following.
    1> Go to WE20 check whether you are able to see two logical system in "Partner Type LS" one for Source and one for Myself.
    2> If you say Yes for <1> then check in both the logical system, in the classification tab, partner status should be 'active'.
    3> Check you have used REMOTE USER in main and in Outbound and inbound Parameters "Post Processing Permitted tab " for each Message type.
    4> go to out bound message type and check the "Receiver port" pointing to correct RFC destination for both ( Source and Myself logical system).
    If your answer is no for <1> then perform all the steps.
    Hope this will help you.
    Suneel

  • Missing data in ap_check_stocks after R12 upgrade

    Hello,
    We are currently in version R12.1.2 and I have an existing query from R11 which gets data from ap_check_stocks table. I am however not able to find any data on ap_check_stocks after R12 upgrade. Has this table been discontinued? I do not find any link in metalink which says that as well...
    The join used in the query is between ap_check_stocks and ap_checks is ap_checks.check_stock_id = ap_check_stocks.check_stock_id... Has something changed in R12?
    Please help!
    Ramya

    You can find AP_CHECK_STOCKS synonym in R12 and the base object is AP.AP_CHECK_STOCKS_ALL table.
    Please refer to eTRM for details.
    SYNONYM: APPS.AP_CHECK_STOCKS
    http://etrm.oracle.com/pls/et1211d9/etrm_pnav.show_object?c_name=AP_CHECK_STOCKS&c_owner=APPS&c_type=SYNONYM
    TABLE: AP.AP_CHECK_STOCKS_ALL
    http://etrm.oracle.com/pls/et1211d9/etrm_pnav.show_object?c_name=AP_CHECK_STOCKS_ALL&c_owner=AP&c_type=TABLE
    Thanks,
    Hussein

  • Deletion of Data from Infoprovider  after upgradation of source system

    HI All,
    In our project  source system is upgraded from 4.7 to ecc 6.
    I have kept the same logical name and changed the RFC connection so that it points to new source system.
    Now do i need to delete all data before uploading data from source system or it will be fine as source system name is same.
    If i need to delete data ,can u help me how to delete master data especially Hierarchy data. do we need to Delete Hierarchy and recreate it.
    Please help em to resolve this issue

    Hi,
    It depends on your requirements whether you have to keep the history or not.
    If you  delete the data from your info providers, you have to reload everything from scratch.
    If you want to delete the data from all
    1. first delete the data in data targets such as cubes, DSO's or ODS then
    2. master data
    use these programs to delete from se38
    -> rsdrd_delete_facts - for Cubes and DSO or ODS
    -> rsdmd_del_background - for master data in background
    -> rsdmd_del_master_data_texts - for master data right away
    manage your master data info objetcs after the deletion, if you still find them....
    go to se14 and delete individually each and every table such as attributes, texts, sids, hierarchy.
    hope it helps you better.
    But I dont recommend you to delete try to load full repair options for all yur loads

  • Error during data load due to special characters in source data

    Hi Experts,
    We are trying to load Billing data into the BW using the billing item datasource. It seems that there are some special characters in the source data. When the record with these characters is encountered, the request turns red and the package is not loaded even into the PSA. The error we get in the monitor is something like
    'RECORD 5028: Contents from field ****  cannot be converted into type CURR',
    where the field **** is a key figure of type currency. We managed to identify the said record in RSA3 on the source system and found that one of the fields contains some invalid (special) characters that show up as squares in RSA3. The data in the rest of the fields, including the fields mentioned in the error  looks correct.
    Our source system is a non-unicode system wheras the BW system is unicode enabled. I figure that the data in the rest of the fields are getting misaligned due to the presence of the invalid characters in the above field. This was confirmed when we unassigned the field with the special characters from the transfer rules and removed the source field from the transfer structure. After doing this the data was loaded successfully and the request turned green.
    Can anyone suggest a way to either filter out such invalid characters from the source data or make some settings in the BW systems such that the special characters are no longer invalid in the BW system? We cannot write code in the transfer rules because the data package does not even come into the PSA. Is there any other method to solve this problem?
    Regards,
    Ted

    Hi Thad,
    I was wondering, if system is unicode or non unicode that should not matter the amount and currency field . As currencies are defined by SAP and it is in pure English at least a currency code part of 3 Chars. 
    Could this because of some incosistency of data ??
    I would like to know for Which currency  had some special characters it it in that particular record ??
    Hope that helps.
    Regards
    Mr Kapadia

  • Problems with loading mailbox after GW8 upgrade

    I just recently upgraded to Groupwise 8 in a school district. I support many school districts and this is the first time I've had this problem. Users are complaining that from outside the network(home access) that when they log in to web acc, the mailbox will just hang at "Loading...."
    The toolbars, menus, and folder list all display properly, but it hangs trying to load the message list for the mailbox. It works fine when on the internal LAN, only happens from home. And the problem is not constant - sometimes it will work, sometimes it will not. I've had users complain about the issue from the entire spectrum of web browsers, from IE 6,7,8 to Safari and Firefox. I've had users clear cache/temp internet files and still have the problem.
    I do see a java error on the logger screen, I'm not sure if its related:
    com.novell.webaccess.providers.gwap.GWAPException: Invalid Parameter
    The only reference I could find to the above error on these forums(or anywhere for that matter) was in the following thread:
    http://forums.novell.com/novell-prod...unloading.html
    Which doesn't describe the problem I am having....
    Has anyone else seen this problem or have an idea what to check?
    Thanks in advance for any help
    -Rich

    John Cunningham wrote:
    > I'm getting exactly the same error after my upgrade from 7.0.3 to 8.0.1.
    > WebAccess 'kinda' works: you can log into it but there are java (Tomcat,
    > I presume) errors on the console:
    >
    > Novell GroupWise WebAccess
    > Version 8.0.1
    > (C) Copyright 1993-2009 Novell, Inc. All rights reserved.
    >
    >
    > <GroupWise WebAccess> WebAccess Servlet is ready for work
    > Sep 4, 2009 6:45:49 PM org.apache.jk.common.ChannelSocket init
    > INFO: JK: ajp13 listening on /0.0.0.0:9010
    > Sep 4, 2009 6:45:49 PM org.apache.jk.server.JkMain start
    > INFO: Jk running ID=0 time=0/85 config=null
    > com.novell.webaccess.providers.gwap.GWAPException at
    > com.novell.webaccess.providers.gwap.XGWAP.xlatAvai lableDictionaries(U
    > nknown Source)
    > at com.novell.webaccess.providers.gwap.XGWAP.xlatMess age(Unknown
    > Source)
    >
    > at com.novell.webaccess.providers.gwap.XGWAP.itemRead (Unknown
    > Source)
    > at sun.reflect.NativeMethodAccessorImpl.invoke0(Nativ e Method)
    > at
    > sun.reflect.NativeMethodAccessorImpl.invoke(Native MethodAccessorImpl.
    > java:39)
    > at
    > sun.reflect.DelegatingMethodAccessorImpl.invoke(De legatingMethodAcces
    > sorImpl.java:25)
    > at java.lang.reflect.Method.invoke(Method.java:324)
    > at
    > com.novell.webaccess.providers.gwap.XGWAP.dispatch ServiceHandler(Unkn
    > own Source)
    >
    >
    > . . . you get the picture.
    >
    > I've re-installed the WebAccess Application from scratch, no diff.
    >
    > Users can log into it, but when they open a message they get a dialog
    > box "Unable to process request. Please contact your system
    > administrator." Then it displays the email message but it does not mark
    > the messages as read in the main browser window.
    >
    > Besides that, Tomcat has crashed on me and hung completely, requiring me
    > to reboot the server.
    >
    > I've spent a lot of hours trying to get this resolved.
    >
    > The WebAccess agent is on another (clustered) node. The WebAccess
    > Application is installed on a single NW65SP7 server (i.e. not clustered).
    >
    > Any suggestions would be greatly appreciated.
    >
    >
    > rpf2 wrote:
    >> I just recently upgraded to Groupwise 8 in a school district. I support
    >> many school districts and this is the first time I've had this
    >> problem. Users are complaining that from outside the network(home
    >> access) that
    >> when they log in to web acc, the mailbox will just hang at
    >> "Loading...."
    >>
    >> The toolbars, menus, and folder list all display properly, but it hangs
    >> trying to load the message list for the mailbox. It works fine when on
    >> the internal LAN, only happens from home. And the problem is not
    >> constant - sometimes it will work, sometimes it will not. I've had
    >> users complain about the issue from the entire spectrum of web browsers,
    >> from IE 6,7,8 to Safari and Firefox. I've had users clear cache/temp
    >> internet files and still have the problem.
    >> I do see a java error on the logger screen, I'm not sure if its
    >> related:
    >> com.novell.webaccess.providers.gwap.GWAPException: Invalid Parameter
    >>
    >> The only reference I could find to the above error on these forums(or
    >> anywhere for that matter) was in the following thread:
    >> http://forums.novell.com/novell-prod...unloading.html
    >>
    >> Which doesn't describe the problem I am having....
    >>
    >> Has anyone else seen this problem or have an idea what to check?
    >>
    >> Thanks in advance for any help
    >>
    >> -Rich
    >>
    >>
    fuplivsyr

  • Do we need to re-create data load rules if we upgrade from EBS 11 to 12?

    If so, please explain the reason. Thanks.

    If you upgrade from EBS 11 to EBS 12 you would need to create a new source system registration in ERPi.
    Once the new source system is created, you would then need to initalize the source system in ERPi.
    From there you would need to associate it with an import format and locations in ERPI and your data load rules are then based on the location.

  • Data loading delay

    Hi Friends.,
               Shall i have an answer for one error,
    The Issue is: Every day i load to one info cube, whatever the cube it is, it takes 2 Hours for every load, but once it has taken 5 Hours, what might be the reason? just confusing with that, can anybody let me clarify !!!!
    Regards.,
    Balaji Reddy K.

    Reddy,
    1. Is the time taken for loading to PSA or to load from PSA to cube ? if it is to oad to PSA then  uaually the problem lies at the extractor
    2. If it is loading to the cube.. then check out if statistics are being maintained for the cube and they would give an accurate picture of where the dataload is taking up most time.
    Do an SQL trace during the data load and if you find a lot of aster Data Lookups .. make sure that master data is loaded and if there are a lot of looups to Table NRIV check if number range buffering is on so that dim IDs get generated faster
    Check if the data load happens fast if you drop any existing indexes...
    Are you loading any agregates after the data load ? check fi th aggregates are necessary or if they have been enabled for delta loads..
    If you have indexes active and there is a huge data loa , depending on the index , the data load can get delayed..
    If the cube is not compressd , some times the data load can get delayed..
    Also when the data load is going on check in SM50 and SM37 to see if the jobs are active - this means that the data load is active from both sides...
    Always update the statistics for the cube before the load and ater the load , this helps in deciphering the time it takes for the data load... after activating the statistics .. check table RSDDSTAT or the standard reports available as part of BW tecnical content..
    Hope it helps..
    Arun
    Assgn points if helpful

  • Is data restored automatically after mavericks upgrade

    Is data restored automatically after the mavericks upgrade

    Welcome to Apple Support Communities
    The Mavericks upgrade doesn't delete your files, so it doesn't have to restore your files after upgrading to OS X Mavericks.
    However, you should make a backup of your files with Time Machine before upgrading to OS X Mavericks > http://support.apple.com/kb/HT1427 If the installation files, you will lose your files unless you have got a backup. Also, check that your apps are compatible > http://www.roaringapps.com

  • Plug-Ins and Extractors - Related to Upgrade of Source system

    Hi,
    We are upgrading from R3 to ECC.
    Can I say that, if the Plug-in in current R3 system and the new system(post upgrade)(ECC) are same then, the extractors will also remain same ? hence, nothing will be affected in BW ?
    For example, if the current system(R3) has the Plug-in as PI 2004.1 and the upgraded ECC also has the same Plug-in PI 2004.1 , Can I say that the extractors will also be the same ?
    Regards,
    Rekha .

    You have to take care of all your deltas are picked into the bw before upgradation.
    Some times you may required to replicate the data sources after upgradation.
    You may lose the connection between your R3 and you BW.
    Please do a souce system check after the upgradation.
    You have to take care of all deltas :
    As a standard practice we drain the delta queues by running the IP/ chain multiple times.
    As a prerequiste we cancel/reschedule the V3 jobs to a future date during this activity.
    The V3 extraction delta queues must be emptied prior to the upgrade to avoid any possible data loss.
    V3 collector jobs should be suspended for the duration of the upgrade.
    They can be rescheduled after re-activation of the source systems upon completion of the upgrade.
    See SAP Notes 506694 and 658992 for more details.
    Load and Empty all Data mart Delta Queues in SAP BW. (e.g. for all export DataSources)
    The SAP BW Service SAPI, which is used for internal and ‘BW to BW’ data mart extraction, is
    upgraded during the SAP BW upgrade. Therefore, the delta queues must be emptied prior to the
    upgrade to avoid any possibility of data loss.
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/472443f2-0c01-0010-20ab-fbd380d45881
    /message/3221895#3221895 [original link is broken]
    OSS notes 328181 and 762951 as a prerequisites.
    http://wiki.ittoolbox.com/index.php/Upgrade_BW_to_Netweaver_2004s_from_v3.0B

  • Data loading to Infocube from SAP R/3 System

    Hello Friends,
             I started load in BW four days back in background. Initially i could able to see the load running in SM50 t.code. This load  has 28 data packets. Out of 28, five data packets are successfully updated data to the cube, Remaining data packets are not through yet and in yellow colour. Now there is no job running in background in SM50 and i didn't find any runtime error in ST22.Source system is SAP R/3.The data is related to Employee HR Position.
    Could anyone plz throw some light on this.Thanks in advance. Your responose will be  appreciated.
    Regards,
    Sreekanth

    Hi,
    for this u have to chk the luw entries in SM58 of R/3....
    if there are some entries ur load won't be successful...
    and also chk these all
    1) wht in the details of extraction is ur data selction ended?
    2) after that chk in sm58 there shuld be no entire there...
    3) still ur load is in yellow, u can maually change the status to RED and goto each data packet and right click and select manual update...this takes soemtime to turn into green...
    do the same for all packets in yellow and then after the whole packets r green u can change the status from red to back to technical status..
    rgds,

  • Data load into SAP ECC from Non SAP system

    Hi Experts,
    I am very new to BODS and I have want to load historical data from non SAP source system  into SAP R/3 tables like VBAK,VBAP using BODS, Can you please provide steps/documents or guidelines on how to achieve this.
    Regards,
    Monil

    Hi
    In order to load into SAP you have the following options
    1. Use IDocs. There are several standard IDocs in ECC for specific objects (MATMAS for materials, DEBMAS for customers, etc., ) You can generate and send IDocs as messages to the SAP Target using BODS.
    2. Use LSMW programs to load into SAP Target. These programs will require input files generated in specific layouts generated using BODS.
    3. Direct Input - The direct input method is to write ABAP programs targetting on specific tables. This approach is very complex and hence a lot of thought process needs to be applied.
    The OSS Notes supplied in previous messages are all excellent guidance to steer you in the right direction on the choice of load, etc.,
    However, the data load into SAP needs to be object specific. So targetting merely the sales tables will not help as the sales document data held in VBAK and VBAP tables you mentioned are related to Articles. These tables will hold sales document data for already created articles. So if you want to specifically target these tables, then you may need to prepare an LSMW program for the purpose.
    To answer your question on whether it is possible to load objects like Materials, customers, vendors etc using BODS, it is yes you can.
    Below is a standard list of IDocs that you can use for this purpose to load into SAP ECC system from a non SAP system.
    Customer Master - DEBMAS
    Article Master - ARTMAS
    Material Master - MATMAS
    Vendor Master - CREMAS
    Purchase Info Records (PIR) - INFREC
    The list is endless.........
    In order to achieve this, you will need to get the functional design consultants to provide ETL mapping for the legacy data to IDoc target schema and fields (better to ahve sa tech table names and fields too). You should then prepare the data after putting it through the standard check table validations for each object along with any business specific conversion rules and validations applied. Having prepared this data, you can either generate flat file output for load into SAP using LSMW programs or generate IDoc messages to the target SAPsystem.
    If you are going to post IDocs directly into SAP target using BODS, you will need to create a partner profile for BODS to send IDocs and define the IDocs you need as inbound IDocs. There are few more setings like RFC connectivity, authorizations etc, in order for BODS to successfully send IDocs into the SAP Target.
    Do let me know if you need more info on any specific queries or issues you may encounter.
    kind regards
    Raghu

  • Key Figure data type/length differs in BI and source system

    Hi all.
    It is a strange question, but we need an explanation to our "strange" client. Why many fields which are DEC 13 in source system became CURR 09 in BI? DEC->CURR type switch seems reasonable, but not the length. I guess the shorter numeric field is the less precise it can keep...
    Does SAP give any explanation for that?

    Hi,
         If you are expecting the data to be aggregated in the cube and if the data is loaded by different requests, then in that case the data wouldnot be aggregated.
    EX:  CHAR1   CHAR2    CHAR3   CHAR4     CHAR5    CHAR6    CHAR7    KF1   KF2   KF3
             A               B              C            D        E              F             G          10     20    30
              A               B              C            D        E              F             G          10     20    30
    sUPPOSE IF THE ABOVE 2 ROWS OF DATA ARE GETTING LOADED IN THE SAME REQUEST, then you can expect the data to be aggregated and the o/p of the KFs will be 20, 40 and 60 respectively.
    In case the loading is done with separate requests, then your data will not be aggregated.
    This is becuse in the cube, Request ID (under the Data Package Dimension)    is one of the dimension and this differs when the requests are different.
    Regards
    Sunil

  • Patch upgrade in Source system ECC 6

    Hi,
    We are plannig Patch upgrade in the ECC 6 r/3 System. This is source system for BI7 (NW2004s).
    I want to know, we beibg a BI consultant
    1. What we need to check in ECC on patch upgrade.
    2. What precautions we need to take for better facilitating the patch upgrade in ECC 6
    3. Other related things.
    can someone please reply.

    Hi
    check patch level in BI system :---
    Logon to your SAP System
    SYSTEM>STATUS>Click the search buttion under SAP System Data
    Where you can find your patchelev.
    BI Content and Plug-in Information
    These are the links to get detailed information about BI Content and Plug-in.
    Check the release notes in this link:
    http://help.sap.com/saphelp_nw70/helpdata/en/bc/52a4373f6341778dc0fdee53d224c8/frameset.htm
    Dependencies of BI_CONT Add-Ons: Functional Correspondences
    https://websmp207.sap-ag.de/~sapdownload/011000358700007362962004E/func_corresp.htm
    Dependencies of BI_CONT Add-Ons: Technical Prerequisites:
    https://websmp207.sap-ag.de/~sapdownload/011000358700007362972004E/tech_prereq.htm
    BI Extractors and Plug-in Information:
    http://service.sap.com/~form/sapnet?_SHORTKEY=01100035870000682135&
    http://sap.seo-gym.com/copa.pdf
    http://help.sap.com/saphelp_nw04s/helpdata/en/e3/e60138fede083de10000009b38f8cf/frameset.htm
    /thread/178564 [original link is broken]
    Asset Mgt Data
    http://gnanamax.com/Documents/BI%207.0%20Functional%20Upgrade%20-%20Migration%20points.pdf
    /thread/507115 [original link is broken]
    Go to sap logon pad on the top left side of the logon pad u can see a small click option beside the logon gui for eg: sap logon710 or 640. click on that and select about sap logon there u can c the patch level, build, file name, file version etc,
    Hope it helps
    regards
    gaurav

  • Master Data cleansing and transformation from non-SAP source systems

    Hi all,
    Our client (Media)wants to cleanse and transform his master data from non-SAP source system to be uploaded into BW (no R/3 yet). If anybody has a document regarding this topic that i could use, i will appreciate if u sent it to me.
    thanks.

    Hi,
    https://websmp203.sap-ag.de/~sapidb/011000358700001965262003
    https://websmp203.sap-ag.de/~sapidb/011000358700006591612001
    https://websmp203.sap-ag.de/~sapidb/011000358700001971392004
    https://websmp203.sap-ag.de/~form/sapnet?_SHORTKEY=01100035870000471477&_OBJECT=011000358700008927932002E
    /manfred

Maybe you are looking for