IDOC ADAPTER Issue, some meta data loads, some doesn't

Hey all,
I just started trying to setup the IDOC adapter connection between our R/3 system and our XI system. I decided to try and pull in some meta data via IDX2 to start. The weird thing is that I get some IDOC structures successfully, but not others. I think it has to do with some requirnments that I am missing on the R/3 side. For instance, outside of setting up the RFC connection, creating the tRFC port and the partner profile for the Logical system what else is needed. Do I have to assign the basic type to the outbound partner profile to access it? Do I have to actually create IDOCS for the basic type? Just curious. Here are the details of what I have done so far. A lot of this I pulled from other peoples postings here on the forum, I think I am just missing a few steps!
Our current configuration is as follows:
SAP R/3 Enterprise to post IDOCs to XI .
- R/3 Configuration:
logical system created: DXICLNT200
we20 : Partner Type LS: partner number: DXICLNT200.
we21 : IDOC Ports: Transactional RFC: Port name: DXICLNT200. IDOC record type SAP Release 4.x. RFC Destination : DXICLNT200
sm59 : create RFC destination : DXICLNT200. Target host: the XI box. Logon security: Client 200. XIAPPLUSER. PASSWORD (for XIAPPLUSER). Testing the connection works.
- XIHOST Configuration:
sm59 : created RFC destination to DDSCLNT210. Gateway options used, NO Client or User specified, are these needed, if so, is there a particular user to use.
Test connection works.
IDX1 : Created Port - DDSCLNT210. client 210. RFC destination DDSCLNT210.
IDX2 : Able to load metadata for IDoc Type CHRMAS01, of port DDSCLNT210. NOT ABLE TO IMPORT CREMAS03 or MATMAS03!
Any help is greatly apprciated!
Thanks,
Chris

There was a problem in the RFC connection, the target host was not populated correctly. All meta data can now be pulled.
However, I am curious about where the meta data for CHRMAS and CLSMAS came from. the RFC connection was NOT working, so where could it have pulled this information from????

Similar Messages

  • Invalidate Adaptive Web Service Meta Data Cache?

    As the subject says - how can I invalidate the Adaptive Web Service meta data cache?  Currently, the only way we can invalidate the cache is to restart the app server, which takes quite a while.
    I've tried restarting the Web Dynpro Runtime service, but this didn't do anything.
    We're running nw70.

    Hi,
    Please consider the following:
    For JCO Models running on NW 7.2 you go to:
    1) NWA
    2) Availability and Performance Management --> Resource Monitoring
    3) JCO Monitoring
    4) Metadata Cache
    5) Click on clear of the desired cache region.
    For WebServices Cache, please follow the instructions as per notes below:
    Note 1088382 - WSIL results are cached after search
    Note 1123574 - Caching functionality of the Web service connector
    Cheers,
    Ivan

  • IDoc meta data load problem in IDX2

    I have been trying to load meta deta for some idocs and am getting message 'Basic type <idoc_type> is unknown'. Interesting to note is that I could load ECMMAS01 idoc type but other idoc types like MATMAS01 ..04, ORDERS01 ...04, DOCUMENT_LOAD01 ... and many others that I tried, I got the same message. The config has been done in IDX1 and IDX2 (that's how I got the ECMMAS01 in).
    There is an OSS note '0000751839 IDoc Adapter: Basic type 'XYZ' is unknown.  ' It seems we are already on that patch level. Can't verify if the note has been really applied or not for authorization issues for now. The note also mentions about a workaround to set the category level to '0' (i.e. blank) in the Integration server configuration. But that doesn't seem to help either.
    Anyone else had this problem and resolved it?
    Harshad

    Problem stands resolved. I realized that the RFC destination in XI for the R/3 system had a problem. What still is not answered is how come I was able to import ECMMAS01?!?! One explanation could be that the RFC connection was ok then. Don't know, but I am happy that it works now.

  • Receiver Idoc adapter - issue

    Hi,
    I need to send an IDoc (CRMXIF_ORDER_SAVE_M01) to SAP CRM. The trouble occur when having/wanting to use/send only the segments that apply to release 620.
    In order to do that I have in my receiver IDoc adapter specified the value '620' in the Segment Version part.
    As soon as having done this my IDocs fail in the adapter giving me the following error:
    - <!--  Call Adapter
      -->
    - <SAP:Error xmlns:SAP="http://sap.com/xi/XI/Message/30" xmlns:SOAP="http://schemas.xmlsoap.org/soap/envelope/" SOAP:mustUnderstand="">
      <SAP:Category>XIAdapter</SAP:Category>
      <SAP:Code area="IDOC_ADAPTER">ATTRIBUTE_IDOC_METADATA</SAP:Code>
      <SAP:P1>Segment 'E101CRMXIF_ACTIVITY_X', segmentnumber '000001' not correct in structure CRMXIF_ORDER_SAVE_M01</SAP:P1>
      <SAP:P2 />
      <SAP:P3 />
      <SAP:P4 />
      <SAP:AdditionalText />
      <SAP:ApplicationFaultMessage namespace="" />
      <SAP:Stack>Error: Segment 'E101CRMXIF_ACTIVITY_X', segmentnumber '000001' not correct in structure CRMXIF_ORDER_SAVE_M01</SAP:Stack>
      <SAP:Retry>M</SAP:Retry>
      </SAP:Error>
    The segment specified in the above corresponds to the very fist sub-segment to the main segment (BUSTRANS). So if removing that segment the error jumps to the next segment (which would then be the first sub-segment).
    Should be noted that I have no trouble sending IDocs if I've either not specified a Segment version (meaning latest version is automatically used) or if I explicitly request the latest by entering the latest segment release (700 in my case).
    <Trace level="1" type="T">Convert one IDoc</Trace>
      <Trace level="2" type="T">Convert Control Record</Trace>
      <Trace level="2" type="T">Convert Data Records</Trace>
      <Trace level="3" type="T">Segment= E101CRMXIF_BUSTRANS</Trace>
      <Trace level="2" type="T">Ignore unknown Segment E101CRMXIF_BUSTRANS</Trace>
      <Trace level="2" type="T">Ignore segment field: APPL_SNAME</Trace>
      <Trace level="2" type="T">Ignore segment field: OBJECT_TASK</Trace>
      <Trace level="2" type="T">Ignore segment field: OBJECT_GUID</Trace>
      <Trace level="2" type="T">Ignore segment field: OBJECT_ID</Trace>
      <Trace level="2" type="T">Ignore segment field: PROCESS_TYPE</Trace>
      <Trace level="2" type="T">Ignore segment field: OBJECT_TYPE</Trace>
      <Trace level="2" type="T">Ignore segment field: POSTING_DATE</Trace>
      <Trace level="2" type="T">Ignore segment field: DESCR_LANGUAGE</Trace>
      <Trace level="2" type="T">Ignore segment field: DESCR_LANGUAGE_ISO</Trace>
      <Trace level="2" type="T">Ignore segment field: LOGICAL_SYSTEM</Trace>
      <Trace level="2" type="T">Ignore segment field: CRM_RELEASE</Trace>
      <Trace level="2" type="T">Ignore segment field: CLIENT</Trace>
      <Trace level="2" type="T">Ignore segment field: CREATED_AT</Trace>
      <Trace level="2" type="T">Ignore segment field: CREATED_BY</Trace>
      <Trace level="2" type="T">Ignore segment field: CHANGED_AT</Trace>
      <Trace level="2" type="T">Ignore segment field: CHANGED_BY</Trace>
      <Trace level="2" type="T">Ignore segment field: LOCAL_TIMEZONE</Trace>
      <Trace level="2" type="T">Ignore segment field: OBJECT_ID_OK</Trace>
      <Trace level="2" type="T">Ignore segment field: HIGHEST_ITEM_NO</Trace>
      <Trace level="2" type="T">Ignore segment field: CRM_CHANGED_AT</Trace>
      <Trace level="2" type="T">Ignore segment field: CALC_SCHEMA</Trace>
      <Trace level="2" type="T">Ignore segment field: SCENARIO</Trace>
      <Trace level="2" type="T">Ignore segment field: VALID_FROM_EXT</Trace>
      <Trace level="2" type="T">Ignore segment field: VERIFY_DATE</Trace>
      <Trace level="3" type="T">Segment= E101CRMXIF_ACTIVITY_X</Trace>
      <Trace level="3" type="T">Segment= E101CRMXIF_ACTIVITY</Trace>
      <Trace level="3" type="T">Segment= E101CRMXIF_ADDRESS</Trace>
      <Trace level="3" type="T">Segment= E103CRMXIF_ADDRESS</Trace>
      <Trace level="3" type="T">Segment= E107CRMXIF_ADDRESS</Trace>
      <Trace level="2" type="T">Ignore unknown Segment E107CRMXIF_ADDRESS</Trace>
      <Trace level="3" type="T">Segment= E101CRMXIF_ADDRESS_F</Trace>
      <Trace level="3" type="T">Segment= E101CRMXIF_ACT_REASON</Trace>
      <Trace level="3" type="T">Segment= E101CRMXIF_ACT_REASON_F</Trace>
    Hope someone can help me so that I can succesfully limit the segment version used when sending Idocs.
    Best regards,
    Daniel

    Hi,
    Thanks for your quick reply.
    The funny thing is that even if a try to post the exact same idoc (except for the EDI part) that I have received from CRM it still fails in XI with the before mentioned error.
    But to answer your questions:
    1. I'm not using java mapping but traditional message mapping (meta-data are up-to-date and structure is newly loaded).
    2. No blank segments are being created
    3. All segments have value 1 or blank (using blank works fine for me, that is when not trying to limit Segment version. But using 1 makes no difference)
    Best Regards,
    Daniel

  • Issues with ondemand Data loader

    Hello,
    We are facing 2 issues with on demand data loader.
    Issue 1
    While inserting 'Contacts' and 'Assets' if the 'Account' information is wrong, the records are created with out accounts even though "Account" is a required field.
    Issue 2
    While inserting records data loader is not checking for duplicates. So duplicate records are getting created.
    Kindly advise if anyone has come across similar issues. Thanks
    Dipu
    Edited by: user11097775 on Jun 20, 2011 11:46 PM

    Hello,
    We are facing 2 issues with on demand data loader.
    Issue 1
    While inserting 'Contacts' and 'Assets' if the 'Account' information is wrong, the records are created with out accounts even though "Account" is a required field.
    Issue 2
    While inserting records data loader is not checking for duplicates. So duplicate records are getting created.
    Kindly advise if anyone has come across similar issues. Thanks
    Dipu
    Edited by: user11097775 on Jun 20, 2011 11:46 PM

  • Log Issue in HFM data load

    Hi,
    I'm new to Oracle data Integrator.
    I have an issue in log file name. I'm loading data into Hyperion Financial Management through ODI. In the Interface, when we select the IKM SQL to HFM data, we have an option of log file enabled. I made it true and gave the log file name as 'HFM_dataload.log'. After executing the interface when I navigate in to that log folder and view the log file, that file is blank. Also a new file 'HFM_dataloadHFM6064992926974374087.log' is created and the log details are displayed in it. Since I have to automate the process of picking up the everyday log file,
    * I need the log details to be displayed in the specified log name i.e. 'HFM_dataload.log
    Also I was not able to perform any action (copy that newly generated log file into another or send that file in mail) on that log file, since I'm not able to predict the numbers generated along with the specified log file name.
    Kindly help me to overcome this issue.
    Thanks in advance.
    Edited by: user13754156 on Jun 27, 2011 5:08 AM
    Edited by: user13754156 on Jun 27, 2011 5:09 AM

    Thanks a lot for idea.
    I am wonder in HFM data loads. In ODI operator they are showing warning symbol though few records got rejected instead Error. Is it possible to make it fail if one or more records got rejected.
    I have experience with Essbase data loads. if it reaches specified number of records operator will get fail.
    Please guide me if i am missing something.
    Regards,
    PrakashV

  • Issue in master data loading

    Dear Experts
    when loading hierarchy data from CRM for some of my master data i am facing the below problem.
    Everyday i load my master data through process chains for some hierarchies but those changes are not getting displayed
    when checked in infopackage monitor its green but when i double click and check the details the overall status is red and underneath that the third one TRANSFER (IDOCS AND TRFC) : is also showing red.
    So later if i load manually then everything is becoming green and i am able to see the changes, What could be the possible reason and how can i solve it.
    Thanks and regards
    Neel

    Dear Neel,
    Thanks for the info, we have checked the option of activating the hierarchy in the infopackage itself, but even though the load package shouldn't be red in overall status.
    Thanks and regards
    Neel

  • Performance issues with Planning data load & Agg in 11.1.2.3.500

    We recently upgraded from 11.1.1.3 to 11.1.2.3. Post upgrade we face performance issues with one of our Planning job (eg: Job E). It takes 3x the time to complete in our new environment (11.1.2.3) when compared to the old one (11.1.1.3). This job loads then actual data and does the aggregation. The pattern which we noticed is , if we run a restructure on the application and execute this job immediately it gets completed as the same time as 11.1.1.3. However, in current production (11.1.1.3) the job runs in the following sequence Job A->Job B-> Job C->Job D->Job E and it completes on time, but if we do the same test in 11.1.2.3 in the above sequence it take 3x the time . We dont have a window to restructure the application to before running Job E  every time in Prod. Specs of the new Env is much higher than the old one.
    We have Essbase clustering (MS active/passive) in the new environment and the files are stored in the SAN drive. Could this be because of this? has any one faced performance issues in the clustered environment?

    Do you have exactly same Essbase config settings and calculations performing AGG ? . Remember something very small like UPDATECALC ON/OFF can make a BIG difference in timing..

  • DSO data Load issue

    Hi all,
    i have some issue with DSO data load , while loading the data data comming to psa perfectly 238 ware comes, but when i trigger that DTP, i am getting only 6 records.
    Please any one suggest me.
    Thanks,
    Gayatri.

    Hi Gayatri,
    if you have already loaded some data to DSO & now if you are trying to do Delta, it is possible that it is taking only Delta Data??
    (or)
    If you have any Start/End routines/Rule Routines written for deleting any records based on some conditions.
    (or)
    Also It depends on the keyfield you have selected in the DSO. If the keyfield which you have selected is having some repeated values, then while loading into DSO, it will be aggregated i.e if you have 10 rows for keyfield with value say 101, then in the DSO it will be loaded with only one row with value 101 (10 rows becoming 1 row) & with the keyfigure either Summation/Overwritten depending on what you have selected in the rule details for the keyfigure(you can check it by right click on Keyfigure mapping> Rule Details> here you can see whether it is Overwrite/Summation).
    Also as mentioned in above posts you can check the DSO --> manage --> Check the number of rows transferred & number of rows added.
    Hope it is clear & helpful!
    Regards,
    Pavan

  • Master data loading issue

    Hi gurus,
        Presently i am working on BI 7.0.I have small issue regarding master data loading.
        I have generic data soruce for master data loading.i have to fetch this data to BW side.Always i have to do full data load to the master data object.first time i have scheduled info package and run DTP to load data to master data object, no issues, data got loaded successfully.whenever i run infopacage for second time and run DTP i am getting error saying that duplicated records.
       How can i handle this.
    Best Regards
    Prasad

    Hi Prasad,
    Following is happening in your case:
    <b>Loading 1st Time:</b>
    1. Data loaded to PSA through ipack.It is a full load.
    2. data loaded to infoobject through DTP.
    <b>Loading 2nd Time:</b>
    1. Data is again loaded to PSA. It is a full load.
    2. At this point, data in PSA itself is duplicate. So when you are running the DTP, it picks up the data of both the requests that were loaded to PSA. And hence, you are getting the Duplicate record error.
    Please clear the PSA after the data is loaded to infoobject.
    Assign points if helpful.
    Regards,
    Tej Trivedi

  • ERPi Data load mapping Issue

    Hi,
    We are facing issue with ERPi data load mappings issue. Mapping file (txt file) has 36k records, whenever we are trying to load mappings, it's taking very long time, nearly 1 hour 30mins. but we want to reduce that time. is there any way to reduce data load mapping time??
    Hyperion verion: 11.1.2.2.300
    Please help, thanks in advance!!
    Thanks.

    Any one face the same kind of issue??

  • Regarding master data loading for different source systems

    Hi Friends,
    I have an issue regarding master data loading.
    we have two source systems one is 4.6c and another is ecc 6.0.
    First i am loading the master data from 4.6c to bi7.0.
    Now this 4.6c is upgraded to ecc6.0.
    *In 4.6c and ecc6.0c master data is changing.
    After some time there is no 4.6c, only ecc 6.0 is there.
    Now if i load master data from ecc6.0 to bi7.0 what will happen.
    Is it possible ?
    Could you please tell me?
    Regards,
    ramnaresh.

    Hi ramnaresh porana,
    Yes, its possible. You can load data from ECC.
    Data will not change, may be you may get more fields in datasource at r/3 side, but BW/BI side no change in mappings structures are same. So data also same.
    You need to take care of Delta's before and after upgrade.
    Hope it Helps
    Srini

  • May I know how data load takes place in essbase

    Hi all,
    Since I am new to hyperion i need to know some details about Essbase.Last month we faced one issue.Whenever a data load takes place it clears the previous years data and loads only loads the current years data.May i know the reason for this.
    And also i want to know how the data load takes place in Essbase.
    With thanks and regards,
    babu

    I m okay with above statements ... Now i am going to add something else..
    Above mentioned in how many ways we can load the data from different files..now what am i trying to say is we can also use "Calc script" to load the data...
    You know how?
    thru "Datacopy" we can load the data value(s) into another member(s)..You can see in session dialogue box while running the Calc script (Which it contains the "Datacopy") that will show data load in progress..
    And also make sure that every time you won't do copy the data from SQL,Excel..etc ... it may be you can have the situation to copy the data from one server to another server...think in that way will be another good advantage for you..
    i hope you understand this....
    Regards,
    Prabhas

  • SPM Data Loads : Less number of records getting loaded in the Invoice Inbound DSO

    Dear Experts,
    We are working on a project, where data of different NON SAP Source Systems is being loaded into SPM, via Flat File Loads. We came across a very weird situation.
    For other Master and Transaction Data objects, it worked fine, but when we loaded Invoice File, less number of records are getting loaded in the Inbound DSO. The Invoice File contained 80000 records, but the inbound DSO has 78500 records only. We are losing out on 1500 Records.
    We are unable to figure out, as to which 1500 records are we missing out on. We couldn't find any logs, in the Inbound Invoice DSO. We are unable to find out if the records are erroneous, or there is any issue with something else. Is there a way to analyze the situation / Inbound invoice DSO.
    If there is any issue with the Outbound DSO or Cube, We know that it is possible to check the issue with the Data Load request, but for the Inbound DSO, we are not aware, as to which the way to analyze the issue, and why Inbound DSO is taking less records.
    Regards
    Pankaj

    Hi,
    Yes, It might be happen in DSO, because the data records have the simantic keys, so in Keyfileds selection you might have less no of records.
    If you have any rountines check the code(If any condetion for filtering the records).
    Regards.

  • Data load through DTP giving Error while calling up FM RSDRI_INFOPROV_READ

    Hi All
    We are trying to load data in Cube through DTP from DSO. In the Transformation, we are looking up Infocube data through SAP Standard Function Module 'RSDRI_INFOPROV_READ'. The Problem we are facing is that our loads are getting failed & it is giving error as 'Unknown error in SQL Interface' & Parallel process error.
    In the DTP, We have Changed the No. of Parallel processes from 3 (default) to 1 but still the above issue exists with data loads.
    We had similar flow developed in 3.5 (BW 3.5 Way) where we had used this Function Module 'RSDRI_INFOPROV_READ' & there our data loads are going fine.
    We feel there is compatability issue of this FM with BI 7.0 data flows but are not sure. If anybody has any relevant inputs on this or has used this FM with BI 7.0 flow then please let me know.
    Thanks in advance.
    Kind Regards
    Swapnil

    Hello Swapnil.
    Please check note 979660 which mentions this issue ?
    Thanks,
    Walter Oliveira.

Maybe you are looking for