Issue with CIF Integration model for Transaction data

Hi Gurus,
I have activated the Integration model for PO & PReqs by location wise and i assumed that transaction data is online and we need not to re activate for any new Product / location combination got created in system.
But the issue is that every time a new material got created, we need to deactivate and activate the integration model for Transaction data to get the transaction data into APO.
Is there any way to avoid the following exercise as it is taking very long time for activation.
Please guide me as it is very urgent.
Thanks for help in advance...
Thanks & Regards,
Jagadeesh

I assume 1,60,000 location products are spread around different locations.
Rather than one Integration Model it is better to have multiple Integration Models.
For example: one for each region like North America, one for Europe, one for Asia (assuming you have countries spread across the world). Or create Intgeration Model by Country.
This way you reduce the number of Products in an Integration Model.
It is very important to have manageable set of Integration Models. Let me give an example - you have some problem hence the single Material Master Integration Model is down (inactive). At that time any PP or PDS transfer will not have the Header or COmponent products transferred to APO (in effect PDS/PPM cannot be transferred). If you are creating or converting Planned Orders they will not transfer to R/3 (as Header product is not part of active intgeration model).
But if you have country spefic or region specific Integration Model - only that country is affected not all.
In fact you should have other integration model (like PDS/PPM, Procurement Relationships, Planned / Production Orders, Sales Orders, Stocks) in same manner i.e. either Country(s) specific or group of countries by region. The risk of models getting inactive or taking too much time to activate after regeneration easily outweighs management of more number of Integration Models (compared to one Global integration model per object).
Hope this gives you some direction.
Somnath

Similar Messages

  • Integration model for transactinoa data

    Hi
    what is the transaction code to reactivate integration model  for the transactional data, let say example, changes in po or sales order after activating the integaration model. which transaction code we have to use reactivate. please any one help me

    Hi somnath , Online Transfer of Transaction Data using BTE is activated in R/3 SPRO (should be enabled for ND-APO application)
    I think  above what u mentioned is  cfc9 tansaction code. It is mainly meant for materila master, vendor master , customers. simply menat for master data.But for transacrtional data likoe PO's , SO   will not come into picture. Right?
    Please clarify me.
    Online Transfer of Transaction Data using BTE is activated in R/3 SPRO (should be enabled for ND-APO application).
    2) Publication settings (Distribution Definitions) maintained in APO for the locations in SPRO.
    Can u tell me the path or transaction code for the above event

  • OIM 11g - Issue with Bulk Load Utility for Account Data

    Hi,
    We are trying to load the account data for users in OIM 11g using bulk load utility.
    We are trying to load the account data for resource "iPlanet". For testing purpose, we made one account entry in csv file and run the bulk load utility. After the bulk load process completes, we have noticed that resource is provisioned to the user multiple times and multiple entries have been created in process form table.
    We have tried to run the utility multiple times with a different user record each time.
    The out put of the below sql query:
    SELECT MSG FROM OIM_BLKLD_LOG
    WHERE MODULE = 'ACCOUNT' AND LOG_LEVEL = 'PROGRESS_MSG'
    ORDER BY MSG_SEQ_NO;
    is coming as follows:
    MSG
    Number of Records Loaded: 126
    Number of Records Loaded: 252
    Number of Records Loaded: 504
    Number of Records Loaded: 1008
    Number of Records Loaded: 2016
    Number of Records Loaded: 4032
    We have noticed that each time the number of records loaded is increased to double from the records loaded in last run even when the csv file contains only one record.
    Provided below are the parent and child csv file entries.
    Parent file:
    UD_IPNT_USR_USERID,UD_IPNT_USR_FIRST_NAME,UD_IPNT_USR_LAST_NAME,UD_IPNT_USR_COMMON_NAME,UD_IPNT_USR_NSUNIQUEID
    KPETER,Peter,Kevin,Peter Kevin,
    Child file 1:
    UD_IPNT_USR_USERID,UD_IPNT_GRP_GROUP_NAME
    KPETER,group1
    Child file 2:
    UD_IPNT_USR_USERID,UD_IPNT_ROL_ROLE_NAME
    KPETER,role1
    Can you please throw some insight on what could be the potential cause for this issue and how it could be resolved?
    Thanks
    Deepa
    Edited by: user10955790 on Jun 25, 2012 6:45 AM

    Hi Deepa,
    I know from 'User load' perspective that is required to restart Oracle Identity Manager when we need to reload data that was not loaded during the first run.
    So, my suggestion is restart it before reload.
    Reference: http://docs.oracle.com/cd/E21764_01/doc.1111/e14309/bulkload.htm#CHDEICEH
    I hope this helps,
    Thiago Leoncio.

  • Integration Model for the resources terminates with an error

    Hi
    we have deleted the existing Integration model for the Resources and trying to create a newones, when iam activating the new resource integration model it is terminating with an error "Resource already exists(mapping error occured)".
    we did the consistency check for the resources and ran OM17 everything seems OK. when tried to the same in the test system it works fine...
    How to rectify this..

    5,
    Deleting integration models for master data is a poor way to start any process.
    The message implies that the offending resource probably was created locally in SCM, or perhaps to support a different Business System Group. The easiest solution is to delete the resource and then recreate via Core Interface.
    Best Regards,
    DB49

  • More than one CIF integration model possible for material?

    Hi Gurus,
    we want to have two different integration models (with non-overlapping selections) for materials to be able to have separate selections, but we cannot make it work.
    When we create them the first time and then activate them, everything works fine: both models send their products to APO in the initial load. However, if there are changes in products in both models, when we generate and then activate one of the models, the changes in the other one are lost: the generation-activation of the second model does not send any product to APO even if there were any changes in the master data (and we see the ALE pointers as processed in the table)
    Is there any workaround to solve this issue in a system without CIF pointers in BDCP2?? The ECC system is below the relevant releases and will not be upgraded in the near future.
    We could not find any mention to an existing limitation on the number of models being one neither in OSS notes nor the help pages.
    thanks a lot,
    Pablo

    Pablo,
    It is common to have multiple Integration Models for Materials.
    I have never heard of transaction BDCP2.  I also have never seen the problem you describe.
    I usually prefer not to use change pointers at all for Master data, such as Materials.  You can alter this behavior in CFC9, and instead use Business Transfer Events.  This means that all fields that are relevant for 'change transfer' to SCM will move across almost immediately after saving the changed material master.
    http://help.sap.com/saphelp_scm70/helpdata/EN/c8/cece3be9cd4432e10000000a11402f/frameset.htm
    Also read the links contained in this page.
    If you wish to actually perform a new 'initial load', then run the Integration model through program RIMODINI.  It may be a lengthy run.
    As always, check first in your Qual system before committing to production.
    Best Regards,
    DB49

  • Modeling "Status" for transaction data in a cube

    Hello all,
    I saw this thread created sometime ago about putting status in a cube.
    How to model Status filed
    I would like model a status of transactions within a cube. One suggestion was to just create a CHAR characteristic of a certain length (I assume this would be in the E and F fact tables and not with the master data at all).
    To me, this would mean that if the status changed, a whole new row would have to be loaded - and the old row removed? Would one use some kind of selective deletinon?
    It seems that a navigational attribute would be best placed in the E/F fact tables.
    Best Regards,
    Casey

    @kkc - thanks for the reply but I already knew about the E and F fact tables - I just did not know that compression takes care of the status field
    @Wond - Above you said "Yes it will be a whole new record will be sitting in the cube, once the status changes the cube will have three records when doing compression the two will wipe out itself and you will remain with one."
    So how does it decide which record stays around? By the loading date? Also, is there any good documentation out there that would say this?
    Thanks a lot!
    PS - I also found this thread but it speaks of the ODS and using 0RECODMODE (it does not say how to set this with the new  BI 7.0 and the "Transformations"/"Data Transfer Processes")
    Modeling "Status" for transaction data in a cube
    Edited by: Casey Harris on Feb 29, 2008 11:28 AM

  • How to Deel with CIF integration modles during SCM server shutdown

    Hi
    How to handle CIF integration model during SCM server shutdown , whether to keep it as (in R/3)  active mode or change to inactive mode?
    Any ideas or experience sharing around this topic would be great !
    Thanks
    Sunny

    Hi,
    This depends on which IM you are using. The most imp IM to be deactivated is IM for ATP Check. If this is kept active, then Sales order confirmation will be searched in APO - which will not be available.
    All other IMs for master data and other transaction data are not required to be deactivated.
    However, once your APO system is up, make sure that you Re-initialize all IMs (except IM for Location) in sequence, using transaction RIMODINI. Rather than reactivation, this transaction deletes transaction data and re initializes. Thus both systems will be in sync.
    Regards,
    Bipin

  • ALE Configuration for Transactional Data (Purchase Order)

    Dear Experts,
    I want to configure ALE for Purchase Orders(Transactional Data).
    For that:
    I have done neccessary Condition  Records at NACE with necessary Output Type Configuration. Configured Port and Partner Profile with Message Types: ORDERS and ORDECHG (as Outbound Parametrs) .Assigned those Message Types against defined Sender and the Receiver at Customer Distribution Model . Configured Port and Partner Profile at the Receiver System also.
    Now I have 2 queries.
    i) Is this much of ALE configuration is enough for Transactional Data for communication between 2 systems or I have to do something more?
    ii) For Master Data we configure Change Pointer(we can see field assignments against Change Document Object at BD52 for Master Data Messaage Types) . Is it necessary to configure Change Pointer for Transactional Data or it will autometically handled by the system(field assignments cannot be seen against Change Document Objects at BD52 for Transactional Data Messaage Types)?
    Regards
    Arnab

    Hi kumar,
    what Alexander  said is absolutely correct .
    from your second question no master data distribution configuration is required for the transactional data distribution.  
    let us know once if you face any problems.
    ~linagnna

  • Problem in data sources for transaction data through flat file

    Hello Friends,
    While creating the data sources for transaction data through flat file, I am getting the following error "Error 'The argument '1519,05' cannot be interpreted as anumber' while assigning character to application structure" Message no. RSDS016
    If any one come across this issue, please provide me the solution.
    Thanks in Advance.
    Regards
    Ravi

    Hallo,
    just for information.
    I had the same problem.
    Have changed the field type from CURR to DEC and have set external instead of internal.
    Then, the import with flatfile worked fine.
    Thank you.

  • [svn:fx-trunk] 12982: Fix for issue with exposing accessible names for combobox list items

    Revision: 12982
    Revision: 12982
    Author:   [email protected]
    Date:     2009-12-15 20:44:23 -0800 (Tue, 15 Dec 2009)
    Log Message:
    Fix for issue with exposing accessible names for combobox list items
    QE notes: none
    Doc notes: none
    Bugs: n/a
    Reviewer: Gordon
    Tests run: checkintests
    Is noteworthy for integration: no
    Modified Paths:
        flex/sdk/trunk/frameworks/projects/spark/src/spark/accessibility/ComboBoxAccImpl.as
        flex/sdk/trunk/frameworks/projects/spark/src/spark/accessibility/ListBaseAccImpl.as

    Add this to the end of your nav p CSS selector at Line 209 of your HTML file, after 'background-repeat...':
    margin-bottom: -2px;
    Your nav p will then look like this:
    nav p {
              font-size: 90%;
              font-weight: bold;
              color: #FFC;
              background-color: #090;
              text-align: right;
              padding-top: 5px;
              padding-right: 20px;
              padding-bottom: 5px;
              border-bottom-width: 2px;
              border-bottom-style: solid;
              border-bottom-color: #060;
              background-image: url(images/background.png);
              background-repeat: repeat-x;
              margin-bottom: -2px;

  • CIF Integration Model Enhancement

    Guys,
    From ECC to APO, I want to send across all materials which are marked as Procurement Type "E" in the material master's MRP 2 view.
    But standard CIF Integration model does not support to select materials with Procurement Type (in CFM1 Transaction there is no "Procurement Type" selection.)
    Does anybody know how to create the field "Procurement Type" in the integration model, that means how to get an enhanced Integration Model which will contain customised fields.
    Thanks
    Kumar

    It can be done- You can add a new selection option. However I would recommend not to disturb the Original CFM1 Screen.
    Instead write a wrapper program which does the simple query and writes the values in the TVARVC Variables.
    You can use the TVARVC variable in the CFM1 variant.
    So Whenever you run the CFM1 in a batch job
    You would have to run the WRAPPER PRogram as the first step and then the second step would be to run the CFM1.
    Let me know If you need details of the wrapper program.
    Regards
    Kumar

  • Problem with activating integration model

    When I am trying to activate an integration model, that includes a scheduling agreement (type LF), the activation isn't successfull. It isn't blocked, but it never stops.
    In the screen I have a message called "determining delta model" but it never continous with executing the activation.
    Does anybody know what could cause this problem?
    thanks in advance
    Mark Smeets

    Hi,
    I am facing the same problem as mentioned in the original post.
    Whenever I activate the integration model for vendors, a "CIF_LOAD" window appears with message "Should interval be created" with YES and NO option. Why does this window appears. After clicking on NO option the message "Determining delta model" appears in the status bar and after some time the program gets terminated due to time out.
    I have tried following solutions mentioned in this thread.
    Refreshed the indexes of the two tables BDCPS and BDCP using DB20.
    Executed the transaction BD22 in test run mode, but I am not sure which change pointers option(obsolete or processed) to select for deletion and also which message type. I selected the obsolete option but here too the program got terminated after time out. Besides I am doing this on prduction environment and other integration models like plant etc got activated.
    Please help us with this.
    Regards,
    Gaurav Patil

  • Time phase Min/max replenishment models for future dates

    Hi,
    We are working on a safety stock requirement of Maximum/Minimum replenishment model in APO.
    It seems to work great except it cannot be time phased.  Our business wants different safety stock strategies at different times of the year. 
    Please suggest if there is a way that we could “time phase” min/max replenishment models for future dates?
    Thanks in advance for your help!
    regards
    Yogendra

    Many thanks for this.
    I can see entirely why it's designed as such, but I just find it slightly frustrating that there's no way to break the link between the order and the shipment out to the depot. Just to clarify, we're not requiring the orders to change - they will still be made and will come in - but just that the orders themselves don't specifically need to be the stock that is used for the replenishment.
    So -
    1. Min Max identifies depot needs replenishing.
    2. Central distribution does not have (enough) stock to replenish.
    3. Order is made to replenish central distributions stock.
    4. We ship whatever we've got, when we've got it, to depot to replenish.
    It's the bit where Min-Max is trying to replensih a specific depot rather than our central distribution centre that's my problem.
    I suspect that, as you say, that specific issue is not directly fixable without getting our IT contractors to do a customisation.
    I'm going to look into your Supply Date Offset suggestion now, though I'm not sure how this affects the shipping after the orders are placed. The orders themselves are approved manually after we've checked our stock position (i.e. what's in with the recycling team), but we recycle & refurb probably 60% of our maint stock so there'll always be kit turning up after the order has been made because of the long lead times.
    Thanks again.

  • CMOD exit_rsap_saplr_001 for transactional data ABAP CODE

    Hi please confirm that you want to convey that i can write the actual code in CMOD exit_rsap_saplr_001 for transactional data?? if i put 20 data sources enhancement code in there is'nt that too bulky and will cause the failing all extractor if one code is wrong if you still suggest that i can go ahead with 20 similar codes as below . please see my code below which i have used for all 20 datasources with little modification
    if you can recommend some changes in code to improve performance would be great
    case i_datasource.
    WHEN '0CUSTOMER_ATTR'.
    loop at i_t_data into l_s_BIW_KNA1_S.
    l_tabix = sy-tabix.
    clear i_knvp.
    select single * from KNVP into i_knvp where KUNNR = l_s_BIW_KNA1_S-KUNNR.
    if sy-subrc = 0.
    l_s_BIW_KNA1_S-ZZPARFN = i_knvp-PARVW.
    l_s_BIW_KNA1_S-ZZCUSNOBP = i_knvp-KUNN2.
    modify i_t_data from l_s_BIW_KNA1_S index l_tabix.
    endif.
    endloop.
    endcase.
    Thanks
    Poonam

    Check this simple code for Z...include into the FM EXIT_SAPLRSAP_001 where zcomp is new field added into the datasource 8zsales2.
    data : l_s_/BIC/CE8ZSALES2 like /BIC/CE8ZSALES2,
    l_tabix like sy-tabix.
    case i_datasource.
    when '8ZSALES2'.
    loop at c_T_DATA into l_s_/BIC/CE8ZSALES2.
    l_tabix = sy-tabix.
    fill the new field
    select comp_code from /BI0/MCOMP_CODE into l_s_/BIC/CE8ZSALES2-zcomp
    where comp_code = '1000'.
    if sy-subrc = 0.
    *l_s_ZFIAA_IS001_TXT50 = ANLA_TXT50.
    modify c_t_data from l_s_/BIC/CE8ZSALES2 index l_tabix.
    endif.
    endselect.
    endloop.
    endcase.
    Edited by: Rohan Kumar on Jan 16, 2008 8:21 AM

  • Integration model for sales orders failing repeatedly

    Integration model for Sales Orders to APO is failing and in the CIF the error says "Customer requirement G BR 0082372353 900010 0000: liveCache problem, retu".
    This error appears everytime you we run the integration model and it says in the Job log; ABAP/4 Processor: SYSTEM_CANCELED.
    Note that this is "not" manually stopped but it still gives this error.

    Hi Kailash,
    Run /SAPAPO/SDRQCR21 in apo for the part contained in your delivery document, which is failing. While running the report, check the radiobutton, Build the requirements from Doc.Flow table. This will remove the inconsistencies remaining in the requirement table on R/3 side.
    Try this and let us know, if you could succeed.
    Regards
    Sanjeev

Maybe you are looking for

  • Need help with non-video clip transitions on the FCP timeline...

    Hi all, I am struggling with this problem with the FCP timeline: I have a mix of things; .motn includes, .psd includes, etc. and I want to put a transition between them and a normal video clip, BUT, as u can guess, it does not work well. I guess beca

  • How to retain the page location when SelectAll or SelectNone are clicked

    Hi, I have a fairly large jspx page and to get to the end of the page one needs to scroll down. Towards the end of the page I have a af:table component with "Select All" and "Select None" features. Every time the SelectAll or SelectNone is selected t

  • HACMP and Oracle RAC

    Hello All, I want to install Oracle 11g R2 on AIX 6.1. After doing some researches i found that there are some documents mentioning that HACMP should not be installed while installing Oracle RAC. what is the use of HACMP ? Is it optional ? when shoul

  • Mac Book Pro Mouse Pad

    Hi guys and girls i have a Mac Book Pro 2012 the mouse pad seems to be moving on its own every now and then , apple care have tried resetting my pro but still having problems , any idea please

  • CLICK X = exit, close, shut down. Lr3.4 still running

    First the backup said it could not complete because there wasn't enough space. That is not true but whatever...so I closed Lightroom for the computer was to be shut off. Then I checked the process monitor tool  just by chance . What a surprise!.Light