Integration model for transactinoa data

Hi
what is the transaction code to reactivate integration model  for the transactional data, let say example, changes in po or sales order after activating the integaration model. which transaction code we have to use reactivate. please any one help me

Hi somnath , Online Transfer of Transaction Data using BTE is activated in R/3 SPRO (should be enabled for ND-APO application)
I think  above what u mentioned is  cfc9 tansaction code. It is mainly meant for materila master, vendor master , customers. simply menat for master data.But for transacrtional data likoe PO's , SO   will not come into picture. Right?
Please clarify me.
Online Transfer of Transaction Data using BTE is activated in R/3 SPRO (should be enabled for ND-APO application).
2) Publication settings (Distribution Definitions) maintained in APO for the locations in SPRO.
Can u tell me the path or transaction code for the above event

Similar Messages

  • Issue with CIF Integration model for Transaction data

    Hi Gurus,
    I have activated the Integration model for PO & PReqs by location wise and i assumed that transaction data is online and we need not to re activate for any new Product / location combination got created in system.
    But the issue is that every time a new material got created, we need to deactivate and activate the integration model for Transaction data to get the transaction data into APO.
    Is there any way to avoid the following exercise as it is taking very long time for activation.
    Please guide me as it is very urgent.
    Thanks for help in advance...
    Thanks & Regards,
    Jagadeesh

    I assume 1,60,000 location products are spread around different locations.
    Rather than one Integration Model it is better to have multiple Integration Models.
    For example: one for each region like North America, one for Europe, one for Asia (assuming you have countries spread across the world). Or create Intgeration Model by Country.
    This way you reduce the number of Products in an Integration Model.
    It is very important to have manageable set of Integration Models. Let me give an example - you have some problem hence the single Material Master Integration Model is down (inactive). At that time any PP or PDS transfer will not have the Header or COmponent products transferred to APO (in effect PDS/PPM cannot be transferred). If you are creating or converting Planned Orders they will not transfer to R/3 (as Header product is not part of active intgeration model).
    But if you have country spefic or region specific Integration Model - only that country is affected not all.
    In fact you should have other integration model (like PDS/PPM, Procurement Relationships, Planned / Production Orders, Sales Orders, Stocks) in same manner i.e. either Country(s) specific or group of countries by region. The risk of models getting inactive or taking too much time to activate after regeneration easily outweighs management of more number of Integration Models (compared to one Global integration model per object).
    Hope this gives you some direction.
    Somnath

  • Integration Model for the resources terminates with an error

    Hi
    we have deleted the existing Integration model for the Resources and trying to create a newones, when iam activating the new resource integration model it is terminating with an error "Resource already exists(mapping error occured)".
    we did the consistency check for the resources and ran OM17 everything seems OK. when tried to the same in the test system it works fine...
    How to rectify this..

    5,
    Deleting integration models for master data is a poor way to start any process.
    The message implies that the offending resource probably was created locally in SCM, or perhaps to support a different Business System Group. The easiest solution is to delete the resource and then recreate via Core Interface.
    Best Regards,
    DB49

  • Integration model for sales orders failing repeatedly

    Integration model for Sales Orders to APO is failing and in the CIF the error says "Customer requirement G BR 0082372353 900010 0000: liveCache problem, retu".
    This error appears everytime you we run the integration model and it says in the Job log; ABAP/4 Processor: SYSTEM_CANCELED.
    Note that this is "not" manually stopped but it still gives this error.

    Hi Kailash,
    Run /SAPAPO/SDRQCR21 in apo for the part contained in your delivery document, which is failing. While running the report, check the radiobutton, Build the requirements from Doc.Flow table. This will remove the inconsistencies remaining in the requirement table on R/3 side.
    Try this and let us know, if you could succeed.
    Regards
    Sanjeev

  • Time phase Min/max replenishment models for future dates

    Hi,
    We are working on a safety stock requirement of Maximum/Minimum replenishment model in APO.
    It seems to work great except it cannot be time phased.  Our business wants different safety stock strategies at different times of the year. 
    Please suggest if there is a way that we could “time phase” min/max replenishment models for future dates?
    Thanks in advance for your help!
    regards
    Yogendra

    Many thanks for this.
    I can see entirely why it's designed as such, but I just find it slightly frustrating that there's no way to break the link between the order and the shipment out to the depot. Just to clarify, we're not requiring the orders to change - they will still be made and will come in - but just that the orders themselves don't specifically need to be the stock that is used for the replenishment.
    So -
    1. Min Max identifies depot needs replenishing.
    2. Central distribution does not have (enough) stock to replenish.
    3. Order is made to replenish central distributions stock.
    4. We ship whatever we've got, when we've got it, to depot to replenish.
    It's the bit where Min-Max is trying to replensih a specific depot rather than our central distribution centre that's my problem.
    I suspect that, as you say, that specific issue is not directly fixable without getting our IT contractors to do a customisation.
    I'm going to look into your Supply Date Offset suggestion now, though I'm not sure how this affects the shipping after the orders are placed. The orders themselves are approved manually after we've checked our stock position (i.e. what's in with the recycling team), but we recycle & refurb probably 60% of our maint stock so there'll always be kit turning up after the order has been made because of the long lead times.
    Thanks again.

  • Integration model for planned orders ..?

    Hi,
    When I create PLANNED orders in R/3  , they come to APO .  But when I create them in APO , I dont see them in R/3. I do not see them R/3 table AFPO.
    What could be wrong? Wwhat integration models are responsible?
    thanks,
    Ashish

    Some basic things to check first are (I quote from a 4.1 system):
    Have you maintained the distribution definition (SPRO > Integration with SAP Components > Integration of SAP SCM and SAP R/3 > Basic Settings for Data Transfer > Publication > Maintain Distribution Definition)?
    Do you have the "No SNP Planned orders" unchecked at SPRO > Advanced Planning and Optimization > Supply Chain Planning > Supply Network Planning (SNP) > Basic Settings > Configure Transfer to OLTP Systems?
    There may be also some BAdIs involved but I'll check later.

  • Ideal model for accesing data

    i'm trying to transfer some data (about 100,000 rows) to mysql database. i already has my Data Access Object(DAO), so i reuse it. when my DAO add about 500 rows, this exception occurs:
    com.mysql.jdbc.CommunicationsException: Communications link failure due to underlying exception:
    ** BEGIN NESTED EXCEPTION **
    java.net.SocketException
    MESSAGE: java.net.BindException: Address already in use: connect
    STACKTRACE:
    java.net.SocketException: java.net.BindException: Address already in use: connect
            at com.mysql.jdbc.StandardSocketFactory.connect(StandardSocketFactory.java:151)
            at com.mysql.jdbc.MysqlIO.<init>(MysqlIO.java:280)
            at com.mysql.jdbc.Connection.createNewIO(Connection.java:1691)
            at com.mysql.jdbc.Connection.<init>(Connection.java:405)
            at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:268)
            at java.sql.DriverManager.getConnection(DriverManager.java:512)
            at java.sql.DriverManager.getConnection(DriverManager.java:171)
            at trade.connector.ConnectionProvider.getConnection(ConnectionProvider.java:71)
            at trade.access.LoginValidator.validate(LoginValidator.java:29)
            at trade.access.UserAccess.getUser(UserAccess.java:124)
            at trade.access.UserAccess.addUser(UserAccess.java:162)
            at trade.access.UserAccess.main(UserAccess.java:254)
    ** END NESTED EXCEPTION **
            at com.mysql.jdbc.Connection.createNewIO(Connection.java:1754)
            at com.mysql.jdbc.Connection.<init>(Connection.java:405)
            at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:268)
            at java.sql.DriverManager.getConnection(DriverManager.java:512)
            at java.sql.DriverManager.getConnection(DriverManager.java:171)
            at trade.connector.ConnectionProvider.getConnection(ConnectionProvider.java:71)
            at trade.access.LoginValidator.validate(LoginValidator.java:29)
            at trade.access.UserAccess.getUser(UserAccess.java:124)
            at trade.access.UserAccess.addUser(UserAccess.java:162)
            at trade.access.UserAccess.main(UserAccess.java:254)
    Exception in thread "main" i retry transfering but this exception occurs again and again. if i retry after about 1 minute, it can resume transfering about 500 rows again.
    Is this problem cause by database protection from network flood attack? i use single shot connection method (connect when needed, and disconnect when job done). so if add method need 2 query (2 times connect), it takes about 1000 re-connection.
    i change connection from single shot connection to connection pooling. the problem still occurs... so i assume the protection apply for statement (i recreate statement for about 1000 times). finally i write extra code for transfering. reuse statement (Prepared Statement),not close and recreate and problem solve.
    So what 'ideal model' for DAO (logical layer)? for every access method, we create Statement, use it, and close it. this will cause database server block our connection when using this method repeatly (over 1000 times). if we leave the statement open, it will cost network traffic to keep it alive and we often use only a part accesing method, not all.

    you are right malcolmmc. i'm running out of local port. i check with netstat & here is the output:
      TCP    TOSHIBA2450:3306       localhost:3987         TIME_WAIT
      TCP    TOSHIBA2450:3306       localhost:3988         TIME_WAIT
      TCP    TOSHIBA2450:3306       localhost:3989         TIME_WAIT
      TCP    TOSHIBA2450:3306       localhost:3990         TIME_WAIT
      TCP    TOSHIBA2450:3306       localhost:3991         TIME_WAIT
      TCP    TOSHIBA2450:3306       localhost:3992         TIME_WAIT
      TCP    TOSHIBA2450:3306       localhost:3993         TIME_WAIT
      TCP    TOSHIBA2450:3306       localhost:3994         TIME_WAIT
      TCP    TOSHIBA2450:3306       localhost:3995         TIME_WAIT
      TCP    TOSHIBA2450:3306       localhost:3996         TIME_WAIT
      TCP    TOSHIBA2450:3306       localhost:3997         TIME_WAIT
      TCP    TOSHIBA2450:3306       localhost:3998         TIME_WAIT
      TCP    TOSHIBA2450:3306       localhost:3999         TIME_WAIT
      TCP    TOSHIBA2450:3306       localhost:4000         TIME_WAIT
      TCP    TOSHIBA2450:3306       localhost:4001         TIME_WAIT
      TCP    TOSHIBA2450:3306       localhost:4002         TIME_WAIT
      TCP    TOSHIBA2450:3306       localhost:4003         TIME_WAIT
      TCP    TOSHIBA2450:3306       localhost:4004         TIME_WAIT
      TCP    TOSHIBA2450:3306       localhost:4005         TIME_WAIT
      TCP    TOSHIBA2450:3306       localhost:4006         TIME_WAIT
      TCP    TOSHIBA2450:3306       localhost:4007         TIME_WAIT
      TCP    TOSHIBA2450:3306       localhost:4008         TIME_WAIT
      TCP    TOSHIBA2450:3306       localhost:4009         TIME_WAIT
      TCP    TOSHIBA2450:3306       localhost:4010         TIME_WAIT
      TCP    TOSHIBA2450:3306       localhost:4011         TIME_WAIT
      TCP    TOSHIBA2450:3306       localhost:4012         TIME_WAIT
      TCP    TOSHIBA2450:3306       localhost:4013         TIME_WAIT
      TCP    TOSHIBA2450:3306       localhost:4014         TIME_WAIT
      TCP    TOSHIBA2450:3306       localhost:4015         TIME_WAIT
      TCP    TOSHIBA2450:3306       localhost:4016         TIME_WAIT
      TCP    TOSHIBA2450:3306       localhost:4017         TIME_WAIT
      TCP    TOSHIBA2450:3306       localhost:4018         TIME_WAIT
      TCP    TOSHIBA2450:3306       localhost:4020         TIME_WAIT
      TCP    TOSHIBA2450:3306       localhost:4021         TIME_WAIT
      TCP    TOSHIBA2450:3306       localhost:4022         TIME_WAIT
      TCP    TOSHIBA2450:3306       localhost:4023         TIME_WAIT
      TCP    TOSHIBA2450:3306       localhost:4024         TIME_WAIT
      TCP    TOSHIBA2450:3306       localhost:4025         TIME_WAIT
      TCP    TOSHIBA2450:3306       localhost:4026         TIME_WAIT
      TCP    TOSHIBA2450:3306       localhost:4027         TIME_WAIT
      TCP    TOSHIBA2450:3306       localhost:4028         TIME_WAITi already try to change my code to use only one connection. and the result is same. i assume create statement will create a new connection. i mean althought i'm not re-creating java.sql.Connection, but when i create java.sql.Statement, it will create new TCP/IP Connection. but i do call:
    preparedstatement.close();
    pool.returnConnection(con); // con.close();i wonder how to close this connection (tcp/ip)...

  • Data flow model for Master data extraction

    Hello friends,
    I have the following scenario where the report layout looks like this. What would be the data flow model for this example if I'm loading master data from R/3
    KNVV     KNA1     KNVP     KNA1     KNA1     KNVP     KNA1
    VKORG     KATR1     KUNN2     NAME1     KTOKD     KUNNR     NAME1
    Sales Org     BCG     Payer No      Payer Nm     Acc Grp     Sold 2 No     Sold 2 Name
    I would appreciate any help.
    Thanks

    Balaji,
    are you trying to develop a master data IOBJ for the field combination given and want to know how to proceed ?
    Did you check out if the current 0customer master data already has the required fields ?
    otherwise you can
    1. create a view for the master
    2. create a FM for the same and use that as a datasource.
    I would suggest you look at the customer master that comes as part of business content first.
    Arun
    Assign points if useful

  • Error Message while Activating Integration Model for PDS

    Hi,
    When we are trying to activate IM for SNP subcontracting PDS,it gives us an error message
    "No authorization to create Product ".
    All master data for that particular Product,plant and Subcontractor are in place in APO.
    Product is  also extended to Plant and subcontractor.
    T-lane is also thr in APO which got created via CIF (Subcontracting Purchasing Info record).
    Need your expertise.
    Regards
    Abhi

    Hello Abhi..
    recently I have faced same problem. If you are facing problem for particular product location, I would suggest
    1) Delete the particular Product master globally in APO  & re cif data
    if doesnot work, then problem must be with authorization
    Please revert in case of any issue
    Thanks
    Mangesh A. Kulkarni

  • CIF Queues generated without active integration model

    Hello All,
    We are facing a unique issue, where queues are seen in SCM queue manager although those products are not there in any active integration model for transaction data.
    The scenario is as follows:
    1. We are CIFing only master data as we have only DP and release to R/3
    2. There is active integration model for master data
    3. There is no integration model for transaction data
    However in SCM queue manager, I am seeing queues of type CFFCC "Location" "Product".
    The function module name is /SAPAPO/CIF_IRQ_REDUCT_INBOUND.
    We observed in R/3 that the goods movement transaction code MB1A is trigerring TRFC's which is generating the queues.
    Has anyone encountered a similar scenario like this?
    Regards,
    Kedar

    Oi Jackeline!
    Hello, all!
    I think Vikas got the point, but let me be more specific...
    The transfer of requirement reduction depended in previous releases (until PI 2002.1) only on an integration model for material master data being active. Then a new type of integration models was created to control the sending of requirement
    reduction.
    To activate the new logic a flag has to be set in customizing.
    Use transaction /CFC9
    OR
    R/3 SAP Customizing Implementation Guide
    -> Integration with Other SAP Components
    -> Advanced Planning and Optimization
      -> Application-Specific Settings and Enhancements
       -> Settings and Enhancements for Requirement Reduction
        -> Evaluate the Requirements Reduction Filter Object Type:
    Then, flag 'Filter Object Requirements Reduction'
    If you set this flag no queue of type CFFCC* will be sent if there is no explicit active integration model for Requirements Reduction. Please set this flag in production and delete any existing queues of type CFFCC*. No new such queues should be created in the future as long as you don't activate the transfer via an integration model.
    I hope this information helps to clarify this issue.
    Please give your feedback concerning the given solution.
    Thank you!
    Will

  • Integration model generation

    Hi,
             We have a integration model for master data- Schedule lines and contracts. We have a background job that generates and activates the model. If I regenerate the model in CFM1, does it hurt anything for the job?
    I think every time you generate a model, a new model will be generated right? Does this cause any discrepancy?
    Thanks.

    There are two types of sending data to APO through CIF
    1) Sending only changes (Delta)
    2) Sending everything (Delta+ New creation).
    Considering there are three materials which changed on a certain day then just activating the IM model will work perfectly fine.
    If along with three materials there was also an addition of a brand new material in R/3 . Then only by generating the Integration model will not help .
    You will have to delete the original IM and recreate the same IM by using variants then activate the IM. By doing so it will consider as a initial transfer and send all the three old materials and the brand new material changes to APO.
    If you feel it is useful pelase reward points.
    The reason behind this behaviour : When you try to activate already existing IModel, then it only activates those objects which are already been generated and present during the generation of the IM.
    So if a new material is created you will need more than just to generate. You will have to delete the model, generate the model, activate the model.
    Thank You
    My
    Thanks,
    My

  • More than one CIF integration model possible for material?

    Hi Gurus,
    we want to have two different integration models (with non-overlapping selections) for materials to be able to have separate selections, but we cannot make it work.
    When we create them the first time and then activate them, everything works fine: both models send their products to APO in the initial load. However, if there are changes in products in both models, when we generate and then activate one of the models, the changes in the other one are lost: the generation-activation of the second model does not send any product to APO even if there were any changes in the master data (and we see the ALE pointers as processed in the table)
    Is there any workaround to solve this issue in a system without CIF pointers in BDCP2?? The ECC system is below the relevant releases and will not be upgraded in the near future.
    We could not find any mention to an existing limitation on the number of models being one neither in OSS notes nor the help pages.
    thanks a lot,
    Pablo

    Pablo,
    It is common to have multiple Integration Models for Materials.
    I have never heard of transaction BDCP2.  I also have never seen the problem you describe.
    I usually prefer not to use change pointers at all for Master data, such as Materials.  You can alter this behavior in CFC9, and instead use Business Transfer Events.  This means that all fields that are relevant for 'change transfer' to SCM will move across almost immediately after saving the changed material master.
    http://help.sap.com/saphelp_scm70/helpdata/EN/c8/cece3be9cd4432e10000000a11402f/frameset.htm
    Also read the links contained in this page.
    If you wish to actually perform a new 'initial load', then run the Integration model through program RIMODINI.  It may be a lengthy run.
    As always, check first in your Qual system before committing to production.
    Best Regards,
    DB49

  • Reuse Dimdate across different database for different data models

    Hi,
    I am designing a new data model for a data mart. I need to add dimdate dimension in this data model. I noticed that dimdate already exists in another database and being used by another
    data model. Is it possible to re-use the existing dimdate table for my new data model? If so, what about the foreign key constraints? Normally we link the date columns from fact table to the dimdate keys. How would we achieve that in case we are using the
    same table across different databases?
    Any opinion on this will be highly appreciated.
    Thanks in Advance.
    Cheers!!

    You can create a copy of dimdate table to your new data warehouse.
    If both data marts were in a single data warehouse, then you don't required to copy. but as these are in two different databases then you just copy that.
    regarding FK relationship. you can connect any fact table to you date dimension. even if you want to use more than one instance of your date dimension, it would be simply adding multiple FK columns in your fact table (role playing dimension).
    For date dimension be sure that your date dimension covers most of the attributes required. here is an example of date dimension:
    http://www.rad.pasfu.com/index.php?/archives/156-Script-to-Generate-and-Populate-Date-Dimension-Version-2-Adding-Multiple-Financial-Years.html
    Regards,
    Reza
    SQL Server MVP
    Blog:  
    http://rad.pasfu.com  Twitter:
      LinkedIn:
    SQL Server Integration Services 2012 Tutorial Videos:
    http://www.radacad.com/CoursePlan.aspx?course=1

  • Error activating integration model in APO

    Hi,
    I would like to have a solution to the folowing problem.
    I've tried to activate integration models in ECC with transaction CFM2.
    I've build several integration models for class, customer, masterdata and transaction data.
    When I try to activate one of these integration models the system gives the next error message:
    outbound queue blocked
    Function/Q/SAPAPO/CIF_CBASE_INB
    Text:        Reference product has not been classified
    I would like to know what causes the problem and how to solve this problem.
    I hope to receive an answer soon.
    Thanks for the help!
    kind regards,
    Marco

    Hi,
    A program enhancement allows you to exclude the relationship between a product and a reference material from the transfer.
    The prerequisite for this solution is that this "reference material for materials that can be packaged in the same way" is not required in SCM and is not relevant for the transfer.
    Procedure:
    Start transaction SMOD. Display the components of enhancement CIFMAT01. On the following screen, double-click function module EXIT_SAPLCMAT_001 (user exit '001' of function group 'CMAT', program SAPLCMAT). In the program editor that appears, double-click include ZXCIFU01. You may have to create the inlcude again if necessary. Implement the system response you want in this include (follow the sample coding given below).
    Example:
    DATA: LS_MATKEYX TYPE  CIF_MATKYX.
    LOOP AT CT_CIF_MATKEYX INTO LS_MATKEYX.
      if link_to_RMATP_needed = 'X'.
        continue.
      endif.
       clear LS_MATKEYX-EXTRMATP.
       MODIFY CT_CIF_MATKEYX FROM LS_MATKEYX.
    ENDLOOP.
    Regards
    Vinod

  • Regarding Integration Model to CIF from ECC to APO

    Hi ,
    Please assist me in the following issues:
    What will the impact if any material is ciffed twice or multiple times from ECC to APO through IM ...will it generate inconsistency ? if so, how we can avoid this ?
    While I'm ciffing the material from ECC to APO through IM, the ' Find Master Data Object ' is showing certain number say 'X' but when I'm running CTM for it the explosion of BOM  of Master Data is showing less number of material than 'X'...in that case what might be the possible issue ?
    While generating IM , can I club Sale Order, Sale Order Stock, Planned Order, Prodn order under same variant ? or ill it generate inconsistency ?
    TIA
    -Michael

    Hi Michael,
    1. It is not a problem. Is is quite common to CIF materials twice a days or even more. It shoul not generate inconsistencies.
    2. You could have more materials than materials bith BOM. In other words, maybe not all your materials have a BOM.
    3. Its recommended to have a separate integration model for your object.
    Read this document with the Best Practices:
    https://websmp102.sap-ag.de/~sapidb/011000358700000715082008E
    Also, note that I was mentioned by Dog Boy, it is convinient asking single questions in your threads. This way you will find more responses.
    Kind Regards,
    Mariano

Maybe you are looking for

  • HP Laser Jet P2035N is terrible with Windopws 7 64 bit

    First off, a Windows XP system on the same network has no isssues printing at all. If I print something simple from my Windows 7 64bit system it prints over and over until I turn off the printer or kill the job in the print queue.  I can fix this my

  • Can I run the new Mac pro (6.1) with only one stick of ram?

    Will that cause any problems. And if not how much will the overall quality be affected. it won't be a permanent solution, but for a few weeks at least. So can it happen? Can I run the Mac pro 2013 with only one memory stick?

  • Need help using Feedburner and Buzzboost in iWeb site

    Hello. I have embedded a blog feed into my iWeb site using Feedburner's Buzzboost. The feed appears correctly, but I cannot for the life of me figure out how to get the customizing CSS codes to work. I've found countless tips on using CSS codes to cu

  • What is the Architecture of Oracle 9i?

    Anybody please tell me the architecture of Oracle 9i.

  • Validator not working

    This is how my code looks : <fx:Declarations>         <mx:NumberValidator property="selectedIndex"                             minValue="0" lowerThanMinError="Please select a Project" id="numValidator"                             />         <mx:Strin