Parallel Sequencing of Production between different resources

Hi all,
I want to consider the way to produce a product in parallel with other resources if available and cost efficient. ( Parallel sequencing of the production between resources)
I have setup groups for products. Also resources (machines)
every machine can do the same job (in different times)
According to setup optimization it may be useful to produce all in one resource but if I need to catch the requirement date, I have to parallelize the work.
Is there any way in SCM like (parallel sequencing in R/3) that takes us to a more feasible solution?

Hi Mehmet,
In R/3, in the operation of routing, we can give "split" number .  It can split one operation into several splits during scheduling or capacity leveling so that total execution time will be reduced accordingly.
But when you transfer such operation with split number, the activity duration will be automatically calculated by using machine time divided by split number.
However in PPDS, there is split function. I'm not sure if it is what you want.
http://help.sap.com/saphelp_scm50/helpdata/en/3d/7e653a0c52425fe10000000a114084/frameset.htm
Hope it can help.
Watson

Similar Messages

  • Deletion of Serial numbers/ Parallel sequences from production order

    HI ,
    I am trying to delete the Serial numbers from the production order programatically using the FM SERNR_DEL_FROM_PP. I dont get any error , but the serial number is not getting deleted from production order. If anyone have used this FM for deletion Please let me know what extra has to be done.
    Similarly i have a requirement or deletion of parallel sequence from the production order. Any FM to do the same?
    Code screenshot below
    CALL FUNCTION 'SERNR_DEL_FROM_PP'
           EXPORTING
                material              = i_matnr
                j_vorgang             = 'PMP2'
                ppaufnr               = i_paufnr
                ppposnr               = i_itemno
           IMPORTING
                anzsn                 = l_num_serno
                ZEILEN_ID             =
               SERIAL_COMMIT         =
           TABLES
                sernos                = gt_sernos
           EXCEPTIONS
                serialnumber_errors   = 1
                serialnumber_warnings = 2
                OTHERS                = 3.
      DATA ls_afpod_po TYPE afpod.
      MOVE-CORRESPONDING gs_afpod_po TO ls_afpod_po.
    *.....Update Number of serial numbers in parent order
      ls_afpod_po-anzsn = l_num_serno.
      update afpo
      set anzsn = l_num_serno
      where aufnr = i_paufnr and
            posnr = i_itemno.
      commit work AND WAIT.

    Hello,
    the function module SERNR_DEL_FROM_PP calls the function module SERNR_DEL_FROM_DOCUMENT within. There, the changes are only written into the memory.
    See also the code below:
    CALL FUNCTION 'STATUS_BUFFER_EXPORT_TO_MEMORY'             "P99K058111
            EXPORTING                                             "P99K058111
                 I_MEMORY_ID = MEMID_STATUS.                      "P99K058111
    The variable memid_status has the value 'SN_STATS'.
    So if you want to update the data, you have to call the following function module:
    CALL FUNCTION 'SERIAL_LISTE_POST_PP'
      EXPORTING
        MEMORY_ID_STATUS       = 'SN_STATS'
    Regards Simon

  • Difference between parallel sequence and parallel operation in a routing.

    Hi Experts,
    Can any one explain me with example the difference between parallel sequence and parallel operation in a routing? wHEN CAN WE USE PARALLEL OPEARTION AND PARALLEL SEQUNCE WITH COMPONENT ALLOCATION.
    Regards
    Deepak sharma

    I think u need to modify ur quest... i think u r asking about Parallel sequence and alternate seq. Below are the details from SAP site.
    A parallel sequence enables you to process several operations at the same time.
    You use an alternative sequence for example, if
    --The production flow is different for certain lot-size ranges
    For instance you can machine a work piece on conventional machine or on NC machines. A NC machine has a longer set-up time than a conventional machine. However the machining costs are considerably less. Therefore whether you use NC machines will depend on the lot size.
    ---The production flow changes under certain business conditions.
    For instance, if you have a capacity problem, you have some production steps performed externally by a vendor.

  • Schedule parallel activities on different resources for the same period

    Hi,
    I am trying to create a PPM where I need to define parallel activities on different resources for the same time period.
    For example:
    Input prod: P1
    Output prod: O1
    Now I define a dummy activity A1 to define this consumption i.e. 1 PC of P1 gives 1 PC of O1.
    Now I have got 3 resources R1, R2, R3.
    I define 3 activities A2, A3, A4 for these resources.
    Now I have to arrange activities A2, A3, A4 such that they start simultaneouly after activity A1 and also end simultaneouly.
    The Fixed duration for all the resources is 1 week.
    My requirement is that out of these resources system should plan for demand considering the resource with minimum capacity
    i.e. kind of bottleneck resource.
    I am using CTM engine for planning.
    I have defined A1 as predecessor for all the 3 activities A2, A3, A4 and after completion all the activities have a dummy successor activity  S.
    But this setting is not working and PPM is not getting exploded, If I use a normal linear relationship between activities it works fine but my requirement is not getting satisfied.
    Can anyone please help me in this case or suggest some alternate method.
    Thanks & Regards,
    Sanjog Mishrikotkar

    Hi,
    You cannot use the old solman key as systems are on different server.
    I think you have to delete the old Dev system and then generate new solman key.
    Thanks
    Sunny

  • Scheduling of Parallel Sequence in Routing & Production Order

    Hello Friends,
    I have gone through sap library & many other threads related to maintaining & scheduling of parallel sequence in Routing as well as Production Order.
    For the example, I have 3 operations : 0010,0020 & 0030.
    I want to maintain 0020 & 0030 as Parallel Operations ( Parallel Sequence) i.e. for both operations I have maintained same lead times so that both should start & end at same time parallely.
    First of all, I have maintained all operations in standard sequence 0 all 0010,0020 & 0030 operations and also maintained tick in Required splitting check box.
    Same way , I have maintained operations 0020 & 0030 in Parallel sequence with branching operation as 0020 & Return operation as 0030. Ref. sequence i have maintained as 0 (Std. Sequence).
    Now when I am checking in Graphic it is not showing both operation 0020 & 0030 as parallely scheduled and hence same is not reflected in Production ORder as well.
    Kindly share your valuable views reg. what may be other settings for parallel sequence which I may be missing.
    Thanks in advance.
    Regards,
    Tejas

    Hi Abhijt,
    Thanks for your reply. I went through your thread & I followed same steps as mentioned by you in your suggested threads.
    I have removed Parallel sequence fully & maintained Required Overlapping for all my 3 operations : 0010,0020 & 0030
    I have maintained workcenters as below :
    0010 A  Set up 2.0 Hrs   Machine 5.0 Hrs   Labour 1.0 Hr
    0020 B  Set up 2.0 Hrs   Machine 5.0 Hrs   Labour 1.0 Hr
    0030 A  Set up 2.0 Hrs   Machine 5.0 Hrs   Labour 1.0 Hr
    I checked scheduling in routing itself same way you suggested with start date today's date & forward scheduling it gives following results :
    0010 start time 07:00:00
    0020 start time 07:13:42
    0030 start time 09:14:39
    My concern is with mainly start time of operations 0010 & 0020 since for both of these are processed on different workcenters A & B respectively then why not starting exactly on same time on 07:00:00 AM only instead another operation is starting on 07:13:42AM. I am unable to find how this difference of 13mins 42 seconds is coming.
    Kindly provide your feedback. Also let me know if you need any more details.
    Regards,
    Tejas

  • Error in schedulinq parallel sequences with flow production operations

    We have a routing with a standard sequence with operations in wich we have set "Continuous flow prod"
    and "Min. Send-Ahead Qty=1".
    We have a parallel sequence as well with operations set in the same way "Continuous flow prod"
    and "Min. Send-Ahead Qty=1".
    When you run the transaction "MD11" to create a planned order for the item the scheduling of the operations is correct
    as regard the standard sequence,the system, correctly,  schedules the operations overlapping
    them according to lead time set in the operation details.
    But the last operation of the parallel sequence is scheduled to finish the whole lot before the operation
    following the "return operation" . The system doesn't overlap this operation. In this way the
    last item of the last operation (in the parallel sequence) isn't finished at the same time
    with the last item of the "return operation" (in the standard sequence) as we like it would be.
    We have set "Backwards scheduling" type on customizing and we use the "Alignment key= 2 "
    for standard sequence and in the parallel as well. We have also tryed to set "Alignment key= 1"
    in the parallel sequence but the results are howewer not correct.
    In conclusion the schedulation of  continuous flow production operations
    doesn't work properly between operations belonging to parallel sequences.
    Could you suggest me what to do?
    Thank you.

    Hi,
      I tried with 5 ops 10, 20, 30 ,40,50 ( same work center  wc1)
      Set up time 1, Process time 1min for 1 pc
      set the continuous flow prodn i at op 20 with min send ahead qty 1
      Also a parallel seq was created with 3 ops op20 ,30 , 40 .
      set up time was 2 min for all work centers( same work center wc2) and proc time
      time is 1 min for  1 pc
      In that set the  continuous flow prodn at op 20 with min send ahead qty 1
      sceduling type was forward
    .Alignment key was 2 for both std seq and parallel seq .
      After scheduling found that op 20 in std seq starts at 6.14am and op 40 ends
      at 6.47 am where as in parallel seq op 20 starts at 6.11 am and op 40 ends at
      at 6.47 am which is correct.( branch op is 20 and return op is  40)
      The three minutes late start in op 20 of std seq  is due to 3 minutes extra time
      defined for three  work centers setup time.
      I also tried with alignment key 1in both std seq and parellel seq.
    Kindly check that you have maintained the continous flow prodn radio button
    in the same operation in both std seq and parallel seq.
    Regards,
    S.Nandhakumar

  • Can LabVIEW global variables be shared between UUT sequence steps in the parallel sequence model?

    Sorry for this simple question,  I'm having a hard time finding this answer.
    If I launch a sequence to run on two UUTs in the parallel sequence module,   can I get both UUTs to run LabVIEW vis that use global variables such that :
    1) UUT 0  executes 20 LabVIEW Vi  steps asynchronously,  5 of which access data from LabVIEW global variables A, B, C (Strings)
    2) UUT 0 executes the same 20 LabVIEW based steps asynchronously  5 of which access data from then same LabVIEW global variables A, B, C (Strings)
    I am a little worried that using file globals may have some delays or more of a race condition than using native LabVIEW global variables like in a single labview application perform.  
    QUESTION 2:  Are file globals actually written to the hard drive and shared between parallel sequences through file transfer?  OR  are they in memory?
    Brad Whaley
    LabVIEW Certified Engineer

    Hi bdwhaley,
    Are your parallel sequences only reading from the global variable? Or are they writing to the variable as well? If it is just reading, then you should be okay. File globals are read from memory. Every time they are read, a copy of the data is made in memory.
    Humphrey H.
    Applications Engineer
    National Instruments

  • Can we have different products between B2B and B2C

    Hi Experts,
          I have one basic question.  Can we maintain the different products between B2B and B2C users.  Because we have requirement is B2B users should get all the products, but B2C users should get limited products.  Is there way we can setup like this to meet the requirements.  Please let me know.  Thank you very much in advance.
    With Regards,
    Sudheer.

    You do this by creating two Product Catalogs (PCAT_B2C and PCAT_B2B, for example)  Then, in ShopAdmin you assign the B2C site to PCAT_B2C and the B2B to the PCAT_B2B.
    Another possibility, although not very user friendly is to use one PCAT and allow B2B users to add items to their cart directly, even thought they're not in the PCAT.  The drawback here is that they MUST know the exact number and they can not search for them.  I can give more info on this if you feel it might work for you, but it is not a very good option as a general rule.

  • Production Order not printing with Parallel Sequence ?

    Hi All,
    User is unable to print Prod. Orders when Parallel Sequence is used, but Orders using Standard Sequence are printing fine.
    Print is done in Update Task, so User is able to release the Order and when he saves the Order, the Order should get printed.
    Currently this is not happening. There is no spool generated.
    Can anyone please tell me the printer configuration required for this. Or any other configuration which we are missing.
    Thanks,
    Bhuvan

    Dear Bhuvan,
    I'm not sure if this is the issue,at a high level,check if the control key assigned to the operations in the
    parallel sequence supports for Printing in General(check box set).
    Regards
    Mangalraj.S

  • How to use parallel sequence for split the operation qty. urgent or other o

    PP guru
    My scenario is as follows.
    I ve one material suppose…xyz.
    For that material I' ve created bom, routing.
    In routing I' ve mention only one operation. 0010.
    Now I' ve the production order of 1000 kg.
    I want to use two work centers to run this production order. ( e.g.work center a and b)
    In routing I used work center a.
    In short I want split that operation.
    So what I ve to do or use parallel sequence and how???
    or else is there any other option to split the order to two or more machines/work center?
    Pls explain me in brief.
    Regards,
    Ram

    Hi,
      If you want to carry the production with two different work centers , first you need to split the operation qty.
    For this in the order type dependent parameters OPL8
    in the controlling tab page enable the indicator Cost collector
    and set the default rule as PP2.
      Create a Product cost collector with KKF6N.
       Now in the production order , select the operation and choose functions--->>> split.
       Now enter the operation split quantity and execute.
       Now in MD04 you can see two production order.
       In the second production order change the work center in the operation overview as desired and save.
      Regards,
    nandha

  • Difference between different RFCs

    Hi All,
    Could you please provide me with some useful material or brief knowledge where i can find out :
    what is an s-RFC, t-RFC and q-RFC ? ?
    Where all these are used ? ?
    And what is the Difference between them ? ?
    Regards,
    Arkesh Sharma

    Please check this link
    http://help.sap.com/saphelp_nw2004s/helpdata/en/62/73241e03337442b1bc1932c2ff8196/content.htm
    http://help.sap.com/saphelp_nw2004s/helpdata/en/6f/1bd5b6a85b11d6b28500508b5d5211/content.htm
    The qRFC Communication Model
    qRFC Properties and Possible Uses
    All types of applications are instructed to communicate with other applications. This communication may take place within an SAP system, with another SAP system, or with an application from a remote external system. An interface that can be used for dealing with this task is the Remote Function Call (RFC). RFCs can be used to start applications in remote systems, and to execute particular functions.
    Whereas the first version of the RFC, the synchronous RFC, (sRFC) required both systems involved to be active in order to produce a synchronous communication, the subsequent generations of RFC had a greater range of features at their disposal (such as serialization, guarantee for one-time-only execution, and that the receiver system does not have to be available). These features were further enhanced through the queued RFC with inbound/outbound queue.
    Communication between applications within an SAP system and also with a remote system can basically be achieved using the Remote Function Call (RFC). Here, the following scenarios are possible:
    · Communication between two independent SAP systems
    · Communication between a calling SAP system and an external receiving system
    · Communication between a calling external system and an SAP receiving system
    The following communication model shows what these communication scenarios may look like in reality. The actual sending process is still done by the tRFC (transactional Remote Function Call). Inbound and outbound queues are added to the tRFC, leaving us with a qRFC (queued Remote Function Call). The sender system is also called the client system, while the target system corresponds to the server system.
    Scenario 1: tRFC
    This scenario is appropriate is the data being sent is independent of each other. A calling application (or client) in system 1 uses a tRFC connection to a called application (or server) in system 2. In this scenario, data is transferred by tRFC, meaning that each function module sent to the target system is guaranteed to be executed one time only. You cannot define the sequence in which the function modules are executed, nor the time of execution. If an error occurs during the transfer, a batch job is scheduled, which sends the function module again after 15 minutes.
    Scenario 2: qRFC with outbound queue
    In this scenario, the sender system uses an outbound queue, to serialize the data that is being sent. This means that function modules which depend on each other (such as update and then change) are put into the outbound queue of the sender system, and are guaranteed to be sent to the target system one after each other and one time only. The called system (server) has no knowledge of the outbound queue in the sender system (client), meaning that in this scenario, every SAP system can also communicate with a non-SAP system. (Note: the programming code of the server system must not be changed. However, it must be tRFC-capable.)
    Scenario 3: qRFC with inbound queue (and outbound queue)
    In this scenario, as well as an outbound queue in the sender system (client), there is also an inbound queue in the target system (server). If a qRFC with inbound queue exists, this always means that an outbound queue exists in the sender system. This guarantees the sequence and efficiently controls the resources in the client system and server system. The inbound queue only processes as many function modules as the system resources in the target system (server) at that time allow. This prevents a server being blocked by a client. A scenario with inbound queue in the server system is not possible, since the outbound queue is needed in the client system, in order to guarantee the sequence and to prevent individual applications from blocking all work processes in the client system.
    Properties of the Three Communication Types
    To help you decide which communication type you should use in your system landscape for your requirements, the advantages of the three communication types are listed below:
    1. tRFC: for independent function modules only
    2. qRFC with outbound queue: guarantees that independent function modules are sent one after each other and one time only (serialization). Suitable for communication with non-SAP servers.
    3. qRFC with inbound queue: in addition to the outbound queue in the client system, an inbound queue makes sure that only as many function modules are processed in the target system (server) as the current resources allow. Client and server system must be SAP systems. One work process is used for each inbound queue.
    The qRFC Communication Model
    Purpose
    Communication within an SAP system or with a remote system can take place using Remote Function Call (RFC). This enables the following scenarios:
    · Communication between two independent SAP systems
    · Communication between a calling SAP system and an external receiving system
    · Communication between a calling external SAP system and an SAP system as the receiving system
    Implementation Considerations
    The following communication model shows how these communication scenarios can occur in practice. tRFC (transactional Remote Function Call) is still responsible for actually sending communications. tRFC is preceded by inbound and outbound queues, which have led to the name qRFC (queued Remote Function Call). The sending system is called the client system, and the target system represents the server system.
    There are three data transfer scenarios:
    Scenario 1: tRFC
    This scenario is suitable if the data being sent is not interrelated. A calling application (or client) in system 1 uses a tRFC connection to a called application (or server) in system 2. In this scenario, the data is transferred using tRFC. This means that each function module sent to the target system is guaranteed to be processed once. The order in which the function modules are executed, and the time they are executed, cannot be determined. If a transfer error occurs, a background job is scheduled that resends the function module after a defined period of time.
    Scenario 2: qRFC with Outbound Queue
    In this scenario, the sending system uses an outbound queue to serialize the data being sent. This means that mutually dependent function modules are placed in the outbound queue of the sending system and are guaranteed to be sent in the correct sequence, and only once, to the receiving system. The called system (server) has no knowledge of the outbound queue in the sending system (client). Using this scenario, every SAP system can communicate with a non-SAP system (the program code of the server system does not need to be changed, but it must be tRFC-compliant).
    Scenario 3: qRFC with Inbound Queue (and Outbound Queue)
    In this scenario, in addition to the outbound queue in the sending system (client), there is also an inbound queue in the target system (server). qRFC with an inbound queue always means that an outbound queue exists in the sending system. This guarantees that the sequence of communications is preserved, and at the same time the resources in the client and in the server system are controlled efficiently. The inbound queue is processed using an inbound scheduler, which only processes as many queues in parallel as the current resources in the target system (server) will allow, This prevents a server from being blocked by a client.
    Features
    Features of the Three Communication Types
    To help you decide which communication types you need to implement according to your system landscape and your requirements, the advantages of the three types of communication are explained below:
    · tRFC
    Suitable only for independent function module calls; the sequence of the calls is not preserved
    · qRFC with outbound queue
    Function modules in a queue are guaranteed to be processed only once and in sequence (serialization). Also suitable for communication with non-SAP servers.
    · qRFC with inbound queue
    The function modules created in the outbound queue are transferred from the outbound queue to the inbound queue; the sequence of the function modules is preserved. An inbound scheduler processes the inbound queues in accordance with the specified resources. Both the client and the server system must be SAP systems. One work process is used for each inbound queue.
    Queued Remote Function Call (qRFC)
    Purpose
    All types of applications are instructed to communicate with other applications. This communication may take place within an SAP system, with another SAP system, or with an application from a remote external system. An interface that can be used for dealing with this task is the Remote Function Call (RFC). RFCs can be used to start applications in remote systems, and to execute particular functions.
    Integration
    In contrast the first version of RFC, synchronous RFC (sRFC), which required both participating systems to be active to form synchronous communication, subsequent generations of RFC now provide a considerably extended range of functions (for example, serialization, guarantee that processing occurs once, and the receiving system does not have to be available). These features were further enhanced through the queued RFC with inbound/outbound queue.
    Contents:
    The information about qRFC is organized into the following main sections, with more detailed subsections:
    The qRFC Communication Model
    · qRFC with Outbound Queues
    · qRFC with Inbound Queues
    qRFC Administration
    · qRFC Administration: Introductory Example
    · Outbound Queue Administration
    · Inbound Queue Administration
    qRFC Programming
    · qRFC Programming: Introductory Example
    · Outbound Queue Programming
    · Inbound Queue Programming
    · qRFC API
    For an introduction to the new bgRFC (Background RFC), use the following links:
    bgRFC (Background RFC)
    · bgRFC Administration
    · bgRFC Programming
    Using Asynchronous Remote Function Calls
    Asynchronous remote function calls (aRFCs) are similar to transactional RFCs, in that the user does not have to wait for their completion before continuing the calling dialog. There are three characteristics, however, that distinguish asynchronous RFCs from transactional RFCs:
    · When the caller starts an asynchronous RFC, the called server must be available to accept the request.
    The parameters of asynchronous RFCs are not logged to the database, but sent directly to the server.
    · Asynchronous RFCs allow the user to carry on an interactive dialog with the remote system.
    · The calling program can receive results from the asynchronous RFC.
    You can use asynchronous remote function calls whenever you need to establish communication with a remote system, but do not want to wait for the functionu2019s result before continuing processing. Asynchronous RFCs can also be sent to the same system. In this case, the system opens a new session (or window). You can then switch back and for between the calling dialog and the called session
    To start a remote function call asynchronously, use the following syntax:
    CALL FUNCTION Remotefunction STARTING NEW TASK Taskname
    DESTINATION ...
    EXPORTING...
    TABLES ...
    EXCEPTIONS...
    The following calling parameters are available:
    § TABLES
    passes references to internal tables. All table parameters of the function module must contain values.
    § EXPORTING
    passes values of fields and field strings from the calling program to the function module. In the function module, the corresponding formal parameters are defined as import parameters.
    § EXCEPTIONS
    See Using Predefined Exceptions for RFCs
    RECEIVE RESULTS FROM FUNCTION Remotefunction is used within a FORM routine to receive the results of an asynchronous remote function call. The following receiving parameters are available:
    § IMPORTING
    § TABLES
    § EXCEPTIONS
    The addition KEEPING TASK prevents an asynchronous connection from being closed after receiving the results of the processing. The relevant remote context (roll area) is kept for re-use until the caller terminates the connection.
    Transactional RFC (tRFC)
    Transactional RFC(tRFC, previously known as asynchronous RFC) is an asynchronous communication method that executes the called function module just once in the RFC server. The remote system need not be available at the time when the RFC client program is executing a tRFC. The tRFC component stores the called RFC function, together with the corresponding data, in the SAP database under a unique transaction ID (TID).
    If a call is sent, and the receiving system is down, the call remains in the local queue. The calling dialog program can proceed without waiting to see whether the remote call was successful. If the receiving system does not become active within a certain amount of time, the call is scheduled to run in batch.
    tRFC is always used if a function is executed as a Logical Unit of Work (LUW). Within a LUW, all calls
    · are executed in the order in which they are called
    · are executed in the same program context in the target system
    · run as a single transaction: they are either committed or rolled back as a unit.
    Implementation of tRFC is recommended if you want to maintain the transactional sequence of the calls.
    Disadvantages of tRFC
    · tRFC processes all LUWs independently of one another. Due to the amount of activated tRFC processes, this procedure can reduce performance significantly in both the send and the target systems.
    · In addition, the sequence of LUWs defined in the application cannot be kept. It is therefore impossible to guarantee that the transactions will be executed in the sequence dictated by the application. The only thing that can be guaranteed is that all LUWs are transferred sooner or later

  • Parallel Routing in Production

    Hi all,
    I am using parallel routing. When we creder production order, routing gets coppied in production order.I am able to see only standard sequence but not able to see parallel sequence. How can we see it in production order? and how it works?
    Its urgent.
    Mayur

    Hi Mayur Phalak ,
    Pls follow as pradeep's reply to view the paralle sequence
    You will not be able to view standard & parallel sequence together in the same screen
    Coming to your second question-How it works?
    Alignment key controls the scheduling part
    As u know the operation with longest lead time will come in the standard sequence
    If you set alignement key as "1" then all the operations(both std & parallel) will be scheduled to start at the same time but will end at different time as the lead time vaires for each operation
    If you set as "2" then all the operations(both std & parallel) will end at same time ,floats will be at the beginning ie starting will vary for each opearation depending on the lead time
    Hope this is helpful,Revert if you have further query
    Regards,
    SVP

  • Confirmation of parallel sequence operation.

    Hello friends,
    I want to do confirmation of parallel sequence operation. I use 3 or5 work centers  at a time for doing one operation. So i have defined parallel sequences for standard sequence. When I do confirmation of quantity and activity time in CO11N for one parallel sequence, other sequence quantity and activity time should be decreased. But system takes actual  quantity and time.
    How to make settings , so that  quantity and activity values should be decreased automatically for next parallel sequence.
    Waiting for Reply!

    Dear ,
    You are not able get the paralle sequence operation in CO11N because production order will copy the Routing based on the Order Type dependent paramter-OPL8-Routing  set up.
    Here , basically , you might have chkeced Alternative Sequence  and Sequence Exchange field (0)  , so while creation production order , routing with operation which are in alternative sequnce will be copied  and will be used in CO11N confirmation.
    2.You need to maintain "Operation Sequence is not checked " for operation sequence for the Ordertype/Plant combination in confirmation parameters using transaction OPK4.-Confirmation parameters of Order
    Try and revert back
    Regards]
    JH

  • Mapping in interconnect between different Business Objects

    I want to know how to do transformation and mapping between different business objects in interconnects.
    Always,We have a very complex SQL,when We do intergration
    with Oracle interconnect ,We use DB Adapter or Jdbc Adapter,but the complex SQL have to be excuted in the resource DB or the destination DB which may be a big pressure to them ,I think can We use different Business Objects, and do the Mappings in interconnect,so the big pressure will be on the interconnect server just like the ETL tools, But I just find that Interconnect can do tranformation and mapping in one Business Object ,how can I do? Is anyone meet this problem like me ?thanks for discussion.

    For me, Business Objects are logical groupings of business processes. For example, we have a Business Object called "Maintain_Employees". Under this we have 1 Procedure (Create_Employee) and 2 Events (Update_Employee and Delete_Employee).
    We have 1 Oracle system interfacing with 23 other legacy systems. Some of these legacy systems will be using this "Maintain_Employees" Business Object (Common View), and our main transformations will be between the Common View and the legacy Application Views.
    We are using a number of techniques to assist in "validating" data in the InterConnect. The main ones are using 'Cross Reference Tables (XREF)' and 'DatabaseOperation' transformations. By using 'Content Based Routing' we are able to send the right message to the right legacy system, and therefore do the right transformation/validation on the message payload. However, this is only a small part of a complex puzzle.
    I also have the "problem" of having "very complex SQL" on our Oracle system too. This is not unusual when using the InterConnect.
    To my mind, the InterConnect does 2 main operations. Firstly, it performs some message transformation (mapping), and secondly, it acts as a transportation engine (routing) using the adapters.
    The remainder of the effort required to create or consume the message resides with the Applications themselves. Whether it is parsing an XML CLOB payload, inserting data into staging tables, writing to log files, pre-processing data, calling API's or something else, your Application side programming and processing overhead can get large.
    The trade off it to ask the question, do I want to be able to track and manage messages from start to finish in high detail? Or can I trust that all message payload data will be consumed with no additional processing on the Application side?
    My experience has shown that the bottleneck is always at the Application side, and almost never in the InterConnect.
    The short answer to your first question is "You are right. Mappings can take place only between Application Views and Common Views only - not between Business Objects.".
    To answer your second question "Probably everyone reading this forum has this problem. The intelligence that is able to really interpret message data, validate it and process it is only found in the Application, not the InterConnect. You could, however, use the Workflow engine within OAI in order to provide additional pre-validation, human interaction and logic, but this too could be complex."
    At my current client, we are architecting an Application OAI Message handling schema. This will contain staging tables, pre-processing tables, "OAI" wrapper PL/SQL scripts, "APPS" wrapper PL/SQL scripts and Message Logging and Exception tables. Ours will be a complex set of PL/SQL processes too.
    I hope this helps, just in letting you know that you are not alone with this problem.
    I wonder if anyone else would like to share how they have architected their InterConnect and Application side mapping and transformation solutions.

  • How to know the database changes between different releases

    Where can I get the information of database changes, like for example new columns new tables, etc, between different releases?
    My goal to try to determine the current version, or is there a better way?
    Thank you

    For JD Edwards World or for JD Edwards EnterpriseOne - you can go to upgradejde.com Once this screen comes up, pick Resources. You will then have an option to take JD Edwards EnterpriseOne or JD Edwards World. On the next screen, you can select the tab Compare Releases. This will allow you to enter the release that you are currently on and enter the latest GA release of software. You will then see all the differences between your current release and the release you should upgrade to. You can also format this into a printed report.

Maybe you are looking for