Help with Integration Processes (ccBPM)

Please help me to understand Process Container variables and "instance of process". I don't quite have a handle on the concepts. Thanks.
When a message is sent to an Integration Process and it is a reciever/start message, is a new process instance instantiated to process the message?
Say another message is sent to the same Integration Proces, is another instance created (ignoring correlating)?
My last question is whether  the Process Container variables are Process Instance specific?
Thank you for any help.

Hi Chris,
>><i>Please help me to understand Process Container variables and "instance of process"</i>
For an integration process to be able to process data such as messages or counters correctly, you must first define the data as container elements. <b><i>Container elements are similar to variables in a programming language.</i></b>
Container elements hold the message interfaces, as the data flow takes place via messages in and out of Integration server (XI), so effectively they are containers that hold data while a Business Process is executing.
>> <i>When a message is sent to an Integration Process and it is a reciever/start message, is a new process instance instantiated to process the message?</i>
You use a receive step,to receive a message. By receiving a message, you are transferring the data it brings into the process. <b><i>You can use a receive step to start a process or within a process that is already running</i></b>
In case of receive step, a new process instance is not instantiated because every receive has a particular message to look for, so there can be many receive steps in a single excuting Business Process.
Whereas in case of receiver determination step, if you choose <i>parforeach</i>then yes a new process instance is created for each message.
>><i>Say another message is sent to the same Integration Proces, is another instance created
(ignoring correlating)?</i>
No,
Receive steps arranged one after the other,
The first message that arrives is assigned to the first receive step, the second message is assigned to the second receive step, and so on. Therefore, the first message is not assigned to all receive steps that are waiting for a message from this message interface.
If you are using a <i>Fork</i>step, then you can have multiple receive step
>><i>My last question is whether the Process Container variables are Process Instance specific?</i>
No, Container variables are not process instance specific, there may be single line or multiline process containers.
Also, Please go through these links:
BPM- BPM in practice modeling Business Process:
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/nw/a-c/bpm251 - bpm in practice modelling business processes.pdf
BPM from modeling to monitoring,
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/events/sap-teched-04/bpm from modeling to monitoring.pdf
I hope this helps, you'll give me points )
Thanks & Regards,
Varun Joshi

Similar Messages

  • Prob with Integration Process

    Hi,
    Juz a simple file to file scenario with integration process. Ain't getting the output file for any Integration Process>??????
    1) File is picked up. No problem with dat.
    2) In sxmb_moni shows just 1 chequered flag for central client.
    3) Receiver grouping schedule for outbound processing.
    4) Inbound message still in green.
    5) Check the return code in SXI_CACHE its is 0. No problem with that too.
    Why is that integration process doesn't run for any scneario???????????????
    <b>Cheers,
    *RAJ*</b>

    Hey Gouri,
    Thanks for the interest show i didn't knew abt logical delete yet so was good to learn. Thanks a lot for that.
    Source and target are different in this case.
    What i  am doing is
    Src interface:
    1) data type
    2) msg type
    3) src MI outbound,  Rec Abs, Send abs (both abs refering to src msg type )
    4) int pro with 2 steps rec and send
    5) Msg map between src and target msg type.
    6) Inter Map between sendabs and target interface
    As the mapping is between is send abs and target interface no need  of transformation step.
    Target system
    1) data type
    2) msg type
    3) src MI Inbound.
    This scenario did work before........lng tym bak
    Mapping is fine no errors.
    Cheers,
    *RAJ*

  • R3 to File with integration process- need help

    Hi all,
    Can you please help me with the steps involved for the following scenario:
    IDOC(R/3) to XI(integration mapping is done here) then to file.
    I have selected  an idoc (MATMAS).
    I have created a business systems (R/3)
    I want only few fields from R/3(MATMAS) to send to file. So, I do the mapping in integration process.
    In the config, I don't know how to link this thro' integration process.
    Can anyone please help me with the steps.
    I appreciate all your efforts.
    Thanks
    felix.

    Felix, I can think of the foll. steps, try it and let know if it helps
    1) One receiver determination parameters -> BS1(R/3) - sender service , MATMAS - sender interface , receiver system - Integration Process(this integration process you will 've to import from Integration repository)
    2) Now you will create an interface determination for the above receiver determination , ere you will specify the Abstract interface(referred to, in ur container object in receive step of IP). Interface Mapping is not reqd , since MATMAS is received as MATMAS ere without any change.
    3) second receiver determination parameters -> Integration Process - sender service , Abstract interface used in the send step of IP - sender interface , receiver system - Business system/service that has the file commn. channel configured
    4) Now you will create an interface determination for step 3, ere you will specify the Asynchronous inbound interface used by the file adapter & the mapping program to be used(optional)
    5) Now you will create a receiver agreement to associate ur async inbnd interface with the Commn channel(file adapter).
    -Saravana

  • Problem with Integration Process

    Hi all,
    I have a problem within a Integration Process.
    I created a loop with help of a local integer element.
    On the development system it was running without any issued. But on the productive machine the Integration process runs until he has to increase my local element by one.
    The system is raising the following error message. (Within the graphical BPM log / Process Engine)
    Expression ''1'{TYPE=SWFXST_INTEGER}' does not correspond to data type () of element
    In general I understand the error message. The Integer expression 1 (should increase variable by one) is the wrong
    data type.
    But I am not able to understand why it is running on the development system but not on the productive system.
    I already compared the main configuration settings for BPM between development and production but the settings are equal.
    Do you have any ideas about such a problem. Ideas about additional logs ...
    cheers,
    Stefan

    Hi,
    Your container operation is perfect. there is no iisue with assigning valuee to counter
    I am telling about the condition in Loop step
    in BPM loop step you need to provide a condition . its basically a while loop.
    i guess you are providing condition based on Counter element
    What are  you putting there as loop condition? Please check that one.

  • Help with scheduling Process Flows in Workflow in 10gR2

    Dear All
    I am after some help with the scheduling of process flows (PF) in Workflow in OWB 10gR2. I am trying to set up some PF’s to handle the refresh of some staging tables from various source systems. I have created a separate Process Flow Module for each source systems, mainly just to keep them separate and organised. I have a number of mappings which all run fine if execute manually or linked directly to a schedule/job. The problem I am encountering is when I try to run process flows from schedules. I have created process flows ok and have run them manually and they complete fine so I know that the content of the PF is ok. After linking and deploying the jobs I can never get all the process flows to run from the various schedules. What appears to happen is that the first PF works ok and any other PF that are within the same Process Flow Module/packages also runs ok even if its running off as different schedules. However PF’s under the other Process Flow Module fail with the following error
    CC_DAILY_0400
    Description :
    Runtime User : OWBRT_USER
    Started : 2006-08-31 04:00:00.0
    Status      Log
    Error      ORA-20002: 3114: Activity 'PACK_1/CC_DAILY_0400' is not a process.
    ORA-06512: at "OWF_MGR.WF_ENGINE", line 3920
    ORA-06512: at line 1
    Following this error the PF will not even run manually! If I then stop the schedule and either drop or replace this failed PF thus redeploying, the PF then runs fine manually and if the restart the schedule it runs ok the next evening. My problem is that this then appears to impact on the other PF’s which all though have not been touched and ran ok the previous evening they then fail the following evening with the same error
    WS_DAILY_2400
    Description :
    Runtime User : OWBRT_USER
    Started : 2006-09-01 00:00:01.0
    Status      Log
    Error      ORA-20002: 3114: Activity 'PACK_1/WS_DAILY_2400' is not a process.
    ORA-06512: at "OWF_MGR.WF_ENGINE", line 3920
    ORA-06512: at line 1
    ORA-20002: 3114: Activity is not a process.
    I basically can not get both sets to run even though they are on separate modules and separate schedules.Has anyone any idea as to what could be wrong or if I am setting something up in a strange way which would cause these symptoms.
    All help or advice greatly appreciated
    Regards Kevin

    Dear All
    I am after some help with the scheduling of process flows (PF) in Workflow in OWB 10gR2. I am trying to set up some PF’s to handle the refresh of some staging tables from various source systems. I have created a separate Process Flow Module for each source systems, mainly just to keep them separate and organised. I have a number of mappings which all run fine if execute manually or linked directly to a schedule/job. The problem I am encountering is when I try to run process flows from schedules. I have created process flows ok and have run them manually and they complete fine so I know that the content of the PF is ok. After linking and deploying the jobs I can never get all the process flows to run from the various schedules. What appears to happen is that the first PF works ok and any other PF that are within the same Process Flow Module/packages also runs ok even if its running off as different schedules. However PF’s under the other Process Flow Module fail with the following error
    CC_DAILY_0400
    Description :
    Runtime User : OWBRT_USER
    Started : 2006-08-31 04:00:00.0
    Status      Log
    Error      ORA-20002: 3114: Activity 'PACK_1/CC_DAILY_0400' is not a process.
    ORA-06512: at "OWF_MGR.WF_ENGINE", line 3920
    ORA-06512: at line 1
    Following this error the PF will not even run manually! If I then stop the schedule and either drop or replace this failed PF thus redeploying, the PF then runs fine manually and if the restart the schedule it runs ok the next evening. My problem is that this then appears to impact on the other PF’s which all though have not been touched and ran ok the previous evening they then fail the following evening with the same error
    WS_DAILY_2400
    Description :
    Runtime User : OWBRT_USER
    Started : 2006-09-01 00:00:01.0
    Status      Log
    Error      ORA-20002: 3114: Activity 'PACK_1/WS_DAILY_2400' is not a process.
    ORA-06512: at "OWF_MGR.WF_ENGINE", line 3920
    ORA-06512: at line 1
    ORA-20002: 3114: Activity is not a process.
    I basically can not get both sets to run even though they are on separate modules and separate schedules.Has anyone any idea as to what could be wrong or if I am setting something up in a strange way which would cause these symptoms.
    All help or advice greatly appreciated
    Regards Kevin

  • Help with batch processing

    Hi, I am having trouble with batch processing. I want to apply adaptive noise reduction (ANR) to several files from the same source tapes. So, following the help instructions, I create a script from one file. When I try to apply it to the other files, instead of applying ANR to them, it simply saves them as is. The steps I perform are as follows:
    I select the entire wave (I have also tried it without first selecting)
    I open the scripts box in the file menu
    I click on open/new script collection and create a new script
    I title the new script
    I click 'record'
    I close the scripts box
    I run ANR on the file
    I go back to the scripts box and click 'stop current script'
    I click add to collection
    I close the scripts box
    I open the batch processing box
    I select the files, then the script, then I click to start processing
    When I try to apply this to script to other files, it saves them but doesn't apply ANR.
    Can anyone suggest what I am doing wrong? Also, in the scripts box, there are three options for the type of script that I can create. One is highlighted: 'Scripts start from scratch.' Is that the correct one to use? If not, how do I select one of the others? I click on them but nothing happens. Finally, when creating a script, do I save the file before or after clicking 'stop record'? I have tried it both ways but don't know which is correct (neither seems to make a difference).
    Just to be as thorough as possible, below is the text of one of my scripts, in case that contains any clues. Thanks so much for any and all help!!
    Collection: doubletnr
    Title: d2
    Description:
    Mode: 2
    Undo: 0
    Selected: none at 0 scaled 109479495 SR 44100
    Freq: Off
    cmd: Channel Both
    Selected: 0 to 109479494 scaled 109479495 SR 44100
    Freq: Off
    Comment: Restoration\Adaptive Noise Reduction
    cmd: {EA93BBBE-0B8F-47D6-AC6D-B67B46524E41}
    1: 52,AAA€ÛÚZ€AAQ€ãýÇÛÚÚAAA€Ďč¸āĂ}úƒÓ~ip€k•iëåÁëåÁ
    2:
    3: 13
    4: 0
    5: 0
    6: 3190728
    7: 0
    8: 0
    9: 0
    10: 1
    Freq: Off
    End:

    This is what my script looks like. Although I was worried about the strange characters in the script, it does work.
    Make sure that the script you use is suitable for the sample rate and bit depth of the file. E.g. if you have 44100/32-bit files, then don't use a script that was recorded on a file with e.g. 48000/16-bit. If required, resample or make different scripts for different rates/depths.
    Collection: Test ANR AA Forum
    Title: Test ANR AA Forum
    Description:
    Mode: 4
    Undo: 1
    Selected: 0 to 9233563 scaled 9233563 SR 44100
    Freq: Off
    cmd: Channel Both
    Selected: 0 to 9233563 scaled 9233563 SR 44100
    Freq: Off
    Comment: Restoration\Adaptive Noise Reduction
    cmd: {EA93BBBE-0B8F-47D6-AC6D-B67B46524E41}
    1: 52,AAA€ÛÚZ€AAQ€ãýÇÛÚÚAAA€Ďč¸āĂ}úƒÓ~ip€k•iëåÁëåÁ
    2:
    3: 13
    4: 0
    5: 0
    6: 154705552
    7: 0
    8: 0
    9: 0
    10: 1
    Freq: Off
    End:

  • Urgent help with BPEL process

    Hello there,
    I need help with BPEL project.
    i have created a table Employee in Database.
    I did create application, BPEL project and connection to the database properly using Database Adapter.
    I need to read the records from the database and convert into xml fomat and it should to go approval for BPM worklist.
    Can someone please describe me step by step what i need to do.
    Thanks,
    Dps

    Read the Demo examples given with Oracle BPEL Process Manager.
    I am now at a proficient level in BPEL in the past 1 year,
    still need to reach the excellent mark.
    Thanks & Regards,
    Gopal D. Kalsekar
    Sr. Software Developer
    Business Solutions (eGroup)
    M.H. Alshaya Company W.L.L.
    www.alshaya.com
    Jai Maharashtra
    P :- (965) 224 3598
    F :- (965) 224 2488
    E :- [email protected]

  • Serial processing with integration processes

    Hi all,
    I have the following problem. I receive a XML message in a integration process, this xml messages contains serveral business partners. I loop over this business partners wiht a 'ForEach' The messages are sent to a second integration proces. In this integration process a business partner is updated in SAP CRM.
    The problem I have is that several instances are started of this second process and this result in locking issues in SAP CRM. How could I configure XI in this way that only one message is processed.
    I read all queue related documentation, but according the documentation the messages should only be picked up from the queue when it is ready.
    I use only one queue (withoug buffering) Used system is XI with the latest service pack.
    Thank you in advance for your reply.
    Kind regards,
    Pieter Alting

    >  I loop over this business partners wiht a 'ForEach' The messages are sent to a second integration proces. In this integration process a business partner is updated in SAP CRM.
    Do you really need another IP here? May be you should look at avoiding this and check if you are using ParForEach or just ForEach.
    VJ

  • How to Cancel the integration process(ccBPM) after a specifice period

    Dear Experts,
    I have a ccBPM(integration process) having two receive steps using one correlation,
    1. First receive step receives the Delivery IDOC
    2. Second receive step receives the Invoice IDOC of the corrospondng delivery of the first receive step using the correlation.
    This works perfect when both the documents comes to XI.
    Problem is if the delivery comes and invoice dosen't come for some reasons then process is active and waiting indefinately. We want to cancel the integration process if the invoice IDOC is not received in 48 hours.
    Please guide how to achive this in ccBPM.
    We are working on PI7.0
    thanks and regards,
    Ravi Siddam

    HI,
    you can use block step and cancel process after a particular time(48 hours in your case)....
    control step has 3 operations.......throw exception,raise alert and cancel process..
    Thanks,
    Madhu
    Edited by: Madhu sudhan Reddy on Jul 31, 2008 8:18 AM

  • I need help with the processes running a media server.

    Hi there!   I need some help with the following log please.  The processes listed I am assuming are the current processes being used from my MacBook Pro to the media server?  Is that correct?  Are these common processes?
    Incident Identifier: EC931B64-E141-4C64-B428-427DF014C7E8
    CrashReporter Key:   b16be41bf16206d8f231e7e71676ab2a9c4dd25e
    Hardware Model:      iPhone4,1
    OS Version:          iPhone OS 5.0.1 (9A405)
    Kernel Version:      Darwin Kernel Version 11.0.0: Tue Nov  1 20:34:16 PDT 2011; root:xnu-1878.4.46~1/RELEASE_ARM_S5L8940X
    Date:                2012-08-24 16:06:18 -0400
    Time since snapshot: 152 ms
    Free pages:        1195
    Wired pages:       88383
    Purgeable pages:   0
    Largest process:   mediaserverd
    Processes
             Name                 UUID                    Count resident pages
                 atc <2271ed33ec773eeb9f381bf1baac9dee>     390
           securityd <e31a714c227a3d1c98ef8aacd44d91ee>     243
             assetsd <281396d3e7d831fbb6a5374157663dbc>    1370
          MobileMail <7064f2baf3f23db987bc8ec99855fe53>    1438 (jettisoned)
            mstreamd <cbe9881735043a389e7cdad3b5bcf5ce>    1099 (jettisoned)
              Camera <88291709452932ac9cbd0f1c06902214>    3105 (active)
         dataaccessd <b4f61f117ee635c48329af8572733d30>    1760
         MobilePhone <fe38c6944a053c9187b41ee50aa151b0>    5549
            networkd <6ee7a78e56073f6e8db4c2cc3265fdb4>     170
          aosnotifyd <58089d732ab43bbea0aec4a6f812f446>     320
            BTServer <e03baab8e0103188979ce54b87591065>     261
          aggregated <68a25a1690cb372096543a46abed14d7>     337
                apsd <e4b6e6e4f31e36f79815747ecbf52907>     291
       fairplayd.N94 <2c0105776e393b39ba95edffaf3bdd17>     294
           fseventsd <78af02202422321885dfc85c24534b0e>     170
                iapd <3ee7f82879033b4fb93b9cf1f4ecae29>     366
             imagent <8e2042f2ec9e3af9ba400f031f1bbfa7>     416
       mDNSResponder <b75f43f012ad3d9ea172d37491994e22>     265
        mediaremoted <b9fa7d1381013c2fa90ea134ff905f59>     258
        mediaserverd <478e5e8345c83be5ba1868906813bb75>    6774
                 ubd <7eaf0b0ca5b83afabecb0dfaa38c7a19>     389
               wifid <e176ab123beb3000bdb89e020612c1d6>     284
           locationd <91c84ab19dd03e4ab1b4cc30178ab1c0>     831
              powerd <25ddef6b52e4385b819e777dd2eeed3c>     167
           lockdownd <a68aa1526ef13a9bb4426bb71ffc1e3c>     250
          CommCenter <51922c9a50e73fe3badccaa4b1b1123b>     781
             syslogd <dd3766bcb1213e91b66283635db09773>     107
         SpringBoard <7506c20d86da3f1dbe9bf38f8bda253d>    5673 (active)
             configd <3430c0025ed13f56800a329b7254d2ae>     418
             notifyd <3793fabace3a385687b3c29c1fa1fcac>     252
      UserEventAgent <6e1cabc1ec6d372c90a6bdeaa7b258fa>     433
             launchd <cc35dd7a872334319ed028e6bbeae081>     133
    **End**
    Thanks a bunch!!!

    COULD NOT OF BEEN BOUGHT BRANDNEW IN 2011** apologies

  • Integration Process/ccBPM - When to be used

    Hi All,
    I have a scenario where i need to trigger the webservices in a third party java based system based on a value provided by SAP Workflow. Here two different web services are involved which needs to be triggred based on the response from the user.
    Below are the steps that are involved in the entire process:
    1. User request for an order and if material quantity is found, service 1 is triggered.
    2. If the material quantity is not found, Service 2 is trigerred providing user with alternate options.
    3. Now based on user response, if he agrees for alternate option, service 1 is trigerred.
    Can any body explain how to do this in XI or perticularly in ccBPM.
    Please let me know if you need further information
    Help will be appriciated.
    Thanks and best regards,
    Kulwant

    Hi Kulwant,
    Though I had mentioned some steps earlier also, i am putting them again here for you:
    Following are the steps for Integration Repository:
    1) You would need an outbound proxy program to send data from SAP workflow. I am not proficient in SAP workflow but i know you can write some code in the workflow steps to send data to SAP XI, you get examples of writing an outbound proxy program in SDN.
    2) Now your outbound proxy would trigger this particular scenario. It will bring input parameters for first java based webservice in XI.
    3) You need to create data type, message type and outbound message interface for your SAP workflow structure. you need not create data type and message type for Java based web service as you would be directly importing it's wsdl and using it as your inbound message interface in message and interface mappings directly.
    4) If the third party webservice is a synchronous one, then while creating the message mappings you need to create two message mappings for the 1st java based webservice, i.e. one for request and 1 for response, you would also need to create a response structure to create the mapping for 1st java based webservice's response or if you want to directly pass the output of 1st webservice as an input to the 2nd java based webservice, you can also do that.
    5) Create a response mapping for the response of 2nd webservice in a similar fashion.
    Confuguration / Integration Directory:
    1) For sender side you do not need any communication channel. For receiver side you would need one as I mentioned earlier. To create a communication channel, you need to first create or import (if already created) a business system or a business service (you can read about these in help.sap.com, ask me if you need the link). I would suggest a business system as web service provider is a third party. How to do it:
    ID--><your configuration scenario> --> Service without Party --> Business Service / Business system --> right click to create or assign.
    Note: looking @ yesterdays posts, i believe it's fine for you to create just a business service only.
    2) When you are done with this, you get a communication channel option created inside you business system or business service. Right click to create a communication channel. Give a name to it and press F4 to select the adapter type as SOAP. it would be a receiver. Read help.sap.com for more info on it.
    Note: you would need two receiver SOAP communication channels for executing the 2 java based web services.
    3) Rest of the mostly needed parameters i have already mentioned in my earlier posts for configuring the communication channel.
    Hint: the target URL needed to configure the communication channel is usually there in <SOAP:action> tag in the wsdls of the respective web services.
    4) Next create a Receiver determination --> Interface determination --> Sender Agreement --> Receiver Agreement
    7) For receiver determination, you need to enter a business service / business system and the sender message interface. If you have chosen your own service then you need to register your message interface in the service. double click on the service name to do it. there are options to register inbound / outbound messages.
    If you can use a conditional receiver determination to determine the user response and call the relevant web service based on that. At the 'edit receiver determination' screen there is a 'configured receivers' tab, where in condition field you can press F4. When you pess F4, a new screen opens, in left operand field, again press F4, a new screen would again open up, select 'Xpath' and there select the field in your sender (structure coming from workflow) structure, the field which contain the user choices (say 1 or 2 as mentioned by you).
    Similarly, in the same receiver determination based on the condition, you can call either webservice 1 or 2.
    8) For interface determination, you need to perform same steps as step no 7 also you need to enter your receiver service or business system.
    9) In sender and receiver agreements you mention sender and receiver communication channels. Well in your case you need not make any sender agreement as you sending data from R/3 to XI directly.
    if you configure these object completly, i believe you scenario should run.
    In case of more queries, feel free to ask,
    Thanks,
    Varun

  • Need help with batch processing picture packages

    Hi, I am having trouble batch processing picture packages is CS2.  (Windows).
    I have hundreds of images that need to be processed into picture packages and would love to find a speedier way to do this.
    I know how to create an action.  I know how to batch process from this action.  I also know how to create picture packages, but I cannot get the final result I am after - please read on....
    I have seperated all the images into their seperate folders for each style of picture package required.
    I can create an action for the picture package required and then do a batch process on the particular folder, but this leaves all the picture packages open on the desktop - as when you chose the close and save in the batch process - this only closes and saves the original image - the picture package has been created as a new document and is on the desktop still open - named Picture Package 1, Picture Package 2 - etc etc.
    I hope I am making some kind of sense here... (??!!)
    What I would like to happen is that the picture package will be saved over the original file (or to a new folder) with the original file name of the original image or maybe even with an adjustment to file name (e.g - orignal file name sc1234.jpeg - new file name sc1234packA.jpeg)
    So is this possible to do??  I'm thinking there must be a way.... i'm sure there are many group photographers out there who come across this everyday??
    Otherwise I have to save each picture package manually to original file name (via searching though files to match the original image to the picture package....) Very time consuming.
    Thanks for your help (in anticipation)...
    Jodie

    hmm - thanks for that - sounds like I will have to try and find some info and assistance regarding the scripting - it may be something I need to look into at a later time in the future....
    At the moment though I will have to plod along with this method I guess!
    Thanks for your assistance...
    Jodie

  • Help with frozen process, retry delay and transaction rolled back

    Hi all,
    </br>
    </br>
    Is there a way to create a JPD that will be able to retry a specific logic for several times with <b>delay</b> in between and <b>freeze</b> on the last retry. Also, at the same time, <b>not rollback</b> any database commits?
    </br>
    </br>
    I have tried several ways to approach this but did not work.
    </br>
    </br>
    1) create a JPD with transaction block with start node = "<b>freeze on failure</b>". Set retry count and delay in the transaction block. The outcome is, the JPD freezes, retries correctly, unfreezed properly, <b>but all DB transactions are rolled back</b>
    </br>
    </br>
    2) create a JPD without transaction block, and set start node with "<b>freeze on failure</b>", group the nodes that I would like to perform retry and in the <b>exception path</b> add in the retry count. Use <b>timer</b> in the exception path to introduct the delay. The outcome of this is: JPD freezes, retries correctly, <b>unfreezed INCORRECTLY</b> (when trying to unfreeze this process, it will start at the ontimeout method of the timer and does not start from the beginning of the jpd), DB transactions are commited correctly
    </br>
    </br>
    Any help or suggestions would be much appreciated...
    </br>
    Thanks!
    </br>
    </br>
    Carol

    The issue may be due to the transaction timeout, verify the configured timeouts and the processing time of the process B.
    Try increasing Sync Max Time Out and the other timeouts accordingly and test it.
    Refer the below URL for the details of configuring the timeouts.
    http://www.albinsblog.com/2011/11/oracle-soa-11g-configure-transaction_20.html
    Regards
    Albin I

  • Help with inline processing for Memory Optimization

    Hello all.  I have an embedded PXI system who's sole purpose is to gather digital data.  I've been tasked to see just how much data we can gather on our PXI-8106 Real-Time controller before we run out of our 2GB memory.
    The digital data is being captured by a PXI FGPA card and being DMA'd up to the Real-Time process running on the controller.  The storage for the data on the controller uses a functional global that is pre-allocated before the test begins to maintain determinism and prevent jitter.  Each 32-bit digital word that the FGPA captures has a 32-bit word-counter and a 32-bit timestamp attached to it prior to being sent up through the DMA channel.   Once the test is complete, the large "compressed data" array is then de-interlaced into three seperate arrays (word count, data, timetag) and wrapped up in a cluster; this is where I see a problem.  After I reformat the compressed data to its 'cluster of arrays', I have now doubled the amount of allocated memory when I really don't need the 'compressed data' array any longer.  I was hoping somebody could offer me some help on how I can inline this conversion prior to storing the data such that only the final format of the data is left in memory, cutting my memory needs in half and thus doubling the amount of data I can gather.  We are stuck with using LabVIEW 8.2 so I don't think we have any access to some fancy memory deallocation VI's that i've read about.
    Here is the functional global used to store the "compressed data" that we get back everytime we do a DMA Read.  This functional global has three methods: clear data, add data, and read data.
    Here is the data conversion VI that converts the compressed data into its final form; ready to be TCP'd up to the host computer.  This VI is passed the "CD array" from the "Read Data" case of the functional global above.
    Thanks in advanced for your help.

    SiegeX wrote:
    Ravens Fan wrote:
    Do you really need a cluster of 3 arrays as your end result data structure.  Why not just go with the 3 arrays?
    Rather than a cluster of 3 arrays, why not make it a 1-D array of the cluster?
    The final output decision of a cluster of 3 arrays was made long ago (3 years IIRC).  Immedaitly after the de-interlacing, this cluster isflattened to a string and then sent to the host PC via TCP/IP.  Wrapping the arrays in a cluster made this very easy to do.  At this point, this format is unfortunately set in stone for all intents and purposes as it would require a rewrite of some upstream API's in released code that expects it.  To go down this path, I would have to prove that changing the output format would be the only way to fix this memory copy problem.  I don't believe this is the case, is it?
    Why not do all of your deinterlacing inside your functional global variable to create whatever final data structure makes the most sense.  That way you only maintain one copy of the final large data structure rather than a copy of the original, a copy of the final, and copies of the intermediate data structures?
    On the way home I was thinking about possible solutions and that is one I thought of and wrote down to test tomorrow.  Just to be sure we are on the same page I was thinking on altering the "Read Data" case by tapping off the compressed data wire to the de-interlacing VI and then using a cluster indicator as output from the SubVI.  I'm hoping this would prevent the double copy.  
    If not, my other idea was to de-interlace at the very beginning of the functional global, before it even enters the case structure.  I would have to maintain 3 seperate arrays, each 1/3 the size of the current compressed data array and then in the "read data" case I would simply wrap up the 3 arrays in a cluster.  
    I hope one of these two ideas does the trick, otherwise I'm at a loss on how to do this and still keep a cluster of arrays as the output data structure.
    That may help along with taking the next steps of putting the logic that converts and transmits the cluster in that state playing "Chase the dots" as you go.
    Another approach is to convert the AE over to use the final cluster format and take advantage of the in-place operators (were they available in 8.2?, I think).
    Have fun,
    Ben
    Ben Rayner
    I am currently active on.. MainStream Preppers
    Rayner's Ridge is under construction

  • Help with uploading process

    why is th uploading process not working? i am trying to convert a pdf into a word doc

    Hi dburtz2468,
    We would like to help! What type of specific error messages are you receiving? Have you been able to convert any pdf's? What is the size of the document?
    Looking forward to hearing back from you!
    Regards, Stacy

Maybe you are looking for

  • Adobe Acrobat Pro (version 9 and X): creating limitations in a group of fields.

    Hello, I am building a fillable questionnaire form using Adobe Acrobat Pro 9 and X. Within 1 question, I want to set 3 fields, each representing an answer that can only be answered once. I know about Radio Buttons but, I need to have some text writte

  • Can't download movie, says I don't have enough storage.

    Can't download movie, says I do not have enough storage. I just purchased 55gb's.

  • Partial Clearing using FP05

    Hi Experts, I am facing a very peculiar problem . We have configured a clearing variant which allows partial clearing of invoice. Someone has changed the configuration now and every time when I post a Posting lot the invoice is getting cleared fully.

  • Different column views for different mailboxes?

    Is it possible to view different columns for different mailboxes? Normally, the "Sent" mailbox displays only "To," not "From;" and the other mailboxes display only "From" and not "To." That's fine. But I have another mailbox I've created where I've s

  • IC Web Client: Custom Controller !

    Hi, I am getting a value in a custom controller. Now later on I am using the same value from this custom controller to output in another view. But the method in the first view where i am populating the custom controller is getting called later than t