Sequential Processing for MDBs of Cluster.

Dear All,
I am facing a strange problem with MDBs. I am deploying the MDB on cluster. I also specify that there is only one instance to be available. the following description was put in the deployment descriptor.
<pool>
     <max-beans-in-free-pool>1</max-beans-in-free-pool>
     <initial-beans-in-free-pool>0</initial-beans-in-free-pool>
</pool>When I tried to deploy I was having 2 instances, i.e., there was one instance per node on the cluster. So the message processing was done parallely.
My requirement was to do sequential processing for a set of beans and parallel processing for other set of beans. I want to have a realtime system where I cannot have a batch program that does sequential processing for me.
Any help is welcomed!

Hey I have got a poor solution that is working.
I have made the MQ Queue that is configured for my MDS as not sharable.
If any one has a better solution kindly let me know.

Similar Messages

  • Best way to modify Sequential Process Model for report generation.

    I am using the Sequential Process Model in my application and the TestStand Reference Manual, (Figure A-1), clearly shows the following processing sequence:
    ...<part removed>
        Call the Test Sequence
        Display the UUT results
        Generate a Report
        Log Result to a Database
    ...<more removed>
    I want to generate the report BEFORE displaying the results to the operator, or at a minimum, I want to generate the report in parallel with displaying the results to the operator.  Currently, the problem I have is that when the test is done I have some automated scripts that take the data file and do some statistical processing on it, but the way the Sequential Process Model is set up, the test might finish but until the operator acknowledges the PASS/FAIL results display, the resulting file is never created.  It could be overnight, over the weekend, or several days before an operator comes back and says, "Oh that last test finished, I guess I can press the OK button!", but until they do, I get no data.  So I want the report generated no matter what, and right after the test finishes.
    Any ideas as to how that might be best accomplished?
    Thanks a billion -  Ski (noob)

    Ray,
    Is that new in 4.2 that the engine won't call a callback with nothing in it?  I just did it and it seemed to work fine.  I'm using 4.1.1 though.
    Ski,
    Maybe there is a better solution for what you want.  Are you using the SequentialModel?  What version of TS do you use?  Why does the report have to be written before the pass/fail banner displays?  The pass/fail banner gets displayed in the PostUUT callback.  Like Ray said if you just put that in your client sequence you won't see the banners.  However, I'm assuming there is more to this than just that.  I'm assuming you want to see the report because of your external analyzer that is gathering the statistical data.  And then based on that data you want to allow the user other options.  Is this correct? 
    If so then I would override the PostUUT callback and then use a different callback (possible the ProcessCleanup callback) to displaly the banners.  You could even do this without modifying the process model (which I always try to avoid).  Just override both the PostUUT and ProcessCleanup callbacks.  And then put code in the ProcessCleanup to behave like you need.
    Or if you want you can modify the process model and create a new callback lower in the process model.  Then have that new one do the post report analysis.
    Just some thoughts.
    jigg
    CTA, CLA
    teststandhelp.com
    ~Will work for kudos and/or BBQ~

  • Upgrade process for SQL server 2005 service pack4 on stand alone and cluster servers

    Hi All,
    We have iniated a process of upgarding sp4 for all sql 2005 stand alone and clusters servers.
    Please provide me the step by step process for installing sp4 and roll back paln for sql 2005 servers.And before proceeding with sp installation what are the pre check/pro active things that we need to take care.
    Maheshwar Reddy

    Hello,
    For applying SP to SQL 2005 cluster environment/standalone please refer to below link
    http://www.sqlcoffee.com/Tips0007.htm
    Please mark this reply as the answer or vote as helpful, as appropriate, to make it useful for other readers

  • Process for Adding Additional WAE to 4.0.11 Cluster with CIFS

    I will be adding an additional WAE 612 W/ 4GB of memory to an existing single WAE 612/4GB cluster in the future. What is the best practice process for adding the second WAE to the cluster. I would like to do the WCCP configuration last. Whats the easiest way to move the policies from the existing WAE to the second wae. Is it as simple as just creating a Cluster Device Group and then adding both devices to the group. Will the second WAE inherit the settings for WAFS as well? I am not concerned about the WAAS setup.
    I don't have access to the credentials for the CIFS connection on the core WAE today so i would really like to know if i can complete this install by just copying settings from the existing WAE using the device group settings.
    I am also going to reverse the 61 and 62 settings to ensure that the inbound client connections get load balanced across the WAE properly.
    Any other things to consider?
    Thanks in advance
    Mike

    Mike,
    The edges will always reconnect to the same core unless it's not available. In 4.0.x, WCCP does not effect CIFS connections (4050 connections), only load balancing the TCP connections outside the 4050 connections. I have manually load balanced edges before via expert mode on the edge GUI, but it's fairly unwieldy. How many edges do you have? I can dig up the procedure if you want to use it.
    You can get decent load balancing by setting up both cores so they are both taking edge connections, take the old core offline (stopping the service in the GUI) for several minutes and then bringing it back up. A good number of your edges should now be on the new core and some will still be on the old core.
    Also, remember it's best practice to try and keep N+1 at your core. If you loose a core WAE in your cluster and a single Core can't handle the CIFS traffic, you will continue to get CIFS optimization.
    Hope that helps,
    Dan

  • Error in executing a process for compilation for jsp

    We have an iView which has jsp pages in it. We deployed the par & try toaccess the iview & we get an exception. The issue is that the iview has a jsp page. At the run time, this jsp is converted into a .java file
    without problem. But EP engine is having issues compiling this java fileinto .class file.
    On the other hand this same iview works just fine on our windows installation. Only Solaris EP install is having problems. The version onwindows as well as on Solaris is EP6 SP9.
    Here is the exact version on the solaris EP:
    sap.com/SAP-JEECOR 6.40 SP9 (1000.6.40.9.0.20041119045253) 20041122132733
    sap.com/SAP-JEE 6.40 SP9 (1000.6.40.9.0.20041119045409) 20041122132741
    When I copy the .class file from windows to unix machine, the iView works fine. Here is the exception I am getting:
    >>> JSPCompiler >>> error
    [email protected]a188b
    [EXCEPTION]
    com.sapportals.portal.prt.servlets_jsp.server.compiler.CompilingException:
    Error in executing a process for compilation
    at
    com.sapportals.portal.prt.servlets_jsp.server.compiler.impl.J2eeCompiler_6_30.launchCompilerProcess(J2eeCompiler_6_30.java:574)
    at
    com.sapportals.portal.prt.servlets_jsp.server.compiler.impl.J2eeCompiler_6_30.compileExternal(J2eeCompiler_6_30.java:370)
    at
    com.sapportals.portal.prt.servlets_jsp.server.compiler.impl.J2eeCompiler_6_30.compile(J2eeCompiler_6_30.java:672)
    at
    com.sapportals.portal.prt.servlets_jsp.server.jsp.JSPParser.parse(JSPParser.java:2143)
    at
    com.sapportals.portal.prt.servlets_jsp.server.jsp.JSPCompiler.compile(JSPCompiler.java:76)
    at
    com.sapportals.portal.prt.servlets_jsp.server.jsp.JSPCompiler.run(JSPCompiler.java:122)
    at
    com.sapportals.portal.prt.core.broker.JSPComponentItem.compileJSP(JSPComponentItem.java:279)
    at
    com.sapportals.portal.prt.core.broker.JSPComponentItem.getComponentInstance(JSPComponentItem.java:129)
    at
    com.sapportals.portal.prt.core.broker.PortalComponentItemFacade.service(PortalComponentItemFacade.java:355)
    at
    com.sapportals.portal.prt.core.broker.PortalComponentItem.service(PortalComponentItem.java:934)
    at
    com.sapportals.portal.prt.core.PortalRequestManager.dispatchRequest(PortalRequestManager.java:435)
    at
    com.sapportals.portal.prt.core.PortalRequestManager.dispatchRequest(PortalRequestManager.java:527)
    at
    com.sapportals.portal.prt.component.AbstractComponentResponse.include(AbstractComponentResponse.java:89)
    at
    com.sapportals.portal.prt.component.PortalComponentResponse.include(PortalComponentResponse.java:232)
    at com.sapportals.portal.htmlb.page.JSPDynPage.doOutput(JSPDynPage.java:76)
    at
    com.sapportals.htmlb.page.PageProcessor.handleRequest(PageProcessor.java:129)
    at
    com.sapportals.portal.htmlb.page.PageProcessorComponent.doContent(PageProcessorComponent.java:134)
    at
    com.sapportals.portal.prt.component.AbstractPortalComponent.serviceDeprecated(AbstractPortalComponent.java:209)
    at
    com.sapportals.portal.prt.component.AbstractPortalComponent.service(AbstractPortalComponent.java:114)
    at
    com.sapportals.portal.prt.core.PortalRequestManager.callPortalComponent(PortalRequestManager.java:328)
    at
    com.sapportals.portal.prt.core.PortalRequestManager.dispatchRequest(PortalRequestManager.java:136)
    at
    com.sapportals.portal.prt.core.PortalRequestManager.dispatchRequest(PortalRequestManager.java:189)
    at
    com.sapportals.portal.prt.component.PortalComponentResponse.include(PortalComponentResponse.java:215)
    at com.sapportals.portal.prt.pom.PortalNode.service(PortalNode.java:646)
    at
    com.sapportals.portal.prt.core.PortalRequestManager.callPortalComponent(PortalRequestManager.java:328)
    at
    com.sapportals.portal.prt.core.PortalRequestManager.dispatchRequest(PortalRequestManager.java:136)
    at
    com.sapportals.portal.prt.core.PortalRequestManager.dispatchRequest(PortalRequestManager.java:189)
    at
    com.sapportals.portal.prt.core.PortalRequestManager.runRequestCycle(PortalRequestManager.java:753)
    at
    com.sapportals.portal.prt.connection.ServletConnection.handleRequest(ServletConnection.java:232)
    at
    com.sapportals.portal.prt.dispatcher.Dispatcher$doService.run(Dispatcher.java:522)
    at java.security.AccessController.doPrivileged(Native Method)
    at
    com.sapportals.portal.prt.dispatcher.Dispatcher.service(Dispatcher.java:405)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
    at
    com.sap.engine.services.servlets_jsp.server.servlet.InvokerServlet.service(InvokerServlet.java:153)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
    at
    com.sap.engine.services.servlets_jsp.server.HttpHandlerImpl.runServlet(HttpHandlerImpl.java:385)
    at
    com.sap.engine.services.servlets_jsp.server.HttpHandlerImpl.handleRequest(HttpHandlerImpl.java:263)
    at
    com.sap.engine.services.httpserver.server.RequestAnalizer.startServlet(RequestAnalizer.java:340)
    at
    com.sap.engine.services.httpserver.server.RequestAnalizer.startServlet(RequestAnalizer.java:318)
    at
    com.sap.engine.services.httpserver.server.RequestAnalizer.invokeWebContainer(RequestAnalizer.java:821)
    at
    com.sap.engine.services.httpserver.server.RequestAnalizer.handle(RequestAnalizer.java:239)
    at com.sap.engine.services.httpserver.server.Client.handle(Client.java:92)
    at
    com.sap.engine.services.httpserver.server.Processor.request(Processor.java:147)
    at
    com.sap.engine.core.service630.context.cluster.session.ApplicationSessionMessageListener.process(ApplicationSessionMessageListener.java:37)
    at
    com.sap.engine.core.cluster.impl6.session.UnorderedChannel$MessageRunner.run(UnorderedChannel.java:71)
    at com.sap.engine.core.thread.impl3.ActionObject.run(ActionObject.java:37)
    at java.security.AccessController.doPrivileged(Native Method)
    at
    com.sap.engine.core.thread.impl3.SingleThread.execute(SingleThread.java:94)
    at com.sap.engine.core.thread.impl3.SingleThread.run(SingleThread.java:162)
    I am having the same issue with another iview which has jsp page in it.
    The web dynpro iview works fine.

    Hi, seems like there's some typo in your jsp. Check the .java file listed with a java editor (like eclipse or devstudio.). Maybe you'll find the typo this way faster.
    Most often, a multiline page import hampers jsp compilation,
    e.g.
    <%@page import="java.util.List,
                    java.util.Map"%>
    needs to be
    <%@page import="java.util.List,java.util.Map"%>
    Regards,
    Armin

  • Error in executing a process for compilation

    Hi:
    Our portal version is 6.0.2.28.0 (SAPJ2EE PL29)
    we have the following problem:
    1.- the developer upload a component in portal, but when he tries to run it, the following error appears:
    Mar 28, 2005 4:22:06 PM # Client_Thread_8      Fatal           >>> JSPCompiler >>> ERROR in Compiling :JSPFileInfo :4782283
    JSP File : /usr/sap/J2EE_DP3/j2ee/j2ee_00/cluster/server/services/servlet_jsp/work/jspTemp/irj/root/WEB-INF/portal/portalapps/PRUEBA
    _PCCLibragestMostrarPlantillas/pagelet/MostrarPlantillas.jsp
    Class Name: sapportalsjspMostrarPlantillas
    Java File : /usr/sap/J2EE_DP3/j2ee/j2ee_00/cluster/server/services/servlet_jsp/work/jspTemp/irj/root/WEB-INF/portal/portalapps/PRUEB
    A_PCCLibragestMostrarPlantillas/work/pagelet/_sapportalsjsp_MostrarPlantillas.java
    Package Name : pagelet
    Class File : /usr/sap/J2EE_DP3/j2ee/j2ee_00/cluster/server/services/servlet_jsp/work/jspTemp/irj/root/WEB-INF/portal/portalapps/PRUE
    BA_PCCLibragestMostrarPlantillas/work/pagelet/_sapportalsjsp_MostrarPlantillas.class
    Is out dated : false
    com.sapportals.portal.prt.servlets_jsp.server.compiler.CompilingException: Error in executing a process for compilation
    2.- If I stop/start SAPJ2EE the component, which was uploaded before,can be run without problem, but if you upload it again,when you try to run it again, the compilation error appears
    Any suggestion???
    Thanks

    Hi:
    the problem was not resolved deleting/uploading the components, the problem was resolved increasing the memory in the host and tuning the JVM memory parameters.
    In our host the memory was reduced and this was the origin of the problem
    Thanks

  • Regarding GP Process for Adobe Forms

    Hi,
        I had created a form through adobe life cycle designer and i want to use this form for GP process in portal.I had configured GP process for adobe and i followed time off process document but am unable to do the same process for a new form .
    Please help as i'am new to this GP process
    with regards
    Pradeep.B

    Hello
    I would suggest trying to configure the parameters using the Config Tool. Here is a small procedure to help you with the navigation:
         1.Open the Config Tool.
         2.Navigate to cluster-data -> template <name of template> -> instance <name of instance> -> services -> caf/eu/gp/model.
    The template represents the usage type installed on your server (for example, EP). It consists of one or more instances that can be configured in a different way.
         3.Modify the required system parameters:
            a.      On the Service properties screen, select a parameter from the table.
            b.      In the Custom value field, enter a value for the parameter.
            c.      Choose Set.
    The new value appears in the Value column.
    In the Template value field, you can see the default value (if any) of the parameter you have selected. You can restore the default template value for any parameter by choosing Restore to Template.
    Hopefully, this way round it will work
    Best regards,
    Petra

  • Sequential Processing in XI

    Hi Experts,
    I have to execute a scenario which involves sequentially processing of files coming from legacy and sending data to R/3 via proxy.We have files coming in for different insp lots.Each insp lot has 1 or more results files.
    My requirement is :
    all the files pertaining to same lot must be sequentialy processed.If one of the file fails all other must be blocked.But Xi should not block files of other onsp lot.
    Options that I have tried:
    1) Setting EOIO in channel and giving queue names.But this works fine for a particular batch lot,but even blocks files of other batchlot.
    2) I tried configuring rules on R/3 moni to break the Qos and make it Eo and let R/3 code handle the sequence.
    3) Tried BPM with option of "Mulitple queues(Content specific)" and using corelation on Insp lot.Did all the settings in SWF_INB_CONF.But the I cannot see separate queues as per the correlation.
    Can anyone help in solving this.Is this possible in XI?
    Thanks in advance.
    Pragati

    Ur third option seems to be the most logical option. There could be some config problem in BPM. U will find multiple queues only if ur correlation instantiate multiple integration processes in parallel. Also make sure that u have only One correlation in ur scenario.
    Regards,
    Prateek

  • First JavaFX application with long sequential processes

    Hi all,
    I'm working on my first JavaFX application.  I have UI elements (view) in place, I have a controller that works properly (with the exception I'll mention below), and I have an model that runs correctly (processes the data properly), but the model code was not originally written to run under a UI.
    What I want to happen is the following.  When the "Run" button is clicked, the input of the user is passed to the model via the controller and the data is processed.  It's a do-once kind of thing where the input is transformed into the output and that's the end of it.  The UI can be reset and new input provided, and the whole thing happens again with the click of the Run button.  Here's the problem:  The processing has several steps that need to happen sequentially, but some can take a long time.  I want to provide feedback via a message (what's being done), and overall progress via a progress bar (both elements in the current UI).  The problem is I'm still very confused about how to accomplish the user feedback.
    The UI was built in SceneBuilder, and I've assigned the onMouseClicked event to call the runButtonClicked() method in the controller.
    The runButtonClicked() method calls private methods that execute the processing steps (I'm going to do something more intelligent with the exceptions - please ignore that for the time being)...
        public void runButtonClicked() {
            runButton.setDisable(true);
            sdfFileBrowseButton.setDisable(true);
            outputFileBrowseButton.setDisable(true);
            try {
                prepareSDFFile();
                prepareModels();
                collectDescriptors();
                makeSamples();
                evaluateSamples();
                writeResultsToTextFile(results, outputFile);
            } catch (IOException exception) {
                System.out.println("Caught IOException: " + exception.getMessage());
            } catch (JAXBException exception) {
                System.out.println("Caught JAXBException: " + exception.getMessage());
    Each of the calls in the try-catch are tasks that need to occur sequentially in the order shown.  A few can take a long time (e.g. collectDescriptors(), evaluateSamples(), and writeResultsToTextFile().  The actual work isn't done in the controller.  The controller creates instances of the classes that do the actual work (this seems to be a departure from most of the tutorials I've seen - it seems the model and the controller are mixed in the tutorials).
    I want to alert the user about what step is executing (by way of a message in the UI) and what stage the whole process is in (by way of a progress bar).  It would be nice if the progress bar could reflect the progress of each individual step (returning to 0 at the beginning of the next step), but having the bar just reflect the over all progress (e.g., how many of the six steps have been finished) would be sufficient.
    I also want to be able to handle a Cancel button event and just stop the whole process and reset the application (losing whatever work has been done is OK).
    So - I really can't tell what the best approach should be.  None of the examples in several tutorials I've gone over have really touched on this type of sequential process.  I'd appreciate suggestions.
    Thanks,
    Dave.

    Suggested Approach
    What you describe is an absolutely classic case for using the javafx.concurrent utilities (Task and Service).
    Your application is a perfect fit for using these services, which will provide you:
    1. a concurrent execution thread to run your log running processes
    2. a threadsafe feedback mechanism for progress that you can bind to a ProgressBar for interactive progress feedback.
    3. a threadsafe feedback mechanism for messages that you can bind to a Label for interactive status feedback.
    4. sample code that demonstrates the ability to cancel and restart processes.
    5, a success callback method setter that can be set to provide a result and take action on success.
    6. a failure callback method setter that can be set to provide an exception and take action on failure.
    Task and Service Samples
    There are many examples of Task usage if you search for JavaFX Task.
    There is also great documentation in the Task and Service javadoc.
    Here is an example of using a JavaFX service.  The example is a little old and uses doesn't use some new API features such as setOnSucceeded or setOnFailed, it also uses the now-deprecated Builder patterns, however, the bulk of it should still be relevant to what you want.
    Here is another more complicated sample to Render 300 charts off screen and save them to files in JavaFX, though I doubt your task needs to be that complicated, and likely a single controlling service is all you need, so don't pay too much attention to the more complicated example unless you must have that kind of functionality.
    Here is another simpler example for Re: Creating multiple parallel tasks by a single service, that actually demonstrates sequential and parallel demonstration of step tasks within a service.
    The progress can be reset between steps, so that the message indicates the step being performed and the progress indicates the progress of the step rather than overall progress.  You could expose separate mechanisms for feeding back overall progress in addition to step progress.
    For further help
    If you are still having issues, I'm sure somebody on the forum could easily mock up a sample application stub that demonstrates the techniques based on the info you provided in your question, just ask if you would like that...

  • Reg: IDoc Process for Shipment

    Hi All,
    For EDI Outbound process i am using the exit EXIT_SAPLV56K_002 to populate
    the new Extended IDoc type for SHIPMNT01.
    The problem is when i am appending the records into the IDOC internal table in exit..and  process IDoc, all segments are displayed in Same level in we02. There are no subnodes
    When i doesn't append, the the segments are displayed in heirarcy manner
    in we02. with subnodes
    So, i need to display Segments in heirarchy Manner with sub nodes as basic Idoc type in WE30.
    ASAP.............
    Regards,
    Hemanth......

    Hi keerthipati,
    when you append (or insert??) new segments into IDOC internal table, you should make sure, that all administrative fields are filled:
    SEGNUM (number of segment) is a sequential number for every segment and
    DOCNUM     IDoc Number
    SEGNUM     Number of SAP segment
    SEGNAM     Segment type
    PSGNUM     Number of the hierarchically higher SAP segment
    HLEVEL     Hierarchy level
    Set PSGNUM as the segment number of the segment in hierarchy directly above. Look at an existing IDOC in SE16 to see the admin data. Compare this to the view you get in WE02 (or other) and set the correct HLEVEL.
    This will help.
    Regards,
    Clemens

  • Dynamic selector for MDB

    Hello,
    I want to achieve the following using JMS,
    Receiving messages from producers with specific criteria, ex. give me only messages that have date less than now.
    if I used MDB then I have the choice of setting the selctor only one time, is there a way to set it from time to time.
    If not then how do I achieve that using a combination of session beans and MDBs.
    Becasue I want to do some processing on the messages received of that criteria then resend them to other receivers.

    Generally you cannot do this. The selector syntax does not allow comparisons against System.currentTimeMills(), and consumers don't allow changes to the selector. Within the specification, the best one can do is recreate the consumer over and over again. This is of course made even worse in that selectors for MDBs are buried inside a deployment descriptor. Nasty stuff.
    WebLogic JMS allows you to specify a delivery time on a message. However it is considered an extension. Other vendors may have a similar feature, but there is no common API for it.

  • Create sequential numbering for chapters in Pages

    Can anyone tell me how to create sequential numbering for my 'scenes' or 'chapters' when writing scripts in Pages.
    I need to have them set up, so if for example, I have...
    Scene 01
    Scene 02
    Scene 03
    and down through scenes to say...
    Scene 52
    But I write another new scene in between scene 02 and 03... I need all the other scne numbers to change automatically below the new scene, instead of me having to manually go through and change every single scene number...
    How do I do this?

    Thanks... this is a reasonable compromise, which I have already done.
    The problem with script writing is that the Scenes are generally headed like this...
    Scene 22: Int - Dinning room - Night
    Then they might have a subtitle, which is a short descriptive of the scene, such as...
    Scene 22: Int - Dinning room - Night
    (Xmas Turkey dilemma)
    So, it could work as you suggest, but it is by far a perfect formula, I am dumbfounded as to why 'Pages' doesn't have this simple function, when just about every other word processing program does. Final Draft is an industry standard for script writing, but I actually prefer Pages, I have a lot more creative control with formatting, except for this one vital function... darn!
    Thanks for you suggestion Peter,
    Cheers

  • Configuring Automatic Failover for EPM Planning Cluster

    We are trying to test automatic failover using a Planning(11.1.2.2)/weblogic cluster containing 2 physical servers and a Weblogic proxy plug-in for OHS.
    I understand that to enable this we must configure in-memory replication of HTTP session states and to do this, (according to various sources including ID 779350.1) the weblogic.xml file must include a descriptor set up as follows:
    <session-descriptor>
       <session-param>
        <param-name>PersistentStoreType</param-name>
        <param-value>replicated</param-value>
    </session-param>
    </session-descriptio>
    Where should weblogic.xml be created or amended (if it already exists) for a Planning cluster in a standard scaled out EPM deployment in order to effect failover between the two servers.
    Thanks

    yes, it can be load balanced in hyperion registry i believe, seen it once only drawback is ,if a JVM goes down while processing a request it needs to be manually started, however the url will switch automatically   

  • Synchronizing multiple threads = sequential processing ???

    Hi !
    I am trying to solicit some comments and views.
    I tested a program which updates an account with two amounts each of 1000, via two different threads.
    I can see thread-1 kicks up , waits for some resource (e.g. from the db server ) while thread-2 kicks in and grab the cpu.
    Without sychronisation, they overwrote each other so that at the end I have only one amount credited to the account (plus the original balance).
    Now I simply add the directive "synchronized" and both amounts are added to the balance correctly !
    However, I can also see (via much use of PrintWriter) that thread-1 completes the synchronised block of java-code first (along with a palpable delay) before thread-2 kicks in.
    In the sychronized case, the cpu was idle while thread-1 waits for an I/O and idle again while thread-2 waits for an I/O.
    Isn't this simple sequential processing ?
    Is this an optimal model for concurrency ?
    I do not see any significant advantages !

    I know the book: Zukovski Java 2 from Sybex.
    Two threads are fighting for single resource. In such case synchronization does not defeat the purpose of concurrent programming. It just introduces the order: not too much of a good thing. Sure, you are better off with only one thread in your account, but think that they can service several diffrent accounts at once not just one.

  • Job fail with Timeout for parallel process (for SID Gener.): 006000

    Hello all,
    Im getting below error and not able to find any issue with Basis side. Please anyone help on this!
    Job started
    Data package has already been activated successfully (will be skipped)
    Process started
    Process started
    Process started
    Process started
    Process started
    Import from cluster of the data package to be activated () failed
    Process 000001 returned with errors
    Process 000002 returned with errors
    Process 000003 returned with errors
    Process 000004 returned with errors
    Background process BCTL_4XU7J1JPLOHYI3Y5RYKD420UL terminated due to missing confirmation
    Process 000006 returned with errors
    Data pkgs 000001; Added records 1-; Changed records 0; Deleted records 0
    Log for activation request ODSR_4XUG2LVXX3DH4L1WT3LUFN125 data package 000001...000001
    Errors occured when carrying out activation
    Analyze errors and activate again, if necessary
    Activation of M records from DataStore object CRACO20A terminated
    Activation is running: Data target CRACO20A, from 1,732,955 to 1,732,955
    Overlapping check with archived data areas for InfoProvider CRACO20A
    Data to be activated successfully checked against archiving objects
    Parallel processes (for Activation); 000005
    Timeout for parallel process (for Activation): 006000
    Package size (for Activation): 100000
    Task handling (for Activation): Backgr Process
    Server group (for Activation): No Server Group Configured
    Parallel processes (for SID Gener.); 000002
    Timeout for parallel process (for SID Gener.): 006000
    Package size (for SID Gener.): 100000
    Task handling (for SID Gener.): Backgr Process
    Server group (for SID Gener.): No Server Group Configured
    Activation started (process is running under user *****)
    Not all data fields were updated in mode "overwrite"
    Data package has already been activated successfully (will be skipped)
    Process started
    Process started
    Process started
    Process started
    Process started
    Import from cluster of the data package to be activated () failed
    Process 000001 returned with errors
    Process 000002 returned with errors
    Process 000003 returned with errors
    Process 000004 returned with errors
    Errors occured when carrying out activation
    Analyze errors and activate again, if necessary
    Activation of M records from DataStore object CRACO20A terminated
    Report RSODSACT1 ended with errors
    Job cancelled after system exception ERROR_MESSAGE

    Thanks for the link TSharma I will try that today.
    UPDATE:
    I ran a non-parallel Data Pump and just let it run overnight. This time it finished after 9 hours.  In this run I set the STATUS=300 parameter in the PARFILE which basically echos STATUS updates to standard out every 300 seconds (5 minutes).
    And as before after 2 hours it finished 99% of the export and just spit out WAITING status for the last 7 hours until it finished.  The remaining TABLES it exported (a few hundred) were all very small or ZERO rows.  There clearly is something going on that is not normal.  I've done this expdp before on clones of this database and it usually takes about 2-2.5 hours to finish.
    The database is about 415 Gigabytes in size.
    I will update what the TRACE finds and I'm also opening a case with MOS.

Maybe you are looking for

  • Fed up with q10 issues from 10.3 restarting and wiping phone

    I have been a bberry user for over 7 years and have never been this close to switching OS's I updated to 10.3 on 10 days ago. All was going well until on the saturday my phone ran out of battery. Normal issue I thought until when i put the charged ba

  • Remove Applications from "Open With..." List

    Ever since I installed Adobe Creative Suite, there have been countless applications available in the Open With... list for image file types (JPEGs, PNGs, etc.). How do I remove applications from that list for a given file type? Thanks!

  • Apple Id help for Ipod touch 4g?

    Hey i have an awsome ipod touch 4g, but i have a problem. I made an id account, but i wanted to make it without a credit card. I tried go through the process again, but it said that the username was already taken. So im wondering how to change the us

  • Editing pictures in Pages

    Hey everyone I'm new here...and a newcomer to Macs as of a few months ago!  I know there is a way to edit pictures in Pages, but I can't remember how... Right now all I need to do is take a color picture and turn it into a black and white silhouette,

  • Premier Pro CS6 Very Sluggish

    Good Day. I'm running PP CS6 on a mostly home-built PC under Win 7 Pro, v.6.1.7601, Serv Pack 1, build 7601. System is a64 bit Intel Core i7 970 @ 3.GHz, 3201Mhz, 6 Cores, 12 Logic 24GB RAM NVIDIA Quadro 4000 w/2GB, Driver is 9.18.13.697 Internal HDD