Remote process execution

I want to send a code snippet, over a Bluetooth connection, from one mobile device to execute it on another one.
How can I do this? Can i send it as a file and execute there? or does it have to be in another format? and how does the execution occur?

I have a query that runs on Host H1 (client host) connecting to a database A running on Host H2 (remote DB Server) and i get this. Its a kind of big query that touches several tables.
ORA-04030: out of process memory when trying to allocate 64544 bytes (sort subheap,sort key)
The same query runs fine if i execute it directly on the DB server (Host H2). I thought there are differences of ulimit between the H1 & H2 hosts that is causing it unable to utilize the local memory. But the same query goes on other DB hosts say Databases B on H3, C on H4, D on H5 etc., No issues with those at all.
All these hosts are of different flavors of UNIX and have different capacities. But H1 is my cental location where the data is being collected and stored for reporting purposes.
Verified all memory parameters such as pga_aggreaget_target , work_area_policy etc., and all look good on Host H2 (which is why the local execution was successful). So, what could go wrong with my remote execution of the same query? Any ideas?please check master note for these errors.
*Master Note for Diagnosing ORA-4030 [ID 1088267.1]*

Similar Messages

  • Remote script execution.

    Hello.
    I would like to know how to run a script on a remote Linux machine. Can I do this without caring where the local JVM is running?
    I haven't done this before, and have searched various places for a concise answer. I have not been successful. Will something like the following work: Runtime.exec("remote.machine\remote.script.sh");. I have also noticed that my searches seem to lead me to discussions of Java RMI, so in thinking RMI has something to say about what I'm doing, I have posted my question here.
    Background: I have a java application (a web service in fact) that needs to get information from a back-end system to do its job. It needs to invoke a script that's running on a remote host where the coveted business system resides.
    The Runtime.exec() method appears only to run processes on local machines. From the research I have done, the usual fudge here is to know the operating system of the local machine so you might be able to use its remote process execution facilities. However, I don't want to know anything about the local OS, because it could be different between development, testing, and production. I know that the remote OS is Linux.
    Is there any way I can get a process to run on a remote machine without having to know about the local machine's OS?
    Many thanks in reply,
    Owen.

    You need obviously the remote system to support this through some facility like remote shell (rsh) or secure shell (ssh). so it depends on how that remote Linux system is set up and what access (userid) you can get for it.
    Are you granted some access to that remote Linux machine?
    If yes, what kind of?

  • Use web service to invoke a new process execution

    Hello,
    I'm using BPM 10g3, and i'm trying to invoke a fully automated process via a web service
    I've setup my web service http://<host>:<port>/albpmServices/<engine name>/ws/<process name>ServiceListener?wsdl
    I've accessed it from a C# application i wrote and visual studio did identify my functions and such.
    I can successfully start a new session (startSession(string password ,string username) and i get a sessionID back) with the web process.
    but when im trying to invoke my own function (which is defined in the Process web service as a Process Execution on my beginIn parameters) I get a response with a FaultException with no visible error code.
    In the BPM log I can see the folloowing lines:
    1) Invoking service with id:<Process Name>ServiceListener
    2) Executing item: IMMEDIATE Inst [-1, -1,-1] Act [No activity] Proc [No Process] Due 1318860646000000]
    3) TransactionAction: Rollback!
    while the TransactionAction: Rollback! means that something bad happened, i cant see the reason.
    any idea's?

    OK i've got it working.
    I did all the things you suggested before but when you said i need to see that the right activity is being started i understood what's wrong
    The chosen activity must be in a lane with the same role as the user login in to the session.
    So i added a role on Begin and created a participent for that role, logged in with it and now it works =)
    Edit: I'm the same guy that opened the threqad, just needed to make a new user for enterprise support (didn't get to use that though)
    Edited by: 891957 on 23:59 17/10/2011

  • How can I put an output stream (HTML) from a remote process on my JSF page

    Hello,
    I've a question if someone could help.
    I have a jsf application that need to execute some remote stuff on a different process (it is a SAS application). This remote process produces in output an html table that I want to display in my jsf page.
    So I use a socket SAS class for setting up a server socket in a separate thread. The primary use of this class is to setup a socket listener, submit a command to a remote process (such as SAS) to generate a data stream (such as HTML or graphics) back to the listening socket, and then write the contents of the stream back to the servlet stream.
    Now the problem is that I loose my jsf page at all. I need a suggestion if some one would help, to understand how can I use this html datastream without writing on my Servlet output stream.
    Thank you in advance
    A.
    Just if you want to look at the details .....
    // Create the remote model
    com.sas.sasserver.submit.SubmitInterface si =
    (com.sas.sasserver.submit.SubmitInterface)
    rocf.newInstance(com.sas.sasserver.submit.SubmitInterface.class, connection);
    // Create a work dataset
    String stmt = "data work.foo;input field1 $ field2 $;cards;\na b\nc d\n;run;";
    si.setProgramText(stmt);
    // Setup our socket listener and get the port that it is bound to
    com.sas.servlet.util.SocketListener socket =
    new com.sas.servlet.util.SocketListener();
    int port = socket.setup();
    socket.start();
    // Get the localhost name
    String localhost = (java.net.InetAddress.getLocalHost()).getHostAddress();
    stmt = "filename sock SOCKET '" + localhost + ":" + port + "';";
    si.setProgramText(stmt);
    // Setup the ods options
    stmt = "ods html body=sock style=brick;";
    si.setProgramText(stmt);
    // Print the dataset
    stmt = "proc print data=work.foo;run;";
    si.setProgramText(stmt);
    // Close
    stmt = "ods html close;run;";
    si.setProgramText(stmt);
    // get my output stream
    context = FacesContext.getCurrentInstance();
    HttpServletResponse response = (HttpServletResponse) context.getExternalContext().getResponse();
    ServletOutputStream out = response.getOutputStream();
    // Write the data from the socket to the response
    socket.write(out);
    // Close the socket listener
    socket.close();

    The system exec function is on the Communication palette. Its for executing system commands. On my Win2K system, the help for FTP is:
    "Ftp
    Transfers files to and from a computer running an FTP server service (sometimes called a daemon). Ftp can be used interactively. Click ftp commands in the Related Topics list for a description of available ftp subcommands. This command is available only if the TCP/IP protocol has been installed. Ftp is a service, that, once started, creates a sub-environment in which you can use ftp commands, and from which you can return to the Windows 2000 command prompt by typing the quit subcommand. When the ftp sub-environment is running, it is indicated by the ftp command prompt.
    ftp [-v] [-n] [-i] [-d] [-g]
    [-s:filename] [-a] [-w:windowsize] [computer]
    Parameters
    -v
    Suppresses display of remote server responses.
    -n
    Suppresses autologin upon initial connection.
    -i
    Turns off interactive prompting during multiple file transfers.
    -d
    Enables debugging, displaying all ftp commands passed between the client and server.
    -g
    Disables file name globbing, which permits the use of wildcard characters (* and ?) in local file and path names. (See the glob command in the online Command Reference.)
    -s:filename
    Specifies a text file containing ftp commands; the commands automatically run after ftp starts. No spaces are allowed in this parameter. Use this switch instead of redirection (>).
    -a
    Use any local interface when binding data connection.
    -w:windowsize
    Overrides the default transfer buffer size of 4096.
    computer
    Specifies the computer name or IP address of the remote computer to connect to. The computer, if specified, must be the last paramete
    r on the line."
    I use tftp all of the time to transfer files in a similar manner. Test the transfer from the Windows command line and copy it into a VI. Pass the command line to system exec and wait until it's done.

  • How can I set a BPS's formula to be a remote process

    Hi Masters
    Now i hava a copy formula in my BPS level.
    I want to know how can I set it to be a remote process and remote it in every day.
    Thanks & Regards.

    Create a process chain, add process type to call the report program UPC_PLANFUNCTION_EXECUTE with proper variants, assign the planning function and trigger this process chain daily.

  • Process execution engine execution error. from the latest HF in BPM Ent.

    Hi All,
    After applying the latest hotfix, I'm now getting this error when trying to launch any of the application.
    My setup is AquaLogic BPM Enterprise 6.0.4 for Weblogic.
    I've tried redeploying all the deployments within Weblogic and also the Admin Center. Nothing has worked.
    Any idea what's causing this?
    Process execution engine execution error.
    Caused by: fuego.io.ObjectSerialization.customWriteObject(Ljava/lang/Object;Ljava/io/ObjectOutputStream;Ljava/lang/Class;)V
    fuego.papi.impl.EngineExecutionException: Process execution engine execution error.
    at fuego.papi.impl.j2ee.EJBProcessControlHandler.doInvoke(EJBProcessControlHandler.java:158)
    at fuego.papi.impl.j2ee.EJBProcessControlHandler.invoke(EJBProcessControlHandler.java:70)
    at $Proxy154.runGlobalActivity(Unknown Source)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:585)
    at fuego.lang.JavaClass.invokeMethod(JavaClass.java:1410)
    at fuego.lang.JavaObject.invoke(JavaObject.java:227)
    at fuego.papi.impl.j2ee.EJBExecution.next(EJBExecution.java:189)
    at fuego.web.execution.InteractiveExecution.process(InteractiveExecution.java:177)
    at fuego.web.execution.impl.WebInteractiveExecution.process(WebInteractiveExecution.java:54)
    at fuego.web.execution.InteractiveExecution.process(InteractiveExecution.java:223)
    at fuego.web.papi.TaskExecutor.runApplicationTask(TaskExecutor.java:349)
    at fuego.web.papi.TaskExecutor.execute(TaskExecutor.java:95)
    at fuego.workspace.servlet.ExecutorServlet.doAction(ExecutorServlet.java:117)
    at fuego.workspace.servlet.BaseServlet.doPost(BaseServlet.java:228)
    at fuego.workspace.servlet.BaseServlet.doGet(BaseServlet.java:219)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:707)
    at fuego.workspace.servlet.AuthenticatedServlet.service(AuthenticatedServlet.java:61)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
    at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:226)
    at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:124)
    at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:283)
    at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:26)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:42)
    at fuego.web.filter.SingleThreadPerSessionFilter.doFilter(SingleThreadPerSessionFilter.java:64)
    at fuego.web.filter.BaseFilter.doFilter(BaseFilter.java:63)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:42)
    at fuego.web.filter.CharsetFilter.doFilter(CharsetFilter.java:48)
    at fuego.web.filter.BaseFilter.doFilter(BaseFilter.java:63)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:42)
    at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3368)
    at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
    at weblogic.security.service.SecurityManager.runAs(Unknown Source)
    at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2117)
    at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2023)
    at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1359)
    at weblogic.work.ExecuteThread.execute(ExecuteThread.java:200)
    at weblogic.work.ExecuteThread.run(ExecuteThread.java:172)
    Caused by: java.lang.NoSuchMethodError: fuego.io.ObjectSerialization.customWriteObject(Ljava/lang/Object;Ljava/io/ObjectOutputStream;Ljava/lang/Class;)V
    at BT_QW.MyProcess.Default_1_0.Instance.writeObject(Instance.xcdl)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:585)
    at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:890)
    at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1333)
    at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1284)
    at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1073)
    at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:291)
    at fuego.server.ProcInst.getComponentData(ProcInst.java:780)
    at fuego.server.ProcInst.mustStoreComponent(ProcInst.java:2793)
    at fuego.server.persistence.jdbc.JdbcProcessInstancePersMgr.createInstance(JdbcProcessInstancePersMgr.java:1018)
    at fuego.server.persistence.Persistence.createProcessInstance(Persistence.java:669)
    at fuego.server.execution.EngineExecutionContext.persistInstances(EngineExecutionContext.java:1810)
    at fuego.server.execution.EngineExecutionContext.persist(EngineExecutionContext.java:1109)
    at fuego.transaction.TransactionAction.beforeCompletion(TransactionAction.java:132)
    at fuego.connector.ConnectorTransaction.beforeCompletion(ConnectorTransaction.java:685)
    at fuego.connector.ConnectorTransaction.commit(ConnectorTransaction.java:368)
    at fuego.transaction.TransactionAction.commit(TransactionAction.java:302)
    at fuego.transaction.TransactionAction.startBaseTransaction(TransactionAction.java:481)
    at fuego.transaction.TransactionAction.startTransaction(TransactionAction.java:551)
    at fuego.transaction.TransactionAction.start(TransactionAction.java:212)
    at fuego.server.execution.DefaultEngineExecution.executeImmediate(DefaultEngineExecution.java:123)
    at fuego.server.execution.EngineExecution.executeImmediate(EngineExecution.java:66)
    at fuego.server.AbstractProcessBean.runGlobalActivity(AbstractProcessBean.java:2708)
    at fuego.ejbengine.EJBProcessControlAdapter.runGlobalActivity(EJBProcessControlAdapter.java:1036)
    at fuego.ejbengine.EJBProcessControl_1zamnl_EOImpl.runGlobalActivity(EJBProcessControl_1zamnl_EOImpl.java:3450)
    at fuego.ejbengine.EJBProcessControl_1zamnl_EOImpl_WLSkel.invoke(Unknown Source)
    at weblogic.rmi.internal.ServerRequest.sendReceive(ServerRequest.java:174)
    at weblogic.rmi.cluster.ClusterableRemoteRef.invoke(ClusterableRemoteRef.java:335)
    at weblogic.rmi.cluster.ClusterableRemoteRef.invoke(ClusterableRemoteRef.java:252)
    at fuego.ejbengine.EJBProcessControl_1zamnl_EOImpl_1000_WLStub.runGlobalActivity(Unknown Source)
    at fuego.papi.impl.j2ee.EJBProcessControlInterfaceWrapper.runGlobalActivity(EJBProcessControlInterfaceWrapper.java:2033)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:585)
    at fuego.papi.impl.AbstractProcessControlHandler.invokeInternal(AbstractProcessControlHandler.java:72)
    at fuego.papi.impl.j2ee.EJBProcessControlHandler.doInvoke(EJBProcessControlHandler.java:116)
    ... 39 more

    I ended up rebuilding my ALBPMDir and ALBPMEngine schemas and WebLogic server domains from scratch. When executing the process that was giving me a headache, all of a sudden I got the following error:
    fuego.server.exception.MaxInstanceSizeRuntimeException: Max instance size exceeded.
    Current size is 16538, whereas the maximum size is 16384. This occurs with instance '
    Anyway, the sizes were not terribly far off, but I doubled the process instance size from 16k to 32k and the error went away.
    Something to try on your end, perhaps.
    Chris

  • How to inovoke a remote process

    Hi,
    I was wondering how to execute a process like 'gedit', 'ls' etc. in a remote machine by using a java program in another machine. In my case the server program will invoke the client program in different machines. The spec said:
    Runtime.getRuntime().exec("ssh " + host + " ; cd " + path + " ; java Site " + ... arguments ... );
    This is one way to start remote processes in Java. It provides very limited functionality.
    I was wondering what other elegant ways would be there.
    I would be grateful if any body give some suggestion about this.
    Thanks.

    Look up JNI and RMI

  • How do you load subReports in ReportViewer in remote processing mode using LoadReportDefinition

    We have a reportViewer control on a web page that renders reports in Remote ProcessingMode.
    However, it does not access the reports from the SSRS server, it loads them from our own repository, and sends the Stream of the RDL file to LoadReportDefinition.
    This works fine when there are no SubReports.  However, I am not sure how to handle SubReports.
    I know that in Local ProcessingMode you have to handle the events to load the SubReports manually, and that in normal Remote ProcessingMode when running a report stored on the server, it handles the SubReports for you.
    But what is the process for handling SubReports for Reports in Remote Processing mode that are rendered with LoadReportDefinition?
    Should I handle the event and load the SubReport manually somehow using LoadReportDefinition again?  Or is it simply not supported?

    Hi jth001,
    From your description, do you mean the ReportExecutionService.LoadReportDefinition method or the LocalReport.LoadReportDefinition method? If it is the former method, subreports and data source references with relative paths are not supported using this
    method. However, absolute paths to catalog items can be used. For more information, please see:
    ReportExecutionService.LoadReportDefinition Method
    If it is the latter, you need to call LoadSubreportDefinition method and provide data for any subreports. Fore more information, please refer to:
    LocalReport.LoadReportDefinition Method
    LocalReport.LoadSubreportDefinition Method
    Regards,
    Mike Yin
    TechNet Subscriber Support
    If you are TechNet Subscription user and have any feedback on our support quality, please send your feedback
    here.

  • Remote command execution via ssh on ips sensor...

    I am attempting to execute a command remotely via ssh so that I can collect the information on another host.
    ex: ssh -t username@sensor show tech-support
    Instead of the output I expect, I receive an error message: Error: Received invalid command line argument.
    Is this type of remote command execution supported by the sensor?
    Kevin Riggins

    Not true, i already created scritp to automaticly backup the IPS

  • Remote Query execution option

    Hello Experts,
    I have installed and configured Intercompany 2.0 for testing, during which I could not find remote query execution option which was there in earlier version.
    Is there any setting to be done to enable this feature in new version.
    Thanks
    Deepak

    Hi Deepak,
    To access the Remote query execution feature of Intercompany, you are required to enable the SAP B1 user for the same. Follow the steps below to enable Intercompany solution Remote Query Execution feature:
    Log in SAP B1, navigate to Administration--> Setup--> Users and search the user.
    On Users-Setup screen, select Can Use Remote Query Execution check box and choose Update button.
      3. Now, using this user, login SAP B1 and open Query Manager window.
    4. Now on Query Manager window; Select a query, choose either of following options and execute          the query:
         Local: To execute and pull data from the logged in company.
         ALL   : To execute and pull data from all intercompany configured companies.
         Selection: To execute and pull data from a selected set of intercompany configured companies.
    Refer to Intercompany Integration Solution User Guide for further information related to Remote query execution.
    Regards,
    Agneesh Jain
    SAP Intercompany Team

  • Debugger unable to connect to remote process

    Hi There,
    I am running Jdev on pc and OAS 10g on an unix environment. I followed the step in the doc to set up the remote debugging. However, when I start the remote debugger from jdev, i got the following error message.
    Debugger attempting to connect to remote process at ax-portal-www6.dev.ipass.com 4000..
    Debugger unable to connect to remote process.
    Your help wuld be greatly appreciated.

    A couple things to try:
    1) Verify your jdk version and build bug 4411232
    2) Increase your Debugger preferences Connection retries to double the default (especially if multiple apps running, low memory or a slow machine)
    Without more specifics on versions, type of debugging or details not much more I can suggest.

  • Source System Conversion in Remote Process Chain

    Hi all!
    I' working with BI7 SP12 and facing the following problem:
    <b>There is no source system conversion for Remote Process Chains, when transporting Process Chains with Remote Process Chains from DEV to QA.</b>
    First the following warning appears:
    DEV-Destination of Remote Process Chain does not exist on QA-System.
    ==> The is correct because the DEV-Destination should have been converted to QA-Destination.
    Then the following error appears:
    Process Chain with the Remote Process Chain was not succesfully checked.
    <b>All source system conversion tables are maintained correctly!</b>
    The source system conversion works for example for InfoPackages correctly.
    But for Remote Process Chain there is neither source system conversion for the Destination nor for Callback Destination. Both should be converted.
    Do I have to do it manually??
    Or do I have to create the DEV-Destination in SM59 on the QA-System??
    ==> We want to avoid this...
    Thanks for any ideas...
    kj

    Please note:
    When I create the DEV-Destination in SM59 on the QA-System,
    the transport gets no warnings and no errors...
    But there is still no conversion of the source systems for the Destination and the Callback Destination of the Remote Process Chain to QA-Destinations...
    So, i think:
    There is no automatic source system conversion of Remote Process Chains and I have to adjust it manually in the QA-System...

  • How to run remote process chain asynchronous?

    Hi,
    I have a situation where I would like to call a remote process chain asynchronous insted of the usual synchronous way.
    The docs on http://help.sap.com/saphelp_nw04/helpdata/en/86/6ff03b166c8d66e10000000a11402f/content.htm
    states that: <i>The remote process transfers the communication with the other system and <b>synchronously</b> starts the executed process chain</i>.
    ...but since I don't want it to be synchronous, I am looking for other ideas...
    Best regards,
    Christian Frier

    I found that it did not work tried to trigger the event on a remote system.
    If I call BP_EVENT_RAISE I can trigger an event and thereby start a process chain on my source system (local). If I specify TARGET_INSTANCE nothing seems to happen in the remote system...
    Also I cannot call BP_EVENT_RAISE with RFC as it is not remote enabled.
    It think the solution is to trigger a local chain that takes care of starting the remote chain. The net effect is that it is asynchronous.

  • OPM Process Execution and OPM Standard Costing for Poultry Business

    I have a business requirement from the client at their Poultry Processing plant where the client feels that the OPM Execution steps are too cumbersome until closing the batches. The client wants to maintain minimum number of batches in Production: one for Primary, one for Secondary and one for Further Processing, rather than having multiple batches based on different formulas. There are 3 main types of processing which can be briefly described as follows:
    A.) Primary Processing:
    Live Birds (Ingredient) become Whole Dressed Chicken with Neck, without Neck, Liver, Heart, etc... (40 Finished Products)
    e.g. Live Birds (Broiler) -> Whole Dressed Chicken with/without neck, Whole Dressed Chicken with/without neck, etc.
    B.) Secondary Processing:
    Only one type of Whole Dressed Chicken (from Primary Processing) become Chicken Cuts (80 Finished Products).
    e.g.[Scenario B1] Whole Dressed Chicken (Broiler) -> Drumsticks, Thighs, Wings, Breasts, Backbone, etc.
    [Scenario B2] Breasts -> Filets and Deboned Breasts (2 FGs)
    [Scenario B3] Thighs -> Thighs in Plain Bag
    Please note that the Whole Dressed Chicken is also sold as a Finished Good as well as an Ingredient for the Secondary Processing.
    C.) Further Processing:
    Chicken Cuts become 40 Finished Products
    e.g Filets -> Chicken Nuggets, Chicken Fries, etc...
    The client is using Standard Costing method for its OPM Financials and all the finished products (160 Products in all) are having different production cost. Please note that 1 kg of Chicken Wings is not equal to 1 Kg of Chicken Thighs. Different body parts have different costs. In order to alleviate the maintenance of multiple batches per day on the production floor, the client wishes to have minimum batches. We therefore would wish if you can confirm the below approach to be correct?
    a.) OPM Process Execution:
    ==========================
    EITHER should I (Option 1):
    a.) Create ONLY 3 Formulas for Primary, Secondary and Further Processing . The 3 Formulas will have all the processing line Finished Products grouped together (as per the scenarios explained above). Formula for Primary and Secondary Processing can also combined together, reducing it to only 2 batches to maintain per day.
    b.) Can set any Quantity Values for the Ingredient, Product and By-Product in the Formula details with ANY Cost Allocation (amounting to a total of '1') in the Products section
    c.) Set the Validity Rules as 'PRODUCTION' for the 3 Recipes
    d.) Complete the steps defined in the 'OPM Cost Management' (as described below)
    e.) Create the 2 or 3 batches and record the appropriate quantities at the end of the day before closing the batches
    OR should I (Option 2):
    a.) Create MULTIPLE Formulas (above 100 formulas) for Primary, Secondary and Further Processing based on the different products processed.
    b.) Can set any Quantity Values for the Ingredient, Product and By-Product in the Formula details with ANY Cost Allocation (amounting to a total of '1') in the Products section
    c.) Set the Validity Rules as 'PRODUCTION' for the 3 Recipes
    d.) Complete the steps defined in the 'OPM Cost Management' (as described below)
    e.) Create the MULTIPLE batches and record the appropriate quantities at the end of the day before closing the batches.
    b.) OPM Cost Management:
    ========================
    Whether (Option 1 or Option 2) selected, the below needs to be set for OPM Costing:
    a.) Define multiple formulas (above 100), as in Option 2.
    b.) Set the Quantity value to be '1' for the Ingredient, Product and By-Product in the Formula details with the appropriate Cost Allocation in the Products section
    c.) Set the Validity Rules as 'COSTING' and 'PRODUCTION' for each Recipe
    d.) Run Cost Rollup at least once so that the products can have an item cost per unit
    As per me, for the purpose of Costing, it would be imperative to have multiple batches (created one time only) with appropriate Cost Allocation in the Formulas and the ‘Recipe Use’ in the Validity Rules should be set as ‘COSTING’. Then, setting Profile Option 'GMF: Use Only Costing Validity Rules for Cost Rollup' to 'Yes'. In this way, we are sure that the different products in the formula with correct Cost Allocation will have their Item Cost calculated after performing the Cost Rollup. AND, for the purpose of operations, we can only one combined formula of Live Birds (as Ingredients) to yield -> All FGs for Primary and Secondary Processing with Cuts. But, this time, the ‘Recipe Use’ in the Validity Rules of the Recipe should be set as ‘PRODUCTION’.
    I want to confirm which approach (Option 1 or Option 2) is more appropriate in terms of Operations and confirm that the above proposed steps are correct with no circular reference (as certain finished products are also used as ingredients for another product in the Secondary Processing)?
    Thanks and regards
    Raveesh Nobeen
    [email protected]
    Edited by: user12189219 on Jan 20, 2010 3:48 PM

    Hi Raveesh
    I am implementing OPM R12 in Poultry Processing business, I think option 1 (Create 3 formulas/recipes/batches) is more appropriate. In my case we are using actual costing (Moving Average) as a costing method.
    I have set the profile option GMF: Cost Allocation Factor Calculation to be "Dynamic" to calculate the batch cost allocations as a ratio of actual quantity of each product produced to the total production batch output quantity and I have used OPM Financial Cost Allocation process to allocate GL expenses on different products based on Fixed percentage % according to product "value" where Filets has a return more than wings.
    Can you please share your knowledge in this business area and confirm to what extend my approach is correct ??
    Thank you and best regards
    Mamdouh Ragab

  • 'Dataguard remote process startup failed'

    Hi
    When I try to use the Data Guard to configure my Physical Database ( Which for testing purposes, is planned to have onthe same machind as the Primary) - gives me the error
    'Dataguard remote process startup failed'
    Please help me.!
    Can anyone give me the details of how to create a physical standby from scratch.. considernig the fact that I am new to Oracle... I am donig it as per the documentation, but not able to start the standyby database which is on the same machine using the pfile...
    Thanks
    Radhika

    Are you creating standby database using data guard manager through oracle managment server? If so, have you applied patches or not?
    Here is the excerpt from metalink note : 204848.1
    ====
    Data Guard Manager reports "Data Guard remote process startup failed." Error Dialog
    Fact
    Oracle Enterprise Manager Data Guard Manager 9.2.0.1.0
    Symptom
    "Data Guard remote process startup failed" error dialog in the Oracle
    Enterprise Manager 'Create Configuration' Wizard is displayed when you
    attempt to run Data Guard Manager and your environment is not setup
    correctly.
    Fix
    Check and address the following items in the order listed.
    1) Ensure the Oracle installations for both the primary and standby Oracle
    Homes contain Perl.
    On UNIX systems, verify this by looking in each Oracle Home for the
    $ORACLE_HOME/Apache/perl/bin/perl file. On NT systems, look for the
    file $ORACLE_HOME\apache\perl\5.00503\bin\mswin32-x86\perl.
    If this file is present, Perl is installed; and proceed to step 2.
    If not, then you must install the Perl that comes with Oracle.
    Follow these steps to install Perl on both the primary and standby
    Oracle Homes:
    a. Run the Oracle Installer and choose the Oracle Home in question.
    b. Choose the "Oracle9i Database 9.2.0.1.0" option.
    c. Choose "Custom" for the installation type.
    d. Under "Oracle Enterprise Manager Products 9.2.0.1.0", choose to
    install the "Enterprise Manager Website 9.2.0.1.0" item.
    e. Complete the install.
    f. Verify that the Perl installation succeeded by checking for the
    presence of the above file. If successful, proceed to step 2.
    2) Ensure that all Data Guard Manager patches are installed in
    the OEM installation.
    These following two patches are independent of each other and must both be
    installed on all versions of 9.2.0:
    Patch Number 3409886 (supercedes Patch 2670975)
    Patch Number 2671349
    Download these patches from Oracle Support Services Website, Metalink,
    http://metalink.oracle.com. Activate the "Patches" button to get the
    patches Web page. Enter the Patch number and activate the "Submit"
    button. The patches are listed for the Windows and Solaris operating
    systems only. However, they are generic patches that will work on
    other platforms.
    After verifying that each patches are installed, proceed to step 3.
    3) Ensure the Oracle Intelligent Agent is running on both the primary and
    standby nodes.
    The Agent must be running for Data Guard to function. Check it by
    running agentctl status on the primary and standby nodes. If it is not
    running, start it by running the command agentctl start. After verifying
    that it is running on both nodes, proceed to step 4.
    4) Ensure that the user specified in the primary and standby node preferred
    credentials can successfully run an OEM job.
    Data Guard Manager requires that OEM preferred credentials are specified
    for both the primary and standby node. Furthermore, these credentials
    must specify a user(s) who has read/write permissions in the primary
    and standby Oracle Homes, respectively.
    Note well: Databases that are on Windows NT/2000 require that the
    the user(s) specified in the credentials must be granted
    the "Log on as a batch job" privilege.
    After checking that preferred node credentials are set correctly on both
    nodes, run a test OEM job to verify that jobs are functioning properly.
    Do this as follows:
    a) Select the Create Job item in the OEM Console Job menu.
    b) Give the job a name, select Node as the Target Type, select and
    add the primary node as the target.
    c) Go to the Tasks tab, select and add "Run OS Command".
    d) Go to the Parameters tab and specify a simple OS command such
    as ls on UNIX, or dir on NT.
    e) Click Submit.
    f) Once the job is finished, select the Jobs icon in the OEM Console
    tree, select the History tab, and examine the Job output.
    If it was successful, jobs are working and you can try the Data
    Guard Create Configuration wizard again.
    If the job failed, you must troubleshoot this problem and fix it
    before continuing with Data Guard Manager.
    5) If problems persist, obtain additional information and log a
    Support Request (iTAR).
    If everything in the previous steps is OK, and Data Guard Manager
    continues to fail, additional information is required to diagnose the
    problem. The following two items must be obtained and submitted to
    Oracle Support Services:
    a) OEM client tracing.
    The Create Configuration wizard must be run again in trace mode.
    This can be done by starting OEM from a command prompt (i.e., not
    the Windows NT start menu) as follows:
    oemapp trace console
    Client tracing messages will appear in the command window from
    which this command is run. Data Guard Manager should be started and
    the Create Configuration wizard should then be run up to the point
    where the error occurs. All tracing messages from Create
    Configuration wizard start-up to this point should then be captured
    and saved into a file.
    b) Data Guard job output
    The output from any failed Data Guard jobs must be captured. This
    can be done as follows:
    1) Select the Jobs icon in the OEM Console tree.
    2) Select the History tab.
    3) Double-click on the latest "DataGuardCtrlXXXX" job that has a
    Failed status.
    4) An Edit Job window will pop up; double-click on the Failed line
    to obtain the output.
    5) Put the output into a file.
    If a Data Guard Manager bug is submitted, the tracing output and the
    job output will be required.
    ====
    Jaffar

Maybe you are looking for