ETL process but no use RSA1

Hi All,
How to ETL process but no use RSA1.
Thanks...
Tom Pemalang

Hi Eddy,
   Is the file on the presentation serever? If so get the file loaded to the application server and gat a process chain created for the data load ... then you need to give authorization for RSPC and get the data loaded via a Process Chain ...
  Once the file is on the application server .. you even have the option of setting up a background job..
Hope this helps.
bestr egards,
Kazmi

Similar Messages

  • Monitor ETL process using SNMP

    Hi,
    I have an ETL process that runs continuously and add lines to a table under Oracle 11g on Linux. I would like to monitor the proper operation of the ETL using an external SNMP manager (eg Zabbix). To do this the algorithm would be like this:
    - List and count lines that have been to the table durig the last n minutes;
    - Update an SNMP counter
    - Repeat every n minutes.
    I was thinking about creating a small Perl subagent for net-snmp that would execute an SQL query using JDBC, but is there a better way to do this? For instance by using an Ora

    I assume you're talking about scripting this? Probably the easiest thing to do would be to use Curl. I've attached an example. You'll need to change a couple values for it to work.
    The output is different than what you'd get if you did a "df" using the service account. You'll have to decide which output you require.

  • Bi Content in 7.0 - Did SAP Convert all content to use new ETL process?

    Hello.
    I am working with BI 7.0 for the first time.  I have 8 years of BW 3.x experience.  We are starting a new BI project and we are in the process of activating Business Content.  We have BI_CONT level 7 (the latest as of today I believe).  I appears that SAP has not converted over its Business Content to use the new ETL process (Transformations, DTP's, etc).  For example, I am looking at basic areas such as AP, AR, SD, MM, GL, etc and the BI content is still using the 3.x ETL (transfer rules, update rules). Is something not installed right on my system or has SAP not converted it's content to the new ETL yet?
    Thanks in advance for your help.

    Jose,
    Some new content is released using the new DTP.  Most content remains in its delivered 3.x format.  If you want to use DTPs for this content you have to manually create the DTPs after installing the 3.x objects.  If you right-click on an InfoCube, you will see some new options in the context menu, including "Create Transformation...," and "Create Data Transfer Process...."  Context menu for DataSouce now contains a "Migrate" option that will allow you to migrate to the new DTP from 3.x.  Also, other objects such as Transfer Rules and Update Rules contain context menu options for "Create Transformation."
    Hope this helps.

  • I have a Nikon D600 and D800 and I shoot everything in Raw. I use Photoshop Elements 9 for processing but I have been unable to convert any RAW files to open in PSE. Says unable to parse the file. What does this mean and how do I fix it?

    I have a Nikon D600 and D800 and I shoot everything in Raw. I use Photoshop Elements 9 for processing but I have been unable to convert any RAW files to open in PSE. Says unable to parse the file. What does this mean and how do I fix it?

    Since pse 9 can only use up to camera raw 6.5 and the d600 needs at least 7.3 and the d800 needs at least 6.7, you can use the
    8.6 adobe dng converter to convert those files to dng copies, which then pse 9 should open.
    8.6 dng converter
    windows
    Adobe - Adobe Camera Raw and DNG Converter : For Windows : Adobe DNG Converter 8.6
    mac
    Adobe - Adobe Camera Raw and DNG Converter : For Macintosh : Adobe DNG Converter 8.6
    Note:
    If you have windows xp or vista or mac os x 10.6, then you'll need to use the 8.3 dng converter instead
    windows
    Adobe - Adobe Camera Raw and DNG Converter : For Windows : Adobe DNG Converter 8.3
    mac
    Adobe - Adobe Camera Raw and DNG Converter : For Macintosh : Adobe DNG Converter 8.3
    how to use the dng converter
    Camera Raw: How to use Adobe DNG Converter - YouTube
    more info on supported cameras and camera raw plugins required
    Camera Raw plug-in | Supported cameras
    Camera Raw-compatible Adobe applications

  • Data/database availability between ETL process

    Hi
    I am not sure whether this is right forum to ask this question. But still I am requesting for help.
    We have a DSS database of size 1Tb. The batch process runs between 12:00 hours till 6:00 am. the presentation/reporting schema is of size 400 GB. Through the nightly batch job, most of the tables in the presentation layer get truncated/modified. Due to business nature and requirement, this presentation layer needs to be available 24*7. As the ETL process modify/changes database, hence the system is not available between 12:00 till 6 am.
    The business requirement is that: Before ETL process starts, take a backup of the presentation layer, transfer the application to this backed-up area and then let the ETL process proceed. Once the ETL process finished, move the application to this latest area.
    Based on the size of the database and schema size, does any one , how to take backup/restore the presentation layer in the same database in a efficient way.

    There are a couple of possibilities here. You certainly need two sets of tables, one for loading and the other for presentation. You could use synonyms to control which one is the active reporting set and which is the ETL set, and switch them over at a particular time. Another approach would be to use partition exchange to exchange data and index segments between the the two table sets.
    I would go for the former method myself.

  • Custom ETL processes creation in OWB

    Hi, we are working in a Oracle Utilities BI implementation project.
    Some of the KPI identified, need a custom development - creation of complete ETL processes:
    - Extractor in CC&B (in COBOL)
    - Workflows in OWB
    - Configuration in BI
    We were able to create 4 custom ETL processes (2 fact and 2 related dimensions) including the COBOL extract program, the OWB entities (Files, Staging tables, Tables, Mappings and Workflows) and the BI tables.
    We already have the data in the BI database and in some cases we are able to configure zones to show the data in BI.
    We are facing some problems when we try to use a custom Fact and a custom Dimension together, for instance:
    Case 1. When we create a zone to show : Number of quotes - Measure 1 : tf=CM_CF_QUOTE.FACT_CNT func=SUM dt=END_DATE_KEY the graph is displayed.
    Case 2. When we create a zone to show : Number of accepted quotes - Measure 1 : tf=CM_CF_QUOTE.FACT_CNT func=SUM dt=END_DATE_KEY the and Fixed Dimensional Filter 1 : tf=CM_CD_QUOTE_TYPE.QUOTE_STATUS_CD oper==(10), the graph is displayed.
    Case 3. When we create a zone to show : Number of ongoing quotes - Measure 1 : tf=CM_CF_QUOTE.FACT_CNT func=SUM dt=END_DATE_KEY the and Fixed Dimensional Filter 1 : tf=CM_CD_QUOTE_TYPE.QUOTE_STATUS_CD oper==(20), the graph is displayed.
    Case 4. When we create a zone to show : Number of ongoing quotes - Measure 1 : tf=CM_CF_QUOTE.FACT_CNT func=SUM dt=END_DATE_KEY the and Fixed Dimensional Filter 1 : tf=CM_CD_QUOTE_TYPE.QUOTE_STATUS_CD oper==(20), the graph is displayed.
    Case 5. But when we create a zone to show : Number of quotes sliced by quote status - Measure 1 : tf=CM_CF_QUOTE.FACT_CNT func=SUM dt=END_DATE_KEY the and Dimensional Filter 1 : tf=CM_CD_QUOTE_TYPE.QUOTE_STATUS_CD, no graph is displayed.
    Case 6. A different problem occurs in the single fact. We try to show the number of processes of a given type. The type of the process is a UDDGEN. When we load the zone to show the graph the following error appears: Measure1 : tf=CM_F_CF_CSW.FACT_CNT func=SUM dt=ACTV_DATE_KEY and Fixed Dimensional Filter 1 : tf=CM_F_CF_CSW.UDDGEN1 oper==(ENTRADA)
    An error is displayed: No join defined between fact table CM_F_CF_CSW and dimension table CM_F_CF_CSW. The dimension tabelentered in the zone parameter could be invalid for this fact table.
    Does anyone had the same problem??????
    Edited by: user11256032 on 10/Jun/2009 11:51
    Edited by: user11256032 on 10/Jun/2009 11:54

    Hi user11256032, I just stumbled upon this by accident. The reason no-one has answered yet, is because it is in the wrong forum. (I can understand that you thought it belonged here.) Please post the question to the Oracle Utilities forum, which is here Utilities If that link doesn't work, go to Forum Home, then choose Industries, then Utilities. You may have to select "More ..." on Industries.
    Actually, I suspect there was an SR created for these, so your question may have been answered already.
    If you don't mind me asking, which customer is this for?
    Jeremy

  • ETL processing Performance and best practices

    I have been tasked with enhancing an existing ETL process. The process includes dumping data from a flat file to staging tables and process records from the initial tables to the permanent table. The first step, extracting data from flat file to staging
    tables is done by Biztalk, no problems here. The second part, processing records from staging tables and updating/inserting permanent tables is done in .Net. I find this process inefficient and prone to deadlocks because the code loads the data from the initial
    tables(using stored procs) and loops through each record in .net and makes several subsequent calls to stored procedures to process data and then updates the record. I see a variety of problems here, the process is very chatty with the database which is a
    big red flag. I need some opinions from ETL experts, so that I can convince my co-workers that this is not the best solution.
    Anonymous

    I'm not going to call myself an ETL expert, but you are right on the money that this is not an efficient way to work with the data. Indeed very chatty. Once you have the data in SQL Server - keep it there. (Well, if you are interacting with other data
    source, it's a different game.)
    Erland Sommarskog, SQL Server MVP, [email protected]

  • In memory Lookups in ETL processing

    Hi All,
    I am writing a function in PL/SQL to convert unit of measurement like:
    convert(from_unit, to_unit) returns float
    I have couple of lookup tables which will be referenced in the function to convert units table_1 has 100 rows and and table_2 has 1000 rows.
    This function will be called during ETL processing for about half a million rows which can have significant I/O operations. I am considering two options
    1) PL/SQL table - pre-load both lookups in PL/SQL tables and use in function.
    2) CACH lookup table.
    However, I would like to have some opinion on my approach, please could you suggest which one will be better approach or if there are any other method to achieve good performance?
    Thanks
    techTD

    910150 wrote:
    Ok, I thought initially to write function with using couple of methods (PL/SQL table etc). There is no such thing as a PL/SQL "+table+". The correct term is array (and there are a couple of favours that can be used).
    Yes, these reside in memory. But in the "wrong" memory ito of scalability. And also in the "wrong" memory for optimal access and use by the SQL engine.
    Which makes this a problematic approach for SQL use. If PL/SQL functions are to be used, consider defining these as deterministic in order to reduce the number of context switches to the PL/SQL engine for executing these PL/SQL functions.
    Better still is doing that PL/SQL function code, inside the SQL statement.
    However, the logic inside is complex and would take some good amount of effort.SQL is vastly underestimated ito power and flexibility.
    I thought will check in the forum if some one has already done similar experiment.No need for "experimenting". The facts are simple. Calling PL/SQL functions from the SQL engine requires a context switch from the SQL engine to the PL/SQL engine.
    Context switching in any form (from inside a CPU between CPU rings, to between SQL and PL/SQL in an application s/w layer) is an overhead that needs to be carefully considered and implement so as to reduce the overhead to a minimum.
    If you have such a context switch from the SQL engine to the PL/SQL engine in a SQL projection, resulting in switch per row, for 1/2 a million rows - that overhead will add significant run-time to the SQL process.

  • Not able to process open items using F-53

    Hello All,
    I am trying to process open items using F-53, I  m getting error message as "Entry SAG1 is missing in table T043G". This was working fine for GL accounts, but failing for vendors.
    I checked the tolerance group for same and its fine and also tried checking table se16, but not able to find the root cause of issue. please let me know how can i resolve this.
    Thanks in advance.
    --Sagar

    HI,
    Check following...
    1. Make sure you have created one null (Blank) tolrance group using transaction OBA3 for your company code SAG1...
    2. If you do not want to create null tolerance group then make sure you have mapped the tolerance group created to the customer master and also see the use assignment...
    Regards,
    Chintan Joshi

  • ETL Stops but no error logged

    Hi All,
    I have an ETL that runs ok in several environments, but having just migrated it to a new UAT environment we are experiencing regular unexplained shutdowns.
    There are no failed tasks, it just stops. This message appears in theserver logs.
    1861 SEVERE Thu Aug 16 22:46:05 BST 2012 ETL seems to have completed. Invoking shut down dispatcher after the notification from the last running task.
    1862 SEVERE Thu Aug 16 22:46:05 BST 2012 Finishing ETL Process.
    The last task to complete was an SDE, this has an associated SIL which hasn't run, so why does the DAC think the ETL has finished?
    I see there have been similar posts in teh past, but the fifference here is that more than three hundred tasks had completed ok, and when I restart the ETL it will run the rest ok.
    Any ideas anyone?
    thanks
    Ed

    Hello Florian,
    may be the usual problem of an active index during loading of a datacube. The index maintainance has to recreate the indizes while loading, and it have to do this in parallel if you have more than one data package loading. This may mean deadlock or memory problems.
    Please drop the index of the Cube (Manage->Performance->Delte Index (Now)) and retry your load. Don't forget to rebuild the index after load.
    You can do this in a process chain too.
    Kind regards,
    Jürgen

  • RPM - How to re-process prices without using the front end.

    Hi all,
    There is a way of re-processing prices without using the front end? For instance, using the front end, a new clearance is created state in rpm_clearance ‘pricechange.State.conflictCheckforApproved’ and changed for ‘pricechange.State.worksheet’ after the conflict check.
    Now, I want to re-process the same clearance without using the front end. I’ve tried to change again the STATE to ‘pricechange.State.conflictCheckforApproved’ but no success. Any suggestions?
    Thanks in advance.
    Regards

    Can you please attach your VI?
    Prashanth N
    National Instruments

  • Video transfer from computer to new ipad (ret disp) shows transfering process, but it's not in video or movie  . ...Where'd it go ?

    video transfer from computer
             Video transfer from computer to new ipad (ret disp) shows transfering process, but it's not in video or movie  . ...Where'd it go ? sync automatically downloaded some -  but was a bit prejudiced and selective with content mainly all edited by me in Moviemaker (windows 7, probably the reason)
             I use technology since neanderthals roamed the planet . ....don't quite understand  all the nuances  . .. .So, my all things "i" (apple/mac) guru has arranged a meeting,    and I'm impatient.  I know some of my videos are in several formats, etc... and I don't quite understand the differences . ...to me, it's complicated, as Alec Baldwin would say.
            Unfortunately, I'm "late" to the game metaphoricaly ( windows vs apple/mac ). I have a couple ipods and actually endorse the ijet remote device (Co. from San Diego). I have pro tools and other hi tech gear in the pro keyboards/sequencing arena where I've been writing computer music since 80's and it's evolution.

    The Apple video app is very limited in what it will play. If you have multiple video formats and if they will play on an iPad, then you likely need a 3rd party vidoe app. One such is BUZZPlayer which is available in the iTunes App Store on a Mac or PC or from the App Store app on an iOS device.
    You transfer the videos to 3rd party video apps using iOS File Sharing.
    http://support.apple.com/kb/ht4094

  • Lightroom opens fine initially, but when trying to open a different catalog it will go through the process but never reopen again. I can't get it to open at all until the computer is reset.

    Lightroom opens fine initially, but when trying to open a different catalog it will go through the process but never reopen again. I can't get it to open at all until the computer is reset. I have been using Lightroom 4 for 18  months now and this problem just started happening in the last month.

    No error messages. It acts totally normal but just never reopens when I try to reopen a new catalog. I create a catalog for each of my clients....I have done this since I started using lightroom as it seems tidy to me. But, I have never had this problem until now. I am using a pc and vista and Lightroom 4. Thanks.

  • Problem with creating Process order confirmation using BAPI

    Hello,
    While creating Process Order confirmation using BAPI_PROCORDCONF_CREATE_TT, material document is getting created. But a line item is inserted in the table AFRU without material document number. When it is created manually using the transaction COR6, the table is getting updated with material doc in the line item. Can anyone let me know what other attributes to be passed in order the update the same?
    Thanks in Advance.
    Regards, Senthil G.

    Hello , I am working with the same Bapi, can you please send me the code to fill the parameters, if I find the same error like you, I will send you the solution if I correct that.
    Thanks for your help.
    [email protected]
    Guatemala, Cempro ADATSA

  • Can deploy process but not workflow (10.1.3.10 beta)

    I tried to deploy several of the demo BPEL processes, but each time the process is deployed successfully, but the workflow is not. Fro example OrderApproval.
    I tried setting properties in the build.properties file, but apparently to no avail. Also changing them via the project ant properties did not help.
    It looks like the payload JSP did not get deployed -- most of the process works, including the workflow bit, as long as one does not try to look into the worklist entry contents (which gives the 404 Not Found error, same as in the 10.1.2 version when the JSP page is not in the proper directory of hw/worklistexpress).
    I noticed some strange behavior in the build properties as stored in the project file (OrderApproval.jpr). There are a lot of equivalent ("fromjdev") entries in the oracle.jdeveloper.ant.AntRunConfiguration hash table. Also, each time after I run a deploy, the value for oc4jinstancename is removed and a corresponding <null/> entry shows up in the file. I tried cleaning up redundant fromjdev entries, but it did not help.
    Also hard coding the oc4jinstancename attribute to soademo did not help.
    Any suggestions on how to fix the build? See below for a log; I added printing of the deploy attributes to the build.xml file.
    Apache Ant version 1.6.5 compiled on June 2 2005
    Buildfile: C:\JDevSOA10.1.3.10beta\jdev\mywork\demos\OrderApproval\build.xml
    Detected Java version: 1.5 in: C:\JDevSOA10.1.3.10beta\jdk\jre
    Detected OS: Windows XP
    parsing buildfile C:\JDevSOA10.1.3.10beta\jdev\mywork\demos\OrderApproval\build.xml with URI = file:///C:/JDevSOA10.1.3.10beta/jdev/mywork/demos/OrderApproval/build.xml
    Project base dir set to: C:\JDevSOA10.1.3.10beta\jdev\mywork\demos\OrderApproval\bpel
    Property ${home} has not been set
    [property] Loading C:\JDevSOA10.1.3.10beta\jdev\mywork\demos\OrderApproval\bpel\${home}\samples\common.properties
    [property] Unable to find property file: C:\JDevSOA10.1.3.10beta\jdev\mywork\demos\OrderApproval\bpel\${home}\samples\common.properties
    Build sequence for target(s) `validateTask' is [validateTask]
    Complete build sequence is [validateTask, deployTaskForm, loadproperty, compile, deployProcess, all, ]
    validateTask:
    [echo] --------------------------------------------------------------
    [echo] // Validating workflow
    [echo] --------------------------------------------------------------
    [validateTask] url is file:/C:/JDevSOA10.1.3.10beta/integration/bpm/support/files/WorkflowTaskDefinition.xsd
    [validateTask] Validation of workflow task definitions is completed without errors
    Build sequence for target(s) `deployProcess' is [deployProcess]
    Complete build sequence is [deployProcess, validateTask, deployTaskForm, loadproperty, compile, all, ]
    deployProcess:
    [echo] --------------------------------------------------------------
    [echo] // Deploying bpel process OrderApproval on localhost and port 8888
    [echo] --------------------------------------------------------------
    [deployProcess] Deploying process C:\JDevSOA10.1.3.10beta\jdev\mywork\demos\OrderApproval\bpel\..\output\bpel_OrderApproval_1.0.jar
    [deployProcess] Successfully deployed the process "OrderApproval" on server "localhost" and port "8888"
    Build sequence for target(s) `deployTaskForm' is [deployTaskForm]
    Complete build sequence is [deployTaskForm, validateTask, loadproperty, compile, deployProcess, all, ]
    deployTaskForm:
    [echo] --------------------------------------------------------------
    [echo] // Deploying workflow form on localhost and port 8888
    [echo] --------------------------------------------------------------
    [echo] hostname=localhost httpport=8888 rmiport=23791
    [echo] platform=oc4j opmnrequestport=23791
    [echo] oc4jinstancename= asinstancename=soademo domain=default rev=1.0
    BUILD FAILED
    C:\JDevSOA10.1.3.10beta\jdev\mywork\demos\OrderApproval\build.xml:52: Error while deploying the form on server "{0}" Error message :
    com.evermind.client.orion.AdminCommandException: Could not connect to the remote server. Please check if the server is down or the client is using invalid host, ORMI port or password to connect: Connection refused: connect
    at com.evermind.client.orion.Oc4jAdminConsole.createCannotConnectException(Oc4jAdminConsole.java:175)
    at com.evermind.client.orion.Oc4jAdminConsole.executeCommand(Oc4jAdminConsole.java:128)
    at com.collaxa.cube.ant.taskdefs.DeployForm.deployOC4J(DeployForm.java:612)
    at com.collaxa.cube.ant.taskdefs.DeployForm.deployForm(DeployForm.java:521)
    at com.collaxa.cube.ant.taskdefs.DeployForm.deployForms(DeployForm.java:780)
    at com.collaxa.cube.ant.taskdefs.DeployForm.execute(DeployForm.java:806)
    at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:275)
    at org.apache.tools.ant.Task.perform(Task.java:364)
    at org.apache.tools.ant.Target.execute(Target.java:341)
    at org.apache.tools.ant.Target.performTasks(Target.java:369)
    at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1216)
    at org.apache.tools.ant.Project.executeTarget(Project.java:1185)
    at org.apache.tools.ant.helper.DefaultExecutor.executeTargets(DefaultExecutor.java:40)
    at org.apache.tools.ant.Project.executeTargets(Project.java:1068)
    at oracle.jdevimpl.ant.runner.AntLauncher.launch(AntLauncher.java:321)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:585)
    at oracle.jdevimpl.ant.runner.InProcessAntStarter.runAnt(InProcessAntStarter.java:295)
    at oracle.jdevimpl.ant.runner.InProcessAntStarter.mav$runAnt(InProcessAntStarter.java)
    at oracle.jdevimpl.ant.runner.InProcessAntStarter$1.run(InProcessAntStarter.java:71)
    Caused by: javax.naming.CommunicationException: Connection refused: connect [Root exception is java.net.ConnectException: Connection refused: connect]
    at com.evermind.server.rmi.RMIClient.lookup(RMIClient.java:258)
    at com.evermind.server.rmi.RMIClientContext.lookup(RMIClientContext.java:51)
    at com.evermind.client.orion.Oc4jAdminConsole.executeCommand(Oc4jAdminConsole.java:126)
    ... 20 more
    Caused by: java.net.ConnectException: Connection refused: connect
    at java.net.PlainSocketImpl.socketConnect(Native Method)
    at java.net.PlainSocketImpl.doConnect(PlainSocketImpl.java:333)
    at java.net.PlainSocketImpl.connectToAddress(PlainSocketImpl.java:195)
    at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:182)
    at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:366)
    at java.net.Socket.connect(Socket.java:507)
    at java.net.Socket.connect(Socket.java:457)
    at java.net.Socket.<init>(Socket.java:365)
    at java.net.Socket.<init>(Socket.java:207)
    at com.evermind.server.rmi.RMIClientConnection.createSocket(RMIClientConnection.java:645)
    at oracle.oc4j.rmi.ClientSocketRmiTransport.createNetworkConnection(ClientSocketRmiTransport.java:58)
    at oracle.oc4j.rmi.ClientRmiTransport.connectToServer(ClientRmiTransport.java:78)
    at oracle.oc4j.rmi.ClientSocketRmiTransport.connectToServer(ClientSocketRmiTransport.java:68)
    at com.evermind.server.rmi.RMIClientConnection.connect(RMIClientConnection.java:609)
    at com.evermind.server.rmi.RMIClientConnection.sendLookupRequest(RMIClientConnection.java:153)
    at com.evermind.server.rmi.RMIClientConnection.lookup(RMIClientConnection.java:137)
    at com.evermind.server.rmi.RMIClient.lookup(RMIClient.java:249)
    ... 22 more
    at com.collaxa.cube.ant.taskdefs.DeployForm.execute(DeployForm.java:818)
    at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:275)
    at org.apache.tools.ant.Task.perform(Task.java:364)
    at org.apache.tools.ant.Target.execute(Target.java:341)
    at org.apache.tools.ant.Target.performTasks(Target.java:369)
    at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1216)
    at org.apache.tools.ant.Project.executeTarget(Project.java:1185)
    at org.apache.tools.ant.helper.DefaultExecutor.executeTargets(DefaultExecutor.java:40)
    at org.apache.tools.ant.Project.executeTargets(Project.java:1068)
    at oracle.jdevimpl.ant.runner.AntLauncher.launch(AntLauncher.java:321)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:585)
    at oracle.jdevimpl.ant.runner.InProcessAntStarter.runAnt(InProcessAntStarter.java:295)
    at oracle.jdevimpl.ant.runner.InProcessAntStarter.mav$runAnt(InProcessAntStarter.java)
    at oracle.jdevimpl.ant.runner.InProcessAntStarter$1.run(InProcessAntStarter.java:71)
    Total time: 2 seconds
    Snippet from project file:
    <hash n="oracle.jdeveloper.ant.AntRunConfiguration">
    <url n="buildfileURL" path="build.xml"/>
    <value n="outputLevel" v="3"/>
    <list n="propertyNames">
    <string v="fromjdev"/>
    <string v="fromjdev"/>
    <string v="fromjdev"/>
    <string v="fromjdev"/>
    <string v="user"/>
    <string v="password"/>
    <string v="type"/>
    <string v="host"/>
    <string v="port"/>
    <string v="fromjdev"/>
    <string v="fromjdev"/>
    <string v="fromjdev"/>
    <string v="fromjdev"/>
    <string v="asinstancename"/>
    <string v="fromjdev"/>
    <string v="fromjdev"/>
    <string v="fromjdev"/>
    <string v="fromjdev"/>
    <string v="fromjdev"/>
    <string v="opmn.requestport"/>
    <string v="admin.user"/>
    <string v="platform"/>
    <string v="domain"/>
    <string v="oc4jinstancename"/>
    <string v="hostname"/>
    <string v="http.port"/>
    <string v="admin.password"/>
    <string v="rmi.port"/>
    <string v="rev"/>
    </list>
    <list n="propertyValues">
    <string v="true"/>
    <string v="true"/>
    <string v="true"/>
    <string v="true"/>
    <string v="oc4jadmin"/>
    <string v="welcome1"/>
    <string v="oc4j"/>
    <string v="localhost"/>
    <string v="8888"/>
    <string v="true"/>
    <string v="true"/>
    <string v="true"/>
    <string v="23791"/>
    <string v="soademo"/>
    <string v="true"/>
    <string v="true"/>
    <string v="true"/>
    <string v="true"/>
    <string v="true"/>
    <string v="23791"/>
    <string v="oc4jadmin"/>
    <string v="oc4j"/>
    <string v="default"/>
    <null/>
    <string v="localhost"/>
    <string v="8888"/>
    <string v="welcome1"/>
    <string v="23791"/>
    <string v="1.0"/>
    </list>
    <list n="targetList">
    <string v="validateTask"/>
    <string v="deployProcess"/>
    <string v="deployTaskForm"/>
    <string v="deployDecisionServices"/>
    </list>
    </hash>

    When I change the rmi ports to 23793 via the project properties and save, they get updated correctly in the .jpr file. However, the deploy still fails, because somehow JDeveloper resets the values back to 23791 (just as it routinely wipes out the oc4jinstancename) and writes them to the .jpr file.
    How can I make the new value 'stick'?
    I tried changing the port of the LocalBPELApplicationServer from 23791 to 23793 but got:
    Error while getting remote MBeanServer for url: ormi://localhost:23793/default:
    Error reading application-client descriptor: Error communicating with server: Connection refused: connect; nested exception is:
         javax.naming.CommunicationException: Connection refused: connect [Root exception is java.net.ConnectException: Connection refused: connect]
    When I click on the [+] next to the connection I get this:
    oracle.oc4j.admin.jmx.shared.exceptions.JMXRuntimeException: Error while getting remote MBeanServer for url: ormi://localhost:23793/default
         at oracle.oc4j.admin.jmx.client.CoreRemoteMBeanServer.fetchMBeanServerEjbRemote(CoreRemoteMBeanServer.java:499)
         at oracle.oc4j.admin.jmx.client.CoreRemoteMBeanServer.<init>(CoreRemoteMBeanServer.java:160)
         at oracle.oc4j.admin.jmx.client.RemoteMBeanServer.<init>(RemoteMBeanServer.java:128)
         at oracle.oc4j.admin.jmx.client.RemoteMBeanServer.getMBeanServer(RemoteMBeanServer.java:158)
         at oracle.oc4j.admin.jmx.client.ClientMBeanServerProxyFactory.getMBeanServer(ClientMBeanServerProxyFactory.java:68)
    <snip>
    And even then, the deploy process still forces the port to 23791.
    I restarted the server and JDeveloper, but the problem persists.

Maybe you are looking for