OPA and DSC?

Hi Dev Team,
I work with a propspect where we have to demonstrate how OPM can work with external databases and get data from them for exectuting the assessment. Developer help guide describes the process, but I have only OPM and OPA is installed on my laptop. What DSC tool do you recommend to installl that works with the tool?
Thanks, Peter
Edited by: user807507 on 2010.09.06. 2:08

Standard practice is to retrieve the data from the database, pass it to Determinations Server via a SOAP request, then save the answer(s) back to the database.
If you want to to do this in Batch Mode, then you could consider creating a .csv extract of the database, running it through the Data source Connector (included in the Oracle Policy Automation packages) and then pushing it back into the database.
It really depends what you are trying to achieve in terms of real-time responsiveness, and what the orchestrating process is aiming to do.
Edited by: Davin Fifield on 12/09/2010 17:19

Similar Messages

  • 100% CPU using LabView 7.1 and DSC module

    Seven years ago I used BridgeView and PXI to execute a project, and the results were so good that after that initial system, I installed one more using BridgeView, and three more using LabView 7.1.
    BridgeView is working very well, but lately, LabView 7.1 is giving me more and more problems. It all started in a system running LabView 7.1 and DSC on Windows XP, using a PXI-8187 controller with a PXI-1042 chassis. I have two PXI-6533 digital I/O boards, one PXI-6030E multifunction, one PXI-6713 analog output board, one PXI-6602 counter board and one PXI-8422 serial com. board. In may/06, all of the sudden, the CPU went to a 100% use. Since I am running PID's to control a flash-butt welding machine in a steel factory, this was extremely dangerous. I changed the controller and everything went back to normal. It was running fine until december/06, when the same behavior appeared again. This time changing the controller did not help. We increased the system memory, from 256 MB to 512 MB and everything worked fine. But one month later (january/07) the fault reappeared. This time we changed the chassis and from that moment to this time, we are running smoothly.
    Application problems were suspect from the very first time this problem appeared, but I was unable to identify the source, if any. I used PROFILE VI's and apparently the write/read tags were taking all the processor resources. However, changing the routines and disabling communications, and optimizing CPU intensive programs never solved the CPU overload.
    Until now, I was aware of only one system showing this erratic behavior. But today I went to check another system, with a totally different application. This one is used to measure the thickness of the steel sheet on a rolling mill. Is not so resource-intensive, but the maintenance folks told me that every time they turned off this particular system, they always had problems trying to turn it on.
    I was suspecting of a PLC-communication related issue, but what I found turned my alarms on. I turned off the system, which was fully functional, and turned it back on, and there it was, CPU load at 100%!. Is exactly the same problem that I have on the welder. This system has a PXI-8184, a PXI-1042 chassis, Windows XP, one PXI-6030E, one PXI-6713 and one PXI-6533. I battled for 20 minutes to put the system back in normal conditions.
    I specified every single board, installed everything and programmed all the applications. The BridgeView applications have never showed anything like this. I am using the DSC (Datalogging and Supervisory Control Module) on both applications, and the Lookout Protocol Drivers are communicating to Modicon PLC's.
    Please, I need help to solve this issue. I believe none of your actual Knowledge Base "100% CPU" articles apply to my case...
    Thanks in advance...
    Antonio Jimenez
    [email protected]

    Thanks for your reply...
    Yes, sometimes I have the feeling that the systems works again because some file or database gets initialized after so many reboots.
    Intentionally I turned off all the events and alarm logging to disk. Also the historical data logging is disabled. This is done by code, every time the main VI is started. This was included precisely to save CPU processing power. However, I am communicating to PLC's, and of course I have to declare variables inside the citadel database to make the communication possible.
    Right now I can't have access to the application, because the mill is rolling, but during the next maintenance stop I will check the database location and size, and I could change the directory location the next time the fault comes up.

  • Labview: resource not found error code 24 when building executable using 8.20 and dsc module

    I am trying to build an executable and I keep getting and error and cannot seem to figure out what is wring.  The whole error is "LabVIEW: resource not found.  An error occured loading "interface 3.vi"  Labview load error code 24:  this vi cannot be loaded because it is broken and has no block diagram."
    I am using LV 8.20 and DSC 8.2.  I am trying to create an executable that has the web server running for using remote panels, i have netowrk published shared variables, connecting to a fieldpoint unit, and I am using the NI system tray icon vis to load the program to the system tray.
    I have my shared variable library added as a support file, as well as the lvdsc.ini, and the tray icon files.  I have sleected "Enable Enhacned DSC run time support" in the app builder, did not remove any type def,etc.
    I cannot get the .exe to run on my development machine or another machine, I get the same error.  I tried creating an installer that installed the 8.2 runtime, dsc,fieldpoint, variable engine and manager and MAX.
    I could not find anything in the help files on the dsc, maybe someone else has some hints.  thanks
    Kenny

    Hi Kenny,
    I hope you are doing well today! What is interface 3.vi? Is it possible for you to post your project over here? Also, I would recommend creating a simple DSC application; and would like you to build it. Do you still get errors?
    Adnan Zafar
    Certified LabVIEW Architect
    Coleman Technologies

  • How OPA and OPM works

    Hello Guys,
    It's me again! :) I already deployed and installed OPA and OPM and was able to access the sample rulebases. I just have some questions / clarifications on how OPA and OPM works.
    1. When you build and run project via OPM, the web-determinations page only opens locally. It will not be directly loaded / deployed to the web server unless the compiled project (rulebase) is copied to the *.war file. I think OPM is really for developers’ use only. Once the project is tested, developer will request to deploy it to the web server. Please confirm if my understanding is correct.
    2. Is there a technical architecture diagram that will show how the OPA components work? Or how are they connected with each other? When I access the sample rulebase, the only URL I see is the web-determinations. Not sure what's the purpose of the other components or if they work independently.
    3. Should the contents of web-determinations and determination-servers directories the same? For instance, if we deploy a new rulebase/plugin to web-determinations, should we also copy those files to determination-servers?
    4. For the interview portlet, will it work with 'Oracle WebCenter Portal 11g Patch Set 5 (11.1.1.6.0)'? In the OPA install guide, it is stated that the 'Consume the Interview Portlet on WebCenter can be done on any instance of WebCenter and will require you to have a working portal application', what does the portal application mean? If we'll integrate OPA with PeopleSoft, do we need to install PeopleSoft Portal application? For now, we only planned to use OPA with PeopleSoft HRMS, Financials and CRM applications. Please confirm if we need to install PS Portal application as well.
    This is all I have for now... Thanks in advance for your assistance. :)
    Regards,
    Ann M.

    1. Generally, OPM is for business users not developers. (i.e. Policy or Business Analysts who communicate with and understand business terminology and express business policies using that terminology). Developers play a role in integrating the policy model to another systems data model - but that should primarily be a data mapping exercise and design and implementation of integration when necessary.
    2. There is a good bit of information on architecture, components, etc. covered in both the developer documentation on OTN ( http://docs.oracle.com/html/E38272_01/toc.htm ) and high level overviews on the OPA YouTube Channel: http://www.youtube.com/user/OraclePAVideos
    3. You only need both is you plan on using both (i.e. Oracle Web Determinations is for running interviews to collect data to be used in making a determination, Oracle Determinations Server is for SOA - i.e. SOAP endpoint which publishes WSDLs that define the request/response formats for integration via web services with other apps.
    4. Portal apps / infrastructure is only required if you are deploying the OPM project within a portal (i.e. as an interview running within a portal app).
    (3 and 4 above are related, you didn't ask but a fourth option for deployment integration is via an API so basically an OPM project can be deployed as one or more of the following: an interview app (Web Determinations), an interview within a portal, a web services endpoint (Determinations Server) or integrated via an API. The YouTube content on architecture overview and integration will explain in a bit more detail and the OTN content will go into much more detail on each of the options.

  • Inquiry on OPA and OPM Minimum Disk Space Requirement

    Hi,
    We would like to know the required disk space (or minimum disk space needed) to install OPA and OPM. Thanks!
    Regards,
    Ann Miranda

    aclm_219 wrote:
    Hi Frank,
    Thanks for the reply. We are new to OPA and have no knowledge (except from its features and OPA runtime components) on how it works and integrated with PeopleSoft applications. Your assistance is highly appreciated.
    Just some follow - up questions:
    - How much memory should we allocate for OPM and OPA? We'll use Oracle Weblogic for OPA on Java application.
    - Do we need to create a separate database instance for OPA? If yes, would you know how much disk space should we alllocate for it? If not, where will OPA store its data?
    Regards,
    AnnAnn,
    Oracle Policy Automation communicates with systems like Peoplesoft via web services. For this the runtime components must run in a J2ee application server (java) or IIS (.NET). While the footprint of the OPA runtime is very small, application servers usually require a fair amount of memory allocated to them.
    There is no OPA to Peoplesoft connector as a product, although Oracle Consulting have built several integrations for customers specific to their needs
    OPA does not require any database at all to run. However, it does not persist results. An Integration between OPA and another system typically involved OPA sending the results of an Interview of Assessment to that system which is then responsible for doing something with the results.
    You can find lots of information at http://www.oracle.com/technetwork/apps-tech/policy-automation
    Cheers
    Frank

  • Process Failure when communicating over MODBUS using LabVIEW 2011 and DSC

    I'm currently trying to read from a PLC's holding registers using MODBUS/TCP. I've confirmed that the PLC is updating the values and responding to MODBUS communication correctly using a third party program called Modbus Poll. However, when I try to poll the PLC using LabVIEW's shared variable engine, I am unable to read any values from the same addresses that I'm viewing with Modbus Poll.
    My setup simply consists of a PC connected directly to the PLC over Ethernet, with no router in between. I am using LabVIEW 2011 SP1 with the DSC module.
    I opened the NI Distributed Systems Manager to view the status of all shared variables in the Modbus library that I created and I've noticed that the CommFail bit is permanently set to "true". All other variables with a "read" access mode report "Process Failure". I've tried restarting the process as well as stopping and starting the local variable engine with no success. I've also restarted my computer several times to see if any services were failing, but this does not seem to have fixed the problem.
    I finally resorted to monitoring communications over the network card that I have the PLC plugged into via Ethernet using Wireshark and I've found that while Modbus Poll is communicating with the PLC, many MODBUS and TCP packets are sent and received. However, when solely using LabVIEW or the NI DSM to communicate with the PLC, there does not appear to be any communication over the network card.
    Something that may be worth noting is that I was able to communicate with the PLC and read values from it with the DSM on just one occasion, when I first figured out which addresses I should be reading from. It all stopped working shortly thereafter. Prior to this, "CommFail" was not usually set to "true" with my current configuration. Thinking that it was my firewall, I have since turned my firewall off, but this seems to have had no effect on the problem either.
    Any help on this matter would be appreciated.
    Solved!
    Go to Solution.

    Just a thought but I think the  register addresses used by LabVIEW are one off of the actual register #.  I was using a CRIO as a modbus IO Server and had to shift the register addresses by 1 to get things to work correctly (can;t recall if it was +1 or -1).  This is documented somewhere on ni.com but can;t seem to find it now.  But here is another  link that may help:
    http://zone.ni.com/reference/en-XX/help/371618E-01/lvmve/dsc_modbus_using/
    Dan

  • Remote Panel and DSC interfering?

    Hi, I wonder if DSC or OPC is interfering to view my remote panel in a executable, since I installed them and now I cannot see my web page. I mean the instructions says close Labview to gain acces with the executable, does DSC or the shared variabel Server or something is interfering?
    What happened:
    I have a program which monitors temperature for rooms with fruit, We acquire the temps through modbus RTU, and We see that info through internet.
    First I did it with the modbus libraries, and It worked, We had the web page working through remote panels and the web publishing tool.
    But now I saw the DSC toolkit for Labview, and excited about that I migrated the whole functionality to the IO servers, and until the very end, when everything else works ok, the web page or remote server doesn't works, works with the VI but not with the EXE.
    The first time I followed this, and I did this time again with not succes
    http://digital.ni.com/public.nsf/allkb/7F95D43D3F50FCAC8625710E000068E1
    Currently state:
    My Web server is started and works in development, I made a webpage named TRY.html, and it works in Labview with the VI.
    I know that the plugin doesn't works on chrome, but just to let you see that the problem is not IEXPLORER, whetn the web page is static it works on chrome too.
    With IEXPLORER (right) it works in chrome (left)  it doesn't
    Working on IExplorer
    So I followed the steps from the instructions, added the web page to my project, added to the build, and built
    Then I modified my NIconfigserver file, and CLOSED labview entirely, the tutorial says so, I closed at least as much as I used to do before.
    My ni config
    # Web server configuration file.
    # Generated by LabVIEW 14.0
    # 13/06/2015 11:44:16 p. m.
    # Global Directives
    NI.AddLVRouteVars
    ErrorLog "$LVSERVER_ROOT/logs/error.log", level=2, anew
    TypesConfig "$LVSERVER_ROOT/mime.types"
    LimitWorkers 10
    LoadModulePath "$LVSERVER_ROOT/..;$LVSERVER_ROOT/modules;$LVSERVER_ROOT/LVModules"
    LoadModule LVAuth lvauthmodule
    LoadModule LVSnapshot lvsnapshotmodule
    LoadModule LVRFP lvrfpmodule
    Listen 8000
    # Directives that apply to the default server
    NI.ServerName LabVIEW
    DocumentRoot "C:/SMC/data"
    InactivityTimeout 60
    SetConnector netConnector
    AddHandler LVAuth
    AddHandler LVSnapshot snap
    AddHandler LVRFP
    AddHandler LVSnapshot
    AddHandler fileHandler ""
    AddOutputFilter chunkFilter
    DirectoryIndex index.html
    My ini
    [SMC]
    server.app.propertiesEnabled=True
    server.ole.enabled=True
    server.tcp.paranoid=True
    server.tcp.serviceName="My Computer/VI Server"
    server.vi.callsEnabled=True
    server.vi.propertiesEnabled=True
    WebServer.Enabled=True
    WebServer.TcpAccess="c+*"
    WebServer.ViAccess="+*"
    DebugServerEnabled=False
    DebugServerWaitOnLaunch=False
    saveFloaterLocations=True
    find.viListFlags=0
    LastErrorListSize=0,0,0,0
    paletteStyle="NamedIcons"
    And so, I open my EXE, then reload the webpages, and I can see that the web server is started, because it shows the html, but I cannot link to my EXE. It says that can't find my VI
    Requested VI is not loaded into memory on the server computer
    If I close the EXE the server stops
    Remote Panel connection refused by specific server, Make sure Labview Web server is enabled
    And that's my history
    It's always hard for me to get the web page working, but this time I've spent hours without success
    I did the test in my development pc and in the target PC and is the same, on these PC's the webpage was working before using DSC.
    So any help will be appreciated
    My best regards

    The DSC Runtime Engine will be sufficient for the deploying the project to a new computer. The Run Time Engine is equivalent to the DSC Module, but you can't do any programming with it. You will need to purchase a separate license for the Runtime license for each deployment computer also. 
    What is the DSC Run-Time System and When Should I Use It?
    http://digital.ni.com/public.nsf/allkb/E56DB8726DB68F288625770E00594351
    Thanks,
    Frank
    Application Engineer
    National Instruments

  • Release Management and Dsc - How to queue a build when a release is in progress?

    I have VSO and Release Management 2013.4 working and deploying to Azure vm's via DSC
    The build profile is set to trigger on commit to the Git repo and the release template is set to be triggered on successful build from TFS
    However if a developer commits in quick succession the resultant builds cause releases to overlap in RM? - this causes some of the releases to fail with a DSC error ("the consistency check or pull cmdlet is in progress....")
    Is there a way to force RM to prevent concurrent releases?  (based on the same release template and build profile)

    Thats correct - in this instance its VSO as source control and using the hosted release management to attempt to deploy to azure vms
    And yes build takes about 5 minutes but the release template can take up to 15 minutes to run - this means that a second build can cause the same release template to run again (its set to trigger on build)
    Error log from RM
    xception Message: New deployment is not allowed as an another deployment is in progress. Retry the deployment after sometime. (type OperationFailedException)
    Exception Stack Trace: at Microsoft.TeamFoundation.Release.EnvironmentProvider.Azure.Implementation.AzureDeploymentProvider.ReadDeploymentResponse(DeploymentResponse response)
    at Microsoft.TeamFoundation.Release.EnvironmentProvider.Azure.Implementation.AzureDeploymentProvider.DownloadBuilds(DeploymentMachineSpecification deploymentMachineSpecification, AzureStorageSpecification azureStorageSpecification)
    at Microsoft.TeamFoundation.Release.EnvironmentProvider.Azure.Implementation.AzureDeploymentProvider.RunScript(String scriptPath, String configurationPath, MachineSpecification machine, StorageSpecification storage, Dictionary`2 configurationVariables)
    at Microsoft.TeamFoundation.Release.Tasks.DeployDsc.Execute(DscComponentParametersV2 dscComponentParameters, AzureStorage azureStorage, String[] dnsNameAndUpdatedWinRmPort, String userName, String password, String dscScriptPath, String dscConfigurationPath, Boolean skipCACheck)
    at Microsoft.TeamFoundation.Release.Automation.Tasks.DeployDscTask.DscExecute(DscComponentParametersV2 dscComponentParameters, AzureStorage azureStorage, String[] dnsNameAndUpdatedWinRmPort, String userName, String password, String dscScriptPath, String dscConfigurationPath, Boolean skipCACheck)
    at Microsoft.TeamFoundation.Release.Automation.Tasks.DeployDscTask.DscDeploy(AzureStorage azureStorage, DscComponentParametersV2 dscComponentParameters, String userName, String password, String dscScriptPath, String dscConfigurationPath, String skipCACheck)
    at Microsoft.TeamFoundation.Release.Automation.Tasks.DeployDscTask.Execute(IAutomationContext context)
    at Microsoft.TeamFoundation.Release.DistributedTask.TaskProcessor.TaskExecutor.Execute(TaskExecutionContext context)

  • Shared Variable Properties and DSC

    Is there a way to assign engineering units to a shared variable as a
    configuration parameter?  This should be on the "Scaling" page of
    the shared variable properties.  It seems this is a logical and
    convenient place to track units.  Assigning units programmatically
    using the Scaling:Units property is awkward (to say the least).
    On similar lines, why aren't shared variable properties automatically
    saved to the DSC historical database?  Every trace should have a
    set of information that exposes ALL the shared variable properties that
    created it.  Take something like engineering units for
    example:  then you would know what kind of historical trace you
    are looking at!  This seems so basic I can't imagine why it was
    missed.
    Unless I missed something -- please enlighten me.
    Regards,
    David Moerman
    TruView Technology Integration Ltd.

    Hello David,
    You are correct in that the units property is not exposed on the
    Scaling page of hte Shared Variable properties dialog, and it sounds
    like you are already well aware of the existing method to access this
    property through the property node interface.  As you have also
    discovered, the Citadel database that we use with DSC does not have a
    built-in provision for storing metadata about the shared variable from
    which a trace originates.  You can emulate this with, for example, an
    array of strings which is also logged to the database, with each string
    containing the metadata for a particular trace. 
    Since it seems like having these features would benefit you, I
    encourage you to let our DSC development group know your needs by
    filling out product feedback, accessible on our site at
    http://www.ni.com/contact   .  This will send a feature request
    directly to the appropriate R & D group, who reads and evaluates
    every suggestion made.  This is your most direct way to let us know
    what features would best meet your needs.
    Cheers,
    Matt Pollock
    National Instruments

  • OPA and Subversion

    We are using OPA to set up an enterprise capability for business rules management. We will calling the Determinations server from a web service (controlled by Fusion orchestrations).
    Subversion is the source control tools we'll be using.
    Do you have any examples or reference sties for using OPA & subversion. I am interested in the file layout that will most enable reuse of rules, and ease of discovery of existing rules.
    With regards to reuse, how can rules from one project be reused in another?
    thanks
    Andrew

    We do have many customers that are using subversion as the version control for Oracle Policy Modeling.
    One way to share rules across OPM projects using subversion is:
    1. Have separate subversion folders for Project 1, Project 2, and Shared rules
    2. When working on Project 1, check out Project 1 and Shared rules to your local folder for Project 1
    3. When working on Project 2, check out Project 2 and Shared rules to your local folder for Project 2
    Davin Fifield

  • SignalExpress and DSC combination??

    Has anyone used the new SignalExpress software in combination with the tag engine to save/record array data from tags??

    Hello,
    To my knowledge an interface between SignalExpress and LabVIEW DSC has not been created. There aren�t any built-in features of SignalExpress that allow interfacing to the DSC module and vice-versa.
    It is conceivable that this interface could be created since SignalExpress allows calling a LabVIEW VI. A VI could be written which would access data from LabVIEW DSC.
    Hope this helps!
    Ken S.
    Applications Engineering
    National Instruments

  • OPA and Siebel E Commerce

    Hi,
    Can OPA be integrated with Siebel E Commerce Order Process?
    Siebel E Commerce is a portal based on Oracle app server and siebel business applications. It is integrated with Siebel Business apps using web services.
    Thanks,
    Sachin

    Sachin,
    I haven't heard of any integrations with Siebel E Commerce, but the OPA Connector for Siebel is designed to integrate with any 8.x Siebel install. OPA uses Web Services to communicate with siebel and as a general approach.
    Cheers
    Frank

  • OPA and configuration folder rename

    Hi All,
    As we all know there is a configuration folder present in WEB-INF/CLASSES I wanted to rename this configuration folder to configuration_mm for this i edited application.properties file however now i need two folders one with configuration_mm where i keep my messages_en-US.properties file and one configuration/application.properties file
    can in any way i can keep my application.properties file in configuration_mm and discard the need of configuration folder i.e. tell my OPM to read application.properties from configuration_mm\application.properties

    Kunal,
    This can be achieved using shared library concept.

  • Problem with dsc shared shared variables and error 1950679023

    I'm having trouble using shared variables on my laptop PC (with Windows XP) using Labview 8.2 and DSC.   I've set up a simple project with one VI and one shared variable library.  In the VI, I have  a while loop that writes a number from a front panel control to the shared variable.  Another loop running in parallel reads the shared variable and displays to an indicator on the front panel.   The VI resides on my PC (no remote targets). This program executes just fine on my desk top PC; however I get the following error codes out of the shared variable nodes when I run the project and VI on my laptop for the write and read, respectively: ni_tagger_lv_Write 180121604 ni_tagger_lv_Read -1950679023 I've tried to manually deploy the shared variable library without success. The NI knowledge base  and other entries on this forum suggested that the Windows firewall should be disabled for a quick fix, however, the firewall was already disabled on my lap top. I didn't see any entries regarding the 180121604 error code.
    Any suggestions would be appreciated.

    The problem has mysteriously gone away.  For about the 10th time, I manually deployed the share variable library, and it started to work properly.

  • 11g (11.1.1.4) ADF and OPA

    Hi,
    I am using JDeveloper/WLS 11g (11.1.1.4). I have also installed downloaded the latest OPA modeling and server (Oracle Policy Automation 10.2.0 modeling and runtime).
    I then went ahead and installed the determinations-server.war in 11g (embedded in JDeveloper) after modifying the ‘application.properties’ to include the following
    load.rulebase.from.claspath=true
    rulebase.path=rulebases
    I was able to test the http://localhost:7101/web-determinations/ URL
    I was also able to create a basic rule form the modeler and run it against the "Oracle Web determinations" from the tool (instead of the "Oracle Determinations Server").
    My questions are as follows.
    Q1. Deployment of Rules with Fusion: I would like my business users to be able to create rules and test them against a test WLS server preferably the Weblogic server that hosts my web-determinations. In this regard, i would like to know how i can configure the OPM tool to deploy/test against Weblogic. In other words, should i set specific values to Tools->Options->Rule based development->Embedded Server? The only settings i see are for Tomcat, but i am assumign i can put something similar to weblogic. Any sample will be appreciated.
    Note that i do understand the issue about rulebase.path directory. In other words, i can either
    a) explode deploy the determinations-server.war , or
    b) add a director that will contains my rules to the weblogic class path, or
    c) use something like -Ddeterminations.server.rulebase.dir=/some/other/dir/rulebases in my startWeblogic.cmd
    Q2. OPA with ADF Example: Can anyone point me to a working example of OPA that has been integrated with ADF 11g. In other words, i am looking for an end-to-end scenario where i can develop my business rules in OPA and use them in an ADF application. It would be better if the example uses some values of the business rules that reside in a database.
    Q3. OPA versus OBR: Can someone articulate lucidly the use cases when someone would chose OBR (Business Rules in ADF) versus OPA (the haley product), and the implications on ADF development.
    Thanks very much,

    1. In the current version, OPM doesn't directly deploy to anything other than local TomCat. You'll need to manually refresh your WebLogic environment with the latest rulebase.
    2. I'm not aware of an ADF sample. I assume your goal is to host Web Determinations interviews in an ADF UI?
    3. You should use OPA instead of OBR if you need any of the following:
    a) Rules in natural language in any of the supported OPA languages (English, Chinese, French, Spanish, Portuguese, etc....)
    b) Rules in Word/Excel document format that can be easily shared with business users
    c) Interactive interviews for guided decision making, where questions can be automatically generated from rules, and screens are shown only as needed to make progress towards an answer
    d) Detailed decision reports of which rules and data were used to reach a decision
    e) Productized integration with Siebel or SAP
    f) Rich test case development and regression testing capabilities
    More generally OPA is very good at determinations. I.e. reaching the one right decision given a set of input data. It is not a good fit for optimization problems where the goal is to seek the best outcome given a series of constraints. It is also good at taking a large set of law/policy documents are managing them as interconnected rules. It is possible to efficiently manage thousands of pages of material as an OPM project.
    For ADF development, anywhere you have a process that involves a complex decision step, or a calculation that depends on complicated rules that may change frequently or need good business visibility, OPA is a good choice.
    Davin.

Maybe you are looking for