Multiple independient instance of task flow

Hello,
How do I deploy multiple instances of the same task flow (in panelWindow)? but with total independence in BC.
The idea is, for example, have several containers (N panelWindow's) and that performing N clicks on a menu, the containers are loaded with the same task flow (same definition different instance). But if you perform an action on one of them (create, delete, search, etc..), The others remain unchanged.
The task flow should contain all the independence it used VO (Master-Detail,
Master-Detail1 y Master-Detail2 Detail1 and even Master1-Detail1 and Master2-Detail 2, etc)
I think the FW is not intended for this, but suddenly, a user requests something like this statement :-).
Greetings.
////////////Castellano :-)
Hola,
¿como implemento varias instancias de mismo task flow (en panelWindow)? pero con independencia total en BC.
La idea es, por ejemplo, tener varios contenedores (N panelWindow's) y que al realizar N clicks sobre un menú, se carguen los contenedores con el mismo task flow (misma definicion distinta instancia). Pero que si realizo una accion en uno de ellos (create, delete, search, etc.), los demas no se modifiquen.
El task flow deberia contener la independencia en todas las VO usadas en él (Master-Detail,
Master-Detail1 y Master-Detail2 o incluso Master1-Detail1 y Master2-Detail 2, etc)
Me parece que el FW no esta pensado para esto, pero de pronto, un usuario solicita algo parecido a lo enunciado :-).
Saludos.

You may configure the taskflows to use isolated DataControl frames. In this way each taskflow instance would have its own set of DataControl instance (and AM instances as a consequence) and the separate TF instances will not share its data.
You can set that in the TF definition. Use the Property Inspector, go to the section "Behaviour" and uncheck the checkbox "Share data controls with calling task flow".
Dimitar

Similar Messages

  • Custom ADF Application with Process Instance Details Task Flow

    Hi,
    I want to use Process Instance Details Task Flow which is referred to in the doc below.
    http://download.oracle.com/docs/cd/E21764_01/user.1111/e15175/bpmug_ws_taskflows.htm#BACDBDJA
    But I could not find the task flow in the JAR files nor in the libraries described in the doc (Actually I could not find one of the JARs itself, adflibWorkspaceTaskFlows.jar).
    Does anyone know where can I find it and how to use it?
    It is more convenient if I can create a custom ADF application which contains information on process instance details using Java API for BPM.
    So I'd like to also ask whether such API is available or not.
    Regards,
    Kenji

    +1
    I also can't find the adflibWorkspaceTaskFlows.jar from the specified directory in the [ BPM Users Guide|http://download.oracle.com/docs/cd/E17904_01/user.1111/e15175/bpmug_ws_taskflows.htm#BACBBDCE] : "$FMW_HOME/AS11gR1SOA/soa/modules/oracle.soa.worklist_11.1.1"
    When I tinker the OracleBPMWorkspace.ear from the $SOA_HOME\soa\applications, it has the OracleBPMWorkspace.war that contains the adflibWorkspaceTaskFlows.jar in its WEB-INF\lib directory, but the adflibWorkspaceTaskFlows.jar does not contain the "processApplicationsTaskflow.xml" or the "processInstancesTaskflow.xml". Those task flows were present in the WEB-INF itself of the OracleBPMWorkspace.war itself which is not deployed as an ADF library.
    Please advice,
    Rommel Pino
    http://soadev.blogspot.com

  • OracleJSP error while integrating BPM Instance Details Task Flow

    Guys,
              I'm using 11.1.1.6 jdeveloper and BPM.
    I have a requirement to show BPM 'Process Instance Details'  (which contains 'Details', 'Open Activities', 'Audit Trail' etc.. for a BPM Instance).
    I followed the following Link to implement this ....
    http://soadev.blogspot.com/2011/07/adf-uishell-application-with-oracle-bpm.html
    Everything was working fine.
    But when i click on 'Graphical View' in 'Audit Trail', i dont get the graphical view. Instead i get the following error message...
    OracleJSP error: oracle.jsp.parse.JavaCodeException: Line # 1, oracle.jsp.parse.JspParseTagScriptlet@16b871b3
    Error: Java code in jsp source files is not allowed in ojsp.next mode.
    Any pointer to solved this problem?
    Thanks in advance
    Dev

    Did you check these post ?
    https://forums.oracle.com/thread/993257
    https://forums.oracle.com/thread/1097866
    http://middiu.blogspot.com/2012/02/oracle-webcenter-spaces-and-webxml.html

  • Single adf task flow portlet with multiple pages and paramter from url

    Hi ,
    I have made a adf task flow portlet with a parameter.
    I have made a portal applcation and added multiple pages . in each of the page I am consuming that portlet through wsrp2.
    I have mapped the task flow parameter in page bindings of the pages with #{param.code}. where code is the get parameter.
    I have edited navigatinoal template so that when i click the pages , code parameter also get added in the url.
    issue
    When I run the applcation and visits first page with get parameter i get the right result. but when i click the on the other page which has some other value for the same get parameter , it doesnt display the value. But the other page shows me correct value if I go this page first but the later page doesnt display any value.
    thanks

    1001446 wrote:
    Hi ,
    I have edited navigatinoal template so that when i click the pages , code parameter also get added in the url.Can you paste the code from the template here?

  • Application Module instance not found in task flow

    Hi,
    i am working on Oracle Jdeveloper 11g Release 1
    I have created a bounded task flow for with following properties:
    <transaction id="__14">
          <new-transaction/>
        </transaction>
        <data-control-scope id="__15">
          <shared/>
        </data-control-scope>
        <task-flow-reentry id="__13">
          <reentry-not-allowed/>
        </task-flow-reentry>I have declared a page flow scope bean in the task flow as:
    <managed-bean id="__5">
          <managed-bean-name id="__6">trainBean</managed-bean-name>
          <managed-bean-class id="__8">oracle.sysman.core.gccompliance.view.library.rule.patchRule.PatchRuleTrainBean</managed-bean-class>
          <managed-bean-scope id="__7">pageFlow</managed-bean-scope>
        </managed-bean>but when i am trying to access the AM Impl instance from bean methos as below:
    public static final String DATA_CONTROLLER = "ComplianceLibraryAMDataControl";
        public ComplianceLibraryAMImpl getDataControl() {
            DCBindingContainer bc =
                (DCBindingContainer)BindingContext.getCurrent().getCurrentBindingsEntry();
            ApplicationModule am = bc.findDataControl(DATA_CONTROLLER).getApplicationModule();
            return (ComplianceLibraryAMImpl)am;
        }I am getting null pointer exception in ApplicationModule am = bc.findDataControl(DATA_CONTROLLER).getApplicationModule();
    I am using the same way to get bean in other task flows as well , so i think the code to get bean is working correctly.
    What am i missing in this?

    thanks Puthanampatti ,
    This is working and i am able to get AM instance from this , But i am not sure what is the difference between:
    public ComplianceLibraryAMImpl getComplianceLibraryAM() {
    ComplianceLibraryAMImpl am = (ComplianceLibraryAMImpl)ADFUtils.getApplicationModuleForDataControl("ComplianceLibraryAMDataControl");
    return am;
    Referred methods:
         * Get application module for an application module data control by name.
         * @param name application module data control name
         * @return ApplicationModule
        public static ApplicationModule getApplicationModuleForDataControl(String name)
            return (ApplicationModule) JSFUtils.resolveExpression("#{data." +  name +  ".dataProvider}");
         * Method for taking a reference to a JSF binding expression and returning
         * the matching object (or creating it).
         * @param expression EL expression
         * @return Managed object
        public static Object resolveExpression(String expression) {
            FacesContext facesContext = getFacesContext();
            Application app = facesContext.getApplication();
            ExpressionFactory elFactory = app.getExpressionFactory();
            ELContext elContext = facesContext.getELContext();
            ValueExpression valueExp =
                elFactory.createValueExpression(elContext, expression,
                                                Object.class);
            return valueExp.getValue(elContext);
        }And my previous approach:
        public ComplianceLibraryAMImpl getDataControl() {
                     DCBindingContainer bc = (DCBindingContainer)BindingContext.getCurrent().getCurrentBindingsEntry();
                     ApplicationModule am = bc.findDataControl(DATA_CONTROLLER).getApplicationModule();
                     return (ComplianceLibraryAMImpl)am;
        }Why am i not able to find my data control AM from Binding Container????

  • Multiple task flows on same page

    Surely appreciate any help on this:
    I am dropping the same bounded task flow as a region twice on the same jspx.
    The region ids are different.
    The task flow has a single fragment which is bound to a page definition.
    The page def exposes a web service data control consuming two web methods: a fetch and a corresponding merge.
    I am setting the data control scope to isolated.
    My desire is to have two autonomous copies of the same bindings, one for each task flow.
    Instead, setting the scope to isolated causes the second fragment to bind to nothing.
    Setting the scope back to shared causes both fragments to access the same bindings.
    Any help?
    Thanks and Best regards,
    Josh

    Actually, let me correct this post.. since the above is a workaround and not the real issue.
    The parent page publishes a contextual event that is subscribed to by the regional-ized bounded task flows.
    These are two instances of the same task flow.
    It appears that once the first task flow consumes this event, it is removed from the event queue, so it never reaches the second task flow.
    This appears to be a bug in the ADF framework. I suspect it may be extending a limitation of the underlying JSF controller.
    Please respond if you may have a resolution.
    Thank you and Regards,
    Josh

  • Single task flow with multiple tabs not working properly in UIShell

    Hi,
    I am using the UIShell Dynamic tabs concept. Here am creating the task flow with a fragment which am binding to the backing bean. if am trying to open the same task flow in two tabs.the getter of backing bean are calling for both tabs when am opening the second tab.
    Please provide me any solution or why backing bean getter is firing for the first tab even i opened the second tab.
    Reg,
    Brahma B.

    There are a few problems that I have noticed so far.
    Line #1 <!doctype html5> is not recognised, hence the page will revert back to HTML 4.01 Transitoinal. It should read <!doctype html>
    Line #51 (links to) style rules should be placed in the <head> element, not in the <body> element
    Lines #124, #159 and #188 all link to the same JS file. Delete two of them.
    Lines #126, #161 and #207 contain a constructor for the same widget. The former two should be deleted.
    Lines #190 and #208 contain a constructor for the same widget. The former should be deleted
    Although JS can be placed anywhere in a document, I tend to place all JS at the bottom, just above the ending body tag (</body>) unless the JS is required for rendering purposes in which case it should be placed where it is required. This way, you can keep check of what you have linked to and which constructors you have created thus eliminating the problems mentioned above.
    Also you might like to have a look at a collapsible panel group http://labs.adobe.com/technologies/spry/samples/collapsiblepanel/CollapsiblePanelGroupSamp le.html. This will simplify the code somewhat.
    Gramps

  • Using ADS in UI Shell with multiple task flows (static and dynamic)

    Greetings,
    I am developing an application in UI Shell and ADS. Following is description of the Application.
    In ThirdPartyComponentArea of the UI Shell I have a task flow with a page fragment(JSFF), which uses ADS (using push mechanism) to get the updates from the server.
    Now we have added a Task List in regional area. The list has following items
    1. One of the items in the TaskList open a dynamic task flow in local area.
    2. Another item in the task list opens a static task flow in local area.
    The issue is, whenever we click either of the links, ADS framework used by the JSFF on the ThirdPartyComponentArea stops working (stops receiving data from ADS and stopActiveData method is invoked by the framework).
    It seems that the task flow initiating in Local Area is causing the ADS framework in other taskflow to end.
    I have read through the documentation, Book - "Oracle Fusion Developer Guide - Building Rich Internet Applications with Oracle ADF Business Components and Oracle ADF Faces" and other examples/blogs.
    From my understanding it seems that ActiveDataManager and EventManager are stopping in this use-case.
    From my understanding opening the task flow in local area is not changing any properties or HTML view of the task flow in ThirdPartComponentArea. Then, why such a behaviour is observed when using ADS in such cases? Is this supported ADS?
    Are such use-cases supported by ADS. From my understanding of these documents it seems they are supported.
    If these use-cases are supported but require some tweaking/workaround in the current flow/logic then also please guide me on it.
    Please let me know if these are not supported by ADS.
    Thanks,
    PShah

    Hi,
    sounds like a product issue to me that you should file with support, providing them a simple test case for analyses
    Frank

  • How to Use No DB connection App with task flow desined as 'New Transaction'

    Hi,
    My application is fully depend on Custom java datasource implmentation and it requires no DB connection at all. I've done necessary implementation (http://andrejusb.blogspot.in/2012/03/use-case-for-adf-bc-with-no-database.html) by creating CustomDatabaseConnectionStrategy that says
        @Override
        public ApplicationModule createApplicationModule(Hashtable env) {
          env.put(Configuration.DB_REQUIRES_CONNECTION,Boolean.FALSE);
          env.put(PropertyMetadata.ENV_DO_FAILOVER.pName, PropertyConstants.FALSE);
          return super.createApplicationModule(env);
        }My application works fine until i keep my task flow transaction as 'No Controller transaction'. Yes, I use Dynamic tab shell template as UI. But when I keep it as 'Always Begin with New Transaction' and not to share the Data control, Screen is not at all getting rendered. I get some exception (I guess the moment I try to open the task flow, it is trying to get the connection. Since there is no connection available, It gives me this error. Is it so?)
    One of my client requirement is to open the same screen under multiple tabs as a fresh screen (i.e. with different transaction. We do transaction through tuxedo transaction server). If I load a screen under first Tab with some data, added some record, deleted some record.. and If I load the same screen under another Tab, it should not refelect the same data whatever I have under first Tab.
    Requirement:
    I don't have DB connection in my app. But I should be able to define the task-flow to open with new instance of Application module whenever it is opened.
    Your help on this would be appreciated.
    Raghu

    If I understand you requirements correctly, you do not need ADF taskflow transactions but you just need fresh DataControls. If it is so, then you do not have to set the taskflow's transaction behaviour to "Always Begin with New Transaction" (e.g. keep it to "No Controller Transaction"), but set the DataControl frame to "isolated" (e.g. uncheck the "Share data controls with the calling task flow" checkbox in the taskflow's Property Inspector). In this way each taskflow instance will be started in a different DataControl frame (e.g. it will instantiate its own set of ApplicationModule instances).
    Dimitar

  • How to return the name (or ID) of the Task FLow in Script

    Sitaution; two task flows created which can be accessed via Tools > TaskFlows within FDQM
    Task Flow "1.1 Multi Load - Import" --> Should run Batch Process Up to Import (enmBatchProcessLevel: 2)
    Task Flow "2.1 Multi Load - Import Up To Validate" --> Should run Batch Process Up to Validate (enmBatchProcessLevel: 4)
    I have developed one generic script which I would like to use for each task flow.
    Only the enmBatchProcessLevel differs between the task flows and therefore I would like to parse this enmBatchProcessLevel as a parameter my generic script.
    To be able to do this, the script needs to know on which task flow a user has clicked. So, I am looking for a function or statement which returns the name (or ID) of the task flow. Based on this name (or ID) a conditional statement can be performed in which a variable is dynamically filled. This variable can then be parsed as a parameter to my generic script.
    For instance:
    Sub GenericRoutine
         Dim strTaskFlow
         Dim intBatchProcessLevel
         '--Get the Task Flow Name
         strTaskFlow = ......<How to return the TaskFlow name or ID?>
         '--Validate the task flow and fill variable intBatchProcessLevel dynamically
         Select Case strTaskFlow
              Case "1.1 Multi Load - Import"
                   intBatchProcessLevel = 2
              Case "2.1 Multi Load - Import Up To Validate"
                   intBatchProcessLevel = 4
         End Select
         '--Execute generic script
         '--Call Batch script and parse intBatchProcessLevel as a parameter:
         Call sBatchProcess(intBatchProcessLevel)
         '--Execute generic script
    End Sub
    Sub sBatchProcess(Byval intBatchProcessLevel)
         Dim lngProcessLevel
         Dim strDelimiter
         Dim blnAutoMapCorrect
         '--Use intBatchProcessLevel to fill lngProcessLevel
         lngProcessLevel = intBatchProcessLevel
         strDelimiter = "_"
         blnAutoMapCorrect = 0
         Set BATCHENG.PcolFiles = BATCHENG.fFileCollectionCreate(CStr(strDelimiter))
         BATCHENG.mFileCollectionProcess BATCHENG.PcolFiles, CLng(lngProcessLevel), , CBool(blnAutoMapCorrect)
    End Sub
    Edited by: user13642656 on Jul 21, 2011 4:55 AM

    Hi, thanks for your reply.
    The Generic script contains 600+ records, which I would like to maintain once, when having multiple Task Flows for Import, UpToValidate, ValidateOnly, UpToExport, ExportOnly etc.
    Is there a central storage in FDQM workbench for script, like a "Module" in Excel VisualBasic environment? Thanks!

  • Af:table filter date format : task-flow navigation issue

    hi
    When trying to use the date format configured on the Entity Object, with Format Type as Simple Date and Format as "dd-MM-yyyy", there seems to be a problem when using task-flows.
    The approach involves an explicitly configured attributeValues binding to use in f:validator and af:convertDateTime components in the af:inputDate in the filter facet, as discussed in the forum thread "af:table filter date format"
    at af:table filter date format
    I used JDeveloper 11.1.1.3.0 to create the example application
    in http://www.consideringred.com/files/oracle/2010/TableFilterDateFormatIssueApp-v0.03.zip
    - The page filterEmp.jspx shows expected behaviour, the filter uses the configured date format and there is no problem when navigating to another page and back.
    see the screencast at http://screencast.com/t/CtQ9rsVFH3k
    - The page menuBTFPage.jspx allows for some navigation after using the filter resulting in the filter showing a date in the wrong format, using scenario (sc1)
    -- (sc1-a) : run menuBTFPage.jspx
    -- (sc1-b) : on "menu-btf : menu", click the "do go-filter-emp-btf" link
    -- (sc1-c) : on "filter-emp-btf : filterEmpFragment", filter on HireDate using "10-03-2005"
    -- (sc1-d) : click the "do goReturnSuccess" button
    -- (sc1-e) : back on "menu-btf : menu", click the "do go-filter-emp-btf" link again
    -- (sc1-f) : back on "filter-emp-btf : filterEmpFragment", see the HireDate filter value in the wrong format as "2005-03-10"
    -- (sc1-g) : click the "do goReturnSuccess" button again, which results in an error "The date is not in the correct format."
    see the screencast at http://www.screencast.com/t/ORHauBd3oQ
    questions:
    - (q1) Can the behaviour in scenario (sc1) be reproduced?
    - (q2) Why is the filter value in the wrong date format in step (sc1-f)?
    - (q3) What can be done to have the filter value consistently in the configured date format, so that errors as in step (sc1-g) can be avoided?
    many thanks
    Jan Vervecken

    hi
    First a short summary of relevant aspects of service request 3-2190488381:
    - development has reviewed bug 10193260
    - development identified some code where a pattern was not applied and started fixing the problem
    - out of the blue, development asked "Will clearing out the filter field completely when moving out of ataskflow be an acceptable behavior ?"
    - I pointed out some concerns (even in a phone call with development), but development did not see any alternative not "perceived to be very risky because of the current design", so the question whether the clearing-all-filter-fields approach would be acceptable became superfluous.
    - following this, bug 10193260 suddenly became an enhancement request (for reasons I still don't understand)
    - a workaround was suggested (for behaviour not perceived as a bug), "Clearing the search fields during taskflow exit in the backing bean (in the app)." for which I also received a modified version of my example application TableFilterDateFormatIssueApp-v0.04.zip with an implementation of the suggested workaround
    As an exercise to try an understand the suggested workaround (an because my example application seemed to have been modified using the currently yet-to-be-released JDeveloper 11.1.1.4.0) I re-implemented it in the example application
    at http://www.consideringred.com/files/oracle/2010/TableFilterDateFormatIssueApp-v0.05.zip
    It has a filter-emp-workaround-btf task-flow with a method-call activity on a managed-bean method, responsible for clearing the search fields, resulting in behaviour where the error "The date is not in the correct format." does not occur,
    as can be seen in the screencast at http://screencast.com/t/Nq7TkkRQ
      public void clearFilterFields()
        BindingContainer vBindingContainer =
          BindingContext.getCurrent().getCurrentBindingsEntry();
        DCBindingContainer vDCBindingContainer = (DCBindingContainer)vBindingContainer;
        DCDataControl vDCDataControl = vDCBindingContainer.getDataControl();
        ApplicationModule vApplicationModule = vDCDataControl.getApplicationModule();
        ViewObject vViewObject = vApplicationModule.findViewObject("EmployeesVOVI");
        ViewCriteriaManager vViewCriteriaManager = vViewObject.getViewCriteriaManager();
        vViewCriteriaManager.clearViewCriterias();
        vViewObject.clearCache();
      }Because the managed-bean method requires access to the ADF Model binding layer to get to the View Object instance used for the filtered table, the method-call activity has a page element configured in DataBindings.cpx referring to the same usageId as the page element for the page fragment showing the filtered table. So that both the method-call and view activity depend on one and the same Binding Container (e.i. PageDef file).
    The method-call activity, responsible for clearing the search fields, would need to be called before each task-flow-return activity.
    As there can be multiple view activities with multiple filtered tables in a bounded task-flow, would that result in multiple method-call activities responsible for clearing search fields (all to be called before each task-flow-return activity)?
    It looks like a more general/generic approach is desirable for the suggested workaround to be feasible.
    - (q5) Does the suggested workaround imply (as bug 10193260 is not a bug) that all bounded task-flows with filtered tables should implement it to avoid errors about formatting?
    thanks
    Jan

  • The subtle use of task flow "No Controller Transaction" behavior

    I'm trying to tease out some subtle points about the Task Flow transactional behavior option "<No Controller Transaction>".
    OTN members familiar with task flows in JDev 11g and on wards would know that task flows support options for transactions and data control scope. Some scenarios based on these options:
    a) When we pick options such as "Use Existing Transaction" and shared data control scope, the called Bounded Task Flow (BTF) will join the Data Control Frame of its caller. A commit by the child does essentially nothing, a rollback of the child rolls any data changes back to when the child BTF was called (i.e. an implicit save point), while a commit of the parent commits any changes in both the child and parent, and a rollback of a parent loses changes to the child and parent.
    A key point to realize about this scenario is the shared data control scope gives both the caller and called BTF the possibility to share a db connection from the connection pool. However this is dependent on the configuration of the underlying services layer. If ADF BC Application Modules (AMs) are used, and they use separate JNDI datasources this wont happen.
    b) When we pick options such as "Always Begin New Transaction" and isolated data control scope, the called BTF essentially has its own Data Control Frame separate to that of the caller. A commit or rollback in either the parent/caller or child/called BTF are essentially isolated, or in other words separate transactions.
    Similar to the last point but the exact opposite, regardless how the underlying business services are configured, even if ADF BC AMs are used with the same JNDI data source, essentially separate database connections will be taken out assisting the isolated transactional behavior with the database.
    This brings me back to my question, of the subtle behavior of the <No Controller Transaction> option. Section 16.4.1 of the Fusion Guide (http://download.oracle.com/docs/cd/E17904_01/web.1111/b31974/taskflows_parameters.htm#CIHIDJBJ) says that when this option is set that "A new data control frame is created without an open transaction." So you could argue this mode is the same as isolated data control scope, and by implication, separate connections will be taken out by the caller/called BTF. Is this correct?
    Doesn't this in turn have implications about database read consistency? If we have one BTF participating in a transaction with the database, reading then writing data, and a separate BTF with the <No Controller Transaction> option set, it's possible it wont see the data of the first BTF unless committed before the No Controller Transaction BTF is called and queries it's own dataset correct?
    An alternative question which takes a different point of view, is why would you ever want this option, don't the other options cover all the scenarios you could possibly want to use a BTF?
    Finally as a separate question based around the same option, presumably an attempt to commit/rollback the Data Control Frame of the associated No Controller Transaction BTF will fail. However what happens if the said BTF attempts to call the Data Control's (not the Data Control Frame's) commit & rollback options? Presumably this will succeed?
    Your thoughts and assistance appreciated.
    Regards,
    CM.

    For other readers this reply is a continuation of this thread and another thread: Re: Clarification?: Frank & Lynn's book - task flow "shared" data control scope
    Hi Frank
    Thanks for your reply. Okay I get the idea that were setting the ADFc options here, that can be overridden by the implementation of data control, and in my specific case that's the ADF BC AM implementation. I've always known that, but the issue became complicated because it didn't make sense what "No Controller Transaction" actually did and when you should use it, and in turn data control frames and their implementation aren't well documented.
    I think a key point from your summation is that "No Controller Transaction" in context of ADF BC, with either data control scope option selected, is effectively (as far as we can tell) already covered by the other options. So if our understanding is correct, the recommendation for ADF BC programmers is I think, don't use this option as future programmers/maintainers wont understand the subtlety.
    However as you say for users of other data controls, such as those using web services, then it makes sense and possibly should be the only option?
    Also regarding your code harvest pg 14 entry on task flow transactions: http://www.oracle.com/technetwork/developer-tools/adf/learnmore/march2011-otn-harvest-351896.pdf
    ....and the following quote in context of setting the transaction option to Begin New Transaction:
    >
    When a bounded task flow creates a new transaction, does it also mean it creates a new database connection? No.
    >
    ....I think you need to be a little more careful in this answer, as again it depends on the underlying data control implementation as you point out in this thread. In considering ADF BC, this is correct if you assume only one root AM. However if the BTFs have separate root AMs, this should result in 2 connections and transactions..... well at least I assume it does, though I wonder what will happen if both AMs share the same JNDI data source.... is the framework smart enough to join the connections/transactions in this case?
    Also in one of your other code harvests (apologies I can't find which one at the moment) you point out sharing data control scopes is only possible if the BTF data controls have the same name. In context of an ADF BC application, with only one root AM used by multiple BTFs, this of course would be the case. Yet, the obvious implication to your summary of transaction outcomes in this thread, if the developers for whatever reason change the DC name across DataBindings.cpx files sourced from ADF Libraries containing the BTFs, then no, it wont.
    Overall the number of variables in this gets really complicated, creating multiple dimensions to the matrix.
    Going to your last point, how can the documentation be improved? I think as you say the documentation is right in context of the options for ADFc, but, as the same documentation is included in the Fusion Dev Guide which assumes ADF BC is being used, then it isn't clear enough and can be misleading. It would seem to me, that depending on the underlying data control technology used, then there needs to be documentation that talks about the effect of ADFc task flow behavior options in the context of each technology. And God knows how you describe a scenario where BTFs use DCs that span technologies.
    From context of ADF BC, one thing that I've found hard in analyzing all of this is there doesn't seem to be an easy way from the middletier to check how many connections are being taken out from a data source. The FMW Control unfortunately when sampling db connections taken out from a JNDI data source pool, doesn't sample quickly enough to see how many were consumed. Are you aware of some easy method to check the number of the db connections opened/closed?
    Finally in considering an Unbounded Task Flow as separate to BTFs, do you have any conclusions about how it participates in the transactions? From what I can determine the UTF lies in it's own data control frame, and is effectively isolated from the BTF transactions, unless, the BTF calls commit/rollback at the ADF BC data control level (as separate to commit/rollback at the data control frame level), and the data control is used both by the UTF and BTF.
    As always thanks for your time and assistance.
    CM.

  • How to use a menu model with a dynamic region and a task flow parameter

    I am using JDeveloper/ADF 11.1.2.1
    I have a menu model that changes which task flow is displayed in a given dynamic region using a backing bean. That works fine. I would like to be able to pass parameters to that task flow based on which menu item is clicked. For example: i have a task flow which shows a page where input fields are used to filter a table. Depending on the value of the task flow parameter I want to change which input fields are displayed. So i will have multiple menu items which refer to the same task flow but have a different set of parameters. I have tried using request scope variables and setting them in the backing bean for the dynamic region which works until the query is submitted at which point the request scope has changed and the value is no longer available. I have tried a number of other 'creative' approaches but have not gotten anything to work. Anyone done this before? Or have an idea on how to solve it?

    Frank,
    I did a fair bit of digging based on your suggestions and some things I found in your Oracle Fusion Developer's Guide book and I came up with something that works really well. It is fairly elegant but requires code. It would be nice if something like a setPropertyListener could be rolled into the menu model. That would make my solution completely declarative.
    Here is my solution:
    My task flow requires a the value #{pageFlowScope.type} to be set. My application has a dynamic region that is changed on the fly using a menu model. The region uses a backing bean (mainRegionManagerBean) which is in the viewScope to manage what taskflow is shown in the region. There are multiple menu items in the menu model that point to the same task flow but pass different values to the #{pageFlowScope.type} parameter. So i wired the menu items up to different methods in the mainRegionManagerBean which set the value for me. See the relevant code below.
    I would be very interested in the feedback from someone with more experience than I on my solution. Maybe there is a more elegant way...
    In the backing bean there is a primary method that was created by generating a dynamic region link which sets the task flow id and then other methods which call it and set the relevant parameters. (JSFUtils is a helper class i wrote to centralize some common tasks):
    public String shipmentTraceMasterTaskflow()
    taskFlowId = "/WEB-INF/taskflow/master/shipmentTraceMasterTaskflow.xml#shipmentTraceMasterTaskflow";
    JSFUtils.setValue("pageFlowScope.type", "");
    return null;
    public String shipmentTraceProNumber()
    shipmentTraceMasterTaskflow();
    JSFUtils.setValue("pageFlowScope.type", "pronumber");
    return null;
    public String shipmentTraceBOLNumber()
    shipmentTraceMasterTaskflow();
    JSFUtils.setValue("pageFlowScope.type", "bolnumber");
    return null;
    In the menu model (notice that these reference the different methods from above):
    <itemNode id="itemNode_ProNumberTrace" label="ProNumber Trace" action="#{viewScope.mainRegionManagerBean.shipmentTraceProNumber}" focusViewId=""/>
    <itemNode id="itemNode_BOLNumberTrace" label="BOL Number Trace" action="#{viewScope.mainRegionManagerBean.shipmentTraceBOLNumber}" focusViewId=""/>
    On the page:
    <af:region value="#{bindings.dynamicRegion1.regionModel}" id="r1"/>
    In the pagedef:
    <taskFlow id="dynamicRegion1" taskFlowId="${viewScope.mainRegionManagerBean.dynamicTaskFlowId}" activation="deferred" xmlns="http://xmlns.oracle.com/adf/controller/binding" Refresh="ifNeeded">
    <parameters>
    <parameter id="type" value="#{pageFlowScope.type}"/>
    </parameters>
    </taskFlow>
    Edited by: Adam Stortz on Nov 22, 2011 11:10 AM

  • How to share the same Database Connection when using several Task Flows ?

    Hi All,
    I’m using JDev 11.1.1.3.0.
    I’m developing ADF Fusion Applications (ABC BC, ADF Faces…)
    These applications are deployed on a Weblogic server.
    Each application has only one Application Module.
    All Application Modules have the same connection type defined: JDBC DataSource : jdbc/GCCDS
    It is working fine.
    I’ve also developed Task Flow Applications for small thinks that are reused in multiple main applications.
    Each Task Flow Application has also one Application Module with the same connections type as main applications.
    All these task flows are deployed to JAR file (ADF Library JAR File) and are reused on my main applications. (drag and drop from the Resource Palette to ADF Regions….).
    There are some parameters passed to Task Flows, so that they can filter data depending on which main applications they are called from.
    Everything is working perfectly.
    All my main applications are using more and more task flows. Which is nice for the reusability etc…?
    Only ONE PROBLEM: DATABASE CONNECTIONS.
    Every Task Flows service made a database connection. So one user may have 10 database connections for the same adf page. And when there are 100 users that are working at the same time, it becomes a problem.
    How to share the same database connections for the main applications and all task flows which are used in the main application?
    Best Regards
    Nicolas

    Hi John,
    When I open a ADF Library JAR file of one of my task flow. (gcc_tf_recentSites.jar)
    I can see TF_RecentSitesService.xml and TF_RecentSitesServiceImpl.class in gcc_tf_recentSites.jar\mu\gcc\tf\recentSites\model\service folder
    + bc4j.xcfg in gcc_tf_recentSites.jar\mu\gcc\tf\recentSites\model\service\common folder.
    bc4j.xcfg details are
    +<?xml version = '1.0' encoding = 'UTF-8'?>+
    +<BC4JConfig version="11.1" xmlns="http://xmlns.oracle.com/bc4j/configuration">+
    +<AppModuleConfigBag ApplicationName="mu.gcc.tf.recentSites.model.service.TF_RecentSitesService">+
    +<AppModuleConfig DeployPlatform="LOCAL" jbo.project="mu.gcc.tf.recentSites.model.TF_RecentSites_Model" name="TF_RecentSitesServiceLocal" ApplicationName="mu.gcc.tf.recentSites.model.service.TF_RecentSitesService">+
    +<Security AppModuleJndiName="mu.gcc.tf.recentSites.model.service.TF_RecentSitesService"/>+
    +<Custom JDBCDataSource="jdbc/GCCDS"/>+
    +</AppModuleConfig>+
    +<AppModuleConfig name="TF_RecentSitesServiceShared" ApplicationName="mu.gcc.tf.recentSites.model.service.TF_RecentSitesService" DeployPlatform="LOCAL" JDBCName="gccdev" jbo.project="mu.gcc.tf.recentSites.model.TF_RecentSites_Model">+
    +<AM-Pooling jbo.ampool.maxpoolsize="1" jbo.ampool.isuseexclusive="false"/>+
    +<Security AppModuleJndiName="mu.gcc.tf.recentSites.model.service.TF_RecentSitesService"/>+
    +</AppModuleConfig>+
    +</AppModuleConfigBag>+
    +</BC4JConfig>+
    So, it seems that the Application Module is packaged with the task flow....
    Is it normal ?
    Regards
    Nicolas

  • Multiple BW Instance question

    Getting lost wading through the plethora of information available, so any direct pointers to information that directly addresses the following would be appreciated, in addition to any direct answers of course!
    Consider a multiple BW instance, with a staging layer, and integration layer, and multiple analytic layers (one large one, and a couple smaller ones that operate at different service levels).
    If we upgrade staging to 3.5, do all the rest have to tag along (staging feeds integration and one other, integration gets data from staging and feeds all the rest).
    If we upgrade to 7.0, or whatever they are going to call it, will every system have to upgrade, or are the BW to BW connections backwardly compatable.
    Major reason for considering this architecture is the ability to have a couple target BW's at higher or lower releases/service pack levels as warranted by the specific applications using them (e.g. SEM wanting SP's faster than the uber BW can test and apply them).   However, if we have to keep all systems in synch, then this is NOT a viable option.
    Mark Marty
    EBIS Architect
    Mckesson

    This is based on some work being done by another member of the Terabyte club, and we have similar sized landscapes.
    The key concerns are, with a single instance of BW, and ~ 10M rows of raw transactions migrating into the system each night (SD - BO, DD, SO's, MM, COPA, it all adds up), it is not clear to us that BW could handle all data rationalization, transformation and dissemination tasks. We have significant legacy transactions and master data that needs to be merged with a good portion of the R/3 data.
    In our current environment, 90% of this is done outside of BW, using Datastage and a Data Provisioning Area.  Even with sending only clean, merged, ready to load to target data, our load windows are increasing, and our backup window is scary.
    This new effort is an attempt to separate key data needs so that certain things that need to be more responsive can be made available earlier.  Additionally, this landscape is going to test out and prove or disprove the ability to do this volume of ETL wholly within a BW environment, rather than leveraging on other marketplace tools and techniques.  Needless to say, this is a hot topic, internally we have a slant towards best in breed, and consultatively everyone wants it all to be the SAP suite of tools.   Hence, we will build it and see.
    Splitting Staging from Integration is not necessary (likely) from a purely technical need, however it has proven useful at another company, and we are going to explore it and determine the efficacy ourselves.

Maybe you are looking for