Modelling real time complex workflows...

i am trying to create a workflow...
the scenario is like this...a user fills in a project initiation form...
it goes to a group called submission_group for review...any member from that group may approve or reject it...
if it is approved, it will be forwarded to the project manager
the project manager just has to fill in some comments and forward it to executive comittee...
project manager does not have rejection rights on the project...
once executive committe acts on the form the executive committe can reject the project also...
then it goes to two people in parallel one of which has rejection rights.
how to model this kind of workflow...where some people have reject rights but some people do not have...
I tried modelling all human tasks as individual task and using "include task history" functionality...but that throws up lots of compilation problems....
has anybody modelled such a workflow with bpel...
appreciate ur help

i have successfully built single approver, sequential approver workflows etc...
here is the sample workflow that i created, i am attaching the .bpel source and the source for human_task1 and human_task2..the human_task2 is using "include task history " from human_task1...the compilation errors are also given in the end of this mail
.bpel source
<?xml version = "1.0" encoding = "UTF-8" ?>
<!--
Oracle JDeveloper BPEL Designer
Created: Mon Mar 19 16:00:13 SGT 2007
Author: Ankita.Sonal
Purpose: Asynchronous BPEL Process
-->
<process name="BPELProcess8"
targetNamespace="http://xmlns.oracle.com/BPELProcess8"
xmlns="http://schemas.xmlsoap.org/ws/2003/03/business-process/"
xmlns:xsd="http://www.w3.org/2001/XMLSchema"
xmlns:xp20="http://www.oracle.com/XSL/Transform/java/oracle.tip.pc.services.functions.Xpath20"
xmlns:bpws="http://schemas.xmlsoap.org/ws/2003/03/business-process/"
xmlns:wfcommon="http://xmlns.oracle.com/bpel/workflow/common"
xmlns:ids="http://xmlns.oracle.com/bpel/services/IdentityService/xpath"
xmlns:ldap="http://schemas.oracle.com/xpath/extension/ldap"
xmlns:client="http://xmlns.oracle.com/BPELProcess8"
xmlns:ora="http://schemas.oracle.com/xpath/extension"
xmlns:taskservice="http://xmlns.oracle.com/bpel/workflow/taskService"
xmlns:hwf="http://xmlns.oracle.com/bpel/workflow/xpath"
xmlns:ehdr="http://www.oracle.com/XSL/Transform/java/oracle.tip.esb.server.headers.ESBHeaderFunctions"
xmlns:bpelx="http://schemas.oracle.com/bpel/extension"
xmlns:task="http://xmlns.oracle.com/bpel/workflow/task"
xmlns:orcl="http://www.oracle.com/XSL/Transform/java/oracle.tip.pc.services.functions.ExtFunc">
<!--
PARTNERLINKS
List of services participating in this BPEL process
-->
<partnerLinks>
<!--
The 'client' role represents the requester of this service. It is
used for callback. The location and correlation information associated
with the client role are automatically set using WS-Addressing.
-->
<partnerLink name="client" partnerLinkType="client:BPELProcess8"
myRole="BPELProcess8Provider"
partnerRole="BPELProcess8Requester"/>
<partnerLink myRole="TaskServiceCallbackListener" name="TaskService"
partnerRole="TaskService"
partnerLinkType="taskservice:TaskService"/>
</partnerLinks>
<!--
VARIABLES
List of messages and XML documents used within this BPEL process
-->
<variables>
<!-- Reference to the message passed as input during initiation -->
<variable name="inputVariable"
messageType="client:BPELProcess8RequestMessage"/>
<!-- Reference to the message that will be sent back to the requester during callback -->
<variable name="outputVariable"
messageType="client:BPELProcess8ResponseMessage"/>
<variable name="HumanTask1_1_globalVariable"
messageType="taskservice:taskMessage"/>
</variables>
<!--
ORCHESTRATION LOGIC
Set of activities coordinating the flow of messages across the
services integrated within this business process
-->
<sequence name="main">
<!-- Receive input from requestor. (Note: This maps to operation defined in BPELProcess8.wsdl) -->
<receive name="receiveInput" partnerLink="client"
portType="client:BPELProcess8" operation="initiate"
variable="inputVariable" createInstance="yes"/>
<!--
Asynchronous callback to the requester. (Note: the callback location and correlation id is transparently handled using WS-addressing.)
-->
<scope name="HumanTask1_1"
xmlns:wf="http://schemas.oracle.com/bpel/extension/workflow"
wf:key="HumanTask1_1_globalVariable">
<bpelx:annotation xmlns:bpelx="http://schemas.oracle.com/bpel/extension">
<bpelx:pattern patternName="bpelx:workflow"></bpelx:pattern>
</bpelx:annotation>
<variables>
<variable name="initiateTaskInput"
messageType="taskservice:initiateTaskMessage"/>
<variable name="initiateTaskResponseMessage"
messageType="taskservice:initiateTaskResponseMessage"/>
</variables>
<correlationSets>
<correlationSet name="WorkflowTaskIdCor"
properties="taskservice:taskId"/>
</correlationSets>
<sequence>
<assign name="HumanTask1_1_AssignTaskAttributes">
<copy>
<from expression="concat(ora:getProcessURL(), string('/HumanTask1/HumanTask1.task'))"/>
<to variable="initiateTaskInput" part="payload"
query="/taskservice:initiateTask/task:task/task:taskDefinitionURI"/>
</copy><copy>
<from expression="number(3)"/>
<to variable="initiateTaskInput" part="payload"
query="/taskservice:initiateTask/task:task/task:priority"/>
</copy><copy>
<from expression="string('task 1')"/>
<to variable="initiateTaskInput" part="payload"
query="/taskservice:initiateTask/task:task/task:title"/>
</copy><copy>
<from expression="bpws:getVariableData('inputVariable','payload','/client:BPELProcess8ProcessRequest/client:input')"/>
<to variable="initiateTaskInput" part="payload"
query="/taskservice:initiateTask/task:task/task:creator"/>
</copy><bpelx:append>
<bpelx:from variable="inputVariable" part="payload"
query="/client:BPELProcess8ProcessRequest"/>
<bpelx:to variable="initiateTaskInput" part="payload"
query="/taskservice:initiateTask/task:task/task:payload"/>
</bpelx:append><bpelx:append>
<bpelx:from variable="inputVariable" part="payload"
query="/client:BPELProcess8ProcessRequest"/>
<bpelx:to variable="initiateTaskInput" part="payload"
query="/taskservice:initiateTask/task:task/task:payload"/>
</bpelx:append></assign>
<assign name="HumanTask1_1_AssignSystemTaskAttributes">
<copy>
<from expression="ora:getInstanceId()"/>
<to variable="initiateTaskInput" part="payload"
query="/taskservice:initiateTask/task:task/task:processInfo/task:instanceId"/>
</copy>
<copy>
<from expression="ora:getProcessId()"/>
<to variable="initiateTaskInput" part="payload"
query="/taskservice:initiateTask/task:task/task:processInfo/task:processName"/>
</copy>
<copy>
<from expression="ora:getProcessId()"/>
<to variable="initiateTaskInput" part="payload"
query="/taskservice:initiateTask/task:task/task:processInfo/task:processId"/>
</copy>
<copy>
<from expression="ora:getProcessVersion()"/>
<to variable="initiateTaskInput" part="payload"
query="/taskservice:initiateTask/task:task/task:processInfo/task:processVersion"/>
</copy>
<copy>
<from expression="ora:getDomainId()"/>
<to variable="initiateTaskInput" part="payload"
query="/taskservice:initiateTask/task:task/task:processInfo/task:domainId"/>
</copy>
<copy>
<from expression="string('BPEL')"/>
<to variable="initiateTaskInput" part="payload"
query="/taskservice:initiateTask/task:task/task:processInfo/task:processType"/>
</copy>
</assign>
<invoke name="initiateTask_HumanTask1_1"
partnerLink="TaskService"
portType="taskservice:TaskService"
operation="initiateTask"
inputVariable="initiateTaskInput"
outputVariable="initiateTaskResponseMessage">
<correlations>
<correlation initiate="yes" set="WorkflowTaskIdCor"
pattern="in"/>
</correlations>
</invoke>
<receive name="receiveCompletedTask_HumanTask1_1"
partnerLink="TaskService"
portType="taskservice:TaskServiceCallback"
operation="onTaskCompleted"
variable="HumanTask1_1_globalVariable"
createInstance="no">
<correlations>
<correlation initiate="no" set="WorkflowTaskIdCor"/>
</correlations>
</receive>
</sequence>
</scope>
<switch name="taskSwitch">
<case condition="bpws:getVariableData('HumanTask1_1_globalVariable', 'payload', '/task:task/task:systemAttributes/task:state') = 'COMPLETED' and bpws:getVariableData('HumanTask1_1_globalVariable', 'payload', '/task:task/task:systemAttributes/task:outcome') = 'REJECT'">
<bpelx:annotation>
<bpelx:pattern>Task outcome is REJECT</bpelx:pattern>
</bpelx:annotation>
<sequence>
<assign name="CopyPayloadFromTask">
<copy>
<from variable="HumanTask1_1_globalVariable"
part="payload"
query="/task:task/task:payload/client:BPELProcess8ProcessRequest"/>
<to variable="inputVariable" part="payload"
query="/client:BPELProcess8ProcessRequest"/>
</copy>
</assign>
</sequence>
</case>
<case condition="bpws:getVariableData('HumanTask1_1_globalVariable', 'payload', '/task:task/task:systemAttributes/task:state') = 'COMPLETED' and bpws:getVariableData('HumanTask1_1_globalVariable', 'payload', '/task:task/task:systemAttributes/task:outcome') = 'APPROVE'">
<bpelx:annotation>
<bpelx:pattern>Task outcome is APPROVE</bpelx:pattern>
</bpelx:annotation>
<sequence>
<assign name="CopyPayloadFromTask">
<copy>
<from variable="HumanTask1_1_globalVariable"
part="payload"
query="/task:task/task:payload/client:BPELProcess8ProcessRequest"/>
<to variable="inputVariable" part="payload"
query="/client:BPELProcess8ProcessRequest"/>
</copy>
</assign>
</sequence>
</case>
<otherwise>
<bpelx:annotation>
<bpelx:pattern>Task is outcome is EXPIRED, STALE, WITHDRAWN or ERRORED</bpelx:pattern>
</bpelx:annotation>
<sequence>
<assign name="CopyPayloadFromTask">
<copy>
<from variable="HumanTask1_1_globalVariable"
part="payload"
query="/task:task/task:payload/client:BPELProcess8ProcessRequest"/>
<to variable="inputVariable" part="payload"
query="/client:BPELProcess8ProcessRequest"/>
</copy>
</assign>
</sequence>
</otherwise>
</switch>
<scope name="HumanTask2_1"
xmlns:wf="http://schemas.oracle.com/bpel/extension/workflow"
wf:key="HumanTask1_1_globalVariable">
<bpelx:annotation xmlns:bpelx="http://schemas.oracle.com/bpel/extension">
<bpelx:pattern patternName="bpelx:workflow"></bpelx:pattern>
</bpelx:annotation>
<variables>
<variable name="reinitiateTaskInput"
messageType="taskservice:reinitiateTaskMessage"/>
<variable name="reinitiateTaskResponseMessage"
messageType="taskservice:reinitiateTaskResponseMessage"/>
</variables>
<correlationSets>
<correlationSet name="WorkflowTaskIdCor"
properties="taskservice:taskId"/>
</correlationSets>
<sequence>
<assign name="HumanTask2_1_AssignTaskAttributes">
<copy>
<from variable="HumanTask1_1_globalVariable" part="payload"
query="/task:task"/>
<to variable="reinitiateTaskInput" part="payload"
query="/taskservice:reinitiateTask/task:task"/>
</copy><copy>
<from expression="concat(ora:getProcessURL(), string('/HumanTask2/HumanTask2.task'))"/>
<to variable="reinitiateTaskInput" part="payload"
query="/taskservice:reinitiateTask/task:task/task:taskDefinitionURI"/>
</copy><copy>
<from expression="number(3)"/>
<to variable="initiateTaskInput" part="payload"
query="/taskservice:initiateTask/task:task/task:priority"/>
</copy><copy>
<from expression="string('task2')"/>
<to variable="initiateTaskInput" part="payload"
query="/taskservice:initiateTask/task:task/task:title"/>
</copy><copy>
<from expression="bpws:getVariableData('inputVariable','payload','/client:BPELProcess8ProcessRequest/client:input')"/>
<to variable="initiateTaskInput" part="payload"
query="/taskservice:initiateTask/task:task/task:creator"/>
</copy><bpelx:append>
<bpelx:from variable="inputVariable" part="payload"
query="/client:BPELProcess8ProcessRequest"/>
<bpelx:to variable="initiateTaskInput" part="payload"
query="/taskservice:initiateTask/task:task/task:payload"/>
</bpelx:append><bpelx:append>
<bpelx:from variable="inputVariable" part="payload"
query="/client:BPELProcess8ProcessRequest"/>
<bpelx:to variable="initiateTaskInput" part="payload"
query="/taskservice:initiateTask/task:task/task:payload"/>
</bpelx:append></assign>
<invoke name="reinitiateTask_HumanTask2_1"
partnerLink="TaskService"
portType="taskservice:TaskService"
operation="reinitiateTask"
inputVariable="reinitiateTaskInput"
outputVariable="reinitiateTaskResponseMessage">
<correlations>
<correlation initiate="yes" set="WorkflowTaskIdCor"
pattern="in"/>
</correlations>
</invoke>
<receive name="receiveCompletedTask_HumanTask2_1"
partnerLink="TaskService"
portType="taskservice:TaskServiceCallback"
operation="onTaskCompleted"
variable="HumanTask1_1_globalVariable"
createInstance="no">
<correlations>
<correlation initiate="no" set="WorkflowTaskIdCor"/>
</correlations>
</receive>
</sequence>
</scope>
<switch name="taskSwitch">
<case condition="bpws:getVariableData('HumanTask1_1_globalVariable', 'payload', '/task:task/task:systemAttributes/task:state') = 'COMPLETED' and bpws:getVariableData('HumanTask1_1_globalVariable', 'payload', '/task:task/task:systemAttributes/task:outcome') = 'REJECT'">
<bpelx:annotation>
<bpelx:pattern>Task outcome is REJECT</bpelx:pattern>
</bpelx:annotation>
<sequence>
<assign name="CopyPayloadFromTask">
<copy>
<from variable="HumanTask1_1_globalVariable"
part="payload"
query="/task:task/task:payload/client:BPELProcess8ProcessRequest"/>
<to variable="inputVariable" part="payload"
query="/client:BPELProcess8ProcessRequest"/>
</copy>
</assign>
</sequence>
</case>
<case condition="bpws:getVariableData('HumanTask1_1_globalVariable', 'payload', '/task:task/task:systemAttributes/task:state') = 'COMPLETED' and bpws:getVariableData('HumanTask1_1_globalVariable', 'payload', '/task:task/task:systemAttributes/task:outcome') = 'APPROVE'">
<bpelx:annotation>
<bpelx:pattern>Task outcome is APPROVE</bpelx:pattern>
</bpelx:annotation>
<sequence>
<assign name="CopyPayloadFromTask">
<copy>
<from variable="HumanTask1_1_globalVariable"
part="payload"
query="/task:task/task:payload/client:BPELProcess8ProcessRequest"/>
<to variable="inputVariable" part="payload"
query="/client:BPELProcess8ProcessRequest"/>
</copy>
</assign>
</sequence>
</case>
<otherwise>
<bpelx:annotation>
<bpelx:pattern>Task is outcome is EXPIRED, STALE, WITHDRAWN or ERRORED</bpelx:pattern>
</bpelx:annotation>
<sequence>
<assign name="CopyPayloadFromTask">
<copy>
<from variable="HumanTask1_1_globalVariable"
part="payload"
query="/task:task/task:payload/client:BPELProcess8ProcessRequest"/>
<to variable="inputVariable" part="payload"
query="/client:BPELProcess8ProcessRequest"/>
</copy>
</assign>
</sequence>
</otherwise>
</switch>
<invoke name="callbackClient" partnerLink="client"
portType="client:BPELProcess8Callback" operation="onResult"
inputVariable="outputVariable"/>
</sequence>
</process>
human_task1
<?xml version = '1.0' encoding = 'UTF-8'?>
<taskDefinition targetNamespace="http://xmlns.oracle.com/HumanTask1" xmlns:xp20="http://www.oracle.com/XSL/Transform/java/oracle.tip.pc.services.functions.Xpath20" xmlns:ora="http://schemas.oracle.com/xpath/extension" xmlns:orcl="http://www.oracle.com/XSL/Transform/java/oracle.tip.pc.services.functions.ExtFunc" xmlns:task="http://xmlns.oracle.com/bpel/workflow/task" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns="http://xmlns.oracle.com/bpel/workflow/taskDefinition">
<name>HumanTask1</name>
<id>${domain_id}_${process_id}_${process_revision}_HumanTask1</id>
<title>task 1</title>
<priority>3</priority>
<process processId="" processVersion=""/>
<routingSlip xmlns="http://xmlns.oracle.com/bpel/workflow/routingSlip">
<globalConfiguration>
<earlyCompletion>
<outcome>REJECT</outcome>
</earlyCompletion>
</globalConfiguration>
<participants isAdhocRoutingSupported="false">
<sequentialParticipant name="Assignee1">
<resource isGroup="false" type="STATIC">jcooper</resource>
<resource isGroup="false" type="STATIC">jstein</resource>
</sequentialParticipant>
</participants>
<notification includeTaskAttachments="false" actionable="false"
secureNotifications="false">
<action name="ASSIGN" recipient="ASSIGNEES"><![CDATA[concat(string('Task '), /task:task/task:title, string(' requires your attention. Please access the task from the worklist application.'))]]></action>
<action name="COMPLETE" recipient="CREATOR"><![CDATA[concat(string('Task '), /task:task/task:title, string(' requires your attention. Please access the task from the worklist application.'))]]></action>
<action name="ERROR" recipient="OWNER"><![CDATA[concat(string('Task '), /task:task/task:title, string(' requires your attention. Please access the task from the worklist application.'))]]></action>
</notification>
</routingSlip>
<workflowConfiguration xmlns="http://xmlns.oracle.com/bpel/workflow/configuration"
xmlns:ns0="http://xmlns.oracle.com/BPELProcess8">
<outcomes>
<outcome>APPROVE</outcome>
<outcome>REJECT</outcome>
</outcomes>
<restrictedActions/>
<payload xmlSchemaDefinition="HumanTask1_payload.xsd">
<messageAttribute name="BPELProcess8ProcessRequest"
attributeType="ELEMENT"
type="ns0:BPELProcess8ProcessRequest"
updatable="true"/>
</payload>
<bpelEventListener>false</bpelEventListener>
</workflowConfiguration>
</taskDefinition>
human_task2
<?xml version = '1.0' encoding = 'UTF-8'?>
<taskDefinition targetNamespace="http://xmlns.oracle.com/HumanTask2" xmlns:xp20="http://www.oracle.com/XSL/Transform/java/oracle.tip.pc.services.functions.Xpath20" xmlns:ora="http://schemas.oracle.com/xpath/extension" xmlns:orcl="http://www.oracle.com/XSL/Transform/java/oracle.tip.pc.services.functions.ExtFunc" xmlns:task="http://xmlns.oracle.com/bpel/workflow/task" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns="http://xmlns.oracle.com/bpel/workflow/taskDefinition">
<name>HumanTask2</name>
<id>${domain_id}_${process_id}_${process_revision}_HumanTask2</id>
<title>task2</title>
<priority>3</priority>
<process processId="" processVersion=""/>
<routingSlip xmlns="http://xmlns.oracle.com/bpel/workflow/routingSlip">
<globalConfiguration/>
<participants isAdhocRoutingSupported="false">
<sequentialParticipant name="Assignee1">
<resource isGroup="false" type="STATIC">cdickens</resource>
<resource isGroup="false" type="STATIC">cdoyle</resource>
</sequentialParticipant>
</participants>
<notification includeTaskAttachments="false" actionable="false"
secureNotifications="false">
<action name="ASSIGN" recipient="ASSIGNEES"><![CDATA[concat(string('Task '), /task:task/task:title, string(' requires your attention. Please access the task from the worklist application.'))]]></action>
<action name="COMPLETE" recipient="CREATOR"><![CDATA[concat(string('Task '), /task:task/task:title, string(' requires your attention. Please access the task from the worklist application.'))]]></action>
<action name="ERROR" recipient="OWNER"><![CDATA[concat(string('Task '), /task:task/task:title, string(' requires your attention. Please access the task from the worklist application.'))]]></action>
</notification>
</routingSlip>
<workflowConfiguration xmlns="http://xmlns.oracle.com/bpel/workflow/configuration"
xmlns:ns0="http://xmlns.oracle.com/BPELProcess8">
<outcomes>
<outcome>APPROVE</outcome>
<outcome>REJECT</outcome>
</outcomes>
<restrictedActions/>
<payload xmlSchemaDefinition="HumanTask2_payload.xsd">
<messageAttribute name="BPELProcess8ProcessRequest"
attributeType="ELEMENT"
type="ns0:BPELProcess8ProcessRequest"
updatable="true"/>
</payload>
<bpelEventListener>false</bpelEventListener>
</workflowConfiguration>
</taskDefinition>
compilation error
Project: D:\product\10.1.3.1\OracleAS_1\bpel\samples\references\Catch\Catch\BPELProcess8\BPELProcess8.jpr
D:\product\10.1.3.1\OracleAS_1\bpel\samples\references\Catch\Catch\BPELProcess8\bpel\BPELProcess8.bpel
Error(267):
[Error ORABPEL-10014]: unresolved variable
[Description]: in line 267 of "D:\product\10.1.3.1\OracleAS_1\bpel\samples\references\Catch\Catch\BPELProcess8\bpel\BPELProcess8.bpel", Variable "initiateTaskInput" of the <to> is not defined.
[Potential fix]: make sure the variable "initiateTaskInput" is defined and in the scope of this activity.
Error(267):
[Error ORABPEL-10083]: data spec error 2
[Description]: in line 267 of "D:\product\10.1.3.1\OracleAS_1\bpel\samples\references\Catch\Catch\BPELProcess8\bpel\BPELProcess8.bpel", part or query attribute is used without the variable being specified.
[Potential fix]: make sure the variable is specified.
Error(271):
[Error ORABPEL-10014]: unresolved variable
[Description]: in line 271 of "D:\product\10.1.3.1\OracleAS_1\bpel\samples\references\Catch\Catch\BPELProcess8\bpel\BPELProcess8.bpel", Variable "initiateTaskInput" of the <to> is not defined.
[Potential fix]: make sure the variable "initiateTaskInput" is defined and in the scope of this activity.
Error(271):
[Error ORABPEL-10083]: data spec error 2
[Description]: in line 271 of "D:\product\10.1.3.1\OracleAS_1\bpel\samples\references\Catch\Catch\BPELProcess8\bpel\BPELProcess8.bpel", part or query attribute is used without the variable being specified.
[Potential fix]: make sure the variable is specified.
Error(275):
[Error ORABPEL-10014]: unresolved variable
[Description]: in line 275 of "D:\product\10.1.3.1\OracleAS_1\bpel\samples\references\Catch\Catch\BPELProcess8\bpel\BPELProcess8.bpel", Variable "initiateTaskInput" of the <to> is not defined.
[Potential fix]: make sure the variable "initiateTaskInput" is defined and in the scope of this activity.
Error(275):
[Error ORABPEL-10083]: data spec error 2
[Description]: in line 275 of "D:\product\10.1.3.1\OracleAS_1\bpel\samples\references\Catch\Catch\BPELProcess8\bpel\BPELProcess8.bpel", part or query attribute is used without the variable being specified.
[Potential fix]: make sure the variable is specified.
Error(280):
[Error ORABPEL-10014]: unresolved variable
[Description]: in line 280 of "D:\product\10.1.3.1\OracleAS_1\bpel\samples\references\Catch\Catch\BPELProcess8\bpel\BPELProcess8.bpel", Variable "initiateTaskInput" of the <to> is not defined.
[Potential fix]: make sure the variable "initiateTaskInput" is defined and in the scope of this activity.
Error(280):
[Error ORABPEL-10083]: data spec error 2
[Description]: in line 280 of "D:\product\10.1.3.1\OracleAS_1\bpel\samples\references\Catch\Catch\BPELProcess8\bpel\BPELProcess8.bpel", part or query attribute is used without the variable being specified.
[Potential fix]: make sure the variable is specified.
Error(285):
[Error ORABPEL-10014]: unresolved variable
[Description]: in line 285 of "D:\product\10.1.3.1\OracleAS_1\bpel\samples\references\Catch\Catch\BPELProcess8\bpel\BPELProcess8.bpel", Variable "initiateTaskInput" of the <to> is not defined.
[Potential fix]: make sure the variable "initiateTaskInput" is defined and in the scope of this activity.
Error(285):
[Error ORABPEL-10083]: data spec error 2
[Description]: in line 285 of "D:\product\10.1.3.1\OracleAS_1\bpel\samples\references\Catch\Catch\BPELProcess8\bpel\BPELProcess8.bpel", part or query attribute is used without the variable being specified.
[Potential fix]: make sure the variable is specified.

Similar Messages

  • Any real time complex validations and assignments

    Hi frds,
    Any one can share few real time complex validations and Assignments on product object?
    Thanks in advance
    Regards.

    Rajeev,
    I do not have real time validation exaples. There are couple of blogs on SDN
    search for
    MDM Expression Engine
    (the document name is MDM Express Engine: Validations,Assignments and More)
    Parsing and Validating Numbers
    (used for check digits but gives more insight on writing validations.)
    I hope this helps
    Regards
    Bala Pochareddy

  • Workflow - real time

    Hi,
    Can some one give me common workflow senarious in the real time concerning vendor/customer/material.  2 or 3 senarios on each of vendor/customer/material will help me to have good insight of each specific workflows.
    Thanks

    Hi
    a typical example can be:
    Start-> Validate->Process-> Match-> Merge-> Approve-> Notify->Syndicate->Stop
    This workflow can be used for validating all records in MDM based on the validation rules, editing records which fail validation, finding if the records are duplicates and then approving by the business. Final step is to syndicate approved and clean golden data to a remote system like ECC where transactions are done on Vendor/customer/material master data.
    This way we ensure only approved de duplicated records are passed on to ECC.
    MDM offers a host of such features and supports looping, branching, Grouping of steps which are used for modeling workflows.
    Hope this helps.
    regards
    ravi

  • Calling Oracle Workflow in Real-Time

    I want to modify the logic in Oracle Apps (both Oracle self-service (pure HTML and JSP) and Oracle Forms applications) to make real-time calls to a mainframe system. The page needs to take different actions depending on the data from the mainframe system.
    I want to avoid actually customizing the Apps and Forms pages if possible due to customization maintenance and support issues. I have been told that I can use Oracle Workflow to modify the actions of the pages without having to resort to customizations, as calls to Workflow are already embedded. Is this true, and if so, how would I go about inserting a call to a mainframe system?
    I am considering mainframe calls using either a proprietary ODBC driver from a third-party vendor, or devising some sort of SOA interface to the mainframe (I know it would require custom development on the mainframe side).
    Thanks!

    I'm pretty sure the sbRIO cards only have 1 FPGA on them.  So when you load the second FPGA code, the first is being overwritten.  You need to make a single FPGA VI that can handle both functions.  If you post your code, we can give more detailed advice.
    There are only two ways to tell somebody thanks: Kudos and Marked Solutions
    Unofficial Forum Rules and Guidelines

  • Why can't Real Time Workshop create a model DLL?

    When trying to create a model DLL in Real-time Workshop I get the flowing error:
    Error using ==> RTW.makertw.make_rtw
    Error using ==> tlc_c
    Error using ==> tlc_c>InvokeTLC
    Error File: C:\SimulationInterfaceToolkit\ModelInterface\basic.tlc Line: 388 Column: 45
    Attempt to call a non-function value: SLibGetBlockPath
    Installed software : LabView 7.1, Matlab release 14 s1. Real-time Workshop 6.1, and Visual Studio Pro 6.0

    This is a new install and I�m only trying to get things setup properly. The MDL file is a simple MDL created to test out SIT. It was taken from the example in the SIT User Guide. I have installed the following software:
    Operating System: Microsoft Windows XP Version 5.1 (Build 2600: Service Pack 2)
    MATLAB Version 7.0.1.24704 (R14) Service Pack 1
    MATLAB Version 7.0.1 (R14SP1)
    Simulink Version 6.1 (R14SP1)
    Real-Time Workshop Version 6.1 (R14SP1)
    Microsoft Visual Studio Pro 6.0
    LabView 7.1
    LabView Real Time 7.1
    Simulation Interface Toolkit 2.0.3
    I will include the MDL file. Is there something in the setup I might have missed?
    Attachments:
    SineWave.mdl ‏17 KB
    SineWave.vi ‏265 KB

  • Workflow real time approvers

    Hi,
    How can i get the real time level approvers in a workflow situation, example for a certain document the approvers are already defined, but during the process sometimes they change, how can i get the real approvers to show in a report, using abap ?
    Thanks in Advance.

    There are in general three solutions:
    1) The easy path is to update a table (application table, not workflow table) with information about when and by who the document was approved. Easy in the sense that the information is easily available, not always so easy with respect to keeping the information updated - especially if approvals can be withdrawn.
    2) If change documents are written, and there is a status (there often is) indicating whether the document has been approved or not, you can check the change documents. I have posted ABAP code here on SDN using this technique to display the approver in purchase requisition release. Unfortunately I can't recall which forum I posted it in, but that's why there is a search possibility. Have a look at PR First release by initiator.
    3) Use the workflow logs as outlined by Arghadip, except his solution won't work at all. This requires no table updates anywhere, other than what is done by the workflow engine. The disadvantage is that you most likely will end up hard-coding (or read from a table) the tasks that are relevant, in order to improve performance and avoid false positives. Thus you have to remember updating the table (or report) if the workflow solution changes.
    What you need to do is not check SWWUSERWI (this table is for work items that have not been completed), you need to check the logged data for the work items, using the SAP_WAPI function modules. For instance, you would be interested in the decision that was made (approve or reject), and the actual agent of the work item.

  • Trouble with deploying models in NI Veristand to real-time target

    Hi All,
      I desperately need some help with some application i’m working on. I’m trying to read some accelerometer measurements into NI Veristand but coming up with an error all the time during the deployment stage to the real target which i have atttached. I can’t quite figure out what to do about it. I’m using a real-time device with a PXI-4461 module. I have checked that i can read all sensor measurements in MAX as attached. The error message is as follows:
      Initializing deployment...
    Waiting for the target to report its state...
    Initiating FTP connection...
    System Definition File -> Acquisition.in4
    Restarting system...
    Restarting target into run mode...
    The target encountered an error and reset. Verify that the system definition file and the target resources are valid. You must deploy a new system definition file or reboot the controller to correct this problem.
      Error -200757 occurred at DAQmx Start Task.vi:1
      Possible reason(s):
    Measurements: Sample Timing Type is set to On Demand which is not supported for analog input on this device.
    Set Sample Timing Type to Sample Clock. You can achieve this whlie setting related properties through DAQmx VIs or functions for configuring timing.
    Task Name: Dev6_AI

    Duplicate Post.

  • Hi Experts! Clarififcation regardng the phases of project in real time

    Hi ,
    Can any body please explain the phases of project and thier details like wht all will be done at each stage in the real time since i am very new to tht kind of phases ..
    Please donot kindly send me any links for reference rather plz describe it in detail..
    Regards,
    Eshwant....

    Hi,
    Implementation processes:
    Project preparation
    The project preparation phase, depicted below, focuses at two main activities, i.e. to make a setup for the TSO and to define a solution vision. These activities allow an organization to put in on the right track towards implementation.
    Design and initially staff the SAP TSO
    TSO chart exampleThe first major step of the project preparation phase is to design and initially staff an SAP technical support organization (TSO), which is the organization that is charged with addressing, designing, implementing and supporting the SAP solution. This can be programmers, project management, database administrators, test teams, etc. At this point, the focus should be at staffing the key positions of the TSO, e.g. the high-level project team and SAP professionals like the senior database administrator and the solution architect. Next to that, this is the time to make decisions about choosing for internal staff members or external consultants.
    The image at the right shows a typical TSO chart.
    Craft solution vision
    The second project preparation job is to define a so-called solution vision, i.e. a vision of the future-state of the SAP solution, where it is important to address both business and financial requirements (budgets). The main focus within the vision should be on the company’s core business and how the SAP solution will better enable that core business to be successful. Next to that, the shortcomings of the current systems should be described and short but clear requirements should be provided regarding availability (uptime), security, manageability and scalability of the SAP system.
    Sizing and blueprinting
    The next phase is often referred to as the sizing and blueprinting phase and forms the main chunk of the implementation process
    Perform cost of ownership analysis
    Figure 5: Solution stack delta analysisThis phase starts with performing a total cost of ownership analysis (TCO analysis) to determine how to get the best business solution at the lowest costs. This means to compare SAP solution stack options and alternatives and then determine what costs each part of the stack will bring and when these costs will be incurred. Parts of the stack are for example the hardware, operating system and database, which form the acquisition costs. Next to that, there should be taken a look at recurring costs like maintenance costs and downtime costs. Instead of performing a complete TCO analysis for various solution stack alternatives that would like to compare, it can be wise just to do a so-called delta analysis, where only the differences between solutions (stacks) are identified and analyzed. The image at the right depicts the essence of a delta analysis.
    Identify high availability and disaster recovery requirements
    The next step is identifying the high availability requirements and the more serious disaster recovery requirements. This is to plan what to do with later downtime of the SAP system, caused by e.g. hardware failures, application failures or power outages. It should be noted that it is very important to calculate the cost of downtime, so that an organization has a good idea of its actual availability requirements.
    Engage SAP solution stack vendors
    Figure 6: Simplified SAP solution stackA true sizing process is to engage the SAP solution stack vendors, which is the next step. This means selecting the best SAP hardware and software technology partners for all layers and components of the solution stack, based on a side-by-side sizing comparison. The most important factors that are of influence here are the estimated numbers of (concurrent) users and batch sizes. A wise thing to do is to involve SAP AG itself to let them create a sizing proposal stating the advised solution stack, before moving to SAP’s technology partners/SAP vendors, like HP, Sun Microsystems and IBM. A simplified solution stack is depicted at the right, showing the many layers for which software and hardware has to be acquired. Note the overlap with the OSI model.
    Staff TSO
    The TSO is the most important resource for an organization that is implementing SAP, so staffing the TSO is a vital job which can consume a lot of time. In a previous phase, the organization should already have staffed the most vital positions. At this point the organization should staff the bulk of the TSO, i.e. fill the positions that directly support the near-term objectives of the implementation, which are to develop and begin the installation/implementation of the SAP data center. Examples are: data center experts, network infrastructure experts, security specialists and database administration experts.
    There are many ways to find the right people within or outside the organization for all of the TSO positions and it depends on the organization how much time it wants to spend on staffing.
    Training
    One of the most vital stages of the implementation process is training. Very few people within an organization are SAP experts or even have worked with SAP software. It is therefore very important to train the end users but especially the SAP TSO: the people who design and implement the solution. Many people within the TSO need all kinds of training. Some examples of these positions:
    SAP Network Specialists
    SAP Database Administrators
    SAP Security specialists
    Documentation specialists
    Et cetera
    All of these people need to acquire the required SAP knowledge and skills or even SAP certifications through training. Moreover, people need to learn to do business in a totally new way. To define how much SAP training every person needs, a company can make use of a skillset matrix. With this matrix, a manager can identify who possesses what knowledge, to manage and plan training, by defining the height of expertise with a number between e.g. 1 and 4 for each skill for each employee.
    Setup SAP data center
    The next step is to set up the SAP data center. This means either building a new data center facility or transforming the current data center into a foundation capable of supporting the SAP solution stack, i.e. all of the technology layers and components (SAP software products) in a productive SAP installation. The most important factor when designing the data center is availability. The high availability and disaster recovery requirements which should have been defined earlier, give a good idea of the required data center requirements to host the SAP software. Data center requirements can be a:
    Physical requirement like power requirements
    Rack requirement
    Network infrastructure requirement or
    Requirement to the network server.
    Perform installations
    The following step is to install the required SAP software parts which are called components and technological foundations like a web application server or enterprise portals, to a state ready for business process configuration. The most vital sub steps are to prepare your OS, prepare the database server and then start installing SAP software. Here it is very important to use installation guides, which are published for each SAP component or technology solution by SAP AG. Examples of SAP components are:
    R/3 Enterprise — Transaction Processing
    mySAP BI — Business Information Warehouse
    mySAP CRM — Customer Relationship Management
    mySAP KW — Knowledge Warehouse
    mySAP PLM — Product Lifecycle Management
    mySAP SCM — Supply Chain Management
    mySAP SEM — Strategic Enterprise Management
    mySAP SRM — Supplier Relationship Management
    Round out support for SAP
    Before moving into the functional development phase, the organization should identify and staff the remaining TSO roles, e.g. roles that relate to helpdesk work and other such support providing work.
    [edit] Functional development
    The next phase is the functional development phase, where it is all about change management and testing. This phase is depicted below.
    Figure 7: Functional development phase
    Address change management
    The next challenge for an organization is all about change management / change control, which means to develop a planned approach to the changes the organization faces. The objective here is to maximize the collective efforts of all people involved in the change and to minimize the risk of failure of implementing the changes related to the SAP implementation.
    The implementation of SAP software will most surely come with many changes and an organization can expect many natural reactions, i.e. denial, to these changes. To fight this, it is most important to create a solid project team dedicated to change management and to communicate the solution vision and goals of this team. This team should be prepared to handle the many change issues that come from various sources like:
    End-user requests
    Operations
    Data center team
    DBA group
    Systems management
    SAP systems and operations management
    Next thing is to create a foundation for the SAP systems management and SAP computer operations, by creating a SAP operations manual and by evaluating SAP management applications. The manual is a collection of current state system documentation, day-to-day and other regularly scheduled operations tasks, various installation and operations checklists and how-to process documents.
    Functional, integration and regression testing
    Testing is very important before going live with any system. Before going live with a SAP system, it is vital to do many different kinds of testing, since there is often a large, complex infrastructure of hardware and software involved. Both requirements as well as quality parameters are to be tested. Important types of testing are:
    Functional testing: to test using functional use cases, i.e. a set of conditions or variables under which a tester will determine if a certain business process works
    Integration testing
    Regression testing
    All tests should be preceded by creating solid test plans.
    [edit] Final preparation
    The last phase before going live can be referred to as the final preparation phase and is depicted below.
    Figure 8: Final preparation phase
    Systems and stress testing
    Another vital preparation activity before going live with SAP is systems and stress testing. This means planning, scripting, executing and monitoring system and stress tests, to see if the expectations of the end users, defined in service level agreements, will be met. This can be done with SAP’s standard application benchmarks, to benchmark the organization’s configurations against configurations that have been tested by SAP’s hardware technology partners. Again, a test plan should be created at first.
    Prepare for cutover
    The final phase before going live with SAP is often referred to as the cutover phase, which is the process of transitioning from one system to a new one. The organization needs to plan, prepare and execute the cutover, by creating a cutover plan that describes all cutover tasks that have to be performed before the actual go-live. Examples of cutover tasks are:
    Review and update all systems-related operations procedures like backup policies and system monitoring
    Assign ownership of SAP’s functional processes to individuals
    Let SAP AG do a GoingLive check, to get their blessing to go live with the system
    Lock down the system, i.e. do not make any more changes to the SAP system
    [edit] Go Live
    All of the previously described phases all lead towards this final moment: the go-live. Go-live means to turn on the SAP system for the end-users and to obtain feedback on the solution and to monitor the solution. It is also the moment where product software adoption comes into play. More information on this topic:
    Product Software Adoption: Big Bang Adoption
    Product Software Adoption: Parallel Adoption
    Product Software Adoption: Phased Adoption
    HTH
    Regards,
    Dhruv Shah

  • InDesign auto-size frame feature not working in real time in InCopy why?

    We have just recently migrated from InCopy CS4 to CS6 to take advantage of the new features like the auto resize frame option, however it now seems that this feature is not working in real-time.
    Basically the steps are needed to be complete before it auto-resizes the frame in InCopy, we use both layout and assignment based workflows:
    1. From an ID document ('doc1'), exported a 'layer' to IC, certain frames are set to auto-size in height using the text frame options. So that editorial can review and make changes to text and the frame should resize according to the specifications set. IC stories are saved to a folder located in a content folder inside the top issue working folder.
    2. Editorial opens the IC software, then opens the ID 'doc1'. Check’s out correct .icml file and makes edits to frame with auto resize.
    3. Frame does not resize according to text frame set options and InCopy file does not respond in same fashion as InDesign.
    4. Change only occurs when InCopy file is closed and updated in InDesign, which is frustrating as this feature would save huge amounts of time serving editorial requests.
    Has anybody experienced this type of workflow problem? If anyone can provide mw with some pointers as to what can I do to get this to update in real time perhaps run a script? Update file in InCopy and refresh I will very much appreciate their assistance. I have run out of ideas.
    Thanks!

    We've had all sorts of problems with this feature as it should've worked straight out of the box but after some testing we have found that its something to do with the way you open the actual file in InCopy. Which is far from ideal and should have been UAT by Adobe before release.
    This will not work consistently work if you open the designed .indd or .icma file in InCopy using the file open command within the application.
    If you need this to work, the InCopy user has to open the .indd or .icma file by dragging and droping from OS windows explorer into InCopy, we use Windows 7 acrros all the teams. Check out .icml files add text changes to the set auto resized frames, this process will expand/collapse the frames to fit the content but as you have to use the drag and drop method to open the .indd and .icma file, 2 users cannot access the same time doc at the same time (a serious flaw in the programming architecture!) which stops people working in parallel. Save changes, check in .icml content and close .indd or .icma.
    However the flaw comes in if you then open the .indd and .icma file in InCopy using the file open command within the application, before an InDesign user opens and saves the file (updates the design). The corrections added in the previous stage above, will not show the frames expanded/collapsed to take in the added text and instead show over matter???? The only way around this is to ask an InDesign user to open, update and save the design that way the InCopy user will see the same result no matter what file open method they use.
    Another suggestion is to design the page to have some of the auto resize frames anchored within main body of text and that way the frames will expland/collapse when checking out and editing the content. However, this does cause issues with InDesign crashing etc. so we have tried to stop this method within the working group.
    Have you experienced other more serious issues with InDesign crashing consistently when re-importing .icml files? See other forums here:
    http://forums.adobe.com/thread/671820?start=80&tstart=0
    http://forums.adobe.com/message/5045608#5045608
    As far as we can see this is a major flaw in how the application(s) work, we have an enterprise agreement with Adobe and purchase a large volume of Adobe products globally but so far the technical support team are unable to find a solution to this and I'm not hopeful of any resolution soon even with the new release of Adobe CC.

  • How can I generate a real-time highchart from my database data?

    I have looked several links; however, I couldn't find a working demo showing how to implement a highchart using data from a database.
    Objective: I want to generate a real time highchart line graph getting data from my database. What I want is very similar to the
    HighChart Demo which provides a real-time highchart with randomly generated values. It is also similar by X-axis and Y-axis, for I want my x-axis to be "Time" (I have a DateTime column in my database) and y-axis to be an integer (I have
    a variable for that as well in my database).
    Please I need help in sending the model data to my razor view.
    Note that I am already using SignalR to display a realtime table. I also want to know if it can be used to automatically update the highchart as well.
    Below is the code snippet of my script in the view. I have used the code provided in
    HighChart Demo link for generating the highchart. Please tell me where should I apply the changes on my code.
    @section Scripts{
    <script src="~/Scripts/jquery.signalR-2.2.0.js"></script>
    <!--Reference the autogenerated SignalR hub script. -->
    <script src="~/SignalR/Hubs"></script>
    <script type="text/javascript">
    $(document).ready(function () {
    // Declare a proxy to reference the hub.
    var notifications = $.connection.dataHub;
    //debugger;
    // Create a function that the hub can call to broadcast messages.
    notifications.client.updateMessages = function () {
    getAllMessages()
    // Start the connection.
    $.connection.hub.start().done(function () {
    alert("connection started")
    getAllMessages();
    }).fail(function (e) {
    alert(e);
    //Highchart
    Highcharts.setOptions({
    global: {
    useUTC: false
    //Fill chart
    $('#container').highcharts({
    chart: {
    type: 'spline',
    animation: Highcharts.svg, // don't animate in old IE
    marginRight: 10,
    events: {
    load: function () {
    // set up the updating of the chart each second
    var series = this.series[0];
    setInterval(function () {
    var x = (new Date()).getTime(), // current time
    y = Math.random();
    series.addPoint([x, y], true, true);
    }, 1000);//300000
    title: {
    text: 'Live random data'
    xAxis: {
    type: 'datetime',
    tickPixelInterval: 150
    yAxis: {
    title: {
    text: 'Value'
    plotLines: [{
    value: 0,
    width: 1,
    color: '#808080'
    tooltip: {
    formatter: function () {
    return '<b>' + this.series.name + '</b><br/>' +
    Highcharts.dateFormat('%Y-%m-%d %H:%M:%S', this.x) + '<br/>' +
    Highcharts.numberFormat(this.y, 2);
    legend: {
    enabled: false
    exporting: {
    enabled: false
    series: [{
    name: 'Random data',
    data: (function () {
    // generate an array of random data
    var data = [],
    time = (new Date()).getTime(),
    i;
    for (i = -19; i <= 0; i += 1) {
    data.push({
    x: time + i * 1000,
    y: Math.random()
    return data;
    function getAllMessages() {
    var tbl = $('#messagesTable');
    var data = @Html.Raw(JsonConvert.SerializeObject(this.Model))
    $.ajax({
    url: '/nurse/GetMessages',
    data: {
    id: data.id,
    contentType: 'application/html ; charset:utf-8',
    type: 'GET',
    dataType: 'html'
    }).success(function (result) {
    tbl.empty().append(result);
    $("#g_table").dataTable();
    }).error(function (e) {
    alert(e);
    </script>

    Hi Sihem,
    Thank you for contacting National Instruments.  Using the LabVIEW Real-Time module, you can do development without actually having a target.  While viewing the project explorer window, you can do the following steps:
    Right click on the project
    Select New >> Targets and Devices
    Select the "New Target or Device" radio button
    Select the target you would like to develop on.Information about the LabVIEW Real-Time Module can be found here.
    Regards,
    Kevin H
    National Instruments
    WSN/Wireless DAQ Product Support Engineer

  • Is there a way to create dependency on the real-time jobs

    Hi,
    We have around 80 real-time services running and loading the changed data into the target.
    The process being used is
    IBM Informix > IBM CDC > JMS (xml messages) > DS real-time services > Oracle EDW.
    While using the above process,  when ever there is change in the fact table and the dimension table, both the real-time services are loading the data at the same time into the target. This is causing issues in looking up data with the timing issue.
    Is there a way where we can create a dependency and resolve the timing issue and make sure the lookup table is loaded and then the master table is loaded?
    Please let me know.
    Thanks,
    C

    Hello
    With the design you curently have, you will have potential sequencing issues.  There is no magic in Data Services to solve this.
    You might want to consider building more complex real-time jobs that accept more complex data structures and have logic to process the data in dependency order.
    Michael

  • How to save data in a 4D array and make partial plots in real time?

    Hi, this is a little complex, so bear with me...
    I have a test system that tests a number of parts at the same time. The
    experiment I do consists of measuring a number of properties of the
    parts at various temperatures and voltages. I want to save all the
    measured data in a 4-dimensional array. The indices represent,
    respectively, temperature, voltage, part, property.
    The way the experiment is done, I first do a loop in temperature, then
    in voltage, then switch the part. At this point, I measure all the
    properties for that condition and part and want to add them as a 1D
    array to the 4D array.
    At the same time, I want to make a multiple plot (on an XY graph) of
    one selected property and part (using two pull-down selectors near the
    XY graph) vs. voltage. (The reason I need to use an XY graph and not a
    waveform graph, which would be easier, is that I do not have
    equidistant steps in voltage, although all the voltage values I step
    through are the same for all cases). The multiple plots are the data
    sets at different temperatures. I would like to draw connection lines
    between the points as a guide to the eye.
    I also want the plot to be updated in the innermost for loop in real
    time as the data are measured. I have a VI working using nested loops
    as described above and passing the 4D array through shift registers,
    starting with an array of the right dimensions initialized by zeroes. I
    know in advance how many times all the loops have to be executed, and I
    use the ReplaceArraySubset function to add the measured properties each
    time. I then use IndexArray with the part and property index terminals
    wired to extract the 2D array containing the data I want to plot. After
    some transformation to combine these data with an array of the voltage
    values in the form required to pass to the XYGraph control, I get my
    plot.
    The problem is: During program execution, when only partial data is
    available, all the zero elements in the array do not allow the graph to
    autoscale properly, and the lines between the points make little sense
    when they jump to zero.
    Here is how I think the problem could be solved:
    1. Start with an empty array and have the array grow gradually as the
    elements are measured. I tried to implement this using Insert Into
    Array. Unfortunately, this VI is not as flexible as the Replace Array
    Subset, and does not allow me to add a 1D array to a 4D array. One
    other option would be to use the Build Array, but I could not figure
    out if this is usable in this case.
    2. The second option would be to extract only the already measured data
    points from the 4D array and pass them to the graph
    3. Keep track of the min. and max. values (only when they are different
    from zero) and manually reset the graph Y axis scale each time.
    Option 3 is doable, but more work for me.....
    Option 2: I first tried to use Array Subset, but this always returns an
    array of the same dimensionality of the input array. It seems to be
    very difficult, but maybe not impossible, to make this work by using
    Index Array first followed by Array Subset. Option 3 seems easier.
    Ideally, I would like option 1, but I cannot figure out how to achieve
    this.
    Your help is appreciated, thanks in advance!
    germ Remove "nospam" to reply

    In article <[email protected]>,
    chutla wrote:
    > Greetings!
    >
    > You can use any of the 3D display vi's to show your "main" 3d
    > data, and then use color to represent your fourth dimension. This can
    > be accessed via the property node. You will have to set thresholds
    > for each color you use, which is quite simple using the comparison
    > functions. As far as the data is concerned, the fourth dimension will
    > be just another vector (column) in your data file.
    chutla, thanks for your post, but I don't want a 3D display of the
    data....
    > Also, check out
    > the BUFFER examples for how to separate out "running" data in real
    > time.
    Not clear to me what you mean, but will c
    heck the BUFFER examples.
    > As far as autoscaling is concerned, you might have to disable
    > it, or alternatively, you could force a couple of "dummy" points into
    > your data which represent the absolute min/max you should encounter.
    > Autoscaling should generally be regarded as a default mode, just to
    > get things rolling, it should not be relied on too heavily for serious
    > data acquisition. It's better to use well-conditioned data, or some
    > other means, such as a logarithmic scale, to allow access to all your
    > possible data points.
    I love autoscaling, that's the way it should be.
    germ Remove "nospam" to reply

  • How to create a Real Time Interactive Business Intelligence Solution in SharePoint 2013

    Hi Experts,
    I was recently given the below requirements to architect/implement a business intelligence solution that deals with instant/real data modifications in data sources. After going through many articles, e-books, expert blogs, I am still unable to piece the
    right information to design an effective solution to my problem. Also, client is ready to invest in the best 
    infrastructure in order to achieve all the below requirements but yet it seems like a sword of Damocles that hangs around my neck in every direction I go.
    Requirements
    1) Reports must be created against many-to-many table relationships and against multiple data sources(SP Lists, SQL Server Custom Databases, External Databases).
    2) The Report and Dashboard pages should refresh/reflect with real time data immediately as and when changes are made to the data sources.
    3) The Reports should be cross-browser compatible(must work in google chrome, safari, firefox and IE), cross-platform(MAC, Android, Linux, Windows) and cross-device compatible(Tabs, Laptops &
    Mobiles).
    4) Client is Branding/UI conscious and wants the reports to look animated and pixel perfect similar to what's possible to create today in Excel 2013.
    5) The reports must be interactive, parameterized, slice able, must load fast and have the ability to drill down or expand.
    6) Client wants to leverage the Web Content Management, Document Management, Workflow abilities & other features of SharePoint with key focus being on the reporting solution.
    7) Client wants the reports to be scalable, durable, secure and other standard needs.
    Is SharePoint 2013 Business Intelligence a good candidate? I see the below limitations with the Product to achieve all the above requirements.
    a) Cannot use Power Pivot with Excel deployed to SharePoint as the minimum granularity of refresh schedule is Daily. This violates Requirement 1.
    b) Excel Services, Performance Point or Power View works as in-memory representation mode. This violates Requirement 1 and 2.
    b) SSRS does not render the reports as stated above in requirement 3 and 4. The report rendering on the page is very slow for sample data itself. This violates Requirement 6 and 7.
    Has someone been able to achieve all of the above requirements using SharePoint 2013 platform or any other platform. Please let me know the best possible solution. If possible, redirect me to whitepapers, articles, material that will help me design a effective
    solution. Eagerly looking forward to hear from you experts!.
    Please feel free to write in case you have any comments/clarifications.
    Thanks, 
    Bhargav

    Hi Experts,
    Request your valuable inputs and support on achieving the above requirements.
    Looking forward for your responses.
    Thanks,
    Bhargav

  • What is the best way to match clip and sequence settings for real-time editing?

    I am a long time FCP user and trying to make the After Effects workflow a little smoother by utilizing the Adobe suite.
    One of the most valuable things I miss about FCP is that when I drop a clip in a blank sequence (no matter what the settings of the seq) it will ask if I want to match the settings of the clip. This ensures that no rendering is required to quickly edit the video in real-time.
    Is there a plugin or function in Premiere CS4 that I am missing that will afford me this feature? Currently i am doing everything I can to match the clip settings when creating a sequence, but sadly i'm still having to render the timeline to preview the edit.
    lh

    And in CS4, if there is not a matching Preset, you can choose the Desktop Preset, which will allow you to customize nearly every attribute to match your source footage. The name is a bit of a misnomer, but has been around for a long time. I would have called it the "Custom Preset," but Adobe never called.
    Good luck,
    Hunt
    PS - I just learned something new about CS5 from Curt, because of your question - thanks!

  • In a Real time project what would be the agent assignment attribute setting

    Dear Experts,
    I have never worked on a workflow project. I have been only practising workflow on a IDES system. I would like to know in a real live production system what settings do we use when we define the agent assignment at the task level? Do we set the attribute as GENERAL TASK always?
    To be more specific as in the task attributes we have many options like
    GENERAL TASK
    GENERAL FORWARDING ALLOWED,
    GENERAL FORWARDING NOT ALLOWED AND
    FORWARDING NOT ALLOWED.
    From the first three options which is most generally used in a real time projects?
    AND
    is it necessary to always set the attribute as GENERAL TASK before transporting the workflow definition to other systems from the development system?
    appreciate your help on the same.
    cheers
    chky

    Hello Learner,
    It depends on the requirement, but in most of the cases we assign the task as general task.
    To have some more information on the various attributes,
    ·        General task
    If you define a task as a general task, all users can execute the task. This is useful if the task is used in a workflow and you only want to define the recipients in the step definition. A recipient can forward associated work items to all users.
    Work items whose tasks are defined as general tasks and for which no responsible agents or default rules are defined are offered to all users of the SAP System for execution in their Business Workplaces.
    ·        General forwarding allowed
    A work item that represents a task with this property can be forwarded by one of its recipients to all users, even if they are not possible agents of the task.
    ·        General forwarding not allowed
    A work item that represents a task with this property can be forwarded by one of its recipients only to the possible agents of the task.
    ·        Forwarding not allowed
    A work item that represents a task with this property cannot be forwarded by one of its recipients.
    Hope this will help.
    Regards,
    Sam

Maybe you are looking for