Assigning today's date to process flow start actvity
Hi,
I am trying to assign today's date to my process flow which will be used as a common load date for multiple transformations within the process flow. I'm at a loss on how to do this. Any help would be appreciated.
Thanks
You can add variable to the start acitivity or create stand alone variable(global for Process Flow) in the left side panel (where you see selected objects/events) and give the value of the variable as sysdate. Make sure the Literal property is set to false meaning it is not literally SYSDATE but the value of sysdate.
It worked for me.
- Manohar
Similar Messages
-
OWB Error: Connection lost when assign a schedule to a process flow
I have:
- OWB 11Gr2
- Oracle DB 10Gr2 Source/Target
I have created a schedule module and set the location to be one of the target DB locations.
Create a one of schedule and scheduled for daily execution.
Open configuration of a process flow and assign with the schedule.
After this, i tried to save the repository, and occurred the error message:
"Repository Connection Error: The connection to the repository was lost, because of the following database error: Closed Connection
Exit OWB without committing."
The connection with the repository was lost and i can´t do more anything.
I tried to create a separate location for the schedule, but it don´t make difference
What´s happening?
Thanks,
AfonsoWea re running 11.2.0.2 and
When looking at the trace log
Dump continued from file: /data02/oramisc/diag/rdbms/dpmdbp/dpmdbp/trace/dpmdbp_ora_13503.trc
ORA-00600: intern felkod, argument: [15570], [], [], [], [], [], [], [], [], [], [], []
With this query "========= Dump for incident 16022 (ORA 600 [15570]) ========
*** 2011-10-18 09:52:25.445
dbkedDefDump(): Starting incident default dumps (flags=0x2, level=3, mask=0x0)
----- Current SQL Statement for this session (sql_id=7a76281h0tr2p) -----
SELECT usage.locuoid, usage.locid, cal_property.elementid, cal_property.classname, schedulable.name || '_JOB' name, schedulable.name || '_JOB' logicalname, cal_property.uoid, cal_property.classname
, cal_property.notm, cal_property.editable editable, cal_property.customerrenamable, cal_property.customerdeletable, cal_property.seeded, cal_property.UpdateTimestamp, cal_property.strongTypeName,
cal_property.metadatasignature signature, 0 iconobject FROM Schedulable_v schedulable, CMPStringPropertyValue_v cal_property, CMPCalendarInstalledModule_v cal_mod, CMPPhysicalObject_v sched_phys, CM
PCalendar_v cal, ( select prop.firstclassobject moduleid, prop.value locuoid, 0 locid from CMPPhysicalObject_v phys, CMPStringPropertyValue_v prop where prop.logicalname = 'SCHEDULE_MODULE_CONFIG.DE
PLOYMENT.LOCATION' and prop.propertyOwner = phys.elementid and phys.namedconfiguration = 42636 union select installedmodule, '' locuoid, location locid from CMPLocationUsage_v locUse where deployme
ntdefault = 1 and not exists ( select null from CMPPhysicalObject_v phys, CMPStringPropertyValue_v prop where prop.logicalname = 'SCHEDULE_MODULE_CONFIG.DEPLOYMENT.LOCATION' and prop.firstclassobjec
t = locUse.installedmodule and prop.propertyOwner = phys.elementid and phys.namedconfiguration = 42636) ) usage WHERE cal_mod.elementid = usage.moduleid and cal_mod.elementid = cal.calendarmodule a
nd substr(cal_property.value,0,32) = cal.uoid and cal_property.logicalname='SCHEDULABLE.PROPERTY' and (cal_property.firstclassobject = schedulable.elementid or cal_property.firstclassobject = sched
_phys.elementid and sched_phys.logicalobject = schedulable.elementid) and cal_mod.owningproject = 42631 and cal_property.propertyowner = sched_phys.elementid and sched_phys.namedconfiguration=42636
ORDER BY schedulable.name || '_JOB'"
Coulöd be possible this bug
Bug 12368039 ORA-600 [15570] with UNION ALL views with Parallel steps
This note gives a brief overview of bug 12368039.
The content was last updated on: 18-MAY-2011
Click here for details of each of the sections below.
Fixed:
This issue is fixed in
* 11.2.0.3 -
Mapping sensor data to process flows
Hi,
I´ve implemented a sensor-action that uses a JMS-queue to report its data. I´ve specified a few sensors in the process I´m calling as well as all processes called by that first one.
This is ment to monitor the process workflow resulting in a graph that represents the flow of the different processes visited.
To get to the point: when that first process is called twice (or even more often) at a time I get a whole bunch of sensor data, but cannot distinquish their origin - either from the first flow or the second one. Since I´m using not only one process I cannot use the instance-id as a reference. I was thinking of passing sort of a generated id (timestamp) through all processes.
Any ideas or suggestions?
Thanks in advance,
MaxYou can use a wrl file with the 3D picture control. How you map your values is optional. I have used color coded sphere at the locations of interest.
Have fun,
Ben
Ben Rayner
I am currently active on.. MainStream Preppers
Rayner's Ridge is under construction -
Process flow wait activity usage
Hi all,
I'm having some issues while trying to use the wait activity on a process flow...
I have a simple process flow:
START ---> WAIT ---> END
the parameter MINIMUM_DELAY on the wait activity is set to 10 and the UNTIL_DATE parameter was left blank.
The following error occurs when I try to execute the process flow:
ORA-01867: the interval is invalid
ORA-06512: at "OWBSYS.WB_RT_CONVERSIONS", line 371
ORA-06502: PL/SQL: numeric or value error string
ORA-06512: at line 3
I'm I doing this right? Any help would be much appreciated...
Best regardsAnyone modifying this should be very careful. We have had problems in some mappings that use the TO_CHAR function to convert a date to the Day number in week, and that varies from region to region.
If you change the language that the Control Center Service is invoked under, you also change the regional settings (NLS_LANGUAGE and NLS_TERRITORY, etc.).
A better solution I think would be to add a preprocess mapping that executes an ALTER SESSION SET NLS_LANGUAGE = ... and ALTER SESSION SET NLS_TERRITORY=....
I do not have the possibility of testing this, but that is what I would see as the correct way forward (and not changing the way the control center is invoked).
Good luck! -
Hi All,
I wanted to run my process flow at a predefined schedule. I had created a schedule for the same. But how do I specify which mapping need to be run on this schedule and how do I deploy this schedule. I am not able to see this schedule in my Location?
Thanks & Regards,
DanishHi
1. Deploy the process flow.
2. Create a schedule and set the time parameters
3.Right click on the process flow you want to schedule
4 select configure
5. Assign the schedule to the process flow using referred calendar. This will create a process flow job or scheduled job under the location string used for creating a schedule. The scheduled job would be having the name of the process flow but with suffix "_JOB"
6. Deploy the scheduled job.
7. Start the scheduled job manually initially. This will start/enable the schedule and the Process flow is executed for every interval of time.
Hope this helps
Regards
Vibhuti -
Date parameter in a process flow
Hi,
I´m working with OWB 10.1.0.2.0.
I have a process flow with a transform.
The transform has a date parameter and I´ve defined a date parameter in the start activity, which is bound to the start parameter.
If I try to pass a date parameter in this form, sysdate, the process doesn´t recognize it and in the all_rt_audit_execution_params table i can see that the param process has no value.
If I pass a date parameter in this format 2006/08/07 the process works fine
I need to pass the date parameter from start activity to the process activity using expresions with sysdate,
may anyone help me? please and thanks
beatrizHi,
there are several tasks you want to accomplish:
1. "I need to check if the parameter file exists"
Therefor OWB has the File Exists Activity (Process Flow). This one checks if a file exists with specified name on specified location. If so it returns with SUCCESS. If not, it returns a WARNING. In second case you should do a sleep for a while and then re-check.
2. "accessing the date value from the parameter file"
There will be several ways to do this. The easiest one may be, to declare a external Table on this file and simply select from it (use it as a source in your mapping)
3. "After the ETL is finished running, the name of the parameter file needs to be changed"
In your process flow you create a user defined activity. In this one you start cmd.exe (on windows) and execute any DOS-commands you like. In this case a move or rename.
Hope I could help you...
Bye,
iOn. -
How to start/stop process flow from sql*plus?
Hi,
i know how to start a process flow via sqlplus_exec_template.sql, but i cannot find any information on how to stop (and rollback) a working flow from sql*plus. Any help would be appreciated.
Greetings
Christoph
Message was edited by:
ctrierweilerHi,
I've had a go.
How should I interpret the results of list_requests:
owner_owr@ORKDEV01> @list_requests
====================
DEPLOYMENTS
====================
Audit ID Status Name Date Owner
2706 READY Deployment Fri Nov 11-NOV-05 10:49:59 OWNER_OWR
11 10:46:37 CET 2
005
====================
DEPLOYMENT UNITS
====================
Audit ID Status Name Date Owner
2707 READY Unit0 11-NOV-05 10:49:59 OWNER_OWR
====================
EXECUTIONS
====================
Er zijn geen rijen geselecteerd.
owner_owr@ORKDEV01>
Whilst a process flow is executing the last query will list executions, all of which have status BUSY:
owner_owr@ORKDEV01> @list_requests
====================
DEPLOYMENTS
====================
Audit ID Status Name Date Owner
2706 READY Deployment Fri Nov 11-NOV-05 10:49:59 OWNER_OWR
11 10:46:37 CET 2
005
====================
DEPLOYMENT UNITS
====================
Audit ID Status Name Date Owner
2707 READY Unit0 11-NOV-05 10:49:59 OWNER_OWR
====================
EXECUTIONS
====================
Audit ID Status Name Date Owner
394512 BUSY PF_ONB01 04-MEI-06 09:11:12 OWNER_OWX
395328 BUSY ONB:FULL_PREPARE 04-MEI-06 09:11:55 OWNER_OWR
395324 BUSY PF_ONB01:ONB 04-MEI-06 09:11:55 OWNER_OWR
owner_owr@ORKDEV01>
As an aside, I will attempt to get rid of the READY deployment and deployment unit using deactive_deployment.sql
Now, if I attempt to use deactivate_execution.sql on any of the executions with status BUSY I get:
owner_owr@ORKDEV01> @deactivate_execution
Voer waarde voor 1 in: 396136
declare
FOUT in regel 1:
.ORA-20003: The object is not in a valid state for the requested operation
ORA-06512: at "OWNER_OWR.WB_RTI_EXCEPTIONS", line 94
ORA-06512: at "OWNER_OWR.WB_RTI_EXECUTION", line 774
ORA-06512: at "OWNER_OWR.WB_RT_EXECUTION", line 90
ORA-06512: at line 4
owner_owr@ORKDEV01>
So all the seems to remain is to use abort_exec_request.sql
This does the job, but the script itself hangs.
I think it has to do with the l_stream_id not being checked again after the initial IF. I think it should probably be part of the loop condition as it is again reset in the do_acks inside the loop.
Cheers & thanks,
Colin -
Analysing Task Audit, Data Audit and Process Flow History
Hi,
Internal Audit dept has requested a bunch of information, that we need to compile from Task Audit, Data Audit and Process Flow History logs. We do have all the info available, however not in a format that allows proper "reporting" of log information. What is the best way to handle HFM logs so that we can quickly filter and export required audit information?
We do have housekeeping in place, so the logs are partial "live" db tables, and partial purged tables that were exported to Excel to archive the historical log info.
Many Thanks.I thought I posted this Friday, but I just noticed I never hit the 'Post Message Button', ha ha.
This info below will help you translate some of the information in the tables, etc. You could report on it from the Audit tables directly or move them to another appropriate data table for analysis later. The concensus, though I disagree, is that you will suffer performance issues if your audit tables get too big, so you want to move them periodically. You can do this using a scheduled Task, manual process, etc.
I personally just dump it to another table and report on it from there. As mentioned above, you'll need to translate some of the information as it is not 'human readable' in the database.
For instance, if I wanted to pull Metadata Load, Rules Load, Member List load, you could run a query like this. (NOTE: strAppName should be equal to the name of your application .... )
The main tricks to know at least for task audit table are figuring out how to convert times and determing which activity code corresponds to the user friendly name.
-- Declare working variables --
declare @dtStartDate as nvarchar(20)
declare @dtEndDate as nvarchar(20)
declare @strAppName as nvarchar(20)
declare @strSQL as nvarchar(4000)
-- Initialize working variables --
set @dtStartDate = '1/1/2012'
set @dtEndDate = '8/31/2012'
set @strAppName = 'YourAppNameHere'
--Get Rules Load, Metadata, Member List
set @strSQL = '
select sUserName as "User", ''Rules Load'' as Activity, cast(StartTime-2 as smalldatetime) as "Time Start",
cast(EndTime-2 as smalldatetime) as ''Time End'', ServerName, strDescription, strModuleName
from ' + @strAppName + '_task_audit ta, hsv_activity_users au
where au.lUserID = ta.ActivityUserID and activitycode in (1)
and cast(StartTime-2 as smalldatetime) between ''' + @dtStartDate + ''' and ''' + @dtEndDate + '''
union all
select sUserName as "User", ''Metadata Load'' as Activity, cast(StartTime-2 as smalldatetime) as "Time Start",
cast(EndTime-2 as smalldatetime) as ''Time End'', ServerName, strDescription, strModuleName
from ' + @strAppName + '_task_audit ta, hsv_activity_users au
where au.lUserID = ta.ActivityUserID and activitycode in (21)
and cast(StartTime-2 as smalldatetime) between ''' + @dtStartDate + ''' and ''' + @dtEndDate + '''
union all
select sUserName as "User", ''Memberlist Load'' as Activity, cast(StartTime-2 as smalldatetime) as "Time Start",
cast(EndTime-2 as smalldatetime) as ''Time End'', ServerName, strDescription, strModuleName
from ' + @strAppName + '_task_audit ta, hsv_activity_users au
where au.lUserID = ta.ActivityUserID and activitycode in (23)
and cast(StartTime-2 as smalldatetime) between ''' + @dtStartDate + ''' and ''' + @dtEndDate + ''''
exec sp_executesql @strSQLIn regards to activity codes, here's a quick breakdown on those ....
ActivityID ActivityName
0 Idle
1 Rules Load
2 Rules Scan
3 Rules Extract
4 Consolidation
5 Chart Logic
6 Translation
7 Custom Logic
8 Allocate
9 Data Load
10 Data Extract
11 Data Extract via HAL
12 Data Entry
13 Data Retrieval
14 Data Clear
15 Data Copy
16 Journal Entry
17 Journal Retrieval
18 Journal Posting
19 Journal Unposting
20 Journal Template Entry
21 Metadata Load
22 Metadata Extract
23 Member List Load
24 Member List Scan
25 Member List Extract
26 Security Load
27 Security Scan
28 Security Extract
29 Logon
30 Logon Failure
31 Logoff
32 External
33 Metadata Scan
34 Data Scan
35 Extended Analytics Export
36 Extended Analytics Schema Delete
37 Transactions Load
38 Transactions Extract
39 Document Attachments
40 Document Detachments
41 Create Transactions
42 Edit Transactions
43 Delete Transactions
44 Post Transactions
45 Unpost Transactions
46 Delete Invalid Records
47 Data Audit Purged
48 Task Audit Purged
49 Post All Transactions
50 Unpost All Transactions
51 Delete All Transactions
52 Unmatch All Transactions
53 Auto Match by ID
54 Auto Match by Account
55 Intercompany Matching Report by ID
56 Intercompany Matching Report by Acct
57 Intercompany Transaction Report
58 Manual Match
59 Unmatch Selected
60 Manage IC Periods
61 Lock/Unlock IC Entities
62 Manage IC Reason Codes
63 Null -
Hello All,
We are using a two tier architecture.
Our Corp server calls the refinery server.
Our CORP MII server uses user id abc_user to connect to the refinery data server.
The user id abc_user has the SAP_xMII_Dynamic_Query role.
The data server also has the checkbox for allow dynamic query enabled.
But we are still getting the following error
Error has occurred while processing data stream
Dynamic Query role is not assigned to the Data Server; Use query template
Once we add the SAP_xMII_Dynamic_Query role to the data server everything works fine. Is this feature by design ?
Thanks,
KiranThanks Anushree !!
I thought that just adding the role to the user and enabling the dynamic query checkbox on the data server should work.
But we even needed to add the role to the data server.
Thanks,
Kiran -
"Exception Processing Message" error when clicking the Accessing Server Data link on the start page
When I first started the application, I click the Accessing Server Data link on the start page. I immediately got the error "Exception Processing Message c0000013 Parameters 75b6bf7c 4 75b6bf7c 75b6bf7c in a dialog box titled "Windows - No Disk". I had to used Task Manager to remove the box after closing Flash Builder 4. I then tried the video tutorial on "PHP Services in Flash Builder 4". I keep receiving an error during service introspection trying to connect to the PHP class. It was the same error as the other individual "Cannot Connect to PHP Service". So I tried another tutorial and generated a sample PHP class. I moved a datagrid on to the stage, droped in the getAllItems function, and configured the return type. However, when I ran the app, I received the same error "Exception Processing Message" again. I have attached the screenshot of the error in addition to my phpinfo and log files.
"The exception process message" is definitely a issue. Is it happening consistently, if so can you file a bug at bugs.adobe.com/flex and give out your machine config details, and the error log.
With regards to generating a new php file and the introspection error that you are getting, looking at the logs it probably is two fold.
First off the name of the file and the class name should be the same, are you having employeeServices as the name of the class in employeeServices.php file.
Second, It seems prior to generating this new file, you had a syntax error at line 5.
It would be easier to figure out what the error is, if you can also attach the php file, by removing any sensitive information in that.
Hope this helps.
Thanks
-Sunil -
Data Type for Process Flow... PB with Date?
I've got a problem by passing parameters in process flow.
I have a mapping with a parameter DATE_EXEC (data type : DATE) and a default value that is TO_DATE('20/01/2007' , 'dd/mm/yyyy') . My mapping is working good when i launch it.
I have a process flow which contains the mapping. This process has a parameter DATE_EXEC (data type : DATE). I bind the 2 DATE_EXEC. But when i launch my mapping the value is not recognized, I try with :
- TO_DATE('20/01/2007' , 'dd/mm/yyyy')
- 20/01/2007
- 2007.01.20
- 2007-01-20
My question is what are the data type in process flow? They are not ORACLE TYPE.
For example , a parameter in a mapping which is a VARCHAR2 must be input between quotes but if you bind it to a parameter of a process flow which is a STRING (not ORACLE Data type) , you must input it without quotes?
Anybody has some rules about that?
I apologize for my english, i'm a french people.Here is some information on the literal quote or not quote query and what I think you need to do at the end, hope it helps. Not exactly intuitive...since the flow designer (you) have to know what is a PLSQL object and what is not.
1. Literal = FALSE
When Literal = FALSE is set then the value entered must be a valid PL/SQL expression which is evaluated at the Control Center e.g.
'Hello World!'
22 / 7
2. Literal = TRUE
When Literal = TRUE then the value is dependent on the the type of Activity. If the activity is a PL/SQL object i.e. Mapping or Transformation, then the value is PL/SQL snippet. The critical difference here is that the value is macro substituted into the call for the object. The format of the value is identical to that entered as default value in the Mapping editor. e.g.
'Hello World!'
sysdate()
If the activity type is not a PL/SQL object then the value is language independent. e.g.
Hello World
3.1427571
What you should try......
Check the map activity parameter in your process flow to see if literal is false (an expression), set it to false and then try using your TO_DATE('20/01/2007' , 'dd/mm/yyyy') expression, deploy your flow and execute. Alternatively the user guide defines the DATE type for flow with the format YYYY-MM-DD so you can have the parameter value as '2007-01-20' use literal equal to true and remember and quote your value.
Cheers
David -
Start OWB Process Flow Schedule Job in Workflow Manager
I am trying to get the Oracle Workflow Schedule to execute my Oracle Warehouse Builder (OWB) Process Flow. Please note that execute Process Flow from the OWB deployment manager works fine. I did the following but nothing being executed:
Info:
OWB: 10g
Oracle DB: 10g
Oracle Workflow: 2.6.3
OracelAS: NONE
1. From the Oracle Workflow Manager browser, click the Background Engines->Submit New
2. Insert the following parameter:
Work Item Type: LD_STG
Run Date and Time: 10/18/2004 17:40:00
Run Every: 0 days, 00 hours, 01 minutes, 00 seconds
3. Click ok and system showed "Successfully submitted Background Engine with job number 107"
However, after the specificied schedule time, nothing happened. No error shown in the Work Item.
Any help are greatly appreciated. Thanks.This is a Oracle Workflow client problem. Can you post this to the workflow forum (Workflow
Regards:
Igor -
Process Flow - mapping not inserting data
I have a process flow that executes three mappings in sequence i.e Map1 -> Map2 -> Map3.
However, sometimes the process flow will execute and no data will be inserted by MAP2, other times it runs ok. If I run the mappings directly data is always inserted.
The only workaround I've found is to add a WAIT of 10sec between Map1 and Map2 in the process flow.
Has anyone come across anything like this and how did you work around or fix it?
OWB 10.2.0.3 btw.
Cheers
Si
PS. transitions are ok.Hi
If you have any map inputs make sure they are also declared in the processflow map - you basicly need to bind them.
You probably need to investigate from the beginning to understand your result of incoherent data.
What I would to first is.
1. Generete an intermediete code - choose your table as outgroup.
2. Review the code - are there any loading hints ?
3. Debug the map
Check your locations.
If everything seems right then the problem should lay on your workflow manager.
Remember that the workflow manager has only one namespace, so watchout for mapping name. I had strange executions on my PF because one of my maps had a "½" in the name.
Check the settings of the map - if you have enterd a set based run in your map and a row based run in the processflow.
Remember the setting in the processflow is the "superior" one
Just ideas what you can do.
Cheers -
Process Flow Execution of Mapping results in ORA-01403: no data found
Hi,
When the Mapping is run standalone, it runs fine, but when run through a Process Flow, it generates ORA-01403 on a FILTER within the Mapping.
What should be the debug steps. The RTE_ROWKEY is mentioned as 0 and does not help.
OWB Version is 9.2.0.2.8
- JojoSo, when I submit the form, the Apex error I see is:
ORA-01403: no data found
Error Unable to process row of table fRQ (the debug is also posted in mail earlier)
Thanks
Edited by: skomar on Jan 29, 2011 5:39 PM -
Cash flow start date and first posting date in REFX-RECN contract
Hi,
While reviewing old contracts, I have noticed that in contracts where first posting date is mentioned, the cash flow start date is not editable in contract change screen. In contracts where first posting date is not mentioned, the cash flow start date is editable in the change screen.
Is this standard system behaviour? Because i have read in the forum that cash flow start date is not editable after first posting of the conditions.
I tried to change the field status of cash flow start date through RECACUST, but despite keeping the field status as display, the cash flow start date remains editable.
Please help on how to make the field status not editable.
sadhanaHi Sidhharth,
Thanks a lot for your immediate reply.
I still have one query. You have written that
Where First posting date is not updated it indicates that posting of the contract is yet to be done,not a single posting has been performed hence Cashflow date is editable.
This means first posting date is always filled up after condition is posted. but in our system i find that the date is blank and greyed out even after the 1st posting.
Could it be due to having several conditions in the contract - some with one time posting, some with monthly posting? eg.most of our contracts have atleast two contions - one for deposit which is one time payment, one for rent. the first posting date is updated in individual condition for rent in conditions tab but first posting date in terms tab is blank. But even in single condition contracts, the first posting date is not getting updated in term tab (the field gets greyout out once contract is activated)
Thanks once again.
Sadhana.
Maybe you are looking for
-
Photo thumbnails not shown in Mac Photos
I have started using the new Photos for Mac and have migrated my aperture photo library but have run into problems. When I go to "All Photos" in the side bar, a lot of thumbnails are missing. If I click on one of the missing thumbnails, the photo wil
-
OBIA data mismatch in EBS and OLAP
hi all, iam facing a situation in which in the date in OLAP end is coming different and the data in EBS end is coming different.So is it in the informatica mappings the error is coming or there is some other reason. Please help.
-
Hi, I finally managed to merge my history of 2 different profiles to one single places.sqlite. Now, that I've created a sync account and successfully synced my history, I've learned that Firefox has a limit for history sync: Only the last two months
-
How to stop the guesswork with printing business cards
Hello everyone, when I print business cards I use one of those 10 blank business cards per sheet paper. I scan the image of the blank business card template, then I import to Illustrator and then I make the business cards. The problem is when I print
-
ITunes 12.0.1.26x64 installation issue Windows 7
Hi, I'm having trouble installing iTunes 12 on my Windows 7 (x64) notebook and desktop. Below is the error I'm seeing when copying new files. The installer has insufficient privileges to access this directory: C:\Program Files\(x86)\iTunes\iTunes.Res