BPEL process to receive CSV file in SOAP attachmnet
Hi ,
I have a requirement where i need to develop a BPEL process that receives a CSV file in a SOAP attachment. The BPEL process ahould take the CSV file from the attachment and store it in a local Directory. Thanks in advance.
Hi ,
I have a requirement where i need to develop a BPEL process that receives a CSV file in a SOAP attachment. The BPEL process ahould take the CSV file from the attachment and store it in a local Directory. Thanks in advance.
Similar Messages
-
How to process large input CSV file with File adapter
Hi,
could someone recommend me the right BPEL way to process the large input CSV file (4MB or more with at least 5000 rows) with File Adapter?
My idea is to receive data from file (poll the UX directory for new input file), transform it and then export to one output CSV file (input for other system).
I developed my process that consists of:
- File adapter partnerlink for read data
- Receive activity with checked box to create instance
- Transform activity
- Invoke activity for writing to output CSV.
I tried this with small input file and everything was OK, but now when I try to use the complete input file, the process doesn't start and automatically goes to OFF state in BPEL console.
Could I use the MaxTransactionSize parameter as in DB adapter, should I batch the input file or other way could help me?
Any hint from you? I've to solve this problem till this thursday.
Thanks,
Milan K.This is a known issue. Martin Kleinman has posted several issues on the forum here, with a similar scenario using ESB. This can only be solved by completely tuning the BPEL application itself, and throwing in big hardware.
Also switching to the latest 10.1.3.3 version of the SOA Suite (assuming you didn't already) will show some improvements.
HTH,
Bas -
Two senders and One receiver for File to SOAP scenario
Dear Friends,
I got a new requirement like we have a to different ECC systems from there they will send the Two different Files to XI, XI will send the same file to receiver, but we have only one receiver.
for soap adapter purpose they provided only one webservice url.
Please suggest me how to do multiple senders and single receiver.
Regards,
Shalini Shah>
shalini shah wrote:
> Dear Madhu,
>
> I am receiving the files from two different ECC servers.
> ECC1 will send the files at different time
> ECC2 will send the files at different time
> but receiver webservice is only one .
> two sender Communication channels are required for pointing to two different ECC servers and one receiver CC is for receiver purpsoe.
>
>
> Regards,
> Shaini Shah
BPM ....not possible to collect the messages otherwise! -
Extract BPEL process from the JAR file deployed to BPEL Server
Hello All-
We have a BPEL process deployed in our production environment but unfortunately we do not have a back-up copy in our test environment.
I got the BPEL deployment JAR file from the production server but I am unable to extract the BPEL Process as well as the XSL file.
I used WINRAR to extract but it was showing that the JAR file is not a valid archive. We need the prod deployed version of the XSL as well as the BPEL process for a change and we are unable to proceed.
If anybody has faced similar kind of issue, please let me know how can it be resolved. Also please let me know if there are any tools which can extract the files.
PLease note that we are on BPEL 10.1.2.0.2
Appreciate your help and thanks in advance.
Thanks,
DibyaHi Dibya
jar -xvf <filename> will work as others said.
However, please make sure you have another TEST/DEV Environment running in par with PROD.
Always, suggested to test first on the TEST/DEV Env for applying any new patches/config changes...then appropriately migrate to actual PROD Environments.
Also, always take a complete backup before you do some R&D on the SOA server.
Regards
A -
SOAP Receiver Acknowledgements (File XI SOAP)
Hello Friends,
can some one help me on this issue? I want to suppress all Aknowledgements on the receiver SOAP Adapter. Is there any How to guide for SOAP Acknowledgement?
By the way; I have have been configured;
SXMB_ADM > Runtime > ACK_SYSTEM_FAILURE > 0
but nothing is changed and i get the same error (see below)
<?xml version="1.0" encoding="UTF-8" standalone="yes" ?>
- <!-- Technical Routing of Response
-->
- <SAP:Error xmlns:SAP="http://sap.com/xi/XI/Message/30" xmlns:SOAP="http://schemas.xmlsoap.org/soap/envelope/" SOAP:mustUnderstand="">
<SAP:Category>XIServer</SAP:Category>
<SAP:Code area="OUTBINDING">CO_TXT_ROUTING_BACK_ERROR</SAP:Code>
<SAP:P1>,XXXXXXX</SAP:P1>
<SAP:P2>XXXXXXX,XXXXXX,,</SAP:P2>
<SAP:P3 />
<SAP:P4 />
<SAP:AdditionalText />
<SAP:ApplicationFaultMessage namespace="" />
<SAP:Stack>Es ist ein Fehler beim Zurück-Routing aufgetreten Fehler beim Kommunikationskanal aufgetreten.</SAP:Stack>
<SAP:Retry>M</SAP:Retry>
</SAP:Error>
<?xml version="1.0" encoding="UTF-8" standalone="yes" ?>
- <!-- Technical Routing of Response
-->
- <SAP:Ack xmlns:SAP="http://sap.com/xi/XI/Message/30" xmlns:SOAP="http://schemas.xmlsoap.org/soap/envelope/" SOAP:mustUnderstand="1">
<SAP:Status>AckRequestNotSupported</SAP:Status>
<SAP:Category>permanent</SAP:Category>
</SAP:Ack>
Thanks,
FthHi Fatih,
Check if these links can help u
JDBC Adapter / Acknowledgements
ALEAUD not coming to SAP from XI & u0093Acknowledgment not possible" in idx5
**Reward if helpfull** -
File Adapter BPEL Process getting switched off
The file adapter BPEL process reads a csv file which has a series of records in itfrom /xfer/chroot/data/aramex/accountUpdate/files. In between reading the files, the BPEL process gets switched off. The below snippet is the error we found in the domain.log. Anybody can you please suggest what to do?
<2010-11-25 16:22:28,025> <WARN> <PreActivation.collaxa.cube.ws> <File Adapter::Outbound>
java.io.FileNotFoundException: /xfer/chroot/data/aramex/accountUpdate/files/VFQ-251120101_1000.csv (No such file or directory)
at java.io.FileInputStream.open(Native Method)
at java.io.FileInputStream.<init>(FileInputStream.java:106)
at oracle.tip.adapter.file.FileUtil.copyFile(FileUtil.java:947)
at oracle.tip.adapter.file.inbound.ProcessWork.defaultArchive(ProcessWork.java:2341)
at oracle.tip.adapter.file.inbound.ProcessWork.doneProcessing(ProcessWork.java:614)
at oracle.tip.adapter.file.inbound.ProcessWork.processMessages(ProcessWork.java:445)
at oracle.tip.adapter.file.inbound.ProcessWork.run(ProcessWork.java:227)
at oracle.tip.adapter.fw.jca.work.WorkerJob.go(WorkerJob.java:51)
at oracle.tip.adapter.fw.common.ThreadPool.run(ThreadPool.java:280)
at java.lang.Thread.run(Thread.java:619)
<2010-11-25 16:22:28,025> <INFO> <PreActivation.collaxa.cube.ws> <File Adapter::Outbound> Processer thread calling onFatalError with exception /xfer/chroot/data/aramex/accountUpdate/files/VFQ-251120101_1000.csv (No such file or directory)
<2010-11-25 16:22:28,025> <FATAL> <PreActivation.collaxa.cube.activation> <AdapterFramework::Inbound> [Read_ptt::Read(root)]Resource Adapter requested Process shutdown!
<2010-11-25 16:22:28,025> <INFO> <PreActivation.collaxa.cube.activation> <AdapterFramework::Inbound> Adapter Framework instance: OraBPEL - performing endpointDeactivation for portType=Read_ptt, operation=Read
<2010-11-25 16:22:28,025> <INFO> <PreActivation.collaxa.cube.ws> <File Adapter::Outbound> Endpoint De-activation called in adapter for endpoint : /xfer/chroot/data/aramex/accountUpdate/files/
<2010-11-25 16:22:28,095> <WARN> <PreActivation.collaxa.cube.ws> <File Adapter::Outbound> ProcessWork::Delete failed, the operation will be retried for max of [2] times
<2010-11-25 16:22:28,095> <WARN> <PreActivation.collaxa.cube.ws> <File Adapter::Outbound>
ORABPEL-11042
File deletion failed.
File : /xfer/chroot/data/aramex/accountUpdate/files/VFQ-251120101_1000.csv as it does not exist. could not be deleted.
Delete the file and restart server. Contact oracle support if error is not fixable.
at oracle.tip.adapter.file.FileUtil.deleteFile(FileUtil.java:279)
at oracle.tip.adapter.file.FileUtil.deleteFile(FileUtil.java:177)
at oracle.tip.adapter.file.FileAgent.deleteFile(FileAgent.java:223)
at oracle.tip.adapter.file.inbound.FileSource.deleteFile(FileSource.java:245)
at oracle.tip.adapter.file.inbound.ProcessWork.doneProcessing(ProcessWork.java:655)
at oracle.tip.adapter.file.inbound.ProcessWork.processMessages(ProcessWork.java:445)
at oracle.tip.adapter.file.inbound.ProcessWork.run(ProcessWork.java:227)
at oracle.tip.adapter.fw.jca.work.WorkerJob.go(WorkerJob.java:51)
at oracle.tip.adapter.fw.common.ThreadPool.run(ThreadPool.java:280)
at java.lang.Thread.run(Thread.java:619)
<2010-11-25 16:22:28,315> <ERROR> <PreActivation.collaxa.cube> <BaseCubeSessionBean::logError> Error while invoking bean "cube delivery": Process state off.
The process class "BulkAccountUpdateFileConsumer" (revision "1.0" ) has not been turned on. No operations on the process or any instances belonging to the process may be performed if the process is off.
Please consult your administrator if this process has been turned off inadvertently.This patch is not for 10.1.3.1.
I have provided a response to on the following post
BPEL Process Going into Dead State Automatically.
cheers
James -
Csv file processing using external table
Dear All,
Our database is oracle 10g r2 and OS is solaris
We would receive csv files to a particular directory on server each day.
File Format look like:
H00,SOURCE_NAME,FILE_CREATED_DATE
RECORD_TYPE,EMP_ID,EMP_NAME,EMP_DOB(DDMMYYYY),EMP_HIRE_DATE(DDMMYYYY),EMP_LOCATION
T00,RECORD_COUNT
EMPLOYEE TABLE STRUCTURE
EMP_ID NOT NULL NUMBER ,
EMP_NAME NOT NULL VARCHAR2(10) ,
EMP_DOB DATE,
EMP_HIRE_DATE NOT NULL DATE,
EMP_LOCATION VARCHAR2(80)
Sample File:
H00,ABC,21092013
"R01",1,"EMP1","14021986","06072010","LOC1"
"R01",20000000000,"EMP2","14021-987","06072011",""
,***,"EMPPPPPPPPPPP3","14021988","060**012","LOC2"
"R01",4,4,"14021989","06072013",
T00,4
we need to validate each record excluding header and trailer for:
DATATYPE, LENGTH,OPTIONALITY, and other date validations such as EMP_HIRE_DATE can not be less than EMP_DOB
In case of any data errors we need to send a response file for corresponding source file.
we have predefined error codes to be sent in the response file.
ERR001 EMP_ID can not be null
ERR002 EMP_ID exceeds 10 digits
ERR003 EMP_ID is not a number
ERR004 EMP_NAME has to be text
ERR005 EMP_NAME length can not exceed 10
ERR006 EMP_NAME can not be null
ERR007 EMP_DOB is not a date
ERR008 EMP_DOB is not in ddmmyyyy format
ERR009 EMP_HIRE_DATE is not a date
ERR010 EMP_HIRE_DATE is not in ddmmyyyy format
ERR011 EMP_HIRE_DATE can not be null
ERR012 EMP_LOCATION has to be text
ERR013 EMP_LOCATION length can not exceed 80
ERR014 EMP_HIRE_DATE can not be less than EMP_DOB
ERR015 Field missing in the record
ERR016 More number of fields than allowed
1.Do I need to create external table before processing each file.(EMP1.txt,EMP2.txt)?
2.How to generate these error codes in case of respective failure scenarios and to log into an exception table?
3.response file needs to have entire record and a concatination of all the error codes in the next line.
4.what would be a better approach among
creating an external table with all char(2000) fields and writing a select statement
such as select * from ext_table where (emp id is not null and length(emp_id)<=10 and....to select only proper data);
or creating the external table to be same as employee table and creating a bad file? if this is the preferred how can I generate the custom error codes?
Could you please help me in achieving this!
Warm Regards,
Shankar.You can do a one-time creation of an external table. After that, you can either overwrite the existing text file or alter the location. In the example below I have split your original sample file into two files, in order to demonstrate altering the location. You can make the external table all varchar2 fields as large as they need to be to accommodate all possible data. If you then create a staging table and rejects table of the same structure, you can create a trigger on the staging table such that, when you insert from the external table to the staging table, it inserts into either the employee table or rejects table with the concatenated error list. If you want this in a file, then you can just spool a select from the rejects table or use some other method such as utl_file. The following is a partial example. Another alternative would be to use SQL*Loader to load directly into the staging table without an external table.
SCOTT@orcl12c> HOST TYPE emp1.txt
H00,ABC,21092013
"R01",1,"EMP1","14021986","06072010","LOC1"
"R01",20000000000,"EMP2","14021-987","06072011",""
T00,4
SCOTT@orcl12c> HOST TYPE emp2.txt
H00,ABC,21092013
,***,"EMPPPPPPPPPPP3","14021988","060**012","LOC2"
"R01",4,4,"14021989","06072013",
T00,4
SCOTT@orcl12c> CREATE OR REPLACE DIRECTORY my_dir AS 'c:\my_oracle_files'
2 /
Directory created.
SCOTT@orcl12c> CREATE TABLE external_table
2 (record_type VARCHAR2(10),
3 emp_id VARCHAR2(11),
4 emp_name VARCHAR2(14),
5 emp_dob VARCHAR2(10),
6 emp_hire_date VARCHAR2(10),
7 emp_location VARCHAR2(80))
8 ORGANIZATION external
9 (TYPE oracle_loader
10 DEFAULT DIRECTORY my_dir
11 ACCESS PARAMETERS
12 (RECORDS DELIMITED BY NEWLINE
13 LOAD WHEN ((1: 3) != "H00" AND (1:3) != 'T00')
14 LOGFILE 'test.log'
15 FIELDS TERMINATED BY "," OPTIONALLY ENCLOSED BY '"' LDRTRIM
16 MISSING FIELD VALUES ARE NULL
17 REJECT ROWS WITH ALL NULL FIELDS
18 (record_type, emp_id, emp_name, emp_dob, emp_hire_date, emp_location))
19 LOCATION ('emp1.txt'))
20 /
Table created.
SCOTT@orcl12c> CREATE TABLE staging
2 (record_type VARCHAR2(10),
3 emp_id VARCHAR2(11),
4 emp_name VARCHAR2(14),
5 emp_dob VARCHAR2(10),
6 emp_hire_date VARCHAR2(10),
7 emp_location VARCHAR2(80))
8 /
Table created.
SCOTT@orcl12c> CREATE TABLE employee
2 (emp_id NUMBER NOT NULL,
3 emp_name VARCHAR2(10) NOT NULL,
4 emp_dob DATE,
5 emp_hire_date DATE,
6 emp_location VARCHAR2(80))
7 /
Table created.
SCOTT@orcl12c> CREATE TABLE rejects
2 (record_type VARCHAR2(10),
3 emp_id VARCHAR2(11),
4 emp_name VARCHAR2(14),
5 emp_dob VARCHAR2(10),
6 emp_hire_date VARCHAR2(10),
7 emp_location VARCHAR2(80),
8 error_codes VARCHAR2(4000))
9 /
Table created.
SCOTT@orcl12c> CREATE OR REPLACE TRIGGER staging_air
2 AFTER INSERT ON staging
3 FOR EACH ROW
4 DECLARE
5 v_rejects NUMBER := 0;
6 v_error_codes VARCHAR2(4000);
7 v_num NUMBER;
8 v_dob DATE;
9 v_hire DATE;
10 BEGIN
11 IF :NEW.emp_id IS NULL THEN
12 v_rejects := v_rejects + 1;
13 v_error_codes := v_error_codes || ',' || 'ERR001';
14 ELSIF LENGTH (:NEW.emp_id) > 10 THEN
15 v_rejects := v_rejects + 1;
16 v_error_codes := v_error_codes || ',' || 'ERR002';
17 END IF;
18 BEGIN
19 v_num := TO_NUMBER (:NEW.emp_id);
20 EXCEPTION
21 WHEN value_error THEN
22 v_rejects := v_rejects + 1;
23 v_error_codes := v_error_codes || ',' || 'ERR003';
24 END;
25 IF :NEW.emp_name IS NULL THEN
26 v_rejects := v_rejects + 1;
27 v_error_codes := v_error_codes || ',' || 'ERR006';
28 ELSIF LENGTH (:NEW.emp_name) > 10 THEN
29 v_rejects := v_rejects + 1;
30 v_error_codes := v_error_codes || ',' || 'ERR005';
31 END IF;
32 BEGIN
33 v_dob := TO_DATE (:NEW.emp_dob, 'ddmmyyyy');
34 EXCEPTION
35 WHEN OTHERS THEN
36 v_rejects := v_rejects + 1;
37 v_error_codes := v_error_codes || ',' || 'ERR008';
38 END;
39 BEGIN
40 IF :NEW.emp_hire_date IS NULL THEN
41 v_rejects := v_rejects + 1;
42 v_error_codes := v_error_codes || ',' || 'ERR011';
43 ELSE
44 v_hire := TO_DATE (:NEW.emp_hire_date, 'ddmmyyyy');
45 END IF;
46 EXCEPTION
47 WHEN OTHERS THEN
48 v_rejects := v_rejects + 1;
49 v_error_codes := v_error_codes || ',' || 'ERR010';
50 END;
51 IF LENGTH (:NEW.emp_location) > 80 THEN
52 v_rejects := v_rejects + 1;
53 v_error_codes := v_error_codes || ',' || 'ERR013';
54 END IF;
55 IF v_hire IS NOT NULL AND v_dob IS NOT NULL AND v_hire < v_dob THEN
56 v_rejects := v_rejects + 1;
57 v_error_codes := v_error_codes || ',' || 'ERR014';
58 END IF;
59 IF :NEW.emp_id IS NULL OR :NEW.emp_name IS NULL OR :NEW.emp_dob IS NULL
60 OR :NEW.emp_hire_date IS NULL OR :NEW.emp_location IS NULL THEN
61 v_rejects := v_rejects + 1;
62 v_error_codes := v_error_codes || ',' || 'ERR015';
63 END IF;
64 IF v_rejects = 0 THEN
65 INSERT INTO employee (emp_id, emp_name, emp_dob, emp_hire_date, emp_location)
66 VALUES (:NEW.emp_id, :NEW.emp_name,
67 TO_DATE (:NEW.emp_dob, 'ddmmyyyy'), TO_DATE (:NEW.emp_hire_date, 'ddmmyyyy'),
68 :NEW.emp_location);
69 ELSE
70 v_error_codes := LTRIM (v_error_codes, ',');
71 INSERT INTO rejects
72 (record_type,
73 emp_id, emp_name, emp_dob, emp_hire_date, emp_location,
74 error_codes)
75 VALUES
76 (:NEW.record_type,
77 :NEW.emp_id, :NEW.emp_name, :NEW.emp_dob, :NEW.emp_hire_date, :NEW.emp_location,
78 v_error_codes);
79 END IF;
80 END staging_air;
81 /
Trigger created.
SCOTT@orcl12c> SHOW ERRORS
No errors.
SCOTT@orcl12c> INSERT INTO staging SELECT * FROM external_table
2 /
2 rows created.
SCOTT@orcl12c> ALTER TABLE external_table LOCATION ('emp2.txt')
2 /
Table altered.
SCOTT@orcl12c> INSERT INTO staging SELECT * FROM external_table
2 /
2 rows created.
SCOTT@orcl12c> SELECT * FROM employee
2 /
EMP_ID EMP_NAME EMP_DOB EMP_HIRE_DATE EMP_LOCATION
1 EMP1 Fri 14-Feb-1986 Tue 06-Jul-2010 LOC1
1 row selected.
SCOTT@orcl12c> COLUMN error_codes NEWLINE
SCOTT@orcl12c> SELECT * FROM rejects
2 /
RECORD_TYP EMP_ID EMP_NAME EMP_DOB EMP_HIRE_D EMP_LOCATION
ERROR_CODES
R01 20000000000 EMP2 14021-987 06072011
ERR002,ERR008,ERR015
*** EMPPPPPPPPPPP3 14021988 060**012 LOC2
ERR003,ERR005,ERR010
R01 4 4 14021989 06072013
ERR015
3 rows selected. -
Processing Several Records in a CSV File
Hello Experts!
I'm currently using XI to process an incoming CSV file containing Accounts Payable information. The data from the CSV file is used to call a BAPI in the ECC (BAPI_ACC_DOCUMENT_POST). Return messages are written to text file. I'm also using BPM. So far, I've been able to get everything to work - financial documents are successfully created in the ECC I also receive the success message in my return text file.
I am, however, having one small problem... No matter how many records are in the CSV file, XI only processes the very first record. So my question is this: Why isn't XI processing all the records? Do I need a loop in my BPM? Are there occurrence settings that I'm missing? I kinda figured XI would simply process each record in the file.
Also, are there some good examples out there that show me how this is done?
Thanks a lot, I more than appreciate any help!Matthew,
First let me explain the BPM Steps,
Recv--->Transformation1->For-Each Block->Transformation2->Synch Call->Container(To append the response from BAPI)->Transformation3--->Send
Transformation3 and Send must be outside Block.
Transformation1
Here, the source and target must be same. I think you must be know to split the messages, if not see the below example
Source
<MT_Input>
<Records>
<Field1>Value1</Field1>
<Field2>Value1</Field2>
</Records>
<Records>
<Field1>Value2</Field1>
<Field2>Value2</Field2>
</Records>
<Records>
<Field1>Value3</Field1>
<Field3>Value3</Field3>
</Records>
</MT_Input>
Now , I need to split the messages for each Records, so what I can do?
In Message Mapping, choose the source and target as same and in the Messages tab, choose the target occurrence as 0..Unbounded.
Now,if you come to Mapping tab, you can see Messages tag added to your structure, and also you can see <MT_Input> occurrence will be changed to 0..unbounded.
Here is the logic now
Map Records to MT_INPUT
Constant(empty) to Records
Map rest of the fields directly. Now your o/p looks like
<Messages>
<Message1>
<MT_Input>
<Records>
<Field1>Value1</Field1>
<Field2>Value1</Field2>
</Records>
</MT_Input>
<MT_Input>
<Records>
<Field1>Value2</Field1>
<Field2>Value2</Field2>
</Records>
</MT_Input>
<MT_Input>
<Records>
<Field1>Value3</Field1>
<Field3>Value3</Field3>
</Records>
</MT_Input>
</Message1>
</Messages>
raj. -
How to design multiple file polling BPEL process?
Hi all,
I am trying to solve/design the following in BPEL process:
- The integration process flow I am trying to design is an interaction b/n system A (FTP server), an Integration platform and system B.
- The integration platform will poll files from an internal FTP server. It has 2 Time files (“Time1” and “Time2”) alternatives to poll from the FTP server. It will then transform the message/information inside the files (time information) to an internal XML format. At the end it shall send the message to system B.
- The information inside the files is identical. The reason to have 2 files with identical information is for redundancy. The integration-platform shall only poll file “TIME1” as long as the conditions are met. If the conditions are not met, then it polls file “Time2”.
Normal succeeded scenario:
1- The integration platform polls the latest “Time1” file from the internal FTP server. The polling is done every 15th minute.
2- The integration platform logs to BAM and reports that the file is received.
3- The platform then converts the message to an internal format.
4- The platform sends the converted message/time information to system B.
5 - The use-case ends.
Not normal scenario:
1- The platform cannot get/receive file “Time1” file.
- The platform logs to BAM and reports that the file could not be received.
- The platform polls file “Time2” from the FTP server and it does the same normal scenario as above. The use case ends.
2 - Time message/information from the Time1 file has not full information (less than 24 rows). The file is supposed to have 24 rows (for 24 hours).
- The platform logs to BAM and reports that there is missing information on the message.
- The platform gets/polls instead the “Time2” file from the FTP server and it does the normal scenario.
- The use case ends.
3- The platform can find/get neither file “Time1” nor “Time2”.
- The platform alarms to BAM that it could not get/find the files.
** I am considering either the pick activity or the switch activity solution. Does any one of you have an idea how to design this in oracle BPEL process?
Thanks.Not sure I understand your use case completly so if I'm off track I apploigise.
I think there are 3 ways to solve this. The 3rd may be closest to what you are looking for.
1. Have 2 processes that read into a common area, e.g. database where you can compare. From there you can choose which file to pick. This is high leve but I think this would be dificult to implement.
2. Implement a combination of a rejectedMessage handler and error processing witn the BPEL process itself. The rejecte message handler can call another BPEL process if the time1 file is in a invalid state. This process will then read the time2 file. The issue with this is that the rejecte message handler will only get invoked when the file doesn't conform to the schema.
3. Use the quarts scheduler whcih invokes a BPEL process every 15 miniutes. Using the file adapter to perform a synchronous read to read time1, analyse it, if it fails read time2 using synchronous read. Then perform the approprate action id this fails.
hope this give some ideas.
cheers
James -
How to read values from Property file into BPEL process local variable?
I would like to use a Property file with some parameters e.g.
<myparm1>12345</myparm1>
How can I read from a BPEL process such a Property file and assign it into e.g. local variable "intparm1"?
Where (in which directory) should I put this XML Property file to have it always available for reading?
PeterHi,
You can also use
ora:readFile( ) function as follow :
ora:readFile(xml file location (ex. "file:///D:\\SOA\\FileAdapters\\readFile\\test\\test.xml"),xsd file location (ex."file:///D:\\SOA\\FileAdapters\\readFile\\test\\test.xsd"))
inside the assign activity and assign this to the variable you want.
regards
arababah
Edited by: arababah on Aug 10, 2009 12:55 AM -
How many different way can we call a BPEL process
Hi Guys,
How many ways can we call a BPEL process
I know 3 ways
1 ) Thru PLSQL 2) using Java API 3) From another BPEL Process.
are there any other ways .... if so what are they ..... .
Thanks
Tom...Hi Tom,
your normal BPEL process (a regular process you create with the "new BPEL process" wizard), can be configured in Oracle application server to be accessible
- via SOAP over HTTP or
- via JMS
since it is has a WSDL based service interface.
JDeveloper as well allows additionally more options for starting a BPEL process such as a file based interface, an e-mail as a starter etc.
Still, I like to see the ESB as a virtualizer of the endpoint protocol, so I suggest you model the BPEL process regularly, exposing its functionality through WSDL and use ESB services that wrap your BPEL process to deal with the actual endpoints.
Cheers,
Hajo -
Replace String in csv file with file adapter content conversion
Hello experts,
I have a sender file channel to receive csv files from an external server. I can process the csv files but I have a little problem. Since the separator in the csv file is a comma and sometimes a comma also appears in the dates but this time it is not a separator.
Example:
1234,20120123,ABCD,customer, which has a comma in it's name,...
5678,20120123,FGHI,customer without comma,...
My plan is to remove all commas when a blank follows.
My current content conversion looks like this:
Record.fieldNames field1,field2,...
Record.fieldSeparator ,
Record.endSeparator 'nl'
Is there something like Record.fieldReplaceString or something similar which I can use? Or is there perhaps a better way to do this?
I'd appreciate every help I can get here.
Best regards.
Oliver.Hi Oliver,
I think I've got a solution for you, after all. All you need to do is have your customer name enclosed in quotes, like that:
1234,20120123,ABCD,"customer, which has a comma in it's name",...
Use the rest of FCC configuration as usually. You might also want to have a look at this wiki entry for a configuration example:
http://wiki.sdn.sap.com/wiki/display/XI/FileContentConversion
Moreover, see the description for "NameA.enclosureSign" in the help document below:
http://help.sap.com/saphelp_nwpi71/helpdata/en/44/6830e67f2a6d12e10000000a1553f6/frameset.htm
Hope this helps,
Greg -
File adapter SyncRead Operation adds "???" to first record of csv file
Hi All,
One interface is reading one CSV file (containing two fields only) using SyncRead Operation of File Adapter. But I found, when file adapter parsed the data it appends "???" to first field of the first record which cause further data validation in the process.
Test.csv file
123456,159357
147258,987654
Trace of the FileAdapter:
<messages>
<Invoke_FILE_Inbound_InputVariable>
<part name="Empty">
<empty/>
</part>
</Invoke_FILE_Inbound_InputVariable>
<Invoke_FILE_Inbound_OutputVariable>
<part name="body">
<Root-Element>
<TestElements>
<Emplid>���123456</Emplid>
<SupId>159357</SupId>
</TestElements>
<HRRespIDElements>
<Emplid>147258</Emplid>
<SupId>987654</SupId>
</TestElements>
</Root-Element>
</part>
</Invoke_FILE_Inbound_OutputVariable>
</messages>
Can anyone help me to understand , what is reason? Or I missed anything.
Thanks & Regards,
SharmisthaUnfortunately, this is documented bahaviour (solution/workaround anyone?!):
Elapsed Time Exceeds:
Specify a time which, when exceeded, causes a new outgoing file to be created.
Note:
The Elapsed Time Exceeds batching criteria is evaluated and a new outgoing file is created, only when an invocation happens.
For example, if you specify that elapsed time exceeds 15 seconds, then the first message that is received is not written out, even after 15 seconds, as batching conditions are not valid. If a second message is received, then batching conditions become valid for the first one, and an output file is created when the elapsed time exceeds 15 seconds. -
How to share schemas between BPEL-processes?
Hello everyone!
I'm trying to build a simple BPEL-SOA-like application using Oracle SOA Suite. How can I share these schemas between the different processes?
For example I have a "main" BPEL process which receives a customer as input and later has to pass this customer on to some other Web Service. As they should both use the same data structure (I created a XML schema definition, complextype), I'd like to know how one would share this structure between the two processes.
Or do I really have to copy the .xsd-files to every process and include it there or define the types inline? But then I'd to use the same namespace overall, because the types wouldn't be compatible otherwise.
Thanks for any hints and best-practices!
Jan.If you want to have a generic xsd that you can use
for multiple purposes in several independent
processes (not calling each other) you can either put
the xsd in each of them or put it on a website where
bpel can download them. Since Oracle AS comes with
Oracle HTTP server, you can put it into its htdocs or
create a virtual name server for it.I tried to make it accessible via HTTP and then imported it in the BPEL process' XSD-file. But unfortunately, then BPEL Console is not able to create a valid payload-message. The form with the input fields is displayed correctly, but the XML is empty! When I use inline-defined types, everything works fine!
If you want to use some types in dependent process
(that calls the master process), there is no need to
do this, just create partnerlink and JDeveloper will
create an import for you in bpel project. Then you
can create variables of those types. In this
scenario, putting the same xsd with the same target
namespace is not recommended, as you could run into
weird errors (for example if one xsd changed without
synchronizing the other one) or it might work.I don't really understand how this should work. Let's say I have process A and process B. Process A defines "AProcessRequest" in file A.xsd. In this type, I want to use for example the type "Customer". Process B also has to use the Customer-type in its "BProcessRequest" ... or should I rather define the types in the A.wsdl? I'm a bit lost here, sorry ;-)
Thanks and greets,
Jan. -
Sender "Mail" adapter - CSV file attachment
Hi there
I'm looking for some help in configuring a sender mail adapter that receives ".csv" files. I did read some blogs that mention using the "PayloadSwapBean" module to read the mail attachment instead of the mail content. My problem is to now convert the ".csv" file into a message. Is there a module that I can use ( is it the "MessageTransfomBean" ) and how. Any help would be appreciated.
Thanks
SalilHi Salil,
If you want to send a mail with a body and attachments, the message sender HAS to provide an XI message with attachments. I doubt a CSV file does justice.
As Renjith said you need to convert CSV to XmL.
A short description about the Standard Modules:
MessageTransformationBean is a standard module used to apply the XSLT mapping to the adapter module by using <i>Transform.class</i> ( This xslt mapping is done to create a mail package, Dont confuse with the actual mapping in your case this is NOT for converting csv to xml).
Also this module can be used to change the name and type of payloads by using <i>Transform.contentType</i>, <i>Transform.contentDisposition</i>, <i>Transform.contentDescription</i>.
PayloadSwapbean is a standard module for replacing payloads with other payloads (SWAP)
If you want to give each attachment a certain name use Parameters, <i>swap.keyname</i> for name of the payload and <i>swap.keyvalue</i>.
I Hope the use of standard modules is understood.
Maybe you are looking for
-
I have table 'Table 1' with columns Attr1,Attr2,Measure 1,Measure 2.I need to create view in the repository from table with Attr1,Measure1 and new Attribute 3 with values as flag 'Y' in to it.What will be the SQL for the new view.Please provide the e
-
WAD 3.x-Hiding the Navigation Icon
Hi Experts, In WAD, I want to hide the "Drilldown in Columns" button and display only the "Drilldown in Rows". I have checked the SHOW_NAVIGATION_ICONS. This enables both Drill on Rows and Drill on Columns. For both Rows and Free Characteristics I ne
-
How validate Total Stock ?
hi guys ....i´ve used the 2LIS_03_BF BX for calculate stocks in BW i've made an initialization and load historically material movements and activate the delta ...but i have a problem..i review my querys and the "in" stock movement and "out" stock mov
-
Air drop visible, open, but can't see nearby computers
I've tried to follow the other threads, but haven't seen an answer for this one: 1. Airdrop icon visible in both sidebars (i.e. both computers support Airdrop). 2. Both Airdrop windows open. 3. Computers next to each other. 4. Wi-fi max-bars on both
-
Can't activate windows after hard drive failure.
I'm attempting to revive a less than year old laptop for a friend after a harddrive failure.... toshiba drive was dead dead. Has a windows 8 sticker on bottom so can only assume that's what it came with, but windows claims the product key stored in b