Counter in Flat file
Hi All,
In my IDOC there are three item segements E1J3AIT, E1J3AIS, E1TXTH9.They are linked to three different structures SI1, SI2, SI3. I am creating a flat file.
In the target structure there is one common field counter. Now if my first segment has 3 items, 2nd segment has two items and if my 3rd segment has 3items my counter field in my output should start from 0 and end with 8. How can I achieve this? Your help will be highly appreciated.
korobee
Shabarish
So you mean to say that to count field I have to map my three segments and I have to change the context. Now I have done like what you have told and changed the context to my IDOC and then I have got errors when I displayed the queue. The errors are:
Source code has syntax error: /usr/sap/JXD/DVEBMGS00/j2ee/cluster/server0/./temp/classpath_resolver/Map0c9da750126f11dbbfb1001125a6cdc6/source/com/sap/xi/tf/_MM_DELVRY03_to_PickTicket_.java:617: cannot resolve symbol symbol : class integer location: class com.sap.xi.tf._MM_DELVRY03_to_PickTicket_ integer counter; ^ /usr/sap/JXD/DVEBMGS00/j2ee/cluster/server0/./temp/classpath_resolver/Map0c9da750126f11dbbfb1001125a6cdc6/source/com/sap/xi/tf/_MM_DELVRY03_to_PickTicket_.java:3249: cannot resolve symbol symbol : variable length location: class java.lang.String i = a.length; ^ /usr/sap/JXD/DVEBMGS00/j2ee/cluster/server0/./temp/classpath_resolver/Map0c9da750126f11dbbfb1001125a6cdc6/source/com/sap/xi/tf/_MM_DELVRY03_to_PickTicket_.java:3250: cannot resolve symbol symbol : variable length location: class java.lang.String j = b.length; ^ /usr/sap/JXD/DVEBMGS00/j2ee/cluster/server0/./temp/classpath_resolver/Map0c9da750126f11dbbfb1001125a6cdc6/source/com/sap/xi/tf/_MM_DELVRY03_to_PickTicket_.java:3251: cannot resolve symbol symbol : variable length location: class java.lang.String k = c.length; ^ /usr/sap/JXD/DVEBMGS00/j2ee/cluster/server0/./temp/classpath_resolver/Map0c9da750126f11dbbfb1001125a6cdc6/source/com/sap/xi/tf/_MM_DELVRY03_to_PickTicket_.java:3256: cannot resolve symbol symbol : variable result location: class com.sap.xi.tf._MM_DELVRY03_to_PickTicket_ result.addValue(val); ^ 5 errors
korobee
Similar Messages
-
Reg: how to count no.of records in flat file
Hi,
Very god morning to all, i faced a one problem like this, i am new to jython code.my problem is i want count the no.of records flat file before loading to the table, is there any easiest way to count the number records in file rather than jython ?
please share immediately.
Regards,
shHi actdi,
Thanks for giving reply, i am working in windows environment, and my previous project developers use the coding like
below:
filesrc = open('#filename','r')
first=filesrc.readline()
lines = 0
while first:
lines+=1
first=filesrc.readline()
s1=str(lines)
s2= ' in the file ------'
s3='#filename'
final = s1 + s2 + s3
raise ' \n\n The Number of Lines in the File are ---' , final
see this code they are wrote in a procedure but i have a doubt with this , where it finally stores the row count.
Kindly share if u have any solutions reg this in windows env.
Regards,
sh. -
Loading Multiple Flat File in to BI System with one InfoPackage
Greetings,
i have developed a routine to Load Multiple Flate File in one InfoPackage, but it doesn't work. Only the Last File are loaded. Who can Help me?
Loading 'R:\Verzeichnis \ing_wan_b_002.fall.csv' to 'R:\Verzeichnis \ing_wan_b_240*.fall.csv'
Routine:
DATA: VerzUndDateiname TYPE string value 'R:\Verzeichnis \ing_wan_b_000.fall.csv',
VerzUndDateinameKomplett TYPE string,
count TYPE i value 2,
countN(3) TYPE n.
WHILE count <= 240.
countN = count.
VerzUndDateinameKomplett = VerzUndDateiname.
REPLACE '000' WITH zaehlerN
INTO VerzUndDateinameKomplett.
p_filename = VerzUndDateinameKomplett.
count = count + 1.
ENDWHILE.
Best Regards
Jens
Edited by: JB6493 on May 18, 2009 1:03 PM
Edited by: JB6493 on May 18, 2009 1:07 PMHello Jens,
you have to process the InfoPackage 239 times. Your routine would be executed once during one processing of the InfoPackage. This would run the WHILE statement and end up with the last filename after 239 iterations.
Either try to concatenate the files you want to load (if possible) and load them in one go. Alternatively you could try to use a process chain to run the InfoPackage 239 times.
Kind regards,
Christoph -
I am attempting to perform a fairly standard operation, extract a table to a flat file.
I have set all the schemas, models and interfaces, and the file is produced how I want it, apart from one thing. In the source, one field is 100 characters long, and in the output, it needs to be 25.
I have set the destination model to have a column physical and logical length of 25.
Looking at the documentation presented at http://docs.oracle.com/cd/E25054_01/integrate.1111/e12644/files.htm - this suggests that setting the file driver up to truncate fields should solve the issue.
However, building a new file driver using the string 'jdbc:snps:dbfile?TRUNC_FIXED_STRINGS=TRUE&TRUNC_DEL_STRINGS=TRUE' does not appear to truncate the output.
I noticed a discrepancy in the documentation - the page above notes 'Truncates strings to the field size for fixed files'. The help tooltip in ODI notes 'Truncates the strings from the fixed files to the field size'. Which might explain the observed lack of truncation.
My question is - what is the way to enforce field sizes in a flat file output?
I could truncate the fields separately in each of the mapping statements using substr, but that seems counter-intuitive, and losing the benefits of the tool.
Using ODI Version:
Standalone Edition Version 11.1.1 - Build ODI_11.1.1.5.0_GENERIC_110422.1001bump
If this is an elementary issue, please let me know what I've missed in the manual. -
Reading data from flat file Using TEXT_IO
Dear Gurus
I already posted this question but this time i need some other changes .....Sorry for that ..
I am using 10G forms and using TEXT_IO for reading data from flat file ..
My data is like this :-
0|BP-V1|20100928|01|1|2430962.89|27|2430962.89|MUR|20100928120106
9|2430962.89|000111111111|
1|61304.88|000014104113|
1|41961.73|000022096086|
1|38475.65|000023640081|
1|49749.34|000032133154|
1|35572.46|000033093377|
1|246671.01|000042148111|
Here each column is separated by | . I want to read all the columns and want to do some validation .
How can i do ?
Initially my requirement was to read only 2 or 3 columns so i did like this ...
Procedure Pay_Simulator(lfile_type varchar2,lac_no varchar2,lcur varchar2,lno_item number,ltotal number,ldate date,lpay_purp varchar2,lfile_name varchar2)
IS
v_handle utl_file.file_type;
v_filebuffer varchar2(500);
line_0_date VARCHAR2 (10);
line_0_Purp VARCHAR2 (10);
line_0_count Number;
line_0_sum number(12,2);
line_0_ccy Varchar2(3);
line_9_sum Number(12,2);
line_9_Acc_no Varchar2(12);
Line_1_Sum Number(12,2);
Line_1_tot Number(15,2) := 0;
Line_1_flag Number := 0;
lval number;
lacno varchar2(16);
v_file varchar2(20);
v_path varchar2(50);
Begin
v_file := mcb_simulator_pkg.GET_FILENAME(lfile_name); -- For the file name
v_path :=rtrim(regexp_substr( lfile_name , '.*\\' ),'\'); For the Path
v_path := SUBSTR (lfile_name,0, INSTR (lfile_name, '\', -1));
v_handle := UTL_FILE.fopen (v_path, v_file, 'r');
LOOP
UTL_FILE.get_line (v_handle, v_filebuffer);
IF SUBSTR (v_filebuffer, 0, 1) = '0' THEN
SELECT line_0 INTO line_0_date
FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_0, ROWNUM rn
FROM DUAL
CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
WHERE rn = 3;
SELECT line_0 INTO line_0_Purp
FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_0, ROWNUM rn
FROM DUAL
CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
WHERE rn = 4;
SELECT line_0 INTO line_0_count
FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_0, ROWNUM rn
FROM DUAL
CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
WHERE rn = 7;
SELECT line_0 INTO line_0_sum
FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_0, ROWNUM rn
FROM DUAL
CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
WHERE rn = 8;
SELECT line_0 INTO line_0_ccy
FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_0, ROWNUM rn
FROM DUAL
CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
WHERE rn = 9;
ELSIF SUBSTR (v_filebuffer, 0, 1) = '9' THEN
SELECT line_9 INTO line_9_Acc_no
FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_9, ROWNUM rn
FROM DUAL
CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
WHERE rn = 3;
SELECT line_9 INTO line_9_sum
FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_9, ROWNUM rn
FROM DUAL
CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
WHERE rn = 2;
ELSIF SUBSTR (v_filebuffer, 0, 1) = '1' THEN
line_1_flag := line_1_flag+1;
SELECT line_1 INTO line_1_sum
FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_1, ROWNUM rn
FROM DUAL
CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
WHERE rn = 3;
Line_1_tot := Line_1_tot + line_1_sum;
END IF;
END LOOP;
DBMS_OUTPUT.put_line (Line_1_tot);
DBMS_OUTPUT.PUT_LINE (Line_1_flag);
UTL_FILE.fclose (v_handle);
END;
But now how can i do ? Shall i use like this select Statement for all the columns ?Sorry for that ..
As per our requirement ...
I need to read the flat file and it looks like like this .
*0|BP-V1|20100928|01|1|2430962.89|9|2430962.89|MUR|20100928120106*
*9|2430962.89|000111111111|*
*1|61304.88|000014104113|*
*1|41961.73|000022096086|*
*1|38475.65|000023640081|*
*1|49749.34|000032133154|*
*1|35572.46|000033093377|*
*1|246671.01|000042148111|*
*1|120737.25|000053101979|*
*1|151898.79|000082139768|*
*1|84182.34|000082485593|*
I have to check the file :-
Validation are 1st line should start from 0 else it should raise an error and insert that error into one table .
The for 2nd line also same thing ..it should start from 9 else it should raise an error and insert that error into one table .
Then the 3rd line should start from 1 else it should raise an error and insert that error into one table .
After that i have to do a validation like i will read the 1st line 2nd column .. It should be like this BP-V1 else raise an error and insert that error to a table . Then i will check the 3rd column which is 20100928 , it should be YYYYMMDD format else same thing ERROR.
Then like this for all columns i have different validation .......
Then it will check for the 2nd line 3rd column . this is an account no .1st i will check it should be 12 char else ERROR .Then I will check that what user has imputed in the form.Like for example User putted 111111111 then i will check with this 000111111111 which is there in the 2nd line . I have to add 000 before that Account no which user imputed .
Then the lines which is starting from 1 , i have to take all the 2nd column for all the lines which is starting from 1 and i have to do a sum . After that i have to compare that sum with the value in the 1st lines ( Starting from 0) 6th column . It should be same else ERROR ...
Then same way i have to count all the lines which is starting from 1 . Then i have to compare with the 7th column of 1st line . It should be same . Here in this file it should be 9.
MY CODE IS :-
Procedure Pay_Simulator(lfile_type varchar2,lac_no varchar2,lcur varchar2,lno_item number,ltotal number,ldate date,lpay_purp varchar2,lfile_name varchar2)
IS
v_handle TEXT_IO.file_type;
v_filebuffer varchar2(500);
line_0_date VARCHAR2 (10);
line_0_Purp VARCHAR2 (10);
line_0_count Number;
line_0_sum number(12,2);
line_0_ccy Varchar2(3);
line_9_sum Number(12,2);
line_9_Acc_no Varchar2(12);
Line_1_Sum Number(12,2);
Line_1_tot Number(15,2) := 0;
Line_1_flag Number := 0;
lval number;
lacno varchar2(16);
v_file varchar2(20);
v_path varchar2(50);
LC$String VARCHAR2(50) ;--:= 'one|two|three|four|five|six|seven' ;
LC$Token VARCHAR2(100) ;
i PLS_INTEGER := 2 ;
lfirst_char number;
lvalue Varchar2(100) ;
Begin
v_file := mcb_simulator_pkg.GET_FILENAME(lfile_name); For the file name
v_path :=rtrim(regexp_substr( lfile_name , '.*\\' ),'\'); For the Path
--v_path := SUBSTR (lfile_name,0, INSTR (lfile_name, '\', -1));
Message(lfile_name);
v_handle := TEXT_IO.fopen(lfile_name, 'r');
BEGIN
LOOP
TEXT_IO.get_line (v_handle, v_filebuffer);
lfirst_char := Substr(v_filebuffer,0,1);
--Message('First Char '||lfirst_char);
IF lfirst_char = '0' Then
Loop
LC$Token := mcb_simulator_pkg.Split( v_filebuffer, i , '|') ;
Message('VAL - '||LC$Token);
lvalue := LC$Token;
EXIT WHEN LC$Token IS NULL ;
i := i + 1 ;
End Loop;
Else
Insert into MU_SIMULATOR_output_ERR (load_no,ERR_CODE,ERR_DESC) values (9999,'0002','First line should always start with 0');
Forms_DDL('Commit');
raise form_Trigger_failure;
End if ;
TEXT_IO.get_line (v_handle, v_filebuffer);
lfirst_char := Substr(v_filebuffer,0,1);
LC$Token := mcb_simulator_pkg.Split( v_filebuffer, i , '|') ;
--Message('Row '||LC$Token);
IF lfirst_char = '9' Then
Null;
Else
Insert into MU_SIMULATOR_output_ERR (load_no,ERR_CODE,ERR_DESC) values (8888,'0016','Second line should start with 9');
Forms_DDL('Commit');
raise form_Trigger_failure;
End IF;
LOOP
TEXT_IO.get_line (v_handle, v_filebuffer);
lfirst_char := Substr(v_filebuffer,0,1);
LC$Token := mcb_simulator_pkg.Split( v_filebuffer, i , '|') ;
--Message('Row '||LC$Token);
IF lfirst_char = '1' Then
Null;
Else
Insert into MU_SIMULATOR_output_ERR (load_no,ERR_CODE,ERR_DESC) values (7777,'0022','The third line onward should start with 1');
Forms_DDL('Commit');
raise form_Trigger_failure;
End if;
END LOOP;
--END IF;
END LOOP;
EXCEPTION
When No_Data_Found Then
TEXT_IO.fclose (v_handle);
END;
Exception
When Others Then
Message('Other error');
END;
I am calling the FUNCTION which you gave SPLIT as mcb_simulator_pkg.Split. -
Different formats of the flat file for the same target
In our deployment, we use plugin code to extract the csv files in the required format. The customers are on same version of datamart, but they are on different versions of source database - from 3.x to 4.5 depending on which version of application they are using. In 4.0, we introduced a new column email in the user table in the source database. Accordingly, plugin will add the field in the csv file. But not all the customers will get the upgraded version of plugin at the same time. So ETL code needs to decide which data flow to process depending on the format of the csv file to load data to the same target table. I made the email field in the target table nullable but it still expects the same format of the csv file with delimiter for null value.
Need help to achieve this. Can I read the structure of the flat file in DS or get the count of delimiters so that I can use a conditional to use different data flow based on the format of the flat files.
Can I make the email column in the flat file optional?
Thanks much in advance.You can add an email column that maps to null in a query transform for the source that does not contain this column.
Or else you can define two different file formats that map to the same file. One with the column and one without -
AR Invoice Flat Files For Customers
Hi All,
There's a requirement to provide a data file with outstanding AR invoice to one of our customers.
In this we are goning to develop a program to extract the relevant outstanding invoices.However we are going to have the following controls as well.
01) Allow the user to extract the invoices an may times they want till they upload the file to customer's suystem
02) To allow that facility we are going to define a process called draft extraction and a final extraction.
03) Once a file has been successfully uplaoded they should confirm that the batch has been uploaded successfully
04) If a batch has been confirmed as successful users cannot extract the invoices again. If they want to do it they need to clear the values populated in the DFF which are anables for this purpose.
05) If a batch has been generated in final mode it must be confirmed prior to generating a new file. Otherwise no output will be generated.
My question is normally what are the other types of controls that you implement in these type of scenarios.
and what sort of information do we need to capture in our side in the extractions.
Thanks,Hi:
Few inputs -
1. I hope you would extract based on invoice 'complete' status but you may need to include the logic to make sure no change happens on the invoice after your final extract. (Ex user can incomplete the invoice, make some change and again complete it)
2. You may also include some reconciliation mechanism such as # of invoice count and amount extracted = # of invoice count and amount in flat file.You can use staging tables information to do this reconciliation.
Hope this is useful.
Tarun -
Flat File to XML using content conversion.
Hi Experts,
I am converting a flat file to xml structure.
i need the structure of xml like :
<LineA>
</LineA>
<LineB>
<LineC>
</LineC>
</LineB>
I am able to generate it in hierarchy
<LineA>
</LineA>
<LineB>
</LineB>
<LineC>
</LineC>
How should i write structure of Recordset so that for LineC so that it comes under the LineB tag ? I also dont want the name of recordset in my output, how can i do that ?
Kulwinder.Hi
Refer
File Receiver with Content Conversion
/people/arpit.seth/blog/2005/06/02/file-receiver-with-content-conversion
Configuring the Receiver File/FTP Adapter
http://help.sap.com/saphelp_nw04/helpdata/en/95/bb623c6369f454e10000000a114084/frameset.htm
File content conversion sites
/people/venkat.donela/blog/2005/03/02/introduction-to-simplefile-xi-filescenario-and-complete-walk-through-for-starterspart1
/people/venkat.donela/blog/2005/03/03/introduction-to-simple-file-xi-filescenario-and-complete-walk-through-for-starterspart2
/people/arpit.seth/blog/2005/06/02/file-receiver-with-content-conversion
/people/anish.abraham2/blog/2005/06/08/content-conversion-patternrandom-content-in-input-file
/people/shabarish.vijayakumar/blog/2005/08/17/nab-the-tab-file-adapter
/people/venkat.donela/blog/2005/03/02/introduction-to-simplefile-xi-filescenario-and-complete-walk-through-for-starterspart1
/people/venkat.donela/blog/2005/03/03/introduction-to-simple-file-xi-filescenario-and-complete-walk-through-for-starterspart2
/people/venkat.donela/blog/2005/06/08/how-to-send-a-flat-file-with-various-field-lengths-and-variable-substructures-to-xi-30
/people/anish.abraham2/blog/2005/06/08/content-conversion-patternrandom-content-in-input-file
/people/shabarish.vijayakumar/blog/2005/08/17/nab-the-tab-file-adapter
/people/jeyakumar.muthu2/blog/2005/11/29/file-content-conversion-for-unequal-number-of-columns
/people/shabarish.vijayakumar/blog/2006/02/27/content-conversion-the-key-field-problem
/people/michal.krawczyk2/blog/2004/12/15/how-to-send-a-flat-file-with-fixed-lengths-to-xi-30-using-a-central-file-adapter
/people/arpit.seth/blog/2005/06/02/file-receiver-with-content-conversion
http://help.sap.com/saphelp_nw04/helpdata/en/d2/bab440c97f3716e10000000a155106/content.htm
Please see the below links for file content conversion..
/people/michal.krawczyk2/blog/2004/12/15/how-to-send-a-flat-file-with-fixed-lengths-to-xi-30-using-a-central-file-adapter - FCC
/people/michal.krawczyk2/blog/2004/12/15/how-to-send-a-flat-file-with-fixed-lengths-to-xi-30-using-a-central-file-adapter - FCC
File Content Conversion for Unequal Number of Columns
/people/jeyakumar.muthu2/blog/2005/11/29/file-content-conversion-for-unequal-number-of-columns - FCC
Content Conversion (Pattern/Random content in input file)
/people/anish.abraham2/blog/2005/06/08/content-conversion-patternrandom-content-in-input-file - FCC
/people/harrison.holland5/blog/2006/12/20/xi-configuration-for-mdm-integration--sample-scenario - FCC - MDM
XI in the role of a FTP
/people/shabarish.vijayakumar/blog/2006/04/03/xi-in-the-role-of-a-ftp - FCC
File to R/3 via ABAP Proxy
/people/prateek.shah/blog/2005/06/14/file-to-r3-via-abap-proxy - FCC
/people/mickael.huchet/blog/2006/09/18/xipi-how-to-exclude-files-in-a-sender-file-adapter - EOIO - File
http://help.sap.com/saphelp_nw04/helpdata/en/ee/c9f0b4925af54cb17c454788d8e466/frameset.htm - cc
http://help.sap.com/saphelp_erp2005vp/helpdata/en/95/bb623c6369f454e10000000a114084/content.htm - fcc cOUNTER
https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/da1e7c16-0c01-0010-278a-eaed5eae5a5f - conversion agent -
Positional Flat File Handling in B2B
Problem:
There is a big problem with handling positional flat files in Oracle B2B. We are trying to handle Positional Flat File, in UTF-8 with BOM encoding, after creating document definition and setting the agreement, we received:
B2B-51507 Error: The data starting at position 0 is not recognized as a valid data transmission.
What’s more, we remembered to locate Parser file into Config/schema location, which usually fixes problem.
Full description:
We have a sample flat file, with UNIX (LF) endline character, encoded in UTF-8 with BOM:
11B6Kamil Mazur Lazurowa 8 12 CityABCD PL
Junior Consultant MyCompanyX
StreetAAAAAA 00-000 CityABCD
First of all, we have created a guideline *.ecs and schema XSD, like that:
imgur: the simple image sharer
During XSD creation we have remembered to confirm, that our endline character is in UNIX standard (LF).
After that, we have created Parser Schema File, and paste it in the schema location in our server:
/oracle/Middleware/MyDomain/soa/thirdparty/edifecs/XEngine/config/schema,
after that we paste the schema location in XRegistry file:
imgur: the simple image sharer
After that action, we restarted SOA Server (which was running during these changes).
When SOA Server was in alive again, we’ve created a document definition :
imgur: the simple image sharer
Starting and Ending positions are from 2-5, because of BOM character existence. Then, we created Trading Partner and Inbound Agreement, based on this Document Definition:
imgur: the simple image sharer
Then we posted out test file via Generic File Listening Channel, with no Java Callout and other properties. Unfortunately, we receive error like this:
Business Message :
Id
AC18017B146145ADDCD00000178AA4F9
Message Id
AC18017B146145ADDCB00000178AA4F8-1
Refer To Message
Refer To Message
Sender Type
Name
Sender Value
Szkolenie
Receiver Type
Name
Receiver Value
MyCompany
Sender
Szkolenie
Receiver
MyCompany
Agreement Id
PersonIn
Agreement
PersonIn
Document Type
Person
Document Protocol
PositionalFlatFile
Document Version
Logistics2013
Message Type
REQ
Direction
INBOUND
State
MSG_ERROR
Acknowledgement Mode
NONE
Response Mode
ASYNC
Send Time Stamp
2014-05-19 14:00
Receive Time Stamp
2014-05-19 14:00
Document Retry Interval(Channel)
0
Document Remaining Retry(Channel)
0
Document Retry Interval(Agreement)
Document Remaining Retry(Agreement)
Native Message Size
141
Translated Message Size
Business Action Name
Business Transaction Name
Xpath Name1
Xpath Value1
Xpath Expression1
Xpath Name2
Xpath Value2
Xpath Expression2
Xpath Name3
Xpath Value3
Xpath Expression3
Correlation From XPath Name
Correlation From XPath Value
Correlation From XPath Expression
Correlation To XPath Name
Correlation To XPath Value
Correlation To XPath Expression
Wire Message
Wire Message
Application Message
Application Message
Payload Storage
Payload Storage
Attachment
Attachment
Label
soa_b2b_ - Thu May 15 15:04:40 CEST 2014 - 1
Collaboration Id
AC18017B146145ADD9200000178AA4F7
Collaboration Name
Collaboration Version
Business Action Name
Exchange Protocol Name
Generic File
Exchange Protocol Version
1.0
Interchange Control Number
Group Control Number
Transaction Set Control Number
Error Code
B2B-51507
Error Description
Machine Info: (soatest.corp.prv) The data starting at position 0 is not recognized as a valid data transmission.
Error Level
ERROR_LEVEL_COLLABORATION
Error Severity
ERROR
Error Text
Payload validation error.
Wire Message:
Id
AC18017B146145ADAB400000178AA4F1
Message Id
AC18017B146145ADAB400000178AA4F1
Business Message
AC18017B146145ADDCD00000178AA4F9
Packed Message
Packed Message
Payload
Payload
Protocol Message Id
test_4value.txt@AC18017B146145ADAE500000178AA4F5
Refer To Protocol Message Id
Protocol Collaboration Id
Protocol Transport Binding
filename=test_4value.txt filesize=143 ChannelName=SzkolenieChannel file_ext=txt fullpath=/home/oracle/POC/positional/Krzysiek/test_4value.txt timestamp=2014-05-07T10:23:30.000+01:00 tp_profile_name=Szkolenie MSG_RECEIVED_TIME=Mon May 19 14:00:37 CEST 2014
Message Digest
Message Digest
Digest Algorithm
Transport Protocol
File
Transport Protocol Version
1.0
Url
file://localhost//home/oracle/POC/positional/Krzysiek
security
Transport Headers
filename=test_4value.txt filesize=143 ChannelName=SzkolenieChannel file_ext=txt fullpath=/home/oracle/POC/positional/Krzysiek/test_4value.txt timestamp=2014-05-07T10:23:30.000+01:00 tp_profile_name=Szkolenie MSG_RECEIVED_TIME=Mon May 19 14:00:37 CEST 2014
certificates
certificates
State
ERROR
Reattempt Count
Error Code
B2B-51507
Error Description
Machine Info: (soatest.corp.prv) The data starting at position 0 is not recognized as a valid data transmission.
Error Text
Payload validation error.
exchange Retry Interval
exchange Remaining Retry
Message Size
141
Payload seems to be unchanged. Problem appears, when our B2B works in Cluster (but both configuration parameters b2b.HAInstance b2b.HAInstanceName are set properly for both soa-servers):
imgur: the simple image sharer
There are no additional info in logs :
[2014-05-19T14:07:47.994+02:00] [soa_server1] [NOTIFICATION] [] [oracle.soa.b2b.engine] [tid: DaemonWorkThread: '20' of WorkManager: 'wm/SOAWorkManager'] [userId: <anonymous>] [ecid: a8bc74c6eb84aa5b:452af1da:146046c88de:-8000-00000000000598a0,0] [APP: soa-infra] Engine: processIncomingMessageImpl: Message id = AC18017B14614616E1A00000178AA510-1 FromParty = Szkolenie Doctype = Person version = Logistics2013
That’s unfortunately all. Is there any configuration that we missed? How to fix this problem?Hi Anuj,
I have added a transformation in the mediator component and I'm now able to transfer the opaque data from Oracle B2B to composite and then write the contents of flat file to another specified file thorugh file adapter.
However I still have few more issues in different scenarios:
I have tried the excercise that was mentioned in your blog:
http://www.anuj-dwivedi.blogspot.com/2011/10/handling-positionaldelimited-flat-files.html
I tried to create a flat file with two records or lines as mentioned in the blog. First line contains header information whose length is 57 characters with 5 Fileds in it. Second line contains the actual data with 57 characters and 5 fields in it. In the agreement that was created to handle this flat file, I have checked the Validate & Translate check box.
During validation, B2B is failing to treat the second line as a new record and it considering it in one record and the validation is failing as nothing is specified from 58th character in the ECS & XSD files.
Following is the error:
Error -: B2B-51507: Payload validation error.: Machine Info: (corpdevsoa10) Extra Field was found in the data file as part of Record HEADER. Record HEADER is defined in the guideline at position 1.{br}{br}This error was detected at:{br}{tab}Record Count: 1{br}{tab}Field Count: 6{br}{tab}Characters: 57 through 116
Please advice me in resolving this issue.
Thanks,
Krishna. -
Hello experts,
In flat file ' | ' (vertical line) is maintained as seperator.
flat file format: J|1|A|
here
J is data of column-1
1 is data of column-2
A is data of column-3.
'|' is seperator,
I want to upload.
When i tried to upload the file with the same in internal table it gives me like this :
Column1 column2 column3
J | 1
how do i resolve this problem???
any sample code??ok.
i hv added new code that it can be converted into internal table format. ie. it will show with header line.
just check this code it will work definately..
with regards,
Kiran.Gouni
DATA : BEGIN OF itab OCCURS 0,
char(150) TYPE c,
END OF itab.
DATA : name TYPE rlgrap-filename.
DATA : string TYPE string.
DATA : BEGIN OF itab1 OCCURS 0,
no(10) TYPE c,
END OF itab1.
DATA : BEGIN OF itab2 OCCURS 0,
no(10) TYPE c,
no1(10) type c,
no2(10) type c,
no3(10) type c,
NO4(10) TYPE C,
END OF itab2.
PARAMETERS : p_name TYPE rlgrap-filename.
AT SELECTION-SCREEN ON VALUE-REQUEST FOR p_name.
CALL FUNCTION 'KD_GET_FILENAME_ON_F4'
EXPORTING
program_name = syst-repid
dynpro_number = syst-dynnr
CHANGING
file_name = p_name
EXCEPTIONS
MASK_TOO_LONG = 1
OTHERS = 2
IF sy-subrc <> 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
START-OF-SELECTION.
MOVE p_name TO string.
CALL FUNCTION 'GUI_UPLOAD'
EXPORTING
filename = string
FILETYPE = 'ASC'
tables
data_tab = itab
IF sy-subrc <> 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
DATA : char(10) TYPE c.
DATA : COUNT TYPE I.
LOOP AT itab.
SPLIT itab-char AT '|' INTO TABLE itab1.
COUNT = 0.
loop at itab1.
COUNT = COUNT + 1.
SY-INDEX = COUNT.
if sy-index = 1.
itab2-no = itab1-no.
endif.
if sy-index = 2.
itab2-no1 = itab1-no.
endif.
if sy-index = 3.
itab2-no2 = itab1-no.
endif.
if sy-index = 4.
itab2-no3 = itab1-no.
endif.
IF SY-INDEX = 5.
ITAB2-NO4 = ITAB1-NO.
APPEND ITAB2.
ENDIF.
endloop.
CLEAR ITAB1.
CLEAR ITAB1[].
ENDLOOP. -
Hi experts,
I have a bit of a problem here wit a flat file to IDOC interface.
My input CSV file structure (shortened) is as follows -
<?xml version="1.0" encoding="UTF-8"?>
<ns0:POINBOUND_MT xmlns:ns0="urn:xx:xi:dwn:xx:pf:xxx:poinbound:100">
<Record>
<Row>
<PO_HEADER_ID/>
<TYPE_LOOKUP_CODE/>
<LINE_NUM/>
<AMOUNT_LIMIT/>
</Row>
</Record>
</ns0:POINBOUND_MT>
the target side is an idoc - PORDCR1.PORDCR102.
in the input file, for each row the PO_HEADER_ID field will have repeated values. the structure for eg.-
PO_HEADER_ID < other fields> < otherfields> ...
12345 <other fields>..<.. >
12345 <other fields>..<..>
12345 <other fields>..<..>
56789 <other fields>..<..>
56789 <other fields>..<..>
Now i need two idocs ( 2 distinct header values) on target side - the 1st idoc will have first 3 line items and 2nd idoc will have the last 2 line items. have already edited the IDOC in xsd format to make it unbounded.
<b><u>Problem</u></b> - i am able to create two idocs by handling contexts(splitbyvalue and a UDF), but with the line items its not working i.e. <u>the first idoc gets the 3 line items created but the 2nd idoc does not have any!</u>
my mapping for line items is -
PO_HEADER_ID <-> splitbyvalue(value changed) <-> count <---> UDF (for result.addvalue using count) <-----> lDOC LINE_ITEM node.
This logic does not work incase i need to create more than one idoc on the target side, it only works for the first idoc..so wat can i do??
Plz help.
thnx in advance!
Raju.Hi Raju,
check these links...
/people/anish.abraham2/blog/2005/12/22/file-to-multiple-idocs-xslt-mapping
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/877c0d53-0801-0010-3bb0-e38d5ecd352c
Sachin -
Flat file header diferent from remaining file (function module GUI_UPLOAD)
Hello gurus,
I need to upload a flat file but the header is the control record with the number of lines in the file, and has a diferent structure from remaining file records.
I'm using GUI_UPLOAD in ASCII mode for the flat file and my first line with the control record does not "fit" the structure of the remaining records.
Whats the best way I read diferent line structures in the flat file so I can have a counter from the first line and all other lines in an internal table?
Using GUI_UPLOAD twice with diferent internal tables will have performance degraded...
regards.
Sérgio.Hi,
Declare an itab as
ITAB TYPE STANDARD TABLE OF STRING
Variable for Tab limited
data :gc_con_tab TYPE c VALUE cl_abap_char_utilities=>horizontal_tab.
pass the control record to the internal table first then.
loop at it_final into wa_final.
concatenating gc_con_tab will add TABs in your output.
CONCATENATE wa_final-f1 wa_final-f2
into ITABl separated by gc_con_tab.
append ITAB.
endloop.
Then use GUI_DOWNLOAD to download ITAB.
Regards,
Shanmugavel chandrasekaran -
Dear All,
Can i call a flat file containing physical inventory data, (Material no, batch & no. of count) in MI10 for background processing so that all the data from the flat file can be come to the corresponding fields in MI10.
I have to post the document with inventory count and difference in MI10.
Can i run BDC for it.
Plz reply me quickly.
Thanks
Prasant SekharSee this link, I hope this will help you
http://www.northcode.com/blog.php/2008/07/28/Fixing-the-Windows-Start-Command-in-Vista
http://board.flashkit.com/board/showthread.php?t=772097 -
Converting a flat file to IDOC
Greetings to All,
I have a requirement where i need to convert an IDOC (flat) file to IDOC.
the idoc is SHPMNT04.Presently i am trying to implement it using FCC at the sender side as the flat file is fixed length type.
Can any one suggest me a better method like using XSLT or Java mapping which could make my work smarter as i am Finding it very difficult to count and write the FCC for the IDOC which is have more than 60 segments.
Any help will be greatly admired.
Thnaks,
Anika.Hi
I am still having a doubt if this method will work on real time scenario.I mean i have to pick files
from an 3rd party FTP location (polling and picking as and when i get a file starting with say
ABC*.txt).Added to this there will be issue related to the volume of file(1000 per day),will that create
any performance issues.
It is now that you are sharing the complete scenario/ problem , just have a glance from your first post at the top & the replies, because its the problem statement that steers the discussion.
Making the files available for the Report is a seperate task. You have to develop FTP-to-File XI scenario (or regular ftp process to send files across, which normally will not be allowed in XI server), I dont see any other way, if any, I 'assume' that it might complicate the situation.
Performance - it is not something to do with your scenario alone, but the overall system load, configuration etc. (b.t.w 1000 is normal)
Can we schedule a job for running the report periodically?
Yes, scehdule a batch job calling this Report (with necessary variants) with required frequency or exact time (if there is any agreement with sender)
I have started a POC
I appreciate that, best way at this moment. you can reverse engineer from the errors to configure your scenario. for any reason,if you think it does not work, you can always stop and/or look for different solution.
need your help
- Definitely I will share my thoughts, if time permits
how will i get the flat file xml(IDOC) in the source side
do you mean the source structure, you can use the same structure as your target.
once you are convinced with your solution approach, then it is better to close this thread & start a new one with new issues, for more views & suggestions and better readability.
@Rajesh - I appreciate your efforts to solve the issue , but the below is not correct & possible, w.r.t this issue
Here what you need is to do configure File Sender CC with NFS protocol.
Regards
Vishnu -
Header in flat file destination (ssis)
Hi
I am creating a header in the flat file destination which has some hard coded values and record count which will get its value from a variable assisgned to a row count transformation in the package.
There is no problem in hard coded value but when i try to get the value of record count, i always get zero.
This is how i am doing it.
In the flat file destination, when we open flat file destination editor, i hard code the values which i want in the HEADER itself right there in the editor.
For the record count, i go properties of the flat file and select expression. In the expression editor, in property, i am selecting 'HeaderRowDelimiter' and drag that variable in the expression box.
Let me know if there is any suggestion.
ThanksThe value of the rowcount is only known AFTER the Data Flow Task is completely finished... and there header is writen before that.
Here are some examples:
http://agilebi.com/jwelch/2008/02/08/adding-headers-and-footers-to-flat-files/
http://social.msdn.microsoft.com/Search/en-US/sqlserver?query=add%20header%20and%20footer%20ssis&rq=meta:Search.MSForums.ForumID(00e50af7-5f43-43ad-af05-d98b73c1f760)+site:microsoft.com&rn=SQL+Server+Integration+Services+Forum
Please mark the post as answered if it answers your question | My SSIS Blog:
http://microsoft-ssis.blogspot.com
Maybe you are looking for
-
Updated to version 4 and now it does not give save tabs option when closing
Now when I close Firefox it only give the option of close all tabs or cancel before upgrading to Firefox 4 it gave the option of saving tabs so when opening Firefox again all tabs where restored. What can I do to restore the old setting?
-
Video Conferencing with IOS Homogeneous Video Conference Bridge
Dears, I have CUCM 8.6 and 3945 Cisco router with 2 x PVDM-256 I configured the IOS Homogeneous Video Conference Bridge. and I did the required configurations in the router. and I can see the conferencing bridge is registered. I put the conference br
-
I am running 2 desktops (G4, G5) and a Laptop (G3). Sometimes I run them together as a network and sometimes I run them as individual computers. I have been doing this for over a year with no problem. Just this week, my mouse began clicking to "show
-
HT201442 how do i get through the passcode screen to set my iphone back up
i am trying to us my old iphone because my current one broke. my old one is disabled from trying the passcode too many times, so i am trying to get through it. the phone is not activated because i cant get through it!
-
Multiple picture-buttons - How to stop the slide
Hi, my problem: A slide with a world map, some buttons as target-markers. If you click at one of these dots, a button with some information pops up (its all about visibility). A single click within this button close itself. It is possible to click an