DMEE Tree Change from flat file to XML
Hi DMME Gurus,
We have a custom DMEE tree that generates a flat file. I need to convert it in such a way that XML will be created. I could not find this option in Transaction DMEE. Is this possible? Can you please tell how to do this?
Or is it correct that I need to create another tree (for XML file) again from scratch? As pop up screen will appear when creating new DMEE tree that will ask you what format you like (flat or XML). Please advise.
Thank you.
Best regards,
Brando
HI,
Possible mention the source Message/Data Type and Target Data Type and Mapping Rules reuired. Then I think it may be useful to give some hints.
btw , What is the error in Mapping ?
Regards,
Moorthy
Similar Messages
-
Flat File to XML mapping Problem
I am facing some issues in mapping from flat file to XML. I have not reached to the stage of actually executing the scenario and the problem has not advanced beyond mapping tests.
Following are structures of inbound flat file (as specified in content conversion parameters of sender adapter)
Document Name  MMADealerStatementInbound_MT
Recordset Name  MMADealerStmtInbound_Type
Recordset Structure 
BatchHeader,1,DealerHeader1,,DealerHeader2,,DealerDetail,,DealerAgeing,,Trailer,1
Recordset Sequence  Ascending
Key Field Name  RecordType
Key Field Type  String (Case Sensitive)
Following are structures of Outbound XML file needed
All DealerDetail nodes are lumped together in first DealerStmt nodes. They need to be distributed across many DealerStmt nodes.
Company CompanyName 25
Company CompanyAddress StreetAddress Company Street Address 50
Company CompanyAddress City Company City 40
Company CompanyAddress State Company State 3
Company CompanyAddress ZIP Company ZIP 10
Company CompanyAddress Country Company Country 3
Company RunParams RunDate Statement Run date 8
Company RunParams StmtPeriod Statement Period 20
DealerStmt 0-unbounded DealerNumber 10
DealerStmt 0-unbounded DealerName 35
DealerStmt 0-unbounded DealerAddress StreetAddress Dealer Street Address 50
DealerStmt 0-unbounded DealerAddress City Dealer City 40
DealerStmt 0-unbounded DealerAddress State Dealer State 3
DealerStmt 0-unbounded DealerAddress ZIP Dealer ZIP 10
DealerStmt 0-unbounded DealerAddress Region Dealer Region 4
DealerStmt 0-unbounded DealerRemitToAddress CompanyName Company Name 50
DealerStmt 0-unbounded DealerRemitToAddress StreetAddress1 Street Address 1 50
DealerStmt 0-unbounded DealerRemitToAddress StreetAddress2 Street Address 2 50
DealerStmt 0-unbounded DealerRemitToAddress StreetAddress3 Street Address 3 50
DealerStmt 0-unbounded DealerDetail 0-unbounded DocDate Document Date 8
DealerStmt 0-unbounded DealerDetail 0-unbounded InvoiceNumber Invoice Number 10
DealerStmt 0-unbounded DealerDetail 0-unbounded TransDescription Transaction Description 50
DealerStmt 0-unbounded DealerDetail 0-unbounded TransAmount Transaction Amount 16
DealerStmt 0-unbounded DealerDetail 0-unbounded AmountSign Amount Sign 1
DealerStmt 0-unbounded DealerAgeing CurrentBal Current Balance 15
DealerStmt 0-unbounded DealerAgeing Bal1to31days Balance 1 to 31 Days 15
DealerStmt 0-unbounded DealerAgeing BalOver31 Over 31 days Old 15
DealerStmt 0-unbounded DealerAgeing BalOver61 Over 61 days Old 15
DealerStmt 0-unbounded DealerAgeing BalOver92 Over 92 days Old 15
DealerStmt 0-unbounded DealerAgeing BalOver123 Over 123 days Old 15
Trailer DealersCount 5
Trailer TransCount 10
Trailer TotalAmount 20
Here is the mapping I used for this node.
/ns0:MMADealerStatementOutbound_MT/MMADealerStmtOutbound_Type/DealerStmt/DealerDetail=ifWithoutElse(stringEquals(/ns0:MMADealerStatementInbound_MT/MMADealerStmtInbound_Type/DealerDetail/RecordType=, const()), SplitByValue(/ns0:MMADealerStatementInbound_MT/MMADealerStmtInbound_Type/DealerDetail=))
My email address is [email protected] and i can send some screen shots to understand better.
I need help of XI mapping Gurus.
Thanks
RajeshHI,
Possible mention the source Message/Data Type and Target Data Type and Mapping Rules reuired. Then I think it may be useful to give some hints.
btw , What is the error in Mapping ?
Regards,
Moorthy -
Hi guys,
Anyone knows standard functions to convert a flat file to XML? Or maybe from an internal table to XML.
Thanks in advance,
Ricardo.Hello ricardo,
Check this sample:
*& Report ZULH_MM_RP_XMLtest001
REPORT ZULH_MM_RP_XMLtest001.
PARAMETERS: GK_RUTA TYPE RLGRAP-FILENAME DEFAULT 'C:test01.XML'.
* TYPE TURNOS *
TYPES: BEGIN OF TURNOS,
LU LIKE T552A-TPR01,
MA LIKE T552A-TPR01,
MI LIKE T552A-TPR01,
JU LIKE T552A-TPR01,
VI LIKE T55AA-TPR01,
SA LIKE T552A-TPR01,
DO LIKE T552A-TPR01,
END OF TURNOS.
* TYPE TURNOS *
* TYPE SOCIO *
TYPES: BEGIN OF SOCIO,
NUMERO LIKE PERNR-PERNR,
REPOSICION LIKE PA0050-ZAUVE,
NOMBRE LIKE PA0002-VORNA,
TURNOS TYPE TURNOS,
END OF SOCIO.
* TYPE SOCIO *
* ESTRUCTURA ACCESOS *
DATA: BEGIN OF ACCESOS OCCURS 0,
SOCIO TYPE SOCIO,
END OF ACCESOS.
* ESTRUCTURA ACCESOS *
* START OF SELECTION *
START-OF-SELECTION.
PERFORM LLENA_ACCESOS.
PERFORM DESCARGA_XML.
END-OF-SELECTION.
* END OF SELECTION *
* FORM LLENA_ACCESOS *
FORM LLENA_ACCESOS.
REFRESH ACCESOS.
CLEAR ACCESOS.
MOVE: '45050' TO ACCESOS-SOCIO-NUMERO,
'MOISES MORENO' TO ACCESOS-SOCIO-NOMBRE,
'0' TO ACCESOS-SOCIO-REPOSICION,
'T1' TO ACCESOS-SOCIO-TURNOS-LU,
'T2' TO ACCESOS-SOCIO-TURNOS-MA,
'T3' TO ACCESOS-SOCIO-TURNOS-MI,
'T4' TO ACCESOS-SOCIO-TURNOS-JU,
'T5' TO ACCESOS-SOCIO-TURNOS-VI,
'T6' TO ACCESOS-SOCIO-TURNOS-SA,
'T7' TO ACCESOS-SOCIO-TURNOS-DO.
APPEND ACCESOS.
CLEAR ACCESOS.
MOVE: '45051' TO ACCESOS-SOCIO-NUMERO,
'RUTH PEÑA' TO ACCESOS-SOCIO-NOMBRE,
'0' TO ACCESOS-SOCIO-REPOSICION,
'T1' TO ACCESOS-SOCIO-TURNOS-LU,
'T2' TO ACCESOS-SOCIO-TURNOS-MA,
'T3' TO ACCESOS-SOCIO-TURNOS-MI,
'T4' TO ACCESOS-SOCIO-TURNOS-JU,
'T5' TO ACCESOS-SOCIO-TURNOS-VI,
'T6' TO ACCESOS-SOCIO-TURNOS-SA,
'T7' TO ACCESOS-SOCIO-TURNOS-DO.
APPEND ACCESOS.
ENDFORM.
* FORM LLENA_ACCESOS *
* FORM DESCARGA_XML *
FORM DESCARGA_XML.
DATA: L_DOM TYPE REF TO IF_IXML_ELEMENT,
M_DOCUMENT TYPE REF TO IF_IXML_DOCUMENT,
G_IXML TYPE REF TO IF_IXML,
W_STRING TYPE XSTRING,
W_SIZE TYPE I,
W_RESULT TYPE I,
W_LINE TYPE STRING,
IT_XML TYPE DCXMLLINES,
S_XML LIKE LINE OF IT_XML,
W_RC LIKE SY-SUBRC.
DATA: XML TYPE DCXMLLINES.
DATA: RC TYPE SY-SUBRC,
BEGIN OF XML_TAB OCCURS 0,
D LIKE LINE OF XML,
END OF XML_TAB.
CLASS CL_IXML DEFINITION LOAD.
G_IXML = CL_IXML=>CREATE( ).
CHECK NOT G_IXML IS INITIAL.
M_DOCUMENT = G_IXML->CREATE_DOCUMENT( ).
CHECK NOT M_DOCUMENT IS INITIAL.
WRITE: / 'Converting DATA TO DOM 1:'.
CALL FUNCTION 'SDIXML_DATA_TO_DOM'
EXPORTING
NAME = 'ACCESOS'
DATAOBJECT = ACCESOS[]
IMPORTING
DATA_AS_DOM = L_DOM
CHANGING
DOCUMENT = M_DOCUMENT
EXCEPTIONS
ILLEGAL_NAME = 1
OTHERS = 2.
IF SY-SUBRC = 0.
WRITE 'Ok'.
ELSE.
WRITE: 'Err =',
SY-SUBRC.
ENDIF.
CHECK NOT L_DOM IS INITIAL.
W_RC = M_DOCUMENT->APPEND_CHILD( NEW_CHILD = L_DOM ).
IF W_RC IS INITIAL.
WRITE 'Ok'.
ELSE.
WRITE: 'Err =',
W_RC.
ENDIF.
CALL FUNCTION 'SDIXML_DOM_TO_XML'
EXPORTING
DOCUMENT = M_DOCUMENT
IMPORTING
XML_AS_STRING = W_STRING
SIZE = W_SIZE
TABLES
XML_AS_TABLE = IT_XML
EXCEPTIONS
NO_DOCUMENT = 1
OTHERS = 2.
IF SY-SUBRC = 0.
WRITE 'Ok'.
ELSE.
WRITE: 'Err =',
SY-SUBRC.
ENDIF.
LOOP AT IT_XML INTO XML_TAB-D.
APPEND XML_TAB.
ENDLOOP.
CALL FUNCTION 'WS_DOWNLOAD'
EXPORTING
BIN_FILESIZE = W_SIZE
FILENAME = GK_RUTA
FILETYPE = 'BIN'
TABLES
DATA_TAB = XML_TAB
EXCEPTIONS
OTHERS = 10.
IF SY-SUBRC <> 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
ENDFORM.
Regards,
Vasanth -
Data loading from flat file to cube using bw3.5
Hi Experts,
Kindly give me the detailed steps with screens about Data loading from flat file to cube using bw3.5
...............PleaseHi ,
Procedure
You are in the Data Warehousing Workbench in the DataSource tree.
1. Select the application components in which you want to create the DataSource and choose Create DataSource.
2. On the next screen, enter a technical name for the DataSource, select the type of DataSource and choose Copy.
The DataSource maintenance screen appears.
3. Go to the General tab page.
a. Enter descriptions for the DataSource (short, medium, long).
b. As required, specify whether the DataSource builds an initial non-cumulative and can return duplicate data records within a request.
c. Specify whether you want to generate the PSA for the DataSource in the character format. If the PSA is not typed it is not generated in a typed structure but is generated with character-like fields of type CHAR only.
Use this option if conversion during loading causes problems, for example, because there is no appropriate conversion routine, or if the source cannot guarantee that data is loaded with the correct data type.
In this case, after you have activated the DataSource you can load data into the PSA and correct it there.
4. Go to the Extraction tab page.
a. Define the delta process for the DataSource.
b. Specify whether you want the DataSource to support direct access to data.
c. Real-time data acquisition is not supported for data transfer from files.
d. Select the adapter for the data transfer. You can load text files or binary files from your local work station or from the application server.
Text-type files only contain characters that can be displayed and read as text. CSV and ASCII files are examples of text files. For CSV files you have to specify a character that separates the individual field values. In BI, you have to specify this separator character and an escape character which specifies this character as a component of the value if required. After specifying these characters, you have to use them in the file. ASCII files contain data in a specified length. The defined field length in the file must be the same as the assigned field in BI.
Binary files contain data in the form of Bytes. A file of this type can contain any type of Byte value, including Bytes that cannot be displayed or read as text. In this case, the field values in the file have to be the same as the internal format of the assigned field in BI.
Choose Properties if you want to display the general adapter properties.
e. Select the path to the file that you want to load or enter the name of the file directly, for example C:/Daten/US/Kosten97.csv.
You can also create a routine that determines the name of your file. If you do not create a routine to determine the name of the file, the system reads the file name directly from the File Name field.
f. Depending on the adapter and the file to be loaded, make further settings.
■ For binary files:
Specify the character record settings for the data that you want to transfer.
■ Text-type files:
Specify how many rows in your file are header rows and can therefore be ignored when the data is transferred.
Specify the character record settings for the data that you want to transfer.
For ASCII files:
If you are loading data from an ASCII file, the data is requested with a fixed data record length.
For CSV files:
If you are loading data from an Excel CSV file, specify the data separator and the escape character.
Specify the separator that your file uses to divide the fields in the Data Separator field.
If the data separator character is a part of the value, the file indicates this by enclosing the value in particular start and end characters. Enter these start and end characters in the Escape Charactersfield.
You chose the; character as the data separator. However, your file contains the value 12;45 for a field. If you set u201C as the escape character, the value in the file must be u201C12;45u201D so that 12;45 is loaded into BI. The complete value that you want to transfer has to be enclosed by the escape characters.
If the escape characters do not enclose the value but are used within the value, the system interprets the escape characters as a normal part of the value. If you have specified u201C as the escape character, the value 12u201D45 is transferred as 12u201D45 and 12u201D45u201D is transferred as 12u201D45u201D.
In a text editor (for example, Notepad) check the data separator and the escape character currently being used in the file. These depend on the country version of the file you used.
Note that if you do not specify an escape character, the space character is interpreted as the escape character. We recommend that you use a different character as the escape character.
If you select the Hex indicator, you can specify the data separator and the escape character in hexadecimal format. When you enter a character for the data separator and the escape character, these are displayed as hexadecimal code after the entries have been checked. A two character entry for a data separator or an escape sign is always interpreted as a hexadecimal entry.
g. Make the settings for the number format (thousand separator and character used to represent a decimal point), as required.
h. Make the settings for currency conversion, as required.
i. Make any further settings that are dependent on your selection, as required.
5. Go to the Proposal tab page.
This tab page is only relevant for CSV files. For files in different formats, define the field list on the Fields tab page.
Here you create a proposal for the field list of the DataSource based on the sample data from your CSV file.
a. Specify the number of data records that you want to load and choose Upload Sample Data.
The data is displayed in the upper area of the tab page in the format of your file.
The system displays the proposal for the field list in the lower area of the tab page.
b. In the table of proposed fields, use Copy to Field List to select the fields you want to copy to the field list of the DataSource. All fields are selected by default.
6. Go to the Fields tab page.
Here you edit the fields that you transferred to the field list of the DataSource from the Proposal tab page. If you did not transfer the field list from a proposal, you can define the fields of the DataSource here.
a. To define a field, choose Insert Row and specify a field name.
b. Under Transfer, specify the decision-relevant DataSource fields that you want to be available for extraction and transferred to BI.
c. Instead of generating a proposal for the field list, you can enter InfoObjects to define the fields of the DataSource. Under Template InfoObject, specify InfoObjects for the fields in BI. This allows you to transfer the technical properties of the InfoObjects into the DataSource field.
Entering InfoObjects here does not equate to assigning them to DataSource fields. Assignments are made in the transformation. When you define the transformation, the system proposes the InfoObjects you entered here as InfoObjects that you might want to assign to a field.
d. Change the data type of the field if required.
e. Specify the key fields of the DataSource.
These fields are generated as a secondary index in the PSA. This is important in ensuring good performance for data transfer process selections, in particular with semantic grouping.
f. Specify whether lowercase is supported.
g. Specify whether the source provides the data in the internal or external format.
h. If you choose the external format, ensure that the output length of the field (external length) is correct. Change the entries, as required.
i. If required, specify a conversion routine that converts data from an external format into an internal format.
j. Select the fields that you want to be able to set selection criteria for when scheduling a data request using an InfoPackage. Data for this type of field is transferred in accordance with the selection criteria specified in the InfoPackage.
k. Choose the selection options (such as EQ, BT) that you want to be available for selection in the InfoPackage.
l. Under Field Type, specify whether the data to be selected is language-dependent or time-dependent, as required.
7. Check, save and activate the DataSource.
8. Go to the Preview tab page.
If you select Read Preview Data, the number of data records you specified in your field selection is displayed in a preview.
This function allows you to check whether the data formats and data are correct.
For More Info: http://help.sap.com/saphelp_nw70/helpdata/EN/43/01ed2fe3811a77e10000000a422035/content.htm -
Reading data from flat file Using TEXT_IO
Dear Gurus
I already posted this question but this time i need some other changes .....Sorry for that ..
I am using 10G forms and using TEXT_IO for reading data from flat file ..
My data is like this :-
0|BP-V1|20100928|01|1|2430962.89|27|2430962.89|MUR|20100928120106
9|2430962.89|000111111111|
1|61304.88|000014104113|
1|41961.73|000022096086|
1|38475.65|000023640081|
1|49749.34|000032133154|
1|35572.46|000033093377|
1|246671.01|000042148111|
Here each column is separated by | . I want to read all the columns and want to do some validation .
How can i do ?
Initially my requirement was to read only 2 or 3 columns so i did like this ...
Procedure Pay_Simulator(lfile_type varchar2,lac_no varchar2,lcur varchar2,lno_item number,ltotal number,ldate date,lpay_purp varchar2,lfile_name varchar2)
IS
v_handle utl_file.file_type;
v_filebuffer varchar2(500);
line_0_date VARCHAR2 (10);
line_0_Purp VARCHAR2 (10);
line_0_count Number;
line_0_sum number(12,2);
line_0_ccy Varchar2(3);
line_9_sum Number(12,2);
line_9_Acc_no Varchar2(12);
Line_1_Sum Number(12,2);
Line_1_tot Number(15,2) := 0;
Line_1_flag Number := 0;
lval number;
lacno varchar2(16);
v_file varchar2(20);
v_path varchar2(50);
Begin
v_file := mcb_simulator_pkg.GET_FILENAME(lfile_name); -- For the file name
v_path :=rtrim(regexp_substr( lfile_name , '.*\\' ),'\'); For the Path
v_path := SUBSTR (lfile_name,0, INSTR (lfile_name, '\', -1));
v_handle := UTL_FILE.fopen (v_path, v_file, 'r');
LOOP
UTL_FILE.get_line (v_handle, v_filebuffer);
IF SUBSTR (v_filebuffer, 0, 1) = '0' THEN
SELECT line_0 INTO line_0_date
FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_0, ROWNUM rn
FROM DUAL
CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
WHERE rn = 3;
SELECT line_0 INTO line_0_Purp
FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_0, ROWNUM rn
FROM DUAL
CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
WHERE rn = 4;
SELECT line_0 INTO line_0_count
FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_0, ROWNUM rn
FROM DUAL
CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
WHERE rn = 7;
SELECT line_0 INTO line_0_sum
FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_0, ROWNUM rn
FROM DUAL
CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
WHERE rn = 8;
SELECT line_0 INTO line_0_ccy
FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_0, ROWNUM rn
FROM DUAL
CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
WHERE rn = 9;
ELSIF SUBSTR (v_filebuffer, 0, 1) = '9' THEN
SELECT line_9 INTO line_9_Acc_no
FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_9, ROWNUM rn
FROM DUAL
CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
WHERE rn = 3;
SELECT line_9 INTO line_9_sum
FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_9, ROWNUM rn
FROM DUAL
CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
WHERE rn = 2;
ELSIF SUBSTR (v_filebuffer, 0, 1) = '1' THEN
line_1_flag := line_1_flag+1;
SELECT line_1 INTO line_1_sum
FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_1, ROWNUM rn
FROM DUAL
CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
WHERE rn = 3;
Line_1_tot := Line_1_tot + line_1_sum;
END IF;
END LOOP;
DBMS_OUTPUT.put_line (Line_1_tot);
DBMS_OUTPUT.PUT_LINE (Line_1_flag);
UTL_FILE.fclose (v_handle);
END;
But now how can i do ? Shall i use like this select Statement for all the columns ?Sorry for that ..
As per our requirement ...
I need to read the flat file and it looks like like this .
*0|BP-V1|20100928|01|1|2430962.89|9|2430962.89|MUR|20100928120106*
*9|2430962.89|000111111111|*
*1|61304.88|000014104113|*
*1|41961.73|000022096086|*
*1|38475.65|000023640081|*
*1|49749.34|000032133154|*
*1|35572.46|000033093377|*
*1|246671.01|000042148111|*
*1|120737.25|000053101979|*
*1|151898.79|000082139768|*
*1|84182.34|000082485593|*
I have to check the file :-
Validation are 1st line should start from 0 else it should raise an error and insert that error into one table .
The for 2nd line also same thing ..it should start from 9 else it should raise an error and insert that error into one table .
Then the 3rd line should start from 1 else it should raise an error and insert that error into one table .
After that i have to do a validation like i will read the 1st line 2nd column .. It should be like this BP-V1 else raise an error and insert that error to a table . Then i will check the 3rd column which is 20100928 , it should be YYYYMMDD format else same thing ERROR.
Then like this for all columns i have different validation .......
Then it will check for the 2nd line 3rd column . this is an account no .1st i will check it should be 12 char else ERROR .Then I will check that what user has imputed in the form.Like for example User putted 111111111 then i will check with this 000111111111 which is there in the 2nd line . I have to add 000 before that Account no which user imputed .
Then the lines which is starting from 1 , i have to take all the 2nd column for all the lines which is starting from 1 and i have to do a sum . After that i have to compare that sum with the value in the 1st lines ( Starting from 0) 6th column . It should be same else ERROR ...
Then same way i have to count all the lines which is starting from 1 . Then i have to compare with the 7th column of 1st line . It should be same . Here in this file it should be 9.
MY CODE IS :-
Procedure Pay_Simulator(lfile_type varchar2,lac_no varchar2,lcur varchar2,lno_item number,ltotal number,ldate date,lpay_purp varchar2,lfile_name varchar2)
IS
v_handle TEXT_IO.file_type;
v_filebuffer varchar2(500);
line_0_date VARCHAR2 (10);
line_0_Purp VARCHAR2 (10);
line_0_count Number;
line_0_sum number(12,2);
line_0_ccy Varchar2(3);
line_9_sum Number(12,2);
line_9_Acc_no Varchar2(12);
Line_1_Sum Number(12,2);
Line_1_tot Number(15,2) := 0;
Line_1_flag Number := 0;
lval number;
lacno varchar2(16);
v_file varchar2(20);
v_path varchar2(50);
LC$String VARCHAR2(50) ;--:= 'one|two|three|four|five|six|seven' ;
LC$Token VARCHAR2(100) ;
i PLS_INTEGER := 2 ;
lfirst_char number;
lvalue Varchar2(100) ;
Begin
v_file := mcb_simulator_pkg.GET_FILENAME(lfile_name); For the file name
v_path :=rtrim(regexp_substr( lfile_name , '.*\\' ),'\'); For the Path
--v_path := SUBSTR (lfile_name,0, INSTR (lfile_name, '\', -1));
Message(lfile_name);
v_handle := TEXT_IO.fopen(lfile_name, 'r');
BEGIN
LOOP
TEXT_IO.get_line (v_handle, v_filebuffer);
lfirst_char := Substr(v_filebuffer,0,1);
--Message('First Char '||lfirst_char);
IF lfirst_char = '0' Then
Loop
LC$Token := mcb_simulator_pkg.Split( v_filebuffer, i , '|') ;
Message('VAL - '||LC$Token);
lvalue := LC$Token;
EXIT WHEN LC$Token IS NULL ;
i := i + 1 ;
End Loop;
Else
Insert into MU_SIMULATOR_output_ERR (load_no,ERR_CODE,ERR_DESC) values (9999,'0002','First line should always start with 0');
Forms_DDL('Commit');
raise form_Trigger_failure;
End if ;
TEXT_IO.get_line (v_handle, v_filebuffer);
lfirst_char := Substr(v_filebuffer,0,1);
LC$Token := mcb_simulator_pkg.Split( v_filebuffer, i , '|') ;
--Message('Row '||LC$Token);
IF lfirst_char = '9' Then
Null;
Else
Insert into MU_SIMULATOR_output_ERR (load_no,ERR_CODE,ERR_DESC) values (8888,'0016','Second line should start with 9');
Forms_DDL('Commit');
raise form_Trigger_failure;
End IF;
LOOP
TEXT_IO.get_line (v_handle, v_filebuffer);
lfirst_char := Substr(v_filebuffer,0,1);
LC$Token := mcb_simulator_pkg.Split( v_filebuffer, i , '|') ;
--Message('Row '||LC$Token);
IF lfirst_char = '1' Then
Null;
Else
Insert into MU_SIMULATOR_output_ERR (load_no,ERR_CODE,ERR_DESC) values (7777,'0022','The third line onward should start with 1');
Forms_DDL('Commit');
raise form_Trigger_failure;
End if;
END LOOP;
--END IF;
END LOOP;
EXCEPTION
When No_Data_Found Then
TEXT_IO.fclose (v_handle);
END;
Exception
When Others Then
Message('Other error');
END;
I am calling the FUNCTION which you gave SPLIT as mcb_simulator_pkg.Split. -
Error in Flat File to XML conversion
Hi all,
I am trying to convert a flat file to XML using the Sender File Adapter and I am getting the following error message.
2006-01-23 17:23:00 EST: Error: Conversion of complete file content to XML format failed around position 0: Exception: ERROR converting document line no. 2 according to structure 'GL_FileUpload_SAPECC_Header_DT1':java.lang.Exception: ERROR in configuration: more elements in file csv structure than field names specified!
My flat file looks like this,
--Start
GL,GLI,1,RefTest,4011,Test,1234567890,12032005,12032005,GL,RK
GL,GLI,4011,3011,,,,,,AU,600,7000,8000,9000,5000,RK,,,,,,,,,,,,,,,,,,,,
---End
The adapter configuration is like this:
Document Name: GL_FileUpload_SAPECC_Item_MT1
Document Namespace: urn:corptech.qld.gov.au:sss_std_offering:gl
RecordSet Name: GL_FileUpload_SAPECC_Record_DT1
RecordSet Namespace: urn:corptech.qld.gov.au:sss_std_offering:gl
RecordSet Structure: GL_FileUpload_SAPECC_Header_DT1,1,GL_FileUpload_SAPECC_Item_DT1,*
RecordSet Sequence: Ascending
Key FieldName: TransType
On the Adapter Properties, I have got:
--Start
GL_FileUpload_SAPECC_Header_DT1.fieldNames: TransType,RowType,SequenceNo,ReferenceKey,SenderSystem,HeaderText,CompanyCode,DocumentDate,PostingDate,DocumentType,ReferenceNo
GL_FileUpload_SAPECC_Header_DT1.fieldSeparator: ,
GL_FileUpload_SAPECC_Item_DT1.fieldNames: TransType,RowType,SequenceNo,GLAccount,CostCentre,ProfitCentre,InternalOrder,WBSElement,TaxCode,Currency,GLAmount,VendorAmount,CustomerAmount,AssetAmount,DRCR,ItemText,VendorNo,CustomerNo,Name,Street,City,PostCode,PoBox,State,Country,BankKey,BankAccount,BankCountry,CalcTax,PaymentTerms,BaseDate,PaymentBlock,PaymentMethod,Assignment,AssetNo,AssetSubNo,AssetTransaction
GL_FileUpload_SAPECC_Item_DT1.fieldSeparator: ,
ignoreRecordsetName: true
GL_FileUpload_SAPECC_Header_DT1.keyFieldValue: GL
GL_FileUpload_SAPECC_Item_DT1.keyFieldValue: GL
---End
The structure defined on the data type looks like this,
--Start
GL_FileUpload_SAPECC_Record_DT1 Complex Type
GL_FileUpload_SAPECC_Header_DT1 Element
TransType Element
RowType Element
Sequence Number Element
GL_FileUpload_SAPECC_Item_DT1 Element
TransType Element
RowType Element
Sequence Number Element
GLAccount Element
---End
Any help or suggestion please.
Thank you.
Warm Regards,
RanjanHi, Ranjan.
First of all, let's look at the meaning of the error.
> ...Exception: ERROR converting document line no. 2 according to
> structure 'GL_FileUpload_SAPECC_Header_DT1':java.lang.Exception:
> ERROR in configuration: more elements in file csv structure than
> field names specified!
It seems that XI interpreted 2nd line as
Header_DT1 not as Item_DT1 that you meant.
> GL,GLI,4011,3011,,,,,,AU,600,7000,8000,9000,5000,RK,,,,,,,,,,,,,,,,,,,,
That's why it says this line has more elements than the structure
defined(Header_DT1)
And the reason why XI misinterpreted the above as Header is that
you used keyFieldValue with the same value.
> ...Header_DT1.keyFieldValue: GL
> ...Item_DT1.keyFieldValue: GL
According to the following help,
http://help.sap.com/saphelp_nw04/helpdata/en/2c/181077dd7d6b4ea6a8029b20bf7e55/content.htm
it says like the following.
Key Field Name
If you specified a variable number of substructures for Recordset
Structure, in other words, at least one substructure has the value
*, then the substructures must be identified by the parser from
their content. This means that a key field must be set with different
constants for the substructures. In this case, you must specify a key
field and the field name must occur in all substructures.
How about using different constants for header and item if possible?
Good luck. -
How to convert flat file to XMl without key field value
Hi experts,
I have a input CSV Flat file which has got one HEADER line and Multiple Detail records.
I do not have any key field value to convert this FLAT file to XML using normal file content conversion.
Is there any module or Bean which can be used to convert this Flat File to XML.
Or any other remedy to overcome this problrm.
Thanks in advance.
Regards
PradeepHi Saurabh,
Thanks for the reply.
Ur understanding is perfectly alright.Lets say my file is like below.
Account Number,Account Name,Currency,Unclear Balance,Account Balance010205000033,VAISHNAVI SALES CORPN,0.00,0.00,350000.00
010205000034,CHAKKA ENTERPRISES,0.00,-641350.47,8649.53
010205000035,SEHGAL TRADING COMPANY,338665.00,-220.00,461115.00
010205000036,SHUBH LAXMI AGENCIES,0.00,0.00,0.00
010205000037,EMPIRE AGENCIES,0.00,-245.11,0.00
010205000038,PIONEER AGENCIES,0.00,-696386.00,303614.00
I am not using the first line in the mappiing.
Are you saying that I need to ignore the first line while doing the conversion using Recordset OFFSET ?
Or do I need to make a single structure taking all the fields from header as well as details and while defining the structure I need to ignore the fields from header?
Please suggest. -
Converting flat file to XML using JMS
Hi,
I want to convert flat file to xml. My sender adapter is JMS,
Can anyone tell me that how to do that conversion ? Conversion is very simple.
Can we use File content conversion in JMS Sender ? any link or blog ? or any other idea to achieve this ?
Regards
KulwinderHi,
The pdf has been removed from that link i suppose.
Anyways, go through the below help, everything that is there in "HowToConveModuleJMS.pdf" has been mentioned in this...
http://help.sap.com/saphelp_nw04/helpdata/en/24/4cad3baabd4737bab64d0201bc0c6c/content.htm
Hope it would surely help you.
You can go through the below link, for better understanding..
https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/a05b2347-01e7-2910-ceac-c45577e574e0
Regards,
Swetha.
Edited by: Swetha Reddy on Feb 26, 2009 5:30 AM -
hello
I am having a scenario a flat file to XML file. in the comm. channel of the sender I made a content conversion to the file and I see in the SXMB MONI and it looks ok. the output file, is declared as a file and I recieve an empty xml file.
what am I missing?
why it doesnt go out through the mapping that I declared so I will get a valid XML with its data?
KfirHi,
Please can you take the Sender side Payload from SXMB_MONI and try to test the mapping in Message Mapping ---> Test Tab.
See if you are getting the same results. Even check if you have properly map the Header nodes in Mapping.
Thanks
Swarup
Edited by: Swarup Sawant on Aug 4, 2008 8:08 AM -
IDOC to Flat File and XML ( Need both the Output)
Hi ,
My Scenario is IDOC to File .
I need o/p in 2 format . ie, 1) Flat File 2 ) XML file
I have a plan to go with Interface determination , 2 IM and what shd be the condition ?
or Else 2 Receiver detmination but condition ?
How to slove this ?
Need the Output from Idoc in 2 format ( Flat file and XML file )
Any Blog , Thread ...Tips...etc
Regards,
JudeHi Jude,
Do like this:
1. Create two inbound Service Interface (with the same target Message Type)
2. Create separate Operation mapping with the above service interfaces as target (with the same source service interface)
3. Create Interface Determination and include these two operation mapping (without any condition)
4. Create 2 receiver agreement (with two separate communication channel) for each of the target service Inetrface. In one channel use content conversion and the other one will give you normal XML output
Regards
Suraj -
Flat files to XML conversion module?
Hi Experts,
I would like to know from where I can have free download for flat files to XML conversion module which I can test in development server. Your help will be well appreciated.
Thanks and Regards,
Praveen.u have file adapter with FCC which does the flat file to XML conversion. wat is ur requirement?
refer the help or search on SDN for file content conversion.
http://help.sap.com/saphelp_nw04/helpdata/en/e3/94007075cae04f930cc4c034e411e1/frameset.htm
few examples:-
Introduction to simple(File-XI-File)scenario and complete walk through for starters(Part1)
Key value:
How to send a flat file with various field lengths and variable substructures to XI 3.0
File Receiver with Content Conversion
chirag -
Convert the flat file to xml format.
hi,
I need to write a interface program in the R/3 to pull the flat file data from the unix application server and do some manipulation and place back into the unix application server in XML format, From the unix box XML file taken by the XI server.
pls give me some idea to convert the flat file to XML format, through any function module or any other logic is there...
with regards,
Thambee.Hi Thambe
in addition to the above posts
Program to convert flat file to XML file.
please download tool from this link:
http://www.download.com/Stylus-Studio-2008-XML-Enterprise-Suite/3000-7241_4-10399885.html?part=dl-StylusStu&subj=dl&tag=button&cdlpid=10399885
how to use:
http://www.stylusstudio.com/learn_convert_to_xml.html
http://www.sap-img.com/abap/sample-xml-source-code-for-sap.htm
Flat file to XML
CONVERTION OF FLAT FILE TO XML : NO OUT PUT FOUND
Converting Idoc flat file representation to XML
how to convert flat file into IDOC-XML
Thanks
sandeep sharma
PS ; if helpful kindly reward points -
Convertion of flat file to XML using DOM
Hi!
I need help for convert a flat file to XML using DOM and by taking the validation of a specified DTD. Can any body help me in this regard.
Bye,
latfirst you have to decide how the flat file will map to xml. Will you use attributes or pcdata for your fields, or both? Will there be a hierarchy, or will it be mostly flat?
Once decided, you'd probably just use a BufferedReader to read the lines one at a time, and generate Dom nodes as appropriate, and stick them in the tree. -
Help Required regding: Validation on Data Loading from Flat File
Hi Experts,
I need u r help in the following issue.
I need to validated the transactional data loading to the GL Cube from Flat file,
1) The transactional data to the Cube to be loaded <b>only if master data</b> record exists for the <b>0GL_ACCOUNT</b> info object.
2) If the master data record does not exits then the record need to be skipped from the loading and after the loading the system should throw a message saying that these many records have been skipped (if there are any skipped records.).
I would really appriciate u r help and suggestions on solving this issue.
Regds
HariHi, write a <b>start routine</b> in transfer rules like this.
DATA: l_s_datapak_line type TRANSFER_STRUCTURE,
l_s_errorlog TYPE rssm_s_errorlog_int,
<b>l_s_glaccount type /BI0/PGLACCOUNT</b>,
new_datapak type tab_transtru.
refresh new_datapak.
loop at datapak into l_s_datapak_line.
select single * from /BI0/PGLACCOUNT into l_s_glaccount
where CHRT_ACCTS eq l_s_datapak_line-<b>field name in transfer structure/datsource for CHRT_ACCTS</b>
and GL_ACCOUNT eq l_s_datapak_line-<b>field name in transfer structure/datsource for GL_ACCOUNT</b>
and OBJVERS eq 'A'.
if sy-subrc eq 0.
append l_s_datapak_line to new_datapak.
endif.
endloop.
datapak = new_datapak.
if datapak[] is initial.
abort <> 0 means skip whole data package !!!
ABORT = 4.
else.
ABORT = 0.
endif.
i have already some modifications but U can slightly change it to suit your need.
regards
Emil -
Tracking history while loading from flat file
Dear Experts
I have a scenario where in have to load data from flat file in regular intervals and also want to track the changes made to the data.
the data will be like this and it keep on changes and the key is P1,A1,C1.
DAY 1---P1,A1,C1,100,120,100
DAY 2-- P1,A1,C1,125,123,190
DAY 3-- P1, A1, C1, 134,111,135
DAY 4-- P1,A1,C1,888,234,129
I am planing to load data into an ODS and then to infocube for reporting what will be the result and how to track history like if i want to see what was the data on Day1 and also current data for P1,A1,C1.
Just to mention as i am loading from flat file i am not mapping RECORDMODE in ODS from flat file..
Thanks and regards
Neel
Message was edited by:
Neel KamalHi
You don't mention your BI release level, so I will assume you are on the current release, SAP NetWeaver BI 2004s.
Consider loading to a write-optimized DataStore object, to strore the data. That way, you automatically will have a unqiue technical key for each record and it will be kept for historical purposes. In addition, load to a standard DataStore object which will track the changes in the change log. Then, load the cube from the change log (will avoid your summarization concern), as the changes will be updates (after images) in the standard DataStore Object.
Thanks for any points you choose to assign
Best Regards -
Ron Silberstein
SAP
Maybe you are looking for
-
Create Personal Documents Folder and iView to display
Hi, I would like to 1. Create an KM Navigation iView to point to a users Fersonal Folder 2. Create an KM Upload iView to upload documents to a users Personal Folder These iviews should point to a users personal folder dynamically based on who is logg
-
do I have to Build thr project in a TS_Folder and then bring it in Disk Utility ? Merci, Jean
-
Ok here is the deal, I have had problems iporting photos into my iphoto library lately. I imported a rill via the import command and then I got a munch of the ! mark things one for every picture. So then I went into the iphoto library and found that
-
We have the following scenario. 1 Primary, 200 distribution points and will like to upgrade from SCCM SP1 to R2. From my understanding of the upgrade, as soon as the primary is upgraded to R2, the DPs will start to upgrade as well. Is there a way to
-
I have an old but always reliable, until today, 932c deskjet, tied to a dell e521 Vista (home basic) computer. The printer worked fine yesterday, and now prints blank pages. (Correction -- nearly blank, there are very faint horizontal irregular dash