Sender FILE Adapater Content Conversion: Header and Item Data
Hi
I need to pick a file and do the content conversion. The XML structure which should be formed should be in the below format. I have a CSV file which needs to be converted.
Kindly suggest how can I maintain parameters for Header and Item level data in Recordset structure.
How can I specify in Recordset parameter which is Header(Customerno, Doc type and Address) and Item level(Line Item) data.
<?xml version="1.0" encoding="UTF-8" ?>
<customernumber></customernumber>
<documenttype></documenttype>
- <Address>
<name1></name1>
<name2></name2>
</Address>
- <LineItem>
<material>100016</material>
<amount>1000</amount>
</LineItem>
Hi Swetank,
The file after conversion should have one Header and you can have any number of line items.
Once you choose File Content Conversion in the message protocol header while configuring the Sender File Adapter, you get the following enteries in the Content Conversion Parameters:
Document Name
Document Namespace
Document Offset
Recordset Name
Recordset Namespace
Recordset Structure
Recordsets per Message
Key Field Name
To maintain the Header information, Some entries are mandatory, i.e.
Recordset Name: Here please specify the name of the structure. It is included in the XML schema.
Recorset Structure: Here you need to enter the sequence and the number of substructures. Since Header is one but you can have many Line items, you will write:
customernumber,1,documenttype,1,Address,1,LineItem,*
this format is clearly explained in the link provided by Divya.
And in the Additional fields you need to write the name of fields corresponding to different Recorset Structures and also add some property to it, like fixed lengths, use some separators etc.
and Most important field is:
Key Field Name: If you specified a variable number of substructures for Recordset Structure, in other words, at least one substructure has the value *, then the substructures must be identified by the parser from their content. This means that a key field must be set with different constants for the substructures. In this case, you must specify a key field and the field name must occur in all substructures.
Here you need to enter LineItem as it is the only Recordset Structure with an *.
I hope this solves your problem,
Thanks and Regards,
Varun Joshi
Similar Messages
-
Content Conversion: Header and Item
Hi everybody,
as the parameters are not explained detailed, I have to ask you:
I have a file as follws:
SENDER;RECEIVER //this line occurs only once
1111;2222 //this line occurs only once
ARTICELNUMBER;AMOUNT//this line occurs only once
444;10//this line occurs unbounded
555;20
The result should look like this (line 1 and 3 have to be ignored!)
<ORDER>
* <HEAD>*
* <SENDER>1111</SENDER>*
* <RECEIVER>2222</RECEIVER>*
* </HEAD>*
* <ITEM>*
* <ARTICELNUMBER>444</ARTICELNUMBER>*
* <AMOUNT>10</AMOUNT>*
* </ITEM>*
* <ITEM>*
* <ARTICELNUMBER>555</ARTICELNUMBER>*
* <AMOUNT>20</AMOUNT>*
* </ITEM>*
</ORDER>
What parameters and which values do I need?
Thanks a lot,
Regards MarioHi,
Check some links on FCC,hope they may help u out.
File content conversion sites
/people/venkat.donela/blog/2005/03/02/introduction-to-simplefile-xi-filescenario-and-complete-walk-through-for-starterspart1
/people/venkat.donela/blog/2005/03/03/introduction-to-simple-file-xi-filescenario-and-complete-walk-through-for-starterspart2
/people/arpit.seth/blog/2005/06/02/file-receiver-with-content-conversion
/people/anish.abraham2/blog/2005/06/08/content-conversion-patternrandom-content-in-input-file
/people/shabarish.vijayakumar/blog/2005/08/17/nab-the-tab-file-adapter
/people/venkat.donela/blog/2005/03/02/introduction-to-simplefile-xi-filescenario-and-complete-walk-through-for-starterspart1
/people/venkat.donela/blog/2005/03/03/introduction-to-simple-file-xi-filescenario-and-complete-walk-through-for-starterspart2
/people/venkat.donela/blog/2005/06/08/how-to-send-a-flat-file-with-various-field-lengths-and-variable-substructures-to-xi-30
/people/anish.abraham2/blog/2005/06/08/content-conversion-patternrandom-content-in-input-file
/people/shabarish.vijayakumar/blog/2005/08/17/nab-the-tab-file-adapter
/people/jeyakumar.muthu2/blog/2005/11/29/file-content-conversion-for-unequal-number-of-columns
/people/shabarish.vijayakumar/blog/2006/02/27/content-conversion-the-key-field-problem
/people/michal.krawczyk2/blog/2004/12/15/how-to-send-a-flat-file-with-fixed-lengths-to-xi-30-using-a-central-file-adapter
/people/arpit.seth/blog/2005/06/02/file-receiver-with-content-conversion
http://help.sap.com/saphelp_nw04/helpdata/en/d2/bab440c97f3716e10000000a155106/content.htm
Regards,
phani -
How to distinguish between the header and item data from the given legacy excel file, so that I can correctly extract the Sales Order header and item level data while doing LSMW? Do we need to do any extra coding for doing that?
Hi Diwakar,
Please read below. This is the way to do it. It resolved my issue.
Hi, I think you can use 1 file but with the record type identifier, the field will appear in all source strictures, in the Maintain Source Fields step amend field (double click) on it and point Identifying Field Content to appropriate level, e.g. 1 for header 2 for item. Your input file will look like:
REC_TYPE .. FIELDS
1 only fields relevant for the header and blanks for line items
2 only fields relevant for the item data blanks for header fields
2
2
1 Next record header
2 next record item 1
2 next record item 2
1
2
2
2
2
1
2
Diwakar,
Please dont be oversmart and dont mislead people and hurt others by making nasty comments.
Thanks,
Raj. -
How to header and item data of sales order using bapi interface
hi friends,
i am geetha, i having a problem like how to upload sales oder header and item data through va01 tcode using BAPI FUNCTION MODULES.
i need bapi function modules for header adn item data and brief explation on that , how to pass importing and tables parameters to get exact output .
regards
geetha.Use : BAPI_SALESORDER_CREATEFROMDAT2
Sales order: Create Sales Order
Functionality
You can use this method to create sales orders.
You must enter at least sales order header data (via ORDER_HEADER_IN structure) and partner data (via the ORDER_PARTNERS table) as input parameters.
Enter the item data via the ORDER_ITEMS_IN table. You can allocate item numbers manually, by filling in the relevant fields, or the system does it, according to the settings for Customizing, by leaving the relevant fields blank.
If you have configurable items, you must enter the configuration data in the ORDER_CFGS_REF, ORDER_CFGS_INST, ORDER_CFGS_PART_OF and ORDER_CFGS_VALUE tables.
Credit cards can be transferred via the BAPICCARD structure, on the one hand, data for card identification, on the other, data for a transaction which has taken place in an external system.
Once you have created the sales order successfully, you will receive the document number (SALESDOCUMENT field). Any errors that may occur will be announced via the RETURN parameter.
If no sales area has been created in the sales order header, then the system creates the sales area from the sold-to party or ship-to party, who has been entered in the partner table. If a clear sales area cannot be created, you will receive a system message, and the sales order will not be created.
Notes
1. Mandatory entries:
ORDER_HEADER_IN : DOC_TYPE Sales document type
SALES_ORG Sales organization
DISTR_CHAN Distribution channel
DIVISION Division
ORDER_PARTNERS..: PARTN_ROLE Partner role, SP sold-to party
PARTN_NUMB Customer number
ORDER_ITEMS_IN..: MATERIAL Material number
2. Ship-to party:
If no ship-to party is entered, use the following: Ship-to party =
sold-to party.
3. Commit control:
The BAPI does not have a database commit. This means that the relevant application must leave the commit, in order that can be carried out on on the database. The BAPI BAPI_TRANSACTION_COMMIT is available for this.
4. German key words:
The following key words must be entered in German, independantly of
the logon language:
DOC_TYPE Sales document type, for example: TA for standard order
PARTN_ROLE Partner role, for example: WE for ship-to party
Further information
You can find further information in the OSS. The note 93091 contains general information on the BAPIs in SD.
Parameters
SALESDOCUMENTIN
ORDER_HEADER_IN
ORDER_HEADER_INX
SENDER
BINARY_RELATIONSHIPTYPE
INT_NUMBER_ASSIGNMENT
BEHAVE_WHEN_ERROR
LOGIC_SWITCH
TESTRUN
CONVERT
SALESDOCUMENT
RETURN
ORDER_ITEMS_IN
ORDER_ITEMS_INX
ORDER_PARTNERS
ORDER_SCHEDULES_IN
ORDER_SCHEDULES_INX
ORDER_CONDITIONS_IN
ORDER_CONDITIONS_INX
ORDER_CFGS_REF
ORDER_CFGS_INST
ORDER_CFGS_PART_OF
ORDER_CFGS_VALUE
ORDER_CFGS_BLOB
ORDER_CFGS_VK
ORDER_CFGS_REFINST
ORDER_CCARD
ORDER_TEXT
ORDER_KEYS
EXTENSIONIN
PARTNERADDRESSES
Exceptions
Function Group
2032 -
Hi,
What are all the tables from which the data would be taken to display in header and item data in process PO ? From where these data would be taken ?
Thanks alot.
Regards, Sunayana NADR10 Printer (Business Address Services)
ADR11 SSF (Business Address Services)
ADR12 FTP and URL (Business Address Services)
ADR13 Pager (Business Address Services)
ADR2 Telephone Numbers (Business Address Services)
ADR3 Fax Numbers (Business Address Services)
ADR4 Teletex Numbers (Business Address Services)
ADR5 Telex Numbers (Business Address Services)
ADR6 E-Mail Addresses (Business Address Services)
ADR7 Remote Mail Addresses (SAP - SAP - Communication; BAS)
ADR8 X.400 Numbers (Business Address Services)
ADR9 RFC Destinations (Business Address Services)
ADRCOMC Comm. Data Serial Number Counter (Business Address Services)
ADRCT Address Texts (Business Address Services)
ADRG Assignment of Addresses to Other Address Groups (BAS)
ADRGP Assignment of Persons to Further Person Groups (BAS)
ADRT Communication Data Text (Business Address Services)
ADRU Table for Communication Usages
ADRV Address Where-Used List (Business Address Services)
ADRVP Person Where-Used List (Business Address Services)
BBP_PDACC Account Assignment
BBP_PDATT Document Attachment
BBP_PDBEH Backend Specific Header Data
BBP_PDBEI Backend Specific Item Data
BBP_PDBGP Partner Extension Gen. Purchasing Data
BBP_PDBINREL Transaction Object Linkage (EBP)
BBP_PDCON Purchase Order Item Confirmation
BBP_PDHAD_V Business Transaction Versions
BBP_PDHCF Set for Tabular Customer and Solution Fields on Hdr
BBP_PDHGP Business Transaction Purchasing Information
BBP_PDHSC Header Extension for Customer Fields
BBP_PDHSS Hdr Extension for SAP Internal Enhancements (IBUs and so on)
BBP_PDIAD_V Business transaction item
BBP_PDICF Set for Tabluar Customer and Solution Fields on Itm
BBP_PDIGP Business Transaction Item-Purchasing Information
BBP_PDISC Item Extension for Customer Fields
BBP_PDISS Item Ext. for SAP Internal Enhancements (IBUs and so on)
BBP_PDLIM Value Limit
BBP_PDLINK_V Transaction - Set - Link
BBP_PDORG Purchasing Organizational Unit
BBP_PDPSET Further Procurement Information
BBP_PDTAX Tax
BBP_PDTOL Tolerances
CDCLS Cluster structure for change documents
CDHDR Change document header
CDPOS_STR Additional Change Document - Table for STRINGs
CDPOS_UID Additional Table for Inclusion of TABKEY>70 Characters
CRMD_LINK Transaction - Set - Link
CRMD_ORDERADM_HBusiness Transaction
CRMD_ORDERADM_IBusiness Transaction Item
CRMD_PARTNER Partners
CRM_JCDO Change Documents for Status Object (Table JSTO)
CRM_JCDS Change Documents for System/User Statuses (Table JEST)
CRM_JEST Individual Object Status
CRM_JSTO Status Object Information
SROBLROLB Persistent Roles of BOR Objects
SROBLROLC Persistent Roles of Business Classes
SRRELROLES Object Relationship Service: Roles
STXB SAPscript: Texts in non-SAPscript format
STXH STXD SAPscript text file header
STXL STXD SAPscript text file lines
TOA01 Link table 1
TOA02 Link table 2
TOA03 Link table 3
TOAHR Container table for HR administration level
TCURR- Exchange table -
How to get header and item data in ME_PROCESS_PO_CUST ?
Hi all,
How can I get header and item data in me_process_po_cust~process_account ? I have to do some validation for account assignment catagory in item overview.Hi ,,
Further make the following changes in method IF_EX_ME_PROCESS_REQ_CUST~PROCESS_ITEM.
Get Line item data using method:
CALL METHOD IM_ITEM->GET_DATA
RECEIVING
RE_DATA = W_ITM_DATA.
and then validate the item data from structure W_ITM_DATA..
Same as follows for Header Records:
In method IF_EX_ME_PROCESS_REQ_CUST~PROCESS_Header.
Thanks
Shambhu -
Header and Item data in a module pool
Hi Friends,
I designed a screen in that i have header data and item data.
Here in my screen header i have vendor number,Vendor name,
bank Name, Branch, LC value and Due date fields.
and for item details i have to take table control.In the item details i have the fields Sl No, PO Number and Po Value.
and Two push buttons i need to put.SAVE and CANCEL.
When i click on SAVE all my screen details should be saved in a ZTABLE and should generate a LCDOCUMENT-Number from the system.
So my question is shell i take the two tables for header and item data or one is enough and if i take two tables(one is header and one is item data) how i write the logic to store into the ztables.
How can i proceed.Plz provide me with ur inputs
Thanks and Regards.Hi!
you have to take one table.....in which u have to move the header and item data into that b'coz according to the header data the items are to be displayed.......
select the data for which ever tables they came and then take a final table in which include all the fields of header and item .loop on all the tables from which data u have taken and inside it move the data of those fields into the fields of the final table.
In the ALV u need to call the function module to display the ALV with that final table in which ur all data is there and prepare a field catalog of those fields you want to display in the item list.
for example a sample code is there...
types:
begin of type_s_vbak,
vkorg type vbak-vkorg, " Sales Organization
aufnr type vbak-aufnr, " Sales Order Number
auart type vbak-auart, " Sales Order Type
kunnr type vbak-kunnr, " Customer Number
vbeln type vbak-vbeln, " Sales Document
knumv type vbak-knumv, " Number of Document Condition
end of type_s_vbak. " BEGIN OF TYPE_S_VBAK
types:
begin of type_s_konv,
kwert type konv-kwert, " Condition Value
kbetr type konv-kbetr, " Rate (Condition Amount)
knumv type konv-knumv, " Number of Document Condition
kschl type konv-kschl, " Condition Type
end of type_s_konv. " BEGIN OF TYPE_S_KONV
types:
begin of type_s_material,
kunnr type vbak-kunnr, " Customer Number
matnr type vbap-matnr, " Material Number
arktx type vbap-arktx, " Material Description
steuc type marc-steuc, " Fiscal Classification
kwmeng type vbap-kwmeng, " Quantity
knumv type vbak-knumv, " Number of Document Condition
vbeln type vbak-vbeln, " Document Number
kwert type konv-kwert, " Condition Value
total type p length 14 decimals 2,
" Total
ipitype type p length 9 decimals 3,
" IPI Type
ipivalue type konv-kwert, " IPI Value
end of type_s_material. " BEGIN OF TYPE_S_MATERIAL
data:
fs_vbak type type_s_vbak.
data:
fs_mati type type_s_material.
data:
fs_konv type type_s_konv.
data:
t_vbak like
standard table
of fs_vbak.
data:
t_konv like
standard table
of fs_konv.
data:
t_mati like
standard table
of fs_mati.
loop at t_konv into fs_konv where kschl eq 'ZPNF'.
move fs_konv-kwert to fs_mati-kwert.
modify t_mati from fs_mati
transporting kwert
where knumv eq fs_konv-knumv.
endloop. " LOOP AT T_KONV INTO FS_KONV
* Modifying Total Field of Material Table.
loop at t_mati into fs_mati.
w_total = fs_mati-kwmeng * fs_mati-kwert.
move w_total to fs_mati-total.
modify t_mati from fs_mati
index sy-tabix
transporting total.
clear w_total.
endloop. " LOOP AT T_KONV INTO FS_KONV
* Modifying IPI-Type Field of Material Table.
loop at t_konv into fs_konv where kschl eq 'IPI3'.
w_total = fs_konv-kbetr div 10.
move w_total to fs_mati-ipitype.
modify t_mati from fs_mati
transporting ipitype
where knumv eq fs_konv-knumv.
clear w_total.
endloop. " LOOP AT T_KONV INTO FS_KONV
* Modifying IPI-Value Field of Material Table.
loop at t_konv into fs_konv where kschl eq 'IPI3'.
move fs_konv-kwert to fs_mati-ipivalue.
modify t_mati from fs_mati
transporting ipivalue
where knumv eq fs_konv-knumv.
endloop. " LOOP AT T_KONV INTO FS_KONV
and then fill the fieldcatalog and display the ALV
Edited by: Richa Tripathi on Apr 15, 2009 3:28 PM -
File Reciever content conversion header line
Hi all,
I want to convert a XML to a CSV-file using file reciever adapter.
Doing this I want to add the
The parameter for content conversion:
recordset: ET_AWB,item
parameter list:
item.addHeaderLine = 1
item.fieldSeparator = ;
ET_AWB.addHeaderLine = 1
ET_AWB.fieldSeparator = nl
The file is created successfully but the header line is not writen. Could it be that the header line is only created if the recordset is a single type?
Can anybody help me?
Thanks a lot!
FlorianHI all,
thanks for the information!
My scenario is a RFC-Message with more than 1 table which is mapped into several RFC-Messages (Message split).
Because I use RFC-functions for mapping I do not want to change the XML because then I would need an own DT/MT.
Probably because the record type is not a single recordset it doesn't function.
Unfortunately also the option from Prakash with explicitely defined header names doesn't function. Has anybody one more idea? Or is this what I want to do impossible?
Thanks a lot!
Florian -
Header and Item Data Extraction
Hi Gurus,
For reporting needs, I have to extract both Sales Billing Header (2LIS_13_VDHDR) and Item Data (2LIS_13_VDITM).
Can anyone please give me some idea about the Data Staging.
1. Do I need to create 2 DSO (1 for Header and 1 for Item) for loading the respective Data and then push them into 1 Cube ?
Or
2. I create 2 Cubes (1 for Header and 1 for Item) and load the Data into them.
Or
3. I extract both Data Sources into 1 DSO or 1 Cube.
I have to do reporting on MultiProvider so in all 3 cases (OR as you suggest) I have to link all the InfoProvider to 1 MultiProvider.
Please let me know the right way and guide me ASAP.
Points will be awarded !!!Hi as you are loading with the standard Business content datasources for the cubes , it is better to load for their respective cubes which are in business content. The performance of the system also good ,if we use business content objects.
Then come to your requirement, just load the data for the Business content cubes for the respected datasource and combine them With Multiproviders and generate reports
Hope it can help you -
Hi,
What is the difference between Header data and Item data ? Please tell me in detail.In SAP we have the various Level of Data...
for example of a Sales Document.
The Sales Header Data would Contain various bussiness flowws how the document should behave should it be a Sales Order or a Quotation or A frwee of charge delivery.
Now the Item which you enter in the order the materails I mean will have the controls from the Item Data that you maintain in the Item categories.
The Sales Order next to be delievered by a delivery so these data goes to the schedule line level control.....
Hope you will get some light on what is explained above.
regards,
Amlan Sarkar -
Hi All,
I have loaded a DSO, with Hdr data and Item Data,. The Key is DOC_NUM and DOC_ITEM.
I hv taken a PO number for testing and henceforth, the data in DSO, shows something like this.
456000337 > > 0.00 $
456000337 > 10 > 1000 $
456000337 > 20 > 1000 $
The first row shows the header data and then next two rows is header + Item data.
I expected the Bex report to show the PO and its 2 items only,, but it shows the header # data also..
as shown below:
456000337 > # > 0.00 $
456000337 > 10 > 1000 $
456000337 > 20 > 1000 $
Can any 1 please help me, how to restrict this at the data target level only,,
i can restrict this # query level, but thats nt the solution rt??
thanksHi,
The current set up is wrong and this is bound to give wrong result.
Right now one extra row is separately maintained for the header of same PO doc and I am sure this record will not contains values in the fields which are specific to item.
Either create two DSO's each for item and header and use an infoset or in one DSO add new fileds specially for header ...
Not all the fields are specific to items and some header fields are common to items...so you can write routine at the transformation level where you can populate the new added fields for header values only.
In the queries you can use these columns to see the header information.
This will make sure that the extra row is changed to new set of columns and you can see the report for desired records only.
Thanks
Ajeet -
Sender File Adapter - Content Conversion
HI Friends,
I got a scenario where I need to convert the File to XML document through Sender file adapter..
My file looks like below.
BATCH1234........
12DASER123142JMM
237DSAFDLKC839890
45SDFLASJ90011
BATCH3455...
132FGAR
SD21352525
BATCH998898...
123145DSRTW
12FSTS
So there is a Header and Body for each record set..
My XML Structure is as follows.
<TimeStructure>
<TimeRecord>
<ControlRec>
<Field1>BATCH </Field1>
<Field2> ...</Field2>
</ControlRec>
<DataRec>
<F1> ...... </F1>
<F2> ...... </F2>
</DataRec>
</TimeRecord>
<TimeRecord>
<ControlRec>
<Field1> BATCH </Field1>
<Field2> ADFAS </Field2>
</ControlRec>
<DataRec>
<F1> ...... </F1>
<F2> ...... </F2>
</DataRec>
</TimeRecord>
</TimeStructure>
The blog <a href="/people/shabarish.vijayakumar/blog/2006/02/27/content-conversion-the-key-field-problem:///people/shabarish.vijayakumar/blog/2006/02/27/content-conversion-the-key-field-problem
is somewhat relevant to my requirement.
But the problem is I have the keyfield "BATCH" for my header file but don't have any <b>key field in the data record</b> of the input file.
Please help me out how to mention the configuration parameters.
Regards,
KumarHi,
If you don't have constant key value for your detail records, then you can not directly get the required xml.
So in this case, you can read all the records in a common Row model, i.e each record will be considered as a one row with all the values, and then split this row with Substring or java functions in the mapping.
Even you can do this in the Adapter module .
If you have key value for each record to identify then you can try with content conversion.
Regards,
Moorthy -
Key field from content - Sender file adapter content conversion
I am reading a source CSV file that has this structure. All rows in the source file are the same structure: line items of a PO. But there will be multiple POs in a single file, identified by the PO number as one column in the file.
PONum,LineItemNum,Qty,Description
001,1,34,Carrots
001,2,17,Apples
001,3,22,Bananas
002,1,4,Mangos
002,2,9,Coconuts
003,1,44,Grapes
Goal is to generate 3 messages, one for each PO:
<po>
<num>001</num>
<line_items>
... 3 line items for PO # 001 ...
</line_items>
</po>
<po>
<num>002</num>
<line_items>
... 2 line items for PO # 002 ...
</line_items>
</po>
<po>
<num>003</num>
<line_items>
... 1 line item for PO # 003 ...
</line_items>
</po>
Is there any way to use the Content Conversion Key Field Name to group the line items into the correct 3 messages? "Key Field Name" expects a static identifier for each type of row; but mine varies by the PO number in the content.
Or do I need to do this in the mapping? If so, what is the easiest way to split 1 large message of all line items into multiple target messages based on the PO number? (I assume this is better than sending individual line item messages and aggregating them later, as long as the file size is OK.)
Thanks in advance!
RBL
Edited by: Robert Burfoot-Lobo on Apr 8, 2009 11:43 AMHi Robert,
If ur goal is to split into 3 messages one for each PO, you can go for message split and using graphical mapping you can achieve that.
Within the message mapping go to the tab Messages.
Change the occurrence of the target message to 0..unbounded.
Also this link may help you.
/people/claus.wallacher/blog/2006/06/29/message-splitting-using-the-graphical-mapping-tool
Regards,
Madhu -
Sender file adapter - content conversion question
Hi all
We have a .csv file to be passed to XI that has column headings as well. Is there a way of stripping the header using "Content conversion". We declared the RecordsetStructure as "header,1,item,*" but then it needs a keyFieldname and identifier which is obviously not available as the first line only has column headings.
Hoping for a reply soon.
Thanks
SalilSalil,
In RecordsetStructure you define as header,1,item,9999999999. But if you have more than 9999999999 records then the extra records after this will come as a second file.
If you expect more item records then increase the number of 9's. Then we dont need to give the key field.
---Satish -
Sender File Adapter content conversion problem
Hi all,
is it possible to do content conversion like this:
Key;Field1;Field2;Field3
PO00H;0482000000;20061102;PL61
PO01I;00010;0A720;Material 1;100.000
PO02D;20061102;100.000;
PO00H;0482000001;20061102;PL63
PO01I;00010;0A730;Material 2;40.000
PO02D;20061102;40.000;
PO01I;00010;0A740;Material 3;140.000
PO02D;20061102;30.000;
PO02D;20061103;110.000;
convert to
<?xml version="1.0" encoding="utf-8" ?>
<MT_PO>
<DT_PO>
<DocumentHeader>
<PONumber>0482000000</PONumber>
<PODate>20061102</PODate>
<CompanyCode>PL61</CompanyCode>
</DocumentHeader>
<Item>
<ItemHeader>
<ItemNumber>00010</ItemNumber>
<MaterialCode>0A720</MaterialCode>
<MaterialDescription>Material 1</MaterialDescription>
<Quantity>100.000</Quantity>
</ItemHeader>
<ItemDetail>
<DeliveryDate>20061102</DeliveryDate>
<Quantity>100.000</Quantity>
</ItemDetail>
</Item>
</DT_PO>
<DT_PO>
<DocumentHeader>
<PONumber>0482000001</PONumber>
<PODate>20061102</PODate>
<CompanyCode>PL63</CompanyCode>
</DocumentHeader>
<Item>
<ItemHeader>
<ItemNumber>00010</ItemNumber>
<MaterialCode>0A730</MaterialCode>
<MaterialDescription>Material 2</MaterialDescription>
<Quantity>40.000</Quantity>
</ItemHeader>
<ItemDetail>
<DeliveryDate>20061102</DeliveryDate>
<Quantity>40.000</Quantity>
</ItemDetail>
</Item>
<Item>
<ItemHeader>
<ItemNumber>00010</ItemNumber>
<MaterialCode>0A740</MaterialCode>
<MaterialDescription>Material 3</MaterialDescription>
<Quantity>140.000</Quantity>
</ItemHeader>
<ItemDetail>
<DeliveryDate>20061102</DeliveryDate>
<Quantity>30.000</Quantity>
</ItemDetail>
<ItemDetail>
<DeliveryDate>20061103</DeliveryDate>
<Quantity>110.000</Quantity>
</ItemDetail>
</Item>
</DT_PO>
</MT_PO>Ivan,
I fear this is not possible.
You can have
<?xml version="1.0" encoding="utf-8" ?>
<MT_PO>
<DocumentHeader>
<PONumber>0482000000</PONumber>
<PODate>20061102</PODate>
<CompanyCode>PL61</CompanyCode>
</DocumentHeader>
<ItemHeader>
<ItemNumber>00010</ItemNumber>
<MaterialCode>0A720</MaterialCode>
<MaterialDescription>Material 1</MaterialDescription>
<Quantity>100.000</Quantity>
</ItemHeader>
<ItemDetail>
<DeliveryDate>20061102</DeliveryDate>
<Quantity>100.000</Quantity>
</ItemDetail>
</MT_PO>
Multiple <Item header> and multiple <Item details>
Do the content conversion like this and then in mapping, convert it to any structure you need.
Regards,
JaiShankar
Maybe you are looking for
-
Unnamed fields in a form: how to reference them in a dataset
Hi all, I am trying to programmatically fill out an XFA form. The form contains a table in which rows can be added dynamically. This is what the relevant template segment (in a simplified form) looks like: <subform name="page6">
-
Dual displays ("extended desktop") on the iBook
This is in response to a previous query that was strangely archived despite the original poster receiving no definitive answer. The iBook G4 (and various other iBooks) does not support "spanning" dual displays (I believe Apple calls this feature "ext
-
IOS4 upgrade. Phone stuck on portrait mode
Did the upgrade on 2 month old 3gs and now the keyboard, photo view etc only works in the vertical position, not horizontal. I'm new to apple land - anyone had this problem?
-
Text messages different colors?
The text messages I send are different colors. Some of their background are blue and some are green. Why is that?
-
Essbase Studio Cube Stores data as integer, drops decimals
I have an Essbase cube that I am developing in Essbase Studio 11.1.2.1. If I preview the data in Essbase Studio I can see the decimals for the number. However, when I deploy the cube the numbers are not stored in the Essbase cube with the decimal pla