Howto Scenario: Inbound IDOC - 2 rows into JDBC adapter ?
Hi,
How can I achieve the following:
Inbound IDOC into XI, has to be transformed, and 2 rows have to be inserted into a database table.
I am able to insert 1 row through the XMB2DB_XML mode in JDBC adapter.
A solution could be to create an identical XML structure ("access" elements) through mapping, but I am unable to get 2 XML structures from 1 inbound IDOC document.
Any ideas are most welcome.
thanks,
Manish
Hi Manish,
On the target side, right-click on the "access" element and select "Duplicate Subtree". Now, you should have 2 "access" structures to map from the IDoc.
Regards,
Bill
Similar Messages
-
How to update row by row in Jdbc Adapter sender ?
Hi friends ,
No i am reading data from a table using select query and resulting data i am keeping in the FTP folder as XML File.
I want to
1. to know how many rows i read ?
2. Update the read completed time in each row of the sender side table .
( I am using <b>select * from a table where tag='n' </b> . I am giving this in <b>Query SQL Statement</b> of JDBC Sender adapter processing parameter .
I am writing update query as update table set tag='y' where tag='n' .
Will it perform row by row ?
3. Insert in to another R3 System table the rows which i read as a log .
Can you please give procedure to do that .
Expecting your reply asap .
Thank you
Best Regards.,
V.RangarajanHi raj ,
Thanks for ur reply . I am new to xi . Just i am doing a scenario . I can able to read the ms-sql server table data using jdbc Sender adapter .
Can i use RFC Adapter to insert the values to R3 table ?
If i have mapped to rfc fields will it store into the table once we read the data from ms_sql server table using select query of JDBC Sender Adapter ?
Best Regards
V.Rangarajan -
JDBC adapter access to iSeries DB files in libraries
Hello,
We try to connect to an AS400 iSeries system via JDBC adapter!!! (not a typical scenario I think!)
The files (so called by IBM but we identify them as tables in our understanding ) are stored in libraries.
The question is how to add this library (like a path?) into JDBC adapter and interface description.
Adapter:
JDBC Driver: <driver name>
Conenction: <physical address of iSeries DB>
LogIn data: xxx
Interface description:
MESSAGE
STATEMENT = []
TABLENAME
ACTION = select
TABLE = DBNAME.TABLENAME
The interface description is from another successful implementation on an ORACLE DB.
I think the table definition should contaion the library like
<DBNAME>.<LIBRARY>.<TABLENAME>
separated by dots!
I am not sure. So can anyone correct or agree?
Currently we are not ready for a first try because the driver installation is next week.
But it would help to know what we have to do instead of spending possibly a lot of time for trial and error.
Kind Regards
Dirk
Of course, helpful answers are valued by points!I don't know this mechanism of high availability with 2 IP, but if you really have 2 hostname you should think to a method to switch from a node to the other in easy way, deactivating one and activating the other (for instance with the <a href="http://help.sap.com/saphelp_nw2004s/helpdata/en/45/0c86aab4d14dece10000000a11466f/frameset.htm">Controlling a Communication Channel Externally</a>)
I suggest you to verify how it will be in production instance: real cluster ? switch over ? virtual IP ?
Regards,
Sandro -
JDBC Inbound to separate IDOCS per Row instead of one single IDOC?
Hello everyone,
I'm currently working on the migration of our old B1iSN 8.8 scenarios to B1iSN 9.0.
The scenario is the following: I have a SQL table with data and a separate control table where I store the date of the last run/data retrieval. My JDBC Inbound is triggered every day and the SQL Statement selects all new data between now and the last run. The process flow only consists of one transformation Atom which builds my custom idoc structure.
In B1iSN 8.8 I've got one IDOC per row from the resultset. In the B1 9.0 outbound all rows are now withing one single IDOC. How can I trigger that separate IDOCS are created!?
I've already tried the "Block Size" Setting. When I set it to "1" I get for two datasets two output IDOCS. However all data is in IDOC 1 and the second IDOC is empty. What is wrong here? How can I access the data in the current block? I currently use a "for each" on the normal jdbc resultset rows.
I'm testing it at the moment with flat file output as our target system is not accessible yet, but I suppose this is not the reason?
Thanks in advacne for any help.Hello André,
please use a two-step approach with two integration scenarios:
DB INBOUND -> VOID OUTBOUND
INTERNAL QUEUE INBOUND -> ECC OUTBOUND
In step one add a condtion including a for-each loop into your integration flow.
Within this for-each please hand over for each IDoc you want to create into a second integration step via "Internal queue":
The second step is triggered x-times via internal queue and does the field mapping into IDoc format.
Best regards
Bastian
P.S.: with this approach, you can keep the block processing setting for DB INBOUND to an higher amount. -
How to eliminate inserting Duplicate rows into database using JDBC Adapter
File->Xi->JDBC
In above Scenario if the file has two rows their values are identical, then how can we eliminated inserting Duplicate rows into database using JDBC AdapterDatabase is a consumer of a SERVICE (SOA!!!!!!).
Database plays a business system role here!!!!
Mapping is part of an ESB service
Adaptor is a technology adapted to ESB framework to support specific protocol.
ESB accomplish ESB duties such as transformation, translation, routing. Routing use a protocol accepted by the consumer. In a JDBC consumer it is JDBC protocol and hence it a JDBC adaptor.
There is clear separation on responsibilities among business system and ESB. ESB do not participate in business decision or try to get into business system data layer.
So who ever are asking people to check duplicate check as part of mapping (an ESB service) may not understand integration practice.
Please use an adaptor module which will execute the duplicate check with business system in a plug and play approach and separate that from ESB service so that people can build integration using AGILE approach.
Thanks -
Pass system ack of JDBC adapter to ERP in IDoc - XI - JDBC scenario
Dear all,
i have an IDoc -> XI -> JDBC scenario (without using ccBPM). In the standard way the ERP system, sending the IDoc waits for an application acknowledgement. However the JDBC adapter is only capeable to send system acknowledgements.
Is there a way to pass these acknowledgements to the IDoc status record?
In help.sap.com (http://help.sap.com/saphelp_nwpi71/helpdata/en/ab/bdb13b00ae793be10000000a11402f/frameset.htm) under "IDoc Processing with the IDoc Adapter " there is a table that maps XI system/applic acknowledgement to IDoc status. So in my opinion the status record of my IDoc should at least chenge to the corresponding status for the system ack.
Can anyone tell, if this is really working? What if I deactivate the acknowledgement request in the NOALE programm?
In any case, can I achieve to transfer the system ack to the IDoc status without using a ccBPM?
Many thanks and best regards
FlorianIs there a way to pass these acknowledgements to the IDoc status record?
Without BPM, No.
So in my opinion the status record of my IDoc should at least chenge to the corresponding status for the system ack.
The ack referred here is related to idoc status whether it is properly reached till XI or not and not related to the JDBC ack.
What if I deactivate the acknowledgement request in the NOALE programm?
Then u won't have any ALEAUD message at sender R3.
In any case, can I achieve to transfer the system ack to the IDoc status without using a ccBPM?
No
Regards,
Prateek -
Item text not getting copied into sales order from inbound Idoc
Hi All,
I have a scenario where in we get the sales order created through a inbound 850 idoc .
The trading partner sends the sales order information which gets through GENTRAN and gets posted as inbound idoc in SAP .
The problem here is that ,there are segments with values for item text in the idoc ,but the item text is not getting carried over to the sales order which is getting created because of this idoc.
There is no problem with the header text and it is present both at the idoc as well as the sales order level.
I have checked the configuration for text determination and everything looks fine .
Please let me know has anyone come across a scenario like this .This is important as our EDI is getting failed when an outbound 810 is sent from SAP for the simple reason that there is no item text .
Thanks
SridharHello Sridhar
Do you have access to the genuine 850 EDI message? Does it contain the item texts? If so then there is probably a problem with the (XSLT) mapping EDI 850 -> ORDERS.ORDERS05 IDoc.
Are you able to test the mapping independent of the EDI transmission? This way you could nail down the problem.
Regards
Uwe -
Inserting Multiple Rows into Database Table using JDBC Adapter - Efficiency
I need to insert multiple rows into a database table using the JDBC adapter (receiver).
I understand the traditional way of repeating the statement multiple times, each having its <access> element. However, I am just wondering whether this might be performance-inefficient, as it might insert records one by one.
Is there a way to ensure that the records are inserted into the table as a block, rather than record-by-record?Hi Bhavesh/Kanwaljit,
If we have multiple ACCESS tags then what happens is that the connection to the database is made only once. But the data is inserted row by row.
Why i am saying this?
If we add the following in JDBC Adapter..logSQLStatement = true. Then incase of multiple inserts we can see that there are multiple
<i>Insert into tablename(EMP_NAME,EMP_ID) VALUES('J','1000')
Insert into tablename(EMP_NAME,EMP_ID) VALUES('J','2000')</i>
Doesnt this mean that rows are inserted one by one?
Correct me if i am wrong.
This does not mean that the transaction is not guaranted. Either all the rows will be inserted or rolled back.
Regards,
Sumit -
EDI / IDoc : While posting inbound IDocs, IDocs gets into status 56
EDI / IDoc : While posting inbound IDocs, IDocs gets into status 56 with message "EDI: Sender port in control record is invalid".
But if I reprocess the same idoc without doing any changes using RBDINPUT and select radio button to processs idocs for status 56, then it gets successfully posted.
So not getting why this stucks for the first time.
Thanks in Advance..........i fixed this my self
-
hello all,
here i have a basic problem ,
i am given a text file ie. note pad which has around 100 rows ,
each row represents each employee details,
now i have to convert each row into 1 idoc,
for this i had created one datatype whose structure coincides with the employee fields and after creating the message interface i mapped it with the all ready imported idoc,can any one tell me how to further with little details, any help will be appreciated ,its very urgent, thank you.Hi Kutumba,
In ID u need to create atlest one business service with a Sender File Communication Channel to receive the file and one Business System for the R/3 System with a Receiver IDoc Communication Channel to receive the Idoc XML.
U also need to create the Receiver and the Sender Agreements.
U can also go through this link:
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/6bd6f69a-0701-0010-a88b-adbb6ee89b34
Try doing this and let me know if u face any further problems.
Regards
Neetu -
Collective inbound idocs into R/3 system using master_idoc_distribute FM
Hi All,
I want to process inbound idocs (4 idocs collectively) in R/3 using master_idoc_distribute function module. The idoc is coming into system as one idoc having 4 segments (each segment for each idoc and contains sub segments too). I want to call master_idoc_distribute for each idoc for sending it to external system. I am using a custom FM for calling the master_idoc_distribute and assigning it into process code inbound for processing the incoming idoc.
The FM is working for one idoc perfectly but throwing error for more than idocs coming in collectively. Any help is appreciated.
Thanksuse commit work saparately after each function module call
-
Conversion of inbound IDOC into XML file
Hi experts,
We have an inbound idoc (FIDCC2) that starts and triggers the creation of an FI document. We would like now, that, instead of creating the FI document, the inbound IDOC creates an XML file on a given directory, with the contents of the idoc. Can anybody explain how to configure this?
Many thanks in advance,
John De Coninck.John,
Firstly, need more information. If I understand you correctly, you are getting an inbound FI document that is created as an IDOC and it posts into FI (using an underlying FM/transaction etc). Now you want to convert this IDOC into XML on a folder (which can be found on AL11 (a.k.a App Server)).
Question 1: Where are you getting this IDOC from? Is it through EDI/ALE? What is your interfacing server (the one that is sending you this IDOC.
Question 2: What format is this IDOC coming in? Is it flat/XML etc?
Question 3: Is this an internally created IDOC? Is there an attached partner profile in WE20? If there is a partner profile that exists and you're using this to create the inbound IDOC, then there are process codes available to forward these IDOC's (e.g ED08).
There are many ways you can achieve this but everything depends on how your system is configured and how this processing is going on. -
Error in Inbound Idoc trigering Workflow Scenario
I am getting below Error while I am posting Z Inbound IDOC in production system.
Please tell me what is getting wrong and how to correct.
Work item 000000000000 cannot be read
Message no. WL803
Diagnosis
The work item with the specified ID can either not be read completely or partly (for example, its container).
System Response
The action was cancelled and the work item, if it still exists, set to error status.
Procedure
Check whether the work item has already been deleted or whether only parts of it can no longer be read.
In the latter case, the work item is inconsistent.
Refer to your workflow system administrator.
Points will be surely given to helpful answers.
Regards
Edited by: Tushar Mundlik on Feb 12, 2008 8:15 AMHi Krishna,
First we have to find out if the error comes from processing in the application or from one of the checks performed before the application is called.
Go to function module IDOC_INPUT_OPDERS and put a breakpoint on "PERFORM edi_mode_to_mem.". Then reprocess the Idoc and when the program stops at your breakpoint set INPUT_METHOD to 'A'. In this way your order will be processed online (using BDC and you can find the error).
If the program doesn't stop at your breakpoint the error comes from one of the checks before.
But please try this first and come back if this doesn't solve your problem.
Kind regards,
John. -
Inbound idoc fdccp02 to populate the data into segments
hello,
can anybody send me sample data populating to inbound idoc fidccp02.
thanks
Anantcheck the below test data that I've used in my project.
E1FIKPF
BUKRS 1000
BLART DT
BLDAT 20080908
BUDAT 20080908
WWERT 20080908
BKTXT 2205/033/01W
WAERS GBP
GLVOR RFBU
E1FISEG
BUZEI 001
BSCHL 01
KOART D
SHKZG S
WRBTR 280.00
SGTXT Q8/Pace
E1FINBU
KUNNR KU0002
E1FISEG
BUZEI 002
BSCHL 50
KOART S
SHKZG H
MWSKZ A0
WRBTR 280.00
SGTXT Effluent Charge
KOSTL L6111
HKONT 0000321030
PRCTR L6111
PRCTR L6111 -
Hi Friends,
Can anybody help me . I am facing a problem in IDOC inbound processing.
My scenario in SALES PER RECEIPT - POS like this: I have to read XML (From PC) file and i have to post it.
I am using Idoc - WPUBON01. After reading XML file i used Function module to distribute - MASTER_IDOC_DSTRIBUTE.
In we02 i checked Idoc is generating but status showing 29.
So run the report RSEOUT00. But i am getting message like 'No idoc is selected for processing'.
Plz help me what i have to do. I created port also.
Regards,
P. Kumar.Hi,
sarath has told valid point..don't use master_idoc_distribute,instead use the bapi
bapi_idoc_input1..
plz check the sample code how to do with inbound idoc,as u have got the data ,u just need to call the bapi and post it..
check the code and proceed..
FUNCTION zdtsint052f_gpoms_to_sap_gm.
""Local interface:
*" IMPORTING
*" VALUE(INPUT_METHOD) LIKE BDWFAP_PAR-INPUTMETHD
*" VALUE(MASS_PROCESSING) LIKE BDWFAP_PAR-MASS_PROC
*" EXPORTING
*" VALUE(WORKFLOW_RESULT) LIKE BDWF_PARAM-RESULT
*" VALUE(APPLICATION_VARIABLE) LIKE BDWF_PARAM-APPL_VAR
*" VALUE(IN_UPDATE_TASK) LIKE BDWFAP_PAR-UPDATETASK
*" VALUE(CALL_TRANSACTION_DONE) LIKE BDWFAP_PAR-CALLTRANS
*" TABLES
*" IDOC_CONTRL STRUCTURE EDIDC
*" IDOC_DATA STRUCTURE EDIDD
*" IDOC_STATUS STRUCTURE BDIDOCSTAT
*" RETURN_VARIABLES STRUCTURE BDWFRETVAR
*" SERIALIZATION_INFO STRUCTURE BDI_SER
*" EXCEPTIONS
*" WRONG_FUNCTION_CALLED
Purpose :On recording the material consumption from *
manufacturing tickets in GPOMS system, GPOMS *
will report the data to the SAP system.Based *
on this message, goods issue to the process order*
will be posted in SAP. This interface will *
used for storage location managed materials *
in SAP. *
Program Logic :Loop at the data records of IDOC and to get the *
Order number and SAP Movement type *
If the Material document already exists for this *
Order number and Movement type then give error *
else call standard FM 'BAPI_IDOC_INPUT1' to *
Material Documents in SAP. *
Declaration of Constants *
CONSTANTS :lc_item_create(25) TYPE c VALUE 'E1BP2017_GM_ITEM_CREATE',
lc_mbgmcr(6) TYPE c VALUE 'MBGMCR'.
Declaration of Variables *
DATA : lv_index TYPE sytabix,
lv_retcode type sy-subrc.
Declaration of Workareas *
DATA: lwa_e1bp2017_gm_item_create TYPE e1bp2017_gm_item_create,
lwa_data TYPE edidd, " Work area for IDOC
lwa_control TYPE edidc. " Work Area for control rec
Read the control data information of idoc.
loop at idoc_contrl INTO lwa_control Where mestyp = lc_mbgmcr.
Extract the data from the segments.
LOOP AT idoc_data INTO lwa_data
WHERE docnum = lwa_control-docnum and
segnam = lc_item_create.
*->> Set the tabix of the internal table
lv_index = sy-tabix.
Move the Material Document Item Segment data
MOVE lwa_data-sdata TO lwa_e1bp2017_gm_item_create.
Modify the material document item data internal table
PERFORM sub_modify_idocdata changing lwa_e1bp2017_gm_item_create.
*->> set the changed values to the IDOC SDATA
MOVE lwa_e1bp2017_gm_item_create TO lwa_data-sdata.
*->> Modify the table
MODIFY idoc_data FROM lwa_data index lv_index.
Clear the Work areas
CLEAR : lwa_data,
lwa_e1bp2017_gm_item_create.
ENDLOOP. "LOOP AT t_idoc_data
Call the BAPI function module to create the
appropriate Material Document
CALL FUNCTION 'BAPI_IDOC_INPUT1'
EXPORTING
input_method = input_method
mass_processing = mass_processing
IMPORTING
workflow_result = workflow_result
application_variable = application_variable
in_update_task = in_update_task
call_transaction_done = call_transaction_done
TABLES
idoc_contrl = idoc_contrl
idoc_data = idoc_data
idoc_status = idoc_status
return_variables = return_variables
serialization_info = serialization_info
EXCEPTIONS
wrong_function_called = 1
OTHERS = 2.
IF sy-subrc = 1.
RAISE wrong_function_called.
ENDIF.
endloop.
ENDFUNCTION.
***INCLUDE LZDTSINT052F_GPOMS_GMF01 .
*& Form sub_modify_idocdata
Modify the material document item data internal table
FORM sub_modify_idocdata
CHANGING pwa_e1bp2017_gm_item_create TYPE e1bp2017_gm_item_create.
contant declaration
CONSTANTS: lc_261(3) TYPE c VALUE '261'.
DATA : lv_aplzl LIKE resb-aplzl,
lv_aufpl LIKE resb-aufpl,
lv_subrc LIKE sy-subrc,
lv_charg LIKE resb-charg,
lv_uom LIKE pwa_e1bp2017_gm_item_create-entry_uom.
CLEAR: pwa_e1bp2017_gm_item_create-reserv_no,
pwa_e1bp2017_gm_item_create-res_item.
*->> Get SAP storage bin & Storage type from the Z table
SELECT lgtyp lgpla
INTO (pwa_e1bp2017_gm_item_create-stge_type,
pwa_e1bp2017_gm_item_create-stge_bin)
UP TO 1 ROWS
FROM zdtsint050_sttyp
WHERE zstorage_typ = pwa_e1bp2017_gm_item_create-stge_type
AND zstorage_bin = pwa_e1bp2017_gm_item_create-stge_bin.
ENDSELECT.
IF sy-subrc NE 0.
CLEAR: pwa_e1bp2017_gm_item_create-stge_type,
pwa_e1bp2017_gm_item_create-stge_bin.
ENDIF.
PERFORM get_oper CHANGING pwa_e1bp2017_gm_item_create.
Get the Reservation number and Reservation item number
basing on the idoc data.
SELECT rspos werks lgort
INTO (pwa_e1bp2017_gm_item_create-res_item,
pwa_e1bp2017_gm_item_create-plant,
pwa_e1bp2017_gm_item_create-stge_loc)
FROM resb
UP TO 1 ROWS
WHERE rsnum = pwa_e1bp2017_gm_item_create-reserv_no
AND matnr = pwa_e1bp2017_gm_item_create-material
AND charg = pwa_e1bp2017_gm_item_create-batch
AND aufnr = pwa_e1bp2017_gm_item_create-orderid
AND vornr = pwa_e1bp2017_gm_item_create-activity
AND bwart = lc_261.
ENDSELECT.
IF sy-subrc <> 0.
Start of insertion for R31K993797
CLEAR lv_charg.
SELECT rspos werks lgort
INTO (pwa_e1bp2017_gm_item_create-res_item,
pwa_e1bp2017_gm_item_create-plant,
pwa_e1bp2017_gm_item_create-stge_loc)
FROM resb
UP TO 1 ROWS
WHERE rsnum = pwa_e1bp2017_gm_item_create-reserv_no
AND matnr = pwa_e1bp2017_gm_item_create-material
AND charg = lv_charg
AND aufnr = pwa_e1bp2017_gm_item_create-orderid
AND vornr = pwa_e1bp2017_gm_item_create-activity
AND ( splkz = 'X' or
splkz = space )
AND bwart = lc_261.
ENDSELECT.
IF sy-subrc <> 0.
End of insertion for R31K993797
SELECT SINGLE werks lgort
INTO (pwa_e1bp2017_gm_item_create-plant,
pwa_e1bp2017_gm_item_create-stge_loc)
FROM resb
WHERE rsnum = pwa_e1bp2017_gm_item_create-reserv_no.
CLEAR : pwa_e1bp2017_gm_item_create-reserv_no,
pwa_e1bp2017_gm_item_create-res_item.
ENDIF.
ENDIF.
get SAP UOM
SELECT SINGLE zsap_uom
INTO lv_uom
FROM zca_uom_conv
WHERE zext_uom = pwa_e1bp2017_gm_item_create-entry_uom.
IF sy-subrc = 0.
pwa_e1bp2017_gm_item_create-entry_uom = lv_uom.
ENDIF.
ENDFORM. " sub_modify_idocdata
*& Form get_oper
Get the operation
<--P_PWA_E1BP2017_GM_ITEM_CREATE_RE Segment
FORM get_oper CHANGING p_pwa_e1bp2017_gm_item_create TYPE
e1bp2017_gm_item_create.
DATA : l_aufpl LIKE afko-aufpl,
l_aplzl LIKE afvc-aplzl.
REFRESH : i_op.
UNPACK p_pwa_e1bp2017_gm_item_create-orderid TO
p_pwa_e1bp2017_gm_item_create-orderid.
Get the reservation and routing number for the order
SELECT SINGLE
rsnum
aufpl
FROM afko
INTO (p_pwa_e1bp2017_gm_item_create-reserv_no,
l_aufpl)
WHERE aufnr = p_pwa_e1bp2017_gm_item_create-orderid.
IF sy-subrc = 0.
CALL FUNCTION 'CONVERSION_EXIT_NUMCV_INPUT'
EXPORTING
input = p_pwa_e1bp2017_gm_item_create-activity
IMPORTING
output = p_pwa_e1bp2017_gm_item_create-activity.
ENDIF.
ENDFORM. " get_oper
i have modifeid the incoming data,in ur case its not needed ,only first few lines until the bapi u have to see..if u have any query let me know..
Regards,
Nagaraj
Maybe you are looking for
-
How to embed the Images in the java sourse
How to embed the Images in the java sourse The size of the file will be too large while I put the binary code of the image into the java sourse! thanks
-
Change posting date in incoming excise invoice
I need to be able to change posting date in incoming excise invoice. It just takes the date from GRPO and doesn't let me edit it. There are instances when I get excise invoice at a later date, and when that date may be in the next month, I want to be
-
So I have been using my iPad for a few months now and I recently noticed that the time display next to the playlist name in the music app that is supposed to display the total song count and time of a playlist is way off. I have a 25 song playlist t
-
Can anyone help me resize a pdf? I signed up for the monthly plan for adobe but i cannot resize an already existing pdf. Please it's urgent would really appreciate a quick answer if anyone knowns.
-
Problem running sudo command in script
Hello, I try to run this script. #!/bin/bash sudo /bin/cat /dev/input/event1 >> /home/tom/.log/log.bin & The sudoers entry %users ALL= NOPASSWD: /bin/cat /dev/input/event1 >> /home/tom/.log/log.bin I get the following error! $ Passwort: sudo: pam_aut