Flat File vs using final constants - which is better appraoch
Hi
i have a requirement where i have set some states variable.
it goes like this:
state A , some indicator A ,indicator B ,flag A : final state set to be set
1 , true , A , 0 : Declined
1 , false , B ,0 : PreDeclined...
...... like this i have some 20+ rows.. below 50
since these are states .. this may increase.. but not frequently
i would like to know which is best way to implement the above logic. My peer suggested to capture all that data as constants /hard code. I was thinking to store them in CSV and then read. but my peer was mentioning using I/O operations slows performance. can u please suggest me a better way ( not DB since its requires lots of permission etc..)
thats interesting. can u please let me know how to use enums to fit my above requrement.
i have used enums eariler where my requirement is just return descriptionstring to code. . ie.. is just only one constant.more over those enums are generated .. and they do correspond to a db table.
but here what i want is a basically look up based upon 3 elements. and return 4 the element..
i'm not sure on how to use enums for that.
Similar Messages
-
Export table data in a flat file without using FL
Hi,
I am looking for options where I can export table data into a flat file without using FL(File Layout) i.e., by using App Engine only.
Please share your experience if you did anything as this
ThanksA simple way to export any record (table/view) to an csv fiel, is to create a rowset and loop through all record fields, like below example code
Local Rowset &RS;
Local Record &Rec;
Local File &MYFILE;
Local string &FileName, &strRecName, &Line, &Seperator, &Value;
Local number &numRow, &numField;
&FileName = "c:\temp\test.csv";
&strRecName = "PSOPRDEFN";
&Seperator = ";";
&RS = CreateRowset(@("Record." | &strRecName));
&RS.Fill();
&MYFILE = GetFile(&FileName, "W", %FilePath_Absolute);
If &MYFILE.IsOpen Then
For &numRow = 1 To &RS.ActiveRowCount
&Rec = &RS(&numRow).GetRecord(@("RECORD." | &strRecName));
For &numField = 1 To &Rec.FieldCount
&Value = String(&Rec.GetField(&numField).Value);
If &numField = 1 Then
&Line = &Value;
Else
&Line = &Line | &Seperator | &Value;
End-If;
End-For;
&MYFILE.WriteLine(&Line);
End-For;
End-If;
&MYFILE.Close(); You can of course create an application class for generic calling this piece of code.
Hope it helps.
Note:
Do not come complaining to me on performance issues ;) -
Process chains from the flat file by using filezilla client version in BI
Hi experts,
please let me know how to create the process chains from flat file by using filezilla client version.
so far, I didn't work with file zilla FTP. can anybody give detailed step by step procedure to find the flat files and download them and creating process chains from that flat file.
Thanks & Regards,
Babu..Hi,
Check these:----
Process chain configuration for Flat file loading
http://wiki.sdn.sap.com/wiki/display/BI/Howtowriteroutinetofetchcurrentday%27sfilename
Regards,
Suman -
Structure of the flat file that uses bapi_po_create1 ?
Hi People,
I am going to create a purchase order using bapi_po_create1 .... to upload the file from legacy to r3 , .what will the stucture of the flat file .......what wil be the key to diffrentiate diiferent purchase orders. ( for eg : IN vendor master ........vendor number will be the key to diffrentiate the records , as we all know the purchase order will be created only at the end of the transaction..so what will be the key to diffrentitate each po record )Hi Siva,
Check the Code below. You can refer the fields to prepare the input File .
*& Report YDM_PO_CREATE1 *
REPORT ydm_po_create1.
*-- Input File Declaration
TYPES: BEGIN OF ty_input_file,
column1 TYPE char50,
column2 TYPE char50,
column3 TYPE char50,
column4 TYPE char50,
column5 TYPE char50,
column6 TYPE char50,
column7 TYPE char50,
column8 TYPE char50,
column9 TYPE char50,
column10 TYPE char50,
column11 TYPE char50,
column12 TYPE char50,
column13 TYPE char50,
column14 TYPE char50,
column15 TYPE char50,
column16 TYPE char50,
column17 TYPE char50,
column18 TYPE char50,
END OF ty_input_file.
DATA: i_input_file TYPE STANDARD TABLE OF ty_input_file,
wa_input_file TYPE ty_input_file.
CONSTANTS: c_path TYPE char20 VALUE 'C:\',
c_mask TYPE char9 VALUE ',*.*,*.*.',
c_mode TYPE char1 VALUE 'O',
c_filetype TYPE char10 VALUE 'ASC',
c_x TYPE char01 VALUE 'X'.
PARAMETERS : p_fname LIKE rlgrap-filename.
AT SELECTION-SCREEN ON VALUE-REQUEST FOR p_fname.
*-- Browse Presentation Server
PERFORM f4_presentation_file.
START-OF-SELECTION..
*-- Read presentation server file
PERFORM f1003_upload_file.
IF NOT i_input_file[] IS INITIAL.
PERFORM split_data.
ENDIF.
*& Form f4_presentation_file
*& F4 Help for presentation server
FORM f4_presentation_file .
CALL FUNCTION 'WS_FILENAME_GET'
EXPORTING
def_path = c_path
mask = c_mask
mode = c_mode
title = text-001
IMPORTING
filename = p_fname
EXCEPTIONS
inv_winsys = 1
no_batch = 2
selection_cancel = 3
selection_error = 4
OTHERS = 5.
IF sy-subrc <> 0.
MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
ENDIF.
ENDFORM. " f4_presentation_file
*& Form f1003_upload_file
*& Upload File
FORM f1003_upload_file .
DATA: lcl_filename TYPE string.
lcl_filename = p_fname.
CALL FUNCTION 'GUI_UPLOAD'
EXPORTING
filename = lcl_filename
filetype = c_filetype
has_field_separator = c_x
TABLES
data_tab = i_input_file
EXCEPTIONS
file_open_error = 1
file_read_error = 2
no_batch = 3
gui_refuse_filetransfer = 4
invalid_type = 5
no_authority = 6
unknown_error = 7
bad_data_format = 8
header_not_allowed = 9
separator_not_allowed = 10
header_too_long = 11
unknown_dp_error = 12
access_denied = 13
dp_out_of_memory = 14
disk_full = 15
dp_timeout = 16
OTHERS = 17.
IF sy-subrc <> 0.
MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
EXIT.
ENDIF.
ENDFORM. " f1003_upload_file
*& Form split_data
Collect data for creating Purchase Order
FORM split_data .
DATA: i_poitem TYPE STANDARD TABLE OF bapimepoitem,
i_poitemx TYPE STANDARD TABLE OF bapimepoitemx,
i_poitem_sch TYPE STANDARD TABLE OF bapimeposchedule,
i_poitem_schx TYPE STANDARD TABLE OF bapimeposchedulx,
i_acct_*** TYPE STANDARD TABLE OF bapimepoaccount,
i_acct_assx TYPE STANDARD TABLE OF bapimepoaccountx,
i_services TYPE STANDARD TABLE OF bapiesllc ,
i_srvacc TYPE STANDARD TABLE OF bapiesklc,
i_return TYPE STANDARD TABLE OF bapiret2,
wa_header TYPE bapimepoheader,
wa_headerx TYPE bapimepoheaderx,
wa_poitem TYPE bapimepoitem,
wa_poitemx TYPE bapimepoitemx,
wa_poitem_sch TYPE bapimeposchedule,
wa_poitem_schx TYPE bapimeposchedulx,
wa_acct_*** TYPE bapimepoaccount,
wa_acct_assx TYPE bapimepoaccountx,
wa_services TYPE bapiesllc,
wa_srvacc TYPE bapiesklc,
wa_return TYPE bapiret2,
ws_po TYPE bapimepoheader-po_number.
wa_services-pckg_no = 10.
wa_services-line_no = 1.
wa_services-outl_no = '0'.
wa_services-outl_ind = c_x.
wa_services-subpckg_no = 20.
APPEND wa_services TO i_services.
wa_srvacc-pckg_no = 10.
wa_srvacc-line_no = 1.
wa_srvacc-serno_line = 01.
wa_srvacc-serial_no = 01.
wa_srvacc-percentage = 100.
APPEND wa_srvacc TO i_srvacc.
LOOP AT i_input_file INTO wa_input_file.
IF wa_input_file-column2 EQ 'HD'.
wa_header-doc_type = wa_input_file-column3.
wa_header-creat_date = sy-datum.
wa_header-created_by = sy-uname.
wa_header-vendor = wa_input_file-column4.
PERFORM conversion_output USING wa_header-vendor
CHANGING wa_header-vendor.
wa_header-comp_code = 'DE03'.
wa_header-purch_org = 'DE03'.
wa_header-pur_group = 'DE1'.
wa_header-vper_start = wa_input_file-column9.
wa_header-vper_end = wa_input_file-column10.
wa_headerx-comp_code = c_x.
wa_headerx-doc_type = c_x.
wa_headerx-creat_date = c_x.
wa_headerx-created_by = c_x.
wa_headerx-vendor = c_x.
wa_headerx-purch_org = c_x.
wa_headerx-pur_group = c_x.
wa_headerx-vper_start = c_x.
wa_headerx-vper_end = c_x.
ENDIF.
IF wa_input_file-column2 EQ 'IT'.
wa_poitem-po_item = wa_input_file-column3.
wa_poitem-short_text = wa_input_file-column6.
wa_poitem-plant = wa_input_file-column8.
wa_poitem-quantity = '1'.
wa_poitem-tax_code = 'V0'.
wa_poitem-item_cat = 'D'.
wa_poitem-acctasscat = 'K'.
wa_poitem-matl_group = wa_input_file-column7.
wa_poitem-pckg_no = '10'.
APPEND wa_poitem TO i_poitem .
wa_poitemx-po_item = wa_input_file-column3.
wa_poitemx-po_itemx = c_x.
wa_poitemx-short_text = c_x.
wa_poitemx-plant = c_x.
wa_poitemx-quantity = c_x.
wa_poitemx-tax_code = c_x.
wa_poitemx-item_cat = c_x.
wa_poitemx-acctasscat = c_x.
wa_poitemx-matl_group = c_x.
wa_poitemx-pckg_no = c_x.
APPEND wa_poitemx TO i_poitemx.
wa_poitem_sch-po_item = wa_input_file-column3.
wa_poitem_sch-delivery_date = sy-datum.
APPEND wa_poitem_sch TO i_poitem_sch.
wa_poitem_schx-po_item = wa_input_file-column3.
wa_poitem_schx-po_itemx = c_x.
wa_poitem_schx-delivery_date = c_x.
APPEND wa_poitem_schx TO i_poitem_schx.
wa_acct_***-po_item = 10.
wa_acct_***-serial_no = 01.
wa_acct_***-gl_account = '0006360100'.
wa_acct_***-co_area = '1000'.
wa_acct_***-costcenter = 'KC010000'.
APPEND wa_acct_*** TO i_acct_***.
wa_acct_***-po_item = 10.
wa_acct_***-serial_no = 02.
wa_acct_***-gl_account = '0006360100'.
wa_acct_***-co_area = '1000'.
wa_acct_***-costcenter = 'KC010000'.
APPEND wa_acct_*** TO i_acct_***.
wa_acct_assx-po_item = 10.
wa_acct_assx-serial_no = 01.
wa_acct_assx-po_itemx = c_x.
wa_acct_assx-serial_nox = c_x.
wa_acct_assx-gl_account = c_x.
wa_acct_assx-co_area = c_x.
wa_acct_assx-costcenter = c_x.
APPEND wa_acct_assx TO i_acct_assx.
wa_acct_assx-po_item = 10.
wa_acct_assx-serial_no = 02.
wa_acct_assx-po_itemx = c_x.
wa_acct_assx-serial_nox = c_x.
wa_acct_assx-gl_account = c_x.
wa_acct_assx-co_area = c_x.
wa_acct_assx-costcenter = c_x.
APPEND wa_acct_assx TO i_acct_assx.
wa_services-pckg_no = 20.
wa_services-line_no = 2.
wa_services-service = wa_input_file-column9.
wa_services-quantity = '100'.
wa_services-gr_price = '100'.
wa_services-userf1_txt = wa_input_file-column13.
APPEND wa_services TO i_services.
wa_srvacc-pckg_no = 20.
wa_srvacc-line_no = 1.
wa_srvacc-serno_line = 02.
wa_srvacc-serial_no = 02.
wa_srvacc-percentage = 100.
APPEND wa_srvacc TO i_srvacc.
ENDIF.
ENDLOOP.
CALL FUNCTION 'BAPI_PO_CREATE1'
EXPORTING
poheader = wa_header
poheaderx = wa_headerx
POADDRVENDOR =
TESTRUN =
MEMORY_UNCOMPLETE =
MEMORY_COMPLETE =
POEXPIMPHEADER =
POEXPIMPHEADERX =
VERSIONS =
NO_MESSAGING =
NO_MESSAGE_REQ =
NO_AUTHORITY =
NO_PRICE_FROM_PO =
IMPORTING
exppurchaseorder = ws_po
EXPHEADER =
EXPPOEXPIMPHEADER =
TABLES
return = i_return
poitem = i_poitem
poitemx = i_poitemx
POADDRDELIVERY =
poschedule = i_poitem_sch
poschedulex = i_poitem_schx
poaccount = i_acct_***
POACCOUNTPROFITSEGMENT =
poaccountx = i_acct_assx
POCONDHEADER =
POCONDHEADERX =
POCOND =
POCONDX =
POLIMITS =
POCONTRACTLIMITS =
poservices = i_services
posrvaccessvalues = i_srvacc
POSERVICESTEXT =
EXTENSIONIN =
EXTENSIONOUT =
POEXPIMPITEM =
POEXPIMPITEMX =
POTEXTHEADER =
POTEXTITEM =
ALLVERSIONS =
POPARTNER =
break gbpra8.
LOOP AT i_return INTO wa_return.
ENDLOOP.
ENDFORM. " split_data
*& Form conversion_output
Conversion exit input
FORM conversion_output USING p_ip
CHANGING p_op.
CALL FUNCTION 'CONVERSION_EXIT_ALPHA_INPUT'
EXPORTING
input = p_ip
IMPORTING
output = p_op.
ENDFORM. " conversion_output
Also suggest you to search in SDN with key - BAPI_PO_CREATE1. Will get more useful links.
Hope this helps.
Manish -
Select flat file name using routine
Hi experts!
I am trying to write a routine in the infopackage for flat file extraction, which will select the flat file automatically according to the date. I need to load always the file of the previous week. Please help me correcting the code. The file name is: DatAuftragsbestandSeiten_W(number of week).fix
For example: DatAuftragsbestandSeiten_W16.fix
Thank you for your help!
program filename_routine.
Global code
$$ begin of global - insert your declaration only below this line -
Enter here global variables and type declarations
as well as additional form routines, which you may call from the
main routine COMPUTE_FLAT_FILE_FILENAME below
*TABLES: ...
DATA: ...
DATA: Str1 value '/strans/appl/anzeigen_bw/DatAuftragsbestandSeiten_W',
Str3 value '.fix'.
DATA: iweek(2).
call function 'WEEKNR_GET'
EXPORTING
DATE = sy-datum
IMPORTING
WEEK+4(2) = iweek.
iweek = iweek - 1.
$$ end of global - insert your declaration only before this line -
form compute_flat_file_filename
using p_infopackage type rslogdpid
p_datasource type rsoltpsourcer
p_logsys type rsslogsys
changing p_filename type RSFILENM
p_subrc like sy-subrc.
$$ begin of routine - insert your code only below this line -
This routine will be called by the adapter,
when the infopackage is executed.
p_filename =
*....Concatenate str1 iweek str3 into p_filename.
p_subrc = 0.
$$ end of routine - insert your code only before this line -
endform.hi Doris,
try
in global routine
data : l_week type SCAL-WEEK,
i_week(2).
in form compute_...
data : Str1 type string,
Str3 type string.
str1 = '/strans/appl/anzeigen_bw/DatAuftragsbestandSeiten_W'.
str3 = '.fix'.
call function 'DATE_GET_WEEK'
exporting
date = sy-datum
importing
week = l_week.
i_week = l_week+4(2).
i_week = i_week - 1.
if strlen( i_week ) = 1.
concatenate str1 '0' i_week str3 into p_filename.
else.
concatenate str1 i_week str3 into p_filename.
endif.
hope this helps. -
Idoc to flat file mapping using XSLT
Hi,
i am using XSLT mapping. my requirement is mapping between idoc and flat file(xml to text). as i do not want to use FCC, i have opted for xslt mapping. please let me know any article which would be helpful for this.
regards,
MeenakshiHi Meenakshi,
Two things:
1. Achieving this functionality using XSLT is very difficult.
2. Secondly you may not be able to find a direct document to convert IDoc-XML to flat file using XSLT in sdn. Try google.
I found one link like that may be you can get some idea from there
http://www.stylusstudio.com/SSDN/default.asp?action=9&read=6453&fid=48
Also if you have a XSLT editor like XMLSPY or stylus studio then creating your specific XSLT will b much more simpler
Regards
Suraj -
Flat file export with a Constant header record???
I have a mapping that export the records of a table in a Flat File.
But I need to agreggate a constant in the first line of the export file (Header Record)
For example AAABBBCCCDDDEEEFFFGGG
How Can I put a Constant Header record?Marcelo,
When you use flat file as a target you can set the header to be your field names. Now... it looks like this is not exactly what you are looking for.
Alternative is to use a pre mapping procedure to create your file with the one line in it, and have the mapping produce additional records that are appended to the first line. Whether you create a new file or append to an existing file is a property on the target file, if I am not mistaken (it would be a configuration property otherwise).
Thanks,
Mark. -
Find out duplicate rows in a flat file before using sqlldr
Hello i want to import via sqlldr from a flat file to a table inside my data base. My flat file has unfortunately some duplicate copies inside. So, i defined my upload table with two primary keys- date and time(and sometimes there are more than one row with the same time and date inside the flat file). These primaries are important to me because i want to use them for later tables and i can't use the direct path and parallel method by using primaries.
Is there any tool which can find duplicate copies before i use sqlldr. But, the special case here is, that the rows not really duplicated but date and time rows twice. And for my interest it isn't necessary whether there are different values in the second row of the same date and time. The file contains data which is monitored every second and that's enough.
It would be nice if someone could help me
cheersI simply upload from sqlldr to staging tables first.
The staging tables allow duplicates then I do what I need to do in regards to duplicates or missing data during the transfer from the staging tables to the real tables.
The staging tables are also global temporary tables so I don't have to worry about cleaning them up or multiple sessions trampling each other.
I have also used an external table on the datafile instead of sqlldr, this allows me to get rid of the staging table. But that is only good for very small datasets in the file being loaded. -
Flat File to Flat File Scenario using File Adapter
Hi Experts,
In my scenario, requirement is Flat File to Flat file through File Adapter in PI. At sender end, we would need to use Sender File Adapter (NFS) and receiver end we would need to use Receiver File Adapter (FTP). File would be coming as a encrypted format, and the file needs to be sent in encrypted format.
Any idea how to proceed with this requirement would be helpful.
Thanks and Regards
SoumyaHi Soumya,
You need to choose the FTP in the "transport protocol" option. Then in the FTP connection parameters choose the option Connection Security and assign the value "FTPS for control and Data Connection". Then the "Command Order" will show up below that option and you could leave the default value itself. If you are using Public private key certificates of X.509 then you could choose the option "Use x.509 certificate for client authentication" and assign the values to the "keystore" and "certificate & private key" fields.
Note: To use the FTP with SSL you need to add the CA certificate to the TrustedCAs keystore view.
Regards,
Vishnu. -
Loading FLat file data using FDMEE having 1 to many mapping
Hi All,
I need to load a data from Flat file to hyperion planning applcation using FDMEE having one to many mapping
For e.g Data file has 2 records
Acc Actual Version1 Scene1 1000
Acc Actual Version1 Scene2 2000
now target application has 5 dimension and data need to be load as
acc Actual Version1 entity1 Prod2 1000
Acc Actual Version1 Entity2 Prod2 2000
Please suggest
Regards
AnubhavFrom your exmple I don't see the one too many mapping requirement. You have one source data line that maps to a single target intersection. Where is the one to many mapping requirement in your example?
-
Duplicate records in flat file extracted using openhub
Hi folks
I am extracting data from the cube to opnhub into a flat file, I see duplicate records in the file.
I am doing a full load to a flat file
I cannot have technical key because I am using a flat file.
PoonamI am using aggregates(In DTP there is a option to use aggregates) and the aggregates are compressed and I am still facing thiis issue.
Poonam -
Errors in Flat files while using BDC
*Hey Experts,*
*I have made a BDC program which uploads an excel file into the system.Check the coding below*
*I tried to run the same data but i received the following error at the system status.*
*Excel file://c:\cannot be processed .on checking in details i received this*
*Message no. UX893*
*Diagnosis*
*An error occurred while attempting to process Excel file FILE://C:\Users\Administrator\Desktop\jack abaper\.*
*Check whether:*
*the file exists*
*you have authorization to process the file*
*Excel is installed and can be run*
*System Response*
*Excel file FILE://C:\Users\Administrator\Desktop\jack abaper\ is not processed. The activity is terminated.*
*Procedure*
*When the error has been corrected, restart the program.*
*is there a way for the program to check the exact cause of error,like for my case the same data i was uploading.*
*if it could tell me which row of data in the file is bringing this error.Apart from that I am to run this program in background not foreground ,please note of that while forgering a solution.*
*Does anyone have a solution for this,or a better way gladly accept?*
*Thank you my fellow abapers*
*all help is appreciated.*
*regds*
*Ja*
*Below see program>>>*
REPORT z_uploadbdc NO STANDARD PAGE HEADING LINE-SIZE 255.
TYPE-POOLS: truxs.
SELECTION-SCREEN : BEGIN OF BLOCK blk1 WITH FRAME TITLE text-001.
PARAMETERS: p_file TYPE rlgrap-filename OBLIGATORY DEFAULT 'C:\'.
SELECTION-SCREEN : END OF BLOCK blk1.
*& Global Declarations
TYPES: BEGIN OF t_datatab,
col1(30) TYPE c,
col2(30) TYPE c,
col3(30) TYPE c,
END OF t_datatab.
DATA: it_datatab TYPE STANDARD TABLE OF t_datatab,
wa_datatab TYPE t_datatab.
DATA: it_raw TYPE truxs_t_text_data.
TYPES: BEGIN OF ty_material,
matnr TYPE matnr,
anzahl TYPE anzahl,
eqtyp TYPE eqtyp,
servon TYPE servon,
serbis TYPE serbis,
sernr TYPE serbis,
END OF ty_material.
DATA: it_material TYPE TABLE OF ty_material,
wa_material TYPE ty_material,
it_bdcdata TYPE TABLE OF bdcdata,
wa_bdcdata TYPE bdcdata.
Table for messages from call transaction.
The table is automatically filled with messages from call transaction.
DATA BEGIN OF messtab OCCURS 10.
INCLUDE STRUCTURE bdcmsgcoll.
DATA END OF messtab.
DATA : WF_MESSAGE(100).
REFRESH MESSTAB.
*& AT SELECTION SCREEN
AT SELECTION-SCREEN ON VALUE-REQUEST FOR p_file.
CALL FUNCTION 'F4_FILENAME'
EXPORTING
field_name = 'P_FILE'
IMPORTING
file_name = p_file.
*include bdcrecx1.
*& START-OF-SELECTION
START-OF-SELECTION.
CALL FUNCTION 'TEXT_CONVERT_XLS_TO_SAP'
EXPORTING
I_FIELD_SEPERATOR =
i_line_header = 'X'
i_tab_raw_data = it_raw
i_filename = p_file
TABLES
i_tab_converted_data = it_material
EXCEPTIONS
conversion_failed = 1
OTHERS = 2.
IF sy-subrc NE 0.
MESSAGE ID sy-msgid
TYPE sy-msgty
NUMBER sy-msgno
WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
ENDIF.
*perform open_group.
LOOP AT it_material INTO wa_material.
PERFORM bdc_dynpro USING 'SAPMIEQ0' '2000'.
PERFORM bdc_field USING 'BDC_CURSOR'
'RISA0-SERNR(01)'.
PERFORM bdc_field USING 'BDC_OKCODE'
'BU'.
PERFORM bdc_field USING 'RISA0-MATNR'
wa_material-matnr.
PERFORM bdc_field USING 'RISA0-ANZAHL'
wa_material-anzahl.
PERFORM bdc_field USING 'EQUI-EQTYP'
wa_material-eqtyp.
PERFORM bdc_field USING 'RISA0-SERVON'
wa_material-servon.
PERFORM bdc_field USING 'RISA0-SERBIS'
wa_material-serbis.
PERFORM bdc_field USING 'RISA0-SERNR(01)'
wa_material-sernr.
perform bdc_transaction using 'IQ04'.
CALL TRANSACTION 'IQ04' USING it_bdcdata
MODE 'A' UPDATE 'S'
MESSAGES into messtab.
REFRESH it_bdcdata.
ENDLOOP.
*& Form bdc_dynpro
text
-->PROGRAM text
-->DYNPRO text
FORM bdc_dynpro USING program dynpro.
CLEAR wa_bdcdata.
wa_bdcdata-program = program.
wa_bdcdata-dynpro = dynpro.
wa_bdcdata-dynbegin = 'X'.
APPEND wa_bdcdata TO it_bdcdata.
ENDFORM. "bdc_dynpro
*& Form bdc_field
text
-->FNAM text
-->FVAL text
FORM bdc_field USING fnam fval.
CLEAR wa_bdcdata.
wa_bdcdata-fnam = fnam.
wa_bdcdata-fval = fval.
APPEND wa_bdcdata TO it_bdcdata.
ENDFORM. "bdc_field
perform close_group.
Edited by: JackAbaper on Feb 8, 2012 3:11 PM
Edited by: JackAbaper on Feb 8, 2012 3:14 PM
Edited by: JackAbaper on Feb 8, 2012 3:15 PMHi ,
This is what you can do.
Upload the file from presentation server to the application server through CG3Z.
Then you can fetch the file from Application server to your BDC program (Which you want to run in background).
Regards,
Chandan. -
XML to flat file conversion using file content conversion in reciever CC
Hi,
Iam working on Idoc to File scenario.
Iam having a problem in the communication channel of reciever.
Iam using File content conversion in Reciever Adapter.
My xml format is asfollows:--
- <Header>
<FILLER1>KTP</FILLER1>
<YEAR_IDOC>YEAR 2006</YEAR_IDOC>
<FILLER2>FIRSTWEEKNUMBER</FILLER2>
<WEEK_IDOC>51</WEEK_IDOC>
<FILLER3>NUMBER WEEKS 26</FILLER3>
<PLANT_CODE>FACTORYM019</PLANT_CODE>
</Header>
- <Record>
<First_material>731000</First_material>
<First_quantity>0000.0</First_quantity>
<First_quantity>0001.9</First_quantity>
<First_quantity>0000.0</First_quantity>
<First_quantity>0000.0</First_quantity>
<First_quantity>0020.0</First_quantity>
<First_quantity>0000.0</First_quantity>
<First_quantity>0000.0</First_quantity>
<First_quantity>0000.0</First_quantity>
<First_quantity>0018.0</First_quantity>
<First_quantity>0000.0</First_quantity>
<Second_material />
<Seond_quantity>000000</Seond_quantity>
<Second_quantity>0011.0</Second_quantity>
<Seond_quantity>000000</Seond_quantity>
<Seond_quantity>000000</Seond_quantity>
<Seond_quantity>000000</Seond_quantity>
<Seond_quantity>000000</Seond_quantity>
<Second_quantity>0049.0</Second_quantity>
<Seond_quantity>000000</Seond_quantity>
<Seond_quantity>000000</Seond_quantity>
<Second_quantity>0067.0</Second_quantity>
<Third_material />
<Third_quantity>000000</Third_quantity>
<Third_quantity>000000</Third_quantity>
<Third_quantity>000000</Third_quantity>
<Third_quantity>0008.0</Third_quantity>
<Third_quantity>000000</Third_quantity>
<Third_quantity>000000</Third_quantity>
</Record>
The file format should be as follows:--
KTP YEAR 2006 FIRSTWEEKNUMBER 51 NUMBER WEEKS 26 FACTORYM019
731000 0000.0 0001.9 0000.0 0000.0 0020.0 0000.0 0000.0 0000.0 0018.0 0000.0
0000.0 0011.0 0000.0 0000.0 0000.0 0000.0 0049.0 0000.0 0000.0 0067.0
0000.0 0000.0 0000.0 0008.0 0000.0 0000.0
Could some one help me in resolving this issue.
Regards
PraveenHi Praven,
couldn't you simply modify your target DATA type so it will be easier to handle?
For example something like:
<Header>
</Header>
<Record>
<Material>
<Number>..</Number>
<quantity>..</quantity>
<quantity>..</quantity>
</Material>
In this way file content conversion will be easier (easy)!
Regards,
Sergio -
Flat File Processing using Batch Job
Hello,
I need some ABAP advice...
The scenario: There's a batch job that runs every hour and picks up all the files from APP server and processes them to creates idocs. Sometimes, while a batch job is running, a file is being created on the app server. But although the file is being written to and is not yet complete, the batch job picks up this file to create an idoc and hence leads to errors. Is there a way I can check if a file is already complete and then only the batch job should process the file.
I am using SAP 4.6c and it doesnt allow me to use GET DATASET attributes...
Any suggestions on how to check.....Hi Shipra,
Check this code taken from another post to get the creation time of the file:
*& Report ZFILE_CREATE_DATE
REPORT zfile_create_date.
TABLES epsf.
PARAMETERS dir
LIKE epsf-epsdirnam DEFAULT 'directory here'.
PARAMETERS file LIKE epsf-epsfilnam DEFAULT 'file here'.
DATA : mtime TYPE p DECIMALS 0,
time(10),
date LIKE sy-datum.
CALL FUNCTION 'EPS_GET_FILE_ATTRIBUTES'
EXPORTING
file_name = file
dir_name = dir
IMPORTING
file_size = epsf-epsfilsiz
file_owner = epsf-epsfilown
file_mode = epsf-epsfilmod
file_type = epsf-epsfiltyp
file_mtime = mtime
EXCEPTIONS
read_directory_failed = 1
read_attributes_failed = 2
OTHERS = 3.
IF sy-subrc NE 0.
WRITE: / 'error:', sy-subrc.
ELSE.
*The subroutine p6_to_date_time_tz is sap std present in rstr0400.
PERFORM p6_to_date_time_tz(rstr0400) USING mtime
time
date.
WRITE: / 'mtime:',mtime.
WRITE: / 'date: ', date.
WRITE: / 'time:',time.
ENDIF.
Regards,
Gilberto Li -
OIM 11g : Flat-File Reconciliation using GTC Connector
I am using OIM version 11.1.1.5
below are the mapping used :
login -> User Login
firstName -> First Name
lastName -> Last Name
eMail -> Email
organization -> Organization
Employee Type -> Role
User Type - User TYpe ( note : OIM version 11.1.1.5 do not have design console access it has User Type avialable for mapping ).
also fallowed all the steps mention in few thred running in this forum but I am still not able to create user using FF file .
I am seeing the same error in the log as given below.
Caused by: oracle.iam.reconciliation.exception.ReconciliationException: Matching rule where clause is null
at oracle.iam.reconciliation.impl.ReconOperationsServiceImpl.getMatchingRule(ReconOperationsServiceImpl.java:476)
at oracle.iam.reconciliation.impl.ReconOperationsServiceImpl.ignoreEvent(ReconOperationsServiceImpl.java:376)
... 48 more
please sugguest what additonal steps need to be done .
thanks
-Kathy.Make sure you checked onf of your recon fields as 'Matching Only'. If you do not pick one field for matching, GTC will not create any reconciliation rule for you.
Thanks,
Krish
Maybe you are looking for
-
Budget availabilty control error
Dear Gurus, I am facing problem while doing time sheet booking at WBS ELement. System gives error "Budget exceeded for the WBS (Error no:BP 603)" The steps I follow are as under: 1. I created a project in CJ20N 2. Budgeted the project in CJ30 3. Rele
-
How do I list all of the table names in my excel database
I am writing code for an electronic sales board display in my office. unfortunatly the datasource that is available to me is an excel file that is updated by the sales manager every day. each sales persons information is listed in a different tab and
-
Access a PDF document in memory
I'm using C#, and I'm trying to store several pdf documents that my company considers secure in a database. The goal is to write a program that can read these pdf files in memory (either as a memorystream, or straight byte array), instead of writing
-
2 different computers, 2 different printers - same error.
"There was an error opening your printer. Printing functions will not be available until you have selected a printer and reopened any documents." I''ve reinstalled, changed defaults, everything. It's a BUG!!!!! Come on Adobe! Damn!
-
I tried to open bank web site but it gave me this massage ''Safari can't open the page "http://www.sabb.com/" because the server where this page is located isn't responding'' . I was able to open this site before, I can't remember what I have change.