Error in virtual provider?( is it possible from flat file)
hi friends,
this step i am using my flat file extraction.
i created DS and under extraction tab i gave direct access allowed, i didnt create infopackage.
i created virtual provider and applied transformations , i created DTP and activate it.
but i didnt execute it . right click my virtual provider activate direct access. now i am tryning to see data in virtual provider, i am getting this error.
please let me know what i done is correct or not, if correct how can i solve this problem
<b>errors:</b>
<b>error in bw
operation generate count not be carried out for request 8
object data transfer process dtp_diel1003lx0biivvkp is locked</b>
hi,
please answer if possible or not>
Similar Messages
-
Error while loading table from flat file (.csv)
I have a flat file which i am loading into a Target Table in Oracle Warehouse Builder. It uses SQL Loader Internally to load the data from flat file, I am facing an issue. Please find the following error ( This is an extract from the error log generated)
SQL*Loader-500: Unable to open file (D:\MY CURRENT PROJECTS\GEIP-IHSS-Santa Clara\CDI-OWB\Source_Systems\Acquisition.csv)
SQL*Loader-552: insufficient privilege to open file
SQL*Loader-509: System error: The data is invalid.
SQL*Loader-2026: the load was aborted because SQL Loader cannot continue.
I believe that this is related to SQL * Loader error.
ACtually the flat file resides in my system ( D:\MY CURRENT PROJECTS\GEIP-IHSS-Santa Clara\CDI-OWB\Source_Systems\Acquisition.csv). I am connecting to a oracle server.
Please suggest
Is it required that i need to place the flat file in Oracle Server System ??
Regards,
Ashoka BLHi
I am getting an error as well which is similar to that described above except that I get
SQL*Loader-500: Unable to open file (/u21/oracle/owb_staging/WHITEST/source_depot/Durham_Inventory_Labels.csv)
SQL*Loader-553: file not found
SQL*Loader-509: System error: The system cannot find the file specified.
SQL*Loader-2026: the load was aborted because SQL Loader cannot continue.
The difference is that Ashoka was getting
SQL*Loader-552: insufficient privilege to open file
and I get
SQL*Loader-553: file not found
The initial thought is that the file does not exist in the directory specified or I have spelt the filename incorrectly it but this has been checked and double checked. The unix directory also has permission to read and write.
Also in the error message is
Control File: C:\u21\oracle\owb_staging\WHITEST\source_depot\INV_LOAD_LABEL_INVENTORY.ctl
Character Set WE8MSWIN1252 specified for all input.
Data File: /u21/oracle/owb_staging/WHITEST/source_depot/Durham_Inventory_Labels.csv
Bad File: C:\u21\oracle\owb_staging\WHITEST\source_depot\Durham_Inventory_Labels.bad
As can be seen from the above it seems to be trying to create the ctl and bad file on my c drive instead of on the server in the same directory as the .csv file. The location is registered to the server directory /u21/oracle/owb_staging/WHITEST/source_depot
I am at a lost as this works fine in development and I have just promoted all the development work to a systest environment using OMBPlus.
The directory structure in development is the same as systest except that the data file is /u21/oracle/owb_staging/WHITED/source_depot/Durham_Inventory_Labels.csv and everything works fine - .ctl and .bad created in the same directory and the data sucessfully loads into a oracle table.
Have I missed a setting in OWB during the promotion to systest or is there something wrong in the way the repository in the systest database is setup?
The systest and development databases are on the same box.
Any help would be much appreciated
Thanks
Edwin -
Error occurred in the data uploading from Flat File in BPC NW
Hi,
I am doing Migration project from BPC MS to NW.
In this i am loading data from flat file to BPC NW. One error occured in this process that is Record Duplication.
Total 17000 records in that 7000 recards rejected by the reason of duplication.
The Information about Package Log
/CPMB/MODIFY completed in 0 seconds
/CPMB/CONVERT completed in 2 seconds
/CPMB/LOAD completed in 7 seconds
/CPMB/CLEAR completed in 0 seconds
[Selection]
FILE= DATAMANAGER\DATAFILES\Aprilmayjun_2011_Budget_V1.CSV
TRANSFORMATION= DATAMANAGER\TRANSFORMATIONFILES\ZAPRMAYJUN_2011BUDGET.xls
CLEARDATA= No
RUNLOGIC= Yes
CHECKLCK= No
[Messages]
Task name CONVERT:
No 1 Round:
Record count: 17064
Accept count: 17064
Reject count: 0
Skip count: 0
Task name LOAD:
Reject count: 7230
Submit count: 9834
Application: CorpBudget Package status: WARNING
Could you help me in this at the earliest.
Thanks and Regards
KrishnaHi,
You cannot send the duplicated with the standard import package. You need to create or add a new package and link to /CPMB/APPEND process chain and do the import. This would consider the duplicate entries also. It looks like you need somehow load all the records including the duplicate ones.
So on excel go to Manage Data -> Maintain Data management -> organize package list and add a new package and link to the standard BPC process chain /CPMB/APPEND.
Thanks,
Sreeni -
Error when uploading data from Flat file
I am uplading data from Flat file.
When I am uploading it gives an error
An error occurred in record 1 during execution of conversion exit CONVERSION_EXIT_CUNIT_INPUT for field UNIT.
Can any one help.
Thanx in Advance.
Regards,
Pradeep.Hi Pradeep,
Refer this link:
CONVERSION_EXIT_CUNIT_INPUT error in flat file load
Re: Error in Flat files upload
Also try this - In the transfer structure check the checkbox fo UNIT and retry the load. Hopefully it should be fine.
Bye
Dinesh -
CUNIT error in data loading from flat file after r/3 extraction
Hi all
After R/3 business content extraction, if i load data from flat file to info cube, I am getting Conversion exit CUNIT error, what might be the reason, the data in the flat file 0unit column is accurate, and mapping rules are also correct, but still i am getting error with CUNIT.?check your unit if you are loading amount or quantities what mapping you have and what you are loading from flat files.
BK -
Error in loading transaction dat from flat file
hi everybody
while i try to load data from flat file at time of scheduling it shows the error "check load from infosourceand its showing errorwhilw monitoring the data as follow"the correcsponding datapacket were not updated using psa" what can i do to rectify the problem
thanks in advanceHi,
Please check the data whether it is in a correct format using the "Preview" option in the infopackage.
Check whether all the related AWB objects are active or not.
Regards,
K.Manikandan. -
Error in loading data from flat file....
Hi,
While loading the data from flat file , only one column is getting populated in the ODS and after that the sixth column is getting populated with the null values. Also when I save the changes in CSV file .. its not getting saved.
Could you please tell me what could be the probable reason for this.
Also what are the points we should keep in mind while loading from flat files.
Regards,
JeetuHi,
You need to take care of -
1. you flat file structure ( left to right columns) should match with the datasource ( top to down) column to column. if you don't wish to laod some column, leave it as blank column in flat file.
2. your file should not be open when loading data
3. make sure your transfer rules & update rules are defined properly.
4. do the simulation in infopakage first & that will give you fair idea.
5. to start with load data first in PSA Only, check & then take it forward.
hope it helps
regards
Vikash -
Hi,
I am moving data from flat file to oracle table. while populating the oracle table if I get any errors in flat file those errors should populate in ODI error table.
Is this is possible? if yes. Could you please let me know the set up in ODI.CKM is the dedicated for checking the constraints while doing transformation. The constraints includes, PK,FK, conditions etc.,
There are two types/ways of checking the constraints.
Flow Control: Triggered CKM after data is loaded in to I$.
Static Control : Triggered CKM after data is loaded in to target table.
If you opt for any one the above ODI will create E$ and SNP_CHECK_TAB (summary table for logging the errors) and load the error records.
ODI will also provide you an option of loading the corrected error records by RECYCLE ERROR feature. In this phase/run ODI will load ONLY the error records in to target table (assuming its been corrected/cleaned).
**how to set up flow control could you please provide steps**
**Appreciate your help** -
Loading Hierarchy from flat file giving dump
Hi gurus,
i am trying to load data from flat file to hierarchy, when i am trying to schedule the infopackage its giving message PLEASE SELECT THE VALID INFOOBJECT and if i proceed even its giving a dump :-ASSIGN LENGTH 0.
i tired using i-doc method...then i am getting records in to BW in red and the same error
InfoObject INFOOBJECT is not available.
i followed the blog
/people/prakash.bagali/blog/2006/02/07/hierarchy-upload-from-flat-files
> points will be assigned for inputs.
Message was edited by:
ravi ahi guys,
my heirarchy is different from the one in the blog. should i mark all the infoobjects in all nodes as hierarchy relavant or only the bottom one? i also brought the 0hier_node from the business content. i dont know what other valid infoobject is missing ......plz help........
the following are more details about the dump
Program error: ASSIGN with length 0 in program "SAPLRRSV".
Error analysis
An ASSIGN statement in the program "SAPLRRSV" contained a field symbol with
length 0. This is not possible.
length 0.
This is not possible -
Data is not updated to Infocube (BI 7.0) when i load from flat file
Hi Experts,
when I try to load data from flat file to Infocube (BI 7.0) with full or Delta, I am gettin following errors.
1) Error while updating to data target, message No: RSBK241
2) Processed with Errors, Message No: RSBK 257.
Till transformations everthing is successful but updating data to cube is not successful...
what is the message no. indicates? can i find solution anywhere with respective to Message NO.?
Anybody please help me..
Regards
AnilHi Bhaskar,
There is no problem with cube activation.But i try to activate the cube by using
the function module " RSDG_CUBE_ACTIVATE". It goes to short dump.Anyhow i want to know
that in what scenario we use this function module to activate the cube.
There is no special characters in the file. I delete the requests in psa, cube and Re run the
Infopackage & DTP.But I am getting same error.
The following errors are found in updating menu
of details tab (display messages option)
1) Error while updating to data target, message No: RSBK241
2) Processed with Errors, Message No: RSBK 257.
Thank you
Regards
Anil
Edited by: anilkumar.k on Aug 20, 2010 8:35 PM -
How often in the real time projects extract data from flat files n process
I am going thru teh BODS data integrator, and trying to understand the demand of ETL services extract data from a flat file, is that really impt in teh real time jobs.
Thank you very much for the helpful info.Hi,
As per the inputs given by you guys i started loading data from flat file.
I try to load 28 files from i which i was able to load 24 files succesfully.For the other 4 i got this error messages
1) Error 'Enter period in the format __.YYYY...' at conversion exit CONVERSION_EXIT_PERI6_INPUT (field CALMONTH record 1, value DUMYTRA)
Message no. RSDS012
2) a) Error 'The argument '1,008.00' cannot be interpreted as anumber' on assignment field QUANT_B record 11714 value 1,008.00
Message no. RSDS013
b) Error 'The argument '1,110.00' cannot be interpreted as anumber' on assignment field QUANT_B record 15374 value 1,110.00
Message no. RSDS013
3) a) Error 'The argument '1,140.00' cannot be interpreted as anumber' on assignment field QUANT_B record 1647 value 1,140.00
Message no. RSDS013
b) Error 'The argument '2,028.00' cannot be interpreted as anumber' on assignment field QUANT_B record 4625 value 2,028.00
Message no. RSDS013
4) Error 'The argument '1,151.00' cannot be interpreted as anumber' on assignment field QUANT_B record 7808 value 1,151.00
Message no. RSDS013
I'am unable to trace out what is the error exactly.
I checked this values in files they are perfect.
can anybody please guide me on this issue.
With Regards,
Pradeep.B -
Reading data from flat file Using TEXT_IO
Dear Gurus
I already posted this question but this time i need some other changes .....Sorry for that ..
I am using 10G forms and using TEXT_IO for reading data from flat file ..
My data is like this :-
0|BP-V1|20100928|01|1|2430962.89|27|2430962.89|MUR|20100928120106
9|2430962.89|000111111111|
1|61304.88|000014104113|
1|41961.73|000022096086|
1|38475.65|000023640081|
1|49749.34|000032133154|
1|35572.46|000033093377|
1|246671.01|000042148111|
Here each column is separated by | . I want to read all the columns and want to do some validation .
How can i do ?
Initially my requirement was to read only 2 or 3 columns so i did like this ...
Procedure Pay_Simulator(lfile_type varchar2,lac_no varchar2,lcur varchar2,lno_item number,ltotal number,ldate date,lpay_purp varchar2,lfile_name varchar2)
IS
v_handle utl_file.file_type;
v_filebuffer varchar2(500);
line_0_date VARCHAR2 (10);
line_0_Purp VARCHAR2 (10);
line_0_count Number;
line_0_sum number(12,2);
line_0_ccy Varchar2(3);
line_9_sum Number(12,2);
line_9_Acc_no Varchar2(12);
Line_1_Sum Number(12,2);
Line_1_tot Number(15,2) := 0;
Line_1_flag Number := 0;
lval number;
lacno varchar2(16);
v_file varchar2(20);
v_path varchar2(50);
Begin
v_file := mcb_simulator_pkg.GET_FILENAME(lfile_name); -- For the file name
v_path :=rtrim(regexp_substr( lfile_name , '.*\\' ),'\'); For the Path
v_path := SUBSTR (lfile_name,0, INSTR (lfile_name, '\', -1));
v_handle := UTL_FILE.fopen (v_path, v_file, 'r');
LOOP
UTL_FILE.get_line (v_handle, v_filebuffer);
IF SUBSTR (v_filebuffer, 0, 1) = '0' THEN
SELECT line_0 INTO line_0_date
FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_0, ROWNUM rn
FROM DUAL
CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
WHERE rn = 3;
SELECT line_0 INTO line_0_Purp
FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_0, ROWNUM rn
FROM DUAL
CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
WHERE rn = 4;
SELECT line_0 INTO line_0_count
FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_0, ROWNUM rn
FROM DUAL
CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
WHERE rn = 7;
SELECT line_0 INTO line_0_sum
FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_0, ROWNUM rn
FROM DUAL
CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
WHERE rn = 8;
SELECT line_0 INTO line_0_ccy
FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_0, ROWNUM rn
FROM DUAL
CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
WHERE rn = 9;
ELSIF SUBSTR (v_filebuffer, 0, 1) = '9' THEN
SELECT line_9 INTO line_9_Acc_no
FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_9, ROWNUM rn
FROM DUAL
CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
WHERE rn = 3;
SELECT line_9 INTO line_9_sum
FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_9, ROWNUM rn
FROM DUAL
CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
WHERE rn = 2;
ELSIF SUBSTR (v_filebuffer, 0, 1) = '1' THEN
line_1_flag := line_1_flag+1;
SELECT line_1 INTO line_1_sum
FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_1, ROWNUM rn
FROM DUAL
CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
WHERE rn = 3;
Line_1_tot := Line_1_tot + line_1_sum;
END IF;
END LOOP;
DBMS_OUTPUT.put_line (Line_1_tot);
DBMS_OUTPUT.PUT_LINE (Line_1_flag);
UTL_FILE.fclose (v_handle);
END;
But now how can i do ? Shall i use like this select Statement for all the columns ?Sorry for that ..
As per our requirement ...
I need to read the flat file and it looks like like this .
*0|BP-V1|20100928|01|1|2430962.89|9|2430962.89|MUR|20100928120106*
*9|2430962.89|000111111111|*
*1|61304.88|000014104113|*
*1|41961.73|000022096086|*
*1|38475.65|000023640081|*
*1|49749.34|000032133154|*
*1|35572.46|000033093377|*
*1|246671.01|000042148111|*
*1|120737.25|000053101979|*
*1|151898.79|000082139768|*
*1|84182.34|000082485593|*
I have to check the file :-
Validation are 1st line should start from 0 else it should raise an error and insert that error into one table .
The for 2nd line also same thing ..it should start from 9 else it should raise an error and insert that error into one table .
Then the 3rd line should start from 1 else it should raise an error and insert that error into one table .
After that i have to do a validation like i will read the 1st line 2nd column .. It should be like this BP-V1 else raise an error and insert that error to a table . Then i will check the 3rd column which is 20100928 , it should be YYYYMMDD format else same thing ERROR.
Then like this for all columns i have different validation .......
Then it will check for the 2nd line 3rd column . this is an account no .1st i will check it should be 12 char else ERROR .Then I will check that what user has imputed in the form.Like for example User putted 111111111 then i will check with this 000111111111 which is there in the 2nd line . I have to add 000 before that Account no which user imputed .
Then the lines which is starting from 1 , i have to take all the 2nd column for all the lines which is starting from 1 and i have to do a sum . After that i have to compare that sum with the value in the 1st lines ( Starting from 0) 6th column . It should be same else ERROR ...
Then same way i have to count all the lines which is starting from 1 . Then i have to compare with the 7th column of 1st line . It should be same . Here in this file it should be 9.
MY CODE IS :-
Procedure Pay_Simulator(lfile_type varchar2,lac_no varchar2,lcur varchar2,lno_item number,ltotal number,ldate date,lpay_purp varchar2,lfile_name varchar2)
IS
v_handle TEXT_IO.file_type;
v_filebuffer varchar2(500);
line_0_date VARCHAR2 (10);
line_0_Purp VARCHAR2 (10);
line_0_count Number;
line_0_sum number(12,2);
line_0_ccy Varchar2(3);
line_9_sum Number(12,2);
line_9_Acc_no Varchar2(12);
Line_1_Sum Number(12,2);
Line_1_tot Number(15,2) := 0;
Line_1_flag Number := 0;
lval number;
lacno varchar2(16);
v_file varchar2(20);
v_path varchar2(50);
LC$String VARCHAR2(50) ;--:= 'one|two|three|four|five|six|seven' ;
LC$Token VARCHAR2(100) ;
i PLS_INTEGER := 2 ;
lfirst_char number;
lvalue Varchar2(100) ;
Begin
v_file := mcb_simulator_pkg.GET_FILENAME(lfile_name); For the file name
v_path :=rtrim(regexp_substr( lfile_name , '.*\\' ),'\'); For the Path
--v_path := SUBSTR (lfile_name,0, INSTR (lfile_name, '\', -1));
Message(lfile_name);
v_handle := TEXT_IO.fopen(lfile_name, 'r');
BEGIN
LOOP
TEXT_IO.get_line (v_handle, v_filebuffer);
lfirst_char := Substr(v_filebuffer,0,1);
--Message('First Char '||lfirst_char);
IF lfirst_char = '0' Then
Loop
LC$Token := mcb_simulator_pkg.Split( v_filebuffer, i , '|') ;
Message('VAL - '||LC$Token);
lvalue := LC$Token;
EXIT WHEN LC$Token IS NULL ;
i := i + 1 ;
End Loop;
Else
Insert into MU_SIMULATOR_output_ERR (load_no,ERR_CODE,ERR_DESC) values (9999,'0002','First line should always start with 0');
Forms_DDL('Commit');
raise form_Trigger_failure;
End if ;
TEXT_IO.get_line (v_handle, v_filebuffer);
lfirst_char := Substr(v_filebuffer,0,1);
LC$Token := mcb_simulator_pkg.Split( v_filebuffer, i , '|') ;
--Message('Row '||LC$Token);
IF lfirst_char = '9' Then
Null;
Else
Insert into MU_SIMULATOR_output_ERR (load_no,ERR_CODE,ERR_DESC) values (8888,'0016','Second line should start with 9');
Forms_DDL('Commit');
raise form_Trigger_failure;
End IF;
LOOP
TEXT_IO.get_line (v_handle, v_filebuffer);
lfirst_char := Substr(v_filebuffer,0,1);
LC$Token := mcb_simulator_pkg.Split( v_filebuffer, i , '|') ;
--Message('Row '||LC$Token);
IF lfirst_char = '1' Then
Null;
Else
Insert into MU_SIMULATOR_output_ERR (load_no,ERR_CODE,ERR_DESC) values (7777,'0022','The third line onward should start with 1');
Forms_DDL('Commit');
raise form_Trigger_failure;
End if;
END LOOP;
--END IF;
END LOOP;
EXCEPTION
When No_Data_Found Then
TEXT_IO.fclose (v_handle);
END;
Exception
When Others Then
Message('Other error');
END;
I am calling the FUNCTION which you gave SPLIT as mcb_simulator_pkg.Split. -
Vendor master data load from flat file
Hi Experts,
I am trying to load data from flat file. I am confused which info object i should use. My requirement is for info cube 0BBP_C01.
As i found for this info cube only two info objects are available: 0BBP_VENDOR and 0VENDOR. But in my flat file i have following fields: Vendor code, Vendor Name, Ultimate Parent, City, Country, Minority Status, Vendor Payment Terms. which all are together not present in either of info objects.
Please suggest which info object i must use. or should i create Z info object for my requirement?
Thanks in advance.
Regards,
Niranjan ChechaniHi Kiran,
I am loading data through flat file. I am not getting your point like how to get attributes present in the R/3 system which I am not using in this case.
Hi rvc,
I have created two characteristics z info objects i.e. ZU_PARENT for Ultimate parent and ZVEN_PAY for Vendor Payment Terms.
When i am trying to add these two as attributes and add it as Navigational attribute, I am getting the error "Characteristic 0VENDOR: The attibutes SID table(s) could not be filled" (Message no. R7586)
Please suggest am i wrong somewhere?
Thanks and Regards,
Niranjan Chechani
Edited by: Niranjan Chechani on Nov 28, 2011 12:15 PM -
Upload PO data into SRM system from flat file
Hi all,
I need to create conversion program to upload Open Purchase order data from flat file to System.
I am trying to create po using bapi BAPI_POEC_CREATE. but getting error .
should any one give the details of parameter need to pass the bapi .
Thanks in advance
SharadSharad,
Not the very best piece of code, but should be helpful.
REPORT zkb_po_create.
DATA: ls_po_header TYPE bapi_po_header_c.
DATA: ls_e_po_header TYPE bapi_po_header_d.
DATA: ls_po_items TYPE bapi_po_item_c.
DATA: ls_po_accass TYPE bapi_acc_c.
DATA: ls_po_partner TYPE bapi_bup_c.
DATA: ls_po_orgdata TYPE bapi_org_c.
DATA: ls_return TYPE bapiret2.
DATA: lt_po_items TYPE TABLE OF bapi_po_item_c.
DATA: lt_po_accass TYPE TABLE OF bapi_acc_c.
DATA: lt_po_partner TYPE TABLE OF bapi_bup_c.
DATA: lt_po_orgdata TYPE TABLE OF bapi_org_c.
DATA: lt_return TYPE TABLE OF bapiret2.
* Header Details
ls_po_header-businessprocess = 1.
CALL FUNCTION 'CONVERSION_EXIT_ALPHA_INPUT'
EXPORTING
input = ls_po_header-businessprocess
IMPORTING
output = ls_po_header-businessprocess.
ls_po_header-process_type = 'EC'.
ls_po_header-doc_date = sy-datum.
ls_po_header-description = 'Test for BAPI_POEC_CREATE'.
ls_po_header-logsys_fi = 'Backend'.
ls_po_header-co_code = '1000'.
ls_po_header-currency = 'GBP'.
* Item Details
ls_po_items-item_guid = 2.
CALL FUNCTION 'CONVERSION_EXIT_ALPHA_INPUT'
EXPORTING
input = ls_po_items-item_guid
IMPORTING
output = ls_po_items-item_guid.
ls_po_items-parent = ls_po_header-businessprocess.
ls_po_items-product_guid = '4678E74FFFC380AD000000000A8E035B'.
ls_po_items-product_id = '400030'.
ls_po_items-product_type = '01'.
ls_po_items-category_guid = '4627B461073F40FC000000000A8E035B'.
ls_po_items-category_id = '1.04.0500'.
ls_po_items-quantity = 10.
ls_po_items-deliv_date = sy-datum + 10.
ls_po_items-price = '25'.
APPEND ls_po_items TO lt_po_items.
* Account Assignment
ls_po_accass-guid = 3.
CALL FUNCTION 'CONVERSION_EXIT_ALPHA_INPUT'
EXPORTING
input = ls_po_accass-guid
IMPORTING
output = ls_po_accass-guid.
ls_po_accass-parent_guid = ls_po_items-item_guid.
ls_po_accass-distr_perc = 100.
ls_po_accass-g_l_acct = '<gl acc>'.
ls_po_accass-cost_ctr = '<cost centre>'.
ls_po_accass-co_area = '<Ctrl area>'.
APPEND ls_po_accass TO lt_po_accass.
* Partner Functions
ls_po_partner-partner_fct = '00000019'.
ls_po_partner-partner = 'Vendor'.
ls_po_partner-parent_guid = ls_po_items-item_guid.
APPEND ls_po_partner TO lt_po_partner.
ls_po_partner-partner_fct = '00000016'.
ls_po_partner-partner = 'Requester'.
ls_po_partner-parent_guid = ls_po_items-item_guid.
APPEND ls_po_partner TO lt_po_partner.
ls_po_partner-partner_fct = '00000020'.
ls_po_partner-partner = 'Receipient'.
ls_po_partner-parent_guid = ls_po_items-item_guid.
APPEND ls_po_partner TO lt_po_partner.
ls_po_partner-partner_fct = '00000075'.
ls_po_partner-partner = 'Location'.
ls_po_partner-parent_guid = ls_po_items-item_guid.
APPEND ls_po_partner TO lt_po_partner.
ls_po_orgdata-proc_org_ot = 'O'.
ls_po_orgdata-proc_org_id = 'Pur Org'.
ls_po_orgdata-proc_group_ot = 'O'.
ls_po_orgdata-proc_group_id = 'Pur Group'.
ls_po_orgdata-parent_guid = ls_po_items-item_guid.
APPEND ls_po_orgdata TO lt_po_orgdata.
CALL FUNCTION 'BAPI_POEC_CREATE'
EXPORTING
i_po_header = ls_po_header
IMPORTING
e_po_header = ls_e_po_header
TABLES
i_po_items = lt_po_items
i_po_accass = lt_po_accass
i_po_partner = lt_po_partner
i_po_orgdata = lt_po_orgdata
return = lt_return.
READ TABLE lt_return INTO ls_return WITH KEY type = 'E'.
IF sy-subrc NE 0.
COMMIT WORK AND WAIT.
WRITE:/ ls_e_po_header-doc_number, ': created successfully'.
ELSE.
WRITE:/ 'The below errors occurs during PO creation.'.
LOOP AT lt_return INTO ls_return.
WRITE:/ ls_return-message.
ENDLOOP.
ENDIF.
Regards, Kathirvel -
Loading transaction data from flat file to SNP order series objects
Hi,
I am an BW developer and i need to provide data to my SNP team.
Can you please let me know more about <b>loading transaction data (sales order, purchase order, etc.,) from external systems into order based SNP objects/structure</b>. there is a 3rd party tool called webconnect that gets data from external systems and can give data in flat file or in database tables what ever required format we want.
I know we can use BAPI's, but dont know how. can you please send any <b>sample ABAP program code that calls BAPI to read flat file and write to SNP order series objects</b>.
Please let me know ASAP, how to get data from flat file into SNP order based objects, with options and I will be very grateful.
thanks in advance
RahulHi,
Please go through the following links:
https://forums.sdn.sap.com/click.jspa?searchID=6094493&messageID=4180456
https://forums.sdn.sap.com/click.jspa?searchID=6094493&messageID=4040057
https://forums.sdn.sap.com/click.jspa?searchID=6094493&messageID=3832922
https://forums.sdn.sap.com/click.jspa?searchID=6094493&messageID=4067999
Hope this helps...
Regards,
Habeeb
Assign points if helpful..:)
Maybe you are looking for
-
I have a problem with syncing Illustrator settings. I just installed CC onto another computer and tried to sync settings from the cloud to the new installation of illustrator, but nothing happened. It worked OK for Photoshop and Dreamweaver, but AI d
-
Conditional MAX or case statement
I am working on a query and need some guidance on how to get the results I need. Below is an example of the data I'm working with: ID Appl_Nbr Prog_Nbr Rating_Cmp Rating_Cmp_Value ROWN 20034672
-
Hello, I have a custom program that I need to develop work flow for. Can anyone provide me with the steps that I need to take to make this happen? For example, how do I develop a custom business object? Do I need a developer to do this? Can you provi
-
Memory leaks in NI-DAQ 6.9.1
Can anyone tell me if the API for NI-DAQ 6.9.1 has been purified to eliminate all memory leaks? I'm using DIG_Block_In() etc. with PCI-653X DIO cards. My Win2K MSC++ V6 Purify (tm) reports many potential leaks similar to the following: [I] MPK: Poten
-
CMS collector taking too long pauses due to fragmentation
we are using Weblogic 10gR3 servers with JDK 160_23 for ODSI application and using CMS collector for garbage collection. But we are seeing ParNew (promotion failed) due to fragmentation and ending up CMS having more than 30 seconds stop the world pau