Flat file writing
Hello All,
Below is the flat file output:
HEADER1
LINE ITEM1
LINE ITEM2
==
==
HEADER2
LINE ITEM3
LINE ITEM4
==
==
TRAILER
When PI is reading the above flat file below is xml structure generating:
<HEADER1>
<LINEITEM1>
<LINEITEM2>
==
<HEADER2>
<LINEITEM3>
<LINEITEM4>
==
<TRAILER>
Till now everything is working fine. In my message mapping I am performing some validations after that I am trying to write the same Input file in the same structure. But I am getting flat file in below format:
HEADER
HEADER
LINE ITEM1
LINEITEM2
LINEITEM3
==
===
I am expecting in same Input format i.e. as below:
HEADER1
LINE ITEM1
LINE ITEM2
==
==
HEADER2
LINE ITEM3
LINE ITEM4
==
==
TRAILER
How can I get the above structure?
Hi,
you can convert the strucutre to flat file in File adapter using conversionType as "StructPlain2XML". And as you Lineitem are comming in variable sequence you need to use the parameter:
xml.recordsetStructureOrder - var
Regards,
Harish
Similar Messages
-
Writing to a flat file using UTL in a procedure
Hello All,
I am creating a procedure in which I am trying to write data to a flat file using UTL. Code is shown below:
CREATE OR REPLACE PROCEDURE wrt_lifungduty IS
sql_stmt varchar2(200);
sql_stmt1 varchar2(200);
sql_stmt2 varchar2(200);
v_cur_hdl integer;
v_rows_processed BINARY_INTEGER;
V_file UTL_FILE.FILE_TYPE;
V_flatfile_line VARCHAR2(98) := NULL;
V_OBLIGATION_KEY NUMBER(10);
V_OBLIGATION_LEVEL VARCHAR2(6);
V_KEY_VALUE_1 VARCHAR2(20);
V_KEY_VALUE_2 VARCHAR2(20);
V_PARTNER_TYPE VARCHAR2(6);
V_PARTNER_ID VARCHAR2(10);
V_EXT_INVC_NO VARCHAR2(30);
V_EXT_INVC_DATE DATE;
V_PAID_DATE DATE;
V_PAID_AMT NUMBER(20,4);
V_COMP_ID VARCHAR2(10);
V_RECEIPT_DATE DATE;
V_ORDER_QTY NUMBER(12,4);
V_RECEIPT_QTY NUMBER(12,4);
V_FRT NUMBER;
V_DUTY NUMBER;
V_SUPPLIER NUMBER;
V_FLAG VARCHAR2(1);
V_ELC_COST NUMBER;
V_ADJ_ELC NUMBER;
Cursor x is
select OBLIGATION_KEY,OBLIGATION_LEVEL,KEY_VALUE_1,KEY_VALUE_2,PARTNER_TYPE,PARTNER_ID,EXT_INVC_NO,EXT_INVC_DATE
,PAID_DATE,PAID_AMT,COMP_ID,RECEIPT_DATE,ORDER_QTY,RECEIPT_QTY,FRT,DUTY,SUPPLIER,FLAG,ELC_COST,ADJ_ELC
from RMSBIZ.CT_ANALYZE_LIFUNG_DUTY;
BEGIN
if not UTL_FILE.IS_OPEN(V_file) then
V_file := UTL_FILE.FOPEN('/exchange/biz/rms2lan','sb_duty.dat','w'); --need to give the location for testing
end if;
for x1 in x loop
V_OBLIGATION_KEY :=1;
V_OBLIGATION_LEVEL :='test';
V_OBLIGATION_KEY :=x1.OBLIGATION_KEY;
V_OBLIGATION_LEVEL :=x1.OBLIGATION_LEVEL;
V_KEY_VALUE_1 :=x1.KEY_VALUE_1;
V_KEY_VALUE_2 :=x1.KEY_VALUE_2;
V_PARTNER_TYPE :=x1.PARTNER_TYPE;
V_PARTNER_ID :=x1.PARTNER_ID;
V_EXT_INVC_NO :=x1.EXT_INVC_NO;
V_EXT_INVC_DATE :=x1.EXT_INVC_DATE;
V_PAID_DATE :=x1.PAID_DATE;
V_PAID_AMT :=x1.PAID_AMT;
V_COMP_ID :=x1.COMP_ID;
V_RECEIPT_DATE :=x1.RECEIPT_DATE;
V_ORDER_QTY :=x1.ORDER_QTY;
V_RECEIPT_QTY :=x1.RECEIPT_QTY;
V_FRT :=x1.FRT;
V_DUTY :=x1.DUTY;
V_SUPPLIER :=x1.SUPPLIER;
V_FLAG :=x1.FLAG;
V_ELC_COST :=x1.ELC_COST;
V_ADJ_ELC :=x1.ADJ_ELC;
V_flatfile_line:= V_OBLIGATION_KEY||','||
V_OBLIGATION_LEVEL ||','||
V_KEY_VALUE_1 ||','||
V_KEY_VALUE_2 ||','||
V_PARTNER_TYPE ||','||
V_PARTNER_ID ||','||
V_EXT_INVC_NO ||','||
V_EXT_INVC_DATE ||','||
V_PAID_DATE ||','||
V_PAID_AMT ||','||
V_COMP_ID ||','||
V_RECEIPT_DATE ||','||
V_ORDER_QTY ||','||
V_RECEIPT_QTY ||','||
V_FRT ||','||
V_DUTY ||','||
V_SUPPLIER ||','||
V_FLAG ||','||
V_ELC_COST ||','||
V_ADJ_ELC;
V_flatfile_line:= V_OBLIGATION_KEY||','|| V_OBLIGATION_LEVEL;
UTL_FILE.PUT_LINE(V_file, V_flatfile_line);
end loop
commit;
UTL_FILE.fclose(V_file);
END;
Getting the following errors:
SQL>Welcome-->
ERROR at line 1:
ORA-06510: PL/SQL: unhandled user-defined exception
ORA-06512: at "SYS.UTL_FILE", line 98
ORA-06512: at "SYS.UTL_FILE", line 157
ORA-06512: at "RMSBIZ.WRT_LIFUNGDUTY", line 39
ORA-06512: at line 1
SQL>Welcome-->39
39* V_file := UTL_FILE.FOPEN('/exchange/biz/rms2lan','sb_duty.dat','w');
SQL>Welcome-->
Does any one know why it is erroring out?
Thanks,
Chiruthe code below works if i take out the last 3 columns while writing to the flat file line.
CREATE OR REPLACE PROCEDURE wrt_lifungduty IS
sql_stmt varchar2(200);
sql_stmt1 varchar2(200);
sql_stmt2 varchar2(200);
v_cur_hdl integer;
v_rows_processed BINARY_INTEGER;
V_file UTL_FILE.FILE_TYPE;
V_flatfile_line VARCHAR2(98) := NULL;
V_OBLIGATION_KEY NUMBER(10);
V_OBLIGATION_LEVEL VARCHAR2(6);
V_KEY_VALUE_1 VARCHAR2(20);
V_KEY_VALUE_2 VARCHAR2(20);
V_PARTNER_TYPE VARCHAR2(6);
V_PARTNER_ID VARCHAR2(10);
V_EXT_INVC_NO VARCHAR2(30);
V_EXT_INVC_DATE DATE;
V_PAID_DATE DATE;
V_PAID_AMT NUMBER(20,4);
V_COMP_ID VARCHAR2(10);
V_RECEIPT_DATE DATE;
V_ORDER_QTY NUMBER(12,4);
V_RECEIPT_QTY NUMBER(12,4);
V_FRT NUMBER;
V_DUTY NUMBER;
V_SUPPLIER NUMBER;
V_FLAG VARCHAR2(1);
V_ELC_COST VARCHAR2(20);
V_ADJ_ELC VARCHAR2(20);
Cursor x is
select OBLIGATION_KEY,OBLIGATION_LEVEL,KEY_VALUE_1,KEY_VALUE_2,PARTNER_TYPE,PARTNER_ID,EXT_INVC_NO,EXT_INVC_DATE
,PAID_DATE,PAID_AMT,COMP_ID,RECEIPT_DATE,ORDER_QTY,RECEIPT_QTY,FRT,DUTY,SUPPLIER,FLAG,ELC_COST,ADJ_ELC
from RMSBIZ.CT_ANALYZE_LIFUNG_DUTY;
BEGIN
if not UTL_FILE.IS_OPEN(V_file) then
V_file := UTL_FILE.FOPEN('/rmsapps/rms803/biz/data/utl_files','sb_duty.csv','w'); --need to give the location for testing
end if;
for x1 in x loop
V_OBLIGATION_KEY :=nvl(x1.OBLIGATION_KEY,null);
V_OBLIGATION_LEVEL :=nvl(x1.OBLIGATION_LEVEL,null);
V_KEY_VALUE_1 :=x1.KEY_VALUE_1;
V_KEY_VALUE_2 :=x1.KEY_VALUE_2;
V_PARTNER_TYPE :=x1.PARTNER_TYPE;
V_PARTNER_ID :=x1.PARTNER_ID;
V_EXT_INVC_NO :=x1.EXT_INVC_NO;
V_EXT_INVC_DATE :=x1.EXT_INVC_DATE;
V_PAID_DATE :=x1.PAID_DATE;
V_PAID_AMT :=x1.PAID_AMT;
V_COMP_ID :=x1.COMP_ID;
V_RECEIPT_DATE :=x1.RECEIPT_DATE;
V_ORDER_QTY :=x1.ORDER_QTY;
V_RECEIPT_QTY :=x1.RECEIPT_QTY;
V_FRT :=x1.FRT;
V_DUTY :=x1.DUTY;
V_SUPPLIER :=x1.SUPPLIER;
V_FLAG :=nvl(x1.FLAG,'0');
V_ELC_COST :=to_char(nvl(x1.ELC_COST,0));
V_ADJ_ELC :=to_char(nvl(x1.ADJ_ELC,0));
V_flatfile_line:= V_OBLIGATION_KEY||','||
V_OBLIGATION_LEVEL ||','||
V_KEY_VALUE_1 ||','||
V_KEY_VALUE_2 ||','||
V_PARTNER_TYPE ||','||
V_PARTNER_ID ||','||
V_EXT_INVC_NO ||','||
V_EXT_INVC_DATE ||','||
V_PAID_DATE ||','||
V_PAID_AMT ||','||
V_COMP_ID ||','||
V_RECEIPT_DATE ||','||
V_ORDER_QTY ||','||
V_RECEIPT_QTY ||','||
V_FRT ||','||
V_DUTY ||','||
V_SUPPLIER ||','||
-- V_FLAG ||','||
V_ELC_COST ||','||
V_ADJ_ELC;
UTL_FILE.PUT_LINE(V_file, V_flatfile_line);
end loop
commit;
UTL_FILE.fclose(V_file);
EXCEPTION
WHEN UTL_FILE.INVALID_PATH
THEN
DBMS_OUTPUT.PUT_LINE ('error: INVALID_PATH');
WHEN UTL_FILE.INVALID_MODE
THEN
DBMS_OUTPUT.PUT_LINE ('error: INVALID_MODE');
WHEN UTL_FILE.INVALID_FILEHANDLE
THEN
DBMS_OUTPUT.PUT_LINE ('error: INVALID_FILEHANDLE');
WHEN UTL_FILE.INVALID_OPERATION
THEN
DBMS_OUTPUT.PUT_LINE ('error: INVALID_OPERATION');
WHEN UTL_FILE.READ_ERROR
THEN
DBMS_OUTPUT.PUT_LINE ('error: READ_ERROR');
WHEN UTL_FILE.WRITE_ERROR
THEN
DBMS_OUTPUT.PUT_LINE ('error: WRITE_ERROR');
WHEN UTL_FILE.INTERNAL_ERROR
THEN
DBMS_OUTPUT.PUT_LINE ('error: INTERNAL_ERROR');
WHEN OTHERS THEN
-- v_error_code := SQLCODE;
-- v_error_message := SQLERRM;
-- dbms_output.put_line('ERROR: '||v_error_code);
-- dbms_output.put_line('ERROR: '||v_error_message);
RAISE_APPLICATION_ERROR(-20003,'sbduty - aborted');
END;
/It fails as soon as it encounters the V_FLAG in the V_flat_file_line. The data in the table has nulls for the FLAG, and some -ve numbers for the ELC_COST,ADJ_ELC..but that should'nt cause any problems does it?
Whats wrong with those 3 columns?
Errors that i get are:
SQL>Welcome-->BEGIN wrt_lifungduty; END;
ERROR at line 1:
ORA-20003: sbduty - aborted
ORA-06512: at "RMSBIZ.WRT_LIFUNGDUTY", line 117
ORA-06512: at line 1
SQL>Welcome-->117
117* RAISE_APPLICATION_ERROR(-20003,'sbduty - aborted');
SQL>Welcome-->
Thanks,
Chiru -
Problem while writing to fixed length flat file from xml
Hi,
I have a problem in writing data into a flat file of fixed length...
My input is a xml file and i want the output as a flat file. I am successful in converting the xml into flat file... But the main problem is, i am unable to insert spaces in between my fields in the flat file.
The data in the flat file comes without spaces... Any suggestions on writing the schema...
Regards
Surya.Have a look at this doc
http://otndnld.oracle.co.jp/document/products/as10g/101310/doc_cd/integrate.1013/b28994/nfb.htm#BGBBAJFD
your element should be something like this, it pads with a space using the paddedBy expression
<xsd:element name="C1" type="xsd:string" nxsd:style="fixedLength" nxsd:length="4" nxsd:paddedBy=" " nxsd:padStyle="tail" />if having trouble post what you want the file to look like, and the xsd you are using.
cheers
James -
EFT Payment Program. Writing to a flat file using UTL Package
Hi guys
I wonder if someone can help me. I'm battling with something in pl/sql. I have a procedure that writes to a flat file. Each procedure I call writes a single section of the file, e.g "create_hdr_rec_line_fn" writes the header on top of the file and "create_std_trx_rec_line_fn" writes the body below the header. Then lastly the procedure that writes the trailer at the bottom of the file.
My problem comes here: I have a proc that calculates the hash total.
This is done by " utl_file.put_line(g_file_id, g_seed_number)". I want to write the hash total next to the trailer section of the file and not below it. How can I do this? Here's the example of the flat file produced. The very last line is the hash total, but I my hash total to be on the same line as the (Trailer). I want it to be just next to the trailer. I hope my question is clear. Please see my procedure below the flat file.
Thanking you in advance....
My flat file
FHSSVSDIM15000932008102810483220081028T (header)
SD0009300100001D19874200019873402211ACSA JOHANNESBURG INTERNATIONA (Line1-Detail)SC00093001D14540500014540057261IS H/O MAIN ACCOUNT 0000000124959315207 (Line-Transaction)
ST00093000000700000000070000000000020000001806378410000000000000000000001806378 (Trailer)
58298239848772973764654319387982 (hash total)
My procedure
PROCEDURE eft(errbuf OUT NOCOPY VARCHAR2,
retcode OUT NOCOPY NUMBER,
p_payment_batch IN VARCHAR2) IS
v_eft_date VARCHAR2(100);
BEGIN
v_eft_date := TO_CHAR(SYSDATE, 'DD-MON-RRRR_HH24_MI_SS');
g_payment_batch := p_payment_batch;
g_file_name := 'EFT'||v_eft_date||'.txt';
g_file_id := utl_file.fopen(g_dir_name,
g_file_name,
'W');
utl_file.put_line(g_file_id,
create_hdr_rec_line_fn);
create_std_trx_rec_line_fn(g_file_id);
create_std_contra_rec_line_fn(g_file_id);
create_std_trailer_fn(g_file_id);
utl_file.put_line(g_file_id,
g_seed_number);
utl_file.fclose_all;
IF (update_tables != TRUE) THEN
Fnd_File.put_line(Fnd_File.LOG, 'Failed to update payment batch tables. Cancel the batch');
END IF;
EXCEPTION
WHEN OTHERS THEN
Fnd_File.put_line(Fnd_File.LOG, 'Print Error - eft' || SUBSTR(SQLERRM, 1, 250));
END eft;user643734 wrote:
Hi cdkumar
I'm not quite sure if I understand what you mean. Are you saying that I should use the "PUT" to write 'create_std_trailer_fn' and and then use the "PUT_LINE" to write 'g_seed_number'? Could you please show me what you mean by changing the proc I posted on my question with your suggestions.How about, rather than use code it for you, you try changing the code and giving it a go yourself.
Essentially PUT_LINE will write out the data and put a "new line" character on the end so the next output will appear on the following line; PUT will just output the data without terminating the line, so any subsequent output will just append to the end.
It's very simple. Give it a go. -
Refreshing the data in a flat file
Hi
I am working on data integration.
I need to fetch data from Oracle data base and then write it to a flat file.
It is working fine now,but for the next fetch I don't need the old data to remain there in the file.The data should get refreshed and only the newly fetched data should be present.
After the data is written to the flat file can i rename the file?
I need the format of the file to be 'File_yyyyMMDDHHmmss.txt'.
My final question is how should I FTP this to the target?
Please help me on this as soon as possible since this is needed in an urgent part of the delivery.All you ask is achievable:
1) The IKM SQL to file has a TRUNCATE option, which will if set to YES, will start from a clean file.
2) You could rename the file after writing it, but why not just write it with that name? If you set the resource name to be a variable, (e.g. #MyProj.MyFilename), and be sure to set the variable in the package before executing your interface, you should be able to get the file you want. Otherwise, you can use the OdiFileMove tool in your package to rename the file.
To set the name of the variable you can use a query on the database (if you are using Oracle, something like SELECT 'File_'||TOCHAR(SYSDATE) from DUAL.)
3) ODI has ftp classes built in- you can find the doc under doc\webhelp\en\ref_jython\jyt_ex_ftp.htm
Hope this helps
Message was edited by:
CTS -
Delete data from a flat file using PL/SQL -- Please help.. urgent
Hi All,
We are writing data to a flat file using Text_IO.Put_Line command. We want to delete data / record from that file after data is written onto it.
Please let us know if this is possible.
Thanks in advance.
VaishaliThere's nothing in UTL_FILE to do this, so your options are either to write a Java stored procedure to do it or to use whatever mechanism the host operating system supports for editing files.
Alternatively, you could write a PL/SQL procedure to read each line in from the file and then write them out to a second file, discarding the line you don't want along the way. -
APD - Query output on to a Flat file
All
We are on BI 7.0 with SP13 and NWS 12. I am trying to create a new APD with source as "Query" built on Info object and target as "Flat File". There is no requirement of transformation so I am trying to map directly from Source to Target with a direct link with out using any transformations in between,
I could check and found no errors in APD definition and when I run APD, the file is getting created both on application server and local workstation but with 0 records.
The volume of records are expected to be around 4 millions on this query and am just wondering why APD is not writing any data on CSV file.
Pls let me know the following;
- What is wrong in the above process and why it is not writing data on to flat file
- Can APD handle volume of 4 million rows even though job runs in the background?
Your inputs in this are highly helpful and thanks for your time to provide solution.
Thanks - SriramHi
Thanks for your reply and I could solve this issue and found the root cause.
The very purpose of this APD is to provide a list of business partners to an upstream systerm with few attributes and in this process one of my consultant has deactivated the key figure in the query which is an input for APD process.
When the key figure is inactivated ( this is the only key figure in the query ), APD does not recognize query as a valid one and sends out an error message.
When we re activate the key figure ( I have hidden this later using APD transformations to hide the column and send output to a flat file on to unix server ) in the query, the APD runs fine and could deliver the output in flat file.
Thanks -
Extracting data from Oracle to a flat file
I'm looking for options to extract data a table or view at a time to a text file. Short of writing the SQL to do it, are there any other ideas? I'm dealing with large volumes of data, and would like a bulk copy routine.
Thanks in advance for your help.Is there any script which i can use for pulling data from tables to a flat file and then import that data to other DB'sFlat file is adequate only from VARCHAR2, NUMBER, & DATA datatypes.
Other datatypes present a challenge within text file.
A more robust solution is to use export/import to move data between Oracle DBs. -
Saving query results to a flat file
Hello Experts!
We have a very specific issue on our current project and I would like to know if any of you have ever done something similar. We are taking query results from BW (after complex calculations, some based on SY-DATE) and saving them to flat files to transfer to a SQL database structure on the Enterprise Portal. From here, the portal team renders the information into more "static" dashboards that include mouse over features and even limited drilldown (i.e. no matter where a user clicks the report always drills down on profit center)
There are many reasons why the model is set up as such (mostly training of executive level end users), and even though it doesn't mesh with the idea that BW could do this work on its own, we have to work with what we have.
We have come up with 3 possible ways to save this data to flat files and hopefully someone can tell us which might be the most effective.
1.) Information Broadcasting. If we broadcast XML files to the portal, will the portals team be able to read that data into a SQL database? Is there another way to use broadcasting to create and send a flat file to specific location?
2.) RSCRM_BAPI. This transaction seems to not support texts.
3.) Open Hub. In order to get the open hub to work, we first have to save all of our query results to direct write data store objects using APD. (calculations based on current date, for example, would require daily full loads to the underlying data providers otherwise.)
What is the best way to accomplish this? Is information broadcasting capable of doing this?
Thanks!Hi Adam,
Do you have to use flat files to load the information to a SQL database? (if so maybe someone else has a suggestion on which way would be the best).
I try to stay away from flat file uploads as there is a lot of manual work involved. May I suggest setting up a connection to your SQL table in the DBCON table and then writing a small abap program to push data into the SQL database (which you can automate).
1) Use APD to push data into a table within BW.
2) Go to transaction SM30 > table DBCON and setup a new entry specifying the conncection parameters to your SQL database.
3) In SE38 Write an ABAP program along the folloing lines (assuming the connection you set in DBCON is named conn1:
data: con_name like dbcon-con_name
con_name = 'conn1'.
exec sql.
set connection :con_name
endexec.
****have a select statement which reads data from your table which the data is saved from the APD into an internal table**********
****loop on the internal table and have a SQL insert statement to insert the records into the SQL table as below******
exec sql.
insert into <SQL TABLE> (column names seperated by ,)
values (column names of the internal table seperated by ,) if the internal table is named itab the columns have to be specified as :itab-column1
If you decide to take this approach you may find more info on DBCON and the process in sdn. Good Luck!
endexec. -
Export InfoCube Contents by Request to a flat file
Hi SDN,
Is it possible, other than going into the manage contents of an infoCube,
to export to a CSV file, by a request id.
I had a look at the function module
RSDRI_INFOPROV_READ
and it nearly does what i require, although i wasn't able to get it to work.
Can anyone firstly provide me with the parameters to input to get it to write to a file, say C:\temp.csv.
We are not able to use open hub service because of licensing issues.
Finally, if there is no easy solution, i will just go to the manage contents, and dump out the 3 - 4 million records identified by a request id number
Thank you.
SimonHi,
hereunder an example populating an internal table; the same writing to flat file shouldn't be an issue (OPEN DATASET....)
DATA:
BEGIN OF ls_sls_infoprov_read,
0PLANT LIKE /BIC/VZMC0332-0PLANT,
0RT_POSNO LIKE /BIC/VZMC0332-0RT_POSNO,
0RT_RECNUMB LIKE /BIC/VZMC0332-0RT_RECNUMB,
0PSTNG_DATE LIKE /BIC/VZMC0332-0PSTNG_DATE,
0TIME LIKE /BIC/VZMC0332-0TIME,
ZAIRLINE LIKE /BIC/VZMC0332-ZAIRLINE,
ZFLIGHTN LIKE /BIC/VZMC0332-ZFLIGHTN,
ZPAXDEST LIKE /BIC/VZMC0332-ZPAXDEST,
ZPAXDESTF LIKE /BIC/VZMC0332-ZPAXDESTF,
ZPAXCLASS LIKE /BIC/VZMC0332-ZPAXCLASS,
0CUSTOMER LIKE /BIC/VZMC0332-0CUSTOMER,
0MATERIAL LIKE /BIC/VZMC0332-0MATERIAL,
0RT_SALRESA LIKE /BIC/VZMC0332-0RT_SALRESA,
0RT_POSSAL LIKE /BIC/VZMC0332-0RT_POSSAL,
0RT_POSSALT LIKE /BIC/VZMC0332-0RT_POSSALT,
ZREASVAL LIKE /BIC/VZMC0332-ZREASVAL,
0RT_PRICRED LIKE /BIC/VZMC0332-0RT_PRICRED,
0RT_PRICDIF LIKE /BIC/VZMC0332-0RT_PRICDIF,
END OF ls_sls_infoprov_read.
DATA: ls_sls_item LIKE ls_sls_infoprov_read.
TYPES: ly_sls_infoprov_read LIKE ls_sls_infoprov_read,
ly_sls_item LIKE ls_sls_item.
DATA:
lt_sls_infoprov_read
TYPE STANDARD TABLE OF ly_sls_infoprov_read
WITH DEFAULT KEY INITIAL SIZE 10,
gt_sls_infoprov_read LIKE lt_sls_infoprov_read.
DATA:
ls_sls_sfc TYPE RSDRI_S_SFC,
lt_sls_sfc TYPE RSDRI_TH_SFC,
ls_sls_sfk TYPE RSDRI_S_SFK,
lt_sls_sfk TYPE RSDRI_TH_SFK,
ls_sls_range TYPE RSDRI_S_RANGE,
lt_sls_range TYPE RSDRI_T_RANGE,
lv_sls_end_of_data TYPE RS_BOOL,
lv_sls_first_call TYPE rs_bool,
lv_sls_icube TYPE RSINFOCUBE VALUE 'ZMC033'.
*filling internal tables containing the output characteristics / KeyFigs
* Shop ID
ls_sls_sfc-chanm = '0PLANT'.
ls_sls_sfc-chaalias = ls_sls_sfc-chanm.
ls_sls_sfc-orderby = 1.
INSERT ls_sls_sfc INTO TABLE lt_sls_sfc.
** Till ID
ls_sls_sfc-chanm = '0RT_POSNO'.
ls_sls_sfc-chaalias = ls_sls_sfc-chanm.
ls_sls_sfc-orderby = 2.
INSERT ls_sls_sfc INTO TABLE lt_sls_sfc.
** Receipt Number
ls_sls_sfc-chanm = '0RT_RECNUMB'.
ls_sls_sfc-chaalias = ls_sls_sfc-chanm.
ls_sls_sfc-orderby = 4.
INSERT ls_sls_sfc INTO TABLE lt_sls_sfc.
** Posting Date
ls_sls_sfc-chanm = '0PSTNG_DATE'.
ls_sls_sfc-chaalias = ls_sls_sfc-chanm.
ls_sls_sfc-orderby = 3.
INSERT ls_sls_sfc INTO TABLE lt_sls_sfc.
* Time
ls_sls_sfc-chanm = '0TIME'.
ls_sls_sfc-chaalias = ls_sls_sfc-chanm.
ls_sls_sfc-orderby = 0.
INSERT ls_sls_sfc INTO TABLE lt_sls_sfc.
* Airline
ls_sls_sfc-chanm = 'ZAIRLINE'.
ls_sls_sfc-chaalias = ls_sls_sfc-chanm.
ls_sls_sfc-orderby = 0.
INSERT ls_sls_sfc INTO TABLE lt_sls_sfc.
* Flight Number
ls_sls_sfc-chanm = 'ZFLIGHTN'.
ls_sls_sfc-chaalias = ls_sls_sfc-chanm.
ls_sls_sfc-orderby = 0.
INSERT ls_sls_sfc INTO TABLE lt_sls_sfc.
* Destination
ls_sls_sfc-chanm = 'ZPAXDEST'.
ls_sls_sfc-chaalias = ls_sls_sfc-chanm.
ls_sls_sfc-orderby = 0.
INSERT ls_sls_sfc INTO TABLE lt_sls_sfc.
* Final Destination
ls_sls_sfc-chanm = 'ZPAXDESTF'.
ls_sls_sfc-chaalias = ls_sls_sfc-chanm.
ls_sls_sfc-orderby = 0.
INSERT ls_sls_sfc INTO TABLE lt_sls_sfc.
* Passenger Class
ls_sls_sfc-chanm = 'ZPAXCLASS'.
ls_sls_sfc-chaalias = ls_sls_sfc-chanm.
ls_sls_sfc-orderby = 0.
INSERT ls_sls_sfc INTO TABLE lt_sls_sfc.
* EU, non EU code
ls_sls_sfc-chanm = '0CUSTOMER'.
ls_sls_sfc-chaalias = ls_sls_sfc-chanm.
ls_sls_sfc-orderby = 0.
INSERT ls_sls_sfc INTO TABLE lt_sls_sfc.
** Item Characteristics
* Article
ls_sls_sfc-chanm = '0MATERIAL'.
ls_sls_sfc-chaalias = ls_sls_sfc-chanm.
ls_sls_sfc-orderby = 0.
INSERT ls_sls_sfc INTO TABLE lt_sls_sfc.
** Key Figures
* Sales Quantity in SuoM
ls_sls_sfk-kyfnm = '0RT_SALRESA'.
ls_sls_sfk-kyfalias = ls_sls_sfk-kyfnm.
ls_sls_sfk-aggr = 'SUM'.
INSERT ls_sls_sfk INTO TABLE lt_sls_sfk.
* Gross Sales Value
ls_sls_sfk-kyfnm = '0RT_POSSAL'.
ls_sls_sfk-kyfalias = ls_sls_sfk-kyfnm.
ls_sls_sfk-aggr = 'SUM'.
INSERT ls_sls_sfk INTO TABLE lt_sls_sfk.
* Tax Value
ls_sls_sfk-kyfnm = '0RT_POSSALT'.
ls_sls_sfk-kyfalias = ls_sls_sfk-kyfnm.
ls_sls_sfk-aggr = 'SUM'.
INSERT ls_sls_sfk INTO TABLE lt_sls_sfk.
* Discounts
ls_sls_sfk-kyfnm = 'ZREASVAL'.
ls_sls_sfk-kyfalias = ls_sls_sfk-kyfnm.
ls_sls_sfk-aggr = 'SUM'.
INSERT ls_sls_sfk INTO TABLE lt_sls_sfk.
* Reductions
ls_sls_sfk-kyfnm = '0RT_PRICRED'.
ls_sls_sfk-kyfalias = ls_sls_sfk-kyfnm.
ls_sls_sfk-aggr = 'SUM'.
INSERT ls_sls_sfk INTO TABLE lt_sls_sfk.
* Differences
ls_sls_sfk-kyfnm = '0RT_PRICDIF'.
ls_sls_sfk-kyfalias = ls_sls_sfk-kyfnm.
ls_sls_sfk-aggr = 'SUM'.
INSERT ls_sls_sfk INTO TABLE lt_sls_sfk.
** filters
FORM infoprov_sls_selections.
REFRESH lt_sls_range.
CLEAR ls_sls_range.
ls_sls_range-chanm = '0COMP_CODE'.
ls_sls_range-sign = 'I'.
ls_sls_range-compop = 'EQ'.
ls_sls_range-low = 'NL02'.
APPEND ls_sls_range TO lt_sls_range.
CLEAR ls_sls_range.
ls_sls_range-chanm = '0PLANT'.
ls_sls_range-sign = 'I'.
ls_sls_range-compop = 'EQ'.
ls_sls_range-low = 'NLAA'.
APPEND ls_sls_range TO lt_sls_range.
ls_sls_range-low = 'NLAB'.
APPEND ls_sls_range TO lt_sls_range.
ls_sls_range-low = 'NLAC'.
APPEND ls_sls_range TO lt_sls_range.
ls_sls_range-low = 'NLAM'.
APPEND ls_sls_range TO lt_sls_range.
** Test Only
CLEAR ls_sls_range.
ls_sls_range-chanm = '0CALDAY'.
ls_sls_range-sign = 'I'.
ls_sls_range-compop = 'BT'.
ls_sls_range-low = s_datefr.
ls_sls_range-high = s_dateto.
APPEND ls_sls_range TO lt_sls_range.
* CLEAR ls_sls_range.
* ls_sls_range-chanm = '0CALDAY'.
* ls_sls_range-sign = 'I'.
* ls_sls_range-compop = 'LE'.
* ls_sls_range-low = s_dateto.
* APPEND ls_sls_range TO lt_sls_range.
* infoprov_sls_read
DATA: lv_sls_records TYPE I, lv_sls_records_char(9) TYPE C.
lv_sls_end_of_data = ' '.
lv_sls_first_call = 'X'.
* read data in packages directly from the cube
WHILE lv_sls_end_of_data = ' '.
CALL FUNCTION 'RSDRI_INFOPROV_READ'
EXPORTING i_infoprov = lv_sls_icube
i_th_sfc = lt_sls_sfc
i_th_sfk = lt_sls_sfk
i_t_range = lt_sls_range
i_reference_date = sy-datum
i_save_in_table = ' '
i_save_in_file = ' '
I_USE_DB_AGGREGATION = 'X'
i_packagesize = 100000
i_authority_check = 'R'
IMPORTING e_t_data = lt_sls_infoprov_read
e_end_of_data = lv_sls_end_of_data
CHANGING c_first_call = lv_sls_first_call
EXCEPTIONS illegal_input = 1
illegal_input_sfc = 2
illegal_input_sfk = 3
illegal_input_range = 4
illegal_input_tablesel = 5
no_authorization = 6
ncum_not_supported = 7
illegal_download = 8
illegal_tablename = 9
OTHERS = 11.
IF sy-subrc <> 0.
BREAK-POINT. "#EC NOBREAK
EXIT.
ENDIF.
APPEND LINES OF lt_sls_infoprov_read TO gt_sls_infoprov_read.
ENDWHILE.
Another options for "adjusting" a cube:
- converting it to transactional and use BPS functionalities (ABAP report SAP_CONVERT_TO_TRANSACTIONAL)
- loopback scenario: generate export datasource on your cube with update rules to your cube itslef; you would extract the data to be corrected to PSA; you can then edit this PSA (manually or mass updates with ABAP code) and the post it again in your cube.
hope this helps...
Olivier. -
Extract BW infocube data to a flat file
Hi Folks,
I have a requirement where I need to extract data from Infocubes to a flat file.
Please suggest me the ABAP programming methodology where i can use function module to extract data from cube and dump it to internal table.
Your immediate help is required.
Thanks,
Rahulyou can simply try a workaround for your problem
right click on cube -> display data-> menu bar->edit--download->save as spreadsheet (.xls) and extract the data
and if you want .txt only. try saving excel in .csv then you can open that via notepad and save it in .txt
you wil be saved frm writing abap codes
-laksh -
HELP!! - Need PL/SQL to write to a flat file!!
I'm trying to query information about a customer's salesrep, and append the results to a flat file. I'm a beginner, and the following pseudocode is the best I have so far. Any advice would be much appreciated.
Thanks in advance!!
Paul
CREATE OR REPLACE PROCEDURE paul IS
file_handle utl_file.file_type;
mgrname CHAR;
mgrphone CHAR;
mgrext CHAR;
BEGIN
utl_file.open('C:\WINNT\Profiles\pking\Desktop\outputfile.txt','w');
SELECT
name
,attribute7
,attribute8
INTO
mgrname
,mgrphone
,mgrext
FROM
ra_salesreps_all
rem WHERE
rem X-X-X-X-X
rem
rem EXCEPTION
rem WHEN no_data_found THEN
rem NULL;
utl_file.putf(file_handle, mgrname, mgrphone, mgrext);
utl_file.fclose(file_handle);
END paul;
nullBelow is a simple one....
Procedure WRITE2FILE
id_h in integer,
matter in varchar2 default null
IS
v_FileHandle utl_file.file_type;
root_dir varchar2(200);
file_h varchar2(100);
BEGIN
file_h := 'msg_'| |id_h| |'.txt'; -- you can give dynamic file name
root_dir := 'unix_or_nt/home/file_dir';
v_FileHandle := utl_file.fopen(root_dir,file_h,'w');
if matter is not null then
utl_file.put_line(v_FileHandle,'Additional Information');
utl_file.put_line(v_FileHandle,'------------------------------------------------------------------');
utl_file.put(v_FileHandle,matter);
utl_file.new_line(v_FileHandle,1);
utl_file.put_line(v_FileHandle,'------------------------------------------------------------------');
else
utl_file.put(v_FileHandle,matter);
end if;
utl_file.fflush(v_FileHandle);
utl_file.fclose_all();
exception
when utl_file.invalid_path then
DBMS_OUTPUT.PUT_LINE('Invalid path:');
when utl_file.invalid_mode then
DBMS_OUTPUT.PUT_LINE('invalid_mode');
when utl_file.invalid_filehandle then
DBMS_OUTPUT.PUT_LINE('invalid_filehandle');
when utl_file.invalid_operation then
DBMS_OUTPUT.PUT_LINE('Invalid_operation. ');
DBMS_OUTPUT.PUT_LINE('The File is not available.');
when utl_file.read_error then
DBMS_OUTPUT.PUT_LINE('read_error');
when utl_file.write_error then
DBMS_OUTPUT.PUT_LINE('write_error');
when utl_file.internal_error then
DBMS_OUTPUT.PUT_LINE('internal_error');
when others then
DBMS_OUTPUT.PUT_LINE(4, 'A problem was encountered while writing the document.');
end ;
Calling procedure>>>>>>>>>
execute write2file(100,'Prints the matter in here.');
will result in a file with name msg_100.txt and the contents of the file will be...
Additional Information
Prints the matter in here.
1)Make sure the directory has write permissions
2)Initialization parameter UTL_FILE = 'unix_or_nt/home/file_dir' on database server. If not, then put this in init.ora (ask your DBA) and restart the db.
3)check the syntax for the '/' and '\' depending on your OS
null -
CAN NOT TRANPORT INFOSPOKE FLAT FILE DESTINATION PATH
Hi,
I am writing from infospoke to a flat file in application server. When I transported this to QA, Application server Check box is unchecked and path is changed to path in QA.
Do I need to have Logical File name in order to transport. If I have 5 infospokes do I need to have 5 logical file names or is there any way to pass a parameter ... .
Is there any other way to transport with out having File Name option instead of Logical File name?
Thanks
KrishnaHi Krishna,
You would need an ABAP program to change the entry in the table RSBFILE.
First try this. Goto se11 -> RSBFILE -> give your Infohub name and check if you have authorization to change the path.
ELSE..
Write an ABAP program and update the field.
REPORT test.
tables: RSBFILE.
update RSBFILE set PATH = '<your path>' where OHDEST = <your Infobub>.
Bye
Dinesh -
How to Create a Flat File using FTP/File Adapter
Can any body done workaround on creating the Flat file using FTP/File Adapter?.
I need to create a simple FlatFile either using of delimiter/Fixed length. using the above said adapters we can create XML file, i tried concatinating all the values into a single String and writing into a file, but it does not have proper structure
Can any body help me out on this..
Thanks
RamYou can create a text schema while creating a File Adapter. If schema is specified for File Adapter, it takes care of converting XML into fixed length or delimited format.
Thanks,
-Ng. -
After working with flat files (especially text and CSV) for some time in my current role I have decided to come up with this little discussion which I believe may benefit peers who might not yet have come across some of the issues that I have raised in this
piece. This piece is just a summary of what I have thought would be useful hints for anyone to consider when they are working with CSV or text files. Everything in this writing is purely from my experience and I am discussing it at SSIS level. I believe most
of the actions can be applied across in SSMS when dealing with flat files there.
Flat files as destination file
Exporting data to flat file destination is relatively easy and straight forward. There aren’t as many issues experienced here as they are when one imports data from flat files. However, whatever one does here hugely impacts on how easy and straightforward
importing data down the line from your flat file that you create at this stage will be. There are a few things to note. When you open your Flat File Destination component, of the options that you will see – listed below is a discussion of some of them. I have
found it useful to take the actions which I have mentioned below.
Column names in the first row – If you want the column names to be in the first row on your file make sure that you always check the box with that option. SSIS does not do that for you by default.
Column delimiter – I always prefer to use Tab {t}. The default delimiter is Comma {,}. The problem that I have found with use of comma delimiters is that if the values have inherent commas within them the system does not always get it right
in determining where the column ends. As a result you may end up getting some rows with shifted and misplaced values from their original columns owing to the wrong column allocation by the system.
AlwaysCheckForRowDelimiters – this is a Flat File Connection Manager property. The default setting of this property is True. At one time I found myself having to use this after I faced a huge problem with breaking and misplacement of rows
in the dataset that I was dealing with. The problem emanated from the values in one of the columns. The offending column had values of varchar datatype which were presented in the form of paragraphs with all sorts of special characters within them, e.g.
This is ++, an example of what – I mean… the characters ;
in the dataset which gave me: nearly 100% ++ headaches – looked like {well}; this piece
OF: example??
You can see from the above italicised dummy value example what I mean. Values such as that make the system to prematurely break the rows. I don’t know why but the somehow painful experience that I had about this led me to the conclusion that I should not
leave the system to auto-decide where the row ends. As such, when I changed the property
AlwaysCheckForRowDelimiters from True to False, along with the recommendations mentioned in items 1 and 2 above, breaking and misplacement of rows was solved. By breaking I mean - you will find one row in a table being broken into two
or three separate rows in the flat file. This is carried over to the new table where that flat will is loaded.
Addendum
There is an additional option which I have found to work even better if one is experiencing issues with breaking of rows due to values being in the ‘paragraph’ format as illustrated above. The option for ‘paragraphed’ values which
I explained earlier works, but not always as I have realised. If that option does not work, this is what you are supposed to do
When you SELECT the data for export from your table, make your SELECT statement to look something like this
SELECT
ColumnName1,
ColumnName2,
ColumnName3,
REPLACE(REPLACE(OffendingColumnNameName,CHAR(10),''),CHAR(13),'')
AS OffendingColumnNameName,
ColumnName4
FROM
MyTableName
The REPLACE function gets rid of the breaks on your values. That is, it gets rid of the paragraphs in the values.
I would suggest use of double dagger column delimiters if using this approach.
Text or CSV file?? – In my experience going with the text file is always efficient. Besides, some of the things recommended above only work in text file (I suppose so. I stand to be corrected on this). An example of this is column delimiters.
Item 2 above recommends use of Tab {t} column delimiter whereas in CSV, as the name suggests, the delimiters are commas.
Flat files as source file
In my experience, many headaches of working with flat files are seen at importing data from flat files. A few examples of the headaches that I’m talking about are things such as,
Datatypes and datatype length, if using string
Shifting and misplacement of column values
Broken rows, with some pseudo-rows appearing in your import file
Double quotation marks in your values
Below I will address some of the common things which I have personally experienced and hope will be useful to other people. When you open your Flat File Source component, of the options that you will see – listed below is a discussion of some of them. I
have found it useful to take the actions which I have mentioned below.
Retain null values from the source as null values in the data flow – this option comes unchecked by default. From the time I noticed the importance of putting a check mark in it, I always make sure that I check it. It was after some of
my rows in the destination table were coming up with shifted and misplaced column values. By shifted and misplaced column values I mean certain values appearing under columns where you do not expect them, by so doing showing that purely the value has
been moved from its original column to another column where it does not belong.
Text qualifier – the default entry here is <none>. I have found that it is always handy to insert double quotes here (“). This will eliminate any double quotes which the system may have included at the time when the flat file was
created. This happens when the values in question have commas as part of the characters in them.
Column delimiter – this solely depends on the column delimiter which was specified at the time when the flat file was created. The system default is Comma {,}. Please note that if the delimiter specified here is different from the one in
your flat file the system will throw up an error with a message like “An error occurred while skipping data rows”.
Column names in the first data row – if you want the first row to be column names put a check mark on this option.
Datatypes and datatypes length
By default when you import a flat file your datatypes for all the columns come up as varchar (50) in SSIS. More often than not if you leave this default setup your package will fail when you run it. This is because some of the values in some of your columns
will be more than 50 characters, the default length. The resulting error will be a truncation error. I have found two ways of dealing with this.
Advanced – This is an option found on the Flat File Source Editor. Once this option is selected on your Flat File Source Editor you will be presented with a list of columns from your flat file. To determine your datatypes and length there
are two possible things that you can do at this stage.
Go column by column – going column by column you can manually input your desired datatypes and lengths on the Flat File Source Editor through the Advanced option.
Suggest types – this is another option under Advanced selection. What this option does is suggest datatypes and lengths for you based on the sample data amount that you mention in the pop-up dialog box. I have noticed that while this is
a handy functionality, the problem with it is that if some of the values from the non-sampled data have lengths bigger than what the system would have suggested the package will fail with a truncation error.
View code – this is viewing of XML code. If for example you want all your columns to be of 255 characters length in your landing staging table
Go to your package name, right click on it and select the option View code from the list presented to you. XML code will then come up.
Hit Ctrl + F to get a “Find and Replace:” window. On “Find What” type in
DTS:MaximumWidth="50" and on “Replace with:” type in
DTS:MaximumWidth="255". Make sure that under “Look in” the selection is
Current Document.
Click “Replace All” and all your default column lengths of 50 characters will be changed to 255 characters.
Once done, save the changes. Close the XML code page. Go to your package GUI designer. You will find that the Flat File Source component at this point will be highlighted with a yellow warning triangle. This is because the metadata definition has changed.
Double click the Flat File Source component and then click on Ok. The warning will disappear and you will be set to pull and load your data to your staging database with all columns being varchar (255). If you need to change any columns to specific data types
you can either use Data Conversion Component or Derived Column component for that purpose, OR you can use both components depending on the data types that you will converting to.
Dynamic Flat File Name and Date
Please see this blog
http://www.bidn.com/blogs/mikedavis/ssis/153/using-expression-ssis-to-save-a-file-with-file-name-and-date
There is so much to flat files to be discussed in one piece.
Any comments plus additions (and subtractions too) to this piece are welcome.
MpumeleloYou could create a
WIKI about this subject. Also see
http://social.msdn.microsoft.com/Forums/sqlserver/en-US/f46b14ab-59c4-46c0-b125-6534aa1133e9/ssis-guru-needed-apply-within?forum=sqlintegrationservices
Please mark the post as answered if it answers your question | My SSIS Blog:
http://microsoft-ssis.blogspot.com |
Twitter
Maybe you are looking for
-
AMD 5770 stuttering with clean install Mountain Lion
I am having issues with bad screen stuttering using an AMD Radeon HD5770 on my early 2008 3,1 Mac Pro running two 24" Apple Cinema Display in portrait mode. The screen stutter/shaking issue is very obvious when using animations like alternating betwe
-
I just upgraded the ram in my new 2011 iMac and I want to run Apple Hardware Test. Since Apple no longer supplied a DVD with this bootable app I was wondering how I can launch the App. I tried apple's online doc and held the "d" key down to try to bo
-
I included couple text files in my C# app by going to Tools | Projects, going to the Resources tab, clicking on "Add Resource" and selected "Add Existing File" and selected two existing text files. In addition to showing in the Resources pane, the f
-
Supply and Demand Propagation not working in SCM5.0
Just curious if anyone is using S&OP (Supply and Demand Propagation) in APO? I am using SCM v5.0 and Planning Area 9ASNP01 can not be initialized. OSS 832393 states: The SNP propagation planning (planning area SNP 9ASNP01, transaction /SAPAPO/SNPSOP
-
Streaming mp3 through Oracle/RealServer plugin
Did anybody try to stream mp3 content from Oracle database through Oracle interMadia plugin for RealServer8? The RealServer8 itself plays mp3 files with no problem. I am having no problem to play native RealServer content (loaded from *.rm files) tro