Export HRMS data to a flat file
Hi All!
Are there any ways of exporting employee related data to a flat file without using a client app (PeopleCode or Integration Broker), that is simply generate a CSV feed from UI?
You can Schedule a query and specify the output format as text. Note that when you select View Log/Trace in process monitor, you will see a file with a .csv extension. However, it will open by default in Excel, and even if you select Save instead of Open it will try to change the extension to .xls. You will have to change it back to .csv.
Similar Messages
-
Export table data in a flat file without using FL
Hi,
I am looking for options where I can export table data into a flat file without using FL(File Layout) i.e., by using App Engine only.
Please share your experience if you did anything as this
ThanksA simple way to export any record (table/view) to an csv fiel, is to create a rowset and loop through all record fields, like below example code
Local Rowset &RS;
Local Record &Rec;
Local File &MYFILE;
Local string &FileName, &strRecName, &Line, &Seperator, &Value;
Local number &numRow, &numField;
&FileName = "c:\temp\test.csv";
&strRecName = "PSOPRDEFN";
&Seperator = ";";
&RS = CreateRowset(@("Record." | &strRecName));
&RS.Fill();
&MYFILE = GetFile(&FileName, "W", %FilePath_Absolute);
If &MYFILE.IsOpen Then
For &numRow = 1 To &RS.ActiveRowCount
&Rec = &RS(&numRow).GetRecord(@("RECORD." | &strRecName));
For &numField = 1 To &Rec.FieldCount
&Value = String(&Rec.GetField(&numField).Value);
If &numField = 1 Then
&Line = &Value;
Else
&Line = &Line | &Seperator | &Value;
End-If;
End-For;
&MYFILE.WriteLine(&Line);
End-For;
End-If;
&MYFILE.Close(); You can of course create an application class for generic calling this piece of code.
Hope it helps.
Note:
Do not come complaining to me on performance issues ;) -
Exporting table data to a flat file
Hi, I am trying to export data from a table to a text file using this:
CREATE TABLE bufurl
url, sno
ORGANIZATION EXTERNAL
TYPE oracle_datadump
DEFAULT DIRECTORY dataload
LOCATION ('external.txt')
AS
SELECT url, sno from table1 where sno > 10
but i get this error:
ERROR at line 1:
ORA-29829: implementation type does not exist
any clues what might be going on here?user652125 wrote:
Hi, I am trying to export data from a table to a text file using this:
CREATE TABLE bufurl
url, sno
ORGANIZATION EXTERNAL
TYPE oracle_datadump
DEFAULT DIRECTORY dataload
LOCATION ('external.txt')
AS
SELECT url, sno from table1 where sno > 10
but i get this error:
ERROR at line 1:
ORA-29829: implementation type does not exist
any clues what might be going on here?Well it should work if you are using 10g or above...
SQL> ed
Wrote file afiedt.buf
1 CREATE TABLE ext_emps
2 ORGANIZATION EXTERNAL
3 (TYPE oracle_datapump
4 DEFAULT DIRECTORY TEST_DIR
5 LOCATION ('emps.txt'))
6 AS
7 SELECT empno, ename
8* from emp
SQL> /
Table created.
SQL>What version of SQL*Plus are you running? -
Export multiple tables into one flat file
I have data in multiple tables on a processing database that I need to move up to a production database. I want to export the data into a flat file, ftp it to the production server and have another job pick up the file and process it. I am looking for
design suggestions on how to get multiple tables into one flat file using SSIS?
Thank You.Hey,
Without a bit more detail, as per Russels response, its difficult to give an exact recommendation.
Essentially, you would first add a data flow task to your control flow. Create a source per table, then direct the output of each into an union all task. The output from the union all task would then be directed to a flat file destination.
Within the union all task you can map the various input columns into the appropriate outputs.
If the sources are different, it would probably be easiest to add a derived column task in-between the source and the union all, adding columns as appropriate and setting a default value that can be easily identified later (again depending on your requirements).
Hope that helps,
Jamie -
Hi all,
I need to export a table data to a flat file.
But the problem is that the data is huge about 200million rows occupying around 60GB of space
If I use SQL*Loader in Toad, it is taking huge time to export
After few months, I need to import the same data again into the table
So please help me which is the efficient and less time taking method to do this
I am very new to this field.
Can some one help me with this?
My oracle database version is 10.2
Thanks in advanceOK so first of all I would ask the following questions:
1. Why must you export the data and then re-import it a few months later?
2. Have you read through the documentation for SQLLDR thoroughly ? I know it is like stereo instructions but it has valuable information that will help you.
3. Does the table the data is being re-imported into have anything attached to it eg: triggers, indices or anything that the DB must do on each record? If so then re-read the the sqlldr documentation as you can turn all of that off during the import and re-index, etc. at your leisure.
I ask these questions because:
1. I would find a way for whatever happens to this data to be accomplished while it was in the DB.
2. Pumping data over the wire is going to be slow when you are talking about that kind of volume.
3. If you insist that the data must be dumped, massaged, and re-imported do it on the DB server. Disk IO is an order of magnitude faster then over the wire transfer. -
Extract PSA Data to a Flat File
Hello,
I would like to download PSA data into a flat file, so I can load it into SQL Server and run SQL statements on it for analysis purposes. I have tried creating a PSA export datasource; however, I can find way to cleanly get the data into a flat structure for analysis and/or download.
Can anyone suggest options for doing this?
Thanks in advance for your help.
Sincerely,
SonyaHi Sonya,
In teh PSA screen try pressing Shift and F8. If this does not bring up the file types then you can try the following: Settings > Chnage Display Variants > View tab > Microsoft Excel > Click SAP_OM.xls and then the green check amrk. The data will be displayed in excel format which you can save to your PC.
Hope this helps... -
Uploading the data from a flat file into ztable
Hi,
I have a requirement where I have to upload the data from 2 flat files into 2 z tables(ZRB_HDR,ZRB_ITM).From the 1st flat file only data for few fields have to be uploaded into ztable(ZRB_HRD) .Fromthe 2nd flat file data for all the fields have to me uploaded into ztable(ZRB_ITM). How can I do this?
Regards,
Hemahi,
declare two internal table with structur of your tables.
your flat files should be .txt files.
now make use of GUI_UPLOAD function module to upload your flatfile into internal tables.
CALL FUNCTION 'GUI_UPLOAD'
EXPORTING
filename = 'c:\file1.txt'
has_field_separator = 'X'
TABLES
data_tab = itab1
EXCEPTIONS
OTHERS = 1.
use this function twice for two tables.
then loop them individually and make use of insert command. -
Problem in the BDC program to upload the data from a flat file.
Hi,
I am required to write a BDC program to upload the data from a flat file. The conditions are as mentioned below:-
1) Selection Screen will be prompted to user and user needs to provide:- File Path on presentation server (with F4 help for this obligatory parameter) and File Separator e.g. @,#,$,%,... etc(fields in the file will be separated by using this special character) or fields may be separated by tab(tab delimited).
2) Finally after the data is uploaded, following messages need to be displayed:-
a) Total Number of records successfully uploaded.
b) Session Name
c) Number of Sessions created.
Problem is when each record is fetched from flat file, the record needs to be split into individual fields separated by delimiter or in case tab separated, then proceeding in usual manner.
It would be great if you provide me either the logic, pseudocode, or sample code for this BDC program.
Thanks,Here is an example program, if you require the delimitor to be a TAB, then enter TAB on the selection screen, if you require the delimitor to be a comma, slash, pipe, whatever, then simply enter that value. This example is simply the uploading of the file, not the BDC, I assume that you know what to do once you have the data into the internal table.
REPORT zrich_0001.
TYPES: BEGIN OF ttab,
rec TYPE string,
END OF ttab.
TYPES: BEGIN OF tdat,
fld1(10) TYPE c,
fld2(10) TYPE c,
fld3(10) TYPE c,
fld4(10) TYPE c,
END OF tdat.
DATA: itab TYPE TABLE OF ttab.
data: xtab like line of itab.
DATA: idat TYPE TABLE OF tdat.
data: xdat like line of idat.
DATA: file_str TYPE string.
DATA: delimitor TYPE string.
PARAMETERS: p_file TYPE localfile.
PARAMETERS: p_del(5) TYPE c.
AT SELECTION-SCREEN ON VALUE-REQUEST FOR p_file.
DATA: ifiletab TYPE filetable.
DATA: xfiletab LIKE LINE OF ifiletab.
DATA: rc TYPE i.
CALL METHOD cl_gui_frontend_services=>file_open_dialog
CHANGING
file_table = ifiletab
rc = rc.
READ TABLE ifiletab INTO xfiletab INDEX 1.
IF sy-subrc = 0.
p_file = xfiletab-filename.
ENDIF.
START-OF-SELECTION.
TRANSLATE p_del TO UPPER CASE.
CASE p_del.
WHEN 'TAB'.
delimitor = cl_abap_char_utilities=>horizontal_tab.
WHEN others.
delimitor = p_del.
ENDCASE.
file_str = p_file.
CALL METHOD cl_gui_frontend_services=>gui_upload
EXPORTING
filename = file_str
CHANGING
data_tab = itab.
LOOP AT itab into xtab.
CLEAR xdat.
SPLIT xtab-rec AT delimitor INTO xdat-fld1
xdat-fld2
xdat-fld3
xdat-fld4.
APPEND xdat to idat.
ENDLOOP.
LOOP AT idat into xdat.
WRITE:/ xdat-fld1, xdat-fld2, xdat-fld3, xdat-fld4.
ENDLOOP.
Regards,
Rich Heilman -
Procurement Card data upload from flat file to database
Hi All,
I need to upload Procurement Card data from a flat file to the database in the table BBP_PCMAS.
I found a BAPI BAPI_PCARD_CREATEMULTIPLE which uploads the data perfectly, however the structure PCMASTER that it takes as input does not contain the field for Blocking reason PCBLOCK - Reason for blocking procurement card. I need to upload this file as well from the flat file.
Any suggestions?
ThanksHi,
You are correct the function module BAPI_PCARD_CREATEMULTIPLE does not contain the PCBLOCK field.
Alternatively what you can do is read the PC data after it is created and modify it with the PCBLOCK appropiately. The necessary function modules are given below.
BBP_PCMAS_READ_PCMAS - Read Data
BBP_PCMAS_MODIFY_PCMAS - Modify Data
Note: BBP_PCMAS_MODIFY_PCMAS is a Update Task FM. Hence it shoild be called as given below, ( refer form write_data of the FM BAPI_PCARD_CREATEMULTIPLE)
call function 'BBP_PCMAS_MODIFY_PCMAS' in update task
exporting
i_pcmas = i_pcmas
* I_PCMAS_OLD =
* I_DELETE =
tables
t_pcacc = i_pcacc
* T_PCACC_OLD =
exceptions
not_found = 1
others = 2.
Regards
Kathirvel -
BAPI to upload data from a flat file to VA01
Hi guys,
I have a requirement wherein i need to upload data from a flat file to VA01.Please tell me how do i go about this.
Thanks and regards,
Frank.Hi
previously i posted code also
Include YCL_CREATE_SALES_DOCU *
Form salesdocu
This Subroutine is used to create Sales Order
-->P_HEADER Document Header Data
-->P_HEADERX Checkbox for Header Data
-->P_ITEM Item Data
-->P_ITEMX Item Data Checkboxes
-->P_LT_SCHEDULES_IN Schedule Line Data
-->P_LT_SCHEDULES_INX Checkbox Schedule Line Data
-->P_PARTNER text Document Partner
<--P_w_vbeln text Sales Document Number
DATA:
lfs_return like line of t_return.
FORM create_sales_document changing P_HEADER like fs_header
P_HEADERX like fs_headerx
Pt_ITEM like t_item[]
Pt_ITEMX like t_itemx[]
P_LT_SCHEDULES_IN like t_schedules_in[]
P_LT_SCHEDULES_INX like t_schedules_inx[]
Pt_PARTNER like t_partner[]
P_w_vbeln like w_vbeln.
This Perform is used to fill required data for Sales order creation
perform sales_fill_data changing p_header
p_headerx
pt_item
pt_itemx
p_lt_schedules_in
p_lt_schedules_inx
pt_partner.
Function Module to Create Sales and Distribution Document
perform sales_order_creation using p_header
p_headerx
pt_item
pt_itemx
p_lt_schedules_in
p_lt_schedules_inx
pt_partner.
perform return_check using p_w_vbeln .
ENDFORM. " salesdocu
Form commit_work
To execute external commit *
FORM commit_work .
CALL FUNCTION 'BAPI_TRANSACTION_COMMIT'
EXPORTING
WAIT = c_x
ENDFORM. " Commit_work
Include ycl_sales_order_header " To Fill Header data and Item data
Include ycl_sales_order_header.
Form return_check
To validate the sales order creation
FORM return_check using pr_vbeln type vbeln.
if pr_vbeln is initial.
LOOP AT t_return into lfs_return .
WRITE / lfs_return-message.
clear lfs_return.
ENDLOOP. " Loop at return
else.
perform commit_work. " External Commit
Refresh t_return.
fs_disp-text = text-003.
fs_disp-number = pr_vbeln.
append fs_disp to it_disp.
if p_del eq c_x or p_torder eq c_x or
p_pgi eq c_x or p_bill eq c_x.
perform delivery_creation. " Delivery order creation
endif. " If p_del eq 'X'......
endif. " If p_w_vbeln is initial
ENDFORM. " Return_check
*& Form sales_order_creation
text
-->P_P_HEADER text
-->P_P_HEADERX text
-->P_PT_ITEM text
-->P_PT_ITEMX text
-->P_P_LT_SCHEDULES_IN text
-->P_P_LT_SCHEDULES_INX text
-->P_PT_PARTNER text
FORM sales_order_creation USING P_P_HEADER like fs_header
P_P_HEADERX like fs_headerx
P_PT_ITEM like t_item[]
P_PT_ITEMX like t_itemx[]
P_P_LT_SCHEDULES_IN like t_schedules_in[]
P_P_LT_SCHEDULES_INX like t_schedules_inx[]
P_PT_PARTNER like t_partner[].
CALL FUNCTION 'BAPI_SALESDOCU_CREATEFROMDATA1'
EXPORTING
sales_header_in = p_p_header
sales_header_inx = p_p_headerx
IMPORTING
salesdocument_ex = w_vbeln
TABLES
return = t_return
sales_items_in = p_pt_item
sales_items_inx = p_pt_itemx
sales_schedules_in = p_p_lt_schedules_in
sales_schedules_inx = p_p_lt_schedules_inx
sales_partners = p_pt_partner.
ENDFORM. " sales_order_creation
plzz reward if i am usefull plzz -
How to export a data as an XML file from oracle data base?
could u pls tell me the step by step procedure for following questions...? how to export a data as an XML file from oracle data base? is it possible? plz tell me itz urgent requirement...
Thankz in advance
BalaSQL> SELECT * FROM v$version;
BANNER
Oracle DATABASE 11g Enterprise Edition Release 11.1.0.6.0 - Production
PL/SQL Release 11.1.0.6.0 - Production
CORE 11.1.0.6.0 Production
TNS FOR 32-bit Windows: Version 11.1.0.6.0 - Production
NLSRTL Version 11.1.0.6.0 - Production
5 rows selected.
SQL> CREATE OR REPLACE directory utldata AS 'C:\temp';
Directory created.
SQL> declare
2 doc DBMS_XMLDOM.DOMDocument;
3 xdata XMLTYPE;
4
5 CURSOR xmlcur IS
6 SELECT xmlelement("Employee",XMLAttributes('http://www.w3.org/2001/XMLSchema' AS "xmlns:xsi",
7 'http://www.oracle.com/Employee.xsd' AS "xsi:nonamespaceSchemaLocation")
8 ,xmlelement("EmployeeNumber",e.empno)
9 ,xmlelement("EmployeeName",e.ename)
10 ,xmlelement("Department",xmlelement("DepartmentName",d.dname)
11 ,xmlelement("Location",d.loc)
12 )
13 )
14 FROM emp e
15 , dept d
16 WHERE e.DEPTNO=d.DEPTNO;
17
18 begin
19 OPEN xmlcur;
20 FETCH xmlcur INTO xdata;
21 CLOSE xmlcur;
22 doc := DBMS_XMLDOM.NewDOMDocument(xdata);
23 DBMS_XMLDOM.WRITETOFILE(doc, 'UTLDATA/marco.xml');
24 end;
25 /
PL/SQL procedure successfully completed.
. -
Error while uploading data from a flat file to the hierarchy
Hi guys,
after i upload data from a flat file to the hierarchy, i get a error message "Please select a valid info object" am loading data using PSA, having activated all external chars still get the problem..some help on this please..
regards
Srithere is o relation of infoobject name in flat file and infoobjet name at BW side.
please check with the object in the BW and their lengths and type of the object and check your flat file weather u have the same type there,
now check the sequence of the objects in the transfer rules and activate them.
there u go. -
Delete data from a flat file using PL/SQL -- Please help.. urgent
Hi All,
We are writing data to a flat file using Text_IO.Put_Line command. We want to delete data / record from that file after data is written onto it.
Please let us know if this is possible.
Thanks in advance.
VaishaliThere's nothing in UTL_FILE to do this, so your options are either to write a Java stored procedure to do it or to use whatever mechanism the host operating system supports for editing files.
Alternatively, you could write a PL/SQL procedure to read each line in from the file and then write them out to a second file, discarding the line you don't want along the way. -
What is the best way to load and convert data from a flat file?
Hi,
I want to load data from a flat file, convert dates, numbers and some fields with custom logic (e.g. 0,1 into N,Y) to the correct format.
The rows where all to_number, to_date and custom conversions succeed should go into table STG_OK. If some conversion fails (due to an illegal format in the flat file), those rows (where the conversion raises some exception) should go into table STG_ERR.
What is the best and easiest way to archive this?
Thanks,
Carsten.Hi,
thanks for your answers so far!
I gave them a thought and came up with two different alternatives:
Alternative 1
I load the data from the flat file into a staging table using sqlldr. I convert the data to the target format using sqlldr expressions.
The columns of the staging table have the target format (date, number).
The rows that cannot be loaded go into a bad file. I manually load the data from the bad file (without any conversion) into the error table.
Alternative 2
The columns of the staging table are all of type varchar2 regardless of the target format.
I define data rules for all columns that require a later conversion.
I load the data from the flat file into the staging table using external table or sqlldr without any data conversion.
The rows that cannot be loaded go automatically into the error table.
When I read the data from the staging table, I can safely convert it since it is already checked by the rules.
What I dislike in alternative 1 is that I manually have to create a second file and a second mapping (ok, I can automate this using OMB*Plus).
Further, I would prefer using expressions in the mapping for converting the data.
What I dislike in alternative 2 is that I have to create a data rule and a conversion expression and then keep the data rule and the conversion expression in sync (in case of changes of the file format).
I also would prefer to have the data in the staging table in the target format. Well, I might load it into a second staging table with columns having the target format. But that's another mapping and a lot of i/o.
As far as I know I need the data quality option for using data rules, is that true?
Is there another alternative without any of these drawbacks?
Otherwise I think I will go for alternative 1.
Thanks,
Carsten. -
Sqlplus – spool data to a flat file
Hi,
Does any oracle expert here know why the sqlplus command could not spool all the data into a flat file at one time.
I have tried below command. It seems like every time I will get different file size :(
a) sqlplus -s $dbUser/$dbPass@$dbName <<EOF|gzip -c > ${TEMP_FILE_PATH}/${extract_file_prefix}.dat.Z
b) sqlplus -s $dbUser/$dbPass@$dbName <<EOF>> spool.log
set feedback off
set trims on
set trim on
set feedback off
set linesize 4000
set pagesize 0
whenever sqlerror exit 173;
spool ${extract_file_prefix}.datFor me, this is working. What exactly are you getting and what exactly are you expecting?
(t352104@svlipari[GEN]:/lem) $ cat test.ksh
#!/bin/ksh
TEMP_FILE_PATH=`pwd`
extract_file_prefix=emp
dbUser=t352104
dbPass=t352104
dbName=gen_dev
dataFile=${TEMP_FILE_PATH}/${extract_file_prefix}.dat
sqlplus -s $dbUser/$dbPass@$dbName <<EOF > $dataFile
set trims on
set trim on
set tab off
set linesize 7000
SET HEAD off AUTOTRACE OFF FEEDBACK off VERIFY off ECHO off SERVEROUTPUT off term off;
whenever sqlerror exit 173;
SELECT *
FROM emp ;
exit
EOF
(t352104@svlipari[GEN]:/lem) $ ./test.ksh
(t352104@svlipari[GEN]:/lem) $ echo $?
0
(t352104@svlipari[GEN]:/lem) $ cat emp.dat
7369 SMITH CLERK 7902 17-DEC-80 800 20
7499 ALLEN SALESMAN 7698 20-FEB-81 1600 300 30
7521 WARD SALESMAN 7698 22-FEB-81 1250 500 30
7566 JONES MANAGER 7839 02-APR-81 2975 20
7654 MARTIN SALESMAN 7698 28-SEP-81 1250 1400 30
7698 BLAKE MANAGER 7839 01-MAY-81 2850 30
7782 CLARK MANAGER 7839 09-JUN-81 2450 10
7788 SCOTT ANALYST 7566 09-DEC-82 3000 20
7839 KING PRESIDENT 17-NOV-81 5000 10
7844 TURNER SALESMAN 7698 08-SEP-81 1500 0 30
7876 ADAMS CLERK 7788 12-JAN-83 1100 20
7900 JAMES CLERK 7698 03-DEC-81 950 30
7902 FORD ANALYST 7566 03-DEC-81 3000 20
7934 MILLER CLERK 7782 23-JAN-82 1300 10
(t352104@svlipari[GEN]:/lem) $
Maybe you are looking for
-
Hi All, I have written a query for VAT report, this Query works fine when there is no credit memo postings. but when there is a credit memo for an invoice then the values displayed are mismatched. please check my query and correct it. This is the Qu
-
How to set the First Line of Table UI in Web Dynpro ABAP
Hi Team, I have a scenario where I'm displaying two Table controls one below the other. When a line is selected in the first(Top) table, I have a supply function(Event) and I fill the second(bottom) table. Everything works fine till here. The problem
-
During Upgrade not able to login using login password
Dear All, We are upgrade R/3 4.7 to ECC 6.0. now we are in adjustment modification phase. so when we are trying to login to adopt the modification, it's not allowing us to login. showing upgrade is still running. so please help on this. Regards Deepa
-
Sleep/Awake normal with lid closed.
Can I get my 15" MacBook Pro (Late 2007) running 10.7.5 to stay awake with the lid closed, yet sleep after 2 hours just like if it were open or connected to a remote display-keyboard-mouse? I've tried the two third party softwares Insomnia and No Sl
-
Simple Socket Question / Error...
I'm testing out a socket connection for an RDP app on the playbook. Flash Builder Burrito 4.5 Adobe AIR SDK 2.5 Hero Playbook SDK 0.9.2 Playbook Sim 0.9.2 VMPlayer running in Bridged Networking mode public function RDPApp() var socket: