Unload CLOB data to a flat file
Hi,
does anybody knows how to unload CLOB data to a flat file. Can u pl give a example. Is there any ulity in Oracle to unload data to a flat file. Whether we have to use the concatation operator for unloading the data to a flat file.
help??????
regards,
gopu
I don't know if there's an easy way, but you might want to try using the packages UTL_FILE and DBMS_LOB.
Hope that helps.
Similar Messages
-
Loading CLOB data to a flat file in owb 11.2.0.3
Hi,
Also,I have one more question.I want to load a table with clob data into a flat file using my owb client 11.2.0.3.But when I run this simple mapping ,I am getting this error:
ORA-06502:pl/sql:NUMERIC OR VALUE ERROR:CHARACTER STRING BUFFER TOO SMALL.
Please suggest what needs to be done.I was up all night as I had a dead line doing this....
Turned out my Mapping was corrupted.
If you do a series of synchronizations by object name, position, id the mapping would end up corrupted.
I had to delete my staging table operator within the mapping, drag and drop and reconnect and now my loads work just fine..
Thanks -
Error while uploading data from a flat file to the hierarchy
Hi guys,
after i upload data from a flat file to the hierarchy, i get a error message "Please select a valid info object" am loading data using PSA, having activated all external chars still get the problem..some help on this please..
regards
Srithere is o relation of infoobject name in flat file and infoobjet name at BW side.
please check with the object in the BW and their lengths and type of the object and check your flat file weather u have the same type there,
now check the sequence of the objects in the transfer rules and activate them.
there u go. -
Delete data from a flat file using PL/SQL -- Please help.. urgent
Hi All,
We are writing data to a flat file using Text_IO.Put_Line command. We want to delete data / record from that file after data is written onto it.
Please let us know if this is possible.
Thanks in advance.
VaishaliThere's nothing in UTL_FILE to do this, so your options are either to write a Java stored procedure to do it or to use whatever mechanism the host operating system supports for editing files.
Alternatively, you could write a PL/SQL procedure to read each line in from the file and then write them out to a second file, discarding the line you don't want along the way. -
What is the best way to load and convert data from a flat file?
Hi,
I want to load data from a flat file, convert dates, numbers and some fields with custom logic (e.g. 0,1 into N,Y) to the correct format.
The rows where all to_number, to_date and custom conversions succeed should go into table STG_OK. If some conversion fails (due to an illegal format in the flat file), those rows (where the conversion raises some exception) should go into table STG_ERR.
What is the best and easiest way to archive this?
Thanks,
Carsten.Hi,
thanks for your answers so far!
I gave them a thought and came up with two different alternatives:
Alternative 1
I load the data from the flat file into a staging table using sqlldr. I convert the data to the target format using sqlldr expressions.
The columns of the staging table have the target format (date, number).
The rows that cannot be loaded go into a bad file. I manually load the data from the bad file (without any conversion) into the error table.
Alternative 2
The columns of the staging table are all of type varchar2 regardless of the target format.
I define data rules for all columns that require a later conversion.
I load the data from the flat file into the staging table using external table or sqlldr without any data conversion.
The rows that cannot be loaded go automatically into the error table.
When I read the data from the staging table, I can safely convert it since it is already checked by the rules.
What I dislike in alternative 1 is that I manually have to create a second file and a second mapping (ok, I can automate this using OMB*Plus).
Further, I would prefer using expressions in the mapping for converting the data.
What I dislike in alternative 2 is that I have to create a data rule and a conversion expression and then keep the data rule and the conversion expression in sync (in case of changes of the file format).
I also would prefer to have the data in the staging table in the target format. Well, I might load it into a second staging table with columns having the target format. But that's another mapping and a lot of i/o.
As far as I know I need the data quality option for using data rules, is that true?
Is there another alternative without any of these drawbacks?
Otherwise I think I will go for alternative 1.
Thanks,
Carsten. -
Sqlplus – spool data to a flat file
Hi,
Does any oracle expert here know why the sqlplus command could not spool all the data into a flat file at one time.
I have tried below command. It seems like every time I will get different file size :(
a) sqlplus -s $dbUser/$dbPass@$dbName <<EOF|gzip -c > ${TEMP_FILE_PATH}/${extract_file_prefix}.dat.Z
b) sqlplus -s $dbUser/$dbPass@$dbName <<EOF>> spool.log
set feedback off
set trims on
set trim on
set feedback off
set linesize 4000
set pagesize 0
whenever sqlerror exit 173;
spool ${extract_file_prefix}.datFor me, this is working. What exactly are you getting and what exactly are you expecting?
(t352104@svlipari[GEN]:/lem) $ cat test.ksh
#!/bin/ksh
TEMP_FILE_PATH=`pwd`
extract_file_prefix=emp
dbUser=t352104
dbPass=t352104
dbName=gen_dev
dataFile=${TEMP_FILE_PATH}/${extract_file_prefix}.dat
sqlplus -s $dbUser/$dbPass@$dbName <<EOF > $dataFile
set trims on
set trim on
set tab off
set linesize 7000
SET HEAD off AUTOTRACE OFF FEEDBACK off VERIFY off ECHO off SERVEROUTPUT off term off;
whenever sqlerror exit 173;
SELECT *
FROM emp ;
exit
EOF
(t352104@svlipari[GEN]:/lem) $ ./test.ksh
(t352104@svlipari[GEN]:/lem) $ echo $?
0
(t352104@svlipari[GEN]:/lem) $ cat emp.dat
7369 SMITH CLERK 7902 17-DEC-80 800 20
7499 ALLEN SALESMAN 7698 20-FEB-81 1600 300 30
7521 WARD SALESMAN 7698 22-FEB-81 1250 500 30
7566 JONES MANAGER 7839 02-APR-81 2975 20
7654 MARTIN SALESMAN 7698 28-SEP-81 1250 1400 30
7698 BLAKE MANAGER 7839 01-MAY-81 2850 30
7782 CLARK MANAGER 7839 09-JUN-81 2450 10
7788 SCOTT ANALYST 7566 09-DEC-82 3000 20
7839 KING PRESIDENT 17-NOV-81 5000 10
7844 TURNER SALESMAN 7698 08-SEP-81 1500 0 30
7876 ADAMS CLERK 7788 12-JAN-83 1100 20
7900 JAMES CLERK 7698 03-DEC-81 950 30
7902 FORD ANALYST 7566 03-DEC-81 3000 20
7934 MILLER CLERK 7782 23-JAN-82 1300 10
(t352104@svlipari[GEN]:/lem) $ -
Export HRMS data to a flat file
Hi All!
Are there any ways of exporting employee related data to a flat file without using a client app (PeopleCode or Integration Broker), that is simply generate a CSV feed from UI?You can Schedule a query and specify the output format as text. Note that when you select View Log/Trace in process monitor, you will see a file with a .csv extension. However, it will open by default in Excel, and even if you select Save instead of Open it will try to change the extension to .xls. You will have to change it back to .csv.
-
Hi all,
I need to export a table data to a flat file.
But the problem is that the data is huge about 200million rows occupying around 60GB of space
If I use SQL*Loader in Toad, it is taking huge time to export
After few months, I need to import the same data again into the table
So please help me which is the efficient and less time taking method to do this
I am very new to this field.
Can some one help me with this?
My oracle database version is 10.2
Thanks in advanceOK so first of all I would ask the following questions:
1. Why must you export the data and then re-import it a few months later?
2. Have you read through the documentation for SQLLDR thoroughly ? I know it is like stereo instructions but it has valuable information that will help you.
3. Does the table the data is being re-imported into have anything attached to it eg: triggers, indices or anything that the DB must do on each record? If so then re-read the the sqlldr documentation as you can turn all of that off during the import and re-index, etc. at your leisure.
I ask these questions because:
1. I would find a way for whatever happens to this data to be accomplished while it was in the DB.
2. Pumping data over the wire is going to be slow when you are talking about that kind of volume.
3. If you insist that the data must be dumped, massaged, and re-imported do it on the DB server. Disk IO is an order of magnitude faster then over the wire transfer. -
Extract PSA Data to a Flat File
Hello,
I would like to download PSA data into a flat file, so I can load it into SQL Server and run SQL statements on it for analysis purposes. I have tried creating a PSA export datasource; however, I can find way to cleanly get the data into a flat structure for analysis and/or download.
Can anyone suggest options for doing this?
Thanks in advance for your help.
Sincerely,
SonyaHi Sonya,
In teh PSA screen try pressing Shift and F8. If this does not bring up the file types then you can try the following: Settings > Chnage Display Variants > View tab > Microsoft Excel > Click SAP_OM.xls and then the green check amrk. The data will be displayed in excel format which you can save to your PC.
Hope this helps... -
Uploading the data from a flat file into ztable
Hi,
I have a requirement where I have to upload the data from 2 flat files into 2 z tables(ZRB_HDR,ZRB_ITM).From the 1st flat file only data for few fields have to be uploaded into ztable(ZRB_HRD) .Fromthe 2nd flat file data for all the fields have to me uploaded into ztable(ZRB_ITM). How can I do this?
Regards,
Hemahi,
declare two internal table with structur of your tables.
your flat files should be .txt files.
now make use of GUI_UPLOAD function module to upload your flatfile into internal tables.
CALL FUNCTION 'GUI_UPLOAD'
EXPORTING
filename = 'c:\file1.txt'
has_field_separator = 'X'
TABLES
data_tab = itab1
EXCEPTIONS
OTHERS = 1.
use this function twice for two tables.
then loop them individually and make use of insert command. -
How to load data from a flat file which is there in the application server
HI All,
how to load data from a flat file which is there in the application server..Hi,
Firstly you will need to place the file(s) in the AL11 path. Then in your infopackage in "Extraction" tab you need to select "Application Server" option. Then you need to specify the path as well as the exact file you want to load by using the browsing button.
If your file name keeps changing on a daily basis i.e. name_ddmmyyyy.csv, then in the Extraction tab you have the option to write an ABAP routine that generates the file name. Here you will need to append sy-datum to "name" and then append ".csv" to generate complete filename.
Please let me know if this is helpful or if you need any more inputs.
Thanks & Regards,
Nishant Tatkar. -
Error Loading Data into a flat file
I am recieving the following error when loading data into a flat file from a flat file. SQL 2005 is my back end DB. If I cut the file iin half approx 500K rows my ODI interface works fine. Not sure what to look at.. I rebuit the interface which before was just dying giving no error and now I am getting this.
Thanks
java.lang.Exception
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
at com.sunopsis.dwg.cmd.e.i(e.java)
at com.sunopsis.dwg.cmd.g.y(g.java)
at com.sunopsis.dwg.cmd.e.run(e.java)
at java.lang.Thread.run(Unknown Source)Figured it out, found similar post that stated changing the HEAP size
Increase the page size in odiparams.bat in the bin folder and restart Designer.
For eg:
set ODI_INIT_HEAP=128m
set ODI_MAX_HEAP=1024m -
Loading Master Data from a Flat File
Hi Guru, I have flat files of different SAP tables (Transactional data tables and Master Data ), I want to upload this flat files to a DSO, my question is in the case of Master Data Files example SAP table USR03 ( User Address Data) in a Flat file format which is better to create a Master data text and attributes Datasource or create it as a transactional data Datasource to upload it to a DSO.
Hi
Master data datasource should be used to load master data like customer details. customer data is the master data which does not change frequently. customer infoobject may have some attributes like customer name, customer contact number, customer address, etc. we have to maintain these attributes and text data.
Transaction data is something which keeps changing very frequently like price of the material, sales, etc. for loading such data we go for transaction datasources.
As mentioned above too,the changes in the address are very rare, therefore youshould not use transactional datasource.
Regards, Rahul
Edited by: Rahul Pant on Feb 7, 2011 12:24 PM -
Data from a Flat file int a Cube
Hi experts!
Just a quick one, could we load data from a flat file directly to an infocube?
Or we woud need to create a ODS to load that from the flat file there and later on form the ODS to the cube?
Thanks you very much for your time!!Hi,
You can directly load the flat data into the info cube. This should not be a problem.
Create a flat file datasource according to the structure of the flat file. Create Transformations and DTP to the Cube.
Load the flat file data into the PSA and then DTP the request into the cube.
If the flat file load is a full load and one time load and data has to be deleted and loaded on the next load then above approach is fine.
But if your load is every day load and delta then it would be appropriate to have a DSO in between the data flow.
Hope it helps. -
Problem in the BDC program to upload the data from a flat file.
Hi,
I am required to write a BDC program to upload the data from a flat file. The conditions are as mentioned below:-
1) Selection Screen will be prompted to user and user needs to provide:- File Path on presentation server (with F4 help for this obligatory parameter) and File Separator e.g. @,#,$,%,... etc(fields in the file will be separated by using this special character) or fields may be separated by tab(tab delimited).
2) Finally after the data is uploaded, following messages need to be displayed:-
a) Total Number of records successfully uploaded.
b) Session Name
c) Number of Sessions created.
Problem is when each record is fetched from flat file, the record needs to be split into individual fields separated by delimiter or in case tab separated, then proceeding in usual manner.
It would be great if you provide me either the logic, pseudocode, or sample code for this BDC program.
Thanks,Here is an example program, if you require the delimitor to be a TAB, then enter TAB on the selection screen, if you require the delimitor to be a comma, slash, pipe, whatever, then simply enter that value. This example is simply the uploading of the file, not the BDC, I assume that you know what to do once you have the data into the internal table.
REPORT zrich_0001.
TYPES: BEGIN OF ttab,
rec TYPE string,
END OF ttab.
TYPES: BEGIN OF tdat,
fld1(10) TYPE c,
fld2(10) TYPE c,
fld3(10) TYPE c,
fld4(10) TYPE c,
END OF tdat.
DATA: itab TYPE TABLE OF ttab.
data: xtab like line of itab.
DATA: idat TYPE TABLE OF tdat.
data: xdat like line of idat.
DATA: file_str TYPE string.
DATA: delimitor TYPE string.
PARAMETERS: p_file TYPE localfile.
PARAMETERS: p_del(5) TYPE c.
AT SELECTION-SCREEN ON VALUE-REQUEST FOR p_file.
DATA: ifiletab TYPE filetable.
DATA: xfiletab LIKE LINE OF ifiletab.
DATA: rc TYPE i.
CALL METHOD cl_gui_frontend_services=>file_open_dialog
CHANGING
file_table = ifiletab
rc = rc.
READ TABLE ifiletab INTO xfiletab INDEX 1.
IF sy-subrc = 0.
p_file = xfiletab-filename.
ENDIF.
START-OF-SELECTION.
TRANSLATE p_del TO UPPER CASE.
CASE p_del.
WHEN 'TAB'.
delimitor = cl_abap_char_utilities=>horizontal_tab.
WHEN others.
delimitor = p_del.
ENDCASE.
file_str = p_file.
CALL METHOD cl_gui_frontend_services=>gui_upload
EXPORTING
filename = file_str
CHANGING
data_tab = itab.
LOOP AT itab into xtab.
CLEAR xdat.
SPLIT xtab-rec AT delimitor INTO xdat-fld1
xdat-fld2
xdat-fld3
xdat-fld4.
APPEND xdat to idat.
ENDLOOP.
LOOP AT idat into xdat.
WRITE:/ xdat-fld1, xdat-fld2, xdat-fld3, xdat-fld4.
ENDLOOP.
Regards,
Rich Heilman
Maybe you are looking for
-
Adding new infoobject to cube with data
Hi experts Our cube contains data and is fully compressed (E-table), there is are no requests in F-table. We need to add 2 key figures and 2 characteristics. The characteristics get a new dimension table. We don't use the remodel technique. We add th
-
how im suppose to fix the auto shutdown problem in my iphone 3gs after updating to ios 5.1.1 this problem has made me a mobile ****** in my house and an loser in thing care so if any solution to this problem let me know
-
Can't get a wireless internet connection with a F@st 3464 modem
I'm having trouble getting access to the internet with my ArchLinux laptop and a F@st 3464 router via wireless (WEP). On the other hand, a winxp laptop and the n900 running maemo 5 have no problem to problem to get an internet connection. My ArchLinu
-
'Begins with' not working in multiselect
Hi All, Our project is using a number of multiselect dashboard prompts. In version 10.1.3.3 we find there is a provision of using operators (match). However, the 'begins with' operator doesnot seem to be working. I have checked this in multiple envir
-
Changing playback level in portion of a single track
How do I change the playback level of several seconds of V/O? Or, how can I get the entire track to play at the same level?