Export data for domain data make wrong file
Hi!
If I try export data from table with column with type such as MDSYS.SDO_GEOMETRY in SQL developer (1.2.0 and 1.2.1.3213 both) result file will be with information like (for insert clause):
Insert into table_name (NUMB,GEOLOC) values (500949,'MDSYS.SDO_GEOMETRY');.
Also in previous version (1.2.0) when this column was shown in data window it was more informative:
MDSYS.SDO_GEOMETRY(2006, 262148, NULL, MDSYS.SDO_ELEM_INFO_ARRAY(1,2,1,95,2,1,109,2,1,133,2,1,157,2,1), MDSYS.SDO_ORDINATE_ARRAY(22847.57591,7216.21100000001,22842.04691,7217.2571,22841.44841,7218.00440000001,22843.39211,7228.31675000001,22844.13881,7232.35205000001,22845.63335,7239.52580000001,22845.63335,7240.27310000001,22845.03599,7240.72145000001,22826.05499,7244.15885000001,22814.39735,7246.10180000001,22809.01769,7246.84910000001,22807.67249,7246.40075000001,22802.44103,7222.33850000001,22799.19203,7213.03505000001,22795.8656525,7208.73815000001,22794.81386,7208.73200000001,22789.47752,7208.70080000001,22784.3570675,7209.03725000001,22758.6899675,7184.04095000001,22757.3447675,7183.59260000001,22751.9645375,7183.59245000001,22744.006055,7183.03205000001,22743.258785,7181.83640000001,22737.1684775,7181.35070000001,22736.7201725,7182.69575,22729.546295,7183.59245000001,22726.7066975,7186.58165000001,22725.9594275,7186.73105000001,22725.2121575,7186.43210000001,22723.11983,7184.56400000001,22722.29789,7184.48915000001,22721.55062,7186.28270000001,22721.326325,7186.80575000001,22717.515305,7191.36410000001,22715.7218,7193.68070000001,22710.1920875,7200.48080000001,22709.4448175,7206.90740000001,22709.370005,7214.15585000001,22709.74364,7214.52950000001,22711.6866275,7215.35150000001,22711.83611,7216.84610000001,22711.98545,7220.05925000001,22711.611815,7236.12560000001,22711.3876625,7247.63360000001,22711.4249975,7249.76345000001,22710.7523975,7250.95910000001,22710.0051275,7252.45355000001,22849.96763,7244.45780000001,22848.8559875,7243.04300000001,22848.32375,7242.36545000001,22849.51961,7243.41155000001,22848.8559875,7243.04300000001,22846.82921,7241.91710000001,22826.05499,7244.15885000001,22263.062285,7163.22935000001,22263.809555,7173.01865000001,22265.67773,7194.61475000001,22265.902025,7196.78180000001,22265.902025,7197.23015000001,22265.8272125,7197.37970000001,22265.304095,7197.97745000001,22217.9272625,7201.19075,22217.1799925,7201.56440000001,22216.8063575,7202.31170000001,22216.35791,7204.47875000001,22216.731545,7206.12275000001,22800.2381225,7220.28350000001,22798.3699475,7214.23070000001,22796.651255,7211.31620000001,22795.3061975,7209.82175000001,22794.9325625,7209.22385000001,22794.81386,7208.73200000001,22785.5170175,7170.21620000001,22777.3717175,7133.0768,22776.9234125,7130.76035000001,22775.727695,7125.90305000001,22774.6816025,7120.82150000001,22773.7100375,7115.81480000001,22774.53212,7109.98610000001,22774.4573075,7110.73340000001,22773.2617325,7111.70480000001,22773.1870625,7112.45210000001,22773.7100375,7115.81480000001,22773.11225,7113.87185000001,22767.95603,7108.93985000001))
when new one:
MDSYS.SDO_GEOMETRY
WBR,
Sergey
I'm newbie here and not sure what you want exactly but.
First of all I've created table on Oracle 10G (10.2.0.3) Enterprise ed as follow:
CREATE TABLE tblnm
"MI_PRINX" NUMBER(11,0),
"GEOLOC" MDSYS.SDO_GEOMETRY,
CONSTRAINT RP_MAP_PK PRIMARY KEY (MI_PRINX)
INSERT INTO USER_SDO_GEOM_METADATA (TABLE_NAME, COLUMN_NAME, DIMINFO, SRID)
VALUES ('tblnm','GEOLOC',MDSYS.SDO_DIM_ARRAY(mdsys.sdo_dim_element('X', -100000.0, 185000.0, 1.425E-5), mdsys.sdo_dim_element('Y', -100000.0, 200000.0, 1.5E-5)),262148);
CREATE INDEX tblnm_SX ON tblnm (GEOLOC)
INDEXTYPE IS MDSYS.SPATIAL_INDEX;
insert into tblnm (MI_PRINX,GEOLOC) VALUES
(1,MDSYS.SDO_GEOMETRY(2001, 262148, NULL, MDSYS.SDO_ELEM_INFO_ARRAY(1,1,1), MDSYS.SDO_ORDINATE_ARRAY(6946.74932,9604.25675000001)));
After that I've export data from this table by SQLDeveloper:
as insert clause result was
-- INSERTING into TBLNM
Insert into TBLNM (MI_PRINX,GEOLOC) values (1,'MDSYS.SDO_GEOMETRY');
when I've try to import data (after delete) by this command i've got:
ERROR at line 1:
ORA-00932: inconsistent datatypes: expected MDSYS.SDO_GEOMETRY got CHAR
for loader clause file looks like
LOAD DATA
INFILE *
Truncate
INTO TABLE "TBLNM"
FIELDS TERMINATED BY ','
TRAILING NULLCOLS
(MI_PRINX,
GEOLOC)
begindata
"1","MDSYS.SDO_GEOMETRY"
and so one. Result file doesn't consist data for sdo_geometry column - only name of class.
Similar Messages
-
Exporting Data Sets as Files in CS5
Why would CS5 crash every time I try to export Data Sets as files? I have been just doing it in CS2 because of the problem. However, CS2 is acting up for some reason at the moment.
If CS2 where working, as it has in the past, then the exact same sets would export easily from there but not CS5. Is there any way to fix this problem?Hello,
Yes I am running the latest version, on a mac. The Data Set is moderately sized, about 30 sets with 5 perameters (sorry if that is not exactly what they are usually called). It crashes while the data set is processing. After it saves a couple of files the system crashes. Sometimes it makes it a bit further, but never all the way through a whole set. CS2 is having all sorts of problems for me all of the sudden. I am going to look for my disk to unintstall then reinstall. It can't open files from File/Open and crashes within minutes of opening, even if I'm not doing anything with it yet. The only reason I have it is to run the Data Sets so I just hope I can find my install disk.
Thanks for any help! -
Do tree have the standard functoin to export data to local file ?
Hi:
do tree have the standard functoin to export data to local file? you know ALV have the standard button to do it.
thanksss.
qimingxing.I don't think there is a method built into the tree object for doing this.
[SAP Tree and Tree Model|http://help.sap.com/saphelp_47x200/helpdata/EN/b7/147a36c70d2354e10000009b38f839/frameset.htm]
I think one way you could do it is have an option on the application menu, then manually parse the data in the tree and build up an itab yourself from the expanded nodes (or all nodes or whatever). Then you could export the data from the itab. -
Data convertion while exporting data into flat files using export wizard in ssis
Hi ,
while exporting data to flat file through export wizard the source table is having NVARCHAR types.
could you please help me on how to do the data convertion while using the export wizard?
Thanks.Hi Avs sai,
By default, the columns in the destination flat file will be non-Unicode columns, e.g. the data type of the columns will be DT_STR. If you want to keep the original DT_WSTR data type of the input column when outputting to the destination file, you can check
the Unicode option on the “Choose a Destination” page of the SQL Server Import and Export Wizard. Then, on the “Configure Flat File Destination” page, you can click the Edit Mappings…“ button to check the data types. Please see the screenshot:
Regards,
Mike Yin
TechNet Community Support -
Error exporting data to excel file
Hello,
I have written a macro to export some channels into an excel file, skipping the excel wizard by using the following code:
EXCELChnCount = intChannels - 1
For i = 1 To intChannels
ExcelExpChn(i) = ChannelList(i - 1)
Next
Call EXCELExport ("c:\test.xls","data",0,"")
No file is created, but when I output EXCELStat / EXCELStatTex it contains the error code 100 "a general error occured".
The strange thing is, if I run the wizard once (by changing the 0 to a 1), and just press "finish", the wizard-less export works flawlessly afterwards.
Has anybody an idea what is wrong? Do I have so set some more variables in order to make it work without using the wizard?
Using a STP configuration file, as I've read in some other postings isn't an option, as the name and the numbers of the exported channels change constantly.
Thanks for your help, and best regards!
SamHi Sam,
I did not try your script out, but I often receive the mysterious "error 100" when Excel already has the file open - Excel is very protective of its open files. Sometimes Excel will crash too, but checking your Task Manager will reveal it is still running...
Julia -
SSIS 2008 R2 - Export data to Flat File in another server
Hello Everybody,
I'm trying to export data from a table in database server to flat file in another server, however it is happening the error Access Denied. I'm using SQL Server Integration Service 2008 R2 to do this.
I've checked all the ways that's happening could the level of security and could not resolve.
Please could someone tell me what I'm missing in this case?
Following error log:
SSIS package "AtualizarDados.dtsx" starting.
Information: 0x4004300A at Gerar arquivo, SSIS.Pipeline: Validation phase is beginning.
Information: 0x40043006 at Gerar arquivo, SSIS.Pipeline: Prepare for Execute phase is beginning.
Information: 0x40043007 at Gerar arquivo, SSIS.Pipeline: Pre-Execute phase is beginning.
Information: 0x402090DC at Gerar arquivo, Geração de Arquivo [63]: The processing of file "\\targetserver\Export_File\Export_File.txt" has started.
Warning: 0x80070005 at Gerar arquivo, Geração de Arquivo [63]: Access is denied.
Error: 0xC020200E at Gerar arquivo, Geração de Arquivo [63]: Cannot open the datafile "\\targetserver\Export_File\Export_File.txt".
Error: 0xC004701A at Gerar arquivo, SSIS.Pipeline: component "Geração de Arquivo" (63) failed the pre-execute phase and returned error code 0xC020200E.
Information: 0x40043008 at Gerar arquivo, SSIS.Pipeline: Post Execute phase is beginning.
Information: 0x402090DD at Gerar arquivo para DMZ, Geração de Arquivo [63]: The processing of file "\\targetserver\Export_File\Export_File.txt" has ended.
Information: 0x4004300B at Gerar arquivo, SSIS.Pipeline: "component "Geração de Arquivo" (63)" wrote 0 rows.
Information: 0x40043009 at Gerar arquivo, SSIS.Pipeline: Cleanup phase is beginning.
Task failed: Gerar arquivo
SSIS package "AtualizarDados.dtsx" finished: Success.
Regards,
Antonio EstimaI got after using domain user.
Thanks to all -
Backup MaxDB 7.6 mit HP Data Protector; Wrong file type
Hallo.
Wir versuchen, eine MaxDB-Datenbank (Rel. 7.6) auf SUSE-LINUX unter Verwendung von HP Data Protector zu sichern. Der Versuch, das Backup durchzuführen, schlägt fehl, da auf dem MaxDB-Server anscheinend die Pipe nicht angelegt werden kann. Folgende Meldung wird ausgegeben:
From: BSM-hph3000.thebis.de "Content_sdb_sdba" Time: 26.04.2008 10:19:16
OB2BAR application on "dell2950.thebis.de" successfully started.
From: OB2BAR_SAPDBBAR-dell2950.thebis.de "SDB" Time: 04/26/08 10:16:22
Executing the dbmcli command: `user_logon'.
From: OB2BAR_SAPDBBAR-dell2950.thebis.de "SDB" Time: 04/26/08 10:16:22
Executing the dbmcli command: `dbm_configset -raw set_variable_1 OB2OPTS="(null)"'.
From: OB2BAR_SAPDBBAR-dell2950.thebis.de "SDB" Time: 04/26/08 10:16:22
Executing the dbmcli command: `dbm_configset -raw set_variable_2 OB2APPNAME=SDB'.
From: OB2BAR_SAPDBBAR-dell2950.thebis.de "SDB" Time: 04/26/08 10:16:22
Executing the dbmcli command: `dbm_configset -raw set_variable_3 OB2BARHOSTNAME=dell2950.thebis.de'.
From: OB2BAR_SAPDBBAR-dell2950.thebis.de "SDB" Time: 04/26/08 10:16:22
Executing the dbmcli command: `dbm_configset -raw set_variable_4 TimeoutWaitFiles=30'.
From: OB2BAR_SAPDBBAR-dell2950.thebis.de "SDB" Time: 04/26/08 10:16:22
Executing the dbmcli command: `dbm_configset -raw BSI_ENV /usr/omni/tmp/SDB.bsi_env'.
From: OB2BAR_SAPDBBAR-dell2950.thebis.de "SDB" Time: 04/26/08 10:16:22
Executing the dbmcli command: `medium_put BACKDP-Data[1]/1 /usr/omni/tmp/SDB.BACKDP-Data[1].1 PIPE DATA 0 8'.
From: OB2BAR_SAPDBBAR-dell2950.thebis.de "SDB" Time: 04/26/08 10:16:22
Executing the dbmcli command: `util_connect'.
From: OB2BAR_SAPDBBAR-dell2950.thebis.de "SDB" Time: 04/26/08 10:16:22
Executing the dbmcli command: `backup_start BACKDP-Data[1] DATA'.
From: OB2BAR_SAPDBBAR-dell2950.thebis.de "SDB" Time: 04/26/08 10:16:23
Error: SAPDB responded with:
-24988,ERR_SQL: SQL error
-903,Host file I/O error
3,Data backup failed
1,Backupmedium #1 (/usr/omni/tmp/SDB.BACKDP-Data[1].1) Wrong file type
6,Backup error occured, Errorcode 3700 "hostfile_error"
17,Servertask Info: because Error in backup task occured
10,Job 1 (Backup / Restore Medium Task) WaitingT131 Result=3700
6,Error in backup task occured, Errorcode 3700 "hostfile_error"
From: OB2BAR_SAPDBBAR-dell2950.thebis.de "SDB" Time: 04/26/08 10:16:23
Executing the dbmcli command: `exit'.
From: BSM-hph3000.thebis.de "Content_sdb_sdba" Time: 26.04.2008 10:19:18
OB2BAR application on "dell2950.thebis.de" disconnected.
From: BSM-hph3000.thebis.de "Content_sdb_sdba" Time: 26.04.2008 10:19:18
None of the Disk Agents completed successfully.
Session has failed.
Ein Versuch, die Pipe /usr/omni/tmp/SDB.BACKDP-Data[1].1 manuell auf dem MaxDB-Server anzulegen, schlug ebenso fehl. Eine Pipe /usr/omni/tmp/SDB.BACKDP-Data kann angelegt werden.
Frage 1) Hat jemand ein ähnliches Problem und dazu eine Lösung?
Frage 2) Wo ist hinterlegt, welchen Medium-Name der Data Protector beim command "medium_put" verwenden soll?
Danke für Eure Hilfe!
Gruß
Gerhard Krauß
HeidelbergHi Marcus,
we use the version 5.5.
The pipe name is generated by the Data Protector. When I want to create manually a pipe with this name I got the message "No hit found".
Do you know a possibilty where we can configure (at Data Protector or MaxDB) which medium name should be used for creating the pipe?
Thanks a lot!
Gerhard -
Hi!
I use this process in apex to export data from table into txt files with fixed length, so without any delimeter
>
declare
v_file_name VARCHAR2 (2000) := 'test.txt';
id varchar2(9);
worker varchar2(30);
address varchar2(26);
begin
OWA_UTIL.mime_header ('application/txt', FALSE);
htp.p('Content-Disposition:attachment;filename="'|| v_file_name|| '"');
OWA_UTIL.http_header_close;
FOR x in (select id id from workers where col00 like '1000')
LOOP
select col01,col02,col03 into id, worker, addressfrom un_web_prenosi where id = x.id;
htp.p(id||worker||address);
END LOOP;
apex_application.g_unrecoverable_error:=true;
exception when others then
null;
end;
and I have problem, because if I open file with notepad is everything in one line, but if I open file in notepad++ or I copy content of file in word then I see contents just like it should be (each record in one line). Do you know how can I solve this problem?I solved my problem. I forget to put chr(13) into htp.p(id||worker||address); so this line must be like htp.p(id||worker||address||chr(13));
-
SSIS - Exporting Data into flat files from Oracle Table as batchwise process
Hi All,
Thanks in advance.
I have a Large Table in Oracle Database with some 3 Lakhs record. I need to fetch the 10,000 records for every iteration and export it into the flat file. This process should occur recursively until the table becomes empty.
Hence, For every iteration on flat file to be generated with 10,000 records.
Please help how to proceed further in SSIS.
Thanks
Pyarajan.SYes, it always helps if your question doesn't specify the actual requirements...
Use the FOR loop container to control the iterations of the data flow. For each run you read 10,000 rows from the table and dump them in a flat file. Either move the flat file, or use an expression on the flat file connection manager to give them dynamic
file names.
30 million rows is also not a problem by the way, it just takes a bit longer.
MCSE SQL Server 2012 - Please mark posts as answered where appropriate. -
Importing previously exported data (from local file)
After having searched to no end I still cannot find the answer to my question....
Our current environment is an SAP and FLM system (forms livecycle management) using livecycle designer for design and all users on at least reader 9.
I would like to know if it is possible for a user to fill in a form and save the entered data (to a local file) so the following week they can then open latest version of the PDF (requested from the FLM portal to be either completed online (portal) or offline (email)) they would then import the data, make any minor updates and submit. The forms should have reader extensions applied but as yet unsure of the exact settings (if any) as this happens automatically within FLM. The form shouldn’t change in future, more than some additional validation, but allow form numbers and versions within FLM to be better maintained. Plus to ensure users do not keep or use outdated forms.
From my understanding there are 2 ways to import data, either folder level javascript or via certifying the form. For our setup folder level scripts are a non-starter as would have no way to maintain these on every PC (far to many) so that leave us with certifying forms....from the limited information I have found on this it appears that the certification may break when importing data, have very very basically tested this and appears to be true.
So the last thought was if I can import data programmatically via javascript and fill fields in etc would this still have the same affect of breaking certification?
Appreciate any advice> Hi Srdjan.
>
> Are you familiar with the MDM Import Manager?
>
> Best regards,
> Nir
Hi
I haven't tried the MDM Import manager..but I solved it with LSMW. I created a recording for one record entry, then I specified the rest of the data in a text file, and passed it to LSMW..It worked (almost) perfectly!
Thanks for the suggestion though, I'll give it a try some other time
Best regards,
S. -
Please can you help an Oracle newcomer?
Below is a script (which I took from the forums) that I used to
run in Lite to export data to a csv file.
As you can see I am having problems with running the same file
in 8i - It starts off OK i.e. asking for the name of the table
but then, instead of completing the full script it executes a
few lines then keeps asking me for the name of the table. I
suspect there is something wrng in my environment settings
(Comments ON etc.) but I have checked everything with no success.
Any help would be much appreciated as this is a really useful
script which I use the whole time.
Thanks very much for your help.
Regards
Andrew
SQL>
1 SET ECHO OFF
2 SET FEEDBACK OFF
3 SET HEADING OFF
4 SET PAGESIZE 0
5 SET TIMING OFF
6 SET VERIFY OFF
7 ACCEPT name_of_table PROMPT 'Enter the name of the table
you wish to convert: '
8 SPOOL query.sql
9 SELECT 'SPOOL ' || UPPER ('&name_of_table') || '.txt'
10 FROM DUAL;
11 SELECT 'SELECT ' || column_name
12 FROM all_tab_columns
13 WHERE table_name = UPPER ('&name_of_table')
14 AND column_id = 1;
15 SELECT '|| '','' || ' || column_name
16 FROM all_tab_columns
17 WHERE table_name = UPPER ('&name_of_table')
18 AND column_id > 1;
19 SELECT 'FROM ' || UPPER ('&name_of_table') || ';'
20 FROM DUAL;
21 SELECT 'SPOOL OFF'
22 FROM DUAL;
23 SPOOL OFF
24* START query
25 /
Enter value for name_of_table: new_table
old 9: SELECT 'SPOOL ' || UPPER ('&name_of_table') || '.txt'
new 9: SELECT 'SPOOL ' || UPPER ('new_table') || '.txt'
Enter value for name_of_table: new_table
old 13: WHERE table_name = UPPER ('&name_of_table')
new 13: WHERE table_name = UPPER ('new_table')
Enter value for name_of_table: new_table
old 17: WHERE table_name = UPPER ('&name_of_table')
new 17: WHERE table_name = UPPER ('new_table')
Enter value for name_of_table: New_table
old 19: SELECT 'FROM ' || UPPER ('&name_of_table') || ';'
new 19: SELECT 'FROM ' || UPPER ('New_table') || ';'
SET ECHO OFF
ERROR at line 1:
ORA-00922: missing or invalid optionThanks for your help - problem solved I was using Run sql file
instead of Start sql file - oops! -
I am curious what the "best practice" is for exporting data programmaticly from SQL Server to Excel. Is it best to do it straight from SQL Server, or should I do it with in my C# code? My program is going to pull the data, put in the
excel file, then email the file. So I could write an SP that gets the data and puts it in the file, then have the C# code run the SP and email the file; Or I could have the code do everything, pull the data, export it & email it.
If it is considered better to have the SP do it, why and what is the best way? ROWSET functions?So I could write an SP that gets the data and puts it in the file, then have the C# code run the SP and email the file; Or I could have the code do everything, pull the data, export it & email it.
A very important thing to consider is where the Excel file should reside who should do the mailing, the client or the server?
I would definitely recommend having a C# program to get the data and write to the Excel book. Writing to Excel books from SQL Server is possible, but there is a lot more problems with permissions, OLE DB providers etc. Plus, neither writing to Excel books
nor sending mail is part of the core business for SQL Server which is to work with data.
Erland Sommarskog, SQL Server MVP, [email protected] -
Help to export data into a file
Hi,
I am having more than 200 columns in a table. Client wants to export the data into a file with all 215 columns data. When i write the select * from tablename into a spool file. It is getting the data into a multiple lines however client wants each record into a single line.(right now records are getting into multiple lines). I tried to use to_nclob(columnname1)....to_nclob(column215), this way it is getting a single line for each record but it is taking much time(means in a table we have 10M records and explain plan is showing 3 days to export the file). Could you please help me how we can achieve these type of functionalities.
Thanks in advanceuser586 wrote:
Hi,
I am having more than 200 columns in a table. Client wants to export the data into a file with all 215 columns data. When i write the select * from tablename into a spool file. It is getting the data into a multiple lines however client wants each record into a single line.(right now records are getting into multiple lines). I tried to use to_nclob(columnname1)....to_nclob(column215), this way it is getting a single line for each record but it is taking much time(means in a table we have 10M records and explain plan is showing 3 days to export the file). Could you please help me how we can achieve these type of functionalities.
Thanks in advance
Hardly surprising it's taking ages to export 10 million rows, if you're doing it through SQL*Plus, as that involves transporting all the data over the network to the client, which is then rendering all that data in the display (which can be slow to scroll, depending on the size of the window) etc.
Better to get the database to access the file system directly, using UTL_FILE (though this does mean that the file is produced on the server) and then transfer the resultant file to where it needs to go.
Example of a generic template to use as a starting point...
As sys user:
CREATE OR REPLACE DIRECTORY TEST_DIR AS '\tmp\myfiles'
GRANT READ, WRITE ON DIRECTORY TEST_DIR TO myuser
As myuser:
CREATE OR REPLACE PROCEDURE run_query(p_sql IN VARCHAR2
,p_dir IN VARCHAR2
,p_header_file IN VARCHAR2
,p_data_file IN VARCHAR2 := NULL) IS
v_finaltxt VARCHAR2(4000);
v_v_val VARCHAR2(4000);
v_n_val NUMBER;
v_d_val DATE;
v_ret NUMBER;
c NUMBER;
d NUMBER;
col_cnt INTEGER;
f BOOLEAN;
rec_tab DBMS_SQL.DESC_TAB;
col_num NUMBER;
v_fh UTL_FILE.FILE_TYPE;
v_samefile BOOLEAN := (NVL(p_data_file,p_header_file) = p_header_file);
BEGIN
c := DBMS_SQL.OPEN_CURSOR;
DBMS_SQL.PARSE(c, p_sql, DBMS_SQL.NATIVE);
d := DBMS_SQL.EXECUTE(c);
DBMS_SQL.DESCRIBE_COLUMNS(c, col_cnt, rec_tab);
FOR j in 1..col_cnt
LOOP
CASE rec_tab(j).col_type
WHEN 1 THEN DBMS_SQL.DEFINE_COLUMN(c,j,v_v_val,2000);
WHEN 2 THEN DBMS_SQL.DEFINE_COLUMN(c,j,v_n_val);
WHEN 12 THEN DBMS_SQL.DEFINE_COLUMN(c,j,v_d_val);
ELSE
DBMS_SQL.DEFINE_COLUMN(c,j,v_v_val,2000);
END CASE;
END LOOP;
-- This part outputs the HEADER
v_fh := UTL_FILE.FOPEN(upper(p_dir),p_header_file,'w',32767);
FOR j in 1..col_cnt
LOOP
v_finaltxt := ltrim(v_finaltxt||','||lower(rec_tab(j).col_name),',');
END LOOP;
-- DBMS_OUTPUT.PUT_LINE(v_finaltxt);
UTL_FILE.PUT_LINE(v_fh, v_finaltxt);
IF NOT v_samefile THEN
UTL_FILE.FCLOSE(v_fh);
END IF;
-- This part outputs the DATA
IF NOT v_samefile THEN
v_fh := UTL_FILE.FOPEN(upper(p_dir),p_data_file,'w',32767);
END IF;
LOOP
v_ret := DBMS_SQL.FETCH_ROWS(c);
EXIT WHEN v_ret = 0;
v_finaltxt := NULL;
FOR j in 1..col_cnt
LOOP
CASE rec_tab(j).col_type
WHEN 1 THEN DBMS_SQL.COLUMN_VALUE(c,j,v_v_val);
v_finaltxt := ltrim(v_finaltxt||',"'||v_v_val||'"',',');
WHEN 2 THEN DBMS_SQL.COLUMN_VALUE(c,j,v_n_val);
v_finaltxt := ltrim(v_finaltxt||','||v_n_val,',');
WHEN 12 THEN DBMS_SQL.COLUMN_VALUE(c,j,v_d_val);
v_finaltxt := ltrim(v_finaltxt||','||to_char(v_d_val,'DD/MM/YYYY HH24:MI:SS'),',');
ELSE
DBMS_SQL.COLUMN_VALUE(c,j,v_v_val);
v_finaltxt := ltrim(v_finaltxt||',"'||v_v_val||'"',',');
END CASE;
END LOOP;
-- DBMS_OUTPUT.PUT_LINE(v_finaltxt);
UTL_FILE.PUT_LINE(v_fh, v_finaltxt);
END LOOP;
UTL_FILE.FCLOSE(v_fh);
DBMS_SQL.CLOSE_CURSOR(c);
END;
This allows for the header row and the data to be written to seperate files if required.
e.g.
SQL> exec run_query('select * from emp','TEST_DIR','output.csv');
PL/SQL procedure successfully completed.
Output.csv file contains:
empno,ename,job,mgr,hiredate,sal,comm,deptno
7369,"SMITH","CLERK",7902,17/12/1980 00:00:00,800,,20
7499,"ALLEN","SALESMAN",7698,20/02/1981 00:00:00,1600,300,30
7521,"WARD","SALESMAN",7698,22/02/1981 00:00:00,1250,500,30
7566,"JONES","MANAGER",7839,02/04/1981 00:00:00,2975,,20
7654,"MARTIN","SALESMAN",7698,28/09/1981 00:00:00,1250,1400,30
7698,"BLAKE","MANAGER",7839,01/05/1981 00:00:00,2850,,30
7782,"CLARK","MANAGER",7839,09/06/1981 00:00:00,2450,,10
7788,"SCOTT","ANALYST",7566,19/04/1987 00:00:00,3000,,20
7839,"KING","PRESIDENT",,17/11/1981 00:00:00,5000,,10
7844,"TURNER","SALESMAN",7698,08/09/1981 00:00:00,1500,0,30
7876,"ADAMS","CLERK",7788,23/05/1987 00:00:00,1100,,20
7900,"JAMES","CLERK",7698,03/12/1981 00:00:00,950,,30
7902,"FORD","ANALYST",7566,03/12/1981 00:00:00,3000,,20
7934,"MILLER","CLERK",7782,23/01/1982 00:00:00,1300,,10
The procedure allows for the header and data to go to seperate files if required. Just specifying the "header" filename will put the header and data in the one file.
Adapt to output different datatypes and styles are required. -
Export data in Excel file - Debug and Break Point
In the program RFUMSV00 the data been displayed on the screen with the function REUSE_ALV_LIST_DISPLAY.
With "Cntrl + Shift + F9" I exported the list of data in an Excel file.
How can I see, with Debug, where the program RFUMSV00 passed the data to Excel File? I can indicate the point, In the program, where I need to set the break point?
Thanks,
SerenaHi,
This download functionality has nothing to have with program RFUMSV00. It is actually processed by REUSE_ALV_LIST_DISPLAY fm, which calls fm LIST_COMMAND which finally calls fm LIST_DOWNLOAD, where you will actually find the downloading part (in fact in another fm, DOWNLOAD_LIST)...
to resume: REUSE_ALV_LIST_DISPLAY->LIST_COMMAND->LIST_DOWNLOAD->DOWNLOAD_LIST
so I would put the break-point in the last one..
Kr,
Manu.
correction: I made a mistake, the break-point should be set in LIST_DOWNLOAD...where you will find the call to fm LIST_CONVERT_TO_DAT which is responsible of the download...
Edited by: Manu D'Haeyer on Dec 9, 2011 1:46 PM -
Hi,
Is it possible to export data from any database to flat files using SQL developer and also automate the process by including some sort of scheduler or script?
Thanks,
NiteshIf the database is Oracle you can export the contents of table to a flat file by right clicking on the table instance and selecting "Export Data" menu item and then selecting the export format
Maybe you are looking for
-
Can't open a file with spaces on the filename.
Lets say I have a file: /home/orv/Open - Me - Please.txt When I double click to open that file with any program, the result is something like this: "Unable to find /home/orv/Open%20-%20Me%20-%20Please.txt" I'm not sure what to call that process of co
-
nas on airport extreme not recognized by time machine. same nas was previously used as time machine backup on different airport extreme. are there any configuration files that i can delete to fix this problem ?
-
I have a new 2012 R2 Domain Controller and Exchange 2013 server running on Server 2012 R2. I have a few Windows XP clients that for some reason as of Monday cannot browse or access the file shares on the Domain Controller. However they can still acce
-
[KDE 4.10] Kontact/Kalender/KMail issue
Hello together! I have recently updated to KDE 4.10, which looks very good. Unfortunately, the above mentioned parts of KDE have stoped to work. I use Kontact for organising my mails, calender and RSS-feeds. The RSS programm (akregator) still works a
-
Showing application message with different languages...
Hi guys, i've a question for you. I'm trying to add to my jsf application a new language. I've added a resource bundle and everythings it's ok. My question is: if in my bean i've if (rs.next()) { FacesContext facesContext = F