Help Export TABLE Records into Flat File with INSERTs - error
Hi,
When i'm trying to run this procedure I got this error:
ORA-00932:inconsistent datatypes: expected - got -
Can anybody tell me why?
Thanks
CREATE OR REPLACE PROCEDURE generate_stmt(prm_table_name IN VARCHAR2,
prm_where_clause IN VARCHAR2,
prm_output_folder IN VARCHAR2,
prm_output_file IN VARCHAR2) IS
TYPE ref_cols IS REF CURSOR;
mmy_ref_cols ref_cols;
mmy_column_name VARCHAR2(100);
mmy_column_data_type VARCHAR2(1);
mmy_col_string VARCHAR2(32767);
mmy_query_col_string VARCHAR2(32767);
V_FILE_HNDL UTL_FILE.file_type;
begin
OPEN mmy_ref_cols FOR
SELECT LOWER(column_name) column_name
FROM user_tab_columns
WHERE table_name = UPPER(prm_table_name)
ORDER BY column_id;
LOOP
FETCH mmy_ref_cols
INTO mmy_column_name;
EXIT WHEN mmy_ref_cols%NOTFOUND;
mmy_col_string := mmy_col_string || mmy_column_name || ', ';
mmy_query_col_string := mmy_query_col_string || ' ' || mmy_column_name || ',';
END LOOP;
CLOSE mmy_ref_cols;
V_FILE_HNDL := UTL_FILE.FOPEN('TEST','TESST.TXT', 'W');
mmy_col_string := 'INSERT INTO ' || LOWER(prm_table_name) || ' (' ||
CHR(10) || CHR(9) || CHR(9) || mmy_col_string;
mmy_col_string := RTRIM(mmy_col_string, ', ');
mmy_col_string := mmy_col_string || ')' || CHR(10) || 'VALUES ( ' ||
CHR(9);
mmy_query_col_string := RTRIM(mmy_query_col_string,
' || ' || '''' || ',' || '''' || ' || ');
dbms_output.put_line(mmy_column_name);
OPEN mmy_ref_cols
FOR ' SELECT ' || mmy_query_col_string ||
' FROM ' || prm_table_name ||
' ' || prm_where_clause;
loop
FETCH mmy_ref_cols
INTO mmy_query_col_string;
EXIT WHEN mmy_ref_cols%NOTFOUND;
mmy_query_col_string := mmy_query_col_string || ');';
UTL_FILE.PUT_LINE(V_FILE_HNDL, mmy_col_string);
UTL_FILE.PUT_LINE(V_FILE_HNDL, mmy_query_col_string);
end loop;
end;
Buddy,
Try this..
CREATE OR REPLACE PROCEDURE generate_stmt
(prm_table_name IN VARCHAR2,
prm_where_clause IN VARCHAR2,
prm_output_folder IN VARCHAR2,
prm_output_file IN VARCHAR2) IS
TYPE ref_cols IS REF CURSOR;
mmy_ref_cols ref_cols;
mmy_column_name VARCHAR2(100);
mmy_column_data_type VARCHAR2(1);
mmy_col_string VARCHAR2(32767);
mmy_query_col_string VARCHAR2(32767);
V_FILE_HNDL UTL_FILE.file_type;
begin
OPEN mmy_ref_cols FOR
SELECT LOWER(column_name) column_name
FROM user_tab_columns
WHERE table_name = UPPER(prm_table_name)
ORDER BY column_id;
LOOP
FETCH mmy_ref_cols
INTO mmy_column_name;
EXIT WHEN mmy_ref_cols%NOTFOUND;
mmy_col_string := mmy_col_string || mmy_column_name || ', ';
mmy_query_col_string := mmy_query_col_string || ' ' || mmy_column_name || ',';
END LOOP;
CLOSE mmy_ref_cols;
mmy_col_string := 'INSERT INTO ' || LOWER(prm_table_name) || ' (' ||CHR(10) || CHR(9) || CHR(9) || mmy_col_string;
mmy_col_string := RTRIM(mmy_col_string, ', ');
mmy_col_string := mmy_col_string || ')' || CHR(10) || 'VALUES ( ' ||CHR(9);
mmy_query_col_string := RTRIM(mmy_query_col_string,' || ' || '''' || ',' || '''' || ' || ');
V_FILE_HNDL := UTL_FILE.FOPEN('TEST','TESST.TXT', 'W');
OPEN mmy_ref_cols FOR 'SELECT ' || mmy_query_col_string ||' FROM ' || prm_table_name ||' ' || prm_where_clause;
loop
FETCH mmy_ref_cols INTO mmy_query_col_string;
EXIT WHEN mmy_ref_cols%NOTFOUND;
mmy_query_col_string := mmy_query_col_string || ');';
UTL_FILE.PUT_LINE(V_FILE_HNDL, mmy_col_string);
UTL_FILE.PUT_LINE(V_FILE_HNDL, mmy_query_col_string);
end loop;
UTL_FILE.FCLOSE(V_FILE_HNDL);
END;
This would work for table with one and only one column.
Look at the line below:
FETCH mmy_ref_cols INTO mmy_query_col_string;
mmy_query_col_string has been declared as string...So it would hold single value only.That's the reason when you try this block on table with more than one column,mmy_query_col_string would've to hold a table row type data which it would not...
Good luck!!
Bhagat
Similar Messages
-
Insert into Flat File with no data
Hello,
One of my client is encountering a problem in loading data from oracle table to a Flat File.
We are using a IKM SQL to File.
A staging area different from Target and defined on the source Logical Schema.
and options Genreate Header and Insert rows.
The problem is :
At the end of the interface execution, which finish without error we see in the file, the header correctly generated which let me think that there's no permission problem with the file.
But there is no records...
And I don't know why ???
There is the .bad and .error files but they are empty.
The file datastore seems to be well declared with the good delimiters....
We are using the Sunopsis Driver File :
com.sunopsis.jdbc.driver.FileDriver
jdbc:snps:file
Cause the file are not accepted with the new one ...
Note that all our interfaces which are reading flat file work fine.
But there's no one writting in which work.
If somebody ever had this problem, or just have an idea about this let me know please.
Cordially,
BM"Bonjour",
I usually use the snpsoutfile API to unload data into flat file.
It's not the exact solution at your question, but i's a technical solution. -
How to add the records of 2 internal table records into one file
hello experts,
My scenario is...
I am retrieving the data for the for the credit, debit and trailer records of the customer into 3 different internal tables and finally i have to append all those records into one file first debit records then credit records finally the trailer record.... how to do that can anyone give some idea plzzzzzzzzz..
Plz its bit urgent..
Thanks a lot for your anticipation
SRIHello,
Do like this.
" Assume u have three itab.
"Itab1 - debit
"Itab2 - credit
"Itab3 - Credit.
REPORT ZV_TEST_SERVER .
*PARAMETERS: P_FILE TYPE STRING."RLGRAP-FILENAME.
CALL FUNCTION 'GUI_DOWNLOAD'
EXPORTING
* BIN_FILESIZE =
FILENAME = P_FILE
* FILETYPE = 'ASC'
APPEND = 'X' " Check here
* WRITE_FIELD_SEPARATOR = ' '
* HEADER = '00'
* TRUNC_TRAILING_BLANKS = ' '
* WRITE_LF = 'X'
* COL_SELECT = ' '
* COL_SELECT_MASK = ' '
* DAT_MODE = ' '
* IMPORTING
* FILELENGTH =
TABLES
DATA_TAB = ITAB1
* EXCEPTIONS
* FILE_WRITE_ERROR = 1
* NO_BATCH = 2
* GUI_REFUSE_FILETRANSFER = 3
* INVALID_TYPE = 4
* NO_AUTHORITY = 5
* UNKNOWN_ERROR = 6
* HEADER_NOT_ALLOWED = 7
* SEPARATOR_NOT_ALLOWED = 8
* FILESIZE_NOT_ALLOWED = 9
* HEADER_TOO_LONG = 10
* DP_ERROR_CREATE = 11
* DP_ERROR_SEND = 12
* DP_ERROR_WRITE = 13
* UNKNOWN_DP_ERROR = 14
* ACCESS_DENIED = 15
* DP_OUT_OF_MEMORY = 16
* DISK_FULL = 17
* DP_TIMEOUT = 18
* FILE_NOT_FOUND = 19
* DATAPROVIDER_EXCEPTION = 20
* CONTROL_FLUSH_ERROR = 21
* OTHERS = 22
IF SY-SUBRC <> 0.
* MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
* WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
CALL FUNCTION 'GUI_DOWNLOAD'
EXPORTING
* BIN_FILESIZE =
FILENAME = P_FILE
* FILETYPE = 'ASC'
APPEND = 'X' " Check here
* WRITE_FIELD_SEPARATOR = ' '
* HEADER = '00'
* TRUNC_TRAILING_BLANKS = ' '
* WRITE_LF = 'X'
* COL_SELECT = ' '
* COL_SELECT_MASK = ' '
* DAT_MODE = ' '
* IMPORTING
* FILELENGTH =
TABLES
DATA_TAB = ITAB2 " Check here
* EXCEPTIONS
* FILE_WRITE_ERROR = 1
* NO_BATCH = 2
* GUI_REFUSE_FILETRANSFER = 3
* INVALID_TYPE = 4
* NO_AUTHORITY = 5
* UNKNOWN_ERROR = 6
* HEADER_NOT_ALLOWED = 7
* SEPARATOR_NOT_ALLOWED = 8
* FILESIZE_NOT_ALLOWED = 9
* HEADER_TOO_LONG = 10
* DP_ERROR_CREATE = 11
* DP_ERROR_SEND = 12
* DP_ERROR_WRITE = 13
* UNKNOWN_DP_ERROR = 14
* ACCESS_DENIED = 15
* DP_OUT_OF_MEMORY = 16
* DISK_FULL = 17
* DP_TIMEOUT = 18
* FILE_NOT_FOUND = 19
* DATAPROVIDER_EXCEPTION = 20
* CONTROL_FLUSH_ERROR = 21
* OTHERS = 22
IF SY-SUBRC <> 0.
* MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
* WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
CALL FUNCTION 'GUI_DOWNLOAD'
EXPORTING
* BIN_FILESIZE =
FILENAME = P_FILE
* FILETYPE = 'ASC'
APPEND = 'X' " Check here
* WRITE_FIELD_SEPARATOR = ' '
* HEADER = '00'
* TRUNC_TRAILING_BLANKS = ' '
* WRITE_LF = 'X'
* COL_SELECT = ' '
* COL_SELECT_MASK = ' '
* DAT_MODE = ' '
* IMPORTING
* FILELENGTH =
TABLES
DATA_TAB = ITAB3 " Check here
* EXCEPTIONS
* FILE_WRITE_ERROR = 1
* NO_BATCH = 2
* GUI_REFUSE_FILETRANSFER = 3
* INVALID_TYPE = 4
* NO_AUTHORITY = 5
* UNKNOWN_ERROR = 6
* HEADER_NOT_ALLOWED = 7
* SEPARATOR_NOT_ALLOWED = 8
* FILESIZE_NOT_ALLOWED = 9
* HEADER_TOO_LONG = 10
* DP_ERROR_CREATE = 11
* DP_ERROR_SEND = 12
* DP_ERROR_WRITE = 13
* UNKNOWN_DP_ERROR = 14
* ACCESS_DENIED = 15
* DP_OUT_OF_MEMORY = 16
* DISK_FULL = 17
* DP_TIMEOUT = 18
* FILE_NOT_FOUND = 19
* DATAPROVIDER_EXCEPTION = 20
* CONTROL_FLUSH_ERROR = 21
* OTHERS = 22
IF SY-SUBRC <> 0.
* MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
* WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
If useful reward.
Vasanth -
Exporting table data into text files
I got a request to export the data from about 85 tables into 85 text files. I assume that they will want a header row and some delimiter character. This is a process that must run every night by a sql job.
Obviously I want it to be flexible. Tables could be addede and removed and columns could be added and removed. There will probably be a control table that has the list of tables to export.
Looked at bcp - but seems that it is command line only?
SSIS package?
What other options are there?Wanted to post my solution. Ended up using bcp in a PowerShell Script. I was unable to use bcp in SQL because it requires xp_cmdshell and that is not allowed at many of our client sites. So wrote a Powershell script that executes bcp. The ps script
loops through a table that contains a row for each table/view to export along with many values and switches for the table/view export: export path, include column headers, enclose each field in double quotes, double up embedded double quotes, how to format
dates, what to return for NULL integers - basically the stuff that is not flexible in bcp. To get this flexibility I created a SQL proc that takes these values as input and creates a view. Then the PS bcp does a SELECT on the view. Many of my tables are
very wide and bcp has a limit of 4000/8000. Some of my SELECT statements ended up being GT 35k in length. So using a view got around this size limitation.
Anyway, the view creation and bcp download is really fast. It can download about 4 gb of data in 20 minutes. It can download it faster the 7z can zip it.
Below is the SQL proc to format the SELECT statement to create the view for bcp (or some other utility like SQLCMD, Invoke-SQLCMD) or SQL query by "SELECT * from v_ExportData". the proc can be used from SQL or PS, or anything that can call a SQL
Proc and then read a SQL view.
IF EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[ExportTablesCreateView]') AND type in (N'P', N'PC'))
DROP PROCEDURE [dbo].[ExportTablesCreateView]
GO
CREATE PROCEDURE [dbo].[ExportTablesCreateView]
ExportTablesCreateView
Description:
Read in a tablename or viewname with parameters and create a view v_ExportTable of the
data. The view will typically be read by bcp to download SQL table data to flat files.
bcp does not have the options to include column headers, include fields in double quotes,
format dates or use '0' for integer NULLS. Also, bcp has a limit of varhar 4000 and
wider tables could not be directly called by bcp. So create the v_ExportTable and have
bcp SELECT v_ExportTable instead of the original table or view.
Parameters:
@pTableName VARCHAR(128) - table or view to create v_ExportTable from
@pColumnHeader INT =1 - include column headers in the first row
@pDoubleQuoteFields INT = 1 - put double quotes " around all column values including column headers
@pDouble_EmbeddedDoubleQuotes INT = 1 - This is usually used with @pDoubleQuoteFields INT = 1. 'ab"c"d would be 'ab""c""d.
@pNumNULLValue VARCHAR(1) = '0' - NULL number data types will export this value instead of bcp default of ''
@pDateTimeFormat INT = 121 - DateTime data types will use this format value
Example:
EXEC ExportTablesCreateView 'custname', 1, 1, 1, '0', 121
@pTableName VARCHAR(128),
@pColumnHeader INT = 1,
@pDoubleQuoteFields INT = 1,
@pDouble_EmbeddedDoubleQuotes INT = 1,
@pNumNULLValue VARCHAR(1) = '0',
@pDateTimeFormat INT = 121
AS
BEGIN
DECLARE @columnname varchar(128)
DECLARE @columnsize int
DECLARE @data_type varchar(128)
DECLARE @HeaderRow nvarchar(max)
DECLARE @ColumnSelect nvarchar(max)
DECLARE @SQLSelect nvarchar(max)
DECLARE @SQLCommand nvarchar(max)
DECLARE @ReturnCode INT
DECLARE @Note VARCHAR(500)
DECLARE db_cursor CURSOR FOR
SELECT COLUMN_NAME, ISNULL(Character_maximum_length,0), Data_type
FROM [INFORMATION_SCHEMA].[COLUMNS]
WHERE TABLE_NAME = @pTableName AND TABLE_SCHEMA='dbo'
OPEN db_cursor
FETCH NEXT FROM db_cursor INTO @ColumnName, @ColumnSize, @Data_type
SET @HeaderRow = ''
SET @ColumnSelect = ''
-- Loop through each of the @pTableColumns to build the SELECT Statement
WHILE @@FETCH_STATUS = 0
BEGIN
BEGIN TRY
-- Put double quotes around each field - example "MARIA","SHARAPOVA"
IF @pDoubleQuoteFields = 1
BEGIN
-- Include column headers in the first row - example "FirstName","LastName"
IF @pColumnHeader = 1
SET @HeaderRow = @HeaderRow + '''"' + @ColumnName + '"'' as ''' + @columnname + ''','
-- Unsupported Export data type returns "" - example "",
IF @Data_Type in ('image', 'varbinary', 'binary', 'timestamp', 'cursor', 'hierarchyid', 'sql_variant', 'xml', 'table', 'spatial Types')
SET @ColumnSelect = @ColumnSelect + '''""'' as [' + @ColumnName + '],'
-- Format DateTime data types according to input parameter
ELSE IF @Data_Type in ('datetime', 'smalldatetime', 'datetime2', 'date', 'datetimeoffset')
-- example - CASE when [aaa] IS NULL THEN '""' ELSE QUOTENAME(CONVERT(VARCHAR,[aaa], 121), CHAR(34)) END AS [aaa],
SET @ColumnSelect = @ColumnSelect + 'CASE WHEN [' + @columnname + '] IS NULL THEN ''""'' ELSE QUOTENAME(CONVERT(VARCHAR,[' + @columnname + '],' + CONVERT(VARCHAR,@pDateTimeFormat) + '), CHAR(34)) END AS [' + @ColumnName + '],'
-- SET Numeric data types with NULL value according to input parameter
ELSE IF @Data_Type in ('bigint', 'numeric', 'bit', 'smallint', 'decimal', 'smallmoney', 'int', 'tinyint', 'money', 'float', 'real')
-- example - CASE when [aaa] IS NULL THEN '"0"' ELSE QUOTENAME([aaa], CHAR(34)) END AS [aaa],
SET @ColumnSelect = @ColumnSelect + 'CASE WHEN [' + @columnname + '] IS NULL THEN ''"' + @pNumNULLValue + '"'' ELSE QUOTENAME([' + @columnname + '], CHAR(34)) END AS [' + @ColumnName + '],'
ELSE
-- Double embedded double quotes - example "abc"d"ed" to "abc""d""ed". Only applicible for character data types.
IF @pDouble_EmbeddedDoubleQuotes = 1
BEGIN
-- example - CASE when [aaa] IS NULL THEN '""' ELSE '"' + REPLACE([aaa],'"','""') + '"' END AS [aaa],
SET @ColumnSelect = @ColumnSelect + 'CASE WHEN [' + @ColumnName + '] IS NULL THEN ''""'' ELSE ''"'' + REPLACE([' + @ColumnName + '],''"'',''""'') + ''"'' END AS [' + @ColumnName + '],'
END
-- DO NOT PUT Double embedded double quotes - example "abc"d"ed" unchanged to "abc"d"ed"
ELSE
BEGIN
-- example - CASE when [aaa] IS NULL THEN '""' ELSE '"' + [aaa] + '"' END AS [aaa],
SET @ColumnSelect = @ColumnSelect + 'CASE WHEN [' + @ColumnName + '] IS NULL THEN ''""'' ELSE ''"'' + [' + @ColumnName + '] + ''"'' END AS [' + @ColumnName + '],'
END
END
-- DO NOT PUT double quotes around each field - example MARIA,SHARAPOVA
ELSE
BEGIN
-- Include column headers in the first row - example "FirstName","LastName"
IF @pColumnHeader = 1
SET @HeaderRow = @HeaderRow + '''' + @ColumnName + ''' as ''' + @columnname + ''','
-- Unsupported Export data type returns '' - example '',
IF @Data_Type in ('image', 'varbinary', 'binary', 'timestamp', 'cursor', 'hierarchyid', 'sql_variant', 'xml', 'table', 'spatial Types')
SET @ColumnSelect = @ColumnSelect + ''''' as [' + @ColumnName + '],'
-- Format DateTime data types according to input parameter
ELSE IF @Data_Type in ('datetime', 'smalldatetime', 'datetime2','date', 'datetimeoffset')
-- example - CASE when [aaa] IS NULL THEN '''' ELSE CONVERT(VARCHAR,[aaa], 121) END AS [aaa],
SET @ColumnSelect = @ColumnSelect + 'CASE WHEN [' + @columnname + '] IS NULL THEN '''' ELSE CONVERT(VARCHAR,[' + @columnname + '],' + CONVERT(VARCHAR,@pDateTimeFormat) + ') END AS [' + @ColumnName + '],'
-- SET Numeric data types with NULL value according to input parameter
ELSE IF @Data_Type in ('bigint', 'numeric', 'bit', 'smallint', 'decimal', 'smallmoney', 'int', 'tinyint', 'money', 'float', 'real')
-- example - CASE when [aaa] IS NULL THEN '"0"' ELSE CONVERT(VARCHAR, [aaa]) END AS [aaa],
SET @ColumnSelect = @ColumnSelect + 'CASE WHEN [' + @columnname + '] IS NULL THEN ''' + @pNumNULLValue + ''' ELSE CONVERT(VARCHAR,[' + @columnname + ']) END AS [' + @ColumnName + '],'
ELSE
BEGIN
-- Double embedded double quotes - example "abc"d"ed" to "abc""d""ed". Only applicible for character data types.
IF @pDouble_EmbeddedDoubleQuotes = 1
-- example - CASE when [aaa] IS NULL THEN '' ELSE CONVERT(VARCHAR,REPLACE([aaa],'"','""')) END AS [aaa],
SET @ColumnSelect = @ColumnSelect + 'CASE WHEN [' + @columnname + '] IS NULL THEN '''' ELSE CONVERT(VARCHAR,REPLACE([' + @columnname + '],''"'',''""'')) END AS [' + @ColumnName + '],'
ELSE
-- example - CASE when [aaa] IS NULL THEN '' ELSE CONVERT(VARCHAR,[aaa]) END AS [aaa],
SET @ColumnSelect = @ColumnSelect + 'CASE WHEN [' + @columnname + '] IS NULL THEN '''' ELSE CONVERT(VARCHAR,[' + @columnname + ']) END AS [' + @ColumnName + '],'
END
END
FETCH NEXT FROM db_cursor INTO @ColumnName, @ColumnSize, @Data_Type
END TRY
BEGIN CATCH
RETURN (1)
END CATCH
END
CLOSE db_cursor
DEALLOCATE db_cursor
BEGIN TRY
-- remove last comma
IF @pColumnHeader = 1
SET @HeaderRow = SUBSTRING(@HeaderRow , 1, LEN(@HeaderRow ) - 1)
SET @ColumnSelect = SUBSTRING(@ColumnSelect, 1, LEN(@ColumnSelect) - 1)
-- Put on the finishing touches on the SELECT
IF @pColumnHeader = 1
SET @SQLSelect = 'SELECT ' + @HeaderRow + ' UNION ALL ' +
'SELECT ' + @ColumnSelect + ' FROM [' + @pTableName + ']'
ELSE
SET @SQLSelect = 'SELECT ' + @ColumnSelect + ' FROM [' + @pTableName + ']'
---- diagnostics
---- PRINT truncates at 4k or 8k, not sure, my tables have many colummns
--PRINT @SQLSelect
--DECLARE @END varchar(max) = RIGHT(@SQLSelect, 3000)
--PRINT @end
--EXECUTE sp_executesql @SQLSelect
-- drop view if exists -- using view because some tables are very wide. one of my tables had a 33k select statement
SET @SQLCommand = '
IF EXISTS (SELECT * FROM SYS.views WHERE name = ''v_ExportTable'')
BEGIN
DROP VIEW v_ExportTable
END'
EXECUTE @ReturnCode = sp_executesql @SQLCommand
IF @returncode = 1
BEGIN
RETURN (1)
END
-- create the view
SET @SQLCommand = '
CREATE VIEW v_ExportTable AS ' + @SQLSelect
-- diagnostics
--print @sqlcommand
EXECUTE @ReturnCode = sp_executesql @SQLCommand
IF @returncode = 1
BEGIN
RETURN (1)
END
END TRY
BEGIN CATCH
RETURN (1)
END CATCH
RETURN (0)
END -- CREATE PROCEDURE [dbo].[ExportTablesCreateView]
GO -
Unable to read E$ table records into excel file in linux machine
Hi
I am using below code in ODI procedure to read E$ table record and store it in excel file
ODI Procedure: Technology=Java Beanshall and Command on Target I written below code and placed it in CKM Oracle KM
<@
String OS = System.getProperty("os.name").toLowerCase();
String v_path="";
if((OS.indexOf("win") >= 0))
v_path="D:\Unload_Dir\<%=snpRef.getSession("SESS_NO")%>.xlsx";
else if (OS.indexOf("mac") >= 0)
v_path="path details";
else if (OS.indexOf("nix") >= 0 || OS.indexOf("nux") >= 0 || OS.indexOf("aix") > 0 )
v_path="/odi_a/oracle/Middleware/logs/wcds/odi_logs/<%=snpRef.getSession("SESS_NO")%>.xlsx";
else if (OS.indexOf("sunos") >= 0)
v_path="soliaris path";
@>
OdiSqlUnload "-FILE=<@=v_path@>" "-DRIVER=<%=odiRef.getInfo("DEST_JAVA_DRIVER")%>" "-URL=<%=odiRef.getInfo("DEST_JAVA_URL")%>" "-USER=<%=odiRef.getInfo("DEST_USER_NAME")%>" "-PASS=<%=odiRef.getInfo("DEST_ENCODED_PASS")%>" "-FILE_FORMAT=VARIABLE" "-ROW_SEP=\r\n" "-DATE_FORMAT=yyyy/MM/dd HH:mm:ss" "-CHARSET_ENCODING=ISO8859_1" "-XML_CHARSET_ENCODING=ISO-8859-1"
select * from <%=odiRef.getTable("L","ERR_NAME", "W")%>
But is not reading the data into .xlsx file ,
Please help me it is very urgent
Can I use below code
String os = "";
if (System.getProperty("os.name").toLowerCase().indexOf("windows") > -1) {
os = "windows";
} else if (System.getProperty("os.name").toLowerCase().indexOf("linux") > -1) {
os = "linux";
} else if (System.getProperty("os.name").toLowerCase().indexOf("mac") > -1) {
os = "mac";
T
his is high priority, please help me urgent
Regards,
Phanikanth
Edited by: Phanikanth on Feb 28, 2013 5:43 AM
Edited by: Phanikanth on Feb 28, 2013 6:00 AM
Edited by: Phanikanth on Feb 28, 2013 7:42 AMHi,
can you describe what is happening when you run the ODI procedure described below:
- Does the procedure fail with an error, if yes, which error(full details)?
- Does the procedure pass but no xslx file is been created?
- Does the procedure pass and an xslx file is been created, but Excel can't read it? If yes, what is the structure of the xslx file when read in an editor?
What I can see from your code below is that you have choosen -FILE_FORMAT=VARIABLE, but the XSLX format is supposed to be XML.
Regards,
Alex -
Invalid data format on EXPORTING table to SQL-FLAT FILE (Insert type)
Hi there!
Writing from Slovenia/Europe.
Working with ORACLE9i (standard version) on Windows XP with SQL-deloper 1.0.0.015.
FIRST SQL.-DEVELOPER IS GOOD TOOL WITH SOME MINOR ERRORS.
1.) Declare and Insert data EXAMPLE
drop table tst_date;
create table tst_date (fld_date date);
insert into tst_date values (sysdate);
insert into tst_date values (sysdate);
2.) Retriving date with SQLPLUS
SQL> select to_char(fld_date,'DD.MM.YYYY HH24:MI:SS') FROM TST_DATE;
23.10.2006 11:25:23
23.10.2006 11:25:25
As you see TIME DATA IS CORRECT.
When I EXPOPRT data TO SQL-insert type I got this result IN TST_DATE.SQL file:
-- INSERTING into TST_DATE
Insert into "TST_DATE" ("FLD_DATE") values (to_date('2006-10-23','DD.MM.RR'));
Insert into "TST_DATE" ("FLD_DATE") values (to_date('2006-10-23','DD.MM.RR'));
As you seel I lost TIME DATA.
QUESTION!
HOW CAN I SET PROPER DATE FORMAT IN SQL-DEVELOPER BEFORE I EXPORT DATA TO FLAT FILE.
Best regards, Iztok from SLOVENIA
Message was edited by:
DEKUSA DATE-Field, is a DATE-Field and not a
DATE-TIME-Field.
The export-tool identifies a DATE-Field and exports
the data into date-format.This is not true. Oracle DATE fields include a time element.
To the original poster - I believe this is a bug in the current version.
See this thread for possible workarounds Bad Export format --- BUG ???
Message was edited by:
smitjb -
Export SQL View to Flat File with UTF-8 Encoding
I've setup a package in SSIS to export a SQL view to a flat file and it's working fine. I now need to make that flat file UTF-8 encoded. The package executes but still shows the files as ANSI encoded.
My package consists of a Source (SQL View) -> Derived Column (casts the fields to DT_WSTR) -> Destination Flat File (Set to output UTF-8 file).
I don't get any errors to help me troubleshoot further. I'm running SQL Server 2005 SP2.Unless there is a Byte-Order-Marker (BOM - hex file prefix: EF BB BF) at the beginning of the file, and unless your data contains non-ASCII characters, I'm unsure there is a technical difference in the files, Paul.
That is, even if the file is "encoded" UTF-8, if your data is only ASCII values (decimal values 0-127, hex 00-7F), UTF-8 doesn't really serve a purpose over ANSI encoding. Now if you're looking for UTF-8 with specifically the BOM included, and your data is all standard ASCII, the Flat File Connection Manager can't do that, it seems.
What the flat file connection manager is doing correctly though, is encoding values that are over decimal 127/hex 7F in UTF-8 when the encoding of the connection manager is set to 65001 (UTF-8).
Example:
Input data built with a script component as a source (code at the bottom of this post) and with only one WSTR output column hooked to a flat file destination component:
a string containing only decimal value 225 (german Eszett character - ß)
Encoding set to ANSI 1252 looks like:
E1 0D 0A (which is the ANSI encoding of the decimal character value 225 (E1) and a CR-LF (0D 0A)
Encoding set to UTF-8 65001 looks like:
C3 A1 0D 0A (which is the UTF-8 encoding of the decimal character value 225 (C3 A1) and a CR-LF (0D 0A)
Note that for values over decimal 127, UTF-8 takes at least two bytes and up to four for the remaining values available.
So, I'm comfortable now, after sitting down and going through this, that the flat file connection manager is working correctly, unless you need a BOM.
1
Imports System
2
Imports System.Data
3
Imports System.Math
4
Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper
5
Imports Microsoft.SqlServer.Dts.Runtime.Wrapper
6
7
Public Class ScriptMain
8
Inherits UserComponent
9
10
Public Overrides Sub CreateNewOutputRows()
11
Output0Buffer.AddRow()
12
Output0Buffer.col1 = ChrW(225)
13
End Sub
14
15
End Class
Phil -
Function module for move internal table records into MSexcel file
Hi all,
Tell me the function module which is used to
move internal table records in to MSexcel file.
Give the sample program...Please search the forum for FM "GUI_DOWNLOAD" You will get tons of threads with sample code.
Also take a look into SAP_CONVERT_TO_XLS_FORMAT
Thanks -
How to export all tracks into audio files with the same duration
Hi everybody.
I am trying to export all the tracks into different audio files.
It's 10 soft. instruments and an audio track. I tried with "File>Export>All Tracks as Audio File but the result is 11 audio files with different lengths. As the song is 3 minutes and 20 seconds I would like every single track to be 3.22 sec long.
How do I do it? In fact, each track is composed not only by played instruments but also by moments of silence.
Thanks!!
Alessandrosambino wrote:
I tried with "File>Export>All Tracks as Audio File but the result is 11 audio files with different lengths. As the song is 3 minutes and 20 seconds I would like every single track to be 3.22 sec long.
The files are different lengths because each instrument does not stop playing at the same moment. It would accomplish nothing other than waste disk space if Logic appended silence to the end of each file in order to make each track the same length. The different lengths you see mean nothing because the absence of appended silence means nothing.
each track is composed not only by played instruments but also by moments of silence.
Naturally. And that silence is accounted for in your exported files, when the silence appears at the beginning of the track, and when it appears during the track. It just isn't appended at the end because that would be pointless. Consider this project, consisting of four tracks:
When I use the command File > Export > All Tracks as Audio Files, Logic will produce four files that look like this:
The files are not equal in length, which is perfectly fine. What's important is that they all start at bar 1, and will align properly when imported into some other program. So all you need is the command you already used: File > Export > All Tracks as Audio Files. In one step, it will produce all the files you need, and it will produce them correctly.
By using Bounce, you would be gaining nothing except extra work, because you would have to do it separately for each file you wanted to produce. With Export > All Tracks as Audio Files, you get all your files with just one command.
iSchwartz wrote:
exporting tracks means that any panning or automation will not be rendered. So if you want the engineer to put up all of your tracks at unity gain (all faders at zero) and have your tracks & stems reproduce the mix you're hearing at your studio, use bounce, not export.
If you want to have your tracks & stems reproduce the mix you're hearing at your studio, there's no need to go to the extra effort of using Bounce instead of Export. You only need to enable (in the Export dialog) the checkbox "Include Volume/Pan Automation." This is explained on p. 1022. When you do this, any panning or automation will indeed be rendered. -
Export query results to flat file with dynamic filename
Hi
Can anybody can point me how to dynamic export query serults set to for example txt file using process flows in OWB.
Let say I have simple select query
select * from table1 where daterange >= sysdate -1 and daterange < sysdate
so query results will be different every day because daterange will be different. Also I would like to name txt file dynamicly as well
eg. results_20090601.txt, results_20090602.txt, results_20090603.txt
I cant see any activity in process editor to enter custom sql statment, like it is in MSSQL 2000 or 2005
thanks in advanceYou can call existing procedures from a process flow the procedure can create the filename with whatever name you desire. OWB maps with file as target can also create a file with a dynamic name defined by an expression (see here ).
Cheers
David -
How to give path in plsql while exporting table data into .csv file
hi,
i have a code like this
PROCEDURE dump_table_to_csv (
p_tname IN VARCHAR2,
p_dir IN VARCHAR2,
p_filename IN VARCHAR2
IS
l_output UTL_FILE.file_type;
l_thecursor INTEGER DEFAULT DBMS_SQL.open_cursor;
l_columnvalue VARCHAR2 (4000);
l_status INTEGER;
l_query VARCHAR2 (1000) DEFAULT 'select * from ' || p_tname;
l_colcnt NUMBER := 0;
l_separator VARCHAR2 (1);
l_desctbl DBMS_SQL.desc_tab;
BEGIN
l_output := UTL_FILE.fopen (p_dir, p_filename, 'w');
EXECUTE IMMEDIATE 'alter session set nls_date_format=''dd-mon-yyyy hh24:mi:ss''';
DBMS_SQL.parse (l_thecursor, l_query, DBMS_SQL.native);
DBMS_SQL.describe_columns (l_thecursor, l_colcnt, l_desctbl);
FOR i IN 1 .. l_colcnt
LOOP
UTL_FILE.put (l_output,
l_separator || '"' || l_desctbl (i).col_name || '"'
DBMS_SQL.define_column (l_thecursor, i, l_columnvalue, 4000);
l_separator := ',';
END LOOP;
UTL_FILE.new_line (l_output);
l_status := DBMS_SQL.EXECUTE (l_thecursor);
WHILE (DBMS_SQL.fetch_rows (l_thecursor) > 0)
LOOP
l_separator := '';
FOR i IN 1 .. l_colcnt
LOOP
DBMS_SQL.column_value (l_thecursor, i, l_columnvalue);
UTL_FILE.put (l_output, l_separator || l_columnvalue);
l_separator := ',';
END LOOP;
UTL_FILE.new_line (l_output);
END LOOP;
DBMS_SQL.close_cursor (l_thecursor);
UTL_FILE.fclose (l_output);
EXECUTE IMMEDIATE 'alter session set nls_date_format=''dd-MON-yy'' ';
EXCEPTION
WHEN OTHERS
THEN
EXECUTE IMMEDIATE 'alter session set nls_date_format=''dd-MON-yy'' ';
RAISE;
END;
I am getting error like :-------
SQL> exec dump_table_to_csv('deptair','c:/csv','aa.deptair');
BEGIN dump_table_to_csv('deptair','c:/csv','aa.deptair'); END;
ERROR at line 1:
ORA-00604: error occurred at recursive SQL level 1
ORA-06502: PL/SQL: numeric or value error: character string buffer too small
ORA-06512: at line 8
ORA-29280: invalid directory path
ORA-06512: at "SCOTT.DUMP_TABLE_TO_CSV", line 58
ORA-06512: at line 1
pplease help me out
thanks
vickylook at your other thread, answer is already there.
-
How to create flat file with fixed lenght records
I need help to export an Oracle table to a flat file with fixed lenght and without columns separator.
the fixed length is the more important demand.
My table have 50 columns with varchar, date and number .
Date and number columns may be empty, null o with values.
Thanks a lot for any help.
[email protected]Hi,
You can use this trick:
SQL>desc t
Name Null? Type
NAME VARCHAR2(20)
SEX VARCHAR2(1)
SQL>SELECT LENGTH(LPAD(NAME,20,' ')||LPAD(SEX,1,' ')), LPAD(NAME,20,' ')||LPAD(SEX,1,' ') FROM T;
LENGTH(LPAD(NAME,20,'')||LPAD(SEX,1,'')) LPAD(NAME,20,'')||LPA
21 aF
21 BM
21 CF
21 DM
4 rows selected.
SQL>SELECT * FROM t;
NAME S
a F
B M
C F
D M
4 rows selected.Regards -
Store oracle table into flat file
Hello,
How to store oracle table into flat file with comma ( , ) delimiter.
Thanx
Muraliset echo off feedback off termout off heading off pages 0
spool path/filename
select cola||','||colb||';'||colc ...
from tablea;
spool off -
CREATE OR REPLACE PROCEDURE test IS
cursor cur is
select dname||';'||loc dnlo
from dept;
rec cur%ROWTYPE;
v_returncode number;
BEGIN
open cur;
loop
fetch cur into rec;
exit when cur%notfound;
v_returncode:= write_file(rec.dnlo);
dbms_output.put_line(rec.dnlo);
end loop;
close cur;
END test;
SQL> exec test
ACCOUNTING;NEW YORK
RESEARCH;DALLAS
SALES;CHICAGO
OPERATIONS;BOSTON
SALES;LAS VEGAS
PL/SQL-Prozedur wurde erfolgreich abgeschlossen.
SQL> The program wrote 5 records into the file.txt (5 calls). Normally i must write
at about 2 million records into the file. It is possible, to write the records into
the file with one call, like a array or a ref_cursor (input parameter of the function write_file) instead of calling the function 2 millon times in a loop?
Message was edited by:
madCouldn't help commenting cos that has got to be one of the best drive/folder name combinations I've seen in a long time. :))
v_FileHandle:= UTL_FILE.FOPEN('s:\hit', v_FileName, p_FileModus, C_MAXLINESIZE);Seriously though, when working with UTL_FILE you should use directory objects to ensure proper security... note...
The UTL_FILE_DIR parameter has been deprecated by oracle in favour of direcory objects because of it's security problems.
The correct thing to do is to create a directory object e.g.:
CREATE OR REPLACE DIRECTORY mydir AS 'c:\myfiles';Note: This does not create the directory on the file system. You have to do that yourself and ensure that oracle has permission to read/write to that file system directory.
Then, grant permission to the users who require access e.g....
GRANT READ,WRITE ON DIRECTORY mydir TO myuser;
Then use that directory object inside your FOPEN statement e.g.
fh := UTL_FILE.FOPEN('MYDIR', 'myfile.txt', 'r');Note: You MUST specify the directory object name in quotes and in UPPER case for this to work as it is a string that is referring to a database object name which will have been stored in uppercase by default.
;) -
Exporting R3 tables into Flat Files for BPC
Dear BPC experts,
I understand currently the way for BPC to extract data from R3 is via flat files. I have the following queries:
1) What exactly are the T codes and steps required to export R3 tables into flat files (without going through the OpenHub in BI7)? Can this process be automated?
2) Is Data Manager of BPC equivalent to SSIS (Integration Services) of SQL Server?
Please advise. Thanks!!
SJHi Soong Jeng,
I would take a look at the existing BI Extractors for the answer to Q1, I am working on finishing up a HTG regarding this. Look for it very soon.
Here is the code to dump out data from a BI Extractor directly from ERP.
You need dev permissions in your ERP system and access to an app server folder. ...Good Luck
*& Report Z_EXTRACTOR_TO_FILE *
report z_extractor_to_file .
type-pools:
rsaot.
parameters:
p_osrce type roosource-oltpsource,
p_filnm type rlgrap-filename,
p_maxsz type rsiodynp4-maxsize default 100,
p_maxfc type rsiodynp4-calls default 10,
p_updmd type rsiodynp4-updmode default 'F'.
data:
l_lines_read type sy-tabix,
ls_select type rsselect,
lt_select type table of rsselect,
ls_field type rsfieldsel,
lt_field type table of rsfieldsel,
l_quiet type rois-genflag value 'X',
l_readonly type rsiodynp4-readonly value 'X',
l_desc_type type char1 value 'M',
lr_data type ref to data,
lt_oltpsource type rsaot_s_osource,
l_req_no type rsiodynp4-requnr,
l_debugmode type rsiodynp4-debugmode,
l_genmode type rois-genflag,
l_columns type i,
l_temp_char type char40,
l_filename like rlgrap-filename,
wa_x030l like x030l,
tb_dfies type standard table of dfies,
wa_dfies type dfies,
begin of tb_flditab occurs 0,
* field description
fldname(40) type c,
end of tb_flditab,
ls_flditab like line of tb_flditab,
l_file type string.
field-symbols:
<lt_data> type standard table,
<ls_data> type any,
<ls_field> type any.
call function 'RSA1_SINGLE_OLTPSOURCE_GET'
exporting
i_oltpsource = p_osrce
importing
e_s_oltpsource = lt_oltpsource
exceptions
no_authority = 1
not_exist = 2
inconsistent = 3
others = 4.
if sy-subrc <> 0.
* ERROR
endif.
create data lr_data type standard table of (lt_oltpsource-exstruct).
assign lr_data->* to <lt_data>.
call function 'RSFH_GET_DATA_SIMPLE'
exporting
i_requnr = l_req_no
i_osource = p_osrce
i_maxsize = p_maxsz
i_maxfetch = p_maxfc
i_updmode = p_updmd
i_debugmode = l_debugmode
i_abapmemory = l_genmode
i_quiet = l_quiet
i_read_only = l_readonly
importing
e_lines_read = l_lines_read
tables
i_t_select = lt_select
i_t_field = lt_field
e_t_data = <lt_data>
exceptions
generation_error = 1
interface_table_error = 2
metadata_error = 3
error_passed_to_mess_handler = 4
no_authority = 5
others = 6.
* get table/structure field info
call function 'GET_FIELDTAB'
exporting
langu = sy-langu
only = space
tabname = lt_oltpsource-exstruct
withtext = 'X'
importing
header = wa_x030l
tables
fieldtab = tb_dfies
exceptions
internal_error = 01
no_texts_found = 02
table_has_no_fields = 03
table_not_activ = 04.
* check result
case sy-subrc.
when 0.
* copy fieldnames
loop at tb_dfies into wa_dfies.
case l_desc_type.
when 'F'.
tb_flditab-fldname = wa_dfies-fieldname.
when 'S'.
tb_flditab-fldname = wa_dfies-scrtext_s.
when 'M'.
tb_flditab-fldname = wa_dfies-scrtext_m.
when 'L'.
tb_flditab-fldname = wa_dfies-scrtext_l.
when others.
* use fieldname
tb_flditab-fldname = wa_dfies-fieldname.
endcase.
append tb_flditab.
* clear variables
clear: wa_dfies.
endloop.
when others.
message id sy-msgid type sy-msgty number sy-msgno
with sy-subrc raising error_get_dictionary_info.
endcase.
describe table tb_flditab lines l_columns.
" MOVE DATA TO THE APPLICATION SERVER
open dataset p_filnm for output in text mode encoding utf-8
with windows linefeed.
data i type i.
loop at <lt_data> assigning <ls_data>.
loop at tb_flditab into ls_flditab.
i = sy-tabix.
assign component i of structure <ls_data> to <ls_field>.
l_temp_char = <ls_field>.
if i eq 1.
l_file = l_temp_char.
else.
concatenate l_file ',' l_temp_char into l_file.
endif.
endloop.
transfer l_file to p_filnm.
clear l_file.
endloop.
close dataset p_filnm.
Cheers,
Scott
Edited by: Jeffrey Holdeman on May 25, 2010 4:44 PM
Added markup to improve readability
Edited by: Jeffrey Holdeman on May 25, 2010 4:47 PM
Maybe you are looking for
-
i just got the iphone 4 and my husband already had on. we share an itunes account and i signed up for i cloud and now ALL of my husbands contacts are on my phone ... so I started to delete them (because i thought it would just be off my phone) and h
-
Resetting font on PS CS4 to English - keep showing fonts in Asian font
Can anyone help me?, I keep resetting the font preference to English, but many of the second half of my font list in PS keeps showing in Asian fonts not allowing me to see what the font preview looks like at all. These same fonts are in English in Ai
-
I need the tool bar that has the little house(home page) so I can go to web sites and return to foxfire. I tried to restore my computer to an earlier date which failed. The down load didn't take , I think my system is getting to old!I use fire fox a
-
How to upload database tab(ztable) before executing of report
hi all, I have developed a report , in which i have a check box in selection screen, when ever i check this and give inputs to report(year, period), upto that year and period , the database table(ztable), have to be updated (the data is intur
-
I own a I phone 4. I recently changed my service from AT&T to Straight Talk. It was working fine but then I deleted the vpn file on accident. Now when I try to get on line I get a message that I am not signed up with a cellular data network. Does any