Table data in FLAT file - Please help me

Hi,
Please have look.
In my database 7 - users, Each user with nearly 1500 tables.
My client wants the data in each table to be availbale in a text file in a human
readable format like.
eg: DEPT table
Deptno,Dname,Location (This line is optional)
10,FIN,LA
20,ACCOUNTING,NW
Please help me to solve this.
Can you suggest any easy way?
regards
Mathew

Can you suggest any easy way?What about the answers given in your previous post:
Tables data into text file
Use Oracle's SQL Developer. Bring up the table in a
grid and export it to a .csv file. Then the human
can open it in Excel.If the OP's numbers are correct (7 users, each with 1500 tables) this does not sound like an easy way to export 10500 tables...
Message was edited by:
thomas.kellerer

Similar Messages

  • Problem Loading Microsoft Sql Serer table data to flat file

    Hi Experts,
    i am trying to load data from SQL Server table to flat file but its errror out.
    I have selected Staging area different form Targert ( I am using SQL Server as my staging area)
    knowlegde modue used: IKM SQL to file Append
    I reciee the following errror
    ODI-1217: Session table to file (124001) fails with return code 7000.
    ODI-1226: Step table to file fails after 1 attempt(s).
    ODI-1240: Flow table to file fails while performing a Integration operation. This flow loads target table test.
    ODI-1227: Task table to file (Integration) fails on the source MICROSOFT_SQL_SERVER connection POS_XSTORE.
    Caused By: java.sql.SQLException: Could not read heading rows from file
         at com.sunopsis.jdbc.driver.file.FileResultSet.<init>(FileResultSet.java:164)
    Please help!!!
    Thanks,
    Prateek

    Have you defined the File Datastore correctly with appropriate delimiter and file properties.
    Although the example is for oracle to file , see if this helps you in any way - http://odiexperts.com/oracle-to-flat-file.

  • Loading data from xml file - please help

    Hi, I am new to attempting to get data from an xml file load
    in flash. I followed a tutorial in a new fla and it seemed to work
    fine, I then tried to adapt it to my own needs which worked fine.
    But then when trying to implement this into my news section it
    doesn't seem to work.
    I am trying to make a news section, that displays the date
    and news article. This is in within a movie clip, I even tried
    putting it in the timeline of the scene but still nothing.
    - I have 2 dynamic text boxes, date_txt and news_txt
    - The xml file is named news.xml,
    - both the fla and the xml are in the same folder
    This is the actionscript I am using:
    function loadXML(loaded) {
    if (loaded) {
    _root.thedate =
    this.firstChild.childNodes[0].childNodes[0].firstChild.nodeValue;
    _root.thenews =
    this.firstChild.childNodes[0].childNodes[1].firstChild.nodeValue;
    date_txt.text = _root.thedate;
    news_txt.text = _root.thenews;
    } else {
    trace("file not loaded!");
    xmlData = new XML();
    xmlData.ignoreWhite = true;
    xmlData.onLoad = loadXML;
    xmlData.load("news.xml");
    This is what I have in the xml document:
    <?xml version="1.0"?>
    <news>
    <article>
    <date>date</date>
    <news>newentry.</news>
    </article>
    <article>
    <date>Doug Engelbart</date>
    <news>Invented the mouse at the Stanford Research
    Institute</news>
    </article>
    </news>
    Does anyone perhaps have any ideas of what the problem could
    be?
    Help much appreciated.

    Hi
    _root refers to the Main Timeline, if your date_txt and
    news_txt are in a movieclip ie: news_mc
    then your path should read _root.news_mc.date_txt.text and
    _root.news_mc.news_txt.text
    You will also need to embed the characters you wish to use
    inside your dymanic text for it to show.
    Hope it helps

  • Exporting table data into text files

    I got a request to export the data from about 85 tables into 85 text files. I assume that they will want a header row and some delimiter character. This is a process that must run every night by a sql job.
    Obviously I want it to be flexible. Tables could be addede and removed and columns could be added and removed. There will probably be a control table that has the list of tables to export.
    Looked at bcp - but seems that it is command line only?
    SSIS package?
    What other options are there?

    Wanted to post my solution. Ended up using bcp in a PowerShell Script. I was unable to use bcp in SQL because it requires xp_cmdshell and that is not allowed at many of our client sites. So wrote a Powershell script that executes bcp. The ps script
    loops through a table that contains a row for each table/view to export along with many values and switches for the table/view export: export path, include column headers, enclose each field in double quotes, double up embedded double quotes, how to format
    dates, what to return for NULL integers - basically the stuff that is not flexible in bcp. To get this flexibility I created a SQL proc that takes these values as input and creates a view. Then the PS bcp does a SELECT on the view. Many of my tables are
    very wide and bcp has a limit of 4000/8000. Some of my SELECT statements ended up being GT 35k in length. So using a view got around this size limitation.
    Anyway, the view creation and bcp download is really fast. It can download about 4 gb of data in 20 minutes. It can download it faster the 7z can zip it.
    Below is the SQL proc to format the SELECT statement to create the view for bcp (or some other utility like SQLCMD, Invoke-SQLCMD) or SQL query by "SELECT * from v_ExportData". the proc can be used from SQL or PS, or anything that can call a SQL
    Proc and then read a SQL view.
    IF EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[ExportTablesCreateView]') AND type in (N'P', N'PC'))
    DROP PROCEDURE [dbo].[ExportTablesCreateView]
    GO
    CREATE PROCEDURE [dbo].[ExportTablesCreateView]
    ExportTablesCreateView
    Description:
    Read in a tablename or viewname with parameters and create a view v_ExportTable of the
    data. The view will typically be read by bcp to download SQL table data to flat files.
    bcp does not have the options to include column headers, include fields in double quotes,
    format dates or use '0' for integer NULLS. Also, bcp has a limit of varhar 4000 and
    wider tables could not be directly called by bcp. So create the v_ExportTable and have
    bcp SELECT v_ExportTable instead of the original table or view.
    Parameters:
    @pTableName VARCHAR(128) - table or view to create v_ExportTable from
    @pColumnHeader INT =1 - include column headers in the first row
    @pDoubleQuoteFields INT = 1 - put double quotes " around all column values including column headers
    @pDouble_EmbeddedDoubleQuotes INT = 1 - This is usually used with @pDoubleQuoteFields INT = 1. 'ab"c"d would be 'ab""c""d.
    @pNumNULLValue VARCHAR(1) = '0' - NULL number data types will export this value instead of bcp default of ''
    @pDateTimeFormat INT = 121 - DateTime data types will use this format value
    Example:
    EXEC ExportTablesCreateView 'custname', 1, 1, 1, '0', 121
    @pTableName VARCHAR(128),
    @pColumnHeader INT = 1,
    @pDoubleQuoteFields INT = 1,
    @pDouble_EmbeddedDoubleQuotes INT = 1,
    @pNumNULLValue VARCHAR(1) = '0',
    @pDateTimeFormat INT = 121
    AS
    BEGIN
    DECLARE @columnname varchar(128)
    DECLARE @columnsize int
    DECLARE @data_type varchar(128)
    DECLARE @HeaderRow nvarchar(max)
    DECLARE @ColumnSelect nvarchar(max)
    DECLARE @SQLSelect nvarchar(max)
    DECLARE @SQLCommand nvarchar(max)
    DECLARE @ReturnCode INT
    DECLARE @Note VARCHAR(500)
    DECLARE db_cursor CURSOR FOR
    SELECT COLUMN_NAME, ISNULL(Character_maximum_length,0), Data_type
    FROM [INFORMATION_SCHEMA].[COLUMNS]
    WHERE TABLE_NAME = @pTableName AND TABLE_SCHEMA='dbo'
    OPEN db_cursor
    FETCH NEXT FROM db_cursor INTO @ColumnName, @ColumnSize, @Data_type
    SET @HeaderRow = ''
    SET @ColumnSelect = ''
    -- Loop through each of the @pTableColumns to build the SELECT Statement
    WHILE @@FETCH_STATUS = 0
    BEGIN
    BEGIN TRY
    -- Put double quotes around each field - example "MARIA","SHARAPOVA"
    IF @pDoubleQuoteFields = 1
    BEGIN
    -- Include column headers in the first row - example "FirstName","LastName"
    IF @pColumnHeader = 1
    SET @HeaderRow = @HeaderRow + '''"' + @ColumnName + '"'' as ''' + @columnname + ''','
    -- Unsupported Export data type returns "" - example "",
    IF @Data_Type in ('image', 'varbinary', 'binary', 'timestamp', 'cursor', 'hierarchyid', 'sql_variant', 'xml', 'table', 'spatial Types')
    SET @ColumnSelect = @ColumnSelect + '''""'' as [' + @ColumnName + '],'
    -- Format DateTime data types according to input parameter
    ELSE IF @Data_Type in ('datetime', 'smalldatetime', 'datetime2', 'date', 'datetimeoffset')
    -- example - CASE when [aaa] IS NULL THEN '""' ELSE QUOTENAME(CONVERT(VARCHAR,[aaa], 121), CHAR(34)) END AS [aaa],
    SET @ColumnSelect = @ColumnSelect + 'CASE WHEN [' + @columnname + '] IS NULL THEN ''""'' ELSE QUOTENAME(CONVERT(VARCHAR,[' + @columnname + '],' + CONVERT(VARCHAR,@pDateTimeFormat) + '), CHAR(34)) END AS [' + @ColumnName + '],'
    -- SET Numeric data types with NULL value according to input parameter
    ELSE IF @Data_Type in ('bigint', 'numeric', 'bit', 'smallint', 'decimal', 'smallmoney', 'int', 'tinyint', 'money', 'float', 'real')
    -- example - CASE when [aaa] IS NULL THEN '"0"' ELSE QUOTENAME([aaa], CHAR(34)) END AS [aaa],
    SET @ColumnSelect = @ColumnSelect + 'CASE WHEN [' + @columnname + '] IS NULL THEN ''"' + @pNumNULLValue + '"'' ELSE QUOTENAME([' + @columnname + '], CHAR(34)) END AS [' + @ColumnName + '],'
    ELSE
    -- Double embedded double quotes - example "abc"d"ed" to "abc""d""ed". Only applicible for character data types.
    IF @pDouble_EmbeddedDoubleQuotes = 1
    BEGIN
    -- example - CASE when [aaa] IS NULL THEN '""' ELSE '"' + REPLACE([aaa],'"','""') + '"' END AS [aaa],
    SET @ColumnSelect = @ColumnSelect + 'CASE WHEN [' + @ColumnName + '] IS NULL THEN ''""'' ELSE ''"'' + REPLACE([' + @ColumnName + '],''"'',''""'') + ''"'' END AS [' + @ColumnName + '],'
    END
    -- DO NOT PUT Double embedded double quotes - example "abc"d"ed" unchanged to "abc"d"ed"
    ELSE
    BEGIN
    -- example - CASE when [aaa] IS NULL THEN '""' ELSE '"' + [aaa] + '"' END AS [aaa],
    SET @ColumnSelect = @ColumnSelect + 'CASE WHEN [' + @ColumnName + '] IS NULL THEN ''""'' ELSE ''"'' + [' + @ColumnName + '] + ''"'' END AS [' + @ColumnName + '],'
    END
    END
    -- DO NOT PUT double quotes around each field - example MARIA,SHARAPOVA
    ELSE
    BEGIN
    -- Include column headers in the first row - example "FirstName","LastName"
    IF @pColumnHeader = 1
    SET @HeaderRow = @HeaderRow + '''' + @ColumnName + ''' as ''' + @columnname + ''','
    -- Unsupported Export data type returns '' - example '',
    IF @Data_Type in ('image', 'varbinary', 'binary', 'timestamp', 'cursor', 'hierarchyid', 'sql_variant', 'xml', 'table', 'spatial Types')
    SET @ColumnSelect = @ColumnSelect + ''''' as [' + @ColumnName + '],'
    -- Format DateTime data types according to input parameter
    ELSE IF @Data_Type in ('datetime', 'smalldatetime', 'datetime2','date', 'datetimeoffset')
    -- example - CASE when [aaa] IS NULL THEN '''' ELSE CONVERT(VARCHAR,[aaa], 121) END AS [aaa],
    SET @ColumnSelect = @ColumnSelect + 'CASE WHEN [' + @columnname + '] IS NULL THEN '''' ELSE CONVERT(VARCHAR,[' + @columnname + '],' + CONVERT(VARCHAR,@pDateTimeFormat) + ') END AS [' + @ColumnName + '],'
    -- SET Numeric data types with NULL value according to input parameter
    ELSE IF @Data_Type in ('bigint', 'numeric', 'bit', 'smallint', 'decimal', 'smallmoney', 'int', 'tinyint', 'money', 'float', 'real')
    -- example - CASE when [aaa] IS NULL THEN '"0"' ELSE CONVERT(VARCHAR, [aaa]) END AS [aaa],
    SET @ColumnSelect = @ColumnSelect + 'CASE WHEN [' + @columnname + '] IS NULL THEN ''' + @pNumNULLValue + ''' ELSE CONVERT(VARCHAR,[' + @columnname + ']) END AS [' + @ColumnName + '],'
    ELSE
    BEGIN
    -- Double embedded double quotes - example "abc"d"ed" to "abc""d""ed". Only applicible for character data types.
    IF @pDouble_EmbeddedDoubleQuotes = 1
    -- example - CASE when [aaa] IS NULL THEN '' ELSE CONVERT(VARCHAR,REPLACE([aaa],'"','""')) END AS [aaa],
    SET @ColumnSelect = @ColumnSelect + 'CASE WHEN [' + @columnname + '] IS NULL THEN '''' ELSE CONVERT(VARCHAR,REPLACE([' + @columnname + '],''"'',''""'')) END AS [' + @ColumnName + '],'
    ELSE
    -- example - CASE when [aaa] IS NULL THEN '' ELSE CONVERT(VARCHAR,[aaa]) END AS [aaa],
    SET @ColumnSelect = @ColumnSelect + 'CASE WHEN [' + @columnname + '] IS NULL THEN '''' ELSE CONVERT(VARCHAR,[' + @columnname + ']) END AS [' + @ColumnName + '],'
    END
    END
    FETCH NEXT FROM db_cursor INTO @ColumnName, @ColumnSize, @Data_Type
    END TRY
    BEGIN CATCH
    RETURN (1)
    END CATCH
    END
    CLOSE db_cursor
    DEALLOCATE db_cursor
    BEGIN TRY
    -- remove last comma
    IF @pColumnHeader = 1
    SET @HeaderRow = SUBSTRING(@HeaderRow , 1, LEN(@HeaderRow ) - 1)
    SET @ColumnSelect = SUBSTRING(@ColumnSelect, 1, LEN(@ColumnSelect) - 1)
    -- Put on the finishing touches on the SELECT
    IF @pColumnHeader = 1
    SET @SQLSelect = 'SELECT ' + @HeaderRow + ' UNION ALL ' +
    'SELECT ' + @ColumnSelect + ' FROM [' + @pTableName + ']'
    ELSE
    SET @SQLSelect = 'SELECT ' + @ColumnSelect + ' FROM [' + @pTableName + ']'
    ---- diagnostics
    ---- PRINT truncates at 4k or 8k, not sure, my tables have many colummns
    --PRINT @SQLSelect
    --DECLARE @END varchar(max) = RIGHT(@SQLSelect, 3000)
    --PRINT @end
    --EXECUTE sp_executesql @SQLSelect
    -- drop view if exists -- using view because some tables are very wide. one of my tables had a 33k select statement
    SET @SQLCommand = '
    IF EXISTS (SELECT * FROM SYS.views WHERE name = ''v_ExportTable'')
    BEGIN
    DROP VIEW v_ExportTable
    END'
    EXECUTE @ReturnCode = sp_executesql @SQLCommand
    IF @returncode = 1
    BEGIN
    RETURN (1)
    END
    -- create the view
    SET @SQLCommand = '
    CREATE VIEW v_ExportTable AS ' + @SQLSelect
    -- diagnostics
    --print @sqlcommand
    EXECUTE @ReturnCode = sp_executesql @SQLCommand
    IF @returncode = 1
    BEGIN
    RETURN (1)
    END
    END TRY
    BEGIN CATCH
    RETURN (1)
    END CATCH
    RETURN (0)
    END -- CREATE PROCEDURE [dbo].[ExportTablesCreateView]
    GO

  • Can we cleanse and transform data at flat file or external table level?

    Hi,
    I have some data that I want to cleanse and transform. I don't want to cleanse it after i populate the external table, I want to get done with it at flat file level or while populating the external table. Can we cleanse and transform data at flat file or external table level through Oracle or OWB 11.2? Is it possible to run a conditional load (i.e. having a where clause or if-else-then) for an external table? Can we call oracle functions for an external table at the time of creation?
    Thanks in advance.
    Regards,
    Ann.

    Hi Oleg,
    Thanks a lot for the clarification. :)
    So is there a way that I can clease the data within the text file through Oracle or OWB? I have datatype mismatches in the data and most of my data is getting rejected because of that. The way I can think of, for solving this problem, is to create the external table with all fields with datatype varchar and then cleansing the data. But it doesn't seem very effecient plus it will get very complicated because I have almost 80-90 fields.
    Any help?
    Thanks and regards,
    Ann.

  • Ssis - import data from flat file to table (sql server 2012)

    i have create a ssis for importing data from flat file to table in sql server 2012.
    but i got the below error for some column in data flow task.
    error: cannot processed because than one code page (950 and 1252)
    anyone helps~

    Hi,
    The issue occurs because the source flat file uses ANSI/OEM – Tranditional Chinese Big5 encoding. When processing the source file, the flat file connection manager uses code page 950 for the columns. Because SQL Server uses code page to perform conversions
    between non-Unicode data and Unicode data, the data in the code page 950 based input columns cannot be loaded to code page 1252 based destination columns. To resolve the issue, you need to load the data into a SQL Server destination table that includes Unicode
    columns (nchar or nvarchar), and convert the input columns to Unicode columns via Data Conversion or the Advanced Editor for the Flat File Source at the same time.
    Another option that may not be that practical is to create a new database based on the Chinese_Taiwan_Stroke_BIN collation, and load the data to non-Unicode columns directly.
    Reference:
    http://social.technet.microsoft.com/Forums/windows/en-US/f939e3ba-a47e-43b9-88c3-c94bdfb7da58/forum-faq-how-to-fix-the-error-the-column-xx-cannot-be-processed-because-more-than-one-code-page?forum=sqlintegrationservices 
    Regards,
    Mike Yin
    TechNet Community Support

  • Pulling data from table to a flat file

    hi all,
    Good day to all,
    Is there any script which i can use for pulling data from tables to a flat file and then import that data to other DB's. Usage of db link is restricted. The db version is 10.2.0.4.
    thanks,
    baskar.l

    Is there any script which i can use for pulling data from tables to a flat file and then import that data to other DB'sFlat file is adequate only from VARCHAR2, NUMBER, & DATA datatypes.
    Other datatypes present a challenge within text file.
    A more robust solution is to use export/import to move data between Oracle DBs.

  • How can I get data in flat file from Pool table and cluster table ?

    Hi,
    I am working in one Achiving project. My requirement is to get data into flat file from Cluster table and pool table.
    Is there any tool avilable to download data into flat file from pool table and cluster table ?
    if table name given in the selection screen then data will be downloaded into flat file.
    waiting for quick response.
    Best Regards,
    Bansidhar

    Data cannot be retrived directly form the cluster table
    as the Cluster results are stored in Cluster Key say for example PCLkey
    and form that Key we need to fetch the data
    these clustes are not the part of PNP or PNPCE tables
    for ur info kindly check

  • Loading data from flat file to table

    Hi,
    I am using OWB 10g R2. I have to develop a map which loads data from flat file to table.
    1. I have created a .csv file.
    2. Import that into OWB by Module Wizard.
    3. I have created one external table, which use the .csv file.
    After that what I have to do for successfully loading the data into table.
    Regards,
    Sanjoy

    Hi Sanjoy
    You can building a mapping to move the data from somewhere to somewhere else. More specifically;
    1. create a mapping
    2. add external table to mapping
    3. add table to mapping for target (it can be unbound)
    4. map from external table ingroup to target table ingroup (all columns will be mapped)
    5. if target table was unbound right click on target table and click create and bind
    This is a basic mapping you can have for this scenario. There is a post here which does something like this, but it will create a File to Table mapping which will use SQLLoader.
    http://blogs.oracle.com/warehousebuilder/2009/01/expert_useful_scripts_from_simple_primitives.html
    Cheers
    David

  • Invalid data format on EXPORTING table  to SQL-FLAT FILE (Insert type)

    Hi there!
    Writing from Slovenia/Europe.
    Working with ORACLE9i (standard version) on Windows XP with SQL-deloper 1.0.0.015.
    FIRST SQL.-DEVELOPER IS GOOD TOOL WITH SOME MINOR ERRORS.
    1.) Declare and Insert data EXAMPLE
    drop table tst_date;
    create table tst_date (fld_date date);
    insert into tst_date values (sysdate);
    insert into tst_date values (sysdate);
    2.) Retriving date with SQLPLUS
    SQL> select to_char(fld_date,'DD.MM.YYYY HH24:MI:SS') FROM TST_DATE;
    23.10.2006 11:25:23
    23.10.2006 11:25:25
    As you see TIME DATA IS CORRECT.
    When I EXPOPRT data TO SQL-insert type I got this result IN TST_DATE.SQL file:
    -- INSERTING into TST_DATE
    Insert into "TST_DATE" ("FLD_DATE") values (to_date('2006-10-23','DD.MM.RR'));
    Insert into "TST_DATE" ("FLD_DATE") values (to_date('2006-10-23','DD.MM.RR'));
    As you seel I lost TIME DATA.
    QUESTION!
    HOW CAN I SET PROPER DATE FORMAT IN SQL-DEVELOPER BEFORE I EXPORT DATA TO FLAT FILE.
    Best regards, Iztok from SLOVENIA
    Message was edited by:
    DEKUS

    A DATE-Field, is a DATE-Field and not a
    DATE-TIME-Field.
    The export-tool identifies a DATE-Field and exports
    the data into date-format.This is not true. Oracle DATE fields include a time element.
    To the original poster - I believe this is a bug in the current version.
    See this thread for possible workarounds Bad Export format --- BUG ???
    Message was edited by:
    smitjb

  • Loading transaction data from flat file to SNP order series objects

    Hi,
    I am an BW developer and i need to provide data to my SNP team.
    Can you please let me know more about <b>loading transaction data (sales order, purchase order, etc.,) from external systems into order based SNP objects/structure</b>. there is a 3rd party tool called webconnect that gets data from external systems and can give data in flat file or in database tables what ever required format we want.
    I know we can use BAPI's, but dont know how. can you please send any <b>sample ABAP program code that calls BAPI to read flat file and write to SNP order series objects</b>.
    Please let me know ASAP, how to get data from flat file into SNP order based objects, with options and I will be very grateful.
    thanks in advance
    Rahul

    Hi,
    Please go through the following links:
    https://forums.sdn.sap.com/click.jspa?searchID=6094493&messageID=4180456
    https://forums.sdn.sap.com/click.jspa?searchID=6094493&messageID=4040057
    https://forums.sdn.sap.com/click.jspa?searchID=6094493&messageID=3832922
    https://forums.sdn.sap.com/click.jspa?searchID=6094493&messageID=4067999
    Hope this helps...
    Regards,
    Habeeb
    Assign points if helpful..:)

  • Data convertion while exporting data into flat files using export wizard in ssis

    Hi ,
    while exporting data to flat file through export wizard the source table is having NVARCHAR types.
    could you please help me on how to do the data convertion while using the export wizard?
    Thanks.

    Hi Avs sai,
    By default, the columns in the destination flat file will be non-Unicode columns, e.g. the data type of the columns will be DT_STR. If you want to keep the original DT_WSTR data type of the input column when outputting to the destination file, you can check
    the Unicode option on the “Choose a Destination” page of the SQL Server Import and Export Wizard. Then, on the “Configure Flat File Destination” page, you can click the Edit Mappings…“ button to check the data types. Please see the screenshot:
    Regards,
    Mike Yin
    TechNet Community Support

  • FM to tansfer data in flat file from presentation to application server

    Hi Experts,
    Please tell the FM to tansfer data in flat file from presentation server to application server or vice versa in ECC 6.0.
    Thanks.

    Hi,
    This is how you can achieve it:
    1. You read the flat file from presentation layer and store the file content in internal table gt_inrec
      CALL FUNCTION 'GUI_UPLOAD'
        EXPORTING
          filename                = gw_filename
          filetype                = 'ASC'
        IMPORTING
          filelength              = gw_length
          header                  = gw_header
        TABLES
          data_tab                = gt_inrec
        EXCEPTIONS
          file_open_error         = 1
          file_read_error         = 2
          no_batch                = 3
          gui_refuse_filetransfer = 4
          invalid_type            = 5
          no_authority            = 6
          unknown_error           = 7
          bad_data_format         = 8
          header_not_allowed      = 9
          separator_not_allowed   = 10
          header_too_long         = 11
          unknown_dp_error        = 12
          access_denied           = 13
          dp_out_of_memory        = 14
          disk_full               = 15
          dp_timeout              = 16
          OTHERS                  = 17.
    2. Create a new file at the application server:
      OPEN DATASET p_ofile FOR OUTPUT IN
      TEXT MODE ENCODING DEFAULT.
    3. Transfer the content from the internal table into the file at the application server:
        LOOP AT gt_inrec.
          TRANSFER gt_inrec-record TO p_ofile.
        ENDLOOP.
    Hope it helps,
    Lim....

  • Upload data from flat file to OVKK  using BDC

    Hi All,
    I want to upload data from flat file to OVKK tcode using BDC.
    OVKK is a maintaince view with  a table control.
    So please send me code for uploading data to OVKK through BDC.
    Thanks & Regards,
    Siva.B

    Hi,
    Welcome to SDN!!!!!!!!!!
    Can you see this example for Table control.
    http://www.sap-img.com/abap/bdc-example-using-table-control-in-bdc.htm
    Today this is the second post on the same issue and same Tranx.
    1st try through SHDB and check the code.
    Thanks.
    If this helps you reward with points.

  • Need a script to import the data from flat file

    Hi Friends,
    Any one have any scripts to import the data from flat files into oracle database(Linux OS). I have to automate the script for every 30min to check any flat files in Incoming directory process them with out user interaction.
    Thanks.
    Srini

    Here is my init.ora file
    # $Header: init.ora 06-aug-98.10:24:40 atsukerm Exp $
    # Copyright (c) 1991, 1997, 1998 by Oracle Corporation
    # NAME
    # init.ora
    # FUNCTION
    # NOTES
    # MODIFIED
    # atsukerm 08/06/98 - fix for 8.1.
    # hpiao 06/05/97 - fix for 803
    # glavash 05/12/97 - add oracle_trace_enable comment
    # hpiao 04/22/97 - remove ifile=, events=, etc.
    # alingelb 09/19/94 - remove vms-specific stuff
    # dpawson 07/07/93 - add more comments regarded archive start
    # maporter 10/29/92 - Add vms_sga_use_gblpagfile=TRUE
    # jloaiza 03/07/92 - change ALPHA to BETA
    # danderso 02/26/92 - change db_block_cache_protect to dbblock_cache_p
    # ghallmar 02/03/92 - db_directory -> db_domain
    # maporter 01/12/92 - merge changes from branch 1.8.308.1
    # maporter 12/21/91 - bug 76493: Add control_files parameter
    # wbridge 12/03/91 - use of %c in archive format is discouraged
    # ghallmar 12/02/91 - add global_names=true, db_directory=us.acme.com
    # thayes 11/27/91 - Change default for cache_clone
    # jloaiza 08/13/91 - merge changes from branch 1.7.100.1
    # jloaiza 07/31/91 - add debug stuff
    # rlim 04/29/91 - removal of char_is_varchar2
    # Bridge 03/12/91 - log_allocation no longer exists
    # Wijaya 02/05/91 - remove obsolete parameters
    # Example INIT.ORA file
    # This file is provided by Oracle Corporation to help you customize
    # your RDBMS installation for your site. Important system parameters
    # are discussed, and example settings given.
    # Some parameter settings are generic to any size installation.
    # For parameters that require different values in different size
    # installations, three scenarios have been provided: SMALL, MEDIUM
    # and LARGE. Any parameter that needs to be tuned according to
    # installation size will have three settings, each one commented
    # according to installation size.
    # Use the following table to approximate the SGA size needed for the
    # three scenarious provided in this file:
    # -------Installation/Database Size------
    # SMALL MEDIUM LARGE
    # Block 2K 4500K 6800K 17000K
    # Size 4K 5500K 8800K 21000K
    # To set up a database that multiple instances will be using, place
    # all instance-specific parameters in one file, and then have all
    # of these files point to a master file using the IFILE command.
    # This way, when you change a public
    # parameter, it will automatically change on all instances. This is
    # necessary, since all instances must run with the same value for many
    # parameters. For example, if you choose to use private rollback segments,
    # these must be specified in different files, but since all gc_*
    # parameters must be the same on all instances, they should be in one file.
    # INSTRUCTIONS: Edit this file and the other INIT files it calls for
    # your site, either by using the values provided here or by providing
    # your own. Then place an IFILE= line into each instance-specific
    # INIT file that points at this file.
    # NOTE: Parameter values suggested in this file are based on conservative
    # estimates for computer memory availability. You should adjust values upward
    # for modern machines.
    # You may also consider using Database Configuration Assistant tool (DBCA)
    # to create INIT file and to size your initial set of tablespaces based
    # on the user input.
    # replace DEFAULT with your database name
    db_name=DEFAULT
    db_files = 80 # SMALL
    # db_files = 400 # MEDIUM
    # db_files = 1500 # LARGE
    db_file_multiblock_read_count = 8 # SMALL
    # db_file_multiblock_read_count = 16 # MEDIUM
    # db_file_multiblock_read_count = 32 # LARGE
    db_block_buffers = 100 # SMALL
    # db_block_buffers = 550 # MEDIUM
    # db_block_buffers = 3200 # LARGE
    shared_pool_size = 3500000 # SMALL
    # shared_pool_size = 5000000 # MEDIUM
    # shared_pool_size = 9000000 # LARGE
    log_checkpoint_interval = 10000
    processes = 50 # SMALL
    # processes = 100 # MEDIUM
    # processes = 200 # LARGE
    parallel_max_servers = 5 # SMALL
    # parallel_max_servers = 4 x (number of CPUs) # MEDIUM
    # parallel_max_servers = 4 x (number of CPUs) # LARGE
    log_buffer = 32768 # SMALL
    # log_buffer = 32768 # MEDIUM
    # log_buffer = 163840 # LARGE
    # audit_trail = true # if you want auditing
    # timed_statistics = true # if you want timed statistics
    max_dump_file_size = 10240 # limit trace file size to 5 Meg each
    # Uncommenting the line below will cause automatic archiving if archiving has
    # been enabled using ALTER DATABASE ARCHIVELOG.
    # log_archive_start = true
    # log_archive_dest = disk$rdbms:[oracle.archive]
    # log_archive_format = "T%TS%S.ARC"
    # If using private rollback segments, place lines of the following
    # form in each of your instance-specific init.ora files:
    # rollback_segments = (name1, name2)
    # If using public rollback segments, define how many
    # rollback segments each instance will pick up, using the formula
    # # of rollback segments = transactions / transactions_per_rollback_segment
    # In this example each instance will grab 40/5 = 8:
    # transactions = 40
    # transactions_per_rollback_segment = 5
    # Global Naming -- enforce that a dblink has same name as the db it connects to
    global_names = TRUE
    # Edit and uncomment the following line to provide the suffix that will be
    # appended to the db_name parameter (separated with a dot) and stored as the
    # global database name when a database is created. If your site uses
    # Internet Domain names for e-mail, then the part of your e-mail address after
    # the '@' is a good candidate for this parameter value.
    # db_domain = us.acme.com      # global database name is db_name.db_domain
    # FOR DEVELOPMENT ONLY, ALWAYS TRY TO USE SYSTEM BACKING STORE
    # vms_sga_use_gblpagfil = TRUE
    # FOR BETA RELEASE ONLY. Enable debugging modes. Note that these can
    # adversely affect performance. On some non-VMS ports the db_block_cache_*
    # debugging modes have a severe effect on performance.
    #_db_block_cache_protect = true # memory protect buffers
    #event = "10210 trace name context forever, level 2" # data block checking
    #event = "10211 trace name context forever, level 2" # index block checking
    #event = "10235 trace name context forever, level 1" # memory heap checking
    #event = "10049 trace name context forever, level 2" # memory protect cursors
    # define parallel server (multi-instance) parameters
    #ifile = ora_system:initps.ora
    # define two control files by default
    control_files = (ora_control1, ora_control2)
    # Uncomment the following line if you wish to enable the Oracle Trace product
    # to trace server activity. This enables scheduling of server collections
    # from the Oracle Enterprise Manager Console.
    # Also, if the oracle_trace_collection_name parameter is non-null,
    # every session will write to the named collection, as well as enabling you
    # to schedule future collections from the console.
    # oracle_trace_enable = TRUE
    # Uncomment the following line, if you want to use some of the new 8.1
    # features. Please remember that using them may require some downgrade
    # actions if you later decide to move back to 8.0.
    #compatible = 8.1.0
    Thanks.
    Srini

Maybe you are looking for

  • How to change Data type of Grid column at runtime..??

    Hi ,         I have taken a "Grid" control,and i have pass a query "Select itemcode Qunatity,linetotal From PDN1" in the ExecuteQuery method of Grid it displaying the data, but the problem is i want to change the Datatype of this column from "FLOAT"

  • OS X 10.9.4 Mavericks: Waking Up Unprompted While Sleeping

    I am running OS X 10.9.4 Mavericks on a 27-inch Mid 2011 iMac. Prior to upgrading to Mavericks, I would go to System Preferences_Energy Saver and set Display Sleep to 1 minute. This would keep the computer screen turned off at night, but would allow

  • Reverse subcontracting process

    Hi, I have an issue in subcontracting purchase I have created PO for quantity 10, Made transfer posting to vendor in MB1B 541 mvt for 10 qty created subc. challan for material doc. Made GR for 5 quantity, and challan is reconcilled for 5 qty. but the

  • My recording level wont move

    when ever i open GarageBand...it records just fine and everything works. but when i try to play back what i have recorded i can not hear it. iv put every volume on the programe and my computer on max except the recording level. the button on the reco

  • Flash'n Yellow

    Finally got my AE Base station up and running. Everything going along great on internet printed a couple of things and all of a sudden I see the "Flashing Yellow. Went to AE utility/summary and status tells me"Disk needs repair". What disk, and is th