Export table format SQL loader using spool

Hi
In enviroment I only access for to do SELECT , then I want to export some tables using SELECT using spool of data
But I wanted export in format Sql Loader for to import other environment
Is possible to do It ?

Depends where the NULL is.
SQL> SELECT * FROM t ORDER BY 1;
        ID DESCR      DT
         1 One
         2            30-MAY-2007
         4 Four       30-MAY-2007
           Three      30-MAY-2007
SQL> SELECT id||',"'||descr||'",'||TO_CHAR(dt,'dd-mon-yyyy') output
  2  FROM t
  3  ORDER BY 1;
OUTPUT
,"Three",30-may-2007
1,"One",
2,"",30-may-2007
4,"Four",30-may-2007John

Similar Messages

  • Is it possible to export tables from diffrent schema using expdp?

    Hi,
    We can export tables from different schema using exp. Ex: exp user/pass file=sample.dmp log=sample.log tables=scott.dept,system.sales ...But
    Is it possible in expdp?
    Thanks in advance ..
    Thanks,

    Hi,
    you have to use "schemas=user1,user2 include=table:"in('table1,table2')" use parfileexpdp scott/tiger@db10g schemas=SCOTT include=TABLE:"IN ('EMP', 'DEPT')" directory=TEST_DIR dumpfile=SCOTT.dmp logfile=expdpSCOTT.log{quote}
    I am not able to perform it using parfile also.Using parfile it shows "UDE-00010: multiple job modes requested, schema and tables."
    When trying the below, i get error
    {code}
    bash-3.00$ expdp directory=EXP_DUMP dumpfile=test.dmp logfile=test.log SCHEMAS=(\'MM\',\'MMM\') include=TABLE:\"IN\(\'EA_EET_TMP\',\'WS_DT\'\)\"
    Export: Release 10.2.0.4.0 - 64bit Production on Friday, 15 October, 2010 18:34:32
    Copyright (c) 2003, 2007, Oracle. All rights reserved.
    Username: / as sysdba
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Starting "SYS"."SYS_EXPORT_SCHEMA_01": /******** AS SYSDBA directory=EXP_DUMP dumpfile=test.dmp logfile=test.log SCHEMAS=('MM','MMM') include=TABLE:"IN('EA_EET_TMP','WS_DT')"
    Estimate in progress using BLOCKS method...
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 0 KB
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
    Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    . . exported "MM"."EA_EET_TMP" 0 KB 0 rows
    ORA-39165: Schema MMM was not found.
    Master table "SYS"."SYS_EXPORT_SCHEMA_01" successfully loaded/unloaded
    Dump file set for SYS.SYS_EXPORT_SCHEMA_01 is:
    /export/home/nucleus/dump/test.dmp
    Job "SYS"."SYS_EXPORT_SCHEMA_01" completed with 1 error(s) at 18:35:19
    {code}
    When checking expdp help=y shows :-
    {code}TABLES Identifies a list of tables to export - one schema only.{code}
    As per few testing,tables from different schemas are not possible to export using expdp in a single command.
    Anand

  • Defaulting org id value into a table through SQL Loader program

    Hi ,
    We have a requirement that we need to load some data from flat file to a table.we are using sql loader to do that. so far no problem but now the requirement is that we need to populate the org id from which we are running the program.
    I tried fnd_profile.value('ORG_ID') and it is populating site level org id.
    Coudl any one please help me how to default org id or request id into a table through sql loader program.
    Thanks,
    Y

    user12001627 wrote:
    Hi Srini,
    Thanks for looking into this!!
    We are on EBS 11.5.10 and OS is solaris.
    I tried fnd_profile,fnd_global but no luck.
    Here is the control file which we are using to load data.
    load data
    infile *
    replace into table XXXX_YYYY_STAG
    trailing nullcols
    (line POSITION(1:2000)
    I would like to populate org id when I load the data from file.unfortunately there is no identifier in the file that says for which org id the data is in the file.Only the way to identify the file org is based on file name
    Where do you want to populate the ORG_ID ? There is no column for it in your stage table above
    Is there way we can pass through concurrent program parameters?
    Thanks
    YHTH
    Srini

  • How can I load data into table with SQL*LOADER

    how can I load data into table with SQL*LOADER
    when column data length more than 255 bytes?
    when column exceed 255 ,data can not be insert into table by SQL*LOADER
    CREATE TABLE A (
    A VARCHAR2 ( 10 ) ,
    B VARCHAR2 ( 10 ) ,
    C VARCHAR2 ( 10 ) ,
    E VARCHAR2 ( 2000 ) );
    control file:
    load data
    append into table A
    fields terminated by X'09'
    (A , B , C , E )
    SQL*LOADER command:
    sqlldr test/test control=A_ctl.txt data=A.xls log=b.log
    datafile:
    column E is more than 255bytes
    1     1     1     1234567------(more than 255bytes)
    1     1     1     1234567------(more than 255bytes)
    1     1     1     1234567------(more than 255bytes)
    1     1     1     1234567------(more than 255bytes)
    1     1     1     1234567------(more than 255bytes)
    1     1     1     1234567------(more than 255bytes)
    1     1     1     1234567------(more than 255bytes)
    1     1     1     1234567------(more than 255bytes)

    Check this out.
    http://download-west.oracle.com/docs/cd/B10501_01/server.920/a96652/ch06.htm#1006961

  • Taking snapshot of oracle tables to sql server using transactional replication is taking a long time

    Hi All,
    I am trying to replicate around 200 oracle tables onto sql server using transaction replication and it taking a long time i.e the initial snapshot is taking more than 24 hrs and it still going on.
    Is there any way to replicate those these tables faster?
    Kindly help me out..
    Thanks

    Hi,
    According to the description, I know the replication is working fine. But it is very slow. 
    1. Check the CPU usage on Oracle publisher and SQL Server. This issue may due to slow client processing (Oracle performance) or Network performance issues.
    2. Based on SQL Server 2008 Books Online ‘Performance Tuning for Oracle Publishers’ (http://msdn.microsoft.com/en-us/library/ms151179(SQL.100).aspx). You can enable the transaction
    job set and follow the instructions based on
    http://msdn.microsoft.com/en-us/library/ms147884(v=sql.100).aspx.
    2. You can enable replication agent logging to check the replication behavior. You may follow these steps to collect them:
    To enable Distribution Agent verbose logging. Please follow these steps:
    a. Open SQL Server Agent on the distribution server.
    b. Under Jobs folder, find out the Distribution Agent.
    c. Right click the job and choose Properties.
    d. Select Steps tap, it should be like this:
    e. Click Run agent and click Edit button, add following scripts by the end of scripts in the command box:
            -Output C:\Temp\OUTPUTFILE.txt -Outputverboselevel 2
    f. Exit the dialogs
     For more information about the steps, please refer to:
    http://support.microsoft.com/kb/312292
    Hope the information helps.
    Tracy Cai
    TechNet Community Support

  • External Table vs SQL Loader.

    Hi,
    Pls anybody can tell me what is the significant differences between external table and SQL Loader.

    Both fall into category of Oracle utilities
    [SQL*Loader|http://download.oracle.com/docs/cd/B19306_01/server.102/b14220/utility.htm#i10606] is the one that loads data into Oracle tables from operating system files and [external table|http://download.oracle.com/docs/cd/B19306_01/server.102/b14220/utility.htm#i10611] is the one that
    is providing functionality similiar as SQL*Loader in the means of accessing external data but with with a different logic and rules,it lets you access data in external sources as if they were in a table in the database.

  • SQL LOADER USING EXTRNAL TABLE

    I have .csv file having around 70k records
    in which fields are delimited by tab and
    enclosed in double quotes but double quotes may be part of data.
    and records are delimited by newline.
    After creating external table when I issue SELECT statment
    select count(*) from proTxt ;
    ERROR at line 1:
    ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-04020: found record longer than buffer size supported, 524288, in C:\Program Files\Apache Software Foundation\Tomcat
    5.5\webapps\tmTest\upload\product\Data\output09_1.txt
    ORA-06512: at "SYS.ORACLE_LOADER", line 19
    Following is the create table statement:
    CREATE TABLE proTxt (PRO_CODE VARCHAR2(30),
    PRO_DESC VARCHAR2(500),
    PUR_PRICE VARCHAR2(20),
    SALE_PRICE VARCHAR2(20)
    ORGANIZATION EXTERNAL
    TYPE ORACLE_LOADER
    DEFAULT DIRECTORY DAT_DIR
    ACCESS PARAMETERS
    records delimited by NEWLINE SKIP 1
    badfile BAD_DIR:'proTxt%a_%p.bad'
    logfile LOG_DIR:'proTxt%a_%p.log'
    fields terminated by X'9' OPTIONALLY ENCLOSED BY '"' AND '"'
    missing field values are null
    ( PRO_CODE,
    PRO_DESC,
    PUR_PRICE,
    SALE_PRICE
    LOCATION ('output09_1.txt')
    PARALLEL 4
    REJECT LIMIT UNLIMITED;
    record size is not large.
    Log file :
    LOG file opened at 12/05/12 20:25:40
    KUP-04020: found record longer than buffer size supported, 524288, in C:\Program Files\Apache Software Foundation\Tomcat 5.5\webapps\tmTest\upload\product\Data\output09_1.txt
    KUP-04053: record number 2
    data file
    PRO_CODE     PRO_DESC     PUR_PRICE     SALE_PRICE
    "0000336658"     "BEARING"     "Rs.0.00"     "Rs.0.00"
    "0000790028"     "SEAL"     "Rs.76.00"     "Rs.90.00"
    "0000790118"     "SPRING"     "Rs.24.00"     "Rs.28.00"
    "0000792284"     "F.BRK.CAL.W/O PA"     "Rs.2,627.00"     "Rs.3,100.00"
    "0000792285"     "F.BRK.CAL.W/O PA"     "Rs.2,627.00"     "Rs.3,100.00"
    "0005896322"     "PISTON, RING"     "Rs.5,000.00"     "Rs.5,900.00"
    "0005896323"     "PISTONS, RINGS AND P"     "Rs.17,755.00"     "Rs.20,951.00"
    "0005896559"     "PISTON, RINGS AND PI"     "Rs.5,000.00"     "Rs.5,900.00"

    Hi,
    when i used
    records delimited by *'\r'*
    then 4226 record written to table
    but enclosed charcter double quotes["] were also written and
    there is some space between charcters
    " 0 0 0 0 3 3 6 6 5 8 " " B E A R I N G " " R s . 0 . 0 0 " " R s . 0 . 0 0 "
    " 0 0 0 0 8 5 6 7 0 7 " " P L U G " " R s . 0 . 0 0 " " R s . 0 . 0 0 "
    Definitely this is "External Table with Flatfile Moved Across Platforms" issue.
    when I opened .csv file in excel and saved as tab delimited it works fine.
    But I do not know plateform of data file.
    How to know the CHARACTERSET of data file
    Log file
    Field Definitions for table PROTXT
    Record format DELIMITED, delimited by
    Data in file has same endianness as the platform
    Rows with all null fields are accepted
    Fields in Data Source:
    PRO_CODE CHAR (255)
    Terminated by "9"
    Enclosed by """ and """
    Trim whitespace same as SQL Loader
    PRO_DESC CHAR (255)
    Terminated by "9"
    Enclosed by """ and """
    Trim whitespace same as SQL Loader
    PUR_PRICE CHAR (255)
    Terminated by "9"
    Enclosed by """ and """
    Trim whitespace same as SQL Loader
    SALE_PRICE CHAR (255)
    Terminated by "9"
    Enclosed by """ and """
    Trim whitespace same as SQL Loader

  • Export data for SQL Loader

    I have a table with the following 3 columns
    Help_number Number(8,0)
    Title       Varchar(100 Byte)
    Description Varchar (100 Byte)I would like to export all the data and import it into another table in another database. Im using SQL Developer to export the data. I choose the "LOADER" option but when the data is exported, the format is wrong. Here is an example of the data is exported.
    "1","Error","Error 5343 - Input not recognised" The problem i have is that the first column is being exported in double quotes even though its of a type of NUMBER. When i try to load this using sqlldr it gets rejected because its a string.
    The other problem that i have is that SQL Developer is not exporting all the rows if a table is big. I tried to export a table with 23000 rows and it only exported the first 55 rows.
    Any help will be appreciated.

    The quotes issue I am able to replicate and have logged a bug #6732587.
    I have also logged a bug for the number of rows, however, if you click ctrl-end and then export, you'll get all the rows. Also, if you do not want to query back all the rows, but want to export all, in the Export dialog, just click the "where" clause tab and then Apply. This will also bring back all the rows. This bug is not only for Loader, but for any export format.
    Sue

  • Multiple table format through email using powershell

    Hi All,
    I have a powershell script which executes a SQL Query on three SQL instances and provides the result in table format through email. The output email contains all the result of the query in a single output itself. Please help me, I have provided the code
    which I am using
    Sample output format which I am getting: 
    ServerInstance
    Databasename EnabledStatus
    Instance1 Database1
    Enable
    Instance1 Database2
    Enable
    Instance1 Database3
    Enable
    Instance2 Database1
    Enable
    Instance2 Database2
    Enable
    My requirement is I should get two table formatted email like below:
    Database status of Instance 1
    ServerInstance
    Databasename EnabledStatus
    Instance1 Database1
    Enable
    Instance1 Database2
    Enable
    Instance1 Database3
    Enable
    Database status of Instance 2
    ServerInstance
    Databasename EnabledStatus
    Instance2 Database1
    Enable
    Instance2 Database2 Enable
    #This PowerShell Scrip is well-suited with PowerShell V3.0
    #import SQL Server module
    #Import-Module SQLPS -DisableNameChecking
    #get all the instances and temporarily store them in a variable
    $ServerInstances = Get-Content "C:\SQL_Servers.txt"
    $scriptFile = "C:\restoredetails_mountdrive.sql"
    $a = "Hi All, <BR> <BR>"
    $a = $a + "Below is the TESTING Environment. This is an auto-generated mail.<BR><BR>"
    $a = $a + "<style>"
    $a = $a + "BODY{background-color:white;}"
    $a = $a + "TABLE{border-width: 1px;border-style: solid;border-color: black;border-collapse: collapse;}"
    $a = $a + "TH{border-width: 0px;width:150%;cellspacing=0 ;padding: 10px;border-style: solid;border-color: black;background-color:#43B2B2;font-family: Verdana;font-size:13 }"
    $a = $a + "TD{border-width: 0px;width:150%;cellspacing=3 ;padding: 10px;border-style: solid;border-color: black;text-align: left;background-color:white;font-family: Verdana;font-size:11}"
    $a = $a + "</style>"
    #he database we want to execute it against, regardless of the instance
    $DBName = "master"
    #iterating through all instances.
    $ServerInstances |
    ForEach-Object {
    #For each instance, we create a new SMO server object
    $ServerObject = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Server -ArgumentList $_
    #use the Invoke-Sqlcmd cmdlet to execute the query
    #we are passing in the pipeline is the instance name, which is $_
    $refresh_output1 = $refresh_output1 + (Invoke-Sqlcmd `
    -ServerInstance $_ `
    -Database $DBName `
    -InputFile $scriptFile
    #-Query $SQLQuery
    [string]$tst = $refresh_output1 |convertTo-Html -Head $a -property InstanceName, DatabaseName,OverallStatus | Out-String
    write-output " "
    [System.Net.Mail.MailMessage]$message = New-Object System.Net.Mail.MailMessage("emailid.com", "toemailid.com", "Subject", $tst )
    [System.Net.Mail.SmtpClient]$client = New-Object System.Net.Mail.SmtpClient("smtpserver",25)
    $Message.IsBodyHtml = $true
    $client.Timeout = 100
    $client.Send($message)

    Generally it's best to post in the Hey Scripting Guy forum, they are scarily good in there. Someday i hope to give an answer so perfect that not even jrv can improve on it.
    Your approach might be possible but it's not the way i'd do it. The ConvertTo-HTML is pretty clever, it works well with arrays of objects. If you were to load each result into a custom PSObject then add that to an array of them for later processing you can
    get the table formatting almost for free.
    I haven't worked with SQL queries in a bit but this might work, it seems ok when i put token results in for the SQL result.
    #This PowerShell Scrip is well-suited with PowerShell V3.0
    #import SQL Server module
    #Import-Module SQLPS -DisableNameChecking
    #get all the instances and temporarily store them in a variable
    $ServerInstances = Get-Content "C:\SQL_Servers.txt"
    $scriptFile = "C:\restoredetails_mountdrive.sql"
    $a = "Hi All, <BR> <BR>"
    $a = $a + "Below is the TESTING Environment. This is an auto-generated mail.<BR><BR>"
    $a = $a + "<style>"
    $a = $a + "BODY{background-color:white;}"
    $a = $a + "TABLE{border-width: 1px;border-style: solid;border-color: black;border-collapse: collapse;}"
    $a = $a + "TH{border-width: 0px;width:150%;cellspacing=0 ;padding: 10px;border-style: solid;border-color: black;background-color:#43B2B2;font-family: Verdana;font-size:13 }"
    $a = $a + "TD{border-width: 0px;width:150%;cellspacing=3 ;padding: 10px;border-style: solid;border-color: black;text-align: left;background-color:white;font-family: Verdana;font-size:11}"
    $a = $a + "</style>"
    #he database we want to execute it against, regardless of the instance
    $DBName = "master"
    #Create an empty object collection
    $objectCollection = @()
    #iterating through all instances.
    $ServerInstances |
    ForEach-Object {
    #For each instance, we create a new SMO server object
    $ServerObject = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Server -ArgumentList $_
    #use the Invoke-Sqlcmd cmdlet to execute the query
    #we are passing in the pipeline is the instance name, which is $_
    $SQLResult = (Invoke-Sqlcmd `
    -ServerInstance $_ `
    -Database $DBName `
    -InputFile $scriptFile
    #-Query $SQLQuery
    $objectCollection += New-Object -TypeName PSObject -Property @{
    "InstanceName" = $_ ;
    "DatabaseName" = $DBName ;
    "OverallStatus" = $SQLResult["OverallStatus"];
    $objectCollection | ConvertTo-Html -Fragment
    [System.Net.Mail.MailMessage]$message = New-Object System.Net.Mail.MailMessage("emailid.com", "toemailid.com", "Subject", $objectCollection)
    [System.Net.Mail.SmtpClient]$client = New-Object System.Net.Mail.SmtpClient("smtpserver",25)
    $Message.IsBodyHtml = $true
    $client.Timeout = 100
    $client.Send($message)

  • Loading multiple tables with SQL Loader

    Hi,
    I want to load multiple tables from a single data file using SQL Loader.
    Here's the basic idea of what I want. Let's say I have two tables, table =T1
    and table T2:
    SQL> desc T1;
    COL1 VARCHAR2(20)
    COL2 VARCHAR2(20)
    SQL> desc T2;
    COL1 VARCHAR2(20)
    COL2 VARCHAR2(20)
    COL3 VARCHAR2(20)
    My data file, test.dat, looks like this:
    AAA|KBA
    BBR|BBCC|CCC
    NNN|BBBN|NNA
    I want to load the first record into T1, and the second and third record load into T2. How do I set up my control file to do that?
    Thank!

    Tough Job
    LOAD DATA
    truncate
    INTO table t1
    when col3 = 'dummy'
    FIELDS TERMINATED BY '|'
    TRAILING NULLCOLS
    (col1,col2,col3 filler char nullif col3='dummy')
    INTO table t2
    when col3 != 'dummy'
    FIELDS TERMINATED BY '|'
    (col1,col2,col3 nullif col3='dummy')
    This will load t2 tbl but not t1.
    T1 Filler col3 is not accepting nullif. Its diff to compare columns have null using when condition. If i find something i will let you know.
    Can you seperate records into 2 file. Will a UNIX command work for you which will seperate 2col and 3col record types for you. and then you can execute 2 controlfiles on it.
    Thanks,
    http://www.askyogesh.com

  • Importing to a Oracle Table from SQL Loader Fails

    Hi ,
    When I try to upload one xml file from my server to my table in oracle server using sql loader it fails at times.Some times it works perfectly.
    This is a daily process which automatically dumps data to my oracle.
    Please find the error log :
    SQL*Loader: Release 10.2.0.4.0 - Production on Thu Dec 5 04:07:32 2013
    Copyright (c) 1982, 2007, Oracle.  All rights reserved.
    Control File:   xmlFeedDelta.ctl
    Data File:      xmlFileNames_Delta.txt
      Bad File:     xmlFileNames_Delta.bad
      Discard File:  none specified
    (Allow all discards)
    Number to load: ALL
    Number to skip: 0
    Errors allowed: 1000
    Bind array:     50000 rows, maximum of 256000 bytes
    Continuation:    none specified
    Path used:      Conventional
    Table XMLFEEDDELTA, loaded from every logical record.
    Insert option in effect for this table: APPEND
       Column Name                  Position   Len  Term Encl Datatype
    FILENAME                            FIRST  4000   ,       CHARACTER           
    FILECONTENT                       DERIVED     *  EOF      CHARACTER           
        Dynamic LOBFILE.  Filename in field FILENAME
    value used for ROWS parameter changed from 50000 to 63
    SQL*Loader-643: error executing INSERT statement for table XMLFEEDDELTA
    ORA-03113: end-of-file on communication channel
    SQL*Loader-2026: the load was aborted because SQL Loader cannot continue.
    Table XMLFEEDDELTA:
      0 Rows successfully loaded.
      0 Rows not loaded due to data errors.
      0 Rows not loaded because all WHEN clauses were failed.
      0 Rows not loaded because all fields were null.
    Space allocated for bind array:                 252378 bytes(63 rows)
    Read   buffer bytes: 1048576
    Total logical records skipped:          0
    Total logical records read:             1
    Total logical records rejected:         0
    Total logical records discarded:        0
    Run began on Thu Dec 05 04:07:32 2013
    Run ended on Thu Dec 05 04:08:42 2013
    Elapsed time was:     00:01:10.05
    CPU time was:         00:00:00.28
    My Control File Looks like this :
    LOAD DATA
    INFILE xmlFileNames_Delta.txt
    INTO TABLE xmlFeedDelta APPEND
    fields terminated by ','
    filename CHAR(4000),
    filecontent LOBFILE(filename) terminated by eof
    My Database version :
    Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit Production
    PL/SQL Release 11.2.0.2.0 - Production
    "CORE 11.2.0.2.0 Production"
    TNS for Linux: Version 11.2.0.2.0 - Production
    NLSRTL Version 11.2.0.2.0 - Production
    I am not sure why this is happening at times . Any help would be appreciated.

    Hi,
    have you tried with the FILLER  command like
    LOAD DATA
    INFILE xmlFileNames_Delta.txt
    INTO TABLE xmlFeedDelta APPEND
    fields terminated by ','
    filename  FILLER CHAR(4000),
    filecontent LOBFILE(filename) terminated by eof

  • Sql loader using position and functions

    Hi all, i need help loading some data in my table using sql loader. consider the following
    CREATE TABLE er
    a1 NUMBER,
    a2 number,
    a3 VARCHAR2(100),
    a4 VARCHAR2(100),
    a5 VARCHAR2(100),
    a6  VARCHAR2(100),
    a7  VARCHAR2(100),
    a8  VARCHAR2(100)
    OPTIONS (BINDSIZE=20548000, READSIZE=20548000, STREAMSIZE=20548000, DATE_CACHE=25000,  SKIP=0)
    LOAD DATA
    INTO TABLE er
    APPEND    
    TRAILING NULLCOLS
      a1            POSITION(0001:0021)               ,
      a2            POSITION(0022:0042)       "DECODE(SUBSTR(:a2,1,3),'***',NULL,:a2)"      ,
      a3            POSITION(0043:0053)       ,
      a4            POSITION(0054:0064)          ,
      a5            POSITION(0065:0075)           ,
      a6            POSITION(0076:0086)       ,
      a7            POSITION(0087:0093)      "DECODE(SUBSTR(:a7,1,3),'***',NULL,:a7)"  
    BEGIN
                     0.00 ******************** X          X          X          *X          ****
    END;if you look at the data, some fields have a lot of * and some has af few such as ****. i want to load this data into a table and when a field contain all * as a value, i want to set it to null. if a field contain a * and alphanumeric then that value should be load as it is.
    in the example above, ******************** should be set to null and **** should also be set to null. notice that there is a field with X. since this field contain alpha numeric, it should be loaded into the table as is.  the only time field should be set to null is when the value contain all .
    somebody in this forum suggest using decode but it looks like it is not working and i get error when it reads second field and try to insert into a2 number column.
    is there any way to use regular expression to find out if a field contain all *. also i want to trim each field since they might contain leading spaces.
    can some one help with this using the sqlloader ctl and data above?

    You can include regular expression in you SQL*Loader control file.
    An example can be found here:
    http://www.morganslibrary.org/reference/sqlloader.html
    Demos 7 and 8 using the UPPER and DECODE functions to illustrate how to do it.

  • Insert data file name into table from sql loader

    Hi All,
    I have a requirement to insert the data file name dynamically into table using sql loader.
    Example:
    sqlldr userid=username/passwword@host_string control=test_ctl.ctl data=test_data.dat
    test_ctl.ctl
    LOAD DATA
    FILED TERMINATED BY ','
    INTO TABLE test
    (empid number,
    ename varchar2(20),
    file_name varchar2(20) ---------- This should be the data file name which can be dynamic (coming from parameter)
    test_data.dat
    1,test
    2,hello
    3,world
    4,end
    Please help..
    Thanks in advance.
    Regards
    Anuj

    you'll probably have to write your control file on the fly, using a .bat or .sh file
    rem ===== file : test.bat ========
    rem
    rem ============== in pseudo speak =============
    rem
    rem
    echo LOAD DATA > test.ctl
    echo FILED TERMINATED BY ',' >> test.ctl
    echo INTO TABLE test >> test.ctl
    echo (empid number, >> test.ctl
    echo ename varchar2(20), >> test.ctl
    echo file_name constant %1% >> test.ctl
    echo ) >> test.ctl
    rem
    rem
    rem
    sqlldr userid=username/passwword@host_string control=test.ctl data=test_data.dat
    rem =============== end of file test.bat =======================
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/ldr_field_list.htm#i1008664

  • Sql loader using sql statement

    Hi All ,
    May i know any way used sql loader refer sql statement e.g "select a from test"
    import data into other table ?
    if you have any sample or link , please let me know , thanks a lot

    If you are using the database steps, the answer is yes. I've done it by creating a data source that uses the Microsoft Text Driver in the ODBC administrator. The Open Database step is then configured like any other ODBC database. In the .csv file, I have the first line define the column names (i.e. column1, column2) and the SQL statement I use is "SELECT * FROM junk.txt". I haven't tried anything with a WHERE clause but that should work as well.

  • Sql*loader using JSP

    wht is the way of using sqlldr in JSP script to load data Oracle DB from a csv file.
    i know how we do sqlldr in cmd.
    can we provide the sqlldr command to system by using jsp if yes how?

    SQL*Loader is a command-line utility. In theory, since Java can call out to the operating system, you could invoke SQL*Loader on the application server to load data into Oracle. It would probably be more appropriate, though, to copy the CSV to the database server and make use of an external table in the database to do the load.
    Justin

Maybe you are looking for

  • Disaster recovery of Oracle Apps(financials)

    We have an 11i Financials App server and 8i backend database installed on two seperate units. At our remote location, we have a standby database. Archive log files generated by production are copied and applied to the standby. To minimize support cos

  • How can i edit

    how to edit and to convert pdf files tt words

  • HT5824 Why aren't my photographs in my icould?

    That was Icloud.   Why didn't all my photograph upload to icloud

  • ZAM 7.0 PRU Problem

    I have not been able to update the Product Recognition on my ZAM 7.0 install for some time now. For awhile I have gotten this error. Please help. <EVENT AT="7/19/2007 12:35:42 PM" SEVERITY="Severe"> Application Object Error at 7/19/2007 12:35:42 PM o

  • Stock for SC

    Hi experts, Senerio like this... If i am entering vendor no and posting dates 20.11.2008 to 1.12.2008 so i need all the material in between this period and their respective closing balance. so plz check the query given below. if any FM to get previou