Sql loader Bad file

I am calling a sqlloader script from unix script. I have mentioned in unix script for bad file as
BADFILE="/web/local/orderlink/linkload/logs/full_prc_stg_bad.log"and in CTL file I have not mentioned any bad file path. I am calling 30 flat files in that CTL file like
LOAD DATA
INFILE '/shared/export/prc_stg1.dat'
INFILE '/shared/export/prc_stg2.dat'
INFILE '/shared/export/prc_stg30.dat'
INTO TABLE L_DRUG_ITEM_PRC_STG_SNGLbut when I run the script, It create log files as
Data File:      /shared/export/prfl1.dat
  *Bad File:     /web/local/orderlink/linkload/logs/full_profile_load_bad.log*
  Discard File:  none specified
(Allow all discards)
Data File:      /shared/export/prfl2.dat
  *Bad File:     /web/local/orderlink/linkload/sql_ctl/prfl2.bad*
  Discard File:  none specified
(Allow all discards)
Data File:      /shared/export/prfl3.dat
  *Bad File:     /web/local/orderlink/linkload/sql_ctl/prfl3.bad*
  Discard File:  none specified
(Allow all discards)
Data File:      /shared/export/prfl4.dat
  Bad File:     /web/local/orderlink/linkload/sql_ctl/prfl4.bad
  Discard File:  none specifiedFor first file, it is creating bad file at specified location of unix script. but In rest of 29 files, it is creating bad file at the location where CTL file kept. How to change this so that it can point to same folder which is mentioned in unix script, like for first script bad file.

For Bad file give extension *.bad and for log give *.log
One thing i want to knw
Have you given privilages to all files in unix?
Thanks,
Neerav

Similar Messages

  • SQL Loader Bad File - Line Number

    Is there a way to get line number of records in the bad file or discard file?
    Thanks,

    Hi,
    I think you get the line number of the rows failed to load in the log file. The failing rows are the rows you will find in the bad file.
    I don't know if the log contains the line numbers of discarded rows.
    Liron Amitzi
    Senior DBA consultant
    [www.dbsnaps.com]
    [www.obiumsoftware.com]

  • Passing file name dynamically to sql loader ctl file

    Hi,
    I am new to scripting and I have a complex requirement involving writing a script.
    Requierment:
    I need to upload a CSV file from a FTP server into oracle table using SQL Loader. The file name resembles like APF0912312359.csv represents 31-DEC-2009 23:59 and there will be multiple files in same day indicated by differnt timestamp as its filename. Once I load a file using SQL loader, I need to move the file to another directory for archival.
    Since the sql loader ctl file has a fixed file name, I would like to know how I can pass the file name dynamically to ctl file to load a new file every time it runs.
    Any help is greatly appreciated..
    Bye
    Sudheer
    Edited by: user2138419 on Oct 28, 2009 4:08 PM

    I agree with Pradeep in regards to declaring the file names on the command line. However, I do have a concern regarding presenting the password on the command line as any user that can issue a ps (assuming Unix ~= Linux here) would be able to read it while the sqlldr command is running.
    Unfortunately, you may not always have the option of declaring the files on the command line. I was recently challenged with this in a Windows 2003 Server environment running SQL*Loader: Release 10.2.0.1.0. In this environment I found that I am able to set a variable file name in a calling batch file and use that value in the control file successfully. Just to demonstrate the approach:
    Batch file:
    set IN_FILE=’c:\inbound\load_me.txt’
    sqlldr user/pswd@db PARFILE=’c:\parameters.txt’
    Parameter file:
    errors=500000
    rows=50000
    control=%CTL_FILE%
    bad=%BAD_FILE%
    discard=%DSC_FILE%
    log=%LOG_FILE%
    Control file:
    LOAD DATA
    INFILE ‘%IN_FILE%’
    INSERT
    INTO TABLE table_to_be_loaded
    I’m really interested to see if this would work on Unix.
    -Luke

  • Create sql loader data file dynamically

    Hi,
    I want a sample program/approach which is used to create a sql loader data file.
    The program will read table name as i/p and will use
    select stmt will column list derived from user_tab_columns from data dictionary
    assuming multiple clob columns in the column list.
    Thanks
    Manoj

    I 'm writing clob and other columns to a sql loader dat file.
    Below sample code for writing clob column is giving file write error.
    How can I write multiple clobs to dat file so that control file will handle it correctly
    offset NUMBER := 1;
    chunk VARCHAR2(32000);
    chunk_size NUMBER := 32000;
    WHILE( offset < dbms_lob.getlength(l_rec_type.narrative) )
    LOOP
    chunk := dbms_lob.substr(l_rec_type.narrative, chunk_size, offset );
    utl_file.put( l_file_handle, chunk );
         utl_file.fflush(l_file_handle);
    offset := offset + chunk_size;
    END LOOP;
         utl_file.new_line(l_file_handle);

  • How to call sql loader ctrl file with in the pl/sql procedure

    Hi Friends,
    I am doing a project related with the transferring data using queues. In the queue , I will get a tab delimited data in the form of CLOB variable/message. I do want to store that dat in the oracle table.
    When updating data into the table ,
    1. Don't want to write that data into a file.( Want to access directly after dequeueing from the specfic queue).
    2. As the data is in tab delimited form, I want to use sql loader concept.
    How do I call the sql loader ctrl file with in my pl/sql procedure. When I searched , most of the forums recommending external procedure or Java program.
    Please Guide me on this issue. my preferrence is pl sql, But don't know about external procedure . If no other way , I will try Java.
    I am using oracle 9.2.0.8.0.
    Thanks in advance,
    Vimal..

    Neither SQL*Loader nor external tables are designed to read data from a CLOB stored in the database. They both work on files stored on the file system. If you don't want the data to be written to a file, you're going to have to roll your own parsing code. This is certainly possible. But it is going to be far less efficient than either SQL*Loader or external tables. And it's likely to involve quite a bit more code.
    The simplest possible thing that could work would be to use something like Tom Kyte's string tokenization package to read a line from the CLOB, break it into the component pieces, and then store the different tokens in a meaningful collection (i.e. an object type or a record type that corresponds to the table definition). Of course, you'll need to handle things like converting strings to numbers or dates, rejecting rows, writing log files, etc.
    Justin

  • Parameters / Variables in SQL Loader Control File

    Here's my problem, I have created SQL Loader control file (ctl) to load a file of employees. This loader program is registered in Oracle Applications. What I want is to pass a parameter derived from the profile like the user name of the one who executed the loader program and pass it to the loader program so that the user name info is included in the loading
    example
    input data
    123456,John,Smith
    654321,Jane,Doe
    loaded data
    123456,John,Smith,03-JUL-2009,USER13
    654321,Jane,Doe,03-JUL-2009,USER13
    the sysdate is easy because it can be defined in the column but what I really want to achieve is a parameter passed to the ctl file.
    Thanks in advance.

    Dear user!
    Please have a look at this thread.
    {thread:id=915277}
    In the last three posts I explain how to use a shellscript with AWK and SED to pass parameters to a controlfile.
    Please feel free to post again if you have some questions regarding my explanatory notes.
    Yours sincerely
    Florian W.

  • Define variable in SQL Loader Control File

    Hi,
    I have an input file where the first line is the header record, followed by the detail records. For the processing, I do not need to store the fields of the header record but I need a date field from this header record and store in as part of the detail record in an oracle record row.
    Is it possible to define a variable within the sql loader control file to store the value that I need in memory then use it when I do the sql insert by the sql loader?
    Thanks for any advice.

    Not sure that you can. But if your on unix/linux/mac its easy enough to write a shell script to populates the variables in a template file that you can then use as the ctl file. The perl template toolkit could be an option for that as well

  • Way to generate sql*loader ctl file from a table?

    I'm an oracle newbie. (Oracle 8i, HP Unix)Is there any way to take an existing table
    description and generate a sql*loader control file from it? If anyone has already written a procedure or knows where one can be found I'd really appreciate it. We're doing a mass conversion to oracle and would like an easy way to re-write all our loads.
    Thanks! Eileen from Polaroid.

    Hi,
    I have shell program, which will get column names from system table and create temp. control file and call sqlloader exe. and load the data automatically.
    You can customise this file, according to your needs.
    Shell Program
    if [ $# -ne 2 ]
    then
    echo " Usage : $0 table_name flat file name"
    exit
    fi
    #assigning envir. variable to unix variables
    table_name=$1
    flat_file=$2
    #creating the control file
    echo "LOAD DATA" > $table_name.ctl
    echo "INFILE '$flat_file'" >> $table_name.ctl
    echo "into table $table_name " >> $table_name.ctl
    echo "fields terminated by '\t'" >> $table_name.ctl
    #calling functions for making column name part
    #describing the table and spooling into file
    sqlplus -s $CUST_ORA_USER << sql_block
    spool $table_name.lst
    desc $table_name
    spool off
    sql_block
    # creating suitable file and add the feilds into control file
    # cutting the first line (headings)
    tail +3 $table_name.lst > temp
    rm -f $table_name.lst
    k=`wc -l < temp`
    k1=`expr $k - 1`
    #cutting the last line
    head -$k1 temp > tempx
    record_count=`wc -l < tempx`
    counter=1
    echo "(" > wxyz.ctl
    # reading file line by line
    cat tempx | while in_line=`line`
    do
    #cutting the first field
    field_name=`echo $in_line | cut -f1 -d' '`
    #calculating the no of characters
    word_cnt=`echo $in_line | wc -w`
    #calculating count in a line
    if [ $word_cnt = 2 ]
    then
    data_type=`echo $in_line | cut -f2 -d' ' | cut -f1 -d'('`
    if [ "$data_type" = "DATE" ]
    then
    data_fmt="DECODE(LENGTH(LTRIM(RTRIM(:$field_name))),'11',to_date(ltrim(rtrim(:$field_name)),'dd-mon-yyyy'),'9',to_date(ltrim(rtrim(:$field_name)),'dd-mm-yy'),'10',to_date(ltrim(rtr im(:$field_name)),'dd-mon-yy'),'yyyy/mm/dd hh24:mi:ss')"
    elif [ "$data_type" = "CHAR" ]
    then
    data_fmt="NVL(RTRIM(LTRIM(:$field_name)),' ')"
    elif [ "$data_type" = "VARCHAR2" ]
    then
    data_fmt="NVL(RTRIM(LTRIM(:$field_name)),' ')"
    else
    data_fmt="NVL(:$field_name,0) "
    fi
    else
    data_type=`echo $in_line | cut -f4 -d' ' | cut -f1 -d'('`
    if [ "$data_type" = "DATE" ]
    then
    data_fmt="DECODE(LENGHTH(LTRIM(RTRIM(:$field_name))),'11',to_date(ltrim(rtrim(:$field_name)),'dd-mon-yyyy'),'9',to_date(ltrim(rtrim(:$field_name)),'dd-mm-yy'),'10',to_date(ltrim(rt rim(:$field_name)),'dd-mon-yy'),'yyyy/mm/dd hh24:mi:ss')"
    elif [ "$data_type" = "CHAR" ]
    then
    data_fmt="NVL(RTRIM(LTRIM(:$field_name)),' ')"
    elif [ "$data_type" = "VARCHAR2" ]
    then
    data_fmt="NVL(RTRIM(LTRIM(:$field_name)),' ')"
    else
    data_fmt="NVL(:$field_name,0) "
    fi
    fi
    #if last line put );
    #else ,
    if test $record_count -eq $counter
    then
    echo $field_name \"$data_fmt\"");" >> wxyz.ctl
    else
    echo $field_name \"$data_fmt\""," >> wxyz.ctl
    fi
    #counter increamenting for each record
    counter=`expr $counter + 1`
    done
    #removing the file
    rm -f temp tempx
    cat wxyz.ctl >> $table_name.ctl
    rm -f x.ctl
    #calling the SQLLOADER
    SQLLDR $CUST_ORA_USER CONTROL=$table_name.ctl ERROR=99999999
    #removing the control file
    rm -f $table_name.ctl

  • Add "Trailing Nullcolls" to sql loader control files generated by OWB

    Hi gurus,
    I've got a problem loading data with SQL Loader. I need to add the parameter "trailing nullcols" into the SQL Loader control file generated by OWB. I don't want to do this by saving the skript on my hard disk and run it manually, so any ideas where I can put this parameter? I am using OWB 10.1 on a windows 2000 machine.
    Thanks
    Stephan

    Hi,
    I found the solution to problem.
    1; Select the mapping where you map your source flat file to a table.
    2; Right click on the mapping and select "configure"
    3; Go Sources and Targets -&gt; YOUR_TABLE_NAME -&gt; SQL*Loader Parameters
    4; Set Trailing Nullcols = true :-)
    Thank you to anyone looking at this problem.
    Greetings
    Stephan

  • Importing already-made SQL*Loader control files in OWB 10g

    Hi all,
    Suppose that I have a certain amount of already hand-made SQL*Loader control files, say 50 or so.
    Each of these control files have a variable quantity of fields.
    I feel really lazy, so I would like to know if it is possible to import directly these control files instead of re-typing them in the design center. That would save me a lot of time.
    Thanks !
    Burgy

    Hi Burgy
    One option is to use the sqlloader option to generate an external table from the SQLLoader control file (search for generate_only in the sqlloader documentation). If you define the external table in the database you can reverse engineer this into OWB.
    Cheers
    David

  • Problem specifying SQL Loader Log file destination using EM

    Good evening,
    I am following the example given in the 2 Day DBA document chapter 8 section 16.
    In step 5 of 7, EM does not allow me to specify the destination of the SQL Loader log file to be on a mapped network drive.
    The question: Does SQL Loader have a limitation that I am not aware of, that prevents placing the log file on a network share or am I getting this error because of something else I am inadvertently doing wrong ?
    Note: I have placed the DDL, load file data and steps I follow in EM at the bottom of this post to facilitate reproducing the problem *(drive Z is a mapped drive)*.
    Thank you for your help,
    John.
    DDL (generated using SQL developer, you may want to change the space allocated to be less)
    CREATE TABLE "NICK"."PURCHASE_ORDERS"
        "PO_NUMBER"      NUMBER NOT NULL ENABLE,
        "PO_DESCRIPTION" VARCHAR2(200 BYTE),
        "PO_DATE" DATE NOT NULL ENABLE,
        "PO_VENDOR" NUMBER NOT NULL ENABLE,
        "PO_DATE_RECEIVED" DATE,
        PRIMARY KEY ("PO_NUMBER") USING INDEX PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS NOCOMPRESS LOGGING TABLESPACE "USERS" ENABLE
      SEGMENT CREATION DEFERRED PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 NOCOMPRESS LOGGING STORAGE
        INITIAL 67108864
      TABLESPACE "USERS" ;
    Load.dat file contents
    1, Office Equipment, 25-MAY-2006, 1201, 13-JUN-2006
    2, Computer System, 18-JUN-2006, 1201, 27-JUN-2006
    3, Travel Expense, 26-JUN-2006, 1340, 11-JUL-2006
    Steps I am carrying out in EM
    log in, select data movement -> Load Data from User Files
    Automatically generate control file
    (enter host credentials that work on your machine)
    continue
    Step 1 of 7 ->
      Data file is located on your browser machine
      "Z:\Documentation\Oracle\2DayDBA\Scripts\Load.dat"
       click next
    step 2 of 7 ->
      Table Name
      nick.purchase_orders
      click next
    step 3 of 7 ->
      click next
    step 4 of 7 ->
      click next
    step 5 of 7 ->
      Generate log file where logging information is to be stored
      Z:\Documentation\Oracle\2DayDBA\Scripts\Load.LOG
      Validation Error
      Examine and correct the following errors, then retry the operation:
      LogFile - The directory does not exist.

    Hi John,
    But, i did'nt found any error when i am going the same what you did.
    My Oracle Version is 10.2.0.1 and using Windows xp. See what i did and i got worked
    1.I created one table in scott schema :
    SCOTT@orcl> CREATE TABLE "PURCHASE_ORDERS"
      2  (
      3      "PO_NUMBER"      NUMBER NOT NULL ENABLE,
      4      "PO_DESCRIPTION" VARCHAR2(200 BYTE),
      5      "PO_DATE" DATE NOT NULL ENABLE,
      6      "PO_VENDOR" NUMBER NOT NULL ENABLE,
      7      "PO_DATE_RECEIVED" DATE,
      8      PRIMARY KEY ("PO_NUMBER") USING INDEX PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS NOCOMPRESS LOGGING TABLESPACE "USERS" ENABLE
      9  )
    10  TABLESPACE "USERS";
    Table created.I logged into em Maintenance-->Data Movement-->Load Data from User Files-->My Host Credentials
    Here i total 3 text boxes :
    1.Server Data File : C:\ORACLE\PRODUCT\10.2.0\ORADATA\ORCL\USERS01.DBF
    2.Data File is Located on Your Browser Machine : z:\load.dat <--- Here z:\ means other machine's shared doc folder; and i selected this option (as option button click) and i created the same load.dat as you mentioned.
    3.Temporary File Location : C:\ORACLE\PRODUCT\10.2.0\ORADATA\ORCL\ <--- I did'nt mentioned anything.
    Step 2 of 7 Table Name : scott.PURCHASE_ORDERS
    Step 3 of 7 I just clicked Next
    Step 4 of 7 I just clicked Next
    Step 5 of 7 I just clicked Next
    Step 6 of 7 I just clicked Next
    Step 7 of 7 Here it is Control File Contents:
    LOAD DATA
    APPEND
    INTO TABLE scott.PURCHASE_ORDERS
    FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
    PO_NUMBER INTEGER EXTERNAL,
    PO_DESCRIPTION CHAR,
    PO_DATE DATE,
    PO_VENDOR INTEGER EXTERNAL,
    PO_DATE_RECEIVED DATE
    And i just clicked on submit job.
    Now i got all 3 rows in purchase_orders :
    SCOTT@orcl> select count(*) from purchase_orders;
      COUNT(*)
             3So, there is no bug, it worked and please retry if you get any error/issue.
    HTH
    Girish Sharma

  • Skipping fields in SQL*LOADER data file

    I have a data file that has more fields than the target table does. How can I write a SQL*LOADER control file to skip some fields in the middle of the text line?
    null

    If you don't want to define input fields by position, the simplest way I think is to use FILLER fields.
    Quoted from SQL*Loader doc:
    "Specifying Filler Fields
    Filler fields have names but they are not loaded into the table. However, filler fields can be used as arguments to init_specs (for example, NULLIF and DEFAULTIF) as well as to directives (for example, SID, OID, REF, BFILE). Also, filler fields can occur anyplace in the data file. They can be inside of the field list for an object or inside the definition of a VARRAY.
    See SQL*Loader DDL Behavior and Restrictions for more information on filler fields and their use.
    A sample filler field specification looks as follows:
    field_1_count FILLER char,
    Ex:
    Regards,
    Zoltan

  • Use of WHEN in SQL Loader Control File

    Does anyone know of a way, using the CTL file below, to NOT LOAD (i.e. discard) any record where field G is null -OR- substr(G,1,3)='CON'+ (the first 3 characters of field G are "CON")?
    The WHEN clause seems very limited in what it can test and all my attempts to use a compound condition, much less the SUBSTR function, have failed. The documentation indicates that only very simple conditions can be specified.
    (Oracle 10g, Unix)
    Thanks
    LOAD DATA
    TRUNCATE
    INTO TABLE TEST_TABLE
    WHEN (G <> '')
    FIELDS TERMINATED BY "^"
    TRAILING NULLCOLS
    A ,
    B FILLER,
    C FILLER,
    D FILLER,
    E FILLER,
    F FILLER,
    G "LTRIM(:G,'0')",
    H FILLER,
    I FILLER,
    J FILLER,
    K SYSDATE,
    L CONSTANT "SQLLDR"
    )

    Hello MTM.
    The WHEN clause in SQL*Loader, despite what you might read, doesn't seem to evaluate the value of the column you specify after SQL clauses are processed. Therefore, the only way you can guarantee to NOT LOAD a record is by using a function to raise an application error when the column data meets your criteria. You'll need to do some basic comparisons between bad data and discarded data in the bad data file to isolate these.
    You can still discard the null data using (replace ne with greater and less than)WHEN (g ne '')Hope this helps,
    Luke
    Please mark the answer as helpful or answered if it is so. If not, provide additional details.
    Always try to provide create table and insert table statements to help the forum members help you better.

  • SQL Loader control file script - INFILE

    Hi,
    The data stored in the file "c:/test_<yyyy>_<mm>_<dd>". The <yyyy>, <mm> and <dd> has to be from sysdate. The infile gets generated on daily basis.
    LOAD DATA
    INFILE 'c:/test_'||to_char(sysdate,'yyyy')||'_'||to_char(sysdate,'mm')||'_'||to_char(sysdate,'dd').txt'
    BADFILE 'c:/test_' || to_char(sysdate,'yyyy') || '_' || to_char(sysdate,'mm') || '_' || to_char(sysdate,'dd').bad'
    I ran the above ctl from cmd prompt, it is giving an error. It is not recognizing the || symbol as concatenation.
    How can I achieve this?
    Thanks in advance.

    Thanks Hussein.
    I placed the control script in GL_TOP directory in unix OS. I created an executable (for the control script) and scheduled this concurrent program.
    OS :- HP Tru64 UNIX - 5.1b
    Database : 9.2.0.5
    Application: Oracle General Ledger (Ver 11.5.9).
    The error I am getting if I run the command "sqlldr ...." from cmd prompt is:
    sql*Loader: Release 8.1.7.0.0
    Illegal combination of non-alphanumeric characters
    INFILE 'c:/test_' || to_char(sysdate,'yyyy'.......'
    Thanks again.

  • SQL  LOADER CONTROL FILE

    hi,
    Can one call procedures and functions inside a control file in SQL Loader ...alternatively how can one use case inside a control file ...
    How to implement the the following code in a control file:
    SELECT CASE
    WHEN INSTR(UPPER(returnname),'SCHEDULE') > 0 THEN 'S'
    WHEN INSTR(UPPER(returnname),'RETURN') > 0 THEN 'R'
    WHEN INSTR(UPPER(RETURNNAME),'BREAKDOWN') > 0 THEN 'B'
    ELSE returnname
    END
    AS returns
    FROM tableexample
    Please let me know ..

    regarding your first question :
    data_file TT.dat :
    SCHEDULE
    RETURN
    BREAKDOWN
    HOLIDAY
    WEEKEND
    schedule
    return
    breakdown
    holiday
    weekend
    control_file TT.ctl :
    load data
    insert
    into table dummy_test
    (TEXT position(01:20) char
    ,TEXT_b position(01:20) char " decode(upper(:TEXT),'SCHEDULE','S','RETURN','R','BREAKDOWN','B',:TEXT) "
    parameter_file TT.par :
    userid=user/password
    DATA=TT.dat
    CONTROL=TT.ctl
    ERRORS=99999
    load file into table :
    sqlldr parfile=TT.par
    log-file TT.log :
    SQL*Loader: Release 8.1.6.2.0 - Production on Thu Feb 26 16:01:23 2004
    (c) Copyright 1999 Oracle Corporation. All rights reserved.
    Control File: TT.ctl
    Data File: TT.dat
    Bad File: TT.bad
    Discard File: none specified
    (Allow all discards)
    Number to load: ALL
    Number to skip: 0
    Errors allowed: 99999
    Bind array: 64 rows, maximum of 65536 bytes
    Continuation: none specified
    Path used: Conventional
    Table DUMMY_TEST, loaded from every logical record.
    Insert option in effect for this table: INSERT
    Column Name Position Len Term Encl Datatype
    TEXT 1:20 20 CHARACTER
    TEXT_B 1:20 20 CHARACTER
    SQL string for column : " decode(upper(:TEXT),'SCHEDULE','S','RETURN','R','BREAKDOWN','B',:TEXT) "
    Table DUMMY_TEST:
    10 Rows successfully loaded.
    0 Rows not loaded due to data errors.
    0 Rows not loaded because all WHEN clauses were failed.
    0 Rows not loaded because all fields were null.
    Space allocated for bind array: 2816 bytes(64 rows)
    Space allocated for memory besides bind array: 0 bytes
    Total logical records skipped: 0
    Total logical records read: 10
    Total logical records rejected: 0
    Total logical records discarded: 0
    Run began on Thu Feb 26 16:01:23 2004
    Run ended on Thu Feb 26 16:01:23 2004
    Elapsed time was: 00:00:00.58
    CPU time was: 00:00:00.10
    SQL > select * from dummy_test ;
    TEXT TEXT_B
    SCHEDULE S
    RETURN R
    BREAKDOWN B
    HOLIDAY HOLIDAY
    WEEKEND WEEKEND
    schedule S
    return R
    breakdown B
    holiday holiday
    weekend weekend
    10 rows selected.
    SQL>
    Does it solve your problem ?
    Regards,
    Rainer

Maybe you are looking for