Running SQL Loader through PL/SQL

Hi All,
Is there a utility package that can be used to run SQL LOADER through PL/SQL?
Regards

External tables are new in 9i.
If you need to call SQL*Loader in 8i, you'd be stuck with the Java stored procedure/ external procedure approach. Of course, this might also be an impetus to upgrade, since 8.1.7 leaves error correction support at the end of the year.
Justin
Distributed Database Consulting, Inc.
http://www.ddbcinc.com/askDDBC

Similar Messages

  • Execute SQL Loader through PL/SQL Block

    I want to run Sql Loader through PL/SQL Block. So Pls. give me solutions.

    It's the same as running any other OS command from PL/SQL - you can only do it by using Java. Check out the AskTom site for a good tutorial.
    rgds, APC

  • SQL Loader through Forms 6i

    Hi,
    I need to know the steps to run sql loader through forms6i in windows environment.
    Please help me in this regard.

    1. Create a control file for Sql*Loader with all parameters you need
    2. Create a .bat file to run Sql*Loader, e.g. "sqlldr username/password@db control=<file above>"
    3. Use HOST command within Forms 6i to run the .bat file.

  • Issues in calling Sql Loader through forms developer (10g)

    Hi,
    I am developing a form(in 10g) ,in which I am calling sql loader for loading data onto a oracle database table from external source (e.g. data file is a .CSV file).
    But , somehow the sql loader is not getting executed.
    Here , I am giving the environment settings , approach i am taking;
    This is a distributed system , both the application server,and database server are mounted on two different servers.
    The form is delpoyed on the application server.
    The database table , and the sql loader are configured on the databse side.
    I am using host(<sql loader command>) command to invoke the Operating System command through forms.
    The sql loader is working ok , and the data is getting loaded successfully onto the required database table when I am executing the sql loader command on Unix prompt , but through the forms , it's not working.
    Do I need to change some environmental setting to make this work...
    Any quick help in this regard is highly appreciated.
    Thanks.

    Hi Craig,
    I have already tried out the option of calling sql loader through stored procedure,but this is not working ...
    could you please share any examples to do so ...
    code snippet i am using in forms:
    declare
    usid get_application_property(userid);
    pwd get_application_property(password);
    db get_application_property(host);
    msqlldr varchar2(250);
    begin
    msqlldr:='/u01/oracle/formss/bin/sqlldr' username=<uid/pwd@db> control=<control file name> data=<data file name> log=<log file name>
    hosts(msqlldr);
    end;
    Note that sqlldr command ,for that matter any unix shell command is not working through the hosts() command ...
    could you please suggest any way out ...
    Thanks.

  • Change the mapping generation code from sql*loader to pl/sql

    I want to use a mapping with a flat file operator to generate pl/sql code even if a mapping generate sql*loader code as default.
    I tried to change the Language generation property of the mapping but an API8548 error message is shown. The suggested solution by OWB is to change the language generation code in the property inspector of the mapping.
    I can't use external table because I have to work with a remote machine.
    What i have to do to change the generation code from SQL*Loader to PL/SQL?

    How about breaking this out into 2 mappings? In the first mapping, map a flat file operator to an table using SQL*Loader code. Then define a second mapping using the table as source and therefore generate PL/SQL. Then use process flow to launch the 2nd map to run after completion of first.

  • Execute sqlldr command (Sql Loader) in PL/SQL

    Hi guys... how can I execute this prompt command via PL/SQL command?
    "sqlldr *****/******@ericsson control=nrcs_atelecom.CTL rows=100000 direct=true"
    I have a lot of CTL files to process loading data. So, I create a PL/SQL that run a loop reading all CTLs, but I don´t know how run, via PL, the Sql Loader.

    I can't post a link but search the documentation, you'll find it in the "Utilities" manual.
    Basically...you create tables that point to your files. Then you can SELECT from them like any other table.

  • Problems using SQL*Loader with Oracle SQL Developer

    I have been using TOAD and able to import large (milllions of rows of data) in various file formats into a table in an Oracle database. My company recently decided not to renew any more TOAD licenses and go with Oracle SQL Developer. The Oracle database is on a corporate server and I access the database via Oracle client locally on my machine. Oracle SQL Developer and TOAD are local on my desktop and connected through TNSnames using the Windows XP platform. I have no issues with using SQL*Loader via the import wizard in TOAD to import the data in these large files into an Oracle table and producing a log file. Loading the same files via SQL*Loader in SQL Developer, freezes up my machine and I cannot get it to produce a log file. Please help!

    I am using SQL Developer version 3.0.04. Yes, I have tried it with a smaller file with no success. What is odd is that the log file is not even created. What is created is a .bat file a control file and a .sh file but no log file. The steps that I take:
    1.Right click on the table I want to import to or go to actions
    2. Import Data
    3. Find file to import
    4. Data Preview - All fields entered according to file
    5. Import Method - SQL Loader utility
    6. Column Definitions - Mapped
    7. Options - Directory of files set
    8. Finish
    With the above steps I was not able to import 255 rows of data. No log file was produced so I don't know why it is failing.
    thanks.
    Edited by: user3261987 on Apr 16, 2012 1:23 PM

  • Invoke SQL Loader from PL/SQL Procedure

    hi
    Is there any way to invoke sqlloader from a PL/SQL Procedure ?
    Thanks
    Ashish

    'HOST' is not a PL/SQL command, it's a SQL*Plus command.
    If you think about it for a moment, being able to run an O.S. command, such as SQL*Loader, from within the database is pretty dangerous. I suspect this is why Oracle has no way of doing it. Otherwise, a you'd be able to execute an O.S. command as the user 'oracle' (on UNIX - probably 'SYSTEM' on NT, which is even worse!!!).
    All it takes is to change this :
    execute '/usr/oracle/bin/sqlldr user/password@DB file=input.txt' ;
    to this :
    execute 'rm -rf /usr/oracle' ;
    and you can start checking the job adverts...

  • SQL loader and PL/SQL

    A text file contains a single row with 450 comma separated entries.I want to load all the data into an oracle table of 450 columns(separating the comma separated entries e.g A1,2,3 from text file into oracle table as
    col1 col2 col3
    A1 2 3
    but sql loader gives a problem while loading data into more than 250 fields.So how to tackle this problem.
    Please reply soon at [email protected] or [email protected]

    That or you will need to use UTL_FILE in read mode. You will probably want to write a function which parses the commas. There are tons floating around here or on ASKTOM. I am not exactly sure however the max linesize UTL_FILE can read, but it should be a viable option.

  • SQL Loader Truncate and SQL TRUNCATE difference

    Could any one let me know what is difference between truncate used by control file of the SQL Loader and TRUNCATE command used by SQL? Is there any impact or difference of these both over the data files.
    Thanks

    Mr Jens I think TRUNCATE in SQLLDR control file reuses extents, unlike SQL TRUNCATE command. In my opinion it is best to truncate these to show the normal usage of these tables, not the elevated values.
    Could you please further comment?

  • Calling Sql-loader from PL/SQL

    What is the command(s) to call Sql-Loader from inside a PL/SQL procedure?
    Regards,
    Ahmad.

    I don't think it is possible ...

  • Calling SQL Loader through Forms using webutils

    hi all,
    Can anyone tell me how can i call a sqlldr through forms 9i using webutils?Let me know if anyone have a sample script.
    Thanks
    Best Regards,
    Abrar
    [email protected]

    If your middle tier and database server are both unix boxes you could use host(rsh...)
    Are you suggesting webutil because your data to be loaded is on the PC ? If so you probably want to move it then load it as two seperate operations.
    If there isn't too much data you also have the option of using text_io to load it into a forms database block, which has the advantage that the user can see what is happening, and validation rules could be added if required.

  • Unable to Run SSIS Package Through SQL Agent Job

    Hi,
    I recently upgraded SQL server 2008 R2 to SQL Server 2012. I also upgraded all the packages on the server. The package runs fine from BIDS. However when I try to run the package through the SQL Agent Job it fails with the error below:
    Executed as user: A. Microsoft (R) SQL Server Execute Package Utility  Version 11.0.2100.60 for 64-bit  Copyright (C) Microsoft Corporation. All rights reserved.    Started:  11:43:04 PM  Could not load package "\FolderA\Package.dtsx"
    because of error 0xC00160AE.  Description: Connecting to the Integration Services service on the computer "S2345WE" failed with the following error: "Access is denied."    By default, only administrators have access to the
    Integration Services service. On Windows Vista and later, the process must be running with administrative privileges in order to connect to the Integration Services service. See the help topic for information on how to configure access to the service.  Source:
      Started:  11:43:04 PM  Finished: 11:43:04 PM  Elapsed:  0.016 seconds.  The package could not be loaded.
    Using Windows Authentication I am able to login to Integration services through SSMS. In the SQL Agent job I am using package store to execute the package. I have admin permission on the server. The integration services currently uses my credentials while
    running.I am not sure why I am getting this error.
    Please advice..
    Thanks,
    EVA05

    Hi ,
    similar thread - http://social.technet.microsoft.com/Forums/en-US/sqlintegrationservices/thread/25e22c7e-bae0-42e4-b86d-2db7a4af519d
    Try this link -
    http://msdn.microsoft.com/en-us/library/dd440760%28v=sql.100%29.aspx
    sathya --------- Please Mark as answered if my post solved your problem and Vote as helpful if my post was useful.

  • Loading data through pl/sql

    Hi frnds,
    i want to load .mdb table into oracle 9i through pl/sql procedure.
    i am doing this through sql* loader
    my script is :
    LOAD DATA INFILE '//empwrpub1/Data_Wrangler/Owens/Original Data/037 Refresh 10-06/SAP/SRM_INVOICE.txt'
    INTO TABLE SRM_INVOICE_200610
    FIELDS TERMINATED BY ',' ENCLOSED BY '"'
    LINES TERMINATED BY '\n'
    IGNORE 1 LINES
    now i want this through pl/sql procedure
    thanks in advance

    Hi,
    The two different questions have arised during this Forum.
    1. Load data through SQL loader from PL/SQL.
    2. Use of many DDL and DML statements in the process of Data Loading.
    I will start from the second Question :
    this is great question I myself has faced this problem like first create temporary tables load data, update spaces with null or date formatting etc.
    This is easily achieved when you load the data by using External Tables. There you need to write a script mentioning your datafile, location, column and line termination etc. In this same script you use your DDL statements first and then data loading script and there after DML if any.
    This block would look like something PL/SQL block.
    So, First questioin has also been answered with the condition that you have to use External tables and then load data by writing PL/SQL. I am not sure as I have never tried to call SQL loader commands from PL/SQL. So it Looks impossible. If at all you want to carry out the same activity from PL/SQL block then write a SQL loader Script, CTL file and then write a SHELL Program and then call it from PL/SQL Block.
    Bye

  • Sql loader - BLOB

    I have used OMWB "generate Sql Loader script " option. ANd received the sql loader error.
    The previous attempt to use OMWB online loading has generated garbage data. The picture was not matching with the person id.
    Table in Sql Server..................
    CREATE TABLE [nilesh] (
         [LargeObjectID] [int] NOT NULL ,
         [LargeObject] [image] NULL ,
         [ContentType] [varchar] (40) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
         [LargeObjectName] [varchar] (255) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
         [LargeObjectExtension] [varchar] (10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
         [LargeObjectDescription] [varchar] (255) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
         [LargeObjectSize] [int] NULL ,
         [VersionControl] [bit] NULL ,
         [WhenLargeObjectLocked] [datetime] NULL ,
         [WhoLargeObjectLocked] [char] (11) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
         [LargeObjectTimeStamp] [timestamp] NOT NULL ,
         [LargeObjectOID] [uniqueidentifier] NOT NULL
    ) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
    GO
    Table in Oracle..............
    CREATE TABLE LARGEOBJECT
    LARGEOBJECTID NUMBER(10) NOT NULL,
    LARGEOBJECT BLOB,
    CONTENTTYPE VARCHAR2(40 BYTE),
    LARGEOBJECTNAME VARCHAR2(255 BYTE),
    LARGEOBJECTEXTENSION VARCHAR2(10 BYTE),
    LARGEOBJECTDESCRIPTION VARCHAR2(255 BYTE),
    LARGEOBJECTSIZE NUMBER(10),
    VERSIONCONTROL NUMBER(1),
    WHENLARGEOBJECTLOCKED DATE,
    WHOLARGEOBJECTLOCKED CHAR(11 BYTE),
    LARGEOBJECTTIMESTAMP NUMBER(8) NOT NULL,
    LARGEOBJECTOID RAW(16) NOT NULL
    TABLESPACE USERS
    PCTUSED 0
    PCTFREE 10
    INITRANS 1
    MAXTRANS 255
    STORAGE (
    INITIAL 64K
    MINEXTENTS 1
    MAXEXTENTS 2147483645
    PCTINCREASE 0
    BUFFER_POOL DEFAULT
    LOGGING
    NOCOMPRESS
    LOB (LARGEOBJECT) STORE AS
    ( TABLESPACE USERS
    ENABLE STORAGE IN ROW
    CHUNK 8192
    PCTVERSION 10
    NOCACHE
    STORAGE (
    INITIAL 64K
    MINEXTENTS 1
    MAXEXTENTS 2147483645
    PCTINCREASE 0
    BUFFER_POOL DEFAULT
    NOCACHE
    NOPARALLEL
    MONITORING;
    Sql Loader script....
    SET NLS_DATE_FORMAT=Mon dd YYYY HH:mi:ssAM
    REM SET NLS_TIMESTAMP_FORMAT=Mon dd YYYY HH:mi:ss:ffAM
    REM SET NLS_LANGUAGE=AL32UTF8
    sqlldr cecildata/@ceciltst control=LARGEOBJECT.ctl log=LARGEOBJECT.log
    Sql loader control file......
    load data
    infile 'nilesh.dat' "str '<er>'"
    into table LARGEOBJECT
    fields terminated by '<ec>'
    trailing nullcols
    (LARGEOBJECTID,
    LARGEOBJECT CHAR(2000000) "HEXTORAW (:LARGEOBJECT)",
    CONTENTTYPE "DECODE(:CONTENTTYPE, CHR(00), ' ', :CONTENTTYPE)",
    LARGEOBJECTNAME CHAR(255) "DECODE(:LARGEOBJECTNAME, CHR(00), ' ', :LARGEOBJECTNAME)",
    LARGEOBJECTEXTENSION "DECODE(:LARGEOBJECTEXTENSION, CHR(00), ' ', :LARGEOBJECTEXTENSION)",
    LARGEOBJECTDESCRIPTION CHAR(255) "DECODE(:LARGEOBJECTDESCRIPTION, CHR(00), ' ', :LARGEOBJECTDESCRIPTION)",
    LARGEOBJECTSIZE,
    VERSIONCONTROL,
    WHENLARGEOBJECTLOCKED,
    WHOLARGEOBJECTLOCKED,
    LARGEOBJECTTIMESTAMP,
    LARGEOBJECTOID "GUID_MOVER(:LARGEOBJECTOID)")
    Error Received...
    Column Name Position Len Term Encl Datatype
    LARGEOBJECTID FIRST * CHARACTER
    Terminator string : '<ec>'
    LARGEOBJECT NEXT ***** CHARACTER
    Maximum field length is 2000000
    Terminator string : '<ec>'
    SQL string for column : "HEXTORAW (:LARGEOBJECT)"
    CONTENTTYPE NEXT * CHARACTER
    Terminator string : '<ec>'
    SQL string for column : "DECODE(:CONTENTTYPE, CHR(00), ' ', :CONTENTTYPE)"
    LARGEOBJECTNAME NEXT 255 CHARACTER
    Terminator string : '<ec>'
    SQL string for column : "DECODE(:LARGEOBJECTNAME, CHR(00), ' ', :LARGEOBJECTNAME)"
    LARGEOBJECTEXTENSION NEXT * CHARACTER
    Terminator string : '<ec>'
    SQL string for column : "DECODE(:LARGEOBJECTEXTENSION, CHR(00), ' ', :LARGEOBJECTEXTENSION)"
    LARGEOBJECTDESCRIPTION NEXT 255 CHARACTER
    Terminator string : '<ec>'
    SQL string for column : "DECODE(:LARGEOBJECTDESCRIPTION, CHR(00), ' ', :LARGEOBJECTDESCRIPTION)"
    LARGEOBJECTSIZE NEXT * CHARACTER
    Terminator string : '<ec>'
    VERSIONCONTROL NEXT * CHARACTER
    Terminator string : '<ec>'
    WHENLARGEOBJECTLOCKED NEXT * CHARACTER
    Terminator string : '<ec>'
    WHOLARGEOBJECTLOCKED NEXT * CHARACTER
    Terminator string : '<ec>'
    LARGEOBJECTTIMESTAMP NEXT * CHARACTER
    Terminator string : '<ec>'
    LARGEOBJECTOID NEXT * CHARACTER
    Terminator string : '<ec>'
    SQL string for column : "GUID_MOVER(:LARGEOBJECTOID)"
    SQL*Loader-309: No SQL string allowed as part of LARGEOBJECT field specification
    what's the cause ?

    The previous attempt to use OMWB online loading has generated garbage data. The picture was not matching with the person id.This is being worked on (bug4119713) If you have a reproducible testcase please send it in (small testcases seem to work ok).
    I have the following email about BLOBS I could forward to you if I have your email address:
    [The forum may cut the lines in the wrong places]
    Regards,
    Turloch
    Oracle Migration Workbench Team
    Hi,
    This may provide the solution. Without having the customer files here I can only guess at the problem. But this should help.
    This email outlines a BLOB data move.
    There are quiet a few steps to complete the task of moving a large BLOB into the Oracle database.
    Normally this wouldn't be a problem, but as far as we can tell SQL Server's (and possibly Sybase) BCP does not reliably export binary data.
    The only way to export binary data properly via BCP is to export it in a HEX format.
    Once in a HEX format it is difficult to get it back to binary during a data load into Oracle.
    We have come up with the idea of getting the HEX values into Oracle by saving them in a CLOB (holds text) column.
    We then convert the HEX values to binary values and insert them into the BLOB column.
    The problem here is that the HEXTORAW function in Oracle only converts a maximum of 2000 HEX pairs.
    We over came this problem by writing our own procedure that will convert (bit by bit) your HEX data to binary.
    NOTE: YOU MUST MODIFY THE START.SQL AND FINISH.SQL TO SUIT YOUR CUSTOMER
    The task is split into 4 sub tasks
    1) CREATE A TABLESPACE TO HOLD ALL THE LOB DATA
    --log into your system schema and create a tablespace
    --Create a new tablespace for the CLOB and BLOB column (this may take a while to create)
    --You may resize this to fit your data ,
    --but I believe you have in excess of 500MB of data in this table and we are going to save it twice (in a clob then a blob)
    --Note: This script may take some time to execute as it has to create a tablespace of 1000Mb.
    -- Change this to suit your customer.
    -- You can change this if you want depending on the size of your data
    -- Remember that we save the data once as CLOB and then as BLOB
    create tablespace lob_tablespace datafile 'lob_tablespace' SIZE 1000M AUTOEXTEND ON NEXT 50M;
    LOG INTO YOUR TABLE SCHEMA IN ORACLE
    --Modify this script to fit your requirements
    2) START.SQL (this script will do the following tasks)
    a) Modify your current schema so that it can accept HEX data
    b) Modify your current schema so that it can hold that huge amount of data.
    The new tablespace is used; you may want to alter this to your requirements
    c) Disable triggers, indexes & primary keys on tblfiles
    3)DATA MOVE
    The data move now involves moving the HEX data in the .dat files to a CLOB.
    The START.SQL script adds a new column to <tablename> called <blob_column>_CLOB.
    This is where the HEX values will be stored.
    MODIFY YOUR CONTROL FILE TO LOOK LIKE THISload data
    infile '<tablename>.dat' "str '<er>'"
    into table <tablename>
    fields terminated by '<ec>'
    trailing nullcols
    <blob_column>_CLOB CHAR(200000000),
    The important part being "_CLOB" appended to your BLOB column name and the datatype set to CHAR(200000000)
    RUN sql_loader_script.bat
    log into your schema to check if the data was loaded successfully-- now you can see that the hex values were sent to the CLOB column
    SQL> select dbms_lob.getlength(<blob_column>),dbms_lob.getlength(<blob_column>_clob) from <tablename>;
    LOG INTO YOUR SCHEMA
    4)FINISH.SQL (this script will do the following tasks)
    a) Creates the procedure needed to perform the CLOB to BLOB transformation
    b) Executes the procedure (this may take some time a 500Mb has to be converted to BLOB)
    c) Alters the table back to its original form (removes the <blob_column>_clob)
    b) Enables the triggers, indexes and primary keys
    Regards,
    (NAME)
    -- START.SQL
    -- Modify this for your particular customer
    -- This should be executed in the user schema in Oracle that contains the table.
    -- DESCRIPTION:
    -- ALTERS THE OFFENDING TABLE SO THAT THE DATA MOVE CAN BE EXECUTED
    -- DISABLES TRIGGERS, INDEXES AND SEQUENCES ON THE OFFENDING TABLE
    -- 1) Add an extra column to hold the hex string
    alter table <tablename> add (FILEBINARY_CLOB CLOB);
    -- 2) Allow the BLOB column to accpet NULLS
    alter table <tablename> MODIFY FILEBINARY NULL;
    -- 3) Dissable triggers and sequences on tblfiles
    alter trigger <triggername> disable;
    alter table tblfiles drop primary key cascade;
    drop index <indexname>;
    -- 4) Allow the table to use the tablespace
    alter table <tablename> move lob (<blob_column>) store as (tablespace lob_tablespace);
    alter table tblfiles move lob (<blob_column>clob) store as (tablespace lobtablespace);
    COMMIT;
    -- END OF FILE
    -- FINISH.SQL
    -- Modify this for your particular customer
    -- This should be executed in the table schema in Oracle.
    -- DESCRIPTION:
    -- MOVES THE DATA FROM CLOB TO BLOB
    -- MODIFIES THE TABLE BACK TO ITS ORIGIONAL SPEC (without a clob)
    -- THEN ENABLES THE SEQUENCES, TRIGGERS AND INDEXES AGAIN
    -- Currently we have the hex values saved as text in the <columnname>_CLOB column
    -- And we have NULL in all rows for the <columnname> column.
    -- We have to get BLOB locators for each row in the BLOB column
    -- put empty blobs in the blob column
    UPDATE <tablename> SET filebinary=EMPTY_BLOB();
    COMMIT;
    -- create the following procedure in your table schema
    CREATE OR REPLACE PROCEDURE CLOBTOBLOB
    AS
    inputLength NUMBER; -- size of input CLOB
    offSet NUMBER := 1;
    pieceMaxSize NUMBER := 50; -- the max size of each peice
    piece VARCHAR2(50); -- these pieces will make up the entire CLOB
    currentPlace NUMBER := 1; -- this is where were up to in the CLOB
    blobLoc BLOB; -- blob locator in the table
    clobLoc CLOB; -- clob locator pointsthis is the value from the dat file
    -- THIS HAS TO BE CHANGED FOR SPECIFIC CUSTOMER TABLE AND COLUMN NAMES
    CURSOR cur IS SELECT <blob_column>clob clobcolumn , <blob_column> blob_column FROM /*table*/<tablename> FOR UPDATE;
    cur_rec cur%ROWTYPE;
    BEGIN
    OPEN cur;
    FETCH cur INTO cur_rec;
    WHILE cur%FOUND
    LOOP
    --RETRIVE THE clobLoc and blobLoc
    clobLoc := cur_rec.clob_column;
    blobLoc := cur_rec.blob_column;
    currentPlace := 1; -- reset evertime
    -- find the lenght of the clob
    inputLength := DBMS_LOB.getLength(clobLoc);
    -- loop through each peice
    LOOP
    -- get the next piece and add it to the clob
    piece := DBMS_LOB.subStr(clobLoc,pieceMaxSize,currentPlace);
    -- append this peice to the BLOB
    DBMS_LOB.WRITEAPPEND(blobLoc, LENGTH(piece)/2, HEXTORAW(piece));
    currentPlace := currentPlace + pieceMaxSize ;
    EXIT WHEN inputLength < currentplace;
    END LOOP;
    FETCH cur INTO cur_rec;
    END LOOP;
    END CLOBtoBLOB;
    -- now run the procedure
    -- It will update the blob column witht the correct binary represntation of the clob column
    EXEC CLOBtoBLOB;
    -- drop the extra clob cloumn
    alter table <tablename> drop column <blob_column>_clob;
    -- 2) apply the constraint we removed during the data load
    alter table <tablename> MODIFY FILEBINARY NOT NULL;
    -- Now re enable the triggers,indexs and primary keys
    alter trigger <triggername> enable;
    ALTER TABLE TBLFILES ADD ( CONSTRAINT <pkname> PRIMARY KEY ( <column>) ) ;
    CREATE INDEX <index_name> ON TBLFILES ( <column> );
    COMMIT;
    -- END OF FILE

Maybe you are looking for