SQLLDR

Hi,
we are using 9i client for SQL Loader and insert into 10g database which are on different machine. Before some days script to insert data into 10g remote database through SQL Loader was taking 10 minutes but now onward that script and also other scripts are taking more time to insert data into remore 10g database. All scripts are taking almost twice time to load.
There is no hard ware or any changes on both nodes.
Please help.
Anand

Can you help?
Anand

Similar Messages

  • Can you use schema name in "INTO TABLE" in sqlldr?

    Hi All,
    I have a simple question.
    My Oracle userid is SBHAT.
    SBHAT has insert,delete, select,update privileges on a Table in another schema XYZ.
    I want to SQL*Load data in Table EMPLOYEE in XYZ schema, using my userid. Something like ....
    sqlldr userid=*SBHAT*/password control=test.ctl data=test.txt
    I tried to use the following in my test.ctl file but it does not work.
    load data
    append
    into table "XYZ.EMPLOYEE"
    fields terminated by ',' optionally enclosed by '"'
    trailing nullcols
    Can someone give me the proper syntax for into table that uses *schema.table_name* construct.
    Thanks,
    Suresh

    Pl post exact OS and database versions - what do you get when you execute sql*loader with the syntax you have identified so far ?
    http://docs.oracle.com/cd/E11882_01/server.112/e22490/ldr_control_file.htm#i1005623
    HTH
    Srini

  • SQLLDR - how to set a default value for an empty string

    I'm using SQLLDR to load data from a flat file on Oracle 10g 10.2.0.4 EE, Linux RHEL 4 update 6. Seems like my question is simple, but darned if I can't find the solution.
    My source file (comma delimited extracted form a different RDBMS) has some (not all) rows where the output for a string is "", and not NULL. How do I check for empty string and add a default value in the control file? The target column is non-nullable
    Here's my current control file - however, the column attributeName where sometimes the source record is "" does not get evaluated as NULL.
    load data
    infile 'mydata.del'
    badfile 'mydata.bad'
    discardfile 'mydata.dsc'
    into table mydata
    fields terminated by "," optionally enclosed by '"'
       TRAILING NULLCOLS           
    (l_metadata,
    company,
    className,
    attributeName "decode(:vc_attributeName,null, 'none')",
    attributeDefault)
    {code}
    When I tried decode with "", I received the following error:
    {code}SQL*Loader-350: Syntax error at line 11.
    Expecting "," or ")", found ", 'none')".
    attributeName "decode(:vc_attributeName,"", 'none')", {code}
    This does not work either:
    {code}
    load data
    infile 'mydata.del'
    badfile 'mydata.bad'
    discardfile 'mydata.dsc'
    into table mydata
    fields terminated by "," optionally enclosed by '"'
       TRAILING NULLCOLS           
    (l_metadata,
    company,
    className,
    attributeName NULLIF attributeName = BLANKS ,"decode(:vc_attributeName,null, 'none')",
    attributeDefault) I get the following in the log:
    Table COMPANYMETADATA, loaded from every logical record.
    Insert option in effect for this table: INSERT
    TRAILING NULLCOLS option in effect
       Column Name                  Position   Len  Term Encl Datatype
    L_METADATA                          FIRST     *   ,  O(") CHARACTER
    COMPANY                            NEXT     *   ,  O(") CHARACTER
    CLASSNAME                         NEXT     *   ,  O(") CHARACTER
    ATTRIBUTENAME                     NEXT     *   ,  O(") CHARACTER
        NULL if ATTRIBUTENAME = BLANKS
        SQL string for column : "decode(:attributeName,null, 'none')"
    ATTRIBUTEDEFAULT                  NEXT     *   ,  O(") CHARACTER
    Record 1: Rejected - Error on table COMPANYMETADATA, column ATTRIBUTENAME.
    ORA-01400: cannot insert NULL into ("SRV5"."COMPANYMETADATA"."ATTRIBUTENAME")
    {code}
    Any help is appreciated -
    Edited by: kpw on Feb 9, 2009 11:10 AM                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

    Hello,
    I just loaded following data successfully in my_data table using the same control file I posted. Yes it will evaluate "" as null and replace with none, remember in control file you have this option, "optionally enclosed by '"'.
    #my_data.dat
    "myname1", "attribname1"
    "myname2", "attribname2"
    "myname3", "attribname3"
    "myname4", ""
    #my_data.ctl
    load data
    into table my_data
    fields terminated by "," optionally enclosed by '"'
    TRAILING NULLCOLS           
    my_name char(30),
    vc_attributeName "decode(:vc_attributeName,null, 'none', :vc_attributeName)"
    {code}
    Here is the output log file
    {code}
    Control File:   my_data.ctl
    Data File:      mydata.dat
      Bad File:     mydata.bad
      Discard File:  none specified
    (Allow all discards)
    Number to load: ALL
    Number to skip: 0
    Errors allowed: 50
    Bind array:     64 rows, maximum of 256000 bytes
    Continuation:    none specified
    Path used:      Conventional
    Table MY_DATA, loaded from every logical record.
    Insert option in effect for this table: INSERT
    TRAILING NULLCOLS option in effect
       Column Name                  Position   Len  Term Encl Datatype
    MY_NAME                             FIRST    30   ,  O(") CHARACTER           
    VC_ATTRIBUTENAME                     NEXT     *   ,  O(") CHARACTER           
        SQL string for column : "decode(:vc_attributeName,null, 'none', :vc_attributeName)"
    Table MY_DATA:
      4 Rows successfully loaded.
      0 Rows not loaded due to data errors.
      0 Rows not loaded because all WHEN clauses were failed.
      0 Rows not loaded because all fields were null.
    Space allocated for bind array:                  18560 bytes(64 rows)
    Read   buffer bytes: 1048576
    Total logical records skipped:          0
    Total logical records read:             4
    Total logical records rejected:         0
    Total logical records discarded:        0
    Run began on Mon Feb 09 18:17:35 2009
    Run ended on Mon Feb 09 18:17:36 2009
    Elapsed time was:     00:00:00.98
    CPU time was:         00:00:00.10
    {code}
    Hope this helps clearing your dobuts
    Regards                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • Getting ORA-22805 when trying to load XML file using SQLLDR

    I'm trying to learn the basics of XML since we'll be getting XML files in the near future. I'm using one of the sample schemas that comes with XMLSPY. I loaded this schema into an 11g Oracle DB using XMLSPY:
    <?xml version="1.0" encoding="UTF-8"?>
    <!-- edited with XML Spy v4.0 NT beta 1 build Jun 13 2001 (http://www.xmlspy.com) by Alexander Falk (Altova, Inc.) -->
    <schema xmlns="http://www.w3.org/2001/XMLSchema" xmlns:ipo="http://www.altova.com/IPO" targetNamespace="http://www.altova.com/IPO" elementFormDefault="unqualified" attributeFormDefault="unqualified">
         <annotation>
              <documentation>
    International Purchase order schema for Example.com
    Copyright 2000 Example.com. All rights reserved.
    </documentation>
         </annotation>
         <!-- include address constructs -->
         <include schemaLocation="address.xsd"/>
         <element name="purchaseOrder" type="ipo:PurchaseOrderType"/>
         <element name="comment" type="string"/>
         <complexType name="PurchaseOrderType">
              <sequence>
                   <element name="shipTo" type="ipo:Address"/>
                   <element name="billTo" type="ipo:Address"/>
                   <element ref="ipo:comment" minOccurs="0"/>
                   <element name="Items" type="ipo:Items"/>
              </sequence>
              <attribute name="orderDate" type="date"/>
         </complexType>
         <complexType name="Items">
              <sequence>
                   <element name="item" minOccurs="0" maxOccurs="unbounded">
                        <complexType>
                             <sequence>
                                  <element name="productName" type="string"/>
                                  <element name="quantity">
                                       <simpleType>
                                            <restriction base="positiveInteger">
                                                 <maxExclusive value="100"/>
                                            </restriction>
                                       </simpleType>
                                  </element>
                                  <element name="price" type="decimal"/>
                                  <element ref="ipo:comment" minOccurs="0"/>
                                  <element name="shipDate" type="date" minOccurs="0"/>
                             </sequence>
                             <attribute name="partNum" type="ipo:Sku"/>
                        </complexType>
                   </element>
              </sequence>
         </complexType>
         <simpleType name="Sku">
              <restriction base="string">
                   <pattern value="\d{3}-[A-Z]{2}"/>
              </restriction>
         </simpleType>
    </schema>
    Then I created an XMLType table:
    CREATE TABLE purchaseOrder OF XMLType
    XMLSCHEMA "ipo.xsd" ELEMENT "purchaseOrder"
    I'm trying to load the sample XML file ipo.xml into purchaseOrder using SQLLDR. This is ipo.xml:
    <?xml version="1.0"?>
    <!-- edited with XMLSPY v2004 rel. 4 U (http://www.xmlspy.com) by Mr. Nobody (Altova GmbH) -->
    <ipo:purchaseOrder xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:ipo="http://www.altova.com/IPO" orderDate="1999-12-01" xsi:schemaLocation="http://www.altova.com/IPO
    ipo.xsd">
         <shipTo export-code="1" xsi:type="ipo:EU-Address">
              <ipo:name>Helen Zoe</ipo:name>
              <ipo:street>47 Eden Street</ipo:street>
              <ipo:city>Cambridge</ipo:city>
              <ipo:postcode>126</ipo:postcode>
         </shipTo>
         <billTo xsi:type="ipo:US-Address">
              <ipo:name>Robert Smith</ipo:name>
              <ipo:street>8 Oak Avenue</ipo:street>
              <ipo:city>Old Town</ipo:city>
              <ipo:state>AK</ipo:state>
              <ipo:zip>95819</ipo:zip>
         </billTo>
         <Items>
              <item partNum="833-AA">
                   <productName>Lapis necklace</productName>
                   <quantity>2</quantity>
                   <price>99.95</price>
                   <ipo:comment>Need this for the holidays!</ipo:comment>
                   <shipDate>1999-12-05</shipDate>
              </item>
              <item partNum="748-OT">
                   <productName>Diamond heart</productName>
                   <quantity>1</quantity>
                   <price>248.90</price>
                   <ipo:comment>Valentine's day packaging.</ipo:comment>
                   <shipDate>2000-02-14</shipDate>
              </item>
              <item partNum="783-KL">
                   <productName>Uncut diamond</productName>
                   <quantity>7</quantity>
                   <price>79.90</price>
                   <shipDate>2000-01-07</shipDate>
              </item>
              <item partNum="238-KK">
                   <productName>Amber ring</productName>
                   <quantity>3</quantity>
                   <price>89.90</price>
                   <ipo:comment>With no inclusions, please.</ipo:comment>
                   <shipDate>2000-01-07</shipDate>
              </item>
              <item partNum="229-OB">
                   <productName>Pearl necklace</productName>
                   <quantity>1</quantity>
                   <price>4879.00</price>
                   <shipDate>1999-12-05</shipDate>
              </item>
              <item partNum="128-UL">
                   <productName>Jade earring</productName>
                   <quantity>5</quantity>
                   <price>179.90</price>
                   <shipDate>2000-02-14</shipDate>
              </item>
         </Items>
    </ipo:purchaseOrder>
    This is what's in the control file:
    LOAD DATA
    INFILE *
    INTO TABLE purchaseOrder TRUNCATE
    xmltype(xmldata)
    FIELDS
    xmldata LOBFILE (CONSTANT ipo.xml)
    BEGINDATA
    0
    The load fails with:
    Record 1: Rejected - Error on table PURCHASEORDER.
    ORA-22805: cannot insert NULL object into object tables or nested tables
    Another question I have is, how do we know how many records (0's) to specify in the control file? In this case there's only one but when real files are used we won't know how many are in the file.
    Thanks for your help!

    The concept was "Don't use SQL*Loader to parse XML".
    You can use SQL*Loader to load an entire XML document into the DB. That is fine. You can do the same via BFILENAME to read in files from disk as well.
    If you want to parse XML, do that from within Oracle via PL/SQL and/or SQL. The solution depends upon your version of Oracle and what is good enough for you in terms of performance.
    So the basics are
    a) How am I getting the information?
    b) How am I getting in into Oracle?
    c) How do I want to parse it?
    As I see the schema, it only allows for one ipo:purchaseOrder node in the document, since that is the root node. If you have multiple in the incoming file, you no longer have valid XML, both per the schema and because you have no single root node. You have an XML fragment, which must be treated different.
    Just trying to understand the question since I now realize it does not agree with what the schema in your initial example shows.

  • Sqlplus and sqlldr commands getting hanged and utilizing cpu to 100%

    Hi ,
    We have a server in which only 10.2.0 client has been installed to execute some scripts in other databases. From today morning we are having problem to run sqlplus or sqlldr . whernever sqlplus or sqlplus /nolog is given it just getting hunged.

    There is no such abnormality found in the system log.
    This system has only the oracle software used as client.
    The main problem is that user is not giving me downtime to restart the server......pls help.

  • Problem with sqlldr and commit

    Hi,
    i have a problem with sqlldr and commit.
    I have a simple table with one colum [ col_id number(6) not null ]. The column "col_id" is primary key in the table. I have one file with 100.000 records ( number from 0 to 99.999 ).
    I want load the file in the table with sqlldr ( sql*loader ) but i want commit only if all records are loaded. If one record is discarded i want discarded all record of file.
    The proble is that in coventional path the commit is on 64 row but if i want the same records of file isn't possible and in direct path sqlldr disable primary key :(
    There are a solutions?
    Thanks
    I'm for the bad English

    This is my table:
    DROP TABLE TEST_SQLLOADER;
    CREATE TABLE TEST_SQLLOADER
    (     COL_ID NUMBER NOT NULL,
         CONSTRAINT TEST_SQLLOADER_PK PRIMARY KEY (COL_ID)
    This is my ctlfile ( test_sql_loader.ctl )
    OPTIONS
    DIRECT=false
    ,DISCARDMAX=1
    ,ERRORS=0
    ,ROWS=100000
    load data
    infile './test_sql_loader.csv'
    append
    into table TEST_SQLLOADER
    fields terminated by "," optionally enclosed by '"'
    ( col_id )
    test_sql_loader.csv
    0
    1
    2
    3
    99999
    i run sqlloader
    sqlldr xxx/yyy@orcl control=test_sql_loader.ctl log=test_sql_loader.log
    output on the screen
    Commit point reached - logical record count 92256
    Commit point reached - logical record count 93248
    Commit point reached - logical record count 94240
    Commit point reached - logical record count 95232
    Commit point reached - logical record count 96224
    Commit point reached - logical record count 97216
    Commit point reached - logical record count 98208
    Commit point reached - logical record count 99200
    Commit point reached - logical record count 100000
    Logfile
    SQL*Loader: Release 11.2.0.1.0 - Production on Sat Oct 3 14:50:17 2009
    Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
    Control File: test_sql_loader.ctl
    Data File: ./test_sql_loader.csv
    Bad File: test_sql_loader.bad
    Discard File: none specified
    (Allow all discards)
    Number to load: ALL
    Number to skip: 0
    Errors allowed: 0
    Bind array: 100000 rows, maximum of 256000 bytes
    Continuation: none specified
    Path used: Conventional
    Table TEST_SQLLOADER, loaded from every logical record.
    Insert option in effect for this table: APPEND
    Column Name Position Len Term Encl Datatype
    COL_ID FIRST * , O(") CHARACTER
    value used for ROWS parameter changed from 100000 to 992
    Table TEST_SQLLOADER:
    100000 Rows successfully loaded.
    0 Rows not loaded due to data errors.
    0 Rows not loaded because all WHEN clauses were failed.
    0 Rows not loaded because all fields were null.
    Space allocated for bind array: 255936 bytes(992 rows)
    Read buffer bytes: 1048576
    Total logical records skipped: 0
    Total logical records read: 100000
    Total logical records rejected: 0
    Total logical records discarded: 0
    Run began on Sat Oct 03 14:50:17 2009
    Run ended on Sat Oct 03 14:50:18 2009
    Elapsed time was: 00:00:01.09
    CPU time was: 00:00:00.06
    The commit is on 992 row
    if i have error on 993 record i have commit on first 992 row :(
    Edited by: inter1908 on 3-ott-2009 15.00

  • Unable to load comma separated number using sqlldr

    Hi
    See the example below.
    I have problem in loading “bonus” into my emp table.
    Appreciate your help.
    Here I have appended the following.
    -     demo04.ctl – controfile
    -     demo04.dat – datafile
    -     demo04.log - logfile
    Error :
    ===============================
    Record 1: Rejected - Error on table EMP, column BONUS.
    ORA-01722: invalid number
    sqlldr userid=scott/tiger@orcl , control=demo04.ctl , log=demo04.log
    Create table emp
    Empno      number,
    Ename     varchar2(80),
    Sal     number(15,2),
    bonus      number
    cat demo04.ctl
    ==========================
    LOAD DATA
    INFILE 'demo04.dat'
    INTO TABLE emp
    FIELDS TERMINATED BY ','
    OPTIONALLY ENCLOSED BY '"'
    Empno INTEGER EXTERNAL,
    Ename ,
    Sal DECIMAL EXTERNAL,
    bonus DECIMAL EXTERNAL ????
    cat demo04.dat
    ==============================================
    "1000", "*XXX,XXXX", "9820.760000","3,395"
    "1001", "*XXX,XXXX", "9821.760000","88,883,395"
    cat demo04.log
    ============================
    SQL*Loader: Release 8.1.7.4.0 - Production on Fri Oct 13 17:58:31 2006
    (c) Copyright 2000 Oracle Corporation. All rights reserved.
    Control File: demo04.ctl
    Data File: demo04.dat
    Bad File: demo04.bad
    Discard File: none specified
    (Allow all discards)
    Number to load: ALL
    Number to skip: 0
    Errors allowed: 50
    Bind array: 64 rows, maximum of 65536 bytes
    Continuation: none specified
    Path used: Conventional
    Table EMP, loaded from every logical record.
    Insert option in effect for this table: INSERT
    Column Name Position Len Term Encl Datatype
    EMPNO FIRST * , O(") CHARACTER
    ENAME NEXT * , O(") CHARACTER
    SAL NEXT * , O(") CHARACTER
    BONUS NEXT * , O(") CHARACTER
    Record 1: Rejected - Error on table EMP, column BONUS.
    ORA-01722: invalid number
    Record 2: Rejected - Error on table EMP, column BONUS.
    ORA-01722: invalid number
    Table EMP:
    0 Rows successfully loaded.
    2 Rows not loaded due to data errors.
    0 Rows not loaded because all WHEN clauses were failed.
    0 Rows not loaded because all fields were null.
    Space allocated for bind array: 65016 bytes(63 rows)
    Space allocated for memory besides bind array: 0 bytes
    Total logical records skipped: 0
    Total logical records read: 2
    Total logical records rejected: 2
    Total logical records discarded: 0
    Run began on Fri Oct 13 17:58:31 2006
    Run ended on Fri Oct 13 17:58:32 2006
    Elapsed time was: 00:00:01.14
    CPU time was: 00:00:00.05

    Try
    bonus CHAR "to_number(:bonus, '999G999G999')"

  • Error using LKM File to Oracle SQLLDR

    Hi,
    we are getting the following error when trying to use SQLLDR LKM
    The error file says the table does not exist although I can see it in the database.
    Any ideas?
    Oracle DB 9i
    SQLLDR 10.2
    Load.out
    SQL*Loader: Release 10.2.0.1.0 - Production on Thu Jan 20 16:30:58 2011
    Copyright (c) 1982, 2005, Oracle. All rights reserved.
    Load.log
    SQL*Loader: Release 10.2.0.1.0 - Production on Thu Jan 20 16:30:58 2011
    Copyright (c) 1982, 2005, Oracle. All rights reserved.
    Control File: D:\HFMBACKUP\/LOAD.ctl
    Data File: D:\HFMBACKUP\/PCONS/Loads/BK_FINST-2009_Thu-20-Jan-2011_14-52-24.TXT
    File processing option string: "str X'0D0A'"
    Bad File: D:\HFMBACKUP\/LOAD.bad
    Discard File: D:\HFMBACKUP\/LOAD.dsc
    (Allow 1 discards)
    Number to load: ALL
    Number to skip: 1
    Errors allowed: 0
    Continuation: none specified
    Path used: Direct
    Table ODISTAG."C$_12181010HFMData", loaded from every logical record.
    Insert option in effect for this table: INSERT
    TRAILING NULLCOLS option in effect
    Column Name Position Len Term Encl Datatype
    C1_SCENARIO FIRST * ; CHARACTER
    C2_YEAR NEXT * ; CHARACTER
    C3_VIEW NEXT * ; CHARACTER
    C4_ENTITY NEXT * ; CHARACTER
    C5_VALUE NEXT * ; CHARACTER
    C6_ACCOUNT NEXT * ; CHARACTER
    C7_ICP NEXT * ; CHARACTER
    C8_PERIOD NEXT * ; CHARACTER
    C10_C1 NEXT * ; CHARACTER
    C13_C2 NEXT * ; CHARACTER
    C9_C3 NEXT * ; CHARACTER
    C12_C4 NEXT * ; CHARACTER
    C11_VALUE1 NEXT * ; CHARACTER
    SQL*Loader-951: Error calling once/load initialization
    ORA-00942: table or view does not exist
    Load CTL
    OPTIONS (
         SKIP=1,
         ERRORS=0,
         DIRECT=TRUE
    LOAD DATA
    INFILE "D:\HFMBACKUP\/PCONS/Loads/BK_FINST-2009_Thu-20-Jan-2011_14-52-24.TXT" "str X'0D0A'"
    BADFILE "D:\HFMBACKUP\/LOAD.bad"
    DISCARDFILE "D:\HFMBACKUP\/LOAD.dsc"
    DISCARDMAX 1
    INTO TABLE ODISTAG."C$_12181010HFMData"
    FIELDS TERMINATED BY X'3B'
    TRAILING NULLCOLS
         C1_SCENARIO     ,
         C2_YEAR     ,
         C3_VIEW     ,
         C4_ENTITY     ,
         C5_VALUE     ,
         C6_ACCOUNT     ,
         C7_ICP     ,
         C8_PERIOD     ,
         C10_C1     ,
         C13_C2     ,
         C9_C3     ,
         C12_C4     ,
         C11_VALUE1     
    )

    Hi,
    Check the paths - seems like you have a combination of forward slashes and back slashes in the pathnames..
    D:\HFMBACKUP\/LOAD.ctl and
    D:\HFMBACKUP\/PCONS/Loads/BK_FINST-2009_Thu-20-Jan-2011_14-52-24.TX for example have combination '\/' after HFMBACKUP.
    cheers
    Bos

  • Load data from a file with multiple record types to a single table-sqlldr

    We are using two datastores which refer to the same file. The file has 2 types of records header and detail.
    h011234tyre
    d01rey5679jkj5679
    h011235tyrr
    d01rel5678jul5688
    d01reh5698jll5638
    Can someone help in loading these lines from one file with only two data stores(not 2 separate files) using File to Oracle(SQLLDR) Knowledge Module.

    Hi,
    Unfortunately the IKM SQLDR doesn't have the "when" condition to be wrote at ctl file.
    If you wish a simple solution, just add an option (drop me a email if you want a LKM with this)
    The point is:
    With a single option, you will control the when ctl clause and, for instance, can define:
    1) create 2 datastores (1 for each file)
    2) the first position will be a column at each datastore
    3) write the when condition to this first column at the LKM in the interface.
    Does it help you?

  • How to use sqlldr for loading data in Oracle 10g xe.

    I want to load data using SQL*LOADER in Oracle DB from a .csv file but it doesn't seems to be working can anyony please help me.I am not get getting how to use sqlldr.
    in CMD in giving this command
    sqlldr hr/hr control='c:/data/record.ctr'
    record.ctr>>load data
    infile 'c:\data\record.csv'
              into table record
    fields terminated by "," optionally enclosed by '"'          
    ( Name,uday, hemant )
    sql table in data base>>CREATE TABLE "RECORD"
    (     "NAME" VARCHAR2(50),
         "UDAY" VARCHAR2(50),
         "HEMANT" VARCHAR2(50)
    record.csv>>name,uday,hemant
    c1,45454,84894
    c2,489654,21322
    can you please tell me how can i get this simple example run.

    C:\>sqlldr scott/tiger
    control = data.ctl
    SQL*Loader: Release 10.1.0.2.0 - Production on Thu Sep 14 17:06:46 2006
    Copyright (c) 1982, 2004, Oracle.  All rights reserved.
    Commit point reached - logical record count 3
    SQL> conn scott/tiger
    Connected.
    SQL> create table data (col1 varchar2(40),col2 varchar2(40),col3 varchar2(40));
    Table created.
    SQL> select * from data;
    COL1                 COL2                 COL3
    name                 uday                 hemant
    c1                   45454                84894
    c2                   489654               21322
    SQL>
    data.csv
    name,uday,hemant
    c1,45454,84894
    c2,489654,21322data.ctl
    load data
    infile 'c:\data.csv'
    append
    into table data
    fields terminated by ','
    optionally enclosed by '"'
    (col1,col2,col3)i hope it will solve ur problem

  • How to change NLS_NUMERIC_CHARACTERS parameter for OWB SQLLDR mapping

    Hi,
    How to change the NLS_NUMERIC_CHARACTERS database paramater for my SQLLDR mapping?
    I have an input flat file which has numeric data with ',' as decimal separator means NLS_NUMERIC_CHARACTERS setting as ',.'
    However in my target oracle schema, the decimal separator is '.' which has NLS parameter set as NLS_NUMERIC_CHARACTERS='.,'
    My OWB version is 10.2.
    When I checked the configuration parameters of the sql loader mapping and the flat file operator, There is facility to change language, but not NLS_NUMERIC_CHARACTERS setting.
    I do not want to change the NLS_NUMERIC_CHARACTERS setting in my database as there are many other projects which will get impacted.
    We got a work around as below using external table & premap procedure. But as I have many mappings already developed, It is not possible to use this workaround.
    - I can use premapping procedure with external tables to populate.
    - NLS_NUMERIC_CHARACTERS setting can be changed using procedure for that particular session.
    Is there a way to change NLS_NUMERIC_CHARACTERS setting only for that particular mapping/mapping session?
    Thanks,
    SriGP

    At this moment , this is not possible . You can see metalink note ID 268906.1.
    It says:
    Currently, external tables always use the setting of NLS_NUMERIC_CHARACTERS
    +at the database level.+
    Cheers
    Marisol

  • How to add file name by sqlldr

    Hi All,
    Can anyone kindly give me an approach to use a variable in a sql loader ctl file. I am trying to add the value before each insert of row and this value is the file name. So the question is how can I dynamically identify the input data file name, if not, is there a way I can make my SQL Loader to insert a value (file name) before each row into the table.
    my control file like below ,
    LOAD DATA
    INTO TABLE "user"."AAA_BILL"
    APPEND
    REENABLE DISABLED_CONSTRAINTS
    EXCEPTIONS "USER"."AAA_BILL"
    FIELDS TERMINATED BY '|'
    (Streamnumber ,
    MSID ,
    UserName,
    Domain,
    UserIP ,
    Correlation_ID)and the loading script as below ,
    for j in $(GetFileNames)
    do
    let i=$i+1
    sqlldr user/pass control="/u01/ctrlfile/loadBILLDtl.ctl" log=$WiMAXlog$j.log data=$WiMAXsource$j   bad=$WiMAXlog$j.bad
    GetFileNames ()
    sqlplus -s user/pass << EOF
    set echo off
    SET FEEDBACK OFF
    SET heading off
    set pagesize 50000
    select bi_file_name from dbm_bill_head where bi_file_name like 'WiMAX_%' and bi_auto_status=35  order by bi_file_name;
    EOF
    }What I am trying to accomplish here is, I want to insert the data file name along with other data in the data file into table AAA_BILL and this table has the file name coloumn, but the data file does not contain the file name as one of its contents.
    Note : my DB is 10G and OS is RHEL
    any help please ,
    Edited by: 876602 on 18/12/2011 05:44 ص

    Hi,
    Now it is working just well.
    I kept 2 variables: one with the POSIX file path for my do shell script and another one (using the provided tip here) for the Applescript function.
    The only problem I had left was that teh file was writen in, maybe, UTF8 so I added to the open for access a "as text" at the end to make the file as straith text file.
    I always found languages like Applescript a little bit hard to learn. Strangely, I have less difficulty with Cocoa!
    Thanks for avery one here!

  • Sqlldr - can you skip last row from a data file

    Hi
    I need to skip last line from the data file when I load the data into the tables. Is that possible to do using sqlldr, if yes How?
    Also, the first row in the data file, which has a single column needs to be loaded into all the rows of the table. How can i do that?
    Example
    020051213
    1088110 0047245 A 000000000000GB 00000496546700
    1088110 0719210 A 000000000000GB 00001643982200
    1088110 0727871 A 000000000000GB 00000083008900
    9010163
    The first line needs to go into all the rows, and the last row needs to be skipped. Remaining can be loaded normally. I think this would take multiple loads. Can anybody please help.
    TIA

    here i am sending a control file sample which use the when clause
    in sql loader
    load data
    infile 'load_4.dat'
    discardfile 'load_4.dsc'
    insert
    into table sql_loader_4_a
    when field_2 = 'Fruit'
    field_1 position(1) char(8),
    field_2 position(9) char(5)
    into table sql_loader_4_b
    when field_2 = 'City'
    field_1 position(1) char(8),
    field_2 position(9) char(5)
    )

  • Oracle error "ORA-01843: not a valid month" when trying to run sqlldr

    Hi all,
    I'm trying to load some data into a staging database via a CSV file using sqlldr, and am running into an issue where it doesn't like the date format I'm using.
    Here is my input data:
    2012-01-09 16:28:12 -05:00Here is the entry in the .ctl file:
    created TIMESTAMP WITH TIME ZONE 'yyyy-mm-dd HH24:MI:SS TZR'And finally, here is the entry in the .sql file:
    created TIMESTAMP WITH TIME ZONEAfter I try to load, I get greeted with the dreaded error message: Record 1: Rejected - Error on table WTPART, column CREATED. ORA-01843: not a valid month
    I'm really confused as to why it's blowing up on the date, because it seems to me that "01" is indeed a valid date in terms of the date format I'm using. Any ideas? Thanks!
    Edited by: Nick Tiberi on Jan 10, 2012 8:06 AM

    Hmmm, not sure exactly what the problem is. It works fine for me on my XE instance.
    Set up the control and data files....
    tubby@Tubbz:~/test$ cat >> WTPart.csv <<EOF
    2012-01-09 16:28:12 -05:00
    EOF
    tubby@Tubbz:~/test$
    tubby@Tubbz:~/test$ cat >> load.ctl <<EOF
    LOAD DATA
    INFILE WTPart.csv
    APPEND INTO TABLE WTPart
    FIELDS TERMINATED BY '|' OPTIONALLY ENCLOSED BY '"'
    TRAILING NULLCOLS
    created TIMESTAMP WITH TIME ZONE 'yyyy-mm-dd HH24:MI:SS TZR'
    EOF
    tubby@Tubbz:~/test$
    tubby@Tubbz:~/test$ /usr/lib/oracle/xe/app/oracle/product/10.2.0/server/bin/sqlldr tubby/pswd@xe control=load.ctl
    SQL*Loader: Release 10.2.0.1.0 - Production on Tue Jan 10 10:21:28 2012
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Commit point reached - logical record count 1
    tubby@Tubbz:~/test$
    {code}
    Query the result from the database
    {code}
    ME_XE?select * from wtpart;
    CREATED
    09-JAN-12 04.28.12.000000 PM -05:00
    1 row selected.
    Elapsed: 00:00:00.01
    ME_XE?
    ME_XE?select * from v$version;
    BANNER
    Oracle Database 10g Express Edition Release 10.2.0.1.0 - Product
    PL/SQL Release 10.2.0.1.0 - Production
    CORE    10.2.0.1.0      Production
    TNS for Linux: Version 10.2.0.1.0 - Production
    NLSRTL Version 10.2.0.1.0 - Production
    5 rows selected.
    Elapsed: 00:00:00.01
    ME_XE?
    {code}
    Are you sure your CSV file doesn't have some "funky" data in it?                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

  • Error in using SQLLDR LKM

    I am trying to load fixed length flat file into Oracle table in ODI by using LKM File to Oracle (SQLLDR).
    But getting below error in Call sqlldr step,
    org.apache.bsf.BSFException: exception from Jython:
    Traceback (most recent call last):
      File "<string>", line 22, in <module>
    Load Error: See C:\Aditya\ODI\SQL_Loader/TARIFF_SERVICE.log for details
        at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
        at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:322)
        at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)
        at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2472)
        at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:47)
        at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)
        at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
        at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2913)
        at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2625)
        at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:558)
        at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:464)
        at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2093)
        at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:366)
        at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)
        at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:300)
        at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:292)
        at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:855)
        at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)
        at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
        at java.lang.Thread.run(Thread.java:662)
    Caused by: Traceback (most recent call last):
      File "<string>", line 22, in <module>
    Load Error: See C:\Aditya\ODI\SQL_Loader/TARIFF_SERVICE.log for details
        at org.python.core.PyException.fillInStackTrace(PyException.java:70)
        at java.lang.Throwable.<init>(Throwable.java:181)
        at java.lang.Exception.<init>(Exception.java:29)
        at java.lang.RuntimeException.<init>(RuntimeException.java:32)
        at org.python.core.PyException.<init>(PyException.java:46)
        at org.python.core.PyException.doRaise(PyException.java:219)
        at org.python.core.Py.makeException(Py.java:1166)
        at org.python.core.Py.makeException(Py.java:1170)
        at org.python.pycode._pyx13.f$0(<string>:59)
        at org.python.pycode._pyx13.call_function(<string>)
        at org.python.core.PyTableCode.call(PyTableCode.java:165)
        at org.python.core.PyCode.call(PyCode.java:18)
        at org.python.core.Py.runCode(Py.java:1204)
        at org.python.core.Py.exec(Py.java:1248)
        at org.python.util.PythonInterpreter.exec(PythonInterpreter.java:172)
        at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:144)
        ... 19 more
    TARIFF_SERVICE.log is not getting created.
    Code generated is,
    import java.lang.String
    import java.lang.Runtime as Runtime
    from jarray import array
    import java.io.File
    import os
    import re
    ctlfile = r"""C:\Aditya\ODI\SQL_Loader/TARIFF_SERVICE.ctl"""
    logfile = r"""C:\Aditya\ODI\SQL_Loader/TARIFF_SERVICE.log"""
    outfile = r"""C:\Aditya\ODI\SQL_Loader/TARIFF_SERVICE.out"""
    oracle_sid=''
    if len('XE')>0: oracle_sid = '@'+'XE'
    loadcmd = r"""sqlldr 'HR/<@=snpRef.getInfo("DEST_PASS") @>%s' control='%s' log='%s' > "%s" """ % (oracle_sid,ctlfile, logfile, outfile)
    rc = os.system(loadcmd)
    if rc <> 0 and rc <> 2:
        raise "Load Error", "See %s for details" % logfile
    # Init Vars
    nbIns = 0
    nbRej = 0
    nbNull = 0
    strprt = ""
    maxAllowedError = r"""0"""
    c = 0
    flag = 0
    # Open log file
    f = open(logfile, "r")
    try:
        lines = f.readlines()
        for line in lines:
            if line.rstrip().upper().endswith(r"""HR.TC$_0SAS_TARIFF_SERVICEREPLACE:""".upper()):
                flag = 1
                c = 0
            if flag == 1:
                if c > 0 and c <= 4:
                    if c == 1 :
                        nbIns = int(re.findall("\d+", line)[0])
                    elif c == 2:
                        nbRej = int(re.findall("\d+", line)[0])
                    elif c == 4:
                        nbNull = int(re.findall("\d+", line)[0])
                        break
            c+=1
        strprt = "\n\tIns:\t%s\n\tReject:\t%s\n\tNullField:\t%s" % (nbIns, nbRej, nbNull)
    finally:
        f.close()
    # if some rows has been rejected due to invalide data, check KM option LOA_ERRORS
    if rc == 2:
        if nbRej > int(maxAllowedError):
            raise strprt
            break
    Can anyone suggest how to resolve this issue?
    Thanks,
    Aditya

    Hi
    i am facing same issue did any body find any solution for it
    Hi 1005380,
    Thanks for quick respond.
    Source(position based file so i cant change the phsical length)
    name,type,physical length,logical length,-,-,recordCount
    pading
    String
    1
    1
    50
    null
    2
    journal_name
    String
    2
    50
    50
    null
    journal_line_number
    String
    52
    6
    50
    null
    segment1
    String
    58
    6
    50
    null
    segment2
    String
    64
    8
    50
    null
    segment3
    String
    72
    5
    50
    null
    segment4
    String
    77
    3
    50
    null
    segment5
    String
    80
    1
    50
    null
    segment6
    String
    81
    4
    50
    null
    segment7
    String
    85
    3
    50
    null
    segment8
    String
    88
    5
    50
    null
    line_desc
    String
    93
    240
    50
    null
    debit_amount
    String
    333
    15
    50
    null
    credit_amount
    String
    348
    15
    50
    null
    Target(DB)
    order,name,type,logicalLength,scale,Source mapping fields
    55
    ENTERED_DR
    NUMBER
    0
    -127
    Debit_amt(sorce)
    56
    ENTERED_CR
    NUMBER
    0
    -127
    credit_amt(source)
    60
    REFERENCE1
    VARCHAR2
    100
    0
    journalname(source)
    142
    CUSTOM_ATTRIBUTE1
    VARCHAR2
    150
    0
    segment1(s)
    143
    CUSTOM_ATTRIBUTE2
    VARCHAR2
    150
    0
    segment2(s)
    144
    CUSTOM_ATTRIBUTE3
    VARCHAR2
    150
    0
    segment3(s)
    145
    CUSTOM_ATTRIBUTE4
    VARCHAR2
    150
    0
    segment4(s)
    146
    CUSTOM_ATTRIBUTE5
    VARCHAR2
    150
    0
    segment5(s)
    147
    CUSTOM_ATTRIBUTE6
    VARCHAR2
    150
    0
    segment6(s)
    148
    CUSTOM_ATTRIBUTE7
    VARCHAR2
    150
    0
    segment7(s)
    149
    CUSTOM_ATTRIBUTE8
    VARCHAR2
    150
    0
    segment8(s)
    i am omitting padding and line_desc from source for mapping.
    Note : 1)even i am mapping single column eg: journal_name and reference1 for more than 1000 records i am getting the Error(Number format exception).for 999 records it is picking up and it is not throwing any error.
    2)for 999 records i mapped all the fields its working fine,if pass more than 999 records it throwing error.
    knowledge Modules:
    LKM File to SQL and IKM SQL Control Append
    Execution Steps:
    loading Drop work table ,Create work table and error at Load data
    Error code:
    Source Code:
    select
    journal_name
    C3_JOURNAL_NAME,
    segment1
    C4_SEGMENT1,
    segment2
    C5_SEGMENT2,
    segment3
    C6_SEGMENT3,
    segment4
    C7_SEGMENT4,
    segment5
    C8_SEGMENT5,
    segment6
    C9_SEGMENT6,
    segment7
    C10_SEGMENT7,
    segment8
    C11_SEGMENT8,
    debit_amount
    C1_DEBIT_AMOUNT,
    credit_amount
    C2_CREDIT_AMOUNT
    from
    TABLE
    /*$$SNPS_START_KEYSNP$CRDWG_TABLESNP$CRTABLE_NAME=ADA_ENTRY_STG_LINESSNP$CRLOAD_FILE=/C:/DavidNithin/GL_Abstraction/adaoasisla.dat.07012013_13h23.txtSNP$CRFILE_FORMAT=FSNP$CRFILE_SEP_FIELD=0x0009SNP$CRFILE_SEP_LINE=0ASNP$CRFILE_FIRST_ROW=0SNP$CRFILE_ENC_FIELD=SNP$CRFILE_DEC_SEP=SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=padingSNP$CRTYPE_NAME=STRINGSNP$CRLINE_OFFSET=1SNP$CRLENGTH=1SNP$CRPRECISION=50SNP$CRREC_CODE_LIST=2SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=journal_nameSNP$CRTYPE_NAME=STRINGSNP$CRLINE_OFFSET=2SNP$CRLENGTH=50SNP$CRPRECISION=50SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=journal_line_numberSNP$CRTYPE_NAME=STRINGSNP$CRLINE_OFFSET=52SNP$CRLENGTH=6SNP$CRPRECISION=50SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=segment1SNP$CRTYPE_NAME=STRINGSNP$CRLINE_OFFSET=58SNP$CRLENGTH=6SNP$CRPRECISION=50SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=segment2SNP$CRTYPE_NAME=STRINGSNP$CRLINE_OFFSET=64SNP$CRLENGTH=8SNP$CRPRECISION=50SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=segment3SNP$CRTYPE_NAME=STRINGSNP$CRLINE_OFFSET=72SNP$CRLENGTH=5SNP$CRPRECISION=50SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=segment4SNP$CRTYPE_NAME=STRINGSNP$CRLINE_OFFSET=77SNP$CRLENGTH=3SNP$CRPRECISION=50SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=segment5SNP$CRTYPE_NAME=STRINGSNP$CRLINE_OFFSET=80SNP$CRLENGTH=1SNP$CRPRECISION=50SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=segment6SNP$CRTYPE_NAME=STRINGSNP$CRLINE_OFFSET=81SNP$CRLENGTH=4SNP$CRPRECISION=50SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=segment7SNP$CRTYPE_NAME=STRINGSNP$CRLINE_OFFSET=85SNP$CRLENGTH=3SNP$CRPRECISION=50SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=segment8SNP$CRTYPE_NAME=STRINGSNP$CRLINE_OFFSET=88SNP$CRLENGTH=5SNP$CRPRECISION=50SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=line_descSNP$CRTYPE_NAME=STRINGSNP$CRLINE_OFFSET=93SNP$CRLENGTH=240SNP$CRPRECISION=50SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=debit_amountSNP$CRTYPE_NAME=STRINGSNP$CRLINE_OFFSET=333SNP$CRLENGTH=15SNP$CRPRECISION=50SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=credit_amountSNP$CRTYPE_NAME=STRINGSNP$CRLINE_OFFSET=348SNP$CRLENGTH=15SNP$CRPRECISION=50SNP$CR$$SNPS_END_KEY*/
    Target Code:
    insert into ODI.C$_0MNA_ENTERY_STG_LINES_ADA
      C3_JOURNAL_NAME,
      C4_SEGMENT1,
      C5_SEGMENT2,
      C6_SEGMENT3,
    C7_SEGMENT4,
    C8_SEGMENT5,
    C9_SEGMENT6,
    C10_SEGMENT7,
    C11_SEGMENT8,
    C1_DEBIT_AMOUNT,
    C2_CREDIT_AMOUNT
    values
    :C3_JOURNAL_NAME,
    :C4_SEGMENT1,
    :C5_SEGMENT2,
    :C6_SEGMENT3,
    :C7_SEGMENT4,
    :C8_SEGMENT5,
    :C9_SEGMENT6,
    :C10_SEGMENT7,
    :C11_SEGMENT8,
    :C1_DEBIT_AMOUNT,
    :C2_CREDIT_AMOUNT
    Error message:
    java.lang.NumberFormatException: For input string: "1,000"
      at java.lang.NumberFormatException.forInputString(NumberFormatException.java:48)
      at java.lang.Integer.parseInt(Inte

  • Data error while loading data using SQLLDR

    Hi Gurus,
    Kindly let em know the possible reasons for getting the below error returned by SQLLDR after loading data:
    x no of rows not loaded due to data errors in SQLLDR
    Could it be due to issues in control file?

    you'll find it well explained here :
    http://download-uk.oracle.com/docs/cd/B19306_01/server.102/b14215/part_ldr.htm#i436326
    http://download-uk.oracle.com/docs/cd/B19306_01/server.102/b14215/ldr_concepts.htm#i1004846

Maybe you are looking for

  • PR to PO lead time

    Hi All Is there any standard report to find PR to PO lead time. Formula to find PR to PO lead time = PO creation date - PR release date Regards Satish Kumar

  • Connect to Server using TOAD

    I am a newb to the DBA side of Oracle and have never setup an instance of it or installed it for that matter. I have TOAD and I would like to connect to the server using it. I have installed TOAD 7.6 on my workstation and that is all. I have not inst

  • How to display all the rows in a report

    How will it be possible to show all the rows in a report, in a table view , other than modifying the table properties for each report individually ? Is there some setting, so that for all tabular reports, all the rows will be displayed in one page? I

  • Ejecting blank pages

    My HP Photosmart C7280 All In One ejects 1 or 2 blank pages or 2 blank pages together in-between pages i want to print, ie:-1 blank page then printed page then 2 blank pages together then printed page etc. Any help solving this would be appreciated,

  • Why is the option to import/export bookmarks not there?

    I'm trying to import my old bookmarks from a HTML or JSON file (I have both) into Firefox 6 under Ubuntu. The following page should explain how: http://support.mozilla.com/en-US/kb/Importing%20Bookmarks%20from%20an%20HTML%20File#os=linux&browser=fx6