Select Data from BSIS table taking too long

Hi
I have to develop a report to give the  details of Extended Withholding Tax (EWT)  for a list of Expenses GL.
Each expense GL is  linked to another Gl which is the EWT Tax GL Account. This is maintained in a Ztable.
I havae wirtten the following code. It takes a lot of time to extract a data.
This give me the GL i require
SELECT * FROM ZSECCO_GL_EWT INTO CORRESPONDING FIELDS OF TABLE IT_GL
 WHERE BUKRS    = P_BUKRS
 AND   WT_QSCOD = P_QSCOD.
Then I select  only the distinct documents No, from BSIS table  fro the hkonts  in the above internal table
SELECT DISTINCT BUKRS GJAHR BELNR  FROM BSIS INTO CORRESPONDING FIELDS OF
TABLE IT_BSIS_GL
 FOR ALL ENTRIES IN IT_GL
 WHERE BUKRS = P_BUKRS AND  HKONT = IT_GL-HKONT  AND GJAHR = P_GJAHR.
Here I   once again select the   document details based on the document No. from above internal table
This query takes a lot of time
SELECT * FROM BSIS INTO CORRESPONDING FIELDS OF TABLE IT_BSIS
 FOR ALL ENTRIES IN IT_BSIS_GL
 WHERE  BUKRS = P_BUKRS AND GJAHR = P_GJAHR
    AND BELNR = IT_BSIS_GL-BELNR.
Please Help

Hi,
Check note 992803; it could be that there is insufficient or missing index for BSIS table.
Regards,
Eli

Similar Messages

  • Transfering data from bsis table to file on application server

    Hi Gurus,
    In my program iam selecting data from bsis table and transferring to file on the application server.
    Code:
    Tables: BSIS.
    data: file type rlgrap-filename.
    Select * from BSIS into BSIS.
    Open dataset output in text mode encoding default.
    Transfer bsis to file.
    closesdataset.
    1, Issue when transferring iam getting dump due to type conflict.
    2, Here I cant create a structure like bsis and define type c  
        varialbles as its an upgrade project.
    3, I tried field-symbols using type casting but data in the
        application server is showing junk data with ##### in currncy
        fields.
    4, In earlier program there is no internal table defined for bsis,
        Directly passing to work area and transferring to file on
        application server.
    Please let me know any feasble solution to approach.
    If need i can post the complete code.
    Regards
    Bhaskar

    Hi Robert,
    Here is my code....in ECC. While executing program is going to dump saying character type error.
    tables: bsis.
    data: begin of i_tab occurs 100.
            include structure bsis.
            end of i_tab.
    parameters: outfile(1000) lower case.
    open dataset outfile for output in text mode encoding default
    ignoring conversion errors..
    select * from bsis into table i_tab
    where bukrs = bukrs.
       loop at i_tab.
          transfer i_tab to outfile.
       endloop.
      close dataset.
    Please let me know if any soluton.
    regards
    Bhaskar

  • Moving the 80 Million records from Conversion database to System Test database (Just for one transaction table) taking too long.

    Hello Friends,
    The background is I am working as conversion manager and we move the data from oracle to SQL Server using SSMA and then we will apply the conversion logic and then move the data to system test ,UAT and Production.
    Scenario:
    Moving the 80 Million records from Conversion database to System Test database (Just for one transaction table) taking too long. Both the databases are in the same server.
    Questions are…
    What is best option?
    IF we use the SSIS it’s very slow and taking 17 hours (some time it use to stuck and won’t allow us to do any process).
    I am using my own script (Stored procedure) and it’s taking only 1 hour 40 Min. I would like know is there any better process to speed up and why the SSIS is taking too long.
    When we move the data using SSIS do they commit inside after particular count? (or) is the Microsoft is committing all the records together after writing into Transaction Log
    Thanks
    Karthikeyan Jothi

    http://www.dfarber.com/computer-consulting-blog.aspx?filterby=Copy%20hundreds%20of%20millions%20records%20in%20ms%20sql
    Processing
    hundreds of millions records can be done in less than an hour.
    Best Regards,Uri Dimant SQL Server MVP,
    http://sqlblog.com/blogs/uri_dimant/
    MS SQL optimization: MS SQL Development and Optimization
    MS SQL Consulting:
    Large scale of database and data cleansing
    Remote DBA Services:
    Improves MS SQL Database Performance
    SQL Server Integration Services:
    Business Intelligence

  • How to select data from cluster table

    hi experts,
                   I have a report which picks data from bseg (cluster table ) for a month report it is taking around 4 minutes to process.I feel it is not good when take the report after some months.
    how to select data from these table???how to declare itab for these cluster tables????can we include any search condition or any other kind of internal table???
    please advice.
    mani

    Hi Manikandan,
    The following code may be helpful to understand how to select the data from cluster table.
    Types: Begin of ty_kna1,
    Kunnr type kna1-kunnr,
    adrnr type kna1-adrnr,
    end of ty_kna1,
    begin of ty_bseg,
    belnr type bseg-belnr,
    kunnr type bseg-kunnr,
    end of ty_bseg,
    begin of ty_final,
    belnr type bseg-belnr,
    kunnr type kna1-kunnr,
    adrnr type kna1-adrnr,
    end of ty_final.
    Data: it_kna1 type table of ty_kna1,
    wa_kna1 type ty_kna1,
    it_bseg type table of ty_bseg,
    wa_bseg type ty_bseg,
    it_final type table of ty_final,
    wa_final type ty_final.
    Select kunnr adrnr from kna1 into table it_kna1 where....
    if sy-subrc = 0.
    select belnr kunnr into table it_bseg for all entries in it_kna1 where kunnr = it_kna1-kunnr.
    endif.
    sort it_kna1 by kunnr.
    Loop at it_bseg into wa_bseg.
    move wa_bseg-belnr to wa_final-belnr.
    read table it_kna1 into wa_kna1 with kunnr = wa_bseg-kunnr binary search.
    if sy-subrc = 0.
    move: wa_kna1-kunnr to wa_final-kunnr,
    wa_kna1-belnr to wa_final-belnr.
    endif.
    append wa_final to it_final.
    clear wa_final.
    endloop.
    Loop at it_final into wa_final.
    write: / wa_final-belnr, wa_final-kunnr, wa_final-adrnr.
    endloop.
    Reward if useful.
    Thankyou,
    Regards.

  • Error while selecting date from external table

    Hello all,
    I am getting the follwing error while selecting data from external table. Any idea why?
    SQL> CREATE TABLE SE2_EXT (SE_REF_NO VARCHAR2(255),
      2        SE_CUST_ID NUMBER(38),
      3        SE_TRAN_AMT_LCY FLOAT(126),
      4        SE_REVERSAL_MARKER VARCHAR2(255))
      5  ORGANIZATION EXTERNAL (
      6    TYPE ORACLE_LOADER
      7    DEFAULT DIRECTORY ext_tables
      8    ACCESS PARAMETERS (
      9      RECORDS DELIMITED BY NEWLINE
    10      FIELDS TERMINATED BY ','
    11      MISSING FIELD VALUES ARE NULL
    12      (
    13        country_code      CHAR(5),
    14        country_name      CHAR(50),
    15        country_language  CHAR(50)
    16      )
    17    )
    18    LOCATION ('SE2.csv')
    19  )
    20  PARALLEL 5
    21  REJECT LIMIT UNLIMITED;
    Table created.
    SQL> select * from se2_ext;
    SQL> select count(*) from se2_ext;
    select count(*) from se2_ext
    ERROR at line 1:
    ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-04043: table column not found in external source: SE_REF_NO
    ORA-06512: at "SYS.ORACLE_LOADER", line 19

    It would appear that you external table definition and the external data file data do not match up. Post a few input records so someone can duplicate the problem and determine the fix.
    HTH -- Mark D Powell --

  • How to select data from a table using a date field in the where condition?

    How to select data from a table using a date field in the where condition?
    For eg:
    data itab like equk occurs 0 with header line.
    select * from equk into table itab where werks = 'C001'
                                                      and bdatu = '31129999'.
    thanks.

    Hi Ramesh,
    Specify the date format as YYYYMMDD in where condition.
    Dates are internally stored in SAP as YYYYMMDD only.
    Change your date format in WHERE condition as follows.
    data itab like equk occurs 0 with header line.
    select * from equk into table itab where werks = 'C001'
    and bdatu = <b>'99991231'.</b>
    I doubt check your data base table EQUK on this date for the existince of data.
    Otherwise, just change the conidition on BDATU like below to see all entries prior to this date.
    data itab like equk occurs 0 with header line.
    select * from equk into table itab where werks = 'C001'
    and <b> bdatu <= '99991231'.</b>
    Thanks,
    Vinay
    Thanks,
    Vinay

  • How to select data from a table by passing document number from another tab

    How to select data from a table by passing document number from another table.
    for eg:-
    I want to display name, adres, region from ADRC table
    by using field delivery document number
    Kind Regards,
    Shanbagavalli.S

    Hi Shanbagavalli,
    There are multiple solutions to this questions a few i will try to answer and then you can take the best required for your requirements.
    **Consider that you have a Internal table having document number from other table..
    SELECT NAME ADRES REGION FROM ADRC
           INTO IT_ADRC
           FOR ALL ENTRIES IN IT_DOC
           WHERE DOCUMENT_NO = IT_DOC-DOCUMENT_NO.
    **Consider that you have 1 document number then
    SELECT NAME ADRES REGION FROM ADRC
         INTO IT_ADRC
         WHERE DOCUMENT_NO = W_DOCUMENT_NO.
    Hope this solves your problem.
    Regards,
    Kunjal

  • Select data from two tables...!

    HI Experts...!
    i m a beginner user and i want to select data from two tables proj and prps.....using joins.....and internal tables i have written a code...
    SELECT prps~pspnr
           prps~objnr
           prps~psphi
           proj~ernam
           proj~erdat
           proj~pspnr
    INTO  table itab   -
    itab is internal table
    FROM prps inner join proj
    WHERE pspnr in p_no and prpspsphi = projpspnr.
    but there is error in from clause ..please help me....
    Advance thanx....

    Hi,
    check the sample code bellow above two reply will solve out your problem but one more extra line in your code pointed out bellow.
    TABLES: prps, proj.
    TYPES:  BEGIN OF ty_test,
            pspnr LIKE prps-pspnr,
            objnr LIKE prps-objnr,
            psphi LIKE prps-psphi,
            ernam LIKE proj-ernam,
            erdat LIKE proj-erdat,
            END OF ty_test.
    DATA: itab TYPE STANDARD TABLE OF ty_test WITH HEADER LINE.
    SELECT-OPTIONS: p_no FOR prps-pspnr.
    SELECT  prps~pspnr
            prps~objnr
            prps~psphi
            proj~ernam
            proj~erdat
    *        proj~pspnr " No need for this you have selected this in
    *     the first line because it is commone so you only need to select from any one
            INTO TABLE itab
    FROM prps INNER JOIN proj ON ( prps~pspnr = proj~pspnr  )
    WHERE prps~pspnr IN p_no.
    Best Regards,
    Faisal
    Edited by: Rob Burbank on Dec 24, 2009 12:24 PM

  • Selecting data from two tables

    I am trying to select data from two tables.  The only problem that I am running into, is that i am only seeing results from my 'uploads' table.  there is also a record in documents where user = 1 that should show up.  here is my sql:
    $userIDNum = 1;
    $sql="Select *
    from uploads, documents
    WHERE uploads.user = documents.user AND uploads.user = $userIDNum
    ORDER BY uploads.title ASC, documents.title ASC";

    You'll need to explain a little more about your data and what you are trying to accomplish. Your current sql will select all columns from both the uploads and documents tables but only rows where the user id in both tables match, AND the user id equals 1. Is that not what you are seeing? Where's the code that outputs the results?

  • 'Erase All Content and Settings' i selected it and it's taking too long it's not even loading! Is it really tooks a several hours ? is there any problem how can i fix it. Please tell me

    'Erase All Content and Settings' i selected it and it's taking too long it's not even loading! Is it really tooks a several hours ? is there any problem how can i fix it. Please tell me

    Likewise bro... atleast mine has the Apple logo... and the circular thing... I saw this online and i'll be trying it ASAP... http://support.apple.com/kb/HT1808

  • Data Archive Script is taking too long to delete a large table

    Hi All,
    We have data archive scripts, these scripts move data for a date range to a different table. so the script has two parts first copy data from original table to archive table; and second delete copied rows from the original table. The first part is executing very fast but the deletion is taking too long i.e. around 2-3 hours. The customer analysed the delete query and are saying the script is not using index and is going into full table scan. but the predicate itself is the primary key, Please help... More info below
    CREATE TABLE "APP"."MON_TXNS"
       (    "ID_TXN" NUMBER(12,0) NOT NULL ENABLE,
        "BOL_IS_CANCELLED" VARCHAR2(1 BYTE) DEFAULT 'N' NOT NULL ENABLE,
        "ID_PAYER" NUMBER(12,0),
        "ID_PAYER_PI" NUMBER(12,0),
        "ID_PAYEE" NUMBER(12,0),
        "ID_PAYEE_PI" NUMBER(12,0),
        "ID_CURRENCY" CHAR(3 BYTE) NOT NULL ENABLE,
        "STR_TEXT" VARCHAR2(60 CHAR),
        "DAT_MERCHANT_TIMESTAMP" DATE,
        "STR_MERCHANT_ORDER_ID" VARCHAR2(30 BYTE),
        "DAT_EXPIRATION" DATE,
        "DAT_CREATION" DATE,
        "STR_USER_CREATION" VARCHAR2(30 CHAR),
        "DAT_LAST_UPDATE" DATE,
        "STR_USER_LAST_UPDATE" VARCHAR2(30 CHAR),
        "STR_OTP" CHAR(6 BYTE),
        "ID_AUTH_METHOD_PAYER" NUMBER(1,0),
        "AMNT_AMOUNT" NUMBER(23,0) DEFAULT 0,
        "BOL_IS_AUTOCAPTURE" VARCHAR2(1 BYTE) DEFAULT 'N' NOT NULL ENABLE,
        "ID_USE_CASE" NUMBER(4,0) NOT NULL ENABLE,
        "ID_AUTH_METHOD_PAYEE" NUMBER(2,0),
         CONSTRAINT "CKC_BOL_IS_CANCELLED_MON_TXNS" CHECK (BOL_IS_CANCELLED in ('Y','N')) ENABLE,
         CONSTRAINT "PK_MON_TXNS" PRIMARY KEY ("ID_TXN")
      USING INDEX PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS
      STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
      PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT)
      TABLESPACE "LARGE_INDEX"  ENABLE,
         CONSTRAINT "FK_MON_TXNS_CURRENCIES" FOREIGN KEY ("ID_CURRENCY")
          REFERENCES "APP"."CURRENCIES" ("ID_CURRENCY") ENABLE,
         CONSTRAINT "FK_MON_TXNS_TO_PAYER" FOREIGN KEY ("ID_PAYER")
          REFERENCES "APP"."CUSTOMERS" ("ID_CUSTOMER") ENABLE,
         CONSTRAINT "FK_MON_TXNS_TO_PAYEE" FOREIGN KEY ("ID_PAYEE")
          REFERENCES "APP"."CUSTOMERS" ("ID_CUSTOMER") ENABLE,
         CONSTRAINT "FK_MON_TXNS_REFERENCE_TXNS" FOREIGN KEY ("ID_TXN")
          REFERENCES "APP"."TXNS" ("ID_TXN") ENABLE,
         CONSTRAINT "FK_MON_TXNS_TO_PI_PAYER" FOREIGN KEY ("ID_PAYER_PI")
          REFERENCES "APP"."PIS" ("ID_PI") ENABLE,
         CONSTRAINT "FK_MON_TXNS_TO_PI_PAYEE" FOREIGN KEY ("ID_PAYEE_PI")
          REFERENCES "APP"."PIS" ("ID_PI") ENABLE,
         CONSTRAINT "FK_MON_TXNS_TO_AUTHMETHOD" FOREIGN KEY ("ID_AUTH_METHOD_PAYER")
          REFERENCES "APP"."AUTHENTICATION_METHODS" ("ID_AUTHENTICATION_METHOD") ENABLE,
         CONSTRAINT "FK_MON_TXNS_USE_CASE_ID" FOREIGN KEY ("ID_USE_CASE")
          REFERENCES "APP"."USE_CASES" ("ID_USE_CASE") ENABLE,
         CONSTRAINT "FK_MON_TXN_AUTH_PAYEE" FOREIGN KEY ("ID_AUTH_METHOD_PAYEE")
          REFERENCES "APP"."AUTHENTICATION_METHODS" ("ID_AUTHENTICATION_METHOD") ENABLE
      CREATE INDEX "APP"."IDX_MON_TXNS" ON "APP"."MON_TXNS" ("ID_PAYER")
      PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS
      STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
      PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT)
      TABLESPACE "LARGE_INDEX" ;
      CREATE INDEX "APP"."IDX_PAYEE_MON_TXNS" ON "APP"."MON_TXNS" ("ID_PAYEE")
      PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS
      STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
      PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT)
      TABLESPACE "LARGE_DATA" ;
      CREATE INDEX "APP"."IDX_PYE_PI_MON_TXNS" ON "APP"."MON_TXNS" ("ID_PAYEE_PI")
      PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS
      STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
      PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT)
      TABLESPACE "LARGE_DATA" ;
      CREATE INDEX "APP"."IDX_PYR_PI_MON_TXNS" ON "APP"."MON_TXNS" ("ID_PAYER_PI")
      PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS
      STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
      PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT)
      TABLESPACE "LARGE_DATA" ;
      CREATE INDEX "APP"."IDX_USE_CASE_MON_TXNS" ON "APP"."MON_TXNS" ("ID_USE_CASE")
      PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS
      STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
      PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT)
      TABLESPACE "LARGE_DATA" ;
      CREATE UNIQUE INDEX "APP"."PK_MON_TXNS" ON "APP"."MON_TXNS" ("ID_TXN")
      PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS
      STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
      PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT)
      TABLESPACE "LARGE_INDEX" ;
    Data is first moved to table in schema3.OTW. and then we are deleting all the rows in otw from original table. below is the explain plan for delete
    SQL> explain plan for
      2  delete from schema1.mon_txns where id_txn in (select id_txn from schema3.OTW);
    Explained.
    SQL> select * from table(dbms_xplan.display);
    PLAN_TABLE_OUTPUT
    Plan hash value: 2798378986
    | Id  | Operation              | Name       | Rows  | Bytes | Cost (%CPU)| Time     |
    |   0 | DELETE STATEMENT       |            |  2520 |   233K|    87   (2)| 00:00:02 |
    |   1 |  DELETE                | MON_TXNS   |       |       |            |          |
    |*  2 |   HASH JOIN RIGHT SEMI |            |  2520 |   233K|    87   (2)| 00:00:02 |
    |   3 |    INDEX FAST FULL SCAN| OTW_ID_TXN |  2520 | 15120 |     3   (0)| 00:00:01 |
    |   4 |    TABLE ACCESS FULL   | MON_TXNS   | 14260 |  1239K|    83   (0)| 00:00:02 |
    PLAN_TABLE_OUTPUT
    Predicate Information (identified by operation id):
    Please help,
    thanks,
    Banka Ravi

    'Best practice' is just what Oracle is already doing as you have already been told: DELETE FROM myTable WHERE myDate between myStart and Myend.
    Your use case is why many orgs elect to use partitioning and use that DATE column as the partition key. Then it is VERY FAST and VERY EASY to truncate or drop partitions that contain old data when you no longer need them.
    The other solution used is to quit waiting so long to delete data and then you don't have to delete large amounts at the same time. So instead of deleting data once a month delete it once a week or even every night. Then the number of rows being deleted will be much smaller and, if the stats are kept current, Oracle may decide to use the index.

  • Selecting data from a table which has a LONG column

    hi all,
    In the table user_views there is a LONG column. Now i want to select rows from that table based on this LONG column. like....
    select *from user_views where text like '%account%;
    but i am getting the error.."ORA-00932: Inconsistent datatypes:Expected NUMBER got LONG".
    How do I overcome this error.
    Please give me some suggestions.
    Thanks & Regards,
    Vijay

    In the table user_views there is a LONG column. Now i want to select rows from that >table based on this LONG column. like....LONG columns cannot appear in a WHERE or
    AND clause.

  • Selecting data from external table

    Hi there
    I was wondering if somebody could assist me. When I try to select data from an external table, no data is displayed, and in my log file I receive the following error:
    KUP-04026: field too long for datatype. Please find attached my external table script.
    CREATE TABLE DEMO_FILE_EXT
    MACODE               NUMBER(7),
    MANO                    NUMBER(7),
    DEPNO                    VARCHAR2(2 BYTE),
    DEPTYPE                    NUMBER(5),
    STARTDATE               NUMBER(8),
    ENDDATE               NUMBER(8),
    OPTIONSTART               NUMBER(8),
    BENEFITSTART     NUMBER(8),
    STARTSUSPEND          NUMBER(8),
    ENDSUSPEND          NUMBER(8),
    INITIALS          VARCHAR2(5 BYTE),
    FIRSTNAME          VARCHAR2(20 BYTE),
    SURNAME                    VARCHAR2(25 BYTE),
    STR1                    VARCHAR2(30 BYTE),
    STR2                    VARCHAR2(30 BYTE),
    STR3                    VARCHAR2(30 BYTE),
    STR4                         VARCHAR2(30 BYTE),
    SCODE                    VARCHAR2(6 BYTE),
    POS1                    VARCHAR2(30 BYTE),
    POS2                    VARCHAR2(30 BYTE),
    POS3                    VARCHAR2(30 BYTE),
    POS4                    VARCHAR2(30 BYTE),
    PCODE                    VARCHAR2(6 BYTE),
    TELH                    VARCHAR2(10 BYTE),
    TELW                    VARCHAR2(10 BYTE),
    TELC                    VARCHAR2(10 BYTE),
    IDNUMBER          VARCHAR2(13 BYTE),
    DOB                NUMBER(8),
    GENDER               VARCHAR2(1 BYTE),
    EMPLOYER_CODE               VARCHAR2(10 BYTE),
    EMPLOYER_NAME               VARCHAR2(900 BYTE)
    ORGANIZATION EXTERNAL
    ( TYPE ORACLE_LOADER
    DEFAULT DIRECTORY DEMO_FILES
    ACCESS PARAMETERS
    ( RECORDS DELIMITED BY newline
         BADFILE 'Tinusb.txt'
         DISCARDFILE 'Tinusd.txt'
         LOGFILE 'Tinusl.txt'
    SKIP 1
    FIELDS TERMINATED BY '|'
              MISSING FIELD VALUES ARE NULL
         (MACODE,
         MANO,
         DEPNO,
         DEPTYPE,
         STARTDATE,
         ENDDATE,
         OPTIONSTART,
         BENEFITSTART,
         STARTSUSPEND,
         ENDSUSPEND,
         INITIALS,
         FIRSTNAME,
         SURNAME,
         STR1,
         STR2,
         STR3,
         STR4,
         SCODE,
         POS1,
         POS2,
         POS3,
         POS4,
         PCODE,
         TELH,
         TELW,
         TELC,
         IDNUMBER,
         DOB,
         GENDER,
         EMPLOYER_CODE,
         EMPLOYER_NAME
    LOCATION (DEMO_FILES:'Test1.txt')
    REJECT LIMIT UNLIMITED
    LOGGING
    NOCACHE
    NOPARALLEL;
    I have the correct privileges on the directory, but the error seems to be on the EMPLOYER_NAME field. The file I try to upload is in pipe-delimited format. The last field in the file does not have a pipe-delimiter at the end. Can this be the problem? Must I go and look for any trailing spaces? Can I specify in the external table script how many characters I need for the employer_name field? We receive this file from an external company
    Thank you very much for the help
    Ferdie

    common mistake, you gave the field sizes in the
    column listing of the table, but not in the file
    definition. oracle does not apply one to the other.
    in the file defintion section, give explict field
    sizes.Hi shoblock
    Sorry for only coming back to you now, thank you for your help, I had to give the explicit field size for the last column (employer name).
    Thank you once again!!
    Ferdie

  • Procedure to insert data into table by selecting data from another table

    Hi all,
    I have to create a procedure where i have to select the data from one table and insert it into another table. Any help on this. And i have to update the 2nd table also when ever new records got inserted in the 1st table then.
    Regards

    Hi, you can try something like:
    CREATE [OR REPLACE] PROCEDURE procedure_name
    IS
    BEGIN
    INSERT INTO TABLE1
    VALUES (SELECT * FROM TABLE2);
    END;
    For the other part you may create a trigger on the first table AFTER INSERT to insert the values in the second table too.

  • Select data from a table and a view: field with same content and different type

    hi all,
    I need data from a table hrp1001 and a view V_USR_NAME. What they have in common are the content of the fields SOBID and BNAME .
    It will be easier if those fields add a same type but they dont.
    Any suggestions for a workaround?
    Thanks

    Hi Ramesh,
    Specify the date format as YYYYMMDD in where condition.
    Dates are internally stored in SAP as YYYYMMDD only.
    Change your date format in WHERE condition as follows.
    data itab like equk occurs 0 with header line.
    select * from equk into table itab where werks = 'C001'
    and bdatu = <b>'99991231'.</b>
    I doubt check your data base table EQUK on this date for the existince of data.
    Otherwise, just change the conidition on BDATU like below to see all entries prior to this date.
    data itab like equk occurs 0 with header line.
    select * from equk into table itab where werks = 'C001'
    and <b> bdatu <= '99991231'.</b>
    Thanks,
    Vinay
    Thanks,
    Vinay

Maybe you are looking for

  • Dunning - ST22- Exception in GET_DUNNING_CUSTOMIZING FM.

    Hi Experts, Issue is, When the user ran the Dunning report on July 16th, (selection criteria is--> July 16th & From customer 1 TO customer 999999999), bcoz of some thing, at CUST_444, the Exception was raised (say, for customers CUST_111, CUST_222, C

  • Mac Pro DVD drive flaky

    My Mac Pro 2006 DVD (Sony D-150A) drive is being flaky--it will read replicated CDs & DVDs, read and burn CDRs, but not DVDRs of any kind--video, data, + or - media. I can record DVD data/media, it just won't play back. Anyone else experience this pa

  • How can I sync photos from events in iphone 4s to macbook pro?

    The hard drive in my macbook pro crashed so I had to get a new hard drive. Unfortunately, all the data in the hard drive, including photos, music, and movies are all gone . All of my files are in my iphone 4s. When I try to sync some of the photos to

  • Ipad 1 (original) - usb connection issues

    I have an Ipad 1 (original).  It connected to my computer fine back in February but when I connected it yesterday it says, usb device not recognised. I have tried 3 computers and several cables - same response. Is there anyway it will connect to the

  • Clearing documents in GL

    Hi What are the clearing documents in GL? Regards, Vadiraj