Wrong century for the dates from flat file

I am in the process of uploading the data from flat file ( aka CSV) into oracle via external table .
All the dates have the year of 20xx. My understanding was 51-99 will be prefixed with 19 ( such as 1951 - 1999) and 00-50
will be prefixed with 20 ( such as 2004 ... )
The column below ( startdate ) is of timestamp .
Is my understanding wrong ?
SQL> select startdate , to_date(startdate , 'dd/mm/yyyy' ) newdate from tab;
NEW_DATE STARTDATE
09/18/2090 18-SEP-90 12.00.00.000000000 AM
BANNER
Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production
PL/SQL Release 11.1.0.7.0 - Production
CORE 11.1.0.7.0 Production
TNS for 64-bit Windows: Version 11.1.0.7.0 - Production
NLSRTL Version 11.1.0.7.0 - Production
SQL> show parameter nls
NAME TYPE VALUE
nls_calendar string
nls_comp string BINARY
nls_currency string
nls_date_format string
nls_date_language string
nls_dual_currency string
nls_iso_currency string
nls_language string AMERICAN
nls_length_semantics string BYTE
nls_nchar_conv_excp string FALSE
nls_numeric_characters string
nls_sort string
nls_territory string AMERICA
nls_time_format string
nls_time_tz_format string
nls_timestamp_format string
nls_timestamp_tz_format string
SQL>
SQL>

Offense . None taken ...
I literally typed the sql ( as I did not copy & paste SQL / resultsets ) . I did use to_char in my original sql . I don't want to expose the real table . thats why I was giving the made up test case. If you carefully the SQL and the equivalent result sets ... you would have noticed .
Here is the SQL used ( of course , I have changed the table name ) ...
select to_char(startdate , 'mm/dd/yyyy') sdate, startdate from t
SDATE STARTDATE
09/18/2090 18-SEP-90 12.00.00.000000 AM
The flatfile had the value of 091890 as the data .

Similar Messages

  • Need a script to import the data from flat file

    Hi Friends,
    Any one have any scripts to import the data from flat files into oracle database(Linux OS). I have to automate the script for every 30min to check any flat files in Incoming directory process them with out user interaction.
    Thanks.
    Srini

    Here is my init.ora file
    # $Header: init.ora 06-aug-98.10:24:40 atsukerm Exp $
    # Copyright (c) 1991, 1997, 1998 by Oracle Corporation
    # NAME
    # init.ora
    # FUNCTION
    # NOTES
    # MODIFIED
    # atsukerm 08/06/98 - fix for 8.1.
    # hpiao 06/05/97 - fix for 803
    # glavash 05/12/97 - add oracle_trace_enable comment
    # hpiao 04/22/97 - remove ifile=, events=, etc.
    # alingelb 09/19/94 - remove vms-specific stuff
    # dpawson 07/07/93 - add more comments regarded archive start
    # maporter 10/29/92 - Add vms_sga_use_gblpagfile=TRUE
    # jloaiza 03/07/92 - change ALPHA to BETA
    # danderso 02/26/92 - change db_block_cache_protect to dbblock_cache_p
    # ghallmar 02/03/92 - db_directory -> db_domain
    # maporter 01/12/92 - merge changes from branch 1.8.308.1
    # maporter 12/21/91 - bug 76493: Add control_files parameter
    # wbridge 12/03/91 - use of %c in archive format is discouraged
    # ghallmar 12/02/91 - add global_names=true, db_directory=us.acme.com
    # thayes 11/27/91 - Change default for cache_clone
    # jloaiza 08/13/91 - merge changes from branch 1.7.100.1
    # jloaiza 07/31/91 - add debug stuff
    # rlim 04/29/91 - removal of char_is_varchar2
    # Bridge 03/12/91 - log_allocation no longer exists
    # Wijaya 02/05/91 - remove obsolete parameters
    # Example INIT.ORA file
    # This file is provided by Oracle Corporation to help you customize
    # your RDBMS installation for your site. Important system parameters
    # are discussed, and example settings given.
    # Some parameter settings are generic to any size installation.
    # For parameters that require different values in different size
    # installations, three scenarios have been provided: SMALL, MEDIUM
    # and LARGE. Any parameter that needs to be tuned according to
    # installation size will have three settings, each one commented
    # according to installation size.
    # Use the following table to approximate the SGA size needed for the
    # three scenarious provided in this file:
    # -------Installation/Database Size------
    # SMALL MEDIUM LARGE
    # Block 2K 4500K 6800K 17000K
    # Size 4K 5500K 8800K 21000K
    # To set up a database that multiple instances will be using, place
    # all instance-specific parameters in one file, and then have all
    # of these files point to a master file using the IFILE command.
    # This way, when you change a public
    # parameter, it will automatically change on all instances. This is
    # necessary, since all instances must run with the same value for many
    # parameters. For example, if you choose to use private rollback segments,
    # these must be specified in different files, but since all gc_*
    # parameters must be the same on all instances, they should be in one file.
    # INSTRUCTIONS: Edit this file and the other INIT files it calls for
    # your site, either by using the values provided here or by providing
    # your own. Then place an IFILE= line into each instance-specific
    # INIT file that points at this file.
    # NOTE: Parameter values suggested in this file are based on conservative
    # estimates for computer memory availability. You should adjust values upward
    # for modern machines.
    # You may also consider using Database Configuration Assistant tool (DBCA)
    # to create INIT file and to size your initial set of tablespaces based
    # on the user input.
    # replace DEFAULT with your database name
    db_name=DEFAULT
    db_files = 80 # SMALL
    # db_files = 400 # MEDIUM
    # db_files = 1500 # LARGE
    db_file_multiblock_read_count = 8 # SMALL
    # db_file_multiblock_read_count = 16 # MEDIUM
    # db_file_multiblock_read_count = 32 # LARGE
    db_block_buffers = 100 # SMALL
    # db_block_buffers = 550 # MEDIUM
    # db_block_buffers = 3200 # LARGE
    shared_pool_size = 3500000 # SMALL
    # shared_pool_size = 5000000 # MEDIUM
    # shared_pool_size = 9000000 # LARGE
    log_checkpoint_interval = 10000
    processes = 50 # SMALL
    # processes = 100 # MEDIUM
    # processes = 200 # LARGE
    parallel_max_servers = 5 # SMALL
    # parallel_max_servers = 4 x (number of CPUs) # MEDIUM
    # parallel_max_servers = 4 x (number of CPUs) # LARGE
    log_buffer = 32768 # SMALL
    # log_buffer = 32768 # MEDIUM
    # log_buffer = 163840 # LARGE
    # audit_trail = true # if you want auditing
    # timed_statistics = true # if you want timed statistics
    max_dump_file_size = 10240 # limit trace file size to 5 Meg each
    # Uncommenting the line below will cause automatic archiving if archiving has
    # been enabled using ALTER DATABASE ARCHIVELOG.
    # log_archive_start = true
    # log_archive_dest = disk$rdbms:[oracle.archive]
    # log_archive_format = "T%TS%S.ARC"
    # If using private rollback segments, place lines of the following
    # form in each of your instance-specific init.ora files:
    # rollback_segments = (name1, name2)
    # If using public rollback segments, define how many
    # rollback segments each instance will pick up, using the formula
    # # of rollback segments = transactions / transactions_per_rollback_segment
    # In this example each instance will grab 40/5 = 8:
    # transactions = 40
    # transactions_per_rollback_segment = 5
    # Global Naming -- enforce that a dblink has same name as the db it connects to
    global_names = TRUE
    # Edit and uncomment the following line to provide the suffix that will be
    # appended to the db_name parameter (separated with a dot) and stored as the
    # global database name when a database is created. If your site uses
    # Internet Domain names for e-mail, then the part of your e-mail address after
    # the '@' is a good candidate for this parameter value.
    # db_domain = us.acme.com      # global database name is db_name.db_domain
    # FOR DEVELOPMENT ONLY, ALWAYS TRY TO USE SYSTEM BACKING STORE
    # vms_sga_use_gblpagfil = TRUE
    # FOR BETA RELEASE ONLY. Enable debugging modes. Note that these can
    # adversely affect performance. On some non-VMS ports the db_block_cache_*
    # debugging modes have a severe effect on performance.
    #_db_block_cache_protect = true # memory protect buffers
    #event = "10210 trace name context forever, level 2" # data block checking
    #event = "10211 trace name context forever, level 2" # index block checking
    #event = "10235 trace name context forever, level 1" # memory heap checking
    #event = "10049 trace name context forever, level 2" # memory protect cursors
    # define parallel server (multi-instance) parameters
    #ifile = ora_system:initps.ora
    # define two control files by default
    control_files = (ora_control1, ora_control2)
    # Uncomment the following line if you wish to enable the Oracle Trace product
    # to trace server activity. This enables scheduling of server collections
    # from the Oracle Enterprise Manager Console.
    # Also, if the oracle_trace_collection_name parameter is non-null,
    # every session will write to the named collection, as well as enabling you
    # to schedule future collections from the console.
    # oracle_trace_enable = TRUE
    # Uncomment the following line, if you want to use some of the new 8.1
    # features. Please remember that using them may require some downgrade
    # actions if you later decide to move back to 8.0.
    #compatible = 8.1.0
    Thanks.
    Srini

  • Data Source creation for Master Data from Flat File to BW

    Hi,
    I need to upload Master Data from Flat File. Can anybody tell step by step right from begining of creation of DataSource up to Loading into Master Data InfoObject.
    can any body have Document.
    Regards,
    Chakri.

    Hi,
    This is the procedure.
    1. Create m-data with or without attributes.
    2. Create infosource .
        a) with flexible update
             or
        b) with direct update
    3. Create transfer rules and assign tyhe names of m-data and attribute in "Transfer rules" tab and transfer them to communication structure.
    4. Create the flat-file with same stucture as communication structure.
    5. if chosen direct update then create infopackage and assign the name of flat-file and schedule it.
    6. if chosen flexible update the create update rule with assigning name of the infosource and the schedule it by creating infopackage.
    Hope this helps. If still have prob then let me know.
    Follow this link also.
    http://help.sap.com/saphelp_nw2004s/helpdata/en/b2/e50138fede083de10000009b38f8cf/frameset.htm
    Assign points if helpful.
    Vinod.

  • How to load the data from flat file ( ex excel ) to Planning area directly

    Hi all ,
    How can i load thedata fro m flat file directly to Planning area .
    PLease help me in this.
    Regards,
    Chandu .

    download one key figure data from planning book ( interactive damand plan) and made some changes and need to upload the data back to same planning book
    But, may I know why you are thinking of downloading, changing and uploading for just changing the figures for a particular key figure. You can do it in the planning book itself.
    However, not all the key-figures can be changed. But, what type of key-figure  you are speaking here? Is it like 'Forecast' for which the value is based on other key-figures, or is like a key-figure where some manual adjustments are to be done--so that it can be manually edited? However,  in both the cases, the data can be changed in the planning book only. In first case, you can change the values of dependant key-figures and in the second case, you can change the key-figures directly.
    And please note that you can change the values of the key-figures only at the detailed level. So, after loading the data in the book, use drill-down option, maintain the data at the detailed level, change the figures, and automatically, this gets reflected at the higher level.
    In case you are unable to change the values, go to the 'Design' mode of the book, right-click your key-figure, under "Selected Rows", uncheck "Output Only" option. In case you are unable to see that option, then you are not authorised to change that. See if you can change the authorisations by going to the "Data View" tab in planning book configuration (/n/sapapo/sdp8b), and change the value of Status to 3.
    Hope your query is answered with different solutions offered by many of the sdn colleagues here.
    Regards,
    Guru Charan.

  • Scheduling Problem for uploading Data from Flat file to SAP

    Hi guys,
    I am facing a weared problem in uploading some leave records in z table. The code is working fine if we run it through se38 after selecting the file from a shared location from production server which has all the access rights.
    This folder lies in the \usr folder of SAP Production.
    I have kept all the Flat files in the shared path "
    Tis-mum-iz-s1\migration\SAP-INT\leave\" ...
    To give u exact directory structure..
    Tis-mum-iz-s1 is the Server Name
    usr is the SAP System folder used for uploads and downloads
    usr |
    ...-> Migration |
                      -> SAP-INT |
                                 -> leave -> (Flat Files)
    Migration folder is shared with all rights.
    Obviously, we cannot give shared drive as the variant in the scheduler.
    So i use the system path i.e. \usr\sap\tmp\migration\sap-int\leave\ as the variant.
    All my other download programs are working fine with this path as a variant...
    But my this particular upload program does not work with this path....
    I am giving u my code...
    TATA INTERACTIVE SYSTEMS (A Division of TATA INDUSTRIES LIMITED)
    REPORT      :  ZMIGRATE_ZLEAVE
    DESCRIPTION :  To Upload the Leave data. (ZLEAVE)
    CREATED BY  :  Abhishek Bachhawat
    CREATED ON  :  01.09.2005
    CONSULTANT  :  ANAND
    REPORT  ZMIGRATE_ZLEAVE.
    TABLES: ZLEAVE.
    data: begin of wtab,
              MANDT(3),
              ZLVID(8),
              PERNR(8),
              ZSTDT(8),
              ZENDT(8),
              ZDAYS(4),
              AEDAT(8),
              ERDAT(8),
          end of wtab,
          itab like WTAB occurs 0 WITH HEADER LINE.
    data: temp like zleave occurs 0 WITH HEADER LINE.
    SELECTION-SCREEN BEGIN OF BLOCK file
                   WITH FRAME TITLE text-005.
    parameters: file like rlgrap-filename Obligatory.
    Concatenate File SY-DATUM '_Leave.txt' into File.
    SELECTION-SCREEN END OF BLOCK file.
    at SELECTION-SCREEN ON VALUE-REQUEST FOR file .
      CALL FUNCTION 'WS_FILENAME_GET'
        IMPORTING
          FILENAME = file.
      IF SY-SUBRC <> 0.
      ENDIF.
    start-of-selection.
      if file ne space.
        CALL FUNCTION 'WS_UPLOAD'
          EXPORTING
            FILENAME = FILE
            FILETYPE = 'DAT'
          TABLES
            DATA_TAB = ITAB.
      else.
        message e000(zps) with 'Specify a file'.
      endif.
      SORT ITAB BY ZLVID.
      LOOP AT ITAB.
        REFRESH TEMP.
        CLEAR TEMP.
        TEMP-MANDT = sy-mandt.
        TEMP-ERDAT = SY-DATUM.
        TEMP-ZLVID = ITAB-ZLVID.
        TEMP-PERNR = ITAB-PERNR.
        TEMP-ZSTDT = ITAB-ZSTDT.
        TEMP-ZENDT = ITAB-ZENDT.
        TEMP-ZDAYS = ITAB-ZDAYS.
        TEMP-AEDAT = ITAB-AEDAT.
        TEMP-ERDAT = ITAB-ERDAT.
        APPEND TEMP.
        SELECT SINGLE *
               FROM   ZLEAVE
               WHERE  ZLVID = TEMP-ZLVID
               AND    PERNR = TEMP-PERNR.
        IF SY-SUBRC = 0.
          UPDATE ZLEAVE SET ZSTDT = TEMP-ZSTDT
                            ZENDT = TEMP-ZENDT
                            ZDAYS = TEMP-ZDAYS
                            AEDAT = TEMP-AEDAT
                            ERDAT = TEMP-ERDAT
                 WHERE ZLVID = TEMP-ZLVID
                 AND   PERNR = TEMP-PERNR.
        ELSE.
          INSERT ZLEAVE FROM TABLE TEMP.
          COMMIT WORK.
        ENDIF.
      ENDLOOP.

    Hi,
    open dataset file for input in text mode.
    check sy-subrc = 0.
    while sy-subrc = 0.
      read dataset file into wa.
      if sy-subrc = 0.
      append wa to itab.
      else.
        exit.
      endif.
    endwhile.
    close dataset file.
    regards
    Siggi
    PS: check also the F1-help for open, read and close statements!

  • Help for loading data from flat file it is an imp for me!!

    Hi All,
    I am working on loading data. I am getting data from Informix as '|' (pipe) delimited.
    Each file consists of 20-25 columns. But I need few columns information only for this load. Thanks in advance for your reply.
    Thanks,
    RM

    go for fixed length loading. refer to some of the sample scripts
    (ref: search for *.ctl in find files )

  • How to pass parameters to the Control file while loading the data from flat file to staging table

    Thanks in advance

    Hi ,
    LOADDATA statement is required at the beginning of the control file.
    INFILE: INFILE keyword is used to specify location of the datafile or datafiles.
    INFILE* specifies that the data is found in the control file and not in an external file. INFILE ‘$FILE’, can be used to send the filepath and filename as a parameter when registered as a concurrent program.
    INFILE   ‘/home/vision/kap/import2.csv’ specifies the filepath and the filename.
    Hope this will help you......

  • How to load the data from flat file

    Hi ,
    Im new to Oracle 10g . I have to upload a flat flle(notepad with , as a seperator) to the table in the database,
    can you please tell me how to do it ?
    Thanks
    vivi

    Hi vivi,
    Sure, read about the wonderful SQL*LOADER. This tool's purpose is exactly that.
    Here's a link: Tahiti start page. Just type in sql*loader and off you go.
    Since you did not mention your database version, I could not put a better one.
    Regards,
    Guido

  • Given data is (10,20,.,30)how to load the data  from flat file to base table by eliminate the dot from data by using sql loader

    pls send ans for this

    1b5595eb-fcfc-48cc-90d2-43ba913ea79f wrote:
    pls send ans for this
    use any text editor to eliminate the dot before loading

  • How to  upload data from flat file to datastore object in BI 7.0

    Dear friends,
    Please tell me
    step by step process for upload data from flat file to datastore object in BI 7.0
    <removed by moderator>
    please help me
    Thanks,
    D.prabhu
    Edited by: Siegfried Szameitat on Aug 17, 2011 11:40 AM

    Create transformation on thr data source and keep the DSO as the target and load.
    Ravi Thothadri

  • Reading data from flat file Using TEXT_IO

    Dear Gurus
    I already posted this question but this time i need some other changes .....Sorry for that ..
    I am using 10G forms and using TEXT_IO for reading data from flat file ..
    My data is like this :-
    0|BP-V1|20100928|01|1|2430962.89|27|2430962.89|MUR|20100928120106
    9|2430962.89|000111111111|
    1|61304.88|000014104113|
    1|41961.73|000022096086|
    1|38475.65|000023640081|
    1|49749.34|000032133154|
    1|35572.46|000033093377|
    1|246671.01|000042148111|
    Here each column is separated by | . I want to read all the columns and want to do some validation .
    How can i do ?
    Initially my requirement was to read only 2 or 3 columns so i did like this ...
    Procedure Pay_Simulator(lfile_type varchar2,lac_no varchar2,lcur varchar2,lno_item number,ltotal number,ldate date,lpay_purp varchar2,lfile_name varchar2)
    IS
    v_handle utl_file.file_type;
    v_filebuffer varchar2(500);
    line_0_date VARCHAR2 (10);
    line_0_Purp VARCHAR2 (10);
    line_0_count Number;
    line_0_sum number(12,2);
    line_0_ccy Varchar2(3);
    line_9_sum Number(12,2);
    line_9_Acc_no Varchar2(12);
    Line_1_Sum Number(12,2);
    Line_1_tot Number(15,2) := 0;
    Line_1_flag Number := 0;
    lval number;
    lacno varchar2(16);
    v_file varchar2(20);
    v_path varchar2(50);
    Begin
    v_file := mcb_simulator_pkg.GET_FILENAME(lfile_name); -- For the file name
    v_path :=rtrim(regexp_substr( lfile_name , '.*\\' ),'\'); For the Path
    v_path := SUBSTR (lfile_name,0, INSTR (lfile_name, '\', -1));
    v_handle := UTL_FILE.fopen (v_path, v_file, 'r');
    LOOP
    UTL_FILE.get_line (v_handle, v_filebuffer);
    IF SUBSTR (v_filebuffer, 0, 1) = '0' THEN
    SELECT line_0 INTO line_0_date
    FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_0, ROWNUM rn
    FROM DUAL
    CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
    WHERE rn = 3;
    SELECT line_0 INTO line_0_Purp
    FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_0, ROWNUM rn
    FROM DUAL
    CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
    WHERE rn = 4;
    SELECT line_0 INTO line_0_count
    FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_0, ROWNUM rn
    FROM DUAL
    CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
    WHERE rn = 7;
    SELECT line_0 INTO line_0_sum
    FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_0, ROWNUM rn
    FROM DUAL
    CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
    WHERE rn = 8;
    SELECT line_0 INTO line_0_ccy
    FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_0, ROWNUM rn
    FROM DUAL
    CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
    WHERE rn = 9;
    ELSIF SUBSTR (v_filebuffer, 0, 1) = '9' THEN
    SELECT line_9 INTO line_9_Acc_no
    FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_9, ROWNUM rn
    FROM DUAL
    CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
    WHERE rn = 3;
    SELECT line_9 INTO line_9_sum
    FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_9, ROWNUM rn
    FROM DUAL
    CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
    WHERE rn = 2;
    ELSIF SUBSTR (v_filebuffer, 0, 1) = '1' THEN
    line_1_flag := line_1_flag+1;
    SELECT line_1 INTO line_1_sum
    FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_1, ROWNUM rn
    FROM DUAL
    CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
    WHERE rn = 3;
    Line_1_tot := Line_1_tot + line_1_sum;
    END IF;
    END LOOP;
    DBMS_OUTPUT.put_line (Line_1_tot);
    DBMS_OUTPUT.PUT_LINE (Line_1_flag);
    UTL_FILE.fclose (v_handle);
    END;
    But now how can i do ? Shall i use like this select Statement for all the columns ?

    Sorry for that ..
    As per our requirement ...
    I need to read the flat file and it looks like like this .
    *0|BP-V1|20100928|01|1|2430962.89|9|2430962.89|MUR|20100928120106*
    *9|2430962.89|000111111111|*
    *1|61304.88|000014104113|*
    *1|41961.73|000022096086|*
    *1|38475.65|000023640081|*
    *1|49749.34|000032133154|*
    *1|35572.46|000033093377|*
    *1|246671.01|000042148111|*
    *1|120737.25|000053101979|*
    *1|151898.79|000082139768|*
    *1|84182.34|000082485593|*
    I have to check the file :-
    Validation are 1st line should start from 0 else it should raise an error and insert that error into one table .
    The for 2nd line also same thing ..it should start from 9 else it should raise an error and insert that error into one table .
    Then the 3rd line should start from 1 else it should raise an error and insert that error into one table .
    After that i have to do a validation like i will read the 1st line 2nd column .. It should be like this BP-V1 else raise an error and insert that error to a table . Then i will check the 3rd column which is 20100928 , it should be YYYYMMDD format else same thing ERROR.
    Then like this for all columns i have different validation .......
    Then it will check for the 2nd line 3rd column . this is an account no .1st i will check it should be 12 char else ERROR .Then I will check that what user has imputed in the form.Like for example User putted 111111111 then i will check with this 000111111111 which is there in the 2nd line . I have to add 000 before that Account no which user imputed .
    Then the lines which is starting from 1 , i have to take all the 2nd column for all the lines which is starting from 1 and i have to do a sum . After that i have to compare that sum with the value in the 1st lines ( Starting from 0) 6th column . It should be same else ERROR ...
    Then same way i have to count all the lines which is starting from 1 . Then i have to compare with the 7th column of 1st line . It should be same . Here in this file it should be 9.
    MY CODE IS :-
    Procedure Pay_Simulator(lfile_type varchar2,lac_no varchar2,lcur varchar2,lno_item number,ltotal number,ldate date,lpay_purp varchar2,lfile_name varchar2)
    IS
    v_handle TEXT_IO.file_type;
    v_filebuffer varchar2(500);
    line_0_date VARCHAR2 (10);
    line_0_Purp VARCHAR2 (10);
    line_0_count Number;
    line_0_sum number(12,2);
    line_0_ccy Varchar2(3);
    line_9_sum Number(12,2);
    line_9_Acc_no Varchar2(12);
    Line_1_Sum Number(12,2);
    Line_1_tot Number(15,2) := 0;
    Line_1_flag Number := 0;
    lval number;
    lacno varchar2(16);
    v_file varchar2(20);
    v_path varchar2(50);
    LC$String VARCHAR2(50) ;--:= 'one|two|three|four|five|six|seven' ;
    LC$Token VARCHAR2(100) ;
    i PLS_INTEGER := 2 ;
    lfirst_char number;
    lvalue Varchar2(100) ;
    Begin
    v_file := mcb_simulator_pkg.GET_FILENAME(lfile_name); For the file name
    v_path :=rtrim(regexp_substr( lfile_name , '.*\\' ),'\'); For the Path
    --v_path := SUBSTR (lfile_name,0, INSTR (lfile_name, '\', -1));
    Message(lfile_name);
    v_handle := TEXT_IO.fopen(lfile_name, 'r');
              BEGIN
                        LOOP
                        TEXT_IO.get_line (v_handle, v_filebuffer);
                        lfirst_char := Substr(v_filebuffer,0,1);
                        --Message('First Char '||lfirst_char); 
                                  IF lfirst_char = '0' Then
                                  Loop
                                  LC$Token := mcb_simulator_pkg.Split( v_filebuffer, i , '|') ;
                                  Message('VAL - '||LC$Token);
                                  lvalue := LC$Token;
                                  EXIT WHEN LC$Token IS NULL ;
    i := i + 1 ;
    End Loop;
                                  Else
                                       Insert into MU_SIMULATOR_output_ERR (load_no,ERR_CODE,ERR_DESC) values (9999,'0002','First line should always start with 0');
                                       Forms_DDL('Commit');
                                       raise form_Trigger_failure;
                                  End if ;
                        TEXT_IO.get_line (v_handle, v_filebuffer);
                        lfirst_char := Substr(v_filebuffer,0,1);
                        LC$Token := mcb_simulator_pkg.Split( v_filebuffer, i , '|') ;
                        --Message('Row '||LC$Token);
                             IF lfirst_char = '9' Then
                                  Null;
                             Else
                                  Insert into MU_SIMULATOR_output_ERR (load_no,ERR_CODE,ERR_DESC) values (8888,'0016','Second line should start with 9');
                                  Forms_DDL('Commit');
                                  raise form_Trigger_failure;
                             End IF;
                        LOOP
                        TEXT_IO.get_line (v_handle, v_filebuffer);
                        lfirst_char := Substr(v_filebuffer,0,1);
                        LC$Token := mcb_simulator_pkg.Split( v_filebuffer, i , '|') ;
                        --Message('Row '||LC$Token);
                                  IF lfirst_char = '1' Then
                                  Null;
                                  Else
                                       Insert into MU_SIMULATOR_output_ERR (load_no,ERR_CODE,ERR_DESC) values (7777,'0022','The third line onward should start with 1');
                                       Forms_DDL('Commit');
                                       raise form_Trigger_failure;
                                  End if;
                        END LOOP;
                        --END IF;
                        END LOOP;
              EXCEPTION
                   When No_Data_Found Then
              TEXT_IO.fclose (v_handle);
              END;
    Exception
         When Others Then
         Message('Other error');
    END;
    I am calling the FUNCTION which you gave SPLIT as mcb_simulator_pkg.Split.

  • Error in loading data from flat file....

    Hi,
    While loading the data from flat file , only one column is getting populated in the ODS and after that the sixth column is getting populated with the null values. Also when I save the changes in CSV file .. its not getting saved.
    Could you please tell me what could be the probable reason for this.
    Also what are the points we should keep in mind while loading from flat files.
    Regards,
    Jeetu

    Hi,
    You need to take care of -
    1. you flat file structure ( left to right columns) should match with the datasource ( top to down) column to column. if you don't wish to laod some column, leave it as blank column in flat file.
    2. your file should not be open when loading data
    3. make sure your transfer rules & update rules are defined properly.
    4. do the simulation in infopakage first & that will give you fair idea.
    5. to start with load data first in PSA Only, check & then take it forward.
    hope it helps
    regards
    Vikash

  • Ssis - import data from flat file to table (sql server 2012)

    i have create a ssis for importing data from flat file to table in sql server 2012.
    but i got the below error for some column in data flow task.
    error: cannot processed because than one code page (950 and 1252)
    anyone helps~

    Hi,
    The issue occurs because the source flat file uses ANSI/OEM – Tranditional Chinese Big5 encoding. When processing the source file, the flat file connection manager uses code page 950 for the columns. Because SQL Server uses code page to perform conversions
    between non-Unicode data and Unicode data, the data in the code page 950 based input columns cannot be loaded to code page 1252 based destination columns. To resolve the issue, you need to load the data into a SQL Server destination table that includes Unicode
    columns (nchar or nvarchar), and convert the input columns to Unicode columns via Data Conversion or the Advanced Editor for the Flat File Source at the same time.
    Another option that may not be that practical is to create a new database based on the Chinese_Taiwan_Stroke_BIN collation, and load the data to non-Unicode columns directly.
    Reference:
    http://social.technet.microsoft.com/Forums/windows/en-US/f939e3ba-a47e-43b9-88c3-c94bdfb7da58/forum-faq-how-to-fix-the-error-the-column-xx-cannot-be-processed-because-more-than-one-code-page?forum=sqlintegrationservices 
    Regards,
    Mike Yin
    TechNet Community Support

  • How often in the real time projects extract data from flat files n process

    I am going thru teh BODS data integrator, and trying to understand the demand of ETL services extract data from a flat file, is that really impt in teh real time jobs.
    Thank you very much for the helpful info.

    Hi,
    As per the inputs given by you guys i started loading data from flat file.
    I try to load 28 files from i which i was able to load 24 files succesfully.For the other 4 i got this error messages
    1) Error 'Enter period in the format __.YYYY...' at conversion exit CONVERSION_EXIT_PERI6_INPUT (field CALMONTH record 1, value DUMYTRA)
    Message no. RSDS012
    2)  a) Error 'The argument '1,008.00' cannot be interpreted as anumber' on assignment field QUANT_B record 11714 value 1,008.00
    Message no. RSDS013
       b) Error 'The argument '1,110.00' cannot be interpreted as anumber' on assignment field QUANT_B record 15374 value 1,110.00
    Message no. RSDS013
    3) a) Error 'The argument '1,140.00' cannot be interpreted as anumber' on assignment field QUANT_B record 1647 value 1,140.00
    Message no. RSDS013
       b) Error 'The argument '2,028.00' cannot be interpreted as anumber' on assignment field QUANT_B record 4625 value 2,028.00
    Message no. RSDS013
    4) Error 'The argument '1,151.00' cannot be interpreted as anumber' on assignment field QUANT_B record 7808 value 1,151.00
    Message no. RSDS013
    I'am unable to trace out what is the error exactly.
    I checked this values in files they are perfect.
    can anybody please guide me on this issue.
    With Regards,
    Pradeep.B

  • Use LINQ to extract the data from a file...

    Hi,
    I have created a Subprocedure CreateEventList
    which populates an EventsComboBox
    with a current day's events (if any).
    I need to store the events in a generic List communityEvents
    which is a collection of
    communityEvent
    objects. This List needs to be created and assigned to the instance variable
    communityEvents.
    This method should call helper method ExtractData
    which will use LINQ to extract the data from my file.
    The specified day is the date selected on the calendar control. This method will be called from the CreateEventList.
    This method should clear all data from List communityEvents.  
    A LINQ
    query that creates CommunityEvent
    objects should select the events scheduled for selected
    day from the file. The selected events should be added to List
    communityEvents.
    See code below.
    Thanks,
    public class CommunityEvent
    private int day;
    public int Day
    get
    return day;
    set
    day = value;
    private string time;
    public string Time
    get
    return time;
    set
    time = value;
    private decimal price;
    public decimal Price
    get
    return price;
    set
    price = value;
    private string name;
    public string Name
    get
    return name;
    set
    name = value;
    private string description;
    public string Description
    get
    return description;
    set
    description = value;
    private void eventComboBox_SelectedIndexChanged(object sender, EventArgs e)
    if (eventComboBox.SelectedIndex == 0)
    descriptionTextBox.Text = "2.30PM. Price 12.50. Take part in creating various types of Arts & Crafts at this fair.";
    if (eventComboBox.SelectedIndex == 1)
    descriptionTextBox.Text = "4.30PM. Price 00.00. Take part in cleaning the local Park.";
    if (eventComboBox.SelectedIndex == 2)
    descriptionTextBox.Text = "1.30PM. Price 10.00. Take part in selling goods.";
    if (eventComboBox.SelectedIndex == 3)
    descriptionTextBox.Text = "12.30PM. Price 10.00. Take part in a game of rounders in the local Park.";
    if (eventComboBox.SelectedIndex == 4)
    descriptionTextBox.Text = "11.30PM. Price 15.00. Take part in an Egg & Spoon Race in the local Park";
    if (eventComboBox.SelectedIndex == 5)
    descriptionTextBox.Text = "No Events today.";

    Any help here would be great.
    Look, you have to make the file a XML file type -- Somefilename.xml.
    http://www.xmlfiles.com/xml/xml_intro.asp
    You can use NotePad XML to make the XML and save the text file.
    http://support.microsoft.com/kb/296560
    Or you can just use Notepad (standard), if you know the basics of how to create XML, which is just text data that can created and saved in a text file, which, represents data.
    http://www.codeproject.com/Tips/522456/Reading-XML-using-LINQ
    You can do a (select new CommunityEvent) just like the example is doing a
    select new FileToWatch and load the XML data into the CommunityEvent properties.
    So you need to learn how to make a manual XML textfile with XML data in it, and you need to learn how to use LINQ to read the XML. Linq is not going to work against some  flat text file you created. There are plenty of examples out on Bing and Google
    on how to use Linq-2-XML.
    http://en.wikipedia.org/wiki/Language_Integrated_Query
    <copied>
    LINQ extends the language by the addition of query
    expressions, which are akin to
    SQL statements, and can be used to conveniently extract and process data from
    arrays, enumerable
    classes, XML documents,
    relational databases, and third-party data sources. Other uses, which utilize query expressions as a general framework for readably composing arbitrary computations, include the construction of event handlers<sup class="reference" id="cite_ref-reactive_2-0">[2]</sup>
    or
    monadic parsers.<sup class="reference" id="cite_ref-parscomb_3-0">[3]</sup>
    <end>
    <sup class="reference" id="cite_ref-parscomb_3-0"></sup>

Maybe you are looking for

  • HP Laserjet 4 -- Windows 8

    Just got new computer -- Windows 8 -- I have a HP Laserjet 4 printer It used to work on my old computer with Vista can't find driver for it HELP?? Dave

  • IOError in IE but not in Firefox (possible crossdomain.xml problem)

    Yesterday, I hopefully debugged a problem that is occuring for our application in IE but not in Firefox. It has to do with accessing remote content from a separate domain. In every aspect it APPEARS to be a crossdomain.xml issue but the fact that thi

  • IASDeploymentException deploying from admin console; need help troubleshoot

    I am receiving a com.sun.enterprise.deployment.backend.IASDeploymentException when I deploy my ear file from the Sun Server Admin Console. Any suggestions on how to trouble shoot would be greatly appreciated. First, when I run the verifier tool, I do

  • How can i zip a folder with use of  webutil

    hi, We are using forms 10g and weutil 1.0.6.Is it possilble to zip and extract a folder with use of the webutil our requirement is take current day folder, zip that filder and upload that file second requirement is extract the uploaded zip file if an

  • New number of posts bug(?) + question; no #100, and total # of posts vary

    New number of posts bug(?) + question; no number 100, and total number of posts vary Intrigued by Tuttle's new attempt to join Niel's old habit to keep his own total number of posts count, I went and start counting my posts since the New Discussions.