How to tackle ' ' in utl_file

Hi
we are in the process of loading data from .csv files into the database but instead of using sqlldr we have opted for utl_file package.
in sqlldr we have an option where we can specify optionally enclosed by ' ' for taking care of data enclosed in ' ' in the files.
How do i take care of the same in utl_file .
what i want is an equivalent of enclosed by ' ' option of sqlldr in utl_file
regards
Sushant

UTL_FILE can only read the entire record into some variable. You have to write the code to parse after you do a Utl_File.get_line.
You could use the INSTR function along with SUBSTR to parse. Utl_File works well with fixed format files. You should stick with SqlLdr if your columns are enclosed by quotes.

Similar Messages

  • How to tackle long processing threads ?

    Hi All,
    I have a requirement where it will take couple of minutes to complete a process. Well, I made that method to be "synchronized" so that none of the other threads can do anything(since all other activities are dependant on this thread, the others have to wait till this important threads completes it's process).
    Now the problem is that other threads goes into problem and all of them are in MW(monitor wait) state and sometime kills/hangs the wls-instance. We are using wls6.1+sp2.
    please suggest me with some example for how to tackle these kind of problems.
    -sangita

    What is this process doing?this process gets some information from an external DB source. Basically, it does some search on a given field. We can't do anything with this DB resource as we don't have any hold on this, some legacy old crap.
    All we need to do is to handle this long runnig process through java. could you please suggest how to handle long processes, intelligently ?
    Thanks for your time on this.
    -sangita

  • How to tackle Forward limit active without stop running vi

    how to tackle Forward limit active without stop running vi: Our robot is comprise of with 8 joint,to limit the robot work area we use the limit botton But now the problem is whenever the limit button is active the vi will stop. i had decompose the errot clust.But i doesnot work ,Is there any good method to solve this problem Thanks!!

    Please provide more information about the hard- and software you are using:
    - NI-Motion board type
    - NI-Motion software version
    - LabVIEW version
    - Simple example code that reproduces the problem
    Best regards,
    Jochen Klier
    Applications Engineering Group Leader
    National Instruments Germany GmbH

  • How do I use UTL_FILE to insert a large number of fields to a file?

    Hi
    I am trying to use UTL_FILE for the first time in a Stored Procedure. I need to run a complex query to select 50 fields from various tables. I need these to be inserted into one line in the output file for all rows. Is this possible? My procedure so far is like the following
    CREATE OR REPLACE PROCEDURE PROC_TEST IS
    output_file UTL_FILE.FILE_TYPE;
    BEGIN
    FOR query in (SELECT FIELD1, FIELD2, ..........FIELD50)
    FROM TABLE A, TABLE B
    WHERE A.ID = B.ID
    ETC
    LOOP
    UTL_FILE.PUT_LINE(output_file, <put all 50 fields for all records into file> );
    END LOOP;               
    UTL_FILE.FCLOSE (output_file);
    EXCEPTION
    WHEN NO_DATA_FOUND THEN
    NULL;
    WHEN OTHERS THEN
         UTL_FILE.FCLOSE_ALL;
    RAISE;
    END PROC_TEST;
    Do I need to define 'query' (after the FOR) anywhere, also please advise with how I put all of the fields into the file.
    Thanks
    GB

    Thanks Steve,
    I have the UTL_FILE working fine now.
    I have other queries to run and conditions to apply in the same procedure, and I need to schedule via Enterprise Manager, therefore using UTL_FILE in a procedure seemed the best option. I looked up Data-pump but this seems to be an 11g feature, and we are still on 10g therefore I will not be able to use it.
    Thanks for your help.
    GB

  • Advice / Help needed on how to tackle this mountain...

    First, I've scoured the forums looking for an answer and some clarity, but have come up empty. Any advice or help you can offer will be greatly appreciated...
    So I'm trying to tackle the feat of sorting though thousands of MP3s in several different fodlers, and getting them all stored in one folder, with just the ones I want to keep (estimated to be about 3,000 when all said and done). Having iTunes just automatically copy them as I drag and drop is perfect, but I don't want the program to automatically sort them in hundreds of different folders. Is there a way to not completely turn this folder off, but have it just copy into the root folder?
    Assuming the answer is no, what's the best way for me to approach this? Right now I've turned off the copy feature, have been adding 100 or so songs into a new playlist, editing the tags in iTunes, and copying into my finish root folder. However, I obviously lose my rating when I move the files, and have to update my library by removing all previous files in library, and adding the root folder again.
    A way around this is for me to just not assign a rating to any files until I have my final root folder set for import into iTunes library, but this also raises a conceptual problem: I use other programs for uplaoding songs from CDs and DLing, and have them all sent to an "Unsorted MP3s" folder. I would like to be able to go back to this folder, edit the tags and clean them up, then copy them into the root folder and updating my library without having to remove all and reimport the whole folder, losing all my ratings and playcounts.
    So I guess there's really 2 questions I have;
    First, is there a way to have iTunes just copy the file into a root folder without automatically sorting and creating folders itself?
    If no, what's the best way for me to go about updating, importing, and copying songs into a singular root folder? I realize I could just stay on top of it, and only add a few files at a time individually, but that solution isn't very feasible when I'd like to update and add large numbers at a time.
    I gotta imagine there's something I'm missing here and it's possible. I did have the idea of merging ALL of my MP3s into my root folder first, then delete and edit from iTunes, eventually ending with a perfect library and all files in one folder. However, I'm still posed with the problem of how to add files at a later date without losing my whole library...
    Hopefully I haven't been too confusing, and I'll be sure to offer more clarity if need be. Thanks in advance for all the help!
    -Mike

    Im not a 100% sure on what your trying to do, i think and correct me if im wrong but you want to copy all your songs that are scattered all over the pc into one folder in the root but not have itunes consolidate the files into folders by name etc.
    Im not sure if this is of any help to you but i had songs scattered all over my pc and wanted to put them in a central folder and then share that folder accross multiple user accounts on the one pc (this could be the root if you wish i just chose shared music) if you follow the following link it explained what i did to put all my music in the one place. (I urge caution though as im not 100% certain if this is what you want to do, anyway hope it help)
    http://discussions.apple.com/thread.jspa?threadID=608497&tstart=0

  • How to tackle BACKSPACE key behaviour?

    Hi,
    i noticed some nasty thing about pressing BACKSPACE key in a form.
    it causes navigation to previous page .
    How can i tackle it?
    JDev 11.1.2.3

    Hi
    It is not related to ADF and web browser behavior
    See
    http://mcrusch.wordpress.com/2009/02/03/disabling-back-behavior-of-backspace-in-browser/

  • How to tackle the dataflow problem when Value Change event always triggers after another GUI event

    We know that Value change event always triggers after another GUI event. Eg, the user modifies string control, the user clicks on a boolean control. Then event boolean clicked is triggered before event string control value change.
    Now suppose somehow the GUI event that must happen to subsequently trigger the Value change event can potentially affect the data that Value change event is supposed to work on. How can we tackle this problem ?
    For example, in a mockup application that the grand purpose is to have user entered values in a textbox logged to a file (no missing information is accepted, and there is a boolean to determine how the information is logged).
    There are 2 controls, boolean A when clicked (mouse down) will load random number in text box B. Text box B is designed with event structure VALUE change which saves whatever values user enters into text box B to a log file.
    There are 3 problems when instead of clicking anywhere on the front panel after modifying text box B, the user ends up clicking on boolean control A.
    1. Event mouse down on Boolean control A will execute first, modifying text box B content before the user entered values in B get saved.
    2. The value of boolean A can potentially affect how textbox B is loggged.
    3. The value of boolean A affects how the file is logged and this is indeterminate. Somehow when running this VI with no Highlighting, the textbox B Value change event executes -before- boolean A value is updated (F to T). When running this VI with Highlighting, the boolean A value is updated (F to T) (because we click on it) -before- textbox B value change event occurs. Why is it like this ?
    Now the situation I made up seems non-sense, but I believe it resembles one way or another a problem that you might run into. How would you solve this problem elegantly ?
     

    You can set the string control to "update while typing".
    Are you sure appending the log to itself is reasonable? Wouldn't it grow without bounds if the users keeps entering strings or pressing the ingore button?
    Why isn't the "constant" a diagram constant instead of a control. Is the user allowed to change it?
    To reset just write empty strings or a false to local variables of the controls (renit to defaults" seems a bit heavy handed).
    All you probably need is a single event case for "ignore:value change" and "String" value changed", no need for the local variable..
    Also add a stop button and an event for it.
    You don't need the timeout event.
     

  • LDAP_UNAVAILABLE how to tackle it

    i am using LDAP version 3.0 for fetctching data from Sun Directory Server 5.1.
    Using microsoft LDAP API in delphi 7.0
    Use following API's first initialize session as
    FActiveDirSession := ldap_init(PChar(Host), PortNumber)
    Then bind to the server suing
    Result := ldap_Connect(FActiveDirSession, nil);
    Result := ldap_simple_bind_s(FActiveDirSession,PChar(User),PChar(Password));
    The search record using
    Result := ldap_search_s(FActiveDirSession, PChar(BaseDN),SearchScope ,PChar(Filter),nil,Cardinal(False),Messages);
    It will fetch records correctly but
    when the same search function is called after one day with same settings and same seesion it will return error 0x34=52. This problem occur on the second day when it reschedule. Can any one help me out what is the actual probelm is it problem of Sun directory Server or my program has the fault.
    Tell if u can what r the possible solution if it is the problem of SUN Directory Server and what can do with the directory server so that this problem will not arise.
    Anxiously Waiting for you reply
    regards
    Imtiaz Khadim
    regards
    Imtiaz Khadim

    Thank you for your helpful replies. I will report my status:
    1) I have found there is no ORACLE_SID at all in my system.
    2) $ORACLE_HOME/bin/emctl status dbconsole
    it shows : No ORACLE_SID is defined.
    3) I exported the ORACLE_SID=+ASM in the .bash_profile
    4) $ORACLE_HOME/bin/emctl status dbconsole
    it shows: TZ set to US/Central
    OC4J Configuration issue. /opt/oracle/product/10.1.0/db_1/oc4j/j2ee/OC4J_DBConsole_localdomain_+ASM not found.
    (*NOTE: ORACLE_HOME=/opt/oracle/product/10.1.0/db_1)
    5) I don't know how to check the EM version. But when I click Start Application-Oracle 10g Menu-DBA Tasks-Oracle Enterprise Manager 10g Database Control Start, it shows:
    TZ set to US/Central
    Oracle Enterprise Manager 10g Database Control Release 10.1.0.2.0
    Copyright(c) 1996, 2004 Oracle Corporation. All rights reserved.
    http://localhost.localdomain:5500/em/console/aboutApplication
    Starting Oracle Enterprise Manager 10g Database Control........started.
    Logs are generated in directory /opt/oracle/product/10.1.0/db_1/localhost.localdomain_orcl10g/sysman/log
    6) I can not find any *.dbf in my system, and I have only a raw partitions named oracleasm mounted in /dev/oracleasm. When I click Start Application-Oracle 10g Menu-DBA Tasks-Oracle 10g Database Start, it shows:
    SQL> Connected to an idle instance.
    SQL> ASM instance started
    Total System Global Area 100663296 bytes
    Fixed Size 777616 bytes
    Variable Size 99885680 bytes
    Database buffers 0 bytes
    Redo buffers 0 bytes
    ASM disgroups mounted
    SQL>disconnected from Oracle Database 10g Release 10.1.0.2.0 -Production
    Database "+ASM" warm started.
    Database "+ASM" already started.
    Database "+ASM" already started.
    All these things make me guess the dabases were built using ASM. Am I right?
    7) If it is EM configuration problem, what should I do next?
    Thanks a lot!

  • How to tackle corrupt OST File Problem ???

    In past days I faced lot of problem due to inaccessible and unreadable offline storage (.ost) file. By the manual methodology, I was unsuccessful to tackle this problem along with it took lot of my time. Then after few days, I found
    SysTools OST Converter, it is one best third party OST file recovery application that retrieve my OST file database just in few seconds. It provides me facility to split my large OST file in small PST files.
    You can search this SysTools OST Converter in-
    https://www.google.com/search?q=systools+ost+recovery
    http://www.bing.com/search?q=systools+ost+recovery

    Hi,
    Actually it is not the right forum for this issue, but I’d like to share the following article with you, hope can help you.
    Repair Outlook Data Files (.pst and .ost)
    http://office.microsoft.com/en-in/outlook-help/repair-outlook-data-files-pst-and-ost-HA010354964.aspx
    Meanwhile, I recommend you to go to the following forum for help:
    Microsoft Office forum
    http://social.technet.microsoft.com/Forums/office/en-US/home?category=officeitpro.
    Regard,
    Yolanda

  • How to tackle " . " or " , "Comma DECIMAL notations in text file uploading?

    Hi Experts,
    I hv to upload the text file via LSMW and creating the Purchse Orders.
    So, issue is with QUANTITY and VALUE fileds, bcoz, the text file may some times contain " . " as DECIAML or may some times " , "(comma-Europe) as DECIMALs.
    And some times this LSMW may run by US guys, where " ." is DECIMAL and some times Europe guys, where " , " is DECIMAL.
    So, request u that, let me know that, How to handle all situations? Is there any FM?
    thank u

    Hello Srinivas
    The easiest approach would be the organizational one:
    Assuming that all relevant values have at most two digitals then export all values without decimal sign and interpret the last two digit of any values as decimals values (this approach is used in many EDI messages).
    The second approach would be to include a "marker" about the decimal notation in the file.
    However, I would always recommend to use the simplest approach (= 1st approach).
    Regards
      Uwe

  • How to tackle files other then jars inorder to run Client applications?

    Hi,
    I am moving my client application to run through JWS. In order to run client application, i need to download files other then jars like xml, some bin files etc. My JNLP file is something like following.
    <security>
    <all-permissions/>
    </security>
    <resources>
    <j2se version="1.6"/>
    <jar href="client.jar" download="eager"/>
    <jar href="log4j.xml" download="eager"/>
    </resources>
    During downloading of XML, JWS raises exception for XML file "#### Could not verify signing in resource: tttp://testmachine:8080/classes/log4j.xml"..
    #1 How can i avoid these Exception ?
    #2. Can only Jar files be downloaded by JWS or i need to package files other jar into jar file format ?
    Appreciate if someone can help me on that matter?

    The jnlp specification only allows jar files as resources . Downloading anything else would not help anyway as there would be no way to access it. Only the classloader has access to the downloaded resources, so it is required to bundle all other resources in jar files and access them with Thread.currentThread().getResourceAsStream("myfile.xml");
    /Andy

  • How to tackle the Training appraisal form whenever manager has to appraise Trainees?

    Dear Team,
    I am having a query on creation of  a Training  Appraisal Template.
    As per the requirement, we need to create a template which has to Evaluate/ appraise by the Reporting manager.
    After a stipulated time, reporting manager has to evaluate the employees who have attended the training program.
    As per the standard TEM, there will be  appraisal for the Training program.
    How to manage the apprisal of Trainees by the respective managers.
    Kindly advice me to proceed.
    Regards,
    Sairam.

    Hello Tiberiu,
    Have you tried to change the element access for the appraisee for this particular element (tab 'emelent access' in the criteria maintenance).
    Regards
    Nicole

  • Need suggestion.  Have a task not sure of how to tackle.

    Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
    PL/SQL Release 11.2.0.3.0 - Production
    CORE     11.2.0.3.0     Production
    TNS for Solaris: Version 11.2.0.3.0 - Production
    NLSRTL Version 11.2.0.3.0 - Production
    I have a table that is being split up. But on top of that, the structure of the columns being removed from the original table is being changed so that the column names are being concatenated onto the line data.
    Here is the original table build. All columns that start with A_ ...E_ are being pulled out, and are the only ones that I'm concerned with.
    CREATE TABLE FS_NRIS_INFORMS.NRI_FRCC_REF
      PNVG_ID                 NUMBER(10)            NOT NULL,
      PNVG                    VARCHAR2(10 BYTE)     NOT NULL,
      LOCATION                VARCHAR2(16 BYTE),
      DESCRIPTION             VARCHAR2(254 BYTE),
      VEG_GRP                 VARCHAR2(2 BYTE),
      A_OVRSTRY_AVG_DBH_LOW   NUMBER(10),
      A_OVRSTRY_AVG_DBH_HIGH  NUMBER(10),
      A_CC_LOW                NUMBER(10),
      A_CC_HIGH               NUMBER(10),
      B_OVRSTRY_AVG_DBH_LOW   NUMBER(10),
      B_OVRSTRY_AVG_DBH_HIGH  NUMBER(10),
      B_CC_LOW                NUMBER(10),
      B_CC_HIGH               NUMBER(10),
      C_OVRSTRY_AVG_DBH_LOW   NUMBER(10),
      C_OVRSTRY_AVG_DBH_HIGH  NUMBER(10),
      C_CC_LOW                NUMBER(10),
      C_CC_HIGH               NUMBER(10),
      D_OVRSTRY_AVG_DBH_LOW   NUMBER(10),
      D_OVRSTRY_AVG_DBH_HIGH  NUMBER(10),
      D_CC_LOW                NUMBER(10),
      D_CC_HIGH               NUMBER(10),
      E_OVRSTRY_AVG_DBH_LOW   NUMBER(10),
      E_OVRSTRY_AVG_DBH_HIGH  NUMBER(10),
      E_CC_LOW                NUMBER(10),
      E_CC_HIGH               NUMBER(10),
      A_PCT                   NUMBER(10),
      B_PCT                   NUMBER(10),
      C_PCT                   NUMBER(10),
      D_PCT                   NUMBER(10),
      E_PCT                   NUMBER(10),
      CREATED_BY              VARCHAR2(30 BYTE)     DEFAULT USER,
      CREATED_DATE            DATE                  DEFAULT SYSDATE,
      CREATED_IN_INSTANCE     NUMBER(6)             NOT NULL,
      MODIFIED_BY             VARCHAR2(30 BYTE),
      MODIFIED_DATE           DATE,
      MODIFIED_IN_INSTANCE    NUMBER(6)
    TABLESPACE USERS_NRIS_INFORMS
    RESULT_CACHE (MODE DEFAULT)
    PCTUSED    0
    PCTFREE    10
    INITRANS   1
    MAXTRANS   255
    STORAGE    (
                INITIAL          80K
                NEXT             1M
                MINEXTENTS       1
                MAXEXTENTS       UNLIMITED
                PCTINCREASE      0
                BUFFER_POOL      DEFAULT
                FLASH_CACHE      DEFAULT
                CELL_FLASH_CACHE DEFAULT
    LOGGING
    NOCOMPRESS
    NOCACHE
    NOPARALLEL
    MONITORING;
    COMMENT ON TABLE FS_NRIS_INFORMS.NRI_FRCC_REF IS 'Stores reference information for the FRCC tool.';
    CREATE UNIQUE INDEX FS_NRIS_INFORMS.NRI_FRCC_REF_PK ON FS_NRIS_INFORMS.NRI_FRCC_REF
    (PNVG_ID)
    LOGGING
    TABLESPACE INDEXES_NRIS_INFORMS
    PCTFREE    10
    INITRANS   2
    MAXTRANS   255
    STORAGE    (
                INITIAL          80K
                NEXT             1M
                MINEXTENTS       1
                MAXEXTENTS       UNLIMITED
                PCTINCREASE      0
                BUFFER_POOL      DEFAULT
                FLASH_CACHE      DEFAULT
                CELL_FLASH_CACHE DEFAULT
    NOPARALLEL;
    CREATE OR REPLACE TRIGGER FS_NRIS_INFORMS.nri_frcc_ref_t
    BEFORE INSERT OR UPDATE
    ON fs_nris_informs.nri_frcc_ref
    FOR EACH ROW
    DECLARE
    BEGIN
       IF INSERTING
       THEN
          db_instance.insert_audit_columns(:new.created_by,
                                           :new.created_in_instance,
                                           :new.created_date);
       ELSE
          db_instance.update_audit_columns(:new.modified_by,
                                           :new.modified_in_instance,
                                           :new.modified_date);
       END IF;
    END;
    ALTER TABLE FS_NRIS_INFORMS.NRI_FRCC_REF ADD (
      CONSTRAINT NRI_FRCC_REF_PK
      PRIMARY KEY
      (PNVG_ID)
      USING INDEX FS_NRIS_INFORMS.NRI_FRCC_REF_PK
      ENABLE VALIDATE);INSERT SAMPLE:
    Insert into FS_NRIS_INFORMS.NRI_FRCC_REF
       (PNVG_ID, PNVG, LOCATION, DESCRIPTION, VEG_GRP,
        A_OVRSTRY_AVG_DBH_LOW, A_OVRSTRY_AVG_DBH_HIGH, A_CC_LOW, A_CC_HIGH, B_OVRSTRY_AVG_DBH_LOW,
        B_OVRSTRY_AVG_DBH_HIGH, B_CC_LOW, B_CC_HIGH, C_OVRSTRY_AVG_DBH_LOW, C_OVRSTRY_AVG_DBH_HIGH,
        C_CC_LOW, C_CC_HIGH, D_OVRSTRY_AVG_DBH_LOW, D_OVRSTRY_AVG_DBH_HIGH, D_CC_LOW,
        D_CC_HIGH, E_OVRSTRY_AVG_DBH_LOW, E_OVRSTRY_AVG_DBH_HIGH, E_CC_LOW, E_CC_HIGH,
        A_PCT, B_PCT, C_PCT, D_PCT, E_PCT,
        CREATED_BY, CREATED_DATE, CREATED_IN_INSTANCE, MODIFIED_BY, MODIFIED_DATE,
        MODIFIED_IN_INSTANCE)
    Values
       (37, 'LLSH', 'Eastern', 'Longleaf Pine - Sandhills', 'xx',
        0, 0, 0, 0, 0,
        0, 0, 0, 0, 0,
        0, 0, 0, 0, 0,
        0, 0, 0, 0, 0,
        0, 0, 0, 0, 0,
        'INFORMS', TO_DATE('04/05/2005 14:10:15', 'MM/DD/YYYY HH24:MI:SS'), 10642, 'INFORMS', TO_DATE('07/14/2005 14:37:25', 'MM/DD/YYYY HH24:MI:SS'),
        10642);
    Insert into FS_NRIS_INFORMS.NRI_FRCC_REF
       (PNVG_ID, PNVG, LOCATION, DESCRIPTION, VEG_GRP,
        A_OVRSTRY_AVG_DBH_LOW, A_OVRSTRY_AVG_DBH_HIGH, A_CC_LOW, A_CC_HIGH, B_OVRSTRY_AVG_DBH_LOW,
        B_OVRSTRY_AVG_DBH_HIGH, B_CC_LOW, B_CC_HIGH, C_OVRSTRY_AVG_DBH_LOW, C_OVRSTRY_AVG_DBH_HIGH,
        C_CC_LOW, C_CC_HIGH, D_OVRSTRY_AVG_DBH_LOW, D_OVRSTRY_AVG_DBH_HIGH, D_CC_LOW,
        D_CC_HIGH, E_OVRSTRY_AVG_DBH_LOW, E_OVRSTRY_AVG_DBH_HIGH, E_CC_LOW, E_CC_HIGH,
        A_PCT, B_PCT, C_PCT, D_PCT, E_PCT,
        CREATED_BY, CREATED_DATE, CREATED_IN_INSTANCE, MODIFIED_BY, MODIFIED_DATE,
        MODIFIED_IN_INSTANCE)
    Values
       (24, 'MABA', 'Eastern', 'Maple-Basswood', 'xx',
        0, 0, 0, 0, 0,
        0, 0, 0, 0, 0,
        0, 0, 0, 0, 0,
        0, 0, 0, 0, 0,
        0, 0, 0, 0, 0,
        'INFORMS', TO_DATE('04/05/2005 14:10:15', 'MM/DD/YYYY HH24:MI:SS'), 10642, 'INFORMS', TO_DATE('07/14/2005 14:37:25', 'MM/DD/YYYY HH24:MI:SS'),
        10642);
    Insert into FS_NRIS_INFORMS.NRI_FRCC_REF
       (PNVG_ID, PNVG, LOCATION, DESCRIPTION, VEG_GRP,
        A_OVRSTRY_AVG_DBH_LOW, A_OVRSTRY_AVG_DBH_HIGH, A_CC_LOW, A_CC_HIGH, B_OVRSTRY_AVG_DBH_LOW,
        B_OVRSTRY_AVG_DBH_HIGH, B_CC_LOW, B_CC_HIGH, C_OVRSTRY_AVG_DBH_LOW, C_OVRSTRY_AVG_DBH_HIGH,
        C_CC_LOW, C_CC_HIGH, D_OVRSTRY_AVG_DBH_LOW, D_OVRSTRY_AVG_DBH_HIGH, D_CC_LOW,
        D_CC_HIGH, E_OVRSTRY_AVG_DBH_LOW, E_OVRSTRY_AVG_DBH_HIGH, E_CC_LOW, E_CC_HIGH,
        A_PCT, B_PCT, C_PCT, D_PCT, E_PCT,
        CREATED_BY, CREATED_DATE, CREATED_IN_INSTANCE, MODIFIED_BY, MODIFIED_DATE,
        MODIFIED_IN_INSTANCE)
    Values
       (23, 'MBOA', 'Eastern', 'Maple-Basswood-Oak-Aspen Mosaic', 'xx',
        0, 0, 0, 0, 0,
        0, 0, 0, 0, 0,
        0, 0, 0, 0, 0,
        0, 0, 0, 0, 0,
        0, 0, 0, 0, 0,
        'INFORMS', TO_DATE('04/05/2005 14:10:15', 'MM/DD/YYYY HH24:MI:SS'), 10642, 'INFORMS', TO_DATE('07/14/2005 14:37:25', 'MM/DD/YYYY HH24:MI:SS'),
        10642);
    Insert into FS_NRIS_INFORMS.NRI_FRCC_REF
       (PNVG_ID, PNVG, LOCATION, DESCRIPTION, VEG_GRP,
        A_OVRSTRY_AVG_DBH_LOW, A_OVRSTRY_AVG_DBH_HIGH, A_CC_LOW, A_CC_HIGH, B_OVRSTRY_AVG_DBH_LOW,
        B_OVRSTRY_AVG_DBH_HIGH, B_CC_LOW, B_CC_HIGH, C_OVRSTRY_AVG_DBH_LOW, C_OVRSTRY_AVG_DBH_HIGH,
        C_CC_LOW, C_CC_HIGH, D_OVRSTRY_AVG_DBH_LOW, D_OVRSTRY_AVG_DBH_HIGH, D_CC_LOW,
        D_CC_HIGH, E_OVRSTRY_AVG_DBH_LOW, E_OVRSTRY_AVG_DBH_HIGH, E_CC_LOW, E_CC_HIGH,
        A_PCT, B_PCT, C_PCT, D_PCT, E_PCT,
        CREATED_BY, CREATED_DATE, CREATED_IN_INSTANCE, MODIFIED_BY, MODIFIED_DATE,
        MODIFIED_IN_INSTANCE)
    Values
       (26, 'MMPH', 'Eastern', 'Mixed Mesophytic Northeast', 'xx',
        0, 0, 0, 0, 0,
        0, 0, 0, 0, 0,
        0, 0, 0, 0, 0,
        0, 0, 0, 0, 0,
        0, 0, 0, 0, 0,
        'INFORMS', TO_DATE('04/05/2005 14:10:15', 'MM/DD/YYYY HH24:MI:SS'), 10642, 'INFORMS', TO_DATE('07/14/2005 14:37:25', 'MM/DD/YYYY HH24:MI:SS'),
        10642);
    Insert into FS_NRIS_INFORMS.NRI_FRCC_REF
       (PNVG_ID, PNVG, LOCATION, DESCRIPTION, VEG_GRP,
        A_OVRSTRY_AVG_DBH_LOW, A_OVRSTRY_AVG_DBH_HIGH, A_CC_LOW, A_CC_HIGH, B_OVRSTRY_AVG_DBH_LOW,
        B_OVRSTRY_AVG_DBH_HIGH, B_CC_LOW, B_CC_HIGH, C_OVRSTRY_AVG_DBH_LOW, C_OVRSTRY_AVG_DBH_HIGH,
        C_CC_LOW, C_CC_HIGH, D_OVRSTRY_AVG_DBH_LOW, D_OVRSTRY_AVG_DBH_HIGH, D_CC_LOW,
        D_CC_HIGH, E_OVRSTRY_AVG_DBH_LOW, E_OVRSTRY_AVG_DBH_HIGH, E_CC_LOW, E_CC_HIGH,
        A_PCT, B_PCT, C_PCT, D_PCT, E_PCT,
        CREATED_BY, CREATED_DATE, CREATED_IN_INSTANCE, MODIFIED_BY, MODIFIED_DATE,
        MODIFIED_IN_INSTANCE)
    Values
       (25, 'NESF', 'Eastern', 'Northeastern Spruce-Fir', 'xx',
        0, 0, 0, 0, 0,
        0, 0, 0, 0, 0,
        0, 0, 0, 0, 0,
        0, 0, 0, 0, 0,
        0, 0, 0, 0, 0,
        'INFORMS', TO_DATE('04/05/2005 14:10:15', 'MM/DD/YYYY HH24:MI:SS'), 10642, 'INFORMS', TO_DATE('07/14/2005 14:37:25', 'MM/DD/YYYY HH24:MI:SS'),
        10642);
    Insert into FS_NRIS_INFORMS.NRI_FRCC_REF
       (PNVG_ID, PNVG, LOCATION, DESCRIPTION, VEG_GRP,
        A_OVRSTRY_AVG_DBH_LOW, A_OVRSTRY_AVG_DBH_HIGH, A_CC_LOW, A_CC_HIGH, B_OVRSTRY_AVG_DBH_LOW,
        B_OVRSTRY_AVG_DBH_HIGH, B_CC_LOW, B_CC_HIGH, C_OVRSTRY_AVG_DBH_LOW, C_OVRSTRY_AVG_DBH_HIGH,
        C_CC_LOW, C_CC_HIGH, D_OVRSTRY_AVG_DBH_LOW, D_OVRSTRY_AVG_DBH_HIGH, D_CC_LOW,
        D_CC_HIGH, E_OVRSTRY_AVG_DBH_LOW, E_OVRSTRY_AVG_DBH_HIGH, E_CC_LOW, E_CC_HIGH,
        A_PCT, B_PCT, C_PCT, D_PCT, E_PCT,
        CREATED_BY, CREATED_DATE, CREATED_IN_INSTANCE, MODIFIED_BY, MODIFIED_DATE,
        MODIFIED_IN_INSTANCE)
    Values
       (22, 'NHDW1', 'Eastern', 'Northern Hardwoods #1', 'xx',
        0, 0, 0, 0, 0,
        0, 0, 0, 0, 0,
        0, 0, 0, 0, 0,
        0, 0, 0, 0, 0,
        0, 0, 0, 0, 0,
        'INFORMS', TO_DATE('04/05/2005 14:10:16', 'MM/DD/YYYY HH24:MI:SS'), 10642, 'INFORMS', TO_DATE('07/14/2005 14:37:25', 'MM/DD/YYYY HH24:MI:SS'),
        10642);
    Insert into FS_NRIS_INFORMS.NRI_FRCC_REF
       (PNVG_ID, PNVG, LOCATION, DESCRIPTION, VEG_GRP,
        A_OVRSTRY_AVG_DBH_LOW, A_OVRSTRY_AVG_DBH_HIGH, A_CC_LOW, A_CC_HIGH, B_OVRSTRY_AVG_DBH_LOW,
        B_OVRSTRY_AVG_DBH_HIGH, B_CC_LOW, B_CC_HIGH, C_OVRSTRY_AVG_DBH_LOW, C_OVRSTRY_AVG_DBH_HIGH,
        C_CC_LOW, C_CC_HIGH, D_OVRSTRY_AVG_DBH_LOW, D_OVRSTRY_AVG_DBH_HIGH, D_CC_LOW,
        D_CC_HIGH, E_OVRSTRY_AVG_DBH_LOW, E_OVRSTRY_AVG_DBH_HIGH, E_CC_LOW, E_CC_HIGH,
        A_PCT, B_PCT, C_PCT, D_PCT, E_PCT,
        CREATED_BY, CREATED_DATE, CREATED_IN_INSTANCE, MODIFIED_BY, MODIFIED_DATE,
        MODIFIED_IN_INSTANCE)
    Values
       (19, 'NHDW2', 'Eastern', 'Conifer Northern Hardwoods', 'xx',
        0, 0, 0, 0, 0,
        0, 0, 0, 0, 0,
        0, 0, 0, 0, 0,
        0, 0, 0, 0, 0,
        0, 0, 0, 0, 0,
        'INFORMS', TO_DATE('04/05/2005 14:10:16', 'MM/DD/YYYY HH24:MI:SS'), 10642, 'INFORMS', TO_DATE('07/14/2005 14:37:25', 'MM/DD/YYYY HH24:MI:SS'),
        10642);
    Insert into FS_NRIS_INFORMS.NRI_FRCC_REF
       (PNVG_ID, PNVG, LOCATION, DESCRIPTION, VEG_GRP,
        A_OVRSTRY_AVG_DBH_LOW, A_OVRSTRY_AVG_DBH_HIGH, A_CC_LOW, A_CC_HIGH, B_OVRSTRY_AVG_DBH_LOW,
        B_OVRSTRY_AVG_DBH_HIGH, B_CC_LOW, B_CC_HIGH, C_OVRSTRY_AVG_DBH_LOW, C_OVRSTRY_AVG_DBH_HIGH,
        C_CC_LOW, C_CC_HIGH, D_OVRSTRY_AVG_DBH_LOW, D_OVRSTRY_AVG_DBH_HIGH, D_CC_LOW,
        D_CC_HIGH, E_OVRSTRY_AVG_DBH_LOW, E_OVRSTRY_AVG_DBH_HIGH, E_CC_LOW, E_CC_HIGH,
        A_PCT, B_PCT, C_PCT, D_PCT, E_PCT,
        CREATED_BY, CREATED_DATE, CREATED_IN_INSTANCE, MODIFIED_BY, MODIFIED_DATE,
        MODIFIED_IN_INSTANCE)
    Values
       (18, 'NHDW3', 'Eastern', 'Northern Hardwoods #2', 'xx',
        0, 0, 0, 0, 0,
        0, 0, 0, 0, 0,
        0, 0, 0, 0, 0,
        0, 0, 0, 0, 0,
        0, 0, 0, 0, 0,
        'INFORMS', TO_DATE('04/05/2005 14:10:16', 'MM/DD/YYYY HH24:MI:SS'), 10642, 'INFORMS', TO_DATE('07/14/2005 14:37:25', 'MM/DD/YYYY HH24:MI:SS'),
        10642);
    Insert into FS_NRIS_INFORMS.NRI_FRCC_REF
       (PNVG_ID, PNVG, LOCATION, DESCRIPTION, VEG_GRP,
        A_OVRSTRY_AVG_DBH_LOW, A_OVRSTRY_AVG_DBH_HIGH, A_CC_LOW, A_CC_HIGH, B_OVRSTRY_AVG_DBH_LOW,
        B_OVRSTRY_AVG_DBH_HIGH, B_CC_LOW, B_CC_HIGH, C_OVRSTRY_AVG_DBH_LOW, C_OVRSTRY_AVG_DBH_HIGH,
        C_CC_LOW, C_CC_HIGH, D_OVRSTRY_AVG_DBH_LOW, D_OVRSTRY_AVG_DBH_HIGH, D_CC_LOW,
        D_CC_HIGH, E_OVRSTRY_AVG_DBH_LOW, E_OVRSTRY_AVG_DBH_HIGH, E_CC_LOW, E_CC_HIGH,
        A_PCT, B_PCT, C_PCT, D_PCT, E_PCT,
        CREATED_BY, CREATED_DATE, CREATED_IN_INSTANCE, MODIFIED_BY, MODIFIED_DATE,
        MODIFIED_IN_INSTANCE)
    Values
       (21, 'NHFI', 'Eastern', 'Northern Hardwoods-Fir', 'xx',
        0, 0, 0, 0, 0,
        0, 0, 0, 0, 0,
        0, 0, 0, 0, 0,
        0, 0, 0, 0, 0,
        0, 0, 0, 0, 0,
        'INFORMS', TO_DATE('04/05/2005 14:10:16', 'MM/DD/YYYY HH24:MI:SS'), 10642, 'INFORMS', TO_DATE('07/14/2005 14:37:25', 'MM/DD/YYYY HH24:MI:SS'),
        10642);
    Insert into FS_NRIS_INFORMS.NRI_FRCC_REF
       (PNVG_ID, PNVG, LOCATION, DESCRIPTION, VEG_GRP,
        A_OVRSTRY_AVG_DBH_LOW, A_OVRSTRY_AVG_DBH_HIGH, A_CC_LOW, A_CC_HIGH, B_OVRSTRY_AVG_DBH_LOW,
        B_OVRSTRY_AVG_DBH_HIGH, B_CC_LOW, B_CC_HIGH, C_OVRSTRY_AVG_DBH_LOW, C_OVRSTRY_AVG_DBH_HIGH,
        C_CC_LOW, C_CC_HIGH, D_OVRSTRY_AVG_DBH_LOW, D_OVRSTRY_AVG_DBH_HIGH, D_CC_LOW,
        D_CC_HIGH, E_OVRSTRY_AVG_DBH_LOW, E_OVRSTRY_AVG_DBH_HIGH, E_CC_LOW, E_CC_HIGH,
        A_PCT, B_PCT, C_PCT, D_PCT, E_PCT,
        CREATED_BY, CREATED_DATE, CREATED_IN_INSTANCE, MODIFIED_BY, MODIFIED_DATE,
        MODIFIED_IN_INSTANCE)
    Values
       (20, 'NHSP', 'Eastern', 'Northern Hardwoods-Spruce', 'xx',
        0, 0, 0, 0, 0,
        0, 0, 0, 0, 0,
        0, 0, 0, 0, 0,
        0, 0, 0, 0, 0,
        0, 0, 0, 0, 0,
        'INFORMS', TO_DATE('04/05/2005 14:10:16', 'MM/DD/YYYY HH24:MI:SS'), 10642, 'INFORMS', TO_DATE('07/14/2005 14:37:25', 'MM/DD/YYYY HH24:MI:SS'),
        10642);The new table build is this:
    CREATE TABLE FS_NRIS_ANALYZER.NRSA_FRCC_SERAL_REF
      BPS_ID_FK  NUMBER(10)                         NOT NULL,
      SERAL      VARCHAR2(20 BYTE)                  NOT NULL,
      KEY_NAME   VARCHAR2(50 BYTE)                  NOT NULL,
      KEY_VALUE  NUMBER(10)                         NOT NULL
    TABLESPACE USERS_NRIS_FSVEG
    RESULT_CACHE (MODE DEFAULT)
    PCTUSED    0
    PCTFREE    10
    INITRANS   1
    MAXTRANS   255
    STORAGE    (
                INITIAL          80K
                NEXT             1M
                MINEXTENTS       1
                MAXEXTENTS       UNLIMITED
                PCTINCREASE      0
                BUFFER_POOL      DEFAULT
                FLASH_CACHE      DEFAULT
                CELL_FLASH_CACHE DEFAULT
    LOGGING
    NOCOMPRESS
    NOCACHE
    NOPARALLEL
    MONITORING;
    ALTER TABLE FS_NRIS_ANALYZER.NRSA_FRCC_SERAL_REF ADD (
      CONSTRAINT NRSA_FRCC_SERAL_REF_FK
      FOREIGN KEY (BPS_ID_FK)
      REFERENCES FS_NRIS_ANALYZER.NRSA_FRCC_REF (BPS_ID)
      ENABLE VALIDATE);Here is where the fun begins...
    This new table contains the data from the columns removed from NRI_FRCC_REF.
    SERAL is the prefix of the column. So for the existing data it will be A, B, C, D, or E.
    KEY_NAME is the rest of the column name. So for the existing data it will be OVRSTRY_AVG_DBH_LOW, OVRSTRY_AVG_DBH_HIGH,
    CC_LOW, CC_HIGH, or PCT
    KEY_VALUE is the value that was assigned to that record. So, if PNVG = 1 and A_CC_HIGH = 2. Then the new record would
    look like: BPS_ID_FK = '1', SERAL = 'A', KEY_NAME= 'CC_HIGH', KEY_VALUE = 2
    Thinking I had this licked, here is what I constructed...
    select pnvg_id as BPS_ID_FK,
            substr('A_OVRSTRY_AVG_DBH_LOW', 0,1) as SERAL,
            substr('A_OVRSTRY_AVG_DBH_LOW', 3) as KEY_NAME,
            A_OVRSTRY_AVG_DBH_LOW as KEY_VALUE       
    from FS_NRIS_INFORMS.NRI_FRCC_REF
    union all
    select pnvg_id as BPS_ID_FK,
            substr('B_OVRSTRY_AVG_DBH_LOW', 0,1) as SERAL,
            substr('B_OVRSTRY_AVG_DBH_LOW', 3) as KEY_NAME,
            A_OVRSTRY_AVG_DBH_LOW as KEY_VALUE       
    from FS_NRIS_INFORMS.NRI_FRCC_REF
    union all
    select pnvg_id as BPS_ID_FK,
            substr('A_OVRSTRY_AVG_DBH_LOW', 0,1) as SERAL,
            substr('A_OVRSTRY_AVG_DBH_LOW', 3) as KEY_NAME,
            A_OVRSTRY_AVG_DBH_LOW as KEY_VALUE       
    from FS_NRIS_INFORMS.NRI_FRCC_REF
    union all
    select pnvg_id as BPS_ID_FK,
            substr('A_OVRSTRY_AVG_DBH_LOW', 0,1) as SERAL,
            substr('A_OVRSTRY_AVG_DBH_LOW', 3) as KEY_NAME,
            A_OVRSTRY_AVG_DBH_LOW as KEY_VALUE       
    from FS_NRIS_INFORMS.NRI_FRCC_REF
    union all
    select pnvg_id as BPS_ID_FK,
            substr('A_OVRSTRY_AVG_DBH_LOW', 0,1) as SERAL,
            substr('A_OVRSTRY_AVG_DBH_LOW', 3) as KEY_NAME,
            A_OVRSTRY_AVG_DBH_LOW as KEY_VALUE       
    from FS_NRIS_INFORMS.NRI_FRCC_REF
    union all
    select pnvg_id as BPS_ID_FK,
            substr('A_OVRSTRY_AVG_DBH_LOW', 0,1) as SERAL,
            substr('A_OVRSTRY_AVG_DBH_LOW', 3) as KEY_NAME,
            A_OVRSTRY_AVG_DBH_LOW as KEY_VALUE       
    from FS_NRIS_INFORMS.NRI_FRCC_REF
    union all
    select pnvg_id as BPS_ID_FK,
            substr('A_OVRSTRY_AVG_DBH_LOW', 0,1) as SERAL,
            substr('A_OVRSTRY_AVG_DBH_LOW', 3) as KEY_NAME,
            A_OVRSTRY_AVG_DBH_LOW as KEY_VALUE       
    from FS_NRIS_INFORMS.NRI_FRCC_REF
    union all
    select pnvg_id as BPS_ID_FK,
            substr('A_OVRSTRY_AVG_DBH_LOW', 0,1) as SERAL,
            substr('A_OVRSTRY_AVG_DBH_LOW', 3) as KEY_NAME,
            A_OVRSTRY_AVG_DBH_LOW as KEY_VALUE       
    from FS_NRIS_INFORMS.NRI_FRCC_REF
    union all
    select pnvg_id as BPS_ID_FK,
            substr('A_OVRSTRY_AVG_DBH_LOW', 0,1) as SERAL,
            substr('A_OVRSTRY_AVG_DBH_LOW', 3) as KEY_NAME,
            A_OVRSTRY_AVG_DBH_LOW as KEY_VALUE       
    from FS_NRIS_INFORMS.NRI_FRCC_REF
    union all;And it's far from correct...nor does it function. LOL. I had it working, but I changed something and now it doesn't.
    What is the best way to accomplish pulling the data out of the original table for an insert? I was trying to just get the data pulled correctly, then I was going to export it as insert statements via TOAD, and change the schema/table name.
    Thanks.....again. ;)
    Edited by: Willy_B on Jan 3, 2013 11:23 AM

    My apologies....I've edited the original posting to include the insert on the original table.
    Actually, there is no way to simplify it. That's my task.
    I gave the original table....can't change that.
    I gave the new table...can't change that.
    I now have included the data that has to be migrated....can't change that.
    I gave my script that I've been working on. That can be changed or blown away. But it actually does exactly what you say. I took the first column, and then literally copied it for all the rest of the columns.
    select pnvg_id as BPS_ID_FK,
            substr('A_OVRSTRY_AVG_DBH_LOW', 0,1) as SERAL,
            substr('A_OVRSTRY_AVG_DBH_LOW', 3) as KEY_NAME,
            A_OVRSTRY_AVG_DBH_LOW as KEY_VALUE       
    from FS_NRIS_INFORMS.NRI_FRCC_REFNext I went to B_....and so forth. Utilizing union all.
    But it's not right.
    I'm reading through this static pivoting and cursor projection and going cross eyed. It's hard to make sense of it the way it's laid out.
    Edited by: Willy_B on Jan 3, 2013 11:35 AM

  • Advice on how to tackle this issue

    Hi All
    More than sql code that will help me with my project, I need advice on what approach to take with building a customers custom reporting.
    They need a daily file with various fields etc, (easy) but theres a twist.
    Within our system our transactions can have have 5 different statuses (new, corrected, locked, extended, closed)
    For every time the transaction changes status our client wants a new record in their report.
    The issue is that our system only tracks or has indicators for when transactions change certain statuses (but not all)
    So I tried building a log table where I check (for a txns received) which ones had changed state.
    The problem is that I was having issues getting the queries running quickly as I used a combination of remote and local tables.
    Now a colleague of mine suggested that I try a dbms_crypto.hash approach to build a hash key (using certain fields) to check for new records or changes.
    The problem is that he is on holiday for the next 2 weeks, so I cant ask him anything else.
    Has anyone used such an approach and will it serve the purpose of identifying new records and changes?
    Thanks

    Hi,
    maybe you can use CDC (Change Data Capture) or maybe GoldenGate. This records all the changes on a table, you can read these changes and put them in your own table (maybe aggregrate). CDC is being described in the Data warehouse Guide, see http://download.oracle.com/docs/cd/E11882_01/server.112/e16579/cdc.htm#i1028295.
    Another option is to use LogMiner in which you query on the redo logs or archived logs, see http://download.oracle.com/docs/cd/E11882_01/server.112/e16536/logminer.htm
    Herald ten Dam
    http://htendam.wordpress.com

  • How to tackle the error - " The Lead Selection has not been set in view "?

    Hi Guys,
    I'm getting this error " The Lead Selection has not been set in view " . If any one has faced the same problem , then please guide me . I am new to WD ABAP so finding it difficult to track the reason.
    TIA,
    Vishesh

    Hi Pradeep,
    I have already checked  "Initialisation Lead Selection" property . I have faced the same problem in another view also there it got solved by changing the cardinality.  In this case, I had tried both cardinality and Lead selection property  but nothing  is working.
    Thanx.

Maybe you are looking for