Table in memory using plsql

I have to convert a table and the plsql will check a small table a lot of times , I'd like to put this table in memory and do this check using variables,
I trying to use record type but I have to find a record using 2 columns and I can not able to use the EXISTS,
I'd like to have a table cache, and find a row in this cache ( using 2 columns as key ) ,
Any help I appreciate .

A multilevel associative array may help you. Here is an example.
set serveroutput on
declare
type row_type is record ( col_1 number, col_2 number );
type inner_aa_type is table of row_type index by varchar2(10) ;
type outer_aa_type is table of inner_aa_type index by varchar2(10) ;
v_row row_type ;
v_aa        outer_aa_type ;
begin
v_row.col_1 := 10 ;
v_row.col_2 := 100 ;
v_aa('A')('1') := v_row ;
v_row.col_1 := 20 ;
v_row.col_2 := 200 ;
v_aa('A')('2') := v_row ;
dbms_output.put_line ( v_aa('A')('1').col_1 );
dbms_output.put_line ( v_aa('A')('1').col_2 );
dbms_output.put_line ( v_aa('A')('2').col_1 );
dbms_output.put_line ( v_aa('A')('2').col_2 );
if ( v_aa.exists('A') and v_aa('A').exists('1') )
then dbms_output.put_line( 'First test is true' );
end if;
if not ( v_aa.exists('A') and v_aa('A').exists('5') )
then dbms_output.put_line( 'Second test is false' );
end if;
if not ( v_aa.exists('X') and v_aa('X').exists('5') )
then dbms_output.put_line( 'Third test is false' );
end if;
end ;
10
100
20
200
First test is true
Second test is false
Third test is false
PL/SQL procedure successfully completed.--
Joe Fuda
[SQL Snippets|http://www.sqlsnippets.com/|SQL Snippets]

Similar Messages

  • Help Required :Excel Upload Into Oracle Table Using PLSQL Procedure/Package

    Please Help , Urgent Help Needed.
    Requirement is to Upload Excel file Into Oracle Table Using PLSQL Procedure/Package.
    Case's are :
    1. Excel File is On Users/ Client PC.
    2. Application is on Remote Server(Oracle Forms D2k).
    3. User Is Using Application Using Terminal Server LogIn.
    4. So If User Will Use to GET_FILE_NAME() function of D2K to Get Excel File , D2k Will Try to pick File from That Remote Server(Bcs User Logind from Terminal Server Option).
    5. Cannot Use Util_File Package Or Oracle Directory to Place That File on Server.
    6. we are Using Oracle 8.7
    So Need Some PL/SQL Package or Fuction/ Procedure to Upload Excel file on User's Pc to Oracle Table.
    Please Guide me wd some Code. or with Some Pl/SQL Package, or With SOme Hint. Or any Link ....
    Jus help to Sort This Issue ........
    you can also write me on :
    [email protected], [email protected]

    I also Tried to Use This
    But How can i Use SQLLDR Command In Stored Procedure.
    Well IN SQL*PlUS it is successfull but in Stored Procedure /Package ,PL/SQL does not recognise the OS commands.
    So now my Question How can I recognise the SQLLDR Commnad in Stored Procedure.

  • Help Required:How Upload Excel file Into Oracle Table Using PLSQL Procedure

    Please Help , Urgent Help Needed.
    Requirement is to Upload Excel file Into Oracle Table Using PLSQL Procedure/Package.
    Case's are :
    1. Excel File is On Users/ Client PC.
    2. Application is on Remote Server(Oracle Forms D2k).
    3. User Is Using Application Using Terminal Server LogIn.
    4. So If User Will Use to GET_FILE_NAME() function of D2K to Get Excel File , D2k Will Try to pick File from That Remote Server(Bcs User Logind from Terminal Server Option).
    5. Cannot Use Util_File Package Or Oracle Directory to Place That File on Server.
    6. we are Using Oracle 8.7
    So Need Some PL/SQL Package or Fuction/ Procedure to Upload Excel file on User's Pc to Oracle Table.
    Please Guide me wd some Code. or with Some Pl/SQL Package, or With SOme Hint. Or any Link ....
    Jus help to Sort This Issue ........
    you can also write me on :
    [email protected], [email protected]

    TEXT_IO is a PL/SQL package available only in Forms (you'll want to post in the Forms forum for more information). It is not available in a stored procedure in the database (where the equivalent package is UTL_FILE).
    If the Terminal Server machine and the database machine do not have access to the file system on the client machine, no application running on either machine will have access to the file. Barring exceptional setups (like the FTP server on the client machine), your applications are not going to have more access to the client machine than the operating system does.
    If you map the client drives from the Terminal Server box, there is the potential for your Forms application to access those files. If you want the files to be accessible to a stored procedure in the database, you'll need to move the files somewhere the database can access them.
    Justin

  • Using plsql tables in select statement of report query

    Hi
    Anyone have experience to use plsql table to select statement to create a report. In otherwords, How to run report using flat file (xx.txt) information like 10 records in flat files and use this 10 records to the report and run pdf files.
    thanks in advance
    suresh

    hi,
    u can use the utl_file package to do that using a ref cursor query in the data model and u can have this code to read data from a flat file
    declare
    ur_file utl_file.file_type;
    my_result varchar2(250);
    begin
    ur_file := UTL_FILE.FOPEN ('&directory', '&filename', 'r') ;
    utl_file.get_line(ur_file, my_result);
    dbms_output.put_line(my_result);
    utl_file.fclose(ur_file);
    end;
    make sure u have an entry in ur init.ora saying that your
    utl_file_dir = '\your directory where ur files reside'
    cheers!
    [email protected]

  • Count files in a dirctory using plsql

    Hi ,
    Currently i am using a shell script that counts the number of files in unix level directory that had xml files and load to the table as CLOB using the sql loader.
    Now i want to use plsql l to count files and then load using insert the script.I am able to use insert statement but not have any idea to count files at unix level using plsql.
    Please give me some ideas.
    Thanks in Advance.
    Rede

    Neither does anyone else... <<LOL
    I've found this one quite some time ago...
    -- How to get a listing of files in a directory using PL/SQL has got to be one of the most
    -- popular questions asked by Oracle developers. Prior to 10g, DIY solutions included the
    -- sophisticated, using either Java (8i onwards) or External Procedures (8 onwards),
    -- or the clunky (pipe servers and shell scripts from 7 onwards)
    -- However in 10g there is a new procedure hidden away in the undocumented
    -- DBMS_BACKUP_RESTORE package. It's called searchfiles, which is a bit of a giveaway
    -- and appears to have been introduced for the new backup features in 10g, as RMAN
    -- now needs to know about files in the recovery destination.
    -- Calling this procedure populates an in memory table called X$KRBMSFT, which
    -- is one of those magic X$ tables, the only column which is of relevance to us
    -- is FNAME_KRBMSFT which is the fully qualified path and file name.
    -- This X$ table acts in a similar fashion to a global temporary table in that it's contents
    -- can only be seen from the calling session. So two sessions can call searchfiles and each
    -- can only see the results of their call (which is extremely useful).
    -- The code sample below will only really run as SYS, due to the select from X$KRBMSFT, it's
    -- just intended as a demo. The first two parameters in the call to searchfiles are
    -- IN OUT so must be defined as variables, even though the second parameter is of no consequence
    -- to us and should be left as NULL. Even though they are IN OUT, testing shows they don't
    -- appear to change.
    -- Updated 29/05/2007
    -- The first parameter is the string to search for, in much the same format as you would pass
    -- in a call to dir (Windows) or ls (Unix). However, on some platforms and versions
    -- passing in a wildcard or anything other than a directory name will result in an empty X$KRBMSFT.
    -- The trick here is to simply pass a valid OS directory name - the entire listing of that directories
    -- contents will now appear in X$KRBMSFT, which can then be filtered using LIKE.
    -- This procedure appears to raise no exceptions, passing an invalid search string, such
    -- as a non-existant path or one with no permissions, simply results in an empty X$KRBMSFT.
    -- However, if the database parameter db_recovery_file_dest is not set, you will get ORA-19801.
    -- Interestingly, this procedure recursively searches sub directories found in the search string.
    -- So passing a string of 'C:\windows' (for example) populates X$KRBMSFT with not only the files found
    -- in that directory but all the files found in all directories beneath, such as C:\windows\system32.
    -- As X$KRBMSFT is an in memory table, you have been warned! Calling this procedure on a directory
    -- with thousands of sub directories and files has the potential to blow out memory
    -- The way forward is to wrap this functionality in a package, perhaps using
    -- directory objects instead and checking access priviliges, creating a view based on X$KRBMSFT,
    -- maybe even allowing/disallowing subdirectory traversal and limiting memory usage search size.
    -- I see another project coming, stay tuned for XUTL_FINDFILES!
    -- This code has been tested and works as intended on Oracle 10104 EE.
    -- Get this tip and others, along with useful PL/SQL utilities at
    -- www.chrispoole.co.uk
    DECLARE
    pattern VARCHAR2(1024) := '/u01/oracle/admin/SID/udump';
    ns VARCHAR2(1024);
    BEGIN
    SYS.DBMS_BACKUP_RESTORE.searchFiles(pattern, ns);
    FOR each_file IN (SELECT FNAME_KRBMSFT AS name FROM X$KRBMSFT WHERE FNAME_KRBMSFT LIKE '%.trc') LOOP
    DBMS_OUTPUT.PUT_LINE(each_file.name);
    END LOOP;
    END;
    DECLARE
    pattern VARCHAR2(1024) := 'C:\temp\*';
    ns VARCHAR2(1024);
    BEGIN
    SYS.DBMS_BACKUP_RESTORE.searchFiles(pattern, ns);
    FOR each_file IN (SELECT FNAME_KRBMSFT AS name FROM X$KRBMSFT) LOOP
    DBMS_OUTPUT.PUT_LINE(each_file.name);
    END LOOP;
    END;
    /Thomas

  • CFMX 6.1's Virtual Memory Use problem!!

    I appologise for the long post in advance...
    Ok... so I have this script that, using cfdirectory, will
    check a directory for any files that may have been uploaded, if
    there are files, it loops through the results and reads the files
    one at a time, line by line, using the FileReader.cfc (Uses the
    Java FileInputStream, InputStreamReader, and BufferedReader to
    provide a way to incrementally read large files). The files are
    just pipe "|" delimited data, each line represents a record for a
    db table.
    Now as it's reading each line, it will perform some basic
    string parsing to clean up the file line to make sure the data is
    valid, blah blah blah and then it will write that "cleaned" line to
    another file using FileWriter.cfc (Java component once again). Once
    it's completely done reading the original file, it will close it
    and it will open the new "cleaned" version of the file, read it
    (FileReader.cfc), create an INSERT statement and then update the
    database table.
    Now... this all works GREAT... until it has to loop through
    more than a few files... 3 - 4 files are NO problem! works like a
    charm, but throw 6 - 8 files at it and it dies, not a timeout mind
    you but an actual "java.lang.OutOfMemoryError" (now, I've tried
    making all the files exactly the same (just changed the name) and
    the weird thing is, it takes longer and longer to process each as
    it goes through the loop... I have the script write some stats as
    it's looping:
    FILE 1 STATS
    Name: COA0607_Intranet1.DAT
    Status: Import Successfull
    Line Count: 32,971
    Processing Time: 74,237ms
    FILE 2 STATS
    Name: COA0607_Intranet2.DAT
    Status: Import Successfull
    Line Count: 32,971
    Processing Time: 82,018ms
    FILE 3 STATS
    Name: COA0607_Intranet3.DAT
    Status: Import Successfull
    Line Count: 32,971
    Processing Time: 94,476ms
    FILE 4 STATS
    Name: COA0607_Intranet4.DAT
    Status: Import Successfull
    Line Count: 32,971
    Processing Time: 145,108ms
    I know what you guy are probably thinking; "Woah man... CF
    isn't really meant to do that kind of processing...", I know, trust
    me I know... however, I really neeeeeed it too lol.
    Ok, so as the script is running, I watch the Virtual Memory
    use of jrun.exe, processing say 3 - 4 of these files brings up the
    usage to approx 300,000k which yes, is a LOT but that's fine...
    this process is meant to run at night via a Scheduled Task...
    When I run more than 4 files, things start to get ugly, keep
    in mind that these are EXACTLY the same files just re-named
    differently. The script will start lagging BIG time and on the last
    file (usually the last file) I'll see the memory usage spike from
    350,000K all the way up to 600,000K and that's when it throws the
    "java.lang.OutOfMemoryError" and dies... I've tried commenting out
    the part of the script that updates the db, but still get the same
    problem...
    So... what gives? How come CF Server does this??? I mean, it
    runs fine for the first few files... and then WAM, it dies... sorry
    for the long post... any insight here is VERY much appreciated...
    it would be AWESOME if the wonderful folks at Adobe could shed some
    light on this for me : )
    CFMX 6.1 version: 6,1,0,83762
    Windows XP Pro SP2
    Intel P4 2.8Ghz
    1Gb of Ram

    quote:
    Originally posted by:
    Mr Black
    300M memory usage while using "incremental" file reader??
    Looks like it is "incremental" only in the sense that it increments
    memory usage. Did you try non-Java C/C++ file reader tags?
    Well I did try cffile originally... and it didn't even run...
    lol

  • Export internal table to memory in User Exit FM

    Hi all,
    My scenario here is to export an internal table in one user exit FM and import it back in another user exit FM.
    I was trying to use
    Export lt_table to memory id 'LABEL'.
    then
    Import lt_table from memory id 'LABEL'.
    But then i hit error in the import statement. How can I rectify this?
    Thanks. Answer will be rewarded.

    Refer to the below related threads
    Export an internal table to memory and import from memory into an internal
    http://help.sap.com/saphelp_erp2005/helpdata/en/fc/eb3bf8358411d1829f0000e829fbfe/frameset.htm
    Regards,
    Santosh

  • Read a text file from DB directly (don't use PLSQL).

    how can I Read a text file from DB directly (don’t use PLSQL). ?

    If there is a known structure, you could use External Table access and query that "file" as any table.
    Nicolas.

  • Export a table to memory ID and import it in the Workflow.

    Hi,
    For a particular requirement,
    We are triggering Workflow from the User Exit MV45AFZZ.
    Before triggering this WF, I am trying to export an internal table to a memory ID and trying to Import it back from the same memory ID into an internal table in the Workflow task. This is failing.
    Even the name of internal tables and memory IDs are similar.
    Can we do this like this? If yes please let me know how to do this.
    Thanks & Regards,
    Mallika Maktala

    Initially,
    I have populated the values of the internal table in to a Database table, immediately called Workflow, in a task tried to retrieve the data from the database table.
    But though there is some data in the table, it dint retrieve any data, as at that point the table may not be updated.
    So now I used this Import export concept.
    I think its in same session, but still not working,
    Any solution?
    Mallika Maktala

  • Exporting an internal table to memory

    Hi
      I want to call an program from another program and want the values stored in an internal table used in the called program.
    how can i export an internal table to memory id an then import it.
    Regards
    Arun

    So to be clear, for your requirement it would be:
    * Towards the end of the called program
      EXPORT it_itab TO MEMORY ID 'ZZ_MEM_ID'.
    * And then from calling program after the submit zzzz and return statement
      IMPORT it_itab FROM MEMORY ID 'ZZ_MEM_ID'.
    Hope that helps.
    Brad
    Message was edited by: Brad Williams (put back intro text)
    Message was edited by: Brad Williams

  • Save Internal table to memory - problem

    Hi everybody ,
    I need to extract one Internal table to memory , every 1st in the Month .
    Then every day I will be reading this table for my report ( in order to avoid running every day same and same database selections) .
    My originall idea wos with EXPORT TO MEMORY ID , AND IMPORT
    but it does'nt work next day ... With Export data is cept only untill tthe end of the transaction ...
    Is there any way except of storing data in one Z_ database table ???
    Thanks in advance

    Hi,
    You can use the following code to export to memory
    *data variable required for background processing
    data: wa_indx type indx.
    *EXPORT Internal Table TO MEMORY ID 'XYZ'.
    *part for background processing
      export tab = <your table> to database indx(xy) from wa_indx client
      sy-mandt id 'XYZ'.
    the following code will import from Memory and clear memory
    *data variable required for background processing
    data: wa_indx type indx.
    imports from database the list sent by the calling program
    IMPORT tab = <your table> FROM DATABASE indx(xy) TO wa_indx CLIENT sy-mandt
    ID 'XYZ'.
    deletes the data to save wastage of memory
    DELETE FROM DATABASE indx(xy)
      CLIENT sy-mandt
      ID 'XYZ'.
    Regards,
    Samson Rodrigues.

  • Pass Internal Table to Memory

    Dear Experts.
    I have the following problem.
    I am using a User-Exit for Travel  that is called from a program standard with use the T.Code TRIP. The program standard use a structure that when entry in the user exit not is. I need use this structure in the user exit. (p_t_req_head).
    I had read abour SAP and ABAP Memory but from the program standar How can Pass Internal Table to Memory SAP or ABAP from the program standard to user-exit?
    When I am doing debugging in the program Standard and this call to the user exit, I use ()namestructure for example ()p_t_req_head  and I get the datas of the structure, But In the Source Code ABAP, How can do this with instructions?
    Regards

    Hi,
    You try with Import & Export Statements as follows ---
    DATA : t_itab LIKE TABLE OF spfli.
    EXPORT t_itab TO MEMORY ID 'ABCD'.
    After sending the internal table to ABAP memory you need to get that by IMPORT in the called session.
    IMPORT t_itab FROM MEMORY ID 'ABCD'.

  • Passing Dynamic Internal Tables to Memory

    I have a bit of a conundrum right now that I can't seem to correct. I am working on adding an ALV report to an existing report program. I was able to write a simple helper program that builds a custom object that I defined that translates my raw data into two seperate dynamic tables, and then builds an ALV grid and outputs it. The reason I wrote this in a simple helper program was so that I could use SUBMIT ... EXPORTING LIST TO MEMORY from my primary report program and capture the input so I can later write it out under our company's standard ABAP list format as if I were using WRITE statements.
    The output of the report itself is working beautifully. We have included functionality to automatically take the output, produce an HTML file from it, and then FTP it directly to a webserver so our clients can get easy access to it. What I want to be able to do though is give the clients two tab-delimited files that contain the raw data that was used to build the report. We have an interface in place to do that, but I somehow need to be able to pass these two dynamic internal tables which I have field-symbols to reference back to my calling program.
    Here is what I am trying to do:
    CALL METHOD OBJ_ALV_MR_EST_REASONS->PRODUCE_ESTIMATION_REPORT
        IMPORTING
          RPT_DATA_BY_REASON   = ref_it_estreason
          RPT_DATA_BY_DISTRICT = ref_it_district.
    *   Assign the field symbols
      ASSIGN ref_it_estreason->* TO <it_estreason>.
      ASSIGN ref_it_district->* TO <it_district>.
    * Export the two internal tables to memory so they can be
    * retrieved by the calling program
      EXPORT reason = <it_estreason>[]
             district = <it_district>[]
      TO MEMORY ID 'ZCR_ESTIMATION_REASON_RPT'.
    As you can see, my method returns two references to dynamic internal tables. I have used the memory debugger to see that these tables are being correctly written to the ABAP memory.
    However, back in my parent program when I try to do the following,
    CREATE DATA ref_it_estreason TYPE REF TO DATA.
          CREATE DATA ref_it_district TYPE REF TO DATA.
          ASSIGN ref_it_estreason->* TO <it_estreason>.
          ASSIGN ref_it_district->* TO <it_district>.
          IMPORT reason = <it_estreason>
                 district = <it_district>
          FROM MEMORY ID 'ZCR_ESTIMATION_REASON_RPT'.
    I get the REFS_NOT_SUPPORTED_YET exception which says that "For the statement Export/Import ..." object references, interface references, and data references are currently not supported".
    I have tried multiple other ways of defining my field-symbols or my reference pointers but they all result in exceptions of some sort. Is there any way for me to get this data passed back? It seems like there must be a way to get the data from memory since I know it's being correctly stored there.
    Thanks in advance.

    Shortly after posting this, I had an idea which I was able to implement to actually get this to work.
    I decided that I would simply pass the FIELDCAT tables for each of my dynamic tables into the same memory ID as the tables themselves.
      EXPORT reason_fcat = it_estreason_fcat
             district_fcat = it_district_fcat
             reason = <it_estreason>[]
             district = <it_district>[]
      TO MEMORY ID 'ZCR_ESTIMATION_REASON_RPT'.
    Then, back in my calling program I execute the following code. This retrieves the FIELDCAT tables, builds two empty dynamic table type reference variables and then lets me create field-symbols to reference those components.
    *     Retrieve the fieldcat internal tables first
          IMPORT reason_fcat = it_estreason_fcat
                 district_fcat = it_district_fcat
          FROM MEMORY ID 'ZCR_ESTIMATION_REASON_RPT'.
    *     Generate an internal table type assigned to each
    *     reference variable based on the fieldcat listings we
    *     retrieve
          CALL METHOD cl_alv_table_create=>create_dynamic_table
            EXPORTING
              it_fieldcatalog = it_estreason_fcat
            IMPORTING
              ep_table        = ref_it_estreason.
          CALL METHOD cl_alv_table_create=>create_dynamic_table
            EXPORTING
              it_fieldcatalog = it_district_fcat
            IMPORTING
              ep_table        = ref_it_district.
    *     Assign the field symbols
          ASSIGN ref_it_estreason->* TO <it_estreason>.
          ASSIGN ref_it_district->* TO <it_district>.
          CREATE DATA ref_wa_estreason LIKE LINE OF <it_estreason>.
          CREATE DATA ref_wa_district LIKE LINE OF <it_district>.
          ASSIGN ref_wa_estreason->* TO <wa_estreason>.
          ASSIGN ref_wa_district->* TO <wa_district>.
    *     Finally, we can retrieve the data from memory and assign
    *     to the internal tables referenced by our field-symbols
          IMPORT reason = <it_estreason>[]
                 district = <it_district>[]
          FROM MEMORY ID 'ZCR_ESTIMATION_REASON_RPT'.
    This worked beautifully and saved me from having to do a major redesign. I don't know how helpful it would be for ABAP Objects to be passed to memory (I believe some type of serialization would need to be in order there), but for dynamically typed internal tables it worked like a dream with little overhead.

  • How to find how much memory used by particular procedure or function.

    Hi,
    How can we find out memory used by particular procedure or function?
    If procedure or function is called many times in particular interver, wil it be cached in memory?
    and how will it affect performance?
    what type of pl/sql statement will take more time than normal sql statement?

    Hi
    There are several different memory issues to consider:
    - the code itself (stored in the shared pool)
    - simple variables defined in the code
    - complex variables (eg VARRAY, TABLE etc)
    There's a helpful note on PL/SQL profiling here - http://www.oratechinfo.co.uk/tuning.html - which mentions how to measure memory use (session PGA and UGA - that's program and user global areas)
    You can find out more about shared pool memory structures here - http://download-east.oracle.com/oowsf2005/003wp.pdf.
    Calling a function many times - yes, the function code will be cached (if possible). Session state (for a package) will also be retained (ie global package variables).
    If many users call the same function, there will be one copy of the code but many copies of the private state.
    Finally: PL/SQL statements that can take a long time include:
    - anything that does heavy processing inside a tight loop;
    - anything that waits (select for update; read from a pipe or dequeue from AQ etc)
    Probably the most common mistake is to use PL/SQL for relational processing that can be done from SQL itself (eg writing nested PL/SQL loops to join data that could have been queried in a single SQL statement. Try to minimise context switches between PL/SQL and SQL:
    - use bulk collect where possible
    - use set operations in SQL
    Good luck, HTH
    Regards Nigel

  • Passing pl/sql table to Java using Oracle JDBC and vice - versa

    A small article on the given topic with sample code and comments, to make code crystal:
    http://mukx.blogspot.com/2007/12/passing-plsql-table-to-java-using.html
    --Mukul                                                                                                                                                                                                                                                                                                                                                           

    Tapash,
    I have seen people using Rosetta in almost all projects in previous years, frequently in couple of projects. I was not aware that Rosetta is an internal thing, anyways if that is the case y oracle is shipping rosetta jar file to customers?
    --Mukul                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

Maybe you are looking for