Capability of inserting a specific number or records ...

Hi ,
Is there any way to permit the end-user enter a specific number or records in a multi-record block..... according to the number of fetched records in another block...????
I assume that the trigger when-create-record can do that ... Are there any other solutions...???
Thanks ,
Simon

..Or,
this is for single block ,but i believe works also for multirecord
A parameter for defining the limit for the number of records the user can
query.
1. Define a parameter, :max_record, which is the limit for the number of records
the user can enter. Make sure to define this parameter as numeric
and provide a default value.
2. For a form with a single block, create the following triggers at block level:
a. Attach the following PL/SQL block to a KEY-CREREC trigger to create a
record only when :system.cursor_record is less than :max_record.
DECLARE
a NUMBER;
b NUMBER;
BEGIN
a := :system.cursor_record;
LAST_RECORD;
b := :system.cursor_record;
IF b >= :parameter.max_record THEN
GO_RECORD(a);
MESSAGE('max record exceeded - create rec III');
RAISE FORM_TRIGGER_FAILURE;
END IF;
GO_RECORD(a);
IF :system.cursor_record < :parameter.max_record THEN
CREATE_RECORD;
ELSE
MESSAGE('max record exceeded - create rec ');
RAISE FORM_TRIGGER_FAILURE;
END IF;
END;
b. To navigate to the next record when :system.cursor_record is
less than the :max_record, create a KEY-DOWN trigger.
IF :system.cursor_record < :parameter.max_record THEN
DOWN;
ELSE
MESSAGE('max records key-down');
END IF;

Similar Messages

  • Analyze table after insert a large number of records?

    For performance purpose, is it a good practice to execute an 'analyze table' command after inserting a large number of a records into a table in Oracle 10g, if there is a complex query following the insert?
    For example:
    Insert into foo ...... //Insert one million records to table foo.
    analyze table foo COMPUTE STATISTICS; //analyze table foo
    select * from foo, bar, car...... //Execute a complex query whithout hints
    //after 1 million records inserted into foo
    Does this strategy help to improve the overall performance?
    Thanks.

    Different execution plans will most frequently occur when the ratio of the number of records in various tables involved in the select has changed tremendously. This happens above all if 'fact' tables are growing and 'lookup' tables stayed constant.
    This is why you shouldn't test an application with a small number of 'fact' records.
    This can happen both with analyze table and dbms_stats.
    The advantage of dbms_stats is, it will export the current statistics to a stats to table, so you can always revert to them using dbms_stats.import_stats.
    You can even overrule individual table and column statistics by artificial values.
    Hth
    Sybrand Bakker
    Senior Oracle DBA

  • Selecting specific number of records

    Hellow
    How can we query certain number of records from table. For example if a table has thousands of records and i wish to query
    1. first 500 records
    2. Records between 500 and 1000 or between <anynumber> to <any number>
    3. Records less than <some number>
    I cannot perform the same operation from a primary key as it is not in serial.
    I tried to use ROWNUM, but i cannot use this when i want to select rows less than 100 or rows between 100 and 200...
    How can i accomplish this
    Regards
    Sunny

    1. first 500 records
    select *
    from ( YOUR_QUERY_GOES_HERE -- including the order by )
    where rownum <= 500Another way using analytical functions can be found in the documentation Top N Ranking
    2. Records between 500 and 1000 or between
    <anynumber> to <any number>
    select *
      from ( select a.*, rownum rnum
               from ( YOUR_QUERY_GOES_HERE -- including the order by ) a
              where rownum <= MAX_ROWS )
    where rnum >= MIN_ROWShttp://asktom.oracle.com/pls/ask/f?p=4950:8:::::F4950_P8_DISPLAYID:127412348064
    3. Records less than <some number>
    select *
    from my_table
    where my_column <   <some number>

  • Lock specific number of records using ENQUEUE & DEQUEUE

    Hi,
    Is it possible to lock a group of records in R/3?
    My requirement is to update a set of records in VBAP table. I'm not using a BAPI here. Instead, I use a direct UPDATE.
    In this case, i know i can lock individual records by passing VBELN and POSNR. But what if i have to lock 10 records?
    Is this possible in any way?
    Thanks in advance.
    The current solution is:
    1) LOOP at ITAB
    2) LOCK each entry
    3) UPDATE VBAP for that entry
    4) UNLOCK the entry
    5) Endloop
    I thought this solution might work: (Assume 10 records are present in ITAB)
    1) LOOP at ITAB (Lock all 10 entries)
    2) LOCK that entry
    3) ENDLOOP
    4) UPDATE VBAP from ITAB (Updates all 10 entries in one databae access)
    5) LOOP at ITAB(Unlock all 10 entries)
    6) UNLOCK that entry
    7) ENDLOOP
    Any help will be appreciated.
    Tabraiz.

    Hello,
    Both of your solutions will work.
    With solution 1 there will always be only 1 enqueue object created, because you always enqueue, perform the update and dequeue.
    This means that in SM12 you will only see 1 enqueue entry on your user ID at the same time when your program runs.
    Solution 2 is also possible but there you will have different enqueue objects that will be created, because you enqueue everything, then perform the updates and then dequeue everything.
    In SM12 (lock entries) this will result in more enqueue records on your user ID the time your program runs.
    You have to pay attention that lock entries (SM12) are stored in a queue that is limited, so make sure with solution 2 that you don't overflow the enqueue queue ! ! !
    Via tcode RZ11 you can check parameter enque/table_size (Size of lock table).
    Check the parameter value but also its documentation and you will understand why you should limit the number of open lock records.
    Success.
    Wim Van den Wyngaert

  • Select specific number of rows

    i'm listing all data entries in a database.
    for that i want to select the 1st ten entires, then the next ten and so on ordered
    by date_column.
    how can i do that? is it possible with rownum?
    in mysql/php i did it with limit
    $query = "select * from article where parent_id=0 order by date desc, time desc limit $select, 10";
    please email me.
    thanks for help
    chris

    but what about rownum if i want the
    records from 10-20?
    <BLOCKQUOTE><font size="1" face="Verdana, Arial, Helvetica">quote:</font><HR>Originally posted by Chen Zhao ([email protected]):
    If you are using Oracle 8i, you can use ROWNUM to specify the specific number of records. You can see the ROWNUM by querying:
    SELECT *
    FROM (SELECT column_name FROM table_name ORDER BY column_name)
    WHERE ROWNUM<10
    CHEN<HR></BLOCKQUOTE>
    null

  • Optimal number of records to fetch from Forte Cursor

    Hello everybody:
    I 'd like to ask a very important question.
    I opened Forte cursor with approx 1.2 million records, and now I am trying
    to figure out the number of records per fetch to obtain
    the acceptable performance.
    To my surprise, fetching 100 records at once gave me approx 15 percent
    performance gain only in comparsion
    with fetching records each by each.
    I haven't found significant difference in performance fetching 100, 500 or
    10.000 records at once.In the same time, fetching 20.000
    records at once make a performance approx 20% worse( this fact I cannot
    explain).
    Does anybody have any experience in how to improve performance fetching from
    Forte cursor with big number of rows ?
    Thank you in advance
    Genady Yoffe
    Software Engineer
    Descartes Systems Group Inc
    Waterloo On
    Canada

    You can do it by writing code in start routine of your transformations.
    1.If you have any specific criteria for filtering go with that and delete unwanted records.
    2. If you want to load specific number of records based on count, then in start routine of the transformations loop through source package records by keeping a counter till you reach your desired count and copy those records into an internal table.
    Delete records in the source package then assign the records stored in internal table to source package.

  • Maximum number of records to 'BAPI_PIRSRVAPS_SAVEMULTI'

    Hi All ,
    Could anybody tell me maximum number of records that can be passed to BAPI
    BAPI_PIRSRVAPS_SAVEMULTI.
    This BAPI is used for forecast upload to SNP planning area (which can be seen in product view: /sapapo/rrp3).
    Win full points for the resolution...
    Thanks in advance...
    Chandan Dubey

    Hi Chandan - There is no simple answer to this question.
    BAPI_PIRSRVAPS_SAVEMULTI has a built in package (number of records to process) counter which sends packets of data to livecache for creating data. By default this BAPI will process all records at once but there is a BADI in this BAPI that allows you to set the package size as well as many other things. The performance will depend upon things like your system,  environment and volume of data. There are 2 limitations in 1) the prereading (retrieval of matlocids, matids, locids, pegids, etc.) which happens prior to the livecache call and 2) the livecache call itself. The prereading can cause a memory overload but that is less likely to happen compared to a livecache problem. The proceduress that call livecache can run out of more likel than the ABAP tables and cause the program to dump as well and the dump may be hard to understand.
    What I have done with many programs is to add a wrapper around a livecache BAPI (or FM) call and use my own counter to send blocks or packets of data to the BAPI. For example loop through records in a program and call the BAPI for every 1000 records accumulating the return info in an internal table. The number of records in each packet or block is driven by a parameter on a selection screen or value in a ztable so the number can be tested and adjusted as needed. The reaction of livecache BAPIs will differ from system due to things such as hardware configuration and volume of data.
    If you do not code to call the BAPI as I have described above, place code in the BADI to set the packet size or limit the number of records being input some other way, then you are taking a risk that one day a specific number of records will cause a dump in this BAPI.
    I would think you would be safe with 500-1000 records but you should really test in your system and consider the options for packeting the number of records.
    Andy

  • To get  the number of record from cmp

    how can i get the specific number of record (25 records) from cmp using weblogic8?
    anybody know pls tell.

    http://java.sun.com/j2se/1.4.1/docs/api/java/io/File.html

  • Using getPrevStepLog to get number of records inserted

    Hi-
    I need to get the number of records inserted by the ODI interface but the value is contained not within previous step, but within previous step's sub-step. e.g.
    Step 0. Load Data Interface (No. of Inserts field is 0, No. of Rows contains a number but it doesn't look like there is a valid parameter string for it...)
    SubStep 1
    SubStep 4 (This substep's No. of Inserts value I want)
    Step 1. Variable Assignment (select '<%=odiRef.getPrevStepLog("INSERT_COUNT")%>' -> gets me 0, but I want the number from Substep 4.
    Thank you in advance for any help.

    Use this in your current step to get previous log information:
    SELECT MAX(NVL(NVL(nb_ins,0),0)
    FROM <key_work_rep>.SNP_SESS_TASK_LOG
    WHERE sess_no = <%= odiRef.getPrevStepLog("SESS_NO") %>
    AND nno = <%= odiRef.getPrevStepLog("NNO") %>
    AND scen_task_no = <hard code your sub_step_no>
    Exception:
    i) No insert log available then return 0
    ii) If no previous step log or sub step found then return 0
    Regards,
    Himanshu

  • How to control number of records in batch insert by eclipselink

    Hi,
    We are using eclipselink(2.2) to persist objects and we use following configuration -
    <property name="eclipselink.jdbc.batch-writing" value="Oracle-JDBC" />
    <property name="eclipselink.jdbc.batch-writing.size" value="5" />
    however the number of records inserted is much more than 5( I have seen 5000 records being inserted ). How can we control the number of records inserted once?
    Thanks.

    Binding can be configured using the "eclipselink.jdbc.bind-parameters" property, and is on by default - it should be on for jdbc batch writing.
    Batch writing defaults to 100 statements, so I am not sure why it would include all statements in one batch unless it is not batching at all. If you set the logs to finest or all it should print of the values it is using for each property, and also show the SQL and statments it is executing. Can you turn on logging and post portions of the logs, particularly the part showing the transaction in question (though maybe only 6 lines of consecutive inserts).
    Logging is controlled through the "eclipselink.logging.level" properties.
    Best Regards,
    Chris

  • How to find out the n number of records inserted??

    In File to JDBC Scenarios, i am inserting n records in Oracle.
    How to find out the n number of records inserted??

    Hi,
    If you are using the statement "UPDATE_INSERT", get the response on element <insert_count>count</insert_count>. It will give you the inserted rows.
    This link can be very helpfull.
    [http://help.sap.com/saphelp_nw2004s/helpdata/en/2e/96fd3f2d14e869e10000000a155106/frameset.htm]
    regards.
    roberti

  • Number of record  with multi-table insert

    Hi,
    I need to insert into 3 different tables from one big source table so I decided to use multi-table insert. I could use three inserts statements but the source table is an external table and I would prefer not reading it three times.
    I wonder if there is a way to get the exact number of records inserted in each one of the tables?
    I tried using rowcount but I all I get is the number of rows inserted in all three tables. Is there a way to get this info without having to execute a "select count" in each table afterward?
    Thanks
    INSERT /*+ APPEND */
    WHEN RES_ENS='PU' THEN
    INTO TABLE1
    VALUES(CD_ORGNS, NO_ORGNS, DT_DEB, SYSDATE)
    WHEN RES_ENS='PR' THEN
    INTO TABLE2
    VALUES(CD_ORGNS, UNO_ORGNS, DT_DEB, SYSDATE)
    ELSE
    INTO TABLE3
    VALUES(CD_ORGNS, NO_ORGNS, DT_DEB, SYSDATE)
    SELECT ES.CD_ORGNS CD_ORGNS, ES.RES_ENS RES_ENS, ES.DT_DEB DT_DEB, ES.NO_ORGNS NO_ORGNS
    FROM ETABL_ENSGN_SUP ES

    I have a large number of data to load in those three tables. I do not want to use PL/SQL (with loops and lines of codes) because I can do the insert directly from the source table and it saves a lot of time.
    INSERT /*+APPEND*/
    WHEN condition1 THEN
    INTO table1
    WHEN condition2 THEN
    INTO table2
    SELECT xx FROM my_table
    For example, if my_table has 750000000 rows, I only need to read it once and the INSERT..SELECT is a really fast way to load data.
    As I was saying, the only problem I've got, is that I cannot get the number of rows in each table. Do you know a way to get it?

  • In Logic Pro or Pro X Meta Events don't work correctly; for example inserting Stop Playback number 52 in a specific position the playhead stops wrongly several ticks before. Then the button play does not start.

    In Logic, Pro or Pro X Meta Events don't work correctly; for example inserting Stop Playback number 52 in a specific position the playhead stops wrongly several ticks before. Then the button play does not start.

    Curious if what your describing is similar to issue 4 which starts around 9:15 in video...
    https://youtu.be/q93jdOhi4Oc
    If so this started for me, or at least I noticed it for the first time in LPX 10.1.1. What version of logic are you running?
    I've recently found that this issue also affects note timing on instrument tracks that use the "External Instrument" plugin.

  • How to restrict user to insert certain number of records (urgent)

    i have master detail Form. My requirement is as following.
    1) In master block user enter other information and enter suppose 5 in text item.
    2) Then 5 rows should be display in detail block. and user couldnt enter more than 5 records in detail block.

    i hope understand,
    To close a query when :max_record = TO_NUMBER(:global.max_rec) and
    keep count of the number of records retrieved,
    create a POST-QUERY trigger.
    IF :parameter.max_record = TO_NUMBER(:global.max_rec) THEN
    ABORT_QUERY;
    ELSE
    :global.max_rec := TO_CHAR(TO_NUMBER(:global.max_rec) + 1);
    END IF;
    :parameter.max_record = number record you want retrieve and set in PRE-FORM
    Hope help you

  • Add number of records to a field

    I need to load data in BW ODS from flat file and want to populate one field ROWID which will hold record numbers starting from 1 if ODS is empty. This ROWID in ODS is used as Key field. I want to populate this ROWID based on number of records in flat file.
    If there are already records in ODS and say last record number is 10, next record number will be 11 and will populate rests one by one.
    So first I want to read ODS table if there are any records. If empty, ROWID will be populating from record 1 and if not empty, get the last ROWID and populate from next number.
    Help me with the code please?

    Hi,
    If the select count (*) is successfull then add 1 to the count then move it to the rowid column.
    SELECT COUNT(*) INTO V_COUNT FROM TABLENAME.
    IF SY-SUBRC NE 0.
    * If no records found then set the counter to 1.
      v_count = 1.
    ELSE.
    * If the record is found then set the row id of the table by adding 1.
      TABLENAME-ROWID = V_COUNT + 1.
    ENDIF.
    **If you want to insert multiple records within a loop.
    LOOP AT ITAB.
    * Increment the counter.
      V_COUNT = V_COUNT + 1.
    * Set the row id.
      TABLENAME-ROWID = V_COUNT.
    ENDLOOP.
    Hope this helps.
    Thanks
    Naren

Maybe you are looking for

  • Error while creating Catalog in Fiori Launchpad Designer

    Hi Experts, We have installed the new UIX* Fiori components on our NW 7.31 front-end server which has the gateway installed as well. Our backend ERP system is a separate system. I have activated the OData services /UI2/INTEROP and /UI2/TRANSPORT for

  • Cp and tar files with non printable characters

    Hi all, Maybe it's a silly question but just got stuck with this. We have an XSan with diverse material from varios departaments. Besides having a backup on tape I was trying to just do a plain copy from a terminal of all the files to another disk ju

  • SAX parser error unknownHostException for JDO files

    Hi everyone, The java.sun.com site is down (or perhaps only unreachable from Maine Road Runner) today. If your JDO file uses a DOCTYPE statement that references the site, and you see this error, that's the reason. David Ezzio

  • Creating/using folders in iTunes apps section for sync (not iOS folders)

    So I use my iPad mainly for work. I have about 600-1000 PDF documents I want to organize. I've been using Goodreader for years. The issue is the initial sync of the Folders, it's too clunky. I have to drag a million files into the Apps section of iTu

  • Item Groups/Accounting Tab

    I use SAP B1 2007 version. The SETUP/Stock Management/ItemGroups/ Accounting Tab is a most confusing screen because it is a mixture of Profit And Loss and Balance Sheet Nominal, not in any particular order or logic and using confusing terms which are