Partial records  fetched

Hello gurus.......
my input
1. code (from bseg)
2. Date (bkpf
3. Regio ( kna1
The below is the looping code, My Select stmts are proper such that the criteria gets
satisfied but when it goes inside the loop ,
i have used
select Bseg
select bkpf
select knal
what happens is if i give the input for code , date and
for regio which is not in that code and date
it selects in bkpf and bseg but not in kna1 since does not match  till here its proper
but inside the loop it fetches the data for watever is satisfied in bkpf and bseg
and does not bring value for kna1
but it should not bring any value since wat i specified for regio is not satisfied
ie it partially shows the output ....but actually no data should be displayed in output
Please guru's suggest .......
  LOOP AT it_bseg INTO wa_bseg WHERE kunnr <> ' '.
    READ TABLE it_kna1 INTO wa_kna1 WITH KEY kunnr = wa_bseg-kunnr .
IF sy-subrc = 0.
     wa_final-name1 = wa_kna1-name1.
     wa_final-mcod3 = wa_kna1-MCOD3.      
     wa_final-belnr = wa_bseg-belnr.
       READ TABLE it_bkpf INTO wa_bkpf WITH KEY belnr = wa_bseg-belnr.
      IF sy-subrc = 0.
        wa_final-bldat = wa_bkpf-bldat.
so onnnnn.......................
  APPEND wa_final TO it_final.
  Endloop.
Endloop.
clear: wa_bseg, wa_bkpf, wa_kna1, wa_t005u, wa_vbrp, wa_konv, wa_final.
ENDLOOP.
Thanx

Hello,
Either check for SY-SUBRC after each select statement or  check
if the above internal table is INITIAL.
For ex:
select BSEG.
if sy-subrc = 0.
select BKPF.
if sy-subrc =0 .
select KNA1.
OR
select BSEG
if (internal table BSEG) is NOT INITIAL.
select BKPF.
if (internal table BKPF) is NOT INITIAL.
select KNA1.
And also put the validations as required on the selection screen fields.

Similar Messages

  • HT2801 HOW DO YOU KNOW WHAT IS AVAILABLE CAPACITY OF CD THAT IS PARTIALLY RECORDED

    I have just intalled the USB Superdrive on my IMAC.  It works fine.
    My question is how do you find out what is the remaining cpacity on a partially recorded CD?

    Another way is to insert the CD into the drive, let it mount on the desktop, click your mouse once on the CD/DVD icon and use the keyboard key combination Command-I.
    This will tell you the capacity of the disc, how much was used and what was left of the CD's total capacity.

  • To display records fetched from the DB through ALBPM, in a custom JSP

    I want to display the records fetched from the database through ALBPM in a custom JSP as a table.
    Currently I am fetching the records and putting them in a BPM object group and sending it to the JSP as a BPM object. Now in the JSP I am trying to iterate and create a table to display the fetched data using the JSTL tag ==&gt;
    &lt;c:forEach ...... &gt;
    &lt;/c:forEach&gt;
    But I am always getting the following error in ALBPM workspace:
    The task could not be successfully executed. Reason: 'java.lang.RuntimeException: /webRoot/customJSP/CustomerSearch1.jsp(29,2) No such tag forEach var in the tag library imported with prefix c'
    Though there is a tag forEach var in the JSTL?? How to solve this error or is there any altrnative way to display the records in a JSP ??
    Thanks,
    Suman.

    Hello. I use custom JSP and it works well. You have to make sure 2 things:
    a) You have imported the libs, including these lines in your JSP:
    <%@ page session="true" %>
    <%@ taglib uri="http://java.sun.com/jstl/core" prefix="c" %>
    <%@ taglib uri="http://fuego.com/jsp/ftl" prefix="f" %>
    b) The variable you are using in the "forEach" statement is stored in the BPM object you pass to the JSP in the call.
    Hope this helps.... Bye

  • Maximum record fetched  property of block

    hi friends
    i have a basetable block
    and i set the maximum record fetched property for this block to 1
    but when i execute_query
    i see all records.
    can any one help me?
    and i set this property ---> query array size
    but it doesnot work also!!!!
    tahnks
    regards,
    shoja

    This property takes effect when the Query All Records property is set to true - and you may have to set the interaction mdoe to Non-blocking at Form mode.
    A Simpler way to limit the number of rows would simply be to add
    where rownum < 2 to the where clause for the block.

  • Building a block based on the records fetched - Forms 10g

    Is there a way in forms to dynamically build the block ( I mean in a multi record block, we usally set the number of records displayed in the property pallete) Instead of setting the property is there a way that we can do it based on the records fetched by the query?
    select a.unit_id, substr(b.product_code,1,5)
    from [email protected] a,[email protected] b
    where substr(a.product_code,1,5) = b.product_code
    and substr(a.product_code,1,5) = 'E3088'
    and inv_product_type in ('PHER','LPHER','VPHER')
    and warehouse_id = 'A'
    This is the query.
    If the query for product_code fetched 5 units for product code = 'E3088' of inv_product_type as mentioned in the query, is if possible to build the multi record block? For another product E3077 it might give us 6 records.
    User wants me to see if I can do it? Is it possible?
    Thanks in advance.
    Anu

    Hi,
    I've not tried this and can't say for sure if it is possible or not.
    But, have a look at this link Re: Automatic  Number  of record displayed .
    I'd thought of a solution but never tried myself. Try if it works for you.
    If it doesn't, i think it will not be possible in oracle forms.
    Navnit

  • Record fetch size property hint

    Hi all,
    I'm using Oracle Forms Builder 10g and Oracle DB 11g. I have a LoV which selects about 150k rows. In fact in the DB the select * from <table> returns first 50 rows for about a second. My question is: What value to set for Record fetch size property to receive the first records faster because now it is loading about 20-30 seconds/which is unacceptable for me :) /. I checked docs for this property and if i increase it i suppose it will display rows faster?
    Thanks in advance,
    Bahchevanov.

    The records are composed of two columns, a code (5 characters) and a description (average size 25 characters). The average size of the record is then approx. 30 characters. From the documentation available in Forms:
        Also, the way in which the actual value is computed when a value of 0 is
        specified has changed. The actual value in this case is now
        0.5 M / total_record_size (i.e. sum_of_column_sizes, not max_column_size),
        but no more than 100 and no less than 20.  The coefficients (0.5 M, 100, and
        20) can be changed by setting these environment variables: 
        FORMS_COMPUTED_RGFS_DIVIDEND, FORMS_MAX_COMPUTED_RGFS,
        and FORMS_MIN_COMPUTED_RGFS.I believe that we are running with the default. Is this OK for this data set? Do we need to set the environmental variables listed above?
    Thanks,
    Thomas

  • Records fetching & writing problem in Oracle DB

    Hi,
    Currently we are fetching 1 million records from Oracle DB , where we are using PreparedStatement and using the fetchSize() we are fetching the records in a "chunk" in an iterative manner and not feching all 1 million records at once. The same amount of records are passed to an utility to write the records in another DB table using addBatch() & batchUpdate() method. If the above operations in place, still the performance is not that great. Hence can we do the below:-
    1) Caching the records and reusing it, the problem here is the data is dynamic and there is no static data involved here, hence whether caching works here? If so we can go with which caching mechanism?
    2) Eventhough PreparedStatement is precompiled statement, is there any where we can have the SQL queries directly located to the database/prepared statement for the faster reading and writing of the data?
    Here the SQL queries are pretty simple "select" and "Insert" query , hence there is no need to fine tune the queries. Please clarify how we can go about this?
    Thanks.
    Edited by: 797836 on May 23, 2013 7:59 PM

    797836 wrote:
    Currently we are fetching 1 million records from Oracle DB , where we are using PreparedStatement and using the fetchSize() we are fetching the records in a "chunk" in an iterative manner and not feching all 1 million records at once. The same amount of records are passed to an utility to write the records in another DB table using addBatch() & batchUpdate() method. If the above operations in place, still the performance is not that great. Hence can we do the below:-
    1) Caching the records and reusing it, the problem here is the data is dynamic and there is no static data involved here, hence whether caching works here? If so we can go with which caching mechanism?So what.. the Oracle's database buffer cache is too small and incapable of caching data? Or is so badly written it is incapable of caching data effectively?
    And big powerful and wonderful application/application tier has a badass buffer cache?
    So why even bother with the database then? Why not fit the entire database in the badass app buffer cache, drop Oracle, and save money?
    If I seem to be foaming at the mouth a bit, it is because Oracle is technically the best RDBMS product on this planet. I process billions of rows (read and write) every single day on my largest database. Without any magic or potions, but by simply sticking to the fundamentals of data processing in Oracle. Then I run across a statement like yours that squarely places the blame for your performance woes on Oracle, and then want to reinvent database functionality, like a db buffer cache, in the app layer. Without a single shred of technical evidence that Oracle is actually the cause of the performance woes.
    And not a word about the latency it takes to get a million rows from spinning rust/disks (where it resides originally)?
    And not a word about the latency of pushing that million rows as IP packets (1500 bytes max size each) from the source database to the app, only to have the app sending that same data as IP packets to the target database? Effectively hitting the network twice with that data volume...
    2) Eventhough PreparedStatement is precompiled statement, is there any where we can have the SQL queries directly located to the database/prepared statement for the faster reading and writing of the data?How is the prepare statement now a problem? Have you actually timed it and have metrics that show it to be the bottleneck?
    Here the SQL queries are pretty simple "select" and "Insert" query , hence there is no need to fine tune the queries. Please clarify how we can go about this?Send data directly from the source database to the target database - single transit of the data volume over network.
    Make sure that network speed and bandwidth suffices - as this, and I/O latency getting data off disks, are the biggest contributors to elapsed processing time.
    If there is spare bandwith and I/O capacity, use parallel processing. Instead of using a single data stream process to get data from source db to target db, use multiple streams.
    Consider compression of data during the network transit phase, especially if the data volume is large and compresses well.
    Of course, the infrastructure can also be tuned/upgraded. Bigger and faster network pipes. Faster disks. Shorter network routes with fewer hops. Use QoS/DSCP on network layer to prioritise the traffic from source to target db. Etc,

  • Cursor -record fetch

    Hi,
    I'm getting the following error.
    Error ** READNEXT: Unable to fetch next array of records from cursor
    Kindly advic me on the same.
    Thanks
    KSG

    KSG wrote:
    Hi,
    I'm getting the following error.
    Error ** READNEXT: Unable to fetch next array of records from cursor
    Kindly advic me on the same.Only if you would post the details like Oracle version, how you get this error and any piece of code that suffers from this problem.

  • XML file not properly read from Appc Server. Partial records filled into IT

    Hi,
    I am reading an XML file from the application Server. only 2 records are comming to the Internal table. Other records are not coming to the Internal Table. It's giving Sy-Subrc 4 after 2 iterations. Below is my code. Kindly suggest.
    open dataset p_unix in text mode message msg.
        if sy-subrc <> 0.
          message i001(38) with 'Error on Open of File: ' msg.
        else.
          while sy-subrc = 0.
            read dataset p_unix into infile.
            check sy-subrc = 0.
            append infile.
          endwhile.
          close dataset p_unix.
    Thanks,
    Debi.

    Hi,
    this is probably because there's only one "end of line" character (probably after <?xml ... ?>). This is normal, but you must store the second line in a string data object, which will then receive the whole XML stream (except the xml header as it is stored in the first line). And that's done.
    BR
    Sandra

  • Partial Records insertion using DB Adapter - SOA 11g

    Hi,
    We have a BPEL process in which we are inserting records in a table using DB Adapter. Currently if the input data has any problem of data type miss-match then all the records are rejected. None of the records are inserted in the table. Whole batch is rejected.
    We have a new requirement that when there is problem is any record, then only that problematic record is rejected and rest of the records are inserted in the table. Is it possible to do so?
    Thanks,
    Sanjay

    In that case, its better to move the insert statement into procedure and do the insert and return value, if the insert statement is more then this will increase the performance time.
    How many rows you will insert at very worst case ? like 100 lines ?
    Thanks,
    Vijay

  • Results page only displays partial record

    Hello everyone: I am new to this forum and would like to
    request some help. I have searched and searched but cannot find the
    answer to my problem on this forum or others.
    I problem is as follows;
    I have a results page which displays the complete version a a
    single news story. I am pulling the story with a simple URL
    Parameter. The PHP tage for the affected field contains a simple
    nl2br statment and I am not using any complicated queries to pull
    the data.
    When I update the record from the update page I get no errors
    of any kind, but when I load the page that displays the updated
    record, the contents of the field in question is trucated.
    I cannot see any problems with any of the code and I am quite
    baffled by it.
    Here is the code for the record display
    <?php echo nl2br ($row_rsNews['story']); ?>
    And this is the code for the update action
    <?php
    function GetSQLValueString($theValue, $theType,
    $theDefinedValue = "", $theNotDefinedValue = "")
    $theValue = (!get_magic_quotes_gpc()) ?
    addslashes($theValue) : $theValue;
    switch ($theType) {
    case "text":
    $theValue = ($theValue != "") ? "'" . $theValue . "'" :
    "NULL";
    break;
    case "long":
    case "int":
    $theValue = ($theValue != "") ? intval($theValue) : "NULL";
    break;
    case "double":
    $theValue = ($theValue != "") ? "'" . doubleval($theValue) .
    "'" : "NULL";
    break;
    case "date":
    $theValue = ($theValue != "") ? "'" . $theValue . "'" :
    "NULL";
    break;
    case "defined":
    $theValue = ($theValue != "") ? $theDefinedValue :
    $theNotDefinedValue;
    break;
    return $theValue;
    $editFormAction = $_SERVER['PHP_SELF'];
    if (isset($_SERVER['QUERY_STRING'])) {
    $editFormAction .= "?" .
    htmlentities($_SERVER['QUERY_STRING']);
    if ((isset($_POST["MM_update"])) &&
    ($_POST["MM_update"] == "updatenews")) {
    $updateSQL = sprintf("UPDATE news SET `date`=%s,
    subtitle=%s, title=%s, story=%s, image=%s, thumb=%s WHERE id=%s",
    GetSQLValueString($_POST['date'], "date"),
    GetSQLValueString($_POST['subtitle'], "text"),
    GetSQLValueString($_POST['headline'], "text"),
    GetSQLValueString($_POST['story'], "text"),
    GetSQLValueString($_POST['image'], "text"),
    GetSQLValueString($_POST['thumb'], "text"),
    GetSQLValueString($_POST['id'], "int"));
    mysql_select_db($database_ifbdb, $ifbdb);
    $Result1 = mysql_query($updateSQL, $ifbdb) or
    die(mysql_error());
    $updateGoTo = "news_list.php";
    if (isset($_SERVER['QUERY_STRING'])) {
    $updateGoTo .= (strpos($updateGoTo, '?')) ? "&" : "?";
    $updateGoTo .= $_SERVER['QUERY_STRING'];
    header(sprintf("Location: %s", $updateGoTo));
    $colname_rsNews = "-1";
    if (isset($_GET['id'])) {
    $colname_rsNews = (get_magic_quotes_gpc()) ? $_GET['id'] :
    addslashes($_GET['id']);
    mysql_select_db($database_ifbdb, $ifbdb);
    $query_rsNews = sprintf("SELECT * FROM news WHERE id = %s",
    $colname_rsNews);
    $rsNews = mysql_query($query_rsNews, $ifbdb) or
    die(mysql_error());
    $row_rsNews = mysql_fetch_assoc($rsNews);
    $totalRows_rsNews = mysql_num_rows($rsNews);
    ?>
    I would truly appreciate any help this group can
    offer.

    Hi Gunther:
    I am really quite embarrassed, about this but I think I found
    the problem...
    I had a look at the table configuration in MySQL
    Administrator, and there was the problem staring me right in the
    face.....
    The field 'story' was set to VARCHAR(255) ! I reset it to
    TEXT and all is fine now!
    Please accept my sincerest thanks for your help. Your
    suggestion to check if I was using a textarea is what led me to
    check the datatye again. I have no idea how it got changed but I
    now know where to check first if it happens again.
    Thaks a ton!!!!!

  • Record fetch is not correct in oracle apps report

    Hi ALL,
    i developed report in oracle apps in that i used one customized table which is poulated in after parameter form trigger and used the table in main data model.
    when two user fire this report ,data in the report mismatch.
    please help me regarding this how to rectified this.
    Thanks,
    Mack

    Hi Mack,
    you can add one more column "request_id" in the custom table and populate it along with the report query ouput data. while displaying the data for any particular user/request, use request_id to fetch correct data for the reports.
    Regards
    Imran

  • Maximum number of records fetched by ABAP Query

    Hi Experts,
    Please tell me what is the specific maximum numbers of records that can be handled by an ABAP Query.
    Thanks in advance.
    Regards,
    Bilal

    Use a query similar to this.....
       SELECT EBELN                        " Purchasing Document Number
                     ERNAM                       " Name of Person who Created
                                                        " the Object
                     LIFNR                          " Vendor's account number
                    EKGRP                        " Purchasing group
                    BEDAT                         " Purchasing Document Date
         FROM EKKO
      PACKAGE SIZE 10000
    APPENDING TABLE T_EBELN
        WHERE EBELN IN S_EBELN.
        ENDSELECT.
    Don't forget to write ENDSELECT.
    Regards,
    Pavan P.

  • Records fetch which not exist in both tables

    hi all
    i have 2 tables like zlease and zcust. the common field is zempno and i want to get the records which are no similar in both records eg:
    records of both the tables
    i_emp|zempno
    VMX020|
    3      |
    1      |
    10     |
    10522  |
    i_cust  empno
    10522
    3
    vmx020
    and i supposed to get 2 records like : 1 and 10.
    i hav written the code but that doent give me just see my code pls
    loop at i_cust.
      read table i_cust with key zempno = i_emp-zempno.
      if not i_cust-zempno = i_emp-zempno.
        move i_cust  to itab.
        append itab.
        endif.
        endloop.
    loop at itab.
        write : / itab-zempno, itab-zcustnumber, itab-zcustname, itab-zcustbd, itab-zno_chil.
    endloop.
    records of both the tables
    i_emp|zempno
    VMX020|
    3      |
    1      |
    10     |
    10522  |
    i_cust  empno
    10522
    3
    vmx020
    and i supposed to get 2 records like : 1 and 10.

    to get record from i_emp which  are not similar to i_cust
    loop at i_emp.
    read table i_cust with key zempno = i_emp-zempno.
    if i_cust-zempno = ' '.
    move i_emp to itab.
    append itab.
    endif.
    endloop.
    and if u want records from i_cust also do the same
    loop at i_cust.
    read table i_emp with key zempno = i_cust-zempno.
    if i_emp-zempno = ' '.
    move i_cust to itab.
    append itab.
    endif.
    endloop.
    loop at itab.
    write : / itab-zempno, itab-zcustnumber, itab-zcustname, itab-zcustbd, itab-zno_chil.
    endloop.
    thanks
    anil
    Edited by: anil chaudhary on Sep 4, 2008 12:14 PM

  • Max limit of record fetch in Parameter form

    I am working in Reports 6i.
    I am populating the LOV in Parameter form. The query used in the LOV returns more than 1,00,000 records.
    When I execute the report it encounters with the following error
    REP-0066: Error executing CA Utility
    REP-3335: Unhandled Internal CA Error.
    calaa 1
    When I remove this parameter form query my report runs smoothly without any error.
    Please help me whether there is any solution to populate morethan 1,00,000 rows in the parameter form.
    Thanks In Advance
    Dheeraj

    I would question why you're trying to generate an LOV with 100,000 rows in it. You possibly should consider another UI for presenting to the user, possibly have a Forms parameter form to drive the report where you have a lot more control (including tree controls).
    Another alternative is to create an HTML parameter form using Reports before/after parameter form escapes to generate the HTML controls you need. The constraints then come down to those of the browser rather than Reports.

Maybe you are looking for