Update a large record set

I have to refresh the data in a table (say, table_reference_data) with the data in a view (say, view_master_data) which is on a remote database accessible via database link. The data for few columns in "table_reference_data" has to be refreshed from 2 local tables (say, table_x, table_y).
What I mean by refresh is insert new records, delete non-existent records and update existing records. All primary keys are in "view_master_data". I have to refresh about 1/2 a million records on an average. This has to be done once a day as a nightly job. This is the only means by which "table_reference_data" is updated. Everywhere else it is only read.
"table_reference_data" has 3 indexes and one unique key constraint. There are no referential integrity constraints.
At the moment this is done as follows using a PL/SQL procedure:
1. create synonym for "view_master_data" (the remote view).
2. bulk collect data for all columns by JOINing "view_master_data", "table_x" and "table_y".
3. delete all records in "table_reference_data" (I have to use yesterday's data if today's refresh fails hence not using truncate).
4. bulk insert data in to "table_reference_data" from the collections in step 2.
5. if exception rollback, else commit.
This job takes 10 minutes to run.
I have 2 questions now...
1. Does deleting 1/2 million records cause a huge amount of logging in Oracle? Is it a downside? If so what are the alternatives...
2. When the above stored proc is invoked from a shell script I get the below error randomly.
ERROR at line 1:
ORA-03113: end-of-file on communication channel
Is there any workaround for this? Database version is Oracle 8.1.6 Enterprise Ed. Has it got to do with using collections? (i.e. type tbl_my_col is table of varchar2(6)) In the shell script it is invoked as below:
echo "set serveroutput on\n execute my_pkg.my_proc()\n" | sqlplus ora_user/ora_password >> $log_file
Thanks in advance.

Is there any workaround for this? Database version is Oracle 8.1.6 Enterprise Ed. ORA-3113 is one of those generic error messages which Oracle use when something unexpected happens (see also ORA-600 and ORA-7445). If you're lucky there will be some additional information in the alert log or in a trace file in the background dump dest directory.
Sometimes there's a workaround, sometimes there's a patch. Given that you're on 8.1.6, which has been OoS for some time now, neither option is likely to be forthcoming from Oracle.
having re-read your decription of the process I am a bit puzzled as to why you're using this complicated hand-rolled process instead of a snapshot (materialized view)?
Cheers, APC

Similar Messages

  • Displaying large result sets in Table View u0096 request for patterns

    When providing a table of results from a large data set from SAP, care needs to be taken in order to not tax the R/3 database or the R/3 and WAS application servers.  Additionally, in terms of performance, results need to be displayed quickly in order to provide sub-second response times to users.
    This post is my thoughts on how to do this based on my findings that the Table UI element cannot send an event to retrieve more data when paging down through data in the table (hopefully a future feature of the Table UI Element).
    Approach:
    For data retrieval, we need to have an RFC with search parameters that retrieves a maximum number of records (say 200) and a flag whether 200 results were returned. 
    In terms of display, we use a table UI Element, and bind the result set to the table.
    For sorting, when they sort by a column, if we have less than the maximum search results, we sort the result set we already have (no need to go to SAP), but otherwise the RFC also needs to have sort information as parameters so that sorting can take place during the database retrieval.  We sort it during the SQL select so that we stop as soon as we hit 200 records.
    For filtering, again, if less than 200 results, we just filter the results internally, otherwise, we need to go to SAP, and the RFC needs to have this parameterized also.
    If the requirement is that the user must look at more than 200 results, we need to have a button on the screen to fetch the next 200 results.  This implies that the RFC will also need to have a start point to return results from.  Similarly, a previous 200 results button would need to be enabled once they move beyond the initial result set.
    Limitations of this are:
    1.     We need to use custom RFC function as BAPI’s don’t generally provide this type of sorting and limiting of data.
    2.     Functions need to directly access tables in order to do sorting at the database level (to reduce memory consumption).
    3.     It’s not a great interface to add buttons to “Get next/previous set of 200”.
    4.     Obviously, based on where you are getting the data from, it may be better to load the data completely into an internal table in SAP, and do sorting and filtering on this, rather than use the database to do it.
    Does anyone have a proven pattern for doing this or any improvements to the above design?  I’m sure SAP-CRM must have to do this, or did they just go with a BSP view when searching for customers?
    Note – I noticed there is a pattern for search results in some documentation, but it does not exist in the sneak preview edition of developer studio.  Has anyone had in exposure to this?
    Update - I'm currently investigating whether we can create a new value node and use a supply function to fill the data.  It may be that when we bind this to the table UI element, that it will call this incrementally as it requires more data and hence could be a better solution.

    Hi Matt,
    i'm afraid, the supplyFunction will not help you to get out of this, because it's only called, if the node is invalid or gets invalidated again. The number of elements a node contains defines the number of elements the table uses for the determination of the overall number of table rows. Something quite similar to what you want does already exist in the WD runtime for internal usage. As you've surely noticed, only "visibleRowCount" elements are initially transferred to the client. If you scroll down one or multiple lines, the following rows are internally transferred on demand. But this doesn't help you really, since:
    1. You don't get this event at all and
    2. Even if you would get the event, since the number of node elements determines the table's overall rows number, the event would never request to load elements with an index greater than number of node elements - 1.
    You can mimic the desired behaviour by hiding the table footer and creating your own buttons for pagination and scrolling.
    Assume you have 10 displayed rows and 200 overall rows, What you need to be able to implement the desired behaviour is:
    1. A context attribute "maxNumberOfExpectedRows" type int, which you would set to 200.
    2. A context attribute "visibleRowCount" type int, which you would set to 10 and bind to table's visibleRowCount property.
    3. A context attribute "firstVisibleRow" type int, which you would set to 0 and bind to table's firstVisibleRow property.
    4. The actions PageUp, PageDown, RowUp, RowDown, FirstRow and LastRow, which are used for scrolling and the corresponding buttons.
    The action handlers do the following:
    PageUp: firstVisibleRow -= visibleRowCount (must be >=0 of course)
    PageDown: firstVisibleRow += visibleRowCount (first + visible must be < maxNumberOfExpectedRows)
    RowDown/Up: firstVisibleRow++/-- with the same restrictions as in page "mode"
    FirstRow/LastRow is easy, isn't it?
    Since you know, which sections of elements has already been "loaded" into the dataSource-node, you can fill the necessary sections on demand, when the corresponding action is triggered.
    For example, if you initially display elements 0..9 and goto last row, you load from maxNumberOfExpected (200) - visibleRows (10) entries, so you would request entries 190 to 199 from the backend.
    A drawback is, that the BAPIs/RFCs still have to be capable to process such "section selecting".
    Best regards,
    Stefan
    PS: And this is meant as a workaround and does not really replace your pattern request.

  • How do I update all the records in a table from the contents of another table?

    Ok some situation information, I have a pervasive database that runs our accounting software that I am pulling a product list from.  I have that list stored in a table in SQL.  From time to time we update the records in the pervasive database and
    I want to be able to pull those changes into the SQL table.  I don't want to drop the table and recreate it because if a product is no longer active in the pervasive database it will not be returned by the query (if I return all the products even the
    inactive ones I will get thousands of records rather than just a few hundred) and I do not want to loose the products from the table that are now inactive.  
    So what I want to be able to do is pull the list from pervasive, compare it to the list that exists in SQL and update any changed records, add any new records, and leave any now missing records alone(missing from the pervasive list but present in the SQL
    list).  I have no trouble pulling the records from pervasive (now) but I am a little stumped on how to do the rest.  I am not sure if this is a situation to use MERGE or not.  I also do not really need this to be done on a regular basis as the
    changes do not happen often enough for that, the ability to manually trigger it would be enough.
    Any help would be appreciated.
    David

    Hi David,
    lets say you want to go with the lookup transformation.
    lets say u want to move the data from server A table A1 to Server B table B1
    What you need to do is the following:
    Cofigure the Lookup options as follows:
    - In general -> Specify how to handle rows with no macthing entries -> "Redirect Rows to no Match Output"
    -  In Connection -> Set the ole db connection to the Server B and select the table B you want to lookup the values in your case the table where you want to input the changes 
    -  In columns -> link the product column from table A to product column in table B. And do not select any rows to output.
    - now your component is ready next you need to input. so when u connect your lookup to the destination component You will get an option to select which output you want to use - use "Lookup No Match Output".
    this will actually just allow you to add only new items only to the table B.
    Teddy Bejjani - BI Specialist @ Netways

  • How to update or delete records in a Complex View in Forms?

    Hi,
    I have a requirement to create a Form by using Complex View. Insertion is possible but updation and deletion is not working properly . I got FRM-40501 Error. I need How to update or delete records in a Complex View in Forms?
    Thanks & Regards,
    Hari Babu

    Depending on how complex your view is, forms is not able to determine how to appropiately lock a record, when you try to update or delete a record.
    One approach to using complex views in forms:
    1. Set the Key-mode of the block to "Non-Updateable"
    2. Mark the column which can be used to build the WHERE-condition to uniquely identify a record with "Primary Key" = "Yes"
    3. For doing INSERT, UPDATE and DELETE, create an INSTEAD-OF-trigger on the view.
    4. Create your own ON-LOCK-trigger in forms which does the locking of the records to update.

  • File content conversion record set per message

    Dear All,
    Problem:- File is of huge size because of which file content conversion is taking longer time and is failing.
    File format:-
    Header
    Detail
    Detail
    Header
    Detail
    Detail
    Detail
    Detail
    Header
    Detail
    Detail
    Trailer
    Trailer has total count of all detail record,header record and there are few checks as well as wrt other fields.
    We need to do all the above validation on the file and if it successful it shud process it otherwise alert shud be raised.
    As a step:-
    I have used record set per message for splitting up the file because of huge size this functionality is working fine but because of spitting of file i am not able to do trailer validation as XI is creating multiple records with different message ID's
    Is any other approch which will help to achieve both Spliting as well as validation
    chirag

    Chirag,
    simplest scenario I can think of is splitting the 2 reqs.
    1. create 2 folders, one for "in process" files and other for "validated" files.
    2. create 2 scenarios:
    2.1. your current sender system to in process folder (whatever to File).
           => In this you just do the validation, without FCC. You could create a simple module for that or even do it at mapping runtime, as you said (mapping may be easier to handle errors), and throw a runtime exception (which will eventually trigger an alert).
           => At the end, only files that go successfully throug the validation will be located in the "In Process" folder.
    2.2. do a simple file to file scenario (from "In Process" to "Validated" folder), this time executing FCC & splitting messages if necessary.
    Of course, this will only work if the module/mapping is able to process the large file anyway (hopefully yes, since it will still be a flat file and not XML yet).
    BR,
    Henrique.

  • Read only record sets ?

    We're using a function to return a ref cursor back to VB.
    Works great until we try and update it. As soon as we try and update the record set we're getting an error "Multi-step operation generated errors. ". It looks to us like the record set is read-only. We've tried using an in-out parameter to a procedure - same problem. However, it works fine as long as we pass the sql instead of a call to a function or procedure.
    We don't really want to have to pass the sql.
    Can anyone help with this ?
    Thanks,
    Tricia.

    Ref cursors that are returned to an application from a stored procedure are read-only if you're using ODBC or OLE DB. I believe that the beta .NET native provider will allow you to update returned ref cursors, however.
    The reason it works when you issue straight SQL is that the driver actually modifies your SQL statement to get the ROWID's for all the rows. Then, it's able to update rows by building its own SQL insert statement.
    Justin

  • Update a record is updating the first record in the DB...HELP!

    I am going over and over this again and cant find the problem.
    i have a form that sends email to emails that are on a php mysql db however when i update certain records it always is updating the first record in the DB...i have looked over this so many times and cant see what is going wrong
    the userid is not auto_increment but is based on the username (these are all unique)
    i have uncluded the code to see if i am missing something
    <?php require_once('../Connections/hostprop.php'); ?>
    <?php
    if (!function_exists("GetSQLValueString")) {
    function GetSQLValueString($theValue, $theType, $theDefinedValue = "", $theNotDefinedValue = "")
      if (PHP_VERSION < 6) {
        $theValue = get_magic_quotes_gpc() ? stripslashes($theValue) : $theValue;
      $theValue = function_exists("mysql_real_escape_string") ? mysql_real_escape_string($theValue) : mysql_escape_string($theValue);
      switch ($theType) {
        case "text":
          $theValue = ($theValue != "") ? "'" . $theValue . "'" : "NULL";
          break;   
        case "long":
        case "int":
          $theValue = ($theValue != "") ? intval($theValue) : "NULL";
          break;
        case "double":
          $theValue = ($theValue != "") ? doubleval($theValue) : "NULL";
          break;
        case "date":
          $theValue = ($theValue != "") ? "'" . $theValue . "'" : "NULL";
          break;
        case "defined":
          $theValue = ($theValue != "") ? $theDefinedValue : $theNotDefinedValue;
          break;
      return $theValue;
    $editFormAction = $_SERVER['PHP_SELF'];
    if (isset($_SERVER['QUERY_STRING'])) {
      $editFormAction .= "?" . htmlentities($_SERVER['QUERY_STRING']);
    if ((isset($_POST["MM_update"])) && ($_POST["MM_update"] == "form1")) {
      $updateSQL = sprintf("UPDATE plus_signup SET email=%s, emailerSubject=%s, emailerContent=%s WHERE userid=%s",
                           GetSQLValueString($_POST['email'], "text"),
                           GetSQLValueString($_POST['emailerSubject'], "text"),
                           GetSQLValueString($_POST['emailerContent'], "text"),
                           GetSQLValueString($_POST['userid'], "text"));
      mysql_select_db($database_hostprop, $hostprop);
      $Result1 = mysql_query($updateSQL, $hostprop) or die(mysql_error());
          // Email Guarantor
              $to = $_POST['email'];
              $subject = "Email From Host Student Property";
              $message = "
              <html>
                        <head>
                                  <title>Dear ".GetSQLValueString($_POST['userid'], "text")."</title>
                        </head>
                        <body>
                                  <img src=\"http://www.hoststudent.co.uk/beta/images/hostlogo.gif\" alt=\"www.HostStudent.co.uk\" />
                                  <h2>An Email From Host Students</h2>
                                  <br /><br />
                                  <table>
                                            <tr>
                                                      <td>Email Subject:</td>
                                            </tr>
                                            <tr>
                                                      <td>".GetSQLValueString($_POST['emailerSubject'], "text")."</td>
                                            </tr>
                                            <tr>
                                                      <td>Email Content</td>
                                            </tr>
                                            <tr>
                                                      <td>".GetSQLValueString($_POST['emailerContent'], "text")."</td>
                                            </tr>
                                  </table>
                        </body>
              </html>
              // Always set content-type when sending HTML email
              $headers = "MIME-Version: 1.0" . "\r\n";
              $headers .= "Content-type:text/html;charset=iso-8859-1" . "\r\n";
              $headers .= 'From: HostStudent.co.uk <[email protected]>' . "\r\n";
              $send = mail($to,$subject,$message,$headers);
      $updateGoTo = "TenantEmailSent.php";
      if (isset($_SERVER['QUERY_STRING'])) {
        $updateGoTo .= (strpos($updateGoTo, '?')) ? "&" : "?";
        $updateGoTo .= $_SERVER['QUERY_STRING'];
      header(sprintf("Location: %s", $updateGoTo));
    mysql_select_db($database_hostprop, $hostprop);
    $query_Recordset1 = "SELECT userid, email, emailerSubject, emailerContent FROM plus_signup";
    $Recordset1 = mysql_query($query_Recordset1, $hostprop) or die(mysql_error());
    $row_Recordset1 = mysql_fetch_assoc($Recordset1);
    $totalRows_Recordset1 = mysql_num_rows($Recordset1);
    ?>
    <?
              session_start();
              if(!$_SESSION['loggedIn']) // If the user IS NOT logged in, forward them back to the login page
                        header("location:Login.html");
    ?>
       <script type="text/javascript">
    function loadFields(Value) {
             var Guarantor = Value.split("|");
              var userid1 = Guarantor[0] ;
                          var GuName = Guarantor[1];
              var GuPhoneEmail = Guarantor[2] ;
              document.getElementById('userid1').value=userid1;
                          document.getElementById('GuName').value=GuName;
              document.getElementById('GuPhoneEmail').value=GuPhoneEmail;
    </script>
    <form action="<?php echo $editFormAction; ?>" method="post" name="form2" id="form2">
                  <table align="center">
          <tr valign="baseline">
            <td nowrap="nowrap" align="right"> </td>
            <td><select name="userid" id="userid" onchange="loadFields(this.value)">
              <option value="Select Guarantor">Select Guarantor</option>
              <?php
    do {
    ?>
    <option value="<?php echo $row_Recordset1['userid'] . '|' . $row_Recordset1['GuName'] . '|' . $row_Recordset1['GuPhoneEmail'];?>"><?php echo $row_Recordset1['userid'] . " , " . $row_Recordset1['GuName'] . " , " . $row_Recordset1['GuPhoneEmail']; ?></option>
              <?php
    } while ($row_Recordset1 = mysql_fetch_assoc($Recordset1));
      $rows = mysql_num_rows($Recordset1);
      if($rows > 0) {
          mysql_data_seek($Recordset1, 0);
                $row_Recordset1 = mysql_fetch_assoc($Recordset1);
    ?>
            </select></td>
          </tr>
          <tr valign="baseline">
            <td nowrap="nowrap" align="right">Tenant Name</td>
            <td><input type="text" name="userid1" id="userid1" readonly="readonly" value="<?php echo htmlentities($row_Recordset1['userid'], ENT_COMPAT, 'utf-8'); ?>" size="32" /></td>
          </tr>
          <tr valign="baseline">
            <td nowrap="nowrap" align="right">GuName:</td>
            <td><input type="text" name="GuName" id="GuName" readonly="readonly" value="<?php echo htmlentities($row_Recordset1['GuName'], ENT_COMPAT, 'utf-8'); ?>" size="32" /></td>
          </tr>
          <tr valign="baseline">
            <td nowrap="nowrap" align="right">GuPhoneEmail:</td>
            <td><input type="text" name="GuPhoneEmail" id="GuPhoneEmail" readonly="readonly" value="<?php echo htmlentities($row_Recordset1['GuPhoneEmail'], ENT_COMPAT, 'utf-8'); ?>" size="32" /></td>
          </tr>
          <tr valign="baseline">
            <td nowrap="nowrap" align="right">GuEmailerSubject:</td>
            <td><input type="text" name="GuEmailerSubject" value="" size="32" /></td>
          </tr>
          <tr valign="baseline">
            <td nowrap="nowrap" align="right">GuEmailerContent:</td>
            <td><textarea name="GuEmailerContent" cols="45" rows="5"> </textarea></td>
          </tr>
          <tr valign="baseline">
            <td nowrap="nowrap" align="right"> </td>
            <td><input type="submit" value="Send email" /></td>
          </tr>
          <tr valign="baseline">
            <td nowrap="nowrap" align="right"></td>
            <td> </td>
          </tr>
                  </table>
        <input type="hidden" name="MM_update" value="form2" />
        <input type="hidden" name="userid" value="<?php echo $row_Recordset1['userid']; ?>" />
    </form>

    i have found the problem, there were two forms with the same name..
    thanks

  • Update of the record in the dbtable

    Hi all,
    I want to update the particular record in the dbtable.
    for example.
    I want to change one field ( status field in catsdb table).
    that field contains status 20.
    now i want to change the status field from 20 to 30.
    so that how to write the correct syntax for that.
    thanking u.
    regards,
    giri.

    HI,
    Check this code
    TABLES:BKPF,J_1IPART2.
      PARAMETERS:P_FAWREF LIKE J_1IPART2-FAWREF,
                 P_FAWRE1 LIKE J_1IPART2-FAWREF.
      DATA: P_FAWREF2 LIKE J_1IPART2-FAWREF,
            P_FAWRE3 LIKE J_1IPART2-FAWREF.
       MOVE P_FAWREF TO P_FAWREF2.
       MOVE P_FAWRE1 TO P_FAWRE3.
      UPDATE J_1IPART2 SET FAWREF = P_FAWRE3
                        WHERE FAWREF = P_FAWREF2.
           IF SY-SUBRC = 0.
             MESSAGE S000.
           ELSE.
             MESSAGE E001.
           ENDIF.
    regards
    siva

  • How can NI FBUS Monitor display very large recorded files

    NI FBUS Monitor version 3.0.1 outputs an error message "Out of memory", if I try to load a large recorded file of size 272 MB. Is there any combination of operating system (possible Vista32 or Vista64) and/or physical memory size, where NI FBUS Monitor can display such large recordings ? Are there any patches or workarounds or tools to display very large recorded files?

    Hi,
    NI-FBUS Monitor does not set the limitation on the maximum record file size. The physical memory size in the system is one of the most important factors that affect the loading of large record file.  Monitor will try loading the entire file into the memory during file open operation.
    272MB is a really large file size. To open the file, your system must have sufficient physical memory available.  Otherwise "Out of memory" error will occur.
    I would recommend you do not use Monitor to open a file larger than 100MB. Loading of a too large file will consume the system memory quickly and decrease the performance.
    Message Edited by Vince Shen on 11-30-2009 09:38 PM
    Feilian (Vince) Shen

  • Updating and deleting records in access DB

    i'm trying to make multiple updates and deletes in a access DB and it doesn't work.
    my program executes lot's of updates and delete statements when saving, but I only commit in the end when I know all the statements finished ok.
    I can't make it a batch update since I need to retrieve the auto-number that access issues me on some of my tables.
    I know it's kinda fuzzy, but my program is SO big already, and I can't really put the code in here - it wouldn't do much good.
    what I will do is write what my log shows (i issue a log output whenever there's a executeQuery/Update and before commit/rollback
    DBG : 29/04/03 : UPDATE CASES_T SET CASE_office_case_id = 1 ,CASE_year = 2 ,CASE_total_debt = 10009 ,CASE_is_limited = true ,CASE_monthly_payment = 400 ,CASE_hotzlap_case_id = ' - - - ' ,CASE_status = 26 WHERE CASE_tech_id = 5
    DBG : 29/04/03 : SELECT count(*) from CASES_OWEES_T where CAOW_Tech_Id = 23
    DBG : 29/04/03 : UPDATE CASES_OWEES_T SET CAOW_case_tech_id = 5 ,CAOW_id_number = '046259990' ,CAOW_first_name = '���' ,CAOW_last_name = '������' ,CAOW_work_place = 'jjjjjjjjjjjjj' ,CAOW_address = '1jjjjjjjjjjjj' ,CAOW_telephone = '09-8099999' ,CAOW_pelephone = '000-000000' ,CAOW_hotzlap_case_id = ' - - - ' ,CAOW_is_valid_address = false WHERE CAOW_TECH_ID = 23
    DBG : 29/04/03 : SELECT count(*) from CASES_OWEES_T where CAOW_Tech_Id = 24
    DBG : 29/04/03 : UPDATE CASES_OWEES_T SET CAOW_case_tech_id = 5 ,CAOW_id_number = ' ' ,CAOW_first_name = '�����' ,CAOW_last_name = '���' ,CAOW_work_place = '����' ,CAOW_address = '��?' ,CAOW_telephone = ' - ' ,CAOW_pelephone = ' - ' ,CAOW_hotzlap_case_id = ' - - - ' ,CAOW_is_valid_address = true WHERE CAOW_TECH_ID = 24
    DBG : 29/04/03 : SELECT count(*) from CASES_INNER_CASES_T where CICA_Tech_Id = 10
    DBG : 29/04/03 : UPDATE CASES_INNER_CASES_T SET CICA_case_tech_id = 5 ,CICA_debt = 5000897 ,CICA_winner_name = '���' ,CICA_lawyer_tech_id = '6' ,CICA_hotzlap_case_id= '02-22122-22-2' WHERE CICA_tech_id= 10
    DBG : 29/04/03 : SELECT count(*) from CASES_INNER_CASES_T where CICA_Tech_Id = 11
    DBG : 29/04/03 : UPDATE CASES_INNER_CASES_T SET CICA_case_tech_id = 5 ,CICA_debt = 20008 ,CICA_winner_name = '������' ,CICA_lawyer_tech_id = '2' ,CICA_hotzlap_case_id= '02-22222-22-2' WHERE CICA_tech_id= 11
    DBG : 29/04/03 : SELECT count(*) from CASES_INNER_CASES_T where CICA_Tech_Id = 14
    DBG : 29/04/03 : UPDATE CASES_INNER_CASES_T SET CICA_case_tech_id = 5 ,CICA_debt = 129 ,CICA_winner_name = '' ,CICA_lawyer_tech_id = '4' ,CICA_hotzlap_case_id= '02-22222-22-2' WHERE CICA_tech_id= 14
    DBG : 29/04/03 : SELECT count(*) from CASES_CUSTOMER_CARDS_T where CUCA_Tech_Id = 23
    DBG : 29/04/03 : UPDATE CASES_CUSTOMER_CARDS_T SET CUCA_case_tech_id = 5 , CUCA_sum = 324 , CUCA_fee_plus_maam = -38.0 , CUCA_for_division = 285.0 , CUCA_voucher = '3222 ' , CUCA_date = '2003-04-24' , CUCA_in_or_out = true WHERE CUCA_tech_id = 23
    DBG : 29/04/03 : SELECT count(*) from CASES_CUSTOMER_CARDS_T where CUCA_Tech_Id = 22
    DBG : 29/04/03 : UPDATE CASES_CUSTOMER_CARDS_T SET CUCA_case_tech_id = 5 , CUCA_sum = 324 , CUCA_fee_plus_maam = -38.0 , CUCA_for_division = 285.0 , CUCA_voucher = '2222 ' , CUCA_date = '2003-04-23' , CUCA_in_or_out = true WHERE CUCA_tech_id = 22
    DBG : 29/04/03 : SELECT count(*) from CASES_CUSTOMER_CARDS_T where CUCA_Tech_Id = 21
    DBG : 29/04/03 : UPDATE CASES_CUSTOMER_CARDS_T SET CUCA_case_tech_id = 5 , CUCA_sum = 100 , CUCA_fee_plus_maam = -11.0 , CUCA_for_division = 0.0 , CUCA_voucher = '2222 ' , CUCA_date = '2003-04-23' , CUCA_in_or_out = true WHERE CUCA_tech_id = 21
    DBG : 29/04/03 : SELECT count(*) from CASES_CUSTOMER_CARDS_T where CUCA_Tech_Id = 20
    DBG : 29/04/03 : UPDATE CASES_CUSTOMER_CARDS_T SET CUCA_case_tech_id = 5 , CUCA_sum = 1000 , CUCA_fee_plus_maam = -118.0 , CUCA_for_division = 882.0 , CUCA_voucher = '1111 ' , CUCA_date = '2003-04-23' , CUCA_in_or_out = true WHERE CUCA_tech_id = 20
    DBG : 29/04/03 : SELECT count(*) from CASES_INVESTIGATIONS_T where CAIN_Tech_Id = 16
    DBG : 29/04/03 : UPDATE CASES_INVESTIGATIONS_T SET CAIN_case_tech_id = 5 ,CAIN_text = '' ,CAIN_was_declared_limited = false ,CAIN_date = '2003-03-01' ,CAIN_rulling_effective_date = '2003-02-01' ,CAIN_payment_amount = 400 WHERE CAIN_TECH_ID = 16
    DBG : 29/04/03 : SELECT count(*) from CASES_INVESTIGATIONS_T where CAIN_Tech_Id = 12
    DBG : 29/04/03 : UPDATE CASES_INVESTIGATIONS_T SET CAIN_case_tech_id = 5 ,CAIN_text = '' ,CAIN_was_declared_limited = true ,CAIN_date = '1970-01-01' ,CAIN_rulling_effective_date = '1905-03-06' ,CAIN_payment_amount = 90 WHERE CAIN_TECH_ID = 12
    DBG : 29/04/03 : SELECT count(*) from CASES_INVESTIGATIONS_T where CAIN_Tech_Id = 10
    DBG : 29/04/03 : UPDATE CASES_INVESTIGATIONS_T SET CAIN_case_tech_id = 5 ,CAIN_text = '' ,CAIN_was_declared_limited = false ,CAIN_date = '1970-01-01' ,CAIN_rulling_effective_date = '1990-01-30' ,CAIN_payment_amount = 89 WHERE CAIN_TECH_ID = 10
    DBG : 29/04/03 : DELETE FROM CASES_INVESTIGATIONS_T where CAIN_tech_id = 9
    DBG : 29/04/03 : DELETE FROM CASES_INVESTIGATIONS_T where CAIN_tech_id = 9
    DBG : 29/04/03 : DELETE FROM CASES_INVESTIGATIONS_T where CAIN_tech_id = 14
    DBG : 29/04/03 : DELETE FROM CASES_INVESTIGATIONS_T where CAIN_tech_id = 14
    DBG : 29/04/03 : DELETE FROM CASES_INVESTIGATIONS_T where CAIN_tech_id = 13
    DBG : 29/04/03 : DELETE FROM CASES_INVESTIGATIONS_T where CAIN_tech_id = 13
    DBG : 29/04/03 : DELETE FROM CASES_INVESTIGATIONS_T where CAIN_tech_id = 11
    DBG : 29/04/03 : DELETE FROM CASES_INVESTIGATIONS_T where CAIN_tech_id = 11
    DBG : 29/04/03 : SELECT count(*) from CASES_PAYMENTS_T where CAPY_Tech_Id = 71
    DBG : 29/04/03 : UPDATE CASES_PAYMENTS_T SET CAPY_case_tech_id = 5 ,CAPY_sum = 400 ,CAPY_exception_text = '' ,CAPY_is_exception = true ,CAPY_voucher = '' ,CAPY_date = '2003-04-01' ,CAPY_is_paid_in_hotzlap = false WHERE CAPY_TECH_ID = 71
    DBG : 29/04/03 : SELECT count(*) from CASES_PAYMENTS_T where CAPY_Tech_Id = 70
    DBG : 29/04/03 : UPDATE CASES_PAYMENTS_T SET CAPY_case_tech_id = 5 ,CAPY_sum = 400 ,CAPY_exception_text = '' ,CAPY_is_exception = true ,CAPY_voucher = '' ,CAPY_date = '2003-03-01' ,CAPY_is_paid_in_hotzlap = false WHERE CAPY_TECH_ID = 70
    DBG : 29/04/03 : SELECT count(*) from CASES_PAYMENTS_T where CAPY_Tech_Id = 69
    DBG : 29/04/03 : UPDATE CASES_PAYMENTS_T SET CAPY_case_tech_id = 5 ,CAPY_sum = 400 ,CAPY_exception_text = '' ,CAPY_is_exception = true ,CAPY_voucher = '' ,CAPY_date = '2003-03-01' ,CAPY_is_paid_in_hotzlap = false WHERE CAPY_TECH_ID = 69
    DEV : 29/04/03 : commit

    sure - I don't get any exception. the data just
    doesn't show in the DB. (no updating/deletion of
    records)
    as I wrote before - when the the programm commits the
    changes and does the next select command it seems as
    if the data was changed/deleted but if I check the DB
    with access or if I restart the prgoram then I see
    that the data didn't change.
    Access/jdbc-odbc has a problem where modifications are not 'commited' when the statement completes. Instead one must do one of the following:
    1. Explicit commit.
    2. Simple select after statement.
    3. Close the connection.
    Presumably you are doing 1.
    Since other than this it does work, it suggests one of the following.
    1. Something is wrong with your environment. For instance you are looking at the wrong database. Or not refreshing. Or something else like that.
    2. You are using something besides a simple connection - like opening it with 'scroll insensitive'.
    3. The complexity is causing it to lose an error message. This can be tested by doing each statement individually and verify that none produce an error.
    4. Maybe you found a bug. You can turn on ODBC tracing via the applet panel and see if digging through all of the detail provides any clues (you can also do this with 3 above.)

  • Tax code update in info record?

    Hi, Gurus
    a question, somebody knows if it is posible to update the tax code in info record update indicator?.
    Info record was created automatically and the tax code was updated as in PO, by setting info update indicator in PO but every time when we enter the tax code in PO, It was not updating in info record.
    Thanks in advance
    RZ

    The Tax code in Info record gets created/ stored, only when the PIR is created very first time by the system. This field does not get updated even when the Update PIR has been checked in any subsequent PO.
    The Tax-code field is set at the P.Org level, & is not at the plant level.
    Hence it will create the issue,
    if the plant considered in very 1st order is in different county (US) than the plant (country- Germeny) considered for the subsequent PO for the same material , vendor & P Org. As it will default the wrong tax code. Hence it is advisible NOT to maintain the tax code at PIR level.
    Regards,
    Samip Patil.

  • How to handle large result set of a SQL query

    Hi,
    I have a question about how to handle large result set of a SQL query.
    My query returns more than a million records. However, the Query Template has a "row count" parameter. If I don't specify it, it by default returns only 100 lines of records in the query result. If I specify it, then it's limited to a specific number.
    Is there any way to get around of this row count issue? I don't want any restriction on the number of records returned by a query.
    Thanks a lot!

    No human can manage that much data...in a grid, a chart, or a direct-connected link to the brain. 
    What you want to implement (much like other customers with similar requirements) is a drill-in and filtering model that helps the user identify and zoom in on data of relevance, not forcing them to scroll through thousands or millions of records.
    You can also use a time-based paging model so that you only deal with a time "slice" at one request (e.g. an hour, day, etc...) and provide a scrolling window.  This is commonly how large datasets are also dealt with in applications.
    I would suggest describing your application in more detail, and we can offer design recommendations and ideas.
    - Rick

  • Large data sets and key terms

    Hello, I'm looking for some guidance on how BI can help me. I am a business analyst in a health solutions firm, but not proficient in SQL. However, I have to work with large data sets that just exceed the capabilities of Excel.
    Basically, I'm having to use Excel to manaully search for key terms and apply a values to those results. For instance, I have a medical claims file, with Provider Names, Tax ID, Charges, etc. It's 300,000 records long and 15-25 columsn wide. I need to search for key terms in the provider name like Ambulance, Fire Dept, Rescue, EMT, EMS, etc. Anything that resembles an ambulance service. Also, need to include abbreviations of them such as AMB, FD, or variations like EMT, E M T, EMS, E M S, etc. Each time I do a search, I have filter and apply an "N/A" flag.
    That's just one key term. I also have things like Dentists or DDS, Vision, Optomemtry and a dozen other Provider Types that need to be flagged as "N/A".
    Is this something that can be handled using BI? I have access to a BI group, but I need to understand more about the capabilities of what can be done. As an analyst, I'm having to deal with poor data inegrity. So, just cleaning up the file can be extremely taxing and cumbersome.
    Some insight would be very helpful. Thanks.

    I am not sure if you are looking for an explanation about different BI products? If so, may be this forum is not the place to get a straight answer.
    But, Information Discovery product suite might be useful in your case. Regarding the "large date set" you mentioned, searching and analyzing 300,000 records may not be considered a large data set at least in Endeca standards :).
    All your other requests, could also be very easily implemented using Endeca's product suite. Please reach out to Oracle's Endeca product team and they might guide you on how this product suite would help you.

  • Procedure to update the table records

    I have to write a procedure to update the existing records.
    I have one staging table and one decode table.
    staging table is obtained from a interface with cross join of two tables X1 and X2 and
    lookup with DECODE table.
    DECODE
    ITEM_ID ITEM_CODE STORE_ID STORE_CODE
    101 C 7 77
    100 A
    100 B
    In this table there is relation between item_id and item_code AND
    store_id and store_code. Items and stores have no relationship here. Simply we insert values of both side by side.
    Initial STAGING table
    SNO RNO SCODE RCODE DECODE_STATUS
    101 8 C NP
    101 6 C NP
    101 7 C 77 P
    100 8 A NP
    100 6 A NP
    100 7 A 77 P
    100 8 B NP
    100 6 B NP
    100 7 B 77 P
    Joins:
    SNO=ITEM_ID
    RNO=STORE_ID
    Now I want to update this according to my new decode where NP record will be updated with their new RCODE values
    DECODE table now is
    ITEM_ID ITEM_CODE STORE_ID STORE_CODE
    101 C 8 88
    100 A 7 77
    100 B 6 66
    Now I need my staging table with RCODE values corresponing to DECODE table.
    My expected STAGING table.
    SNO RNO SCODE RCODE DECODE_STATUS
    101 8 C 88 NP
    101 6 C 66 NP
    101 7 C 77 P
    100 8 A 88 NP
    100 6 A 66 NP
    100 7 A 77 P
    100 8 B 88 NP
    100 6 B 66 NP
    100 7 B 77 P
    Please help me to do this ...I have tried with interface where I kept source and target table same staging table and used DECODE for lookup but I didnt get the expected result.

    Hi Chattar ,
    It clearly seems that issue is with Primary keys.
    As you have said about the decode table "In this table there is relation between item_id and item_code AND store_id and store_code. Items and stores have no relationship here. Simply we insert values of both side by side."
    It seems your final update query should be like this :
    update staging1 T
    set (T.rcode, T.scode)
    =
    Select
    (select store_code from decode where store_id = T.RNO ) ,
    (SELECT ITEM_CODe from decode where ITEM_id = T.SNO)
    from dual
    ) where T.decode_status = 'NP';
    If what i have said is right , then you can implement the same using an interface too But first you should decide the Key columns on the target table.
    Hope it solves the issue.
    Edited by: user8427112 on Sep 22, 2012 6:48 AM

  • Updating a database record

    I am trying to update a database record using the following :-
    update adv_order_progress set adv_action_out = to_char(sysdate, 'DD-MM-YYYY HH24:MI:SS') where adv_order_progress.adv_order_id = :adv_order.adv_order_id and adv_order_progress.adv_ord_number = :adv_order.adv_ord_number;
    at trigger WHEN-BUTTON-PRESSED but it doesn't update any at all. When I do the same from ISQL*plus it worked fine !!.
    would you please advise

    did you only have ISQL for testing it?
    Can you debug the form and look, if someone happens - exceptions, ... ?
    after the update you can use
    message (SQL%ROWCOUNT); pause ;
    to look, how many records were updated

Maybe you are looking for