2000 records to update to sap..

Hi Guys,
I have a requirement where I have 2500 records in database to update to sap.
Im currently using a rfc adapter, using the bapi but I have not tried with 2500 records.
Is it possible with bapi or do i have to use a proxy??
kindly suggest
Regards,
Teja

Ravindra ,
Is it possible with bapi or do i have to use a proxy??
Possible !!!
but avoid BAPI....Proxies are preferred for voluminous data..more over using proxies will give you better monitoring options at R/3 side .
Regards ,

Similar Messages

  • Commit after 2000 records in update statement but am not using loop

    Hi
    My oracle version is oracle 9i
    I need to commit after every 2000 records.Currently am using the below statement without using the loop.how to do this?
    do i need to use rownum?
    BEGIN
    UPDATE
    (SELECT A.SKU,M.TO_SKU,A.TO_STORE FROM
    RT_TEMP_IN_CARTON A,
    CD_SKU_CONV M
    WHERE
    A.SKU=M.FROM_SKU AND
    A.SKU<>M.TO_SKU AND
    M.APPROVED_FLAG='Y')
    SET SKU = TO_SKU,
         TO_STORE=(SELECT(
              DECODE(TO_STORE,
              5931,'931',
              5935,'935',
              5928,'928',
              5936,'936'))
              FROM
              RT_TEMP_IN_CARTON WHERE TO_STORE IN ('5931','5935','5928','5936'));
    COMMIT;
    end;
    Thanks for your help

    I need to commit after every 2000 recordsWhy?
    Committing every n rows is not recommended....
    Currently am using the below statement without using the loop.how to do this?Use a loop? (not recommended)

  • Error while updating the status record of IDOC in SAP

    Hi All,
    I am facing this problem. I have done outbound processing and IDOC was sent successfully from SAP to EDI system and it was processed in EDI and now EDI system wants to send the status back to SAP with a status message and the status number that we are using is '24' and we have mapped all the fields in the status table EDIDS and made sure that EDI system sends all those and I think Counter field can not be determined by EDI system so EDI system used the counter '1' since it can not determine how many counters are already there in SAP and unless we pass the value to this counter field we were getting the error. After passing all the values the status of the IDOC is updated with the status '00' instead of '24' and we do not know why it is happened and I would like to know how SAP converts the status record that is received from the EDI system.
    Please let me know how the status record will be translated into SAP from EDI system and I ahve followed basically EDIDS structure and the IDOC status is updating in SAP but with wrong status number and the alignemnt also missing in SAP fields like if I give some text in the EDI it is splitting and storing in 2 fields.
    Please help me in this and I think I have explained the problem in a detail manner.
    Thanks,
    Ramesh.

    Hi Naresh,
    Thanks for the reply and my question is since EDI system is able to send the status back to SAP from EDI system and only problem is it is updatinf the status wrong and I have checked the EDIDS table also and the entry is creating with another counter and please confirm me that it is because of EDI does not support the status record update so that I can confirm to the client that there is nothing wrong in the way SAP functions and EDI system can not update the status code in the SAP.
    Thanks,
    Ramesh.

  • Delta records not updating from DSO to CUBE in BI 7

    Hi Experts,
    Delta records not updating from DSO to CUBE
    in DSO keyfigure value showing '0' but in CUBE same record showing '-I '
    I cheked in Change log table in DSO its have 5 records
    ODSR_4LKIX7QHZX0VQB9IDR9MVQ65M -  -1
    ODSR_4LKIX7QHZX0VQB9IDR9MVQ65M -   0
    ODSR_4LIF02ZV32F1M85DXHUCSH0DL -   0
    ODSR_4LIF02ZV32F1M85DXHUCSH0DL -   1
    ODSR_4LH8CXKUJPW2JDS0LC775N4MH -   0
    but active data table have one record - 0
    how to corrcct the delta load??
    Regards,
    Jai

    Hi,
    I think initially the value was 0 (ODSR_4LH8CXKUJPW2JDS0LC775N4MH - 0, new image in changelog) and this got loaded to the cube.
    Then the value got changed to 1 (ODSR_4LIF02ZV32F1M85DXHUCSH0DL - 0, before image & ODSR_4LIF02ZV32F1M85DXHUCSH0DL - 1, after image). Now this record updates the cube with value 1. The cube has 2 records, one with 0 value and the other with 1.
    The value got changed again to 0 (ODSR_4LKIX7QHZX0VQB9IDR9MVQ65M - (-1), before image &
    ODSR_4LKIX7QHZX0VQB9IDR9MVQ65M - 0, after image). Now these records get aggregated and update the cube with (-1).
    The cube has 3 records, with 0, 1 and -1 values....the effective total is 0 which is correct.
    Is this not what you see in the cube? were the earlier req deleted from the cube?

  • Info record not updated from PO , PO updated from Info record

    Dear Gurus,
    I want to restrict Base price in PO will be copied from Valid info record. User does not permitted to change the Base price once info record not maintained.
    Kindly tell me the configuration steps to adopt the process ?
    I don't want to update infor record from PO also.
    I know while making PO i can untick infor record not update, but that is user specific.
    Thanks in advance
    With regards
    SD

    Dear Sidi,
    Thanks . Problem solved.
    When info record is maintained the price condition will get from there. for this change in condition type P001 , set "D" in  manual entries column.
    When info record is not maintained, then the system will try to get the price from last PO, if not present, it  will throw an error u201CNet price must be greater than 0u201D and sets price condition PBXX (manual entry) to manually enter the price you want.
    What you can do:
    Is make the condition type P000 as automatic only (option D as below) and also same for condition type PBXX.
    Regards
    Soumen

  • POS DM: Stock updates in SAP Retail

    Hi!
    We are implementing SAP Retail (ERP 6.0 EHP4 stack 3), SAP PI (Netweaver PI 7.1 SP 8) and SAP BW (NetWeaver 7.0 ehp1 stack 4, incl BI content 704, SP 3 ) for a client. Their POS system is a 3rd party system, i.e. not SAP POS, and it does not contain any stock information.
    This issue concerns stock updates in SAP Retail using POS DM:
    On the busiest day of year, my client has 347 000 POS transactions in total from all stores, and in one year they have 30 mill POS transactions. My client requires that SAP will provide them with near-realtime information on stock availability, as this is an important way of providing good customer service. The stock information should be available in SAP Enterprise Portal and SAP Retail Store, as well as standard SAP gui transaction. In addition, the webshop (external system) should have as accurate information about stock as possible, at least it should be updated once every 30 minutes.
    My client also has a requirement to be able to create purchase orders during office hours (store opening hours), based on MRP runs. This means they must have as accurate stock levels as possible for running MRP. Stock levels are also used in ATP checks, when deciding whether to pick customer orders from store inventory/stock or to create purchase requisitions.
    All these requirements have led us to believe that the best way to solve this, is alternative A:
    - to update stock continuously during office hours, using WPUBON to create material documents, one for each POS transaction. Sales and tenders will be updated in SAP Retail using WPUUMS and WPUTAB.
    Some challenges this will create:
    a)     It will create 347 000 WPUBONs on the busiest day of year, all within 10-12 open hours.
    b)     If we have same amount of material docs and FI docs, they will use 30% of the available 8 digits within their number ranges per year.
    c)     The idocs will fail if there are article locks simultaneously. (However, this can be handled by updating products only at night time (using batch jobs) and using batch job to handle failed idocs)
    Edited by: Øystein Emhjellen on Jan 4, 2010 2:15 PM

    Alternative B:
    - we have considered to use WPUUMS and WPUTAB, updating stock levels only once a day. The solution we then must use for taking into account actual sales data is described in SAP Note 1088886 - Using the interface for online stock queries. If we choose this, we need to check during the day for each article any sales, to reduce stock in reporting/screens, ATP checks and in MRP runs. We must also create a program to check for articles sold before sending stock data to webshop (external system). (As we do not know which articles have been sold, we must create our own version of RFC - /POSDW/SALES_QUERY_RFC, to enable that the article number does not have to be specified. Rather, the article numbers (and sold quantities) have to be returned to ERP based on what was actually sold.)
    We consider this alternative more risky, due to the need for changing behaviour of standard programs for MRP and ATP check. It will also create a number of RFC calls between SAP ECC and SAP BW.
    My questions on this issue are:
    i)     Will alternative A be preferrable, if hardware requirements are handled?
    ii)     Do you see any additional challenges related to alternative A?
    iii)     Will alternative B be adviceable, concerning the program changes/enhancements involved?
    iv)     Are there standard implementations of SAP ECC, that includes the use of RFC - /POSDW/SALES_QUERY_RFC. I.e. standard programs in SAP ECC that uses the information from this RFC call?
    Thanks for any advice !
    Best regards,
    Oeystein Emhjellen

  • How to get all records using Invoke-webrequest?/Why Invoke-webrequest returns only first 2000 Records?

    invoke-webrequest content returning only 2000 records though it has around 4000 records in web api.
    The same url if I give in excel oData Data feed I am getting all the records.
    See the below script
    Script:
    $QueryResult= (Invoke-WebRequest -Uri $ODataURI -UseDefaultCredentials)
    [xml]$xmlResult=$QueryResult.content
    foreach($obj in $xmlResult.feed.entry.content.properties)
    $Name=$obj.Name;
    $IsAvail=$obj.isAvail.'#text';
    $PGroup=$obj.PGroup
    I am exporting the above result as a CSV file and my CSV file contains only 2000 records. 
    But,  $xmlResult.feed.Count --> it Shows 4000 Records.
    The same Odata url if I give in excel oData Data feed I am getting all the 4000 records.
    So Please help me how can I get all the records using power shell.
    Thanks
    A Pathfinder..
    JoSwa
    If a post answers your question, please click &quot;Mark As Answer&quot; on that post and &quot;Mark as Helpful&quot;
    Best Online Journal

    Hi Jo Swa(K.P.Elayaraja)-MCP,
    Would you please also post code which is used to export the records?
    In addition, to use the cmdlet invoke-RestMethod to work on ODate feeds, please refer to this article:
    Interacting with TechEd NA 2012 Schedule using PowerShell v3
    I hope this helps.

  • Sql query to Load only new records,or update old records

    Hi,
    I need a query (not stored procedure) to insert only new records otherwise update existing records.
    I tried by creating "Merge statement in SQl", but the sql query is wrong,it is not updating,always inserting records(replicating..1,2,4,8,16,32..).
    below is my sample query,
    Here "FACT_mytbl -FCT" is my fact table.(which need to update if already records found ,otherwise insert as new records).
    *Inside select ---the table   E_tbl1,E_tbl2  is the business logic table ,using this two table only ,"FACT_mytbl" was created  .So the "S" alias will be the source
    table and "FCT" will be target table. based on this we have to insert or update records.   
      ---Query starts-------------------------------------                    
     MERGE INTO [FACT_mytbl]  FCT
     USING  (
           SELECT 
           FCT1.ID
          ,FCT1.PKcol1
          ,FCT1.FKcol1
          ,FCT1.col1
          ,FCT1.col2
         , FCT1.col3     
       FROM  [FACT_mytbl] FCT1 WITH(NOLOCK)        
       LEFT JOIN dbo.E_tbl1 CT WITH(NOLOCK)
    ON CT.PKcol1=FCT1.PKcol1
       LEFT JOIN dbo.E_tbl2 CT1 WITH(NOLOCK)
    ON CT1.PKcol1=FCT1.PKcol1
       ) S  
       ON FCT.PKcol1 = S.PKcol1 
     WHEN MATCHED AND (FCT.PKcol1 ! = S.PKcol1 ) THEN
     UPDATE SET       
           FCT.col1
          ,FCT.col2
          ,FCT.col3     
     WHEN NOT MATCHED THEN 
     INSERT VALUES
           S.ID
          ,S.PKcol1
          ,S.FKcol1
          ,S.col1
          ,S.col2
         , S.col3     
    --query ends----------------
    any suggestions,
    Thanks,
    R.B

    Hi Bhupesh_Rajasekaran,
    According to your description, if you want to insert only new records which does not exist in destination and update existing records which does exist in destination. We usually accomplish these in two statement.
    1.A join statement to update records.
    2.A insert statement for new records.
    Also we can use MERGE in SQL Server to insert, update at the same time. You specify a "Source" record set and a "Target" table, and the join between the two. You then specify the type of data modification that is to occur when the records between the two
    data are matched or are not matched. For more information, there is an similar example about merge in SQL Server, you can review the following article.
    http://www.codeproject.com/Tips/590085/Merge-in-SQL-Server
    Regards,
    Sofiya Li
    Sofiya Li
    TechNet Community Support

  • Puzzle why query returned wrong value for last record in update.

    Hi all,
    10.2.0.4, Windows 32 bit.
    Apply 15% dis.count to related 5 items ( total amount 522 ) sorted by lowest amount first (amount = 1).
    Last record ( amount = 200 ) will be the total dis.count amount ( 78.3 ) less applied accumulated dis.count amount ( 48.50 ).
    Query runs fine without update clause but wrong result for last record with update.
    CREATE TABLE "T1"
        "ID"       NUMBER NOT NULL ENABLE,
        "ITEM_ID"  VARCHAR2(20 BYTE) NOT NULL ENABLE,
        "AMOUNT"   NUMBER(10,2) NOT NULL ENABLE,
        "dizcount" NUMBER(10,2)
    INSERT INTO T1 ( ID, ITEM_ID, AMOUNT, dizcount )  VALUES ( 65, '101', 1, NULL ) ;
    INSERT INTO T1 ( ID, ITEM_ID, AMOUNT, dizcount )  VALUES ( 65, '102', 1, NULL ) ;
    INSERT INTO T1 ( ID, ITEM_ID, AMOUNT, dizcount )  VALUES ( 65, '201', 200, NULL ) ;
    INSERT INTO T1 ( ID, ITEM_ID, AMOUNT, dizcount )  VALUES ( 65, '215', 155, NULL ) ;
    INSERT INTO T1 ( ID, ITEM_ID, AMOUNT, dizcount )  VALUES ( 65, '111', 165, NULL ) ;
    UPDATE t1 a
    SET a.dizcount =
      (SELECT
        CASE
          WHEN rec_count = row_count
          THEN (78.3 - NVL(lag(temp_total,1) over ( order by rec_count) ,0))-- 78.3 is total dizcount amount from 522 * .15
          ELSE disc_amt
        END amt
      FROM
        (SELECT id,
          item_id,
          disc_amt,
          rec_count,
          row_count,
          CASE
            WHEN rec_count != row_count -- accumulate dizcount amount except for last record
            THEN SUM(disc_amt) over (order by rec_count)
            ELSE 0
          END temp_total
        FROM
          (SELECT ID ,
            item_id,
            amount amt,
            ROUND(amount * .15,1) disc_amt, -- dizcount is 15%
            row_number () over (order by amount) rec_count,
            COUNT ( *) over () row_count
          FROM t1
          WHERE ID = 65
        GROUP BY id,
          item_id,
          disc_amt,
          rec_count,
          row_count
        )b
      WHERE a.item_id = b.item_id
      );Regards
    Zack
    Edited by: Zack.L on Jul 26, 2010 1:26 AM

    Zack.L wrote:
    Query runs fine without update clause but wrong result for last record with update.Not sure why but looks like another case in favour of MERGE.
    MERGE INTO T1
    using (SELECT id, item_id,
        CASE
          WHEN rec_count = row_count
          THEN (78.3 - NVL(lag(temp_total,1) over ( order by rec_count) ,0))-- 78.3 is total dizcount amount from 522 * .15
          ELSE disc_amt
        END amt
      FROM
        (SELECT id,
          item_id,
          disc_amt,
          rec_count,
          row_count,
          CASE
            WHEN rec_count != row_count -- accumulate dizcount amount except for last record
            THEN SUM(disc_amt) over (order by rec_count)
            ELSE 0
          END temp_total
        FROM
          (SELECT ID ,
            item_id,
            amount amt,
            ROUND(amount * .15,1) disc_amt, -- dizcount is 15%
            row_number () over (order by amount) rec_count,
            COUNT ( *) over () row_count
          FROM t1
          WHERE ID = 65
        GROUP BY id,
          item_id,
          disc_amt,
          rec_count,
          row_count
        ))b
    on (t1.id = b.id and t1.item_id = b.item_id)
    when matched then update set dizcount = b.amt ;This worked for me. I tested on 11.2 but should work on 10.2.0.4
    SQL> select * from t1 order by amount ;
            ID ITEM_ID                  AMOUNT   DIZCOUNT
            65 101                           1         .2
            65 102                           1         .2
            65 215                         155       23.3
            65 111                         165       24.8
            65 201                         200       29.8p.s. BTW, not sure if you want the column names in different cases, but your script, as it is, gave me an error (on 11.2)

  • Difference/Updation to SAP-BW Versions 3.0B,3.1C and 3.5

    hello guys
    can anyone breif up with the difference/Updation to SAP-BW Versions 3.0B,3.1C and 3.5?
    Pl let me know atleast few of points?
    Many Thanks
    balaji

    hi Balaji,
    3.1 c
    SAP BW Release 3.1 Content - What' s new?
    SAP provides a comprehensive set of business analytics to integrate and analyze process-oriented content from a variety of sources and across various applications and industries. It enables companies to improve business processes and quickly respond to changing market dynamics by providing business information to executives, analysts and information consumers, delivered in real-time via the Web.
    SAP BW's analytical foundation and wide-range of Business Content enables customers to create their own analytical applications that best fit their business requirements.
    Business Content and Business Analytics
    Business Content for Oracle Applications
    Standard Business Content for Oracle Applications 11i was delivered with BW 3.0B covering the Financials and Project Accounting areas. To enhance the value of these existing analytical applications, including Web cockpits for cost center and project monitoring, we provide predefined extractors to load the business data from the Oracle Applications source system into SAP BW.
    The integration of Oracle Applications into SAP BW is based on the new Oracle extract package of Ascential's DataStage product. Business data from Oracle Applications is extracted into the InfoCubes for the Oracle General ledger and for Oracle Project Accounting that were delivered with BW 3.0B. For more information please click here and also here.
    Analytics Supporting The Concept of Business Activity Monitoring (BAM)
    BAM is a Gartner term that defines the concept of providing real-time access to critical business performance indicators in order to improve the speed and effectiveness of business operations. As business transactions are increasingly automated, the speed of these transactions accelerates. SAP BW delivers packaged analytics, for example in the SCM area, that combine application and data integration using SCEM (Supply Chain Event Management).
    SCEM enables companies to monitor processes immediately and in great detail. This means that they are able to recognize and react to critical situations quickly, before they result in delayed deliveries and production standstills. In many cases, event information is essential not only during the processes, but also after completion, because it can be used both for measuring quality aspects and for cost saving. For this reason, it is advisable to integrate the Supply Chain Event Manager with SAP Business Information Warehouse (SAP BW).
    Business Analytics - SAP's Packaged Analytical Applications per Business Area / Industry
    Analytical applications measure and improve business processes across a company's entire value chain. Based on a solid foundation (Data Warehousing, BI Platform and BI Suite of tools), SAP BW provides integrated applications that understand each other's metadata, and offer a closed loop that incorporates all aspects: Operations, analytics, and personalization.
    SAP BW 3.1 Content delivers analytics for all business areas such as CRM, SCM, SRM, HR, etc. as well as for different industries, such as Retail, Banking, and Service Providers to name a few.
    bw 3.5
    What's new?
    SAP BW 3.5 is designed to deliver seamless integration capabilities into all of the SAP NetWeaver components, as well as offering new capabilities in the Business Intelligence platform and suite.
    Information Broadcasting via Business Explorer (BEx) Broadcaster
    Check out the Demo to learn more about the possibilities Information Broadcasting offers:
    Share and disseminate insights to support decision-making processes
    Access the complete BI information portfolio via the SAP Enterprise Portal (SAP EP 6.0)
    Single, web-based wizard to broadcast personalized BI information portfolios to various end-users (pre-calculated for optimized query response time)
    Leverages SAP NetWeaver Knowledge Management features such as subscription, feedback, discussion, collaboration, rating, enterprise search, etc.
    Offers broadcasting services such as different scheduling options (ad-hoc, based on data loads, time scheduling), pre-calculation of queries and workbooks, sending pre-calculated queries and web templates as email attachments
    Based on the Java Repository Manager, all SAP BW metadata, master data, and transactional documents, as well as pre-calculated queries/templates for KM Services are enabled.
    Universal Data Integration
    The new Universal Data Integration significantly extends SAP BW data access capabilities to diverse data sources.
    BI Java Connectors
    Several hundred of connectors provide access to all data sources that support JDBC, XMLA, OLE DB for OLAP and SAP Query
    UDConnect (Universal Data Connect)
    Out-of-the-box connectivity for additional data sources that can be accessed by the BI Java Connectors. UDConnect supports staging and remote scenarios to this data.
    For instance, extraction from /remote access to a relational database via JDBC or extraction from /remote access to an OLAP source using OLE DB for OLAP, and extraction from an OLAP source using XML for Analysis.
    BI Java Software Development Kit (BI Java SDK) for custom-built Java Applications accessing SAP BW or non-SAP BW data via the BI Java Connectors.
    Easy to use and learn
    Based on open and accepted standards for interoperability
    Embedded BI - Integration into SAP NetWeaver
    Web Application Server:
    Integration with new Internet Graphics Server (IGS) and WAS Alert Framework
    Connecting BI alert framework to the SAP NetWeaver alert repository to streamline alert message processing
    Platform independence for graphical rendering (charts, maps), improved usability and new chart designer in BEx Web Application DesignerInbound Message Processing
    Integration with SAP Exchange Infrastructure (SAP XI) to support real-time data acquisition
    The data warehouse and/or operational data store is simply another subscriber to the real-time data being distributed by the Integration Broker
    Data is active, event-driven that's available to the Business Intelligence system in "real time"
    Reporting on harmonized master data:
    Integration with SAP Master Data Management (MDM) helps to improve the quality of decisions made
    Create consolidated views on customers, vendors and products
    Enhance master data with global attributes for company-wide analysis (i.e. spend analysis)
    BI Web Services:
    > The following BI web services can be accessed via open standards
    XML Data Load, XML for Analysis, XML Query Result Set
    Leveraging the Web Application Server 6.40 technology infrastructure
    Seamless deployment of BI web applications:
    into SAP Enterprise Portal roles for instant information delivery
    into SAP Enterprise Portal collaboration rooms
    into SAP Enterprise Portal KM folders
    Allows to search through BI applications in the context of unstructured information
    Gains improved query response times through cached application retrieval
    BI Platform Enhancements
    BI Web Services for data acquisition, reporting, and analyisis (XMLA) and BW Query XML Result Sets as Web Services (enable output of BW Query into XML for further processing)
    The Analysis Process Designer is a graphical tool for modelling multilevel analyses processes. This includes dataselection, preparation (e.g. filtering, sorting...) and transformation (e.g. data mining, regression, ...)of selected data with storage capabilities.
    Planning and Simulation (BW-BPS)
    With release SAP BW 3.5, SAP will deliver planning through the integrated capabilities of BW-BPS (formerly known as SEM-BPS). This brings together planning, budgeting, and forecasting with monitoring, reporting, and analysis, bundled into one software installation and one support package cycle.
    BW-BPS helps you to plan, budget, and forecast by providing functions such as:
    Top-down planning and bottom-up contribution with a rich set of planning functions
    A planning framework that lets you create and maintain planning models
    An user interface for manual planning and analysis
    Tools for process control (i.e. status tracking and monitoring)

  • How do we track how many records INSERT/UPDATE/DELETE per day

    Hi All,
    We have around 150 tables in database. so many members/persons can access these table and every one can have rights to do INSERT,DELETE,UPDATE the records. I wanted to know How many records are UPDATED,INSERTED,DELETED for a day. is it possible to find out.
    I have some questions please clarify the doubts.
    1> If we create any table, this table gets store in All_OBJECTS/USER_OBJECTS, tabs/tab/user_tables...... we can find out table name,columns what tables were created in database.
    2> if we enter records/ delete records / update records in a table. we can able to find corresponding table. Apart from corresponding table . is there any way to find out records.
    above i said that if i create any table it will store in objects table.

    Schedule a periodic DBMS_STATS.FLUSH_DATABASE_MONITORING_INFO and query USER_TAB_MODIFICATIONS. This view shows DML activity against each table and is updated when database monitoring info is flushed. The flush is automatic but if you need control over the frequency of updates, you can explicitly call the FLUSH.
    In 9i you have to "ALTER TABLE table_name MONITORING ;" before you can use this method.
    In 10g, this is enabled by default unless STATISTICS_LEVEL is set to BASIC.
    See http://download.oracle.com/docs/cd/B19306_01/server.102/b14237/statviews_4465.htm#sthref2375
    and
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14231/tables.htm#ADMIN01508
    Another useful view is V$SEGMENT_STATISTICS (or the faster view V$SEGSTAT) which provides information in the same manner as in the V$SYSTAT / V$SESSTAT views.
    Hemant K Chitale
    http://hemantoracledba.blogspot.com

  • What is the best way to process a dataset of 2000 records...

    I have a problem where I get a collection of 2000 records. Using these rows I need to look for matches amongst existing rows in a table. Currently the procedure each row and scans the table for a match. Is there a way I can scan the table once for all possible matches.
    Thanks

    Assuming you can't retrieve the 2000 rows in one SQL statement another approach might be to create an object collection and cast this to a table in a subsequent SQL statement.
    For example
    CREATE TABLE test (abc NUMBER, def NUMBER, ghi NUMBER);
    INSERT INTO test VALUES (1,2,3);
    CREATE TYPE test_typ AS OBJECT (abc NUMBER, def NUMBER, ghi NUMBER);
    CREATE TYPE test_coll_typ AS TABLE OF test_typ;
    SET SERVEROUTPUT ON
    DECLARE
    coll test_coll_typ := test_coll_typ();
    CURSOR cur_dupes IS
    SELECT abc, def, ghi
    FROM test
    INTERSECT
    SELECT abc, def, ghi
    FROM TABLE(CAST(coll AS test_coll_typ));
    BEGIN
    -- Create some rows in our collection
    coll.EXTEND(3);
    coll(1) := test_typ(2,3,4);
    coll(2) := test_typ(1,2,3);
    coll(3) := test_typ(3,4,5);
    -- Output the duplicates in table "test"
    FOR rec_dupes IN cur_dupes LOOP
    DBMS_OUTPUT.PUT_LINE(rec_dupes.abc||' '||rec_dupes.def||' '||rec_dupes.ghi);
    END LOOP;
    END;
    The disadvantage is that you now have two more objects in your schema. This might not be a problem if they're there for other reasons too, but it is a bit of overkill perhaps if this is their sole reason for being.

  • In which table the condition records get stored in sap crm

    hi everybody any one can help me in this,
    In which table the condition records get stored in sap crm.
    Regards,
    Babu

    Hi Babu,
    The table name depends on the condition table you have chosen while adding a condition record. Like if it is SAP001, the database table will be CNCCRMPRSAP001.
    Regards,
    Shalini Chauhan
    Edited by: Shalini Chauhan on Jun 23, 2008 10:18 AM

  • "FRM-40501: Oracle error: Unable to Reserve Record For Update or Delete"

    "FRM-40501: Oracle error: Unable to Reserve Record For Update or Delete"
    as I can unblock a session in the graphical surroundings of administration of the BD 10g of Oracle

    From experience with this problem since the blocked customer form has been coded not to wait for the updating session to complete then there is likely no waiter on the system now so you cannot find the blocking session.
    What you need to do is determine what row the Form was going after then using SQLPlus issue an update on the row. If the blocking session has not yet committed then this update will wait. Now if you look for blocking sessions you will be able to find it and make a determination if the session should be killed or if someone needs to call the user and ask he or she to flip through their screens and get out of the blocking screen.
    Applications screens written not to wait on currently being updated data need to also be written to expose the information necessary to identify the row(s) in question.
    HTH -- Mark D Powell --

  • SLD auto update of SAP's realease versions

    Hello,
    I heard that the administrator has to download manually a file with the current release version informations and put this file to the neccessary file. Is this the normal way or is there a possibility to download the files automatically?
    Thank you, Maximilian

    Hello again,
    sorry, now I got it.
    From time to time you need to update the data models in the SLD, so the
    SLD can handle/recognize new software versions and software releases.
    You need to update the
    cim data model
    content repository
    This is described in sap note
    669669                                             
    24 from 24.06.2008                                 
    Released for Customer                              
    24.06.2008                                                                               
    EN                                                 
    DE                                                 
    Updating the SAP Component Repository in the SLD   
    kr,
    andreas

Maybe you are looking for