Recording Batch in PA30

Hello Abapers!
      I have a problem with recording transaction PA30. Exacly I want to delete person from work center by "Organizational Assignment". When I did it without recording everything was ok, but when I tried to record it by SM35, one POPUP window wasn't appeared and I really need that POPUP to end my activity.
If anyone knows how to record this POPUP, or know other solution it will be great.
Thanks in advance
BL

Hello!
If noone knows answer of this queston, maybe someone knows witch FM or BAPI use to that kind of operations (delete relationship beetwen person and work center)
Best reagrds
BL

Similar Messages

  • Recording fot transaction pa30

    Hi experts,
    I have created one recording for transaction PA30 to update infotypes 0000, 0001, 0007 and 0019.
    My problem is that when i am executing my BDC report, screen is not taking data from my program.
    please tell me where i am doing.
    and please do not lock the thread this time becausa i very well know how to use debugger.
    but it is not solving my problem because it is taking some default value i do not knw from where.
    and PA40 i can use but my requirement is with PA30.
    Regards,
    Rajesh Kumar

    Hi all,
    Thanks for your replies.
    My problem has been solved.
    i have to set certain parameters in CALL TRANSACTION as shown below.
    DATA: gi_opts     TYPE ctu_params.
      DATA: gc_zmode    TYPE c VALUE 'A',
            gc_zupd     TYPE c VALUE 'S',
            gc_znobinpt TYPE c VALUE 'X'.
      gi_opts-dismode = gc_zmode.
      gi_opts-updmode = gc_zupd.
      gi_opts-defsize = 'X'.
      gi_opts-nobinpt = gc_znobinpt.
      CALL TRANSACTION 'PA30' USING gi_bdcdata
                         OPTIONS FROM gi_opts
                         MESSAGES INTO gi_messtab.
    Regards,
    Rajesh Kumar

  • Problems Recording Batch details in Inbound Deliveries -SLED

    I am trying to record the batch details in an inbound delivery and then pass this through to the full goods receipt.
    In the inbound delivery, when I press the batch creation button for the item I get a dialogue box with the batch (vendor's) number and expiry date as open for entry but the production date is grayed out! When I then go into the classification system to enter the rest of the data, the production date is 00.00.0000 and this obviously causes problems! 
    If I switch off the confirmation control in the PO and go straight to MIGO, the batch handling acts as I would expect and a production AND expiry date can be added and all the classification data goes in "sweet as". This implies to me that there is nothing wrong with the site or the IM postings.
    I have tried changing the "Batch Installation in the Inbound Delivery" but this does not fix the production date problem. Although if I switch off SLED it does switch of the batch checking and creation in the delivery as I would expect.
    I have scoured through OSS and not found anything relevant, we are on 4.7.
    Any thoughts or pointers as to what else I could have overlooked?
    Many thanks
    H

    Hi Howard,
    You should probably post this message in the ERP forum.
    Kind regards,
    Yann

  • Want to output Error message in BDC recording(PA30)

    Hi all,
    Currently iam working on upload program,whic is done by BDC recording with transaction PA30.while loading the data to different infotypes,iam checking one condition
    in particular infotype,after that i need to give the error message(Type E) and i data should not get uploaded for that infotype,and it has to go to next infotype.i want to log this error also,how can i do this?

    The ERROR message will stop the processing of the program,  the message will need to be either an I(information) or a W(warning) message into for the program to continue processing.
    Log the message by writing it to an internal table.
    IF SOME_CONDITION = 'X'.
    MESSAGE W001(00) with 'Hey, here is a message'.
    itab-pernr = p_pernr.
    itab-msgid = '00'.
    itab-msgno = '001'.
    append itab.
    ENDIF.
    CALL TRANSACTION 'PA30'........
    Regards,
    Rich HEilman

  • Batch Input / Recording for EnjoySAP transaction

    Hi Experts,
    May I know any possibility for us to do recording/batch input processing for EnjoySAP transaction ??
    Nowadays I need to do Recording for ME29N, as ME28 have some problem on release some of the PO document, so may I know any tool that I might can use?
    Thanks
    Cheers,
    Isaac.

    Hi Issac,
    Suggest you to<b> Search in SDN with key - eCATT</b>
    Will get few more useful related Posts.
    Check
    ECATT - Extended Computer Aided testing tool.
    http://help.sap.com/saphelp_47x200/helpdata/en/20/e81c3b84e65e7be10000000a11402f/frameset.htm
    Part I - eCATT An Introduction
    /people/sapna.modi/blog/2006/04/10/ecatt--an-introduction-part-i
    Part II - eCATT Scripts Creation - TCD Mode
    /people/sapna.modi/blog/2006/04/10/ecatt-scripts-creation-150-tcd-mode-part-ii
    Part III - eCATT Scripts Creation - SAPGUI Mode
    /people/sapna.modi/blog/2006/04/10/ecatt-scripts-creation--sapgui-mode-part-iii
    Part IV - eCATT Chaining, Parameterization, Creation Of Test Data,Test Configuration, System Data /people/sapna.modi/blog/2006/04/18/ecatt-chaining-parameterization-creation-of-test-datatest-configuration-system-data-part-iv
    Part V - eCATT Scripts Management Via Test Workbench
    /people/sapna.modi/blog/2006/04/13/ecatt-scripts-management-via-test-workbench-part-v
    Part VI - eCATT Logs
    /people/sapna.modi/blog/2006/04/18/ecatt-logs-part-vi
    Reward points if this Helps.
    Manish
    Message was edited by:
            Manish Kumar

  • Mass update of email in the vendor master record

    Hello to you all,
    Does anyone know how can I run mass update of email in the vendor master record for a big No. of entities?
    XK99 won't help since you can't find the email,
    Batch input recording won't help as well since the email can't be found.
    I will appreciate your help,
    Amir

    Hello Amir,
    You can use LSMW for update operation. You should create a batch input recording for transaction FK02. But you must tick "Use central address management" checkbox on FK02 screen while recording batch input. After that you will see E-mail field on screen.
    Regards,
    Burak

  • Flash builder 4.6 - send batch data to server (php connected)

    Hi, I've recently downloaded flashbuilder 4.6 and started to develop my first mobile app.
    Just to explain the sense of my problem/question,  I'd like to manage a local database offline (insert data to a table) and manually sincronize it to a server database (push a button -> add records batch to table in server).
    Following tutorials I was able to manage a sqlite local database, I got the connection to server database (mySQL) with php and have Data/Services panel populated (services test works fine).
    Unfortunately I'm not able to send local data, batch to server. A few lines of code below (debugging I got no errors and no results).
    Something wrong on the code?
    Or did I take a wrong way to solve my problem?
    thanks in advance for interest
    protected function button_clickHandler(event:MouseEvent):void
    var LocalDatabase:ArrayCollection = ... <loaded from sqlite database>
    var RecordToAdd: MydataTable;    //server data table to fill
    for (var i:int= 0; i<LocalDatabase.length; i++)
        RecordToAdd = new MydataTable;
        RecordToAdd.Field1= LocalDatabase[i].Field1;                                                  
        RecordToAdd.Field2 = LocalDatabase[i].Field2;
        createMydataTableResult.token = MydataTableService.createMydataTable(RecordToAdd);
        createMydataTableResult.token = MydataTableService.commit();

    Hello
    you said "Following tutorials I was able to manage a sqlite local database, I got the connection to server database (mySQL) with php and have Data/Services panel populated (services test works fine)."
    can you indicate some tutorials with that? i am trying to do the same thing but i'm having some trouble...
    thanks

  • Need help, Trouble in uploading records using sql loader in Forms 6i

    Hi,
    I am trying to develop a screen for uploading records to a table by using a ctl file, batch file and sql loader.
    Env: Forms 6i, Oracle 8
    Table to be updated is: shy_upload_table
    My TSN entry looks similar to this,
    TEST_AXA.CNB.COM =
    (DESCRIPTION =
    (ADDRESS_LIST =
    (ADDRESS = (PROTOCOL = TCP)(HOST = 11.23.11.123)(PORT = 1234))
    (CONNECT_DATA =
    (SID = axdabc)
    My intention is whenever i press the upload_button, I should truncate the table and upload it with the contents of the file.
    In the when-button-pressed event of the upload_button I have the following code. always I am able to truncate the table but am not able to upload it with the contents of the file. Can any of you help me fix this problem ?
    declare
         var_control varchar2(256);
         VAR_DATA VARCHAR2(256);
         VAR_OUTPUT VARCHAR2(500);
         var_filename varchar2(256);
         str varchar2(50);
         cnt number;
    begin
         FORMS_DDL('TRUNCATE TABLE shy_upload_table ');
         select count(*) into cnt from shy_upload_table;
         message('count '||cnt);
         MESSAGE('');
    If NOT form_success Then
         MESSAGE('Upload Failed');
         MESSAGE('Upload Failed');           
    else
         set_item_property('DISPLAY_PB',enabled,property_true);
    --when ever i run, i am able to see the display_pb enabled. it means form_success is true.
    end if;
         var_filename := :txt_filename;
    --I have tried with each of the below option,
    --sqlldr userid/[email protected] control=F:\ERP\file_upload.ctl
    --sqlldr userid/password@axdabc control=F:\ERP\file_upload.ctl
    --sqlldr userid/password@TEST_AXA.CNB.COM control=F:\ERP\file_upload.ctl
         VAR_DATA :='data=' || var_filename ;
         VAR_OUTPUT := var_control|| ' ' ||VAR_DATA;
         host('F:\a.bat');
    end;
    batch file contents...
    # I have tried with each of the below options
    sqlldr userid/[email protected] control=F:\ERP\file_upload.ctl data=F:\ERP\sample.txt log=F:\ERP\x.log bad=F:\ERP\x.bad
    #sqlldr userid/password@axdabc control=F:\ERP\file_upload.ctl data=F:\ERP\sample.txt log=F:\ERP\x.log bad=F:\ERP\x.bad
    #sqlldr userid/password@TEST_AXA.CNB.COM control=F:\ERP\file_upload.ctl data=F:\ERP\sample.txt log=F:\ERP\x.log bad=F:\ERP\x.bad
    pause
    Thanks
    vish

    Hi Francois,
    Thanks for responding, I am not very sure of what you want me to try out.
    When I double click the batch file containing the below, the record gets inserted in the table. Only when using my form and trying to upload, it fails to insert the record.
    batch file contents...
    #sqlldr userid/password@TEST_AXA.CNB.COM control=F:\ERP\file_upload.ctl data=F:\ERP\sample.txt log=F:\ERP\x.log bad=F:\ERP\x.bad
    pause
    Thanks
    Vish

  • How can i control error in batch input ?

    i run program with recording ( batch input on F-03 )  ,
    i want to give the user the option to :
    only fix the problem  and continue the process
    end batch input and go back to the main program .
    is there is option to do so  ?
    in which mode it can be done  ('E' ? ) .
    the reason that i need this option is
    data with fast changes .
    thanks .

    thanks any one  ,
    may be i was not clear inough  ,
    i know what is mode 'E' .
    i want to let him fix the field only
    i don't want to allowed him to go to other screen
    just do update to the field and proceed.
    is there option to do so ?
    thanks again , and sure i will award points  .

  • LSMW  -2-RECORDING-URGENT

    i WANT TO DO RECORDING OF TRANSACTION PA30.sINCE I COULDNT GET ALL FIELDS IN 1ST RECORDING AS CERTAIN FIELDS CAN BE GOT ONLY IF WE GO GIVE THE INFOTYPES SPECIFICALLY .I HAVE TO DO A SECOND RECORDING OF SAME TRANSACTIO. KINDLY TELL ME HOW TO PROCEED ..I HAVE ONLY ONE SOURCE STRUCTURE.HOW SHOULD I PROCEED.

    Hi Deepthi,
    While recording:
    Start from the Infotype (0000) Actions without giving any pernr by selecting the options Actions.
    Then after entering the mandatory values for the IT0000 press SAVE then automatically it will goto Personal data screen..u its required fill the mandatory fields or else press NEXT Record button for the next screen..then it will gives Org.Assignment(IT0001)...
    Like that u can continue based on your req..or esle say BACK. But dont forget to save once u enter the data and before press BACK button.
    Thanks
    Eswar

  • BOM Explosion needed on PO and/or Delivery for Stock Transfer PO Process

    Good afternoon,
      I have been researching the SDN and web for a solution to my problem to no avail so far. 
    The business problem I need to solve is the requirement for our STPO process to explode certain BOM's that are transfered from one company/plant to another company/plant in order to record batch/serial # component information by the delivering plant on the delivery and verification of the batch/serial # in the receiving plant during MIGO processing of the delivery.  This is needed by our Service Mgmt department in order to track servicable items within the Finished Good (medical devices industry).
    It appears on the SDN that a BOM can not be exploded on a non-Subcontracting PO (which would be ideal for us), but barring this problem, we are trying to have the BOM explode in the Delivery created from the STPO via the VL10B transaction.
    The STPO generates a Delivery doc with type NLCC and item category NLC.  I have maintained the material master for the BOM material to have an Item Category Group of ERLA, and I have modified Item Category Determination in configuration in Logistics Execution for deliveries to include ERLA, usage V, and NLC as default item category.  But, the deliveries generated from the STPO still do not explode, like what normally happens if the delilvery of the BOM was created from a standard Order.
    The components for the BOM need to be displayed on the Delivery (if not the STPO), so the cleark can record the Batch/Serial #'s for each component.  Then when the receiving plant posts the Goods Receipt for the delivery a user exit we have (based on movement type and material document code) builds a Service Mgmt table that is then processed to De-install the BOM and components from the former plant and install the BOM and components at the receiving plant (and then subsequently at the ultimate customer's location when a normal sale is made), generating a new Functional Location.
    Has anyone here had to do anything like this before?  Would any of you have any ideas on how this BOM explosion might work on the PO or Delivery?
    Thanks in advance.
    Scott.

    No responses to this question.
    Subject dropped.

  • BOM Explosion needed on Delivery for Intercompany Stock Transfer Order

    Good afternoon,
    I have been researching the SDN and web for a solution to my problem to no avail so far.
    The business problem I need to solve is the requirement for our STPO process to explode certain BOM's that are transfered from one company/plant to another company/plant in order to record batch/serial # component information by the delivering plant on the delivery and verification of the batch/serial # in the receiving plant during MIGO processing of the delivery. This is needed by our Service Mgmt department in order to track servicable items within the Finished Good (medical devices industry).
    It appears on the SDN that a BOM can not be exploded on a non-Subcontracting PO (which would be ideal for us), but barring this problem, we are trying to have the BOM explode in the Delivery created from the STPO via the VL10B transaction.
    The STPO generates a Delivery doc with type NLCC and item category NLC. I have maintained the material master for the BOM material to have an Item Category Group of ERLA, and I have modified Item Category Determination in configuration in Logistics Execution for deliveries to include ERLA, usage V, and NLC as default item category. But, the deliveries generated from the STPO still do not explode, like what normally happens if the delilvery of the BOM was created from a standard Order.
    The components for the BOM need to be displayed on the Delivery (if not the STPO), so the cleark can record the Batch/Serial #'s for each component. Then when the receiving plant posts the Goods Receipt for the delivery a user exit we have (based on movement type and material document code) builds a Service Mgmt table that is then processed to De-install the BOM and components from the former plant and install the BOM and components at the receiving plant (and then subsequently at the ultimate customer's location when a normal sale is made), generating a new Functional Location.
    Has anyone here had to do anything like this before? Would any of you have any ideas on how this BOM explosion might work on the PO or Delivery?
    Thanks in advance.
    Scott.

    No responses to this question.
    Subject dropped.

  • Performance problems with XMLTABLE and XMLQUERY involving relational data

    Hello-
    Is anyone out there using XMLTABLE or XMLQUERY with more than a toy set of data? I am running into serious performance problems tyring to do basic things such as:
    * Combine records in 10 relational tables into a single table of XMLTYPE records using XMLTABLE. This hangs indefinitely for any more than 800 records. Oracle has confirmed that this is a problem and is working on a fix.
    * Combine a single XMLTYPE record with several relational code tables into a single XMLTYPE record using XMLQUERY and ora:view() to insert code descriptions after each code. Performance is 10 seconds for 10 records (terrible) passing a batch of records , or 160 seconds for one record (unacceptable!). How can it take 10 times longer to process 1/10th the number of records? Ironically, the query plan says it will do a full table scan of records for the batch, but an index access for the one record passed to the XMLQUERY.
    I am rapidly losing faith in XML DB, and desparately need some hints on how to work around these performance problems, or at least some assurance that others have been able to get this thing to perform.

    <Note>Long post, sorry.</Note>
    First, thanks for the responses above. I'm impressed with the quality of thought put into them. (Do the forum rules allow me to offer rewards? :) One suggestion in particular made a big performance improvement, and I’m encouraged to hear of good performance in pure XML situations. Unfortunately, I think there is a real performance challenge in two use cases that are pertinent to the XML+relational subject of this post and probably increasingly common as XML DB usage increases:
    •     Converting legacy tabular data into XML records; and
    •     Performing code table lookups for coded values in XML records.
    There are three things I want to accomplish with this post:
    •     Clarify what we are trying to accomplish, which might expose completely different approaches than I have tried
    •     Let you know what I tried so far and the rationale for my approach to help expose flaws in my thinking and share what I have learned
    •     Highlight remaining performance issues in hopes that we can solve them
    What we are trying to accomplish:
    •     Receive a monthly feed of 10,000 XML records (batched together in text files), each containing information about an employee, including elements that repeat for every year of service. We may need to process an annual feed of 1,000,000 XML records in the future.
    •     Receive a one-time feed of 500,000 employee records stored in about 10 relational tables, with a maximum join depth of 2 or 3. This is inherently a relational-to-XML process. One record/second is minimally acceptable, but 10 records/sec would be better.
    •     Consolidate a few records (from different providers) for each employee into a single record. Given the data volume, we need to achieve a minimum rate of 10 records per second. This may be an XML-only process, or XML+relational if code lookups are done during consolidation.
    •     Allow the records to be viewed and edited, with codes resolved into user-friendly descriptions. Since a user is sitting there, code lookups done when a record is viewed (vs. during consolidation) should not take more than 3 seconds total. We have about 20 code tables averaging a few hundred rows each, though one has 450,000 rows.
    As requested earlier, I have included code at the end of this post for example tables and queries that accurately (but simply) replicate our real system.
    Why we did and why:
    •     Stored the source XML records as CLOBS: We did this to preserve the records exactly as they were certified and sent from providers. In addition, we always access the entire XML record as a whole (e.g., when viewing a record or consolidating employee records), so this storage model seemed like a good fit. We can copy them into another format if necessary.
    •     Stored the consolidated XML employee records as “binary XML”. We did this because we almost always access a single, entire record as a whole (for view/edit), but might want to create some summary statistics at some point. Binary XML seemed the best fit.
    •     Used ora:view() for both tabular source records and lookup tables. We are not aware of any alternatives at this time. If it made sense, most code tables could be pre-converted into XML documents, but this seemed risky from a performance standpoint because the lookups use both code and date range constraints (the meaning of codes changes over time).
    •     Stored records as XMLTYPE columns in a table with other key columns on the table, plus an XMLTYPE metadata column. We thought this would facilitate pulling a single record (or a few records for a given employee) quickly. We knew this might be unnecessary given XML indexes and virtual columns, but were not experienced with those and wanted the comfort of traditional keys. We did not used XMLTYPE tables or the XML Repository for documents.
    •     Used XMLTABLE to consolidate XML records by looping over each distinct employee ID in the source batch. We also tried XMLQUERY and it seems to perform about the same. We can achieve 10 to 20 records/second if we do not do any code lookups during consolidation, just meeting our performance requirement, but still much slower than expected.
    •     Used PL/SQL with XMLFOREST to convert tabular source records to XML by looping over distinct employee IDs. We tried this outside PL/SQL both with XMLFOREST and XMLTABLE+ora:view(), but it hangs in both cases for more than 800 records (a known/open issue). We were able to get it to work by using an explicit cursor to loop over distinct employee IDs (rather than processing all records at once within the query). The performance is one record/second, which is minimally acceptable and interferes with other database activity.
    •     Used XMLQUERY plus ora:view() plus XPATH constraints to perform code lookups. When passing a single employee record, the response time ranges from 1 sec to 160 sec depending on the length of the record (i.e., number of years of service). We achieved a 5-fold speedup using an XMLINDEX (thank you Marco!!). The result may be minimally acceptable, but I’m baffled why the index would be needed when processing a single XML record. Other things we tried: joining code tables in the FOR...WHERE clauses, joining code tables using LET with XPATH constraints and LET with WHERE clause constraints, and looking up codes individually via JDBC from the application code at presentation time. All those approaches were slower. Note: the difference I mentioned above in equality/inequality constraint performance was due to data record variations not query plan variations.
    What issues remain?
    We have a minimally acceptable solution from a performance standpoint with one very awkward PL/SQL workaround. The performance of a mixed XML+relational data query is still marginal IMHO, until we properly utilize available optimizations, fix known problems, and perhaps get some new query optimizations. On the last point, I think the query plan for tabular lookups of codes in XML records is falling short right now. I’m reminded of data warehousing in the days before hash joins and star join optimization. I would be happy to be wrong, and just as happy for viable workarounds if I am right!
    Here are the details on our code lookup challenge. Additional suggestions would be greatly appreciated. I’ll try to post more detail on the legacy table conversion challenge later.
    -- The main record table:
    create table RECORDS (
    SSN varchar2(20),
    XMLREC sys.xmltype
    xmltype column XMLREC store as binary xml;
    create index records_ssn on records(ssn);
    -- A dozen code tables represented by one like this:
    create table CODES (
    CODE varchar2(4),
    DESCRIPTION varchar2(500)
    create index codes_code on codes(code);
    -- Some XML records with coded values (the real records are much more complex of course):
    -- I think this took about a minute or two
    DECLARE
    ssn varchar2(20);
    xmlrec xmltype;
    i integer;
    BEGIN
    xmlrec := xmltype('<?xml version="1.0"?>
    <Root>
    <Id>123456789</Id>
    <Element>
    <Subelement1><Code>11</Code></Subelement1>
    <Subelement2><Code>21</Code></Subelement2>
    <Subelement3><Code>31</Code></Subelement3>
    </Element>
    <Element>
    <Subelement1><Code>11</Code></Subelement1>
    <Subelement2><Code>21</Code></Subelement2>
    <Subelement3><Code>31</Code></Subelement3>
    </Element>
    <Element>
    <Subelement1><Code>11</Code></Subelement1>
    <Subelement2><Code>21</Code></Subelement2>
    <Subelement3><Code>31</Code></Subelement3>
    </Element>
    </Root>
    for i IN 1..100000 loop
    insert into records(ssn, xmlrec) values (i, xmlrec);
    end loop;
    commit;
    END;
    -- Some code data like this (ignoring date ranges on codes):
    DECLARE
    description varchar2(100);
    i integer;
    BEGIN
    description := 'This is the code description ';
    for i IN 1..3000 loop
    insert into codes(code, description) values (to_char(i), description);
    end loop;
    commit;
    end;
    -- Retrieve one record while performing code lookups. Takes about 5-6 seconds...pretty slow.
    -- Each additional lookup (times 3 repeating elements in the data) adds about 1 second.
    -- A typical real record has 5 Elements and 20 Subelements, meaning more than 20 seconds to display the record
    -- Note we are accessing a single XML record based on SSN
    -- Note also we are reusing the one test code table multiple times for convenience of this test
    select xmlquery('
    for $r in Root
    return
    <Root>
    <Id>123456789</Id>
    {for $e in $r/Element
        return
        <Element>
          <Subelement1>
            {$e/Subelement1/Code}
    <Description>
    {ora:view("disaac","codes")/ROW[CODE=$e/Subelement1/Code]/DESCRIPTION/text() }
    </Description>
    </Subelement1>
    <Subelement2>
    {$e/Subelement2/Code}
    <Description>
    {ora:view("disaac","codes")/ROW[CODE=$e/Subelement2/Code]/DESCRIPTION/text()}
    </Description>
    </Subelement2>
    <Subelement3>
    {$e/Subelement3/Code}
    <Description>
    {ora:view("disaac","codes")/ROW[CODE=$e/Subelement3/Code]/DESCRIPTION/text() }
    </Description>
    </Subelement3>
    </Element>
    </Root>
    ' passing xmlrec returning content)
    from records
    where ssn = '10000';
    The plan shows the nested loop access that slows things down.
    By contrast, a functionally-similar SQL query on relational data will use a hash join and perform 10x to 100x faster, even for a single record. There seems to be no way for the optimizer to see the regularity in the XML structure and perform a corresponding optimization in joining the code tables. Not sure if registering a schema would help. Using structured storage probably would. But should that be necessary given we’re working with a single record?
    Operation Object
    |SELECT STATEMENT ()
    | SORT (AGGREGATE)
    | NESTED LOOPS (SEMI)
    | TABLE ACCESS (FULL) CODES
    | XPATH EVALUATION ()
    | SORT (AGGREGATE)
    | NESTED LOOPS (SEMI)
    | TABLE ACCESS (FULL) CODES
    | XPATH EVALUATION ()
    | SORT (AGGREGATE)
    | NESTED LOOPS (SEMI)
    | TABLE ACCESS (FULL) CODES
    | XPATH EVALUATION ()
    | SORT (AGGREGATE)
    | XPATH EVALUATION ()
    | SORT (AGGREGATE)
    | XPATH EVALUATION ()
    | TABLE ACCESS (BY INDEX ROWID) RECORDS
    | INDEX (RANGE SCAN) RECORDS_SSN
    With an xmlindex, the same query above runs in about 1 second, so is about 5x faster (0.2 sec/lookup), which is almost good enough. Is this the answer? Or is there a better way? I’m not sure why the optimizer wants to scan the code tables and index into the (one) XML record, rather than the other way around, but maybe that makes sense if the optimizer wants to use the same general plan as when the WHERE clause constraint is relaxed to multiple records.
    -- Add an xmlindex. Takes about 2.5 minutes
    create index records_record_xml ON records(xmlrec)
    indextype IS xdb.xmlindex;
    Operation Object
    |SELECT STATEMENT ()
    | SORT (GROUP BY)
    | FILTER ()
    | NESTED LOOPS ()
    | FAST DUAL ()
    | TABLE ACCESS (BY INDEX ROWID) SYS113473_RECORDS_R_PATH_TABLE
    | INDEX (RANGE SCAN) SYS113473_RECORDS_R_PATHID_IX
    | SORT (AGGREGATE)
    | FILTER ()
    | TABLE ACCESS (FULL) CODES
    | FILTER ()
    | NESTED LOOPS ()
    | FAST DUAL ()
    | TABLE ACCESS (BY INDEX ROWID) SYS113473_RECORDS_R_PATH_TABLE
    | INDEX (RANGE SCAN) SYS113473_RECORDS_R_PATHID_IX
    | SORT (GROUP BY)
    | FILTER ()
    | NESTED LOOPS ()
    | FAST DUAL ()
    | TABLE ACCESS (BY INDEX ROWID) SYS113473_RECORDS_R_PATH_TABLE
    | INDEX (RANGE SCAN) SYS113473_RECORDS_R_PATHID_IX
    | SORT (AGGREGATE)
    | FILTER ()
    | TABLE ACCESS (FULL) CODES
    | FILTER ()
    | NESTED LOOPS ()
    | FAST DUAL ()
    | TABLE ACCESS (BY INDEX ROWID) SYS113473_RECORDS_R_PATH_TABLE
    | INDEX (RANGE SCAN) SYS113473_RECORDS_R_PATHID_IX
    | SORT (GROUP BY)
    | FILTER ()
    | NESTED LOOPS ()
    | FAST DUAL ()
    | TABLE ACCESS (BY INDEX ROWID) SYS113473_RECORDS_R_PATH_TABLE
    | INDEX (RANGE SCAN) SYS113473_RECORDS_R_PATHID_IX
    | SORT (AGGREGATE)
    | FILTER ()
    | TABLE ACCESS (FULL) CODES
    | FILTER ()
    | NESTED LOOPS ()
    | FAST DUAL ()
    | TABLE ACCESS (BY INDEX ROWID) SYS113473_RECORDS_R_PATH_TABLE
    | INDEX (RANGE SCAN) SYS113473_RECORDS_R_PATHID_IX
    | SORT (AGGREGATE)
    | FILTER ()
    | NESTED LOOPS ()
    | FAST DUAL ()
    | TABLE ACCESS (BY INDEX ROWID) SYS113473_RECORDS_R_PATH_TABLE
    | INDEX (RANGE SCAN) SYS113473_RECORDS_R_PATHID_IX
    | SORT (AGGREGATE)
    | TABLE ACCESS (BY INDEX ROWID) SYS113473_RECORDS_R_PATH_TABLE
    | INDEX (RANGE SCAN) SYS113473_RECORDS_R_PATHID_IX
    | TABLE ACCESS (BY INDEX ROWID) RECORDS
    | INDEX (RANGE SCAN) RECORDS_SSN
    Am I on the right path, or am I totally using the wrong approach? I thought about using XSLT but was unsure how to reference the code tables.
    I’ve done the best I can constraining the main record to a single row passed to the XMLQUERY. Given Mark’s post (thanks!) should I be joining and constraining the code tables in the SQL WHERE clause too? That’s going to make the query much more complicated, but right now we’re more concerned about performance than complexity.

  • How to use LSMW to upload database table directly from flat file extract

    Hi Guru's,
    I am new to LSMW tool. I have searched the fourm before posting this thread for my issue but i didn't find any good posts for the same.
    my requirement is : I will have a flat ( tab -delimited ) or Excel file with number of records downloaded using the databrowser for some of standard tables from one SAP systesm  . I wanted to upload the records using the LSMW  to the same tables  in some other SAP systesm . Please help me to How to upload the same using the LSMW.
    Thanks & Regards,
    Praveen.

    Hi Praveen,
    There is a risk trying to migrate data directly into standard tables because this can generate database inconsistences or wrong inserted data according to what is customized in the target system. I do not recommend migrate like this. 
    With LSMW you use objects like direct input programs, idocs, bapis and recorded batch input. Try to create a project and use an standard object for your data. Also, check in SXDA transaction (Goto-->DX programs) if there is a standard program for your data.
    Anyway, if you want to upload data directly to tables, read below thread:
    ["UPLOAD  CSV  FILE";
    [how to upload .csv file into a custom table;
    Regards,
    Roger

  • Error in code inspector

    Hi Abapers,
    I have done recording for Tcode:pa30 in my report and used call transaction method. because of this i'm getting one error in when i check code inspector.
    error message:
    CA CL_CI_TEST_CRITICAL_STATEMENTS0002
    Critical Statements
    Call of Transaction &1
    &1 = Name of Transaction
    For CALL TRANSACTION there must already be a suitable transaction authorization with the calling transaction.
    Kindly help me out to fix this error.
    Thanks in advance.

    Hi
    Goto SE24
    enter this class CL_CI_TEST_CRITICAL_STATEMENTS
    and see its documentation.
    It shows the all critical errors that are there in code.
    Regards
    Anji

Maybe you are looking for

  • Not getting subvi results back in main VI

    I've got a subvi that I using in another subvi that then gets used in the main VI.  The results from the first subvi transfer to the second subvi correctly but the results do not get transfered into the main VI.  I've wired the connecotr in the secon

  • Netflix not working properly after ATV update

    After the update, movies at netflix after some seconds, video starts from the beginning and audio still plays normal. Can't see anything like that. Have to fast forward so video comes back to original point. Can you guys help me out?

  • Have tried all solutions listed on this board to A/V chat (unsuccessful)

    I have read practically every post on a -8 or -22 error, have opened every AIM/iChat port imaginable, done the same on the computer i am trying to connect to, opened all ports possible on my Mac firewall, opened all ports on other computers McAfee fi

  • Ipad mini wont charge or connect with itunes

    Ipad mini wont charge or connect with itunes? Pls help?

  • Semmingly intractable JDBC connect problem

    Greetings, I am having massive difficulties getting our application to talk to our oracle database. We've (probably unwisely) upgraded the OS to RH9, the database to Oracle 9i and the JDK to 1.3.1 (from RH7, 8i and 1.2). The problem is that when I tr