MRP_ATP_PUB.call_atp sample

Hi,
We need a code for API usage of MRP_ATP_PUB.call_atp. We got one from the metalink, but its given example of only 1 item being passed.
When we pass group of items, its giving us error of ATP Processing, where as when we pass those items individually, it works out fine.
Any help in this regard is highly appreciated.
Thanks and Regards,
Amit

We are on R12 (12.1.1).
Though we are not getting error, but here are the details.
As per the example files of the API, we are using BULK COLLECT to collect all the items (Model + Option + Item) in the p_atp_rec (input to the API).
Though in the output variable x_atp_rec it is giving us result of only first item that we are passing whereas rest of the items details (schedule ship date etc) are not getting in the output.
Below is the BULK COLLECT we are using with all the parameters. Please note that we are not passing order info to the API. But as per the notes on metalink its not mandatory so should not be an issue.
select
a.Inventory_Item_Id ,
a.segment1 ,
100, --Quantity_Ordered          ,
'EA', --Quantity_UOM              ,
TO_DATE('02-MAY-2011') , --Requested_Ship_Date       ,
--100, --Action                    ,
NULL, Instance_Id ,
NULL, Source_Organization_id ,
--'N', --OE_Flag                   ,
--0, --Insert_Flag               ,
--1, --Attribute_04          ,
--600027019, --Customer_Id ,             
--600042515, --Customer_Site_Id          ,
--660, --Calling_Module            ,
NULL, --Row_Id                    ,
NULL, --Organization_Id           ,
NULL, --order_number              ,
OE_ORDER_LINES_S.nextval, --identifier               ,
NULL, --destination_time_zone     ,
NULL, --demand_class              ,
NULL, --validation_org            ,
--1, --included_item_flag        , 
NULL, --old_source_organization_id,
NULL, --old_demand_class          ,
NULL --override_flag
--b.component_sequence_id
BULK COLLECT INTO
l_atp_rec.Inventory_Item_Id ,
l_atp_rec.Inventory_Item_Name ,
l_atp_rec.Quantity_Ordered ,
l_atp_rec.Quantity_UOM ,
l_atp_rec.Requested_Ship_Date ,
--l_atp_rec.Action                    ,
--l_atp_rec.Instance_Id               ,
--l_atp_rec.Source_Organization_id    ,
--l_atp_rec.OE_Flag                   ,
--l_atp_rec.Insert_Flag               ,
--l_atp_rec.Attribute_04,
--l_atp_rec.Customer_Id               ,
--l_atp_rec.Customer_Site_Id          ,
--l_atp_rec.Calling_Module            ,
l_atp_rec.Row_Id ,
l_atp_rec.Organization_Id ,
l_atp_rec.order_number ,
l_atp_rec.identifier ,
l_atp_rec.destination_time_zone ,
l_atp_rec.demand_class ,
l_atp_rec.validation_org ,
--l_atp_rec.included_item_flag        , 
l_atp_rec.old_source_organization_id,
l_atp_rec.old_demand_class ,
l_atp_rec.override_flag          
--l_atp_rec.component_sequence_id
FROM
mtl_system_items_b a
--BOM_INVENTORY_COMPONENTS b,
--BOM_BILL_OF_MATERIALS c
where a.segment1 in ('XCDE400-2-500SXA-K9','XCDE PWR OPT','XCAB-9K12A-NA','XCDE EXPAND','XCDAOND-MGT')
--and c.organization_id = a.organization_id
--and b.bill_sequence_id = c.bill_sequence_id
--and c.assembly_item_id = a.inventory_item_id
and a.organization_id = 1;
l_atp_rec.OE_Flag(1) := 'N';
l_atp_rec.Calling_Module(1) := 660;
l_atp_rec.Attribute_04(1) := 1;
l_atp_rec.action(1) := 100;
l_atp_rec.included_item_flag(1) := 1;
l_atp_rec.Customer_Id(1) := 600027024; --600027019;
l_atp_rec.Customer_Site_Id(1) := 600042524; --600042515;
l_atp_rec.Insert_Flag(1) := 0;
Please let me know if something is incorrect in this call.
Thanks and Regards,
Amit

Similar Messages

  • Error in Invoking MRP_ATP_PUB.Call_ATP

    Error in Invoking MRP_ATP_PUB.Call_ATP API
    We are trying to call MRP_ATP_Pub.Call_ATP Public API using Apps Adapter. When we invoke the adapter it throws ORA-00902
    error.
    file:/soadev/app/bpel/domains/default/tmp/.bpel_atp_v2008_07_22__470_fc381ba72b0fd2bab686f6be311c6154.tmp/test.wsdl [
    test_ptt::test(InputParameters,OutputParameters) ] - WSIF JCA Execute of
    operation 'test' failed due to: Error while trying to prepare and execute an
    API.
    An error occurred while preparing and executing the APPS.XX_BPEL_TESTXXX.MRP_ATP_PUB$CALL_ATP API. Cause: java.sql.SQLException:
    ORA-00902: invalid datatype
    [Caused by: ORA-00902: invalid datatype
    ; nested exception is:
    ORABPEL-11811
    Error while trying to prepare and execute an API.
    An error occurred while preparing and executing the APPS.XX_BPEL_TESTXXX.MRP_ATP_PUB$CALL_ATP API. Cause: java.sql.SQLException:
    ORA-00902: invalid datatype
    [Caused by: ORA-00902: invalid datatype
    Check to ensure that the API is defined in the database and that the parameters match
    the signature of the API. Contact oracle support if error is not fixable.

    Pl do not post duplicates -0 MRP_ATP_PUB.Call_ATP api not retrieving requested_date_quantity value when executed in custom schema but when executed in apps its working fine, can anyone help on this..

  • MRP_ATP_PUB.Call_ATP api not retrieving requested_date_quantity value when executed in custom schema but when executed in apps its working fine, can anyone help on this..

    The MRP_ATP_PUB.Call_ATP api not retrieving requested_date_quantity value when executed in custom schema but when executed in apps its working fine, can anyone help on this..
    We are passing the required values to the ATP API.
    1) The x_return_status is showing as 'S' i.e. success but x_atp_rec.Requested_Date_Quantity is not returning any value.
    2) If there is a grant issue how to identify it.
    Regards,
        Vinod Annukaran

    Pl do not post duplicates -0 MRP_ATP_PUB.Call_ATP api not retrieving requested_date_quantity value when executed in custom schema but when executed in apps its working fine, can anyone help on this..

  • Allocated ATP derivation for bulk loads and stealing

    Gurus,
    We are designing a solution where the requirement is to derive ATP for more than a million items every day. For around a 250,000 items, we need to know the maximum quantity available for an item, org, demand class (including Stealing).
    We need to know the availability for future dates (like Horizontal plan) which includes stealing. We are using Allocated ATP and we are in R12.2.3.
    Questions:
    1) As per the document (Doc ID 150908.1) ATP API Description R12 (ATP API MRP_ATP_PUB.Call_ATP) the API can be used derive multiple items in a single call. Is there a limit on the number of items can be passed in a single call?
    2) Can we derive ATP values including stealing in a single call? If so, which of the output record type can be referred to ATP_Rec_Typ, ATP_Period_Typ and ATP_Supply_Demand_Typ?
    3) Can we derive future availability of items in a single call? We saw the output of X_ATP_PERIOD which seem to give the horizontal plan. Can we get the availability with stealing in horizontal plan in a single ATP call itself?
    4) If API is not the suitable approach, can you please let us know the table/logic that can be used for derivation?
    Appreciate your help.
    Thanks,
    Ram.

    889367 wrote:
    What's the benefit of including "IF  :NEW.col1  IS NULL" statement? If I leave it out and someone tries to insert a value not using the trigger it changes the value for me. I can see that being good and bad, but it keeps them from not using the sequence.
    The benefit is that it allows you to manually assign the column value if you want to. Whether that is a 'benefit' for your use case or not is for you to decide.
    But I'm trying to decide when and if I should use this. I wouldn't consider it but we've got Informatica developers that insist on writing dynamic sql functions to pull values for sequences to use in inserts because they can't reference the nextval in the their workflows.
    Don't use Informatica to do something that can be done using Oracle. The strength and utility of an ETL tool is in doing things that the database either can NOT do or cannot do efficiently: for example pulling data from multiple databases and flat files for insert into a target database. The goal being to get ALL of the data into the target database as quickly and efficiently as possible. Then you can apply the full power of the target database to ALL of the data.

  • Restrict data flow to MRP_ATP_DETAILS_TEMP

    Hi,
    we have a custom program which pulls the details of item availability for SOs through ATP rules. we are using the api MRP_ATP_PUB.CALL_ATP to pull the atp detail for the items.
    but when this program is run each time it inserts about *70million* rows into MRP_ATP_DETAILS_TEMP table. then we have to run the purge atp temp table program to truncate the temp table. But as per our DBA this is a risk for the db storage space. so we have stopped running of this program now.
    please answer my below 2 questions -
    1) is it really a risk if the table rows reaches 70 million records? upto what maximum number of rows it is safe?
    2) is there a way to stop data insert to atp temp tables when the api is called? any profile option or debug level responsible for this? we dont want data to be inserted into the temp table.
    Regards,
    Samir Kumar Das

    1) is it really a risk if the table rows reaches 70 million records? upto what maximum number of rows it is safe?
    There is no magic number. It depends on the size of your database. 70m is huge for a small company but peanuts for say, Boeing.
    2) is there a way to stop data insert to atp temp tables when the api is called? any profile option or debug level responsible for this? we dont want data to be inserted into the temp table
    Oracle uses this table to calculate ATP results. So if you use the API, it will insert records.
    Having said that, 70m for ATP check is excessive.
    Are you doing an ATP check for ALL open sales orders?
    If it is just one or even hundred order lines, 70m seems too much. You need to raise an SR with Oracle.
    If you are doing it for all open lines (and there are thousands and thousands) here are a few things you can try
    1) Find how many sales order lines the custom program processes. See if you can reduce it by putting more conditions (such as ignore orders more than a year old, ignore orders more than 2 months in future etc.)
    2) Try to process in chunks. Instead of calling purge AFTER your custom program finishes, try calling it from your custom program every time say x number of order lines are processed.
    Sandeep Gandhi

  • ATP Check API before Order Import API

    Hi, All
    Does anyone have information of the following processes?
    ATP Check API before Order Import API
    I neve use ATP Check API before, It would be great help if someone can provide any inputs for this.
    Thanks,

    ATP API can be used for scheduling.
    API name: MRP_ATP_PUB.CALL_ATP
    MRP_ATP_PUB is a public API so it should have all validatations etc. Explore above API and you should be able to write some good package for scheduling or ATP.
    thanks
    Vikrant

  • How do I find, at-a-glance, the sample size used in several music files?

    How do I find, at-a-glance, the sample size used in several music files?
    Of all the fields available in a FInder Search, "Sample Size" is not available. Finder does offer a "Bits per Sample" field, but it only recognized graphic files and not music files.
    Running 10.8.5 on an iMac i5.
    I did search through a couple of communities but came up empty.
    Thank you,
    Craig

    C-squared,
    There is no View Option to allow display of a column of sample size. 
    For WAV or Apple Lossless files, it is available on the Summary tab (one song at a time, as you know).  For MP3 and AAC it is not available at all.
    You can roughly infer it from the files that are larger than expected for their time.
    99% of the music we use is at the CD standard of 16-bit, so I can guess that displaying it has never been a priority.  However, if you want to make a suggestion to Apple, use this link:
    http://www.apple.com/feedback/itunesapp.html

  • COGS update on other GL Account at the time of Free goods or Sample goods

    Hello,
    In sales process, Usually the at the time of Delivery, material document is created as
    DR COGS
    CR INVENTORY
    But in the case of free goods or bonus goods or samples: the account should not determines COGS instead it should determine another GL Account called as Free good -COGS expense A/c.
    How can we solve the issue.
    Regards,
    SK

    Hi Yasar,
    You need to create a new routine for calculate type.
    Do as below:
    1. Go to VOFM>Formulas>calc.rule Rebate InKd to create a new routine for calculate type.  for example 601.
    2. add the following code in this routine 601 and then save.
      USING L_FRM STRUCTURE KONDN_FRM.
    DATA: VORKOMMA  LIKE KONDN-KNRMM,
           NACHKOMMA LIKE KONDN-KNRMM.
      L_FRM-NRMENGE = 0.
      L_FRM-NRRUND  = 0.
      L_FRM-NRMENGE = ( L_FRM-MGLME / L_FRM-KNRNM * L_FRM-KNRZM ).
    business rounding
        VORKOMMA = FLOOR( L_FRM-NRMENGE ).
      L_FRM-NRRUND  = L_FRM-NRMENGE - VORKOMMA.
      L_FRM-NRMENGE = VORKOMMA.
    3. Select routine 601 in field "Calc.Rule" when you create free goods condition record.
    Hope it helps.

  • Error in compiling Photoshop CC 2014 sample project

    Hi,
    I am trying to compile SDK sample project "outbound". but it is showing errors, as "Parse Issue: Unknown type name 'DialogPtr' " in DialogUtilities.h .
    DialogUtilities.h file is in "Adobe Photoshop CC 2014:photoshopsdk:pluginsdk:samplecode:common:includes".
    even if I add this path in project  settings it still shows the errors.
    how can I make this work?
    I am using Photoshop CC 2014 and Xcode version is 4.6.3 (4H1503)
    If you have any idea regarding project settings then please let me know.
    Thanks and regards,
    Priyanka.

    Are you using the CC 2014 release of the SDK?
    The DialogUtilities.h for mac do not work. They are the old Carbon API's. See the Dissolve example for an Objective-C UI.
    I would comment out the DialogUtilities.h include and other associated headers for Carbon UI. Some Carbon calls still work that are unrelated to UI.

  • How to log in to Service Desk to create sample message to SAP?

    Dear all,
    How to log in to Service Desk under my Solution Manager server to create sample message to SAP under Component: SV-SMG-SUP?
    Regards
    GN

    Hi GN,
    You have different options to create messages in test component SV-SMG-SUP-TST:
    1. if you are on 7.1 you can create message using CRM_UI
    2. If you are using lower vrsion then either you can create message using Workcenter or using transaction NOTIF_CREATE or using menu --> Help --> Create support message.
    Thanks
    Regards,
    Vikram

  • SSO java sample application problem

    Hi all,
    I am trying to run the SSO java sample application, but am experiencing a problem:
    When I request the papp.jsp page I end up in an infinte loop, caught between papp.jsp and ssosignon.jsp.
    An earlier thread in this forum discussed the same problem, guessing that the cookie handling was the problem. This thread recommended a particlar servlet , ShowCookie, for inspecting the cookies for the current session.
    I have installed this cookie on the server, but don't see anything but one cookie, JSESSIONID.
    At present I am running the jsp sample app on a Tomcat server, while Oracle 9iAS with sso and portal is running on another machine on the LAN.
    The configuration of the SSO sample application is as follows:
    Cut from SSOEnablerJspBean.java:
    // Listener token for this partner application name
    private static String m_listenerToken = "wmli007251:8080";
    // Partner application session cookie name
    private static String m_cookieName = "SSO_PAPP_JSP_ID";
    // Partner application session domain
    private static String m_cookieDomain = "wmli007251:8080/";
    // Partner application session path scope
    private static String m_cookiePath = "/";
    // Host name of the database
    private static String m_dbHostName = "wmsi001370";
    // Port for database
    private static String m_dbPort = "1521";
    // Sehema name
    private static String m_dbSchemaName = "testpartnerapp";
    // Schema password
    private static String m_dbSchemaPasswd = "testpartnerapp";
    // Database SID name
    private static String m_dbSID = "IASDB.WMDATA.DK";
    // Requested URL (User requested page)
    private static String m_requestUrl = "http://wmli007251:8080/testsso/papp.jsp";
    // Cancel URL(Home page for this application which don't require authentication)
    private static String m_cancelUrl = "http://wmli007251:8080/testsso/fejl.html";
    Values specified in the Oracle Portal partner app administration page:
         ID: 1326
         Token: O87JOE971326
         Encryption key: 67854625C8B9BE96
         Logon-URL: http://wmsi001370:7777/pls/orasso/orasso.wwsso_app_admin.ls_login
         single signoff-URL: http://wmsi001370:7777/pls/orasso/orasso.wwsso_app_admin.ls_logout
         Name: testsso
         Start-URL: http://wmli007251:8080/testsso/
         Succes-URL: http://wmli007251:8080/testsso/ssosignon.jsp
         Log off-URL: http://wmli007251:8080/testsso/papplogoff.jsp
    Finally I have specified the cookie version to be v1.0 when running the regapp.sql script. Other parameters for this script are copied from the values specified above.
    Unfortunately the discussion in the earlier thread did not go any further but to recognize the cookieproblem, so I am now looking for help to move further on from here.
    Any ideas will be greatly appreciated!
    /Mads

    Pierre - When you work on the sample application, you should test the pages in a separate browser instance. Don't use the Run Page links from the Builder. The sample app has a different authentication scheme from that used in the development environment so it'll work better for you to use a separate development browser from the application testing browser. In the testing browser, to request the page you just modified, login to the application, then change the page ID in the URL. Then put some navigation controls into the application so you can run your page more easily by clicking links from other pages.
    Scott

  • Sample source code for fields mapping in expert routine

    Hi All
    Iam writing the expert routine from dso to cube for example I have two fields in dso FLD1,FLD2
    same fields in infocube also ,can any body provide me sample abap code to map source fields to target fields in expert routine,your help will be heighly appreciatble,it's an argent.
    regards
    eliaz

    Basic would be ;
    RESULT_FIELDS -xxx = <SOURCE_FIELDS> -xxx
    you have the source fields as source, and result fields for as the target. In between you can check some conditions as in other routines of transformation.
    BEGIN OF tys_SC_1, shows your source fields ( in your case DSO chars and key figures)
    BEGIN OF tys_TG_1, , shows your result fields ( in your case Cube characteristics)
    Hope this helps
    Derya

  • SAMPLE RECEIVING OPEN INTERFACE SCRIPT(ROI 이용자를 위한 SCRIPT)

    제품: MFG_PO
    작성날짜 : 2006-05-11
    SAMPLE RECEIVING OPEN INTERFACE SCRIPT(ROI 이용자를 위한 SCRIPT)
    ================================================================
    PURPOSE
    Receiving Open Interface(ROI)를 좀더 쉽게 사용할 수 있도록 만들어진
    tool이라고 할 수 있다. 이 script를 이용을 통해 user는 PO no, user id,
    Org id를 입력하면 script는 PO에서 최소한의 data를 가져와 receiving
    transaction을 생성하기 위해 ROI에 data를 입력한다.
    Receiving Transaction Processor는 insert 된 data를 실행한다.
    Explanation
    Instructions:
    1.Script exroi.sql을 local computer에 copy 하거나 sqlplus 환경의 text
    edior에 script 내용을 cut&paste 한다.
    2.사용가능한 PO no, User id, Org id를 결정한다.
    3.sqlplus prompt에서 아래와 같이 입력한다.
    SQL> @ezroi.sql
    4.PO no, User id, Org id를 입력하라는 prompt를 볼 수 있을 것이다.
    5.exroi.sql script를 관련된 PO data를 가져와 rcv_headers_interface 및
    rcv_transactions_interface tables에 insert 한다.
    만일 PO shipment lind이 closed, cancelled, fully received 되었다면
    ROI table에 data를 insert 하지 않는다.
    Note: 이 script가 data를 validate 하진 않으며,ROI API 자체 validation
    이 실행될 뿐이다.
    6.Script가 끝나면 Receiving Transaction Processor를 실행하여 insert 된
    lines을 처리할 수 있다. Transaction Status Summary 화면을 통해 실행된
    line이 pending 인지 error 상태인지 확인할 수 있다.
    Notes:
    1.Org_id parameter 값을 찾는법:
    a) Application에 접속, Help> Diagnostics> Examine으로 이동.
    Block:$Profile$, Field: ORG_ID 를 선택한다.
    b) ORG_ID 값을 note 해 놓고 ORG_ID prompt시 이 값을 입력한다.
    2.User_Name parameter 값을 찾는법:
    a) Application에 접속, Help> Diagnostics> Examine으로 이동.
    Block:$Profile$, Field: USER_NAME 를 선택한다.
    b) USER_NAME 값을 note 해 놓고 USER_NAME prompt시 이 값을 입력한다.
    Example
    "eZROI.sql' script...
    --*** eZROI ***
    --*** by ***
    --*** Preston D. Davenport ***
    --*** Oracle Premium Applications Support ***
    --*** Oracle Worldwide Global Support Services ***
    --*** Date: 23-JUL-2003 - Beta release ***
    --*** Date: 09-SEP-2003 - Rev A Added multi- ***
    --*** shipment line capability ***
    --*** Parameters: ***
    --*** ORG_ID Organization ID ***
    --*** USER_NAME FND User Name ***
    --*** PO_NUMBER Purchase Order Number ***
    --*** This script intended for a standard Purchase ***
    --*** Order document to be inserted into the Oracle ***
    --*** Receiving Open Interface (ROI) via the standard ***
    --*** Oracle open interface api for a simple Receive ***
    --*** transaction. ***
    --*** Note: This script only considers open Purchase ***
    --*** Orders. This script will not allow over- ***
    --*** receipt, cancelled or closed PO's to be ***
    --*** inserted into the ROI and received ***
    CLEAR BUFFER
    SET VERIFY OFF
    SET LINESIZE 140
    SET PAGESIZE 60
    SET ARRAYSIZE 1
    SET SERVEROUTPUT ON SIZE 100000
    SET FEEDBACK OFF
    SET ECHO OFF
    DECLARE
    X_USER_ID NUMBER;
    X_PO_HEADER_ID NUMBER;
    X_VENDOR_ID NUMBER;
    X_SEGMENT1 NUMBER;
    X_ORG_ID NUMBER;
    X_LINE_NUM NUMBER;
    BEGIN
    DBMS_OUTPUT.PUT_LINE('***ezROI RCV API Insert Script***');
    SELECT PO_HEADER_ID , VENDOR_ID , SEGMENT1 , ORG_ID
    INTO X_PO_HEADER_ID , X_VENDOR_ID , X_SEGMENT1 , X_ORG_ID
    FROM PO_HEADERS_ALL
    WHERE SEGMENT1 = '&PO_NUMBER'
    AND ORG_ID = &ORG_ID;
    SELECT USER_ID INTO X_USER_ID
    FROM FND_USER
    WHERE USER_NAME = UPPER('&USER_NAME');
    INSERT INTO RCV_HEADERS_INTERFACE
    HEADER_INTERFACE_ID ,
    GROUP_ID ,
    PROCESSING_STATUS_CODE ,
    RECEIPT_SOURCE_CODE ,
    TRANSACTION_TYPE ,
    LAST_UPDATE_DATE ,
    LAST_UPDATED_BY ,
    LAST_UPDATE_LOGIN ,
    VENDOR_ID ,
    EXPECTED_RECEIPT_DATE ,
    VALIDATION_FLAG
    SELECT
    RCV_HEADERS_INTERFACE_S.NEXTVAL ,
    RCV_INTERFACE_GROUPS_S.NEXTVAL ,
    'PENDING' ,
    'VENDOR' ,
    'NEW' ,
    SYSDATE ,
    X_USER_ID ,
    0 ,
    X_VENDOR_ID ,
    SYSDATE ,
    'Y'
    FROM DUAL;
    DECLARE
    CURSOR PO_LINE IS
    SELECT PL.ITEM_ID , PL.PO_LINE_ID , PL.LINE_NUM ,
    PLL.QUANTITY , PL.UNIT_MEAS_LOOKUP_CODE ,
    MP.ORGANIZATION_CODE , PLL.LINE_LOCATION_ID ,
    PLL.CLOSED_CODE , PLL.QUANTITY_RECEIVED ,
    PLL.CANCEL_FLAG, PLL.SHIPMENT_NUM
    FROM PO_LINES_ALL PL ,
    PO_LINE_LOCATIONS_ALL PLL ,
    MTL_PARAMETERS MP
    WHERE PL.PO_HEADER_ID = X_PO_HEADER_ID
    AND PL.PO_LINE_ID = PLL.PO_LINE_ID
    AND PLL.SHIP_TO_ORGANIZATION_ID = MP.ORGANIZATION_ID;
    BEGIN
    FOR CURSOR1 IN PO_LINE LOOP
    IF CURSOR1.CLOSED_CODE IN ('APPROVED','OPEN')
    AND CURSOR1.QUANTITY_RECEIVED < CURSOR1.QUANTITY
    AND NVL(CURSOR1.CANCEL_FLAG,'N') = 'N'
    THEN
    INSERT INTO RCV_TRANSACTIONS_INTERFACE
    INTERFACE_TRANSACTION_ID ,
    GROUP_ID ,
    LAST_UPDATE_DATE ,
    LAST_UPDATED_BY ,
    CREATION_DATE ,
    CREATED_BY ,
    LAST_UPDATE_LOGIN ,
    TRANSACTION_TYPE ,
    TRANSACTION_DATE ,
    PROCESSING_STATUS_CODE ,
    PROCESSING_MODE_CODE ,
    TRANSACTION_STATUS_CODE ,
    PO_LINE_ID ,
    ITEM_ID ,
    QUANTITY ,
    UNIT_OF_MEASURE ,
    PO_LINE_LOCATION_ID ,
    AUTO_TRANSACT_CODE ,
    RECEIPT_SOURCE_CODE ,
    TO_ORGANIZATION_CODE ,
    SOURCE_DOCUMENT_CODE ,
    DOCUMENT_NUM ,
    HEADER_INTERFACE_ID ,
    VALIDATION_FLAG
    SELECT
    RCV_TRANSACTIONS_INTERFACE_S.NEXTVAL ,
    RCV_INTERFACE_GROUPS_S.CURRVAL ,
    SYSDATE ,
    X_USER_ID ,
    SYSDATE ,
    X_USER_ID ,
    0 ,
    'RECEIVE' ,
    SYSDATE ,
    'PENDING' ,
    'BATCH' ,
    'PENDING' ,
    CURSOR1.PO_LINE_ID ,
    CURSOR1.ITEM_ID ,
    CURSOR1.QUANTITY ,
    CURSOR1.UNIT_MEAS_LOOKUP_CODE ,
    CURSOR1.LINE_LOCATION_ID ,
    'RECEIVE' ,
    'VENDOR' ,
    CURSOR1.ORGANIZATION_CODE ,
    'PO' ,
    X_SEGMENT1 ,
    RCV_HEADERS_INTERFACE_S.CURRVAL ,
    'Y'
    FROM DUAL;
    DBMS_OUTPUT.PUT_LINE('PO line: '||CURSOR1.LINE_NUM||' Shipment: '||CURSOR1.SHIPMENT_NUM||' has been inserted into ROI.');
    ELSE
    DBMS_OUTPUT.PUT_LINE('PO line '||CURSOR1.LINE_NUM||' is either closed, cancelled, received.');
    END IF;
    END LOOP;
    DBMS_OUTPUT.PUT_LINE('*** ezROI COMPLETE - End ***');
    END;
    COMMIT;
    END;
    SET VERIFY ON
    Reference Documents
    Note 245334.1

    I have the same problem on ESXI 5.5 for over a month now, tried the patches, tried the LTS kernel which others say results in an immediate result without patches, nothing seems to work and nobody seems to be able to offer a solution.
    Did you make any progress ??
    Error! Build of vmblock.ko failed for: 3.10.25-1-lts (x86_64)
    Consult the make.log in the build directory
    /var/lib/dkms/open-vm-tools/2013.09.16/build/ for more information.
    make[2]: *** No rule to make target '/var/lib/dkms/open-vm-tools/2013.09.16/build/vmblock/linux/inode', needed by '/var/lib/dkms/open-vm-tools/2013.09.16/build/vmblock/vmblock.o'. Stop.
    Makefile:1224: recipe for target '_module_/var/lib/dkms/open-vm-tools/2013.09.16/build/vmblock' failed
    make[1]: *** [_module_/var/lib/dkms/open-vm-tools/2013.09.16/build/vmblock] Error 2
    make[1]: Leaving directory '/usr/src/linux-3.10.25-1-lts'
    Makefile:120: recipe for target 'vmblock.ko' failed
    Last edited by crankshaft (2014-01-10 11:32:32)

  • Can anybody recommend me on the strain gauges which are attached along a steel bar, say, 3 mm in diameter, and can respond at a sampling rate of 100 kHz?

    Or anybody has similar application, it would be very grateful if you can share your experience.

    Many thanks for your information. We are about to investigate the dynamic performance of soil anchorage subject to dynamic impulse in a geotechnical centrifuge. Basically, 5 model anchorages (3 mm in diameter) are to be constructed. Each model anchorage is instrumented with 1 accelerometer, 1 load cell and 6 strain gauges. Therefore, the minimum instrumentation would be
    1) 30 strain gauges
    2) 5 accelerometers
    3) 5 load cells
    It is worth noting that all the instruments would eventually reside in a geotechnical centrifuge operating in a 10g gravitational field. Whilst the DAQ system would experience much less 'g force' since they are to be placed close to the rotation axis of the machine.
    These model anchorages would be tested one by one dyn
    amically on each individual basis, whilst the other four are also preferably to be monitored during the tests. For an active model anchorage which is under testing, simultaneous sampling is preferred for its 8 sensors to be measured at the same instant in time (Besides, it is also understood that the natural frequency of the model anchorage is estimated to be in the range of several kHz). For the other four model anchorages, simultaneous sampling could be less demanding. It would therefore be ideal that simultaneous sampling can be switched among the active testing model anchorages.
    I am greatly appreciated if you can further advise me on this. Thanks in advance.

  • How do I find music I want by sample?

    There is a sample of music in YouTube. I want to find more of the same by that sample. So how do I find more?
    The sample is of laser harp play, but I am interested in the sound of it. The sound is deep and metallic, vibrating, almost menacing, mysterious at least. It is kind of space music, but I want that deep metallic vibrating sound, not just whatever space music. The music doesn't need to be as happy as in the sample, I would like it more deep, mean, and mysterious too. The music can be ambient or then have more detail to it. But it shouldn't be bass booming, which I hate.
    Sample: http://uk.youtube.com/watch?v=sLVXmsbVwUs

    Well, he does all kinds of space music songs like below in the link. That doesn't help much, because I want only music that has the metallic vibrating sound. The sound should play major role in the songs. I want to be able to find music by the sample, not searching through music I don't need. Also I would like to get other artists' music too, not only this one's.
    Link: http://uk.youtube.com/watch?v=x8dqzTl0vUI
    Message was edited by: ThunderHorse

Maybe you are looking for

  • In accounting document posting H.E.cess on CVD is added to Material cost.

    Hi, I created a purchase order for import scenario where i executed a cycle with purchase order, MGO,capture and post excise invoice and then MIRO. But during the analysis of account posting at the time of MIGO the posting done on Higher Education Ce

  • Internal table with out header line

    Hi friends, Can u send me code for internal table with out header line : how to declare ,how to populate data and how to access the data Regards, vijay

  • Recommandations for upgrading from 46C to ECC 6.0

    Hello Our SAP system landscape is actually in 46c and is only composed of 3 servers with almost 40 users declared (25 actives) on production system. Modules FI/CO HR MM are actually implemented. We plan to upgrade the system to ECC 6.0 between March

  • Get 2 pages when I preview

    I designed a print layout. when I preview the layout with 1 row of data, found I get 2 pages and the second page only shows page header. I adjust the height, width and position of some objects, but it's useless. can anyone tell me what can I do to fi

  • Plz comment on interveiw questions....

    Hi gurus, i have attended an interview they have asked these questions plz give the answers 1. what is tha advantages and disadvantages of NAVIGATIONAL ATTRIBUTES, and why we have to make all of them as navigational attributes. i could only answer th