Where in the icx tables are price break and price break quantities stored?

I am trying to build a query from the icx tables that will show me all the BPA line price breaks and quantities.
I cannot seem to find any documentation on what is specifically is extracted from the BPA lines when an internal catalog build is performed. I am able to validate that in iP, the price breaks are being taken into consideration when I create Requisitions with various quantities, but I do not know where iP is storing this information. Perhaps the data isn't stored and is taken from the core app po_line_locations_all table at the time the requisition is being created?
I have a BPA, line 27, that has multiple price break lines.
select rt_item_id, price_type, contract_num, contract_line_num, allow_price_override_flag, not_to_exceed_price, value_basis
from apps.icx_cat_item_prices
where contract_num = 'xxxxxx' and contract_line_num = '27'
and price_type = 'BLANKET';
All I could find when running this query was the price break information for the first price break line, none of the other price break lines.
If anybody knows where the documentation is that tells me exactly what is extracted from the core application BPA to the icx tables, I would be greatly appreciative. Even better, if someone already knows the answer to either 1) price break and quantity are not stored in icx tables or 2) they are stored and you have SQL that shows me how to find it, I would be so very appreciative to have this information.
Edited by: user6287397 on Jan 24, 2009 6:40 AM

I got the answer. :-)
Price breaks details are not stored in any icx tables. iP retrieves the information based on the need by date entered on the requisition.
Oracle support referred me to the Oracle® Purchasing Release 11i10 Open Interfaces and APIs.
The java code - SourceDocHelper.java - is responsible to get the price information by calling the procedure po_price_break_grp.get_price_break (POXPRBKB.pls DefaultPricing ) For a given a Source Document (Quotation/Catalog), Quantity and Unit of Measure, this procedure derives the best price for the calling routine.
The SQL used to get the price information that was sent to me is attached. Note that this SQL uses the need by date to get the right price in case of price break used at the distribution level.
SELECT poll.price_override
, round(poll.price_override * v_conversion_rate,
l_base_curr_ext_precision )
, poh.rate_date
, poh.rate
, poh.currency_code
, poh.rate_type
, poll.price_discount
, poll.price_override
, decode( poll.line_location_id,
null, pol.unit_meas_lookup_code,
poll.unit_meas_lookup_code)
, poll.line_location_id -- SERVICES FPJ
FROM po_headers_all poh -- FPI GA
, po_lines_all pol -- FPI GA
, po_line_locations_all poll -- FPI GA
WHERE poh.po_header_id = p_source_document_header_id
and poh.po_header_id = pol.po_header_id
and pol.line_num = p_source_document_line_num
and pol.po_line_id = poll.po_line_id
and ( p_required_currency is null
or poh.currency_code = p_required_currency )
and ( p_required_rate_type is null
or poh.rate_type = p_required_rate_type )
and nvl(poll.unit_meas_lookup_code, nvl(p_unit_of_measure,
pol.unit_meas_lookup_code))
= nvl(p_unit_of_measure, pol.unit_meas_lookup_code)
Change sysdate to l_pricing_date in order to use the Need By
Date
to determine the price.
and (trunc(nvl(l_pricing_date, trunc(sysdate))) >= trunc(poll.
start_date) -- FPJ Custom Price
OR
poll.start_date is null)
and (trunc(nvl(l_pricing_date, trunc(sysdate))) <= trunc(poll.
end_date) -- FPJ Custom Price
OR
poll.end_date is null)
--Bug #2693408: added nvl clause to quantity check
and nvl(poll.quantity, 0) <= nvl(p_in_quantity, 0)
Determining the price based on ship-to-location and
destination organization
and ((poll.ship_to_location_id = v_ship_to_location_id OR poll.
ship_to_location_id is null)
AND
(poll.ship_to_organization_id = p_destination_org_id OR poll.
ship_to_organization_id is null))
and poll.shipment_type in ('PRICE BREAK', 'QUOTATION')
-- <2721775 START>: Make sure Quotation Price Breaks are Approved.
AND ( -- ( poll.shipment_type IS NULL )
( poll.shipment_type = 'PRICE BREAK' )
OR ( ( poll.shipment_type = 'QUOTATION' )
AND ( ( poh.approval_required_flag <> 'Y' )
OR ( EXISTS ( SELECT ('Price Break is Approved')
FROM po_quotation_approvals pqa
WHERE pqa.line_location_id = poll.line_location_id
AND pqa.approval_type IN ('ALL
ORDERS', 'REQUISITIONS')
AND trunc(nvl(l_pricing_date,
sysdate)) -- FPJ Custom Price
BETWEEN
trunc(nvl(start_date_active, sysdate-1))
AND trunc(nvl(end_date
_active, sysdate+1)))))))
-- <2721775 END>
order by poll.ship_to_organization_id ASC, poll.ship_to_location_id ASC,
NVL(poll.quantity, 0) DESC,
trunc(poll.creation_date) DESC, poll.price_override ASC; /*
*/

Similar Messages

  • Content Tab: None of the fact tables are compatible with the query request

    Hi All,
    **One thing I am not clear yet of all my years with OBIEE is working with the content tab in BMM.**
    I have made a rpd the joins in physical layer as shown below:
    https://picasaweb.google.com/114804305606242416264/OBIEEError#5663056545119428530
    And the BMM layer as:
    https://picasaweb.google.com/114804305606242416264/OBIEEError#5663056519553812930
    Error I am getting when i run a request from the 3 columns from the selected 3 tables is:
    Dim - Comment Code Details
    Fact - Complaint
    Dim - Service Details
    Error Codes: OPR4ONWY:U9IM8TAC:OI2DL65P
    State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 14020] None of the fact tables are compatible with the query request Sr Num:[DAggr(Fact - Complaint.Sr Num by [ Dim - Service Details.Sr Cat Type Cd, Dim - Comment Code Details.Cmtcode name] )]. (HY000).
    I get no error for consistency.. I read everywhere and I know i need to set the appropriate aggregation levels in the various dims and facts LTS properties to help OBIEE understanding our model, but how to do that.. how do i decide... how should I approach, what should be the aggregation level, what details.
    When i click More button i see different options: Copy, Copy From, Get Levels, Check Level, what do these mean.
    Aggregation Content, group by - Logical Level or Column which one should i choose and how should I decide.
    Can anyone explain the Content Tab in details and from scratch with some example and why we get these errors.... I know many people who are well versed with many other things related to RPD but this. A little efforts of explaining from you guys will really be appreciated.
    Thanks in advance,
    Dev

    Hi Deepak,
    Option 1:
    My tables in physical layer are joined as below:
    D1--> F1 <--D2--> F2 <--D3
    Same way i model it in BMM
    D1--> F1 <-- D2--> F2 <--D3
    Here D1 is non Conformed Dimension for F2 and D3 is non Conformed dim for F1. Later create Dimensional hierarchies, I tried setting up the content levels
    I go Sources>content tab of Fact F1 I set
    Dimensions----------- Logical level
    D1---------------------- D1 Detail
    D2---------------------- D2 Detail
    D3---------------------- D3 Total
    then, I go Sources>content tab of Fact F2 I set
    Dimensions----------- Logical level
    D1---------------------- D1 Total
    D2---------------------- D2 Detail
    D3---------------------- D3 Detail
    Then, I also go in all the dimensions and set their content levels to Details, but it still gives me errors not sure where I am going wrong in setting the content levels.
    I need to know whether the way I have modeled it in BMM is right,
    Option 2:
    I can combine the two facts in a single Logical Fact or the above design should also work.
    (F1&F2)<--D1, D2 , D3 joined separately using complex logical joins.
    what will be the content tab details?
    Thanks,
    Dev

  • "None of the fact tables are compatible with the query request " error

    I've got a situation where I have two facts(Fact_1, Fact_2) and three dimensions(dim_1,dim_2,dim_3) in 1 subject area. I've got dimension hierarchies setup for all the dimension tables.
    Dim_1 is one to many to Fact_1
    Dim_2 is one to many to Fact_2
    Dim_3 is one to many to both Fact_1 and Fact_2
    I've set up the content levels for the LTS for the Facts so that they are the lowest grain for dimensions they join to and the grand total grain for dimensions they do not join to.
    My rpd is consistent. When I run a report using an attribute from Dim_3 and Dim_1 or Dim_3 and Dim_2, the report comes back fine.
    But if I try to run a report using all three Dim tables, I get an error and the message "None of the fact tables are compatible with the query request ".
    First of all, is it possible to make a report using all three dimensions?
    Second, what's the best way to trouble shoot this error? Why are none of the fact tables compatible? I thought as long as the aggregation levels were set to grand total for non-shared dimensions, Answers would be able to create the report properly.
    Any advise would be greatly appreciated.
    Thanks!
    -Joe

    OBIEE is looking for a fact that can link ALL the dimensions together. This is also known as the implicit fact ... you don't have a fact that can relate all the dimensions - you have 2 facts that together they can. Perhaps you need to great a single logical fact that has both LTS for your physical facts and try it that way.
    Then you'd have Dim1, Dim 2, Dim3 all being able to join to Fact1 (which is made of physical facts 1 & 2).

  • How do I find out what the charges on my credit card are for? Where is the iTunes account to log into and check?

    How do I find out what the charges are on my credit card are for. Where is the iTunes account to sign in and check?

    Launch iTunes then select iTunes Store from the source list left side of the window.
    Click Sign In on the right side of the window just above Quick Links.
    Now click your account name.
    In the next window click See All to the right of Purchase History.

  • Essbase answers - None of the fact tables are compatible with the query request "member"

    Hi,
    I have modelled an Essbase database into the repository.
    If I pull the measure, period and year dimension in and filter on the year (member) and display the year (member) along with the period (alias) and measure it errors with =>
    State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 43113] Message returned from OBIS. [nQSError: 43119] Query Failed: [nQSError: 14020] None of the fact tables are compatible with the query request Fiscal Year.Fiscal Year Code. (HY000)
    However, all other things being equal if I change the year displayed to the alias then it works.
    Anyone tell me why??
    Is there a limitation that Essbase brings through that you cannot view what you filter on?
    thanks,
    Robert.

    Hi
    i have done the content level setting in each of the table, D1,F1 and F2(LTS), now i am getting the following error..
    State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 15018] Incorrectly defined logical table source (for fact table Gl Sets Of Books) does not contain mapping for [Code Combinations.Code Combinations.Affiliate, GL Balances.GL Balances.Currency Code, GL Balances.GL Balances.PTD_Balance, Gl Sets Of Books.Gl Sets Of Books .SoB Name]. (HY000)
    Gl Balances : D1
    Code Commbination: F1
    Gl Sets Of Books : F2
    I have checked the joins in physical and BMM layer..all are fine..

  • Error: None of the fact tables are compatible with the query request

    Hi experts,
    I have one confirm dimension D1 and other two fact table F1 and F2 (F1 and F2 are connected to D1)
    when i create a report from D1 and F1 the report is running fine. But pull the column from F2 also in this report i am getting the following error
    None of the fact tables are compatible with the query request Code Combinations
    please suggest on the same.
    Regards,
    S

    Hi
    i have done the content level setting in each of the table, D1,F1 and F2(LTS), now i am getting the following error..
    State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 15018] Incorrectly defined logical table source (for fact table Gl Sets Of Books) does not contain mapping for [Code Combinations.Code Combinations.Affiliate, GL Balances.GL Balances.Currency Code, GL Balances.GL Balances.PTD_Balance, Gl Sets Of Books.Gl Sets Of Books .SoB Name]. (HY000)
    Gl Balances : D1
    Code Commbination: F1
    Gl Sets Of Books : F2
    I have checked the joins in physical and BMM layer..all are fine..

  • None of the fact tables are compatible

    hi,
    am developing report from two fact table columns and one dimension table in obiee 11.1.1.5.0.
    am getting error
    Error
    View Display Error
    Odbc driver returned an error (SQLExecDirectW).
    Error Details
    Error Codes: OPR4ONWY:U9IM8TAC:OI2DL65P
    State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 43113] Message returned from OBIS. [nQSError: 43119] Query Failed: [nQSError: 14020] None of the fact tables are compatible with the query request Fact - Retail.Retail. (HY000)
    SQL Issued: SELECT 0 s_0, "TM Vehicle Sales"."- Offtake Facts"."Offtake" s_1, "TM Vehicle Sales"."- Retail Facts"."Retail" s_2, "TM Vehicle Sales"."Distributor"."Country" s_3 FROM "TM Vehicle Sales"
    regards
    vcm

    need to see your design, dim is shared between the facts?
    I think you can assume the physical query based on your columns selection..
    Now pick one column from 1st fact and 2nd column from dim run a report, get physcial query and verify the joins with obiee and your own query.
    then add column from 2nd query see how it works
    Edited by: svee on Jun 29, 2012 6:21 AM

  • None of the dimension tables are compatible with the query request

    Hi,
    i am experiencing the below error while querying columns alone from employee dimension (w_employee_d) in workforce profile SA. There is only one column in my report which is employee number coming from employee dimension. when i query other information like job, region, location etc i am not getting any error. the below error appears only when querying columns from employee dimension. the content tab level for the LTS of employee dimension is set to employee detail.
         View Display Error
    Odbc driver returned an error (SQLExecDirectW).
    Error Details
    Error Codes: OPR4ONWY:U9IM8TAC:OI2DL65P
    State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 43113] Message returned from OBIS. [nQSError: 43119] Query Failed: [nQSError: 14077] None of the dimension tables are compatible with the query request Dim - Employee.Employee Number. (HY000)
    SQL Issued: SELECT 0 s_0, "Human Resources - Workforce Profile"."Employee Attributes"."Employee Number" s_1 FROM "Human Resources - Workforce Profile" FETCH FIRST 65001 ROWS ONLY.
    couldn't able to know the exact reason. Any suggestions would be highly appreciated.
    Regards.

    hi user582149,
    It is difficult to answer you question with such a little amount of details. Could you specify:
    - how many facts/dimensions are you using in the query?
    - what is the structure of your Business Model?
    - which version of OBI are you using?
    - what does your log say?
    I hope to tell you more having the information above
    Cheers

  • None of the fact tables are compatible error

    Hi All,
    I do see this error (none of the compatible fact table) after setting the content level aggregation on the dimension tables and the fact table. This error i get only when i try to pull the calculated item which is based on a attribute in the fact table. I have an attribute like year in the fact table i need to display like 'CY'||'2013' in a calculated logical column and when i pull this into answers i get this error -
    1) joins are ok ; only one fact table and 3 dimension tables
    2). content level on the fact table are specified at the detail level and also for the dimensions
    any suggestions - thanks for your time

    can anyone please provide some suggestions -
    > i looked at the fact table LTS and specified the logical level for each dimension as the detail
    > specified the LTS for each dimension table
    > I have a column in my fact table which is calendar year and i want to have a derived column like rep_cal_year with 'CY'||cal_year - so when i pull this derived column in my answers i get the error - none of the fact tables are compatibile with the query;
    what could be missign?

  • A way to track where all the Sub Var are applied?

    Hopefully this is the section where to post.
    Is there a way of tracking where all the substitution variables are being used in the reports?
    Either in reports or calc scripts, or in a 3rd format, if there is one.
    We just don't want to open up each script individually and scan for use of the sub var, or open each report and see if it's used.

    Hi there,
    Is this HFM or Essbase?
    Regards,
    Thanos

  • When I attempt to open iphoto, I get the loading icon in the working space/picture area.  If I try to reinstall ilife '11, will I lose any pictures?  Where, on the hard drive, are all my pictures actually stored?

    When I attempt to open iphoto, I get the loading icon in the working space/picture area.  If I try to reinstall ilife '11, will I lose any pictures?  Where, on the hard drive, are all my pictures actually stored?

    You won't lose anything because you have a back up, right?
    Most Simple Back Up
    Drag the iPhoto Library from your Pictures Folder to another Disk. This will make a copy on that disk.
    Slightly more complex:
    Use an app that will do incremental back ups. This is a very good way to work. The first time you run the back up the app will make a complete copy of the Library. Thereafter it will update the back up with the changes you have made. That makes subsequent back ups much faster. Many of these apps also have scheduling capabilities: So set it up and it will do the back up automatically. Examples of such apps: Chronosync or DejaVu . But are many others. Search on MacUpdate
    Option 1
    Back Up and try rebuild the library: hold down the command and option (or alt) keys while launching iPhoto. Use the resulting dialogue to rebuild. Choose to Rebuild iPhoto Library Database from automatic backup.
    If that fails:
    Option 2
    Download iPhoto Library Manager and use its rebuild function. This will create a new library based on data in the albumdata.xml file. Not everything will be brought over - no slideshows, books or calendars, for instance - but it should get all your albums and keywords back.
    Because this process creates an entirely new library and leaves your old one untouched, it is non-destructive, and if you're not happy with the results you can simply return to your old one. .
    Regards
    TD

  • I am trying to buy my album. I have an English ID but am operating in Switzerland in the German part. When I go to buy book the instructions are in French and price in  CHF. Can anyone let me know if I can change the language to English and order in £

    I am trying to buy my album. I have an English ID but am operating in Switzerland in the German part. When I go to buy book the instructions are in French and price in  CHF. Can anyone let me know if I can change the language to English and order in £ and use english Id from switzerland.
    All other systems set to English and i tunes

    see instructions here - since they are in Frencha and german I have no idea if they will solve your ptoblem
    LN

  • Call the database table on commit operation and fetch the rows

    Hi
    I am new to Bpel. I have a task to call the database table on commit operation and fetch the rows and call a webservice iteratively for fetching the data into a file?
    Can anybody help me on this
    Thanks and regards
    Richa

    HI,
    update ZYBKP set MATNR = <b>zcore_2-ZDUMMY</b> where matnr = <b>zcore_2-ZDUMMY</b>.
    update ZY310 set MATNR = zcore_2-ZDUMMY where matnr = zcore_2-ZDUMMY.
    update ZYBCS_LOT set MATNR = zcore_2-ZDUMMY where matnr = zcore_2-ZDUMMY.
    <b>YOU ARE COMPARING SAME MATERIAL AND UPDATING WITH THE SAME..</b> AND ALSO ZCORE_2 IS A DATABASE TABLE AND MAY NOT CONTAIN ANY VALUE AT THAT POINT...
    in these statments you say that
    update table set matnr = 'MATNR' WHERE MATNR = 'MATNR'.
    EVEN THOUGH THIS STATMENT IS EXECUTED THERE WILL NOT BE ANY CHANGE IN THE TABLE...
    Thanks,
    Mahesh

  • Is there any command/query/etc, which would allow to understand what database objects (for example tables) are consuming memory and how much of it?

    TimesTen Release 11.2.1.9.6 (64 bit Linux/x86_64)
    Command> dssize;
    PERM_ALLOCATED_SIZE:      51200000
      PERM_IN_USE_SIZE: 45996153
    PERM_IN_USE_HIGH_WATER:   50033464
    TEMP_ALLOCATED_SIZE:      2457600
    TEMP_IN_USE_SIZE:         19680
    TEMP_IN_USE_HIGH_WATER:   26760
    Is there any command/query/etc, which would allow to understand what database objects (for example tables) are consuming memory and how much of it?
    tried to use ttsize function, but it gives some senseless results – for example, for the biggest table, tokens, it produces following output (that this table is 90GB in size – what physically cannot be true):
    Command> call ttsize('tokens',null,null);
    < 90885669274.0000 >
    1 row found.

    Are you able to use the command line version of ttSize instead? This splits out how much space is being used by indexes (in the Temp section of the TT memory segment), which I think is being combined into one, whole figure in the procedure version of ttSize you're using. For example:
    ttSize -tbl ia my_ttdb
    Rows = 4
    Total in-line row bytes = 17524
    Total = 17524
    Command> create index i1 on ia(a);
    ttSize -tbl ia my_ttdb;
    Rows = 4
    Total in-line row bytes = 17524
    Indexes:
    Range index JSPALMER.I1 adds 5618 bytes
      Total index bytes = 5618
    Total = 23142
    Command> call ttsize ('ia',,);
    < 23142.0000000000 >
    1 row found.
    In 11.2.2 we added the procedure ttComputeTabSizes which populates system tables with detailed table size data, and was designed to be an alternative to ttSize. Unfortunately it still doesn't calculate index usage though, and it isn't in 11.2.1.

  • Hi. After I have loaded a picture into Photoshop some of the menu tools are greyed out and I cannot use the keyboard short cut, for instance I cannot use Ctrl   T to enter free transform mode. Also the toolbar is so small I cannot see the icons without ge

    Hi. After loading a picture into Photoshop some of the menu tools are greyed out and I cannot use the keyboard short cuts, for instance I cannot use Ctrl + T for free transform, how do I resolve this problem. In addition the tool Icons are so small I have to get close to the screen to see them, is there anyway the Icons can be made bigger without reducing the resolution of the screen.
    Thanks
    Derek

    Are you working on a background layer? If so, Ctrl + T will not work. Press Alt and double-click your background layer to convert to a normal layer (or right-click BG layer and select Layer from Background) and try again.
    If you're working in Photoshop CC 2014 you can go to Preferences > Experimental Features and tick "Scale UI 200% for high-density displays" to make your buttons and icons larger.

Maybe you are looking for

  • Does time machine "restore" change all subsequent versions?

    Hi all, This evening I some how accidently "restored" a document I had been working on to a version I started about a month ago.  I honestly have no idea how this happened since I hadn't even opened Time Machine when noticed the document was the old

  • Week number and days in week number

    Hi, All we have to update the material master with weekly forcast sales,  so can any one tell me which <b>function i have to use to get week numbers in a month ( i have to get all the week numbers in the given month) and once i get the week numbers h

  • HT1338 do I need an anti virus on my mac

    My mac book pro has stopped logging me in. It is showing me a no entry after putting in my password.  How can I go around this

  • Constrain of Custom Infotype

    Hi Experts, In which table / transaction  do we  find the time constrains of the Custom Infotype. Regards, IFF

  • Xdo file not found error

    Hi all, For some reason, any time I click view on a report that I have created (as opposed to the sample reports -- they work fine), I get an error saying that the .xdo file does not exist. As a simple test, I just created a folder called Test with a