Updating Physical Cube Tables and Hierarchies OBIEE 11.1.1.7

OBIEE 11.1.1.7
I have imported a MSAS Cube, modeled it and created a presentation layer. I now have some new hierarchies I need to add to the phyicial layer.
How do you import new Cube Tables and/or Hierarchies?
When I select the Import option it creates a new OLAP database and connection pool, I am unable to successfully move those changes into the existing OLAP Connection.
I have been through the documentation and it reccomends importing over a manual process. However I am unable to figure out how to do this with the import process.
Oracle® Fusion Middleware Metadata Repository Builder's Guide for Oracle Business Intelligence Enterprise Edition
11g Release 1 (11.1.1) Part Number E10540-05
Chapter 8 Working with Physical Tables, Cubes, and Joins
There is a section called Working with Multidimensional Sources in the Physical Layer, in this section is says
"Each multidimensional catalog in the data source can contain multiple physical cubes. You can import the metadata for one or more of these cubes into your Oracle BI repository. Although it is possible to create a cube table manually, it is recommended that you import metadata for cube tables and their components"
The New Utilities feature(s) is really cool, but it doesn't work for OLAP connections. 
Any help is greatly appreciated.

Hi Michael,
it works, but its a bit tricky. I did it with Oracle OLAP and I think it must be similar with other OLAP Databases.
Rename your existing Database in Physical Layer to the name Metadata Import dialog would create. With Oracle OLAP the "Data Source Name" ist used. Just try it once.
Unfortunately you can not merge dimensions and cubes as it is with relational Objects . So you have to delete the cube and dimensions you will import from you renamed Database in Physical Layer. Of course the mapping between physical and business model will be lost
Now you can import your modified OLAP-Cubes and Dimensions and it should be placed in your renamed Database in Physical Layer
Open the sources of your logical tables in business model and add the Dimensions or Cubes in the "General" Tab. Move to the Tab "Column Mapping" and check if mapping is OK. If not (this will be the case, if you customized the column names), you have to do the mapping from the scratch .
I recommend to use the original names from the data source. If you need another name for a column, just create a new logical column and use "Derive from existing columns". Now you can reimport OLAP Metadate quicker because the Mapping is done automatically.
Hope this helps
Regards Claus

Similar Messages

  • Expand the Warehouse tables X adding physical tables and joins

    I was questioned about these options:
    a) Add physical tables (from other DBs) and modifying the physical model with new joins + modifying logical model to include new columns
    b) Expand warehouse tables to include new columns in the tables. Doing a complementar ETL in order to feed the additional columns with data. Logical model would be updated but the idea in this option is to avoid joins in the physical layer.
    My understanding is that option a) despite the joins in the physical layer would be a better strategy.
    Pls. I'd appreciate any comments on the performance side or in the amount of effort to create the complementar ETL...
    Txs.
    Antonio

    Hi Lombo,
    I am not sure about what you mean with option a. In my understanding, you are asking for a comparison between:
    a) adding data from an additional source in the RPD:
    This means that you have to create an additional data source in the physical layer. I do not think you can create relations between physical tables sourced from different data sources in your physical layer. This means that the data is related in the logical layer. By doing this, the BI Server has to join the data from both sources for each front-end request. Also, it will have to perform the last aggregations instead of shipping it to the database. This is a big performance hit in terms of response time.
    However! It can be much and much faster in terms of development effort required to support this. If it's only a limited number of columns and tables being added, this may be a valid option. You can also use this approach to prototype option b.
    b) adding data from an additional source using ETL:
    Quite some work in terms of development effort. However, all complexity and performance hits are moved to the bottom of the stack and process: database and ETL. You will need to create an additional data source in the DAC, additional custom folder(s) in Informatica and the ETL logic to update the existing tables and/or load new tables.
    Additionaly, you need to expand the RPD with the new columns / tables, just as you would do in option a. However, now it will be sourced from the same datasource: less work for the BI Server to deliver the dataset to satisfy the request.
    So basically, I would consider:
    - How does the system currently perform: response times in the front end ( bad -> go for option b )
    - How many columns will be added and how often will they be used in the front-end
    - Informatica / DAC / OBI expertise available
    - How long do you have to facilitate reporting on data from the 2nd source
    Good luck!

  • How to update the VBFA table without entries in VBRK and VBRP tables

    Hello,
    I have a requirement , where the sales order , delivery happens in one SAP system say X system and billing document wil happens in other sap system say Y.
    Now in this particular case , if the user want to know the billing document number in X system. what would be the solution.
    Is there any possibility to update the VBFA table with invoice number of Y system (without updating the VBRK and VBRP tables).
    User want to see the billing doc number from sales order in the form of document flow in X system
    Thanks in advance
    Pradeep

    Hii
    Without updating VBRK/VBRP table and updating VBFA is not possible, So you have to play with work arround to create one Z TABLE, so whenever system Y will create billing document with the reference of System X data you have to update Ztable with Invoice number created in System Y, The primary key will be in Z TABLE is Delivery document number to track one to one with billing doc.
    Thanks and Regards
    Shambhu Sarkar

  • Z field addition in MM01 and updation of MARA table

    Hi ABAPers
    I have created a Z field in BASIC DATA1 TAB in MM01 transaction as Input/Output Enabled of length 1 and type CHAR.
      Screen Name: SAPLZMGD1
        Numer : 2313
    And also added the same field in MARA table (Append structure) .
    When I enter value in MM02 screen for this field and is getting updated in MARA table and in MM02/MM03 screen  as well.
    My requirement is to update this field through BAPI_MATERIAL_SAVEDATA throug a Z program.
    To update using  this BAPI , I have added this field to BAPI_TE_MARA and BAPI_TE_MARAX structures also and passing these values to EXTENSIONIN and EXTENSIONINNX tables.
    But it is not getting updated when I supplied this value to BAPI , whereas other Z fields getting populated in MARA table.
    So please let me know where Iam doing wrong or any settings I missed while creating Screen field -
    Your quick response is highly appreciated
    Thanks
    Satya

    Hi Reddy
    Thanks for your reply
    I did as you mentioned .
    The field that I created is not getting populated and the other Z field on the same screen and  same tab is getting populated correctly.
    YYXPTY  CHAR(4)
    YYHISIND CHAR(1)
    here the YYXPTY field is getting populated in MARA table correctly but not YYHISIND which is created by me.
    The code that I have written is as below :  [Please help me out]
    REPORT  ztest_yyxpty.
    DATA: wa_bapimara TYPE bapimathead,
          t_extn  TYPE STANDARD TABLE OF bapiparex,
          wa_extn TYPE bapiparex,
          t_extnx  TYPE STANDARD TABLE OF bapiparexx,
          wa_extnx TYPE bapiparexx,
          wa_return TYPE bapiret2,
          t_rettab  TYPE STANDARD TABLE OF bapi_matreturn2,
          wa_makt TYPE bapi_makt,
          t_makt TYPE STANDARD TABLE OF bapi_makt.
    wa_bapimara-material = '500000000000088999'.
    wa_bapimara-basic_view = 'X'.
    wa_extn-structure = 'BAPI_TE_MARA'.
    wa_extn-valuepart1+0(18) = '500000000000088999'.  "Material number
    wa_extn-valuepart1+65(4) = '0002'.       " field YYXPTY
    wa_extn-valuepart1+69(1) = 'A'.            " field YYHISIND
    wa_extnx-structure = 'BAPI_TE_MARAX'.
    wa_extnx-valuepart1+0(18) = '500000000000088999'.
    wa_extnx-valuepart1+27(1) = 'X'.         
    wa_extnx-valuepart1+28(1) = 'X'.
    APPEND wa_extn TO t_extn.
    APPEND wa_extnx TO t_extnx.
    CALL FUNCTION 'BAPI_MATERIAL_SAVEDATA'
      EXPORTING
        headdata            = wa_bapimara
      IMPORTING
        return              = wa_return
      TABLES
        materialdescription = t_makt
        returnmessages      = t_rettab
        extensionin         = t_extn
        extensioninx        = t_extnx.
    IF wa_return-type = 'S'.
      CALL FUNCTION 'BAPI_TRANSACTION_COMMIT'
        EXPORTING
          wait = 'X'.
    ENDIF.
    WRITE: wa_return-type, wa_return-message, wa_return-message_v1,
    wa_return-message_v2, wa_return-message_v3, wa_return-message_v4.

  • Problem creating hierarchy based on 2 physical dimension tables

    I'm having a problem creating 1 logical dimension with a drill-down hierarchy, based on two separate physical dimension tables. The errors I receive when navigating the drill-down hierarchy is:
    "Cannot find logical table source coverage for logical columns" &
    "Missing join between logical tables".
    I'm using OBIEE 10.1.3.4
    Here are the details of what I have setup sofar:
    Physical layer:
    Dimension table DIM_ORG with columns:
    -dimension_key
    -org_total_code
    -org_total_description
    -org_detail_code
    -org_detail_description
    Dimension table DIM_DEPT with columns:
    -dimension_key
    -dept_total_code
    -dept_total_description
    -dept_detail_code
    -dept_detail_description
    Fact table FACT_SALES with columns:
    -fk_org
    -fk_dept
    -sum_sales
    Physical Joins:
    FACT_SALES.fk_org = DIM_ORG_dimension_key
    FACT_SALES.fl_dept = DIM_DEPT.dimension_key
    Business Model & Mapping layer:
    I created a logical dimension ORG_DEPT. It contains two logical table sources (DIM_ORG & DIM_DEPT) and the following logical columns:
    - All Departments (mapped to dept_total_code)
    - Organisation (mapped to org_detail_description)
    - Organisation Number (mapped to org_detail_code)
    - Department (mapped to dept_detail_description)
    - Department Code (mapped to dept_detail_code)
    The business logical key is based on the combination of Organisation Number & Department Code
    The hierarchy I need is: All Departments -> Organisation -> Department so I created the following hierarchy for ORG_DEPT:
    - Total Level containing: All Departments
    - Organisation Level containing: Organisation Number (defined as the Logical level key) & Organisation (defined als the Drill level key)
    - Detail Department Level containing: Department Code (defined as Logical level key) and Department (defined as Drill level key).
    In the LTS of the dimension ORG_DEPT I've set the Content levels for the sources:
    DIM_ORG : Organisation Level
    DIM_DEPT: Detail Department Level
    In the LTS no -inner- joins have been added against related physical tables.
    I created a logical fact table SALES (based on the physical fact table) and joined it against the logical dimension table ORG_DEPT.
    In the LTS the Content level for ORG_DEPT is set against the Detail Department Level. No - inner- joins have been aded against related physical tables.
    When I create a report in Answers to test the hierachy and select only 'All Departments' I get the correct dimension value returned. When I try to drill to the next level I get the following ODBC error:
    "Cannot find logical table source coverage for logical columns: [All Departments]. Please check more detailed level keys are mapped correctly".
    When I create a report in Answers and select both 'All Departments' and 'Sales' I get the correct result. When I try to drill to the next level I get a different ODBC error:
    "Missing join between logical tables DIM_DEPT and DIM_DEPT: There must be at least one physical join link between the underlying physical tables".
    Any suggestions are welcome!
    Thanks!

    Hello Robert,
    Your suggestions were known to me but I still wanted to combine the two physical dimension tables in one logical dimension. So I've played around a bit more and found the solution: In my original setup I had two seperate logical table sources (one for each physical dimension table). The solution was to combine the two logical table sources in one logical table source. I achieved that by logical joining the DIM_DEPT table to the FACT_SALES table and subsequently to the DIM_ORG within the 1 LTS and using inner joins.
    Then I created the logical table key (a combination of org_detail_code & dept_detail_code). After that I could create the hierarchy with no problem.
    Edited by: The_Dutchman on Nov 4, 2011 9:43 PM

  • Setup event table cache clear for - Essbase cube table

    we have event polling table for clearing the cache now having issues in setting up for essbase physical cube tables.
    1. we have structure of hyperion like this for physical cube table "CONSOL"- where schema is empty.
    "Hyperion"."CONSOLX".."CONSOL"
    2. Event polling table is in different database/connection
    When the event table is populated without schema, polling actually looking for the table in the database configured for the event pool connection pool and not "hyperion".
    *[55001] The physical table Hyperion:CONSOLX:Event Polling:CONSOL in a cache polled row does not exist.*
    When i read in some blogs...I understood its working as expected(when something is null it looks in event polling database)
    Has anybody configured event polling for hyperion cubes table cache clear ?
    Thanks
    Ayaps

    what is your insert statement in UET table?
    Looks like you are missing one value out of 4 required. Thats why it is considering Event Pooling in between. As you have event pooling table in different physical database, you have to fill DbName,CatalogName,SchemaName and TableName in the UET tables.

  • Updating A child Table As Administrator

    Hello Eevryone, I read a post here http://forums.adobe.com/thread/463635. And i first off want to commend you guys on your awesome contribution to novices like me. I understand the technique presented here but I have a different problem though. I have two tables, which are related by one to many. Personals Table and Courses Table. One person can have many courses. Only the Admin can update and add new courses. The Courses Table has a Foreign key: personal_Id. The admin has a view page and can update personals table easily. How can I add and update courses for a selected Person. I mean how to build the recordset in Dreamweaver CS5 so that I cant pick just that persons ID and add all course just for his account. In the technical sense how do I parse the FK_personal_Id so that the add course form or update course form only stores the course for that personal_ID on Personal Table. Please I hope you understand how I have explained it but I would love to know this procedure. Thank you very much.

    You don't really want to create a recordset to update the tables. You don't mention what scripting language you are using. In ASP, you CAN  create updateable recordsets, but it's really not necessary in this circumstance. You simply need to create an update statement to update the parent table, and an insert statement to insert the record into the related table when the form is posted. You can use DW's server behaviors to accomplish one or the other if you want - but not both. If I am performing a CUD on multiple tables, I would think about writing a stored procedure. Again, it depends on your DBMS. I'd suggest looking into DW Insert record and update record server behaviors to start with.

  • Updating a mutating table

    I have a after row trigger on a table which extracts data from another database through a database link. On insert/update of the row it updates the requisite tables in the database on which it is created.An exception is returned when if the rows are not inserted for any reason. This is to be updated in the triggering table column.Can you write a Stored Procedure to update the triggering table and avoid mutating table problem

    If you just want to update the triggering row with some specific value, then simply assign
    :new.columnname := 'some value';
    If you need to identify other rows in the same table, it's more complicated because the data is a state of flux. Potentially there are other triggers still to fire, in an unkown order, that could change the results of any query.

  • Need to update column in table from joining the two tables

    Hi
    i want to update a table by joining two table but i m getting the below error
    single row subquery returns more than one value.
    i m using following query
    update table2
    set deposite_date=(select a.deposit_date from table1 a,table2 b
    where a.ban=b.ban and (a.deposit_date between b.cy_start_date and b.due_date))
    table2
    ban due_date cy_start_date deposite_date
    100 5/6/2011 22/05/2011
    101 7/11/2011 22/05/2011
    102 5/25/2011 22/05/2011
    table1
    ban      deposit_date
    100      04/12/2011
    100      04/15/2011
    102      03/25/2011
    101      04/26/2011
    101      05/06/2011
    101      07/26/2011
    can i insert latest deposit_date from table1 which lies between b.cy_start_date and b.due_date
    Thanks

    Hi,
    Welcome to the forum!
    876466 wrote:
    Hi
    i want to update a table by joining two table but i m getting the below error
    single row subquery returns more than one value.Exactly!
    (   SELECT  a.deposit_date
        FROM    ...returns all deposit_dates.
    (   SELECT  MAX (a.deposit_date)
        FROM    ...would return only the latest one.
    Also, you need to correlate the sub-query to the main query, so that it only looks for rows with the same ban.
    i m using following query
    update table2
    set deposite_date=(select a.deposit_date from table1 a,table2 b
    where a.ban=b.ban and (a.deposit_date between b.cy_start_date and b.due_date))
    table2
    ban due_date cy_start_date deposite_date
    100 5/6/2011 22/05/2011
    101 7/11/2011 22/05/2011
    102 5/25/2011 22/05/2011
    table1
    ban      deposit_date
    100      04/12/2011
    100      04/15/2011
    102      03/25/2011
    101      04/26/2011
    101      05/06/2011
    101      07/26/2011
    can i insert latest deposit_date from table1 which lies between b.cy_start_date and b.due_datePerhaps you need something like this:
    UPDATE  table2     b
    SET     deposit_date     =
             (    SELECT  MAX (a.deposit_date)
                   FROM      table1           a
               WHERE      a.ban          = b.ban
               AND      a.deposit_date BETWEEN b.cy_start_date
                               AND     b.due_date
    ;Notice that table1 is the only table in the FROM clause of the sub-query, but the sub-query is correlated to table2, so we can reference columns from table2 in the sub-query.
    Whenever you have a problem, post CREATE TABLE and INSERT statements for your sample data.
    Also post the results you want from that data, and an explanation of how you get those results from that data, with specific examples.
    If you're asking about a DML statement, such as UPDATE, the CREATE TABLE and INSERT statements should re-create the tables as they are before the DML, and the results will be the contents of the changed table(s) when everything is finished.
    Always say which version of Oracle you're using.
    Edited by: Frank Kulash on Aug 1, 2011 1:58 PM

  • OBIEE bypasses smaller aggregate table and queries largest aggregate table

    Hello,
    Currently we are experiencing something strange regarding queries that are generated.
    Scenario:
    We have 1 detail table and 3 aggregate tables in the RPD. For this scenario I will only refer to 2 of the Aggregates.
    Aggregate 1 (1 million rows):
    Contains data - Division, Sales Rep, Month, Sales
    Aggregate 2 (13 milliion rows):
    Contains data - Division, Product, Month, Sales
    Both tables are set at the appropriate dimension levels in the Business Model. Row counts have been updated in the physical layer in the RPD.
    When we create an answers query that contains Division, Month and Sales, one would think that OBIEE would query the smaller and faster of the two tables. However, obiee wants to query the table with 13 million records completely bypassing the smaller table. If we make the larger aggregate inactive, then OBIEE queries the smaller table. We can't figure out why OBIEE wants to immediately go to the larger table.
    Has anyone experienced something such as this? Any help would be greatly appreciated.
    Edited by: gwb on Aug 19, 2009 7:45 AM

    Have you try to change the sort order of the logical table source in your logical table ?
    !http://gerardnico.com/wiki/_media/temp/obiee_logical_table_sources_sort.jpg!
    Set the Aggregate 1 first.
    Cheers
    Nico

  • Automatically updating table column in OBIEE

    Hi folks,
    I have a requirement for say i have to update a table column based on a condition, but this update should happen on daily basis (i mean it should be scheduled once daily) by checking if any records in that column meets the condition then update it
    example :- column='Pending' then update it with 'Approved' . As transactions records comes on daily into the table.OBIEE should check and update the row so next time it doesnt show this row as 'Pending' .
    I have thought of 2 approaches for this
    1) Through writeback it is possible to update but manually ( the issue is it cant be automated or scheduled on daily basis and update multiple records on a go)
    2) Writing a stored procedure or function for update, but how can i accomplish this procedure to be scheduled or run once daily.how to use it?
    Any thoughts on the above two points or any method you guys can think off.
    Any help is appreciated !
    Cheers,
    KK

    Kranthi,
    This shud be easy...
    - Create a direct database request...with Update Statement
    eg: Update table set col = 'Approved' where col = 'Pending'
    - Create an ibot using the above DDR and schedule it on daily basis.
    This will update the full table with new value on daily basis based on the condition

  • Are fact tables and cubes same in OWB?

    Dear all
    A simple question. How can I create a star schema (that is, with a fact table and dimensions) using OWB?
    OWB has options to create cubes, but as per my understanding a cube is not a fact table.
    Cube contains pre-computed data where as fact table contains normal data with references to dimensions.
    Please correct me if I am wrong.
    thanks in advance

    These are just different levels of abstraction.
    "Cube" is the highest level of abstraction referring to the overall package of data.
    "Star schema" is how cubes are modelled showing the relationships from a fact entity to the dimension entities.
    Relational and OLAP are different methods of physical implementation.
    In OWB, to promote sharing of dimensions across cubes to avoid inconsistency the idea is you define and build the dimensions independently. Then you define the "cubes" as measures and references to the dimensions. When you build the "cube" you pass in business identifiers from the source data which OWB will use to link the measures to the applicable dimension data. Due to the wonders of inner joins anyone reading the "cube" will only see dimension data related to the data in that cube!
    Using OWB you do not need to be concerned with the physical implementation when you use the dimension and cube operators as those operators know what to do.

  • Dimension table and fact table exists data physically

    Hi experts,
    can anyone plz tell me weather dimension table and fact table exists data physically or not/

    Hi..Sudheer
    SAPu2019s BW is based on "Enhanced Star schema" or "Info Cubes" database design.This database design has a central database table, known as u2018Fact Tableu2019 which is surrounded by associated dimension tables.
    Fact table is surrounded by dimensional tables. Fact table is usually very large, that means it contains
    millions to billions of records.
    These dimension tables doesn't contain data  it contain references to the pointer tables that point to the master data tables which in turn contain Master data objects such as customer, material and destination country stored in BW as Info objects. An InfoObjects can contain single field definitions such as transaction data or complex Customer Master Data that hold attributes, hierarchy and customer texts that are stored in their own tables.
    SID is surrogate ID generated by the system. The SID tables are created when we create a master data IO. In SAP BW star schema, the distinction is made between two self contained areas: Infocube & master data tables/SID tables.
    The master data doesn't reside in the satr schema but resides in separate tables which are shared across all the star schemas in SAP BW. A numer ID is generated which connects the dimension tables of the infocube to that of the master data tables.
    The dimension tables contain the dim ID and SID of a particular IO. Using this SID the attributes and texts of an master data Io is accessed.
    The SID table is connected to the associated master data tables via teh char key.
    Fact table(Transaction data,DIM ID)<>Dimention Table(SID and Dim ID)<->Masterdata table(SID,IO)
    Thanks,
    Abha

  • Update records from a table in correct sequence that look from 2 tables and loop

    Hi!
    My question title is kinda unclear but here how it goes.
    I created  2 tables for my BOM (Bill of Materials). I have BOM_Header and BOM_Detail. 
    Below are my sample data. 
    BOM_Header
    Product_ID Int
    Product_Name Varchar(50)
    Cost Numeric(18,2)
    Yield Varchar(50)
    Select * from BOM_Header
    1 Choco Cake 850.00 10
    2 Mixed Flour 700.00 30
    3 Choco Syrup 160.00 10
    4 Egg Formula 2150.00 20
    BOM_Detail
    Product_ID int
    ItemNo Int
    ItemName varchar(50)
    Quantity int
    Unit varchar(50)
    ProdCost numeric(18,2)
    Select * from BOM_Detail
    1 2 Mixed Four 10 Grams 15.00
    1 3 Choco Syrup 20 ML 25.00
    1 4 Egg Formula 20 Grams 10.00
    2 101 Flour 5 packs 80.00
    2 4 Egg Formula 5 Grams 60.00
    3 201 Cocoa Power 2 kg 20.00
    3 202 Sugar 2 kg 60.00
    4 301 Whole Egg 10 pcs 85.00
    4 302 EP12 Formula 25 ml 52.00
    My computation is below.
    BOM_Header = a
    BOM_Detail = b
    a.Cost = b.Quantity  x  b.Product Cost  where a.Product_ID = b.Product_ID
    My problem is how can I automatically compute their Food Cost in  sequence from raw materials to finished products.
    Based on the data, I need to compute first the Egg Formula because it is used as component of  Mixed Flour then compute the Mixed Flour and other component to get the cost of Choco Cake. 
    How can I do this automatically in query to look first in detail if there are ingredients within a sub - ingredients then compute before computing the final cost of the Product. 
    This is because cost of ingredients are changing most of the time and need to recalculate to get the most updated cost.
    Any suggestion is very much appreciated.
    Thank you very much,
    Regem

    >> My question title is kinda unclear but here how it goes. <<
    Then your answers will be unclear, too :(  
    You do not know data modeling, so your data types are wrong. What math are you doing with the product_id? None. This is why identifiers are char(n) and not numeric. They should be industry standards if possible. I like the GTIN.
    You do not even know that rows are not records. 
    Why is the product name fifty characters? Certainly not research! But if you are an ACCESS programmer using a default vale, then you might do this. 
    Besides violating ISO-11179 rules, “<vague>_field” makes no sense! It is a quantity that you put in a  string. 
    CREATE TABLE BOM_Header
    (gtin CHAR(15) NOT NULL PRIMARY KEY,
     product_name   VARCHAR(25) NOT NULL
     unit_cost  DECIMAL(18,2) NOT NULL
      CHECK ( unit_cost >= 0.0.),
     yield_qty INTEGER NOT NULL
       CHECK (yield_qty >= 0));
    >> Any suggestion is very much appreciated. <<
    Get a copy of my book on TREES in SQL and read the chapter on BOM problems. I am not goingto try to post a whole chaper and diagrams to answer this. You are doing the wrong things and have done them poorly. 
    --CELKO-- Books in Celko Series for Morgan-Kaufmann Publishing: Analytics and OLAP in SQL / Data and Databases: Concepts in Practice Data / Measurements and Standards in SQL SQL for Smarties / SQL Programming Style / SQL Puzzles and Answers / Thinking
    in Sets / Trees and Hierarchies in SQL

  • How can i update data in okc_k_items and csi_item_instances tables?

    How can i update data in okc_k_items and csi_item_instances tables?
    by EBS .
    Thanks.

    For csi_item_instances table, you can use the following API:
    CSI_ITEM_INSTANCE_PUB.UPDATE_ITEM_INSTANCE
    For okc_k_items, try using the following API:
    OKC_CONTRACT_ITEM_PUB.UPDATE_CONTRACT_ITEM
    Hope this helps!

Maybe you are looking for

  • Reg: Warning message in UWL or task TS00700049 &  TS00700059

    Hi ,        We tried to configure the Universal worklist. On registering it , we are getting hte following warnings. WebDynproApplication value not found for task TS00700049 in system SAP_Webflow Portal Component value not found for task TS00700059 i

  • Hi, every time I try to open up iPhoto it freezes and an error pops up.

    This is what my error says: Date/Time:      2012-10-16 20:13:19 -0400 OS Version:     10.5.8 (Build 9L30) Architecture:   i386 Report Version: 4 Command:        iPhoto Path:           /Applications/iPhoto.app/Contents/MacOS/iPhoto Version:        8.1

  • Controlling Dynamic Text

    How do you get dynamic text to show up within a specific movie clip instead of simply on the stage? I've created a dynamic text field and given it an instance name (banner_txt). I'd like to use a variable to hold the .text info for this field, that w

  • Very slow browser! Why?? (I've had to switch to Safari)

    I have been struggling with a slow Firefox (3.6.12 I think) for months it seems. Yesterday I decided to completely remove Firefox from the computer and reinstall it from scratch. I did so with no change in speed. This morning version 3.6.13 installed

  • Do you have any html5 web app examples (not using javascript)?

    Do you have an html5 web app that only uses html5 and CSS?  (and NOT javascript).  I'd just like to see what other people are making because I don't understand how you could make an interactive webapp without using javascript (and I don't want to lea