Partition a table after loading data

Hi there,
Is it possible to partition a table after loading the data in OWB?
Thanks

Yes, it is. You would add a partition information (type, key, storage) in OWB Configuration menu of your table. More on this in User's Guide, Chapter 5 "Creating Indexes and Partitions". Next, when you deploy your changed table, Warehouse Builder detects that the table was previously deployed and offers the default action called 'Upgrade'. That action performs all the necessary steps for you without any manual work: RENAME existing table, CREATE new table with partitions, INSERT data from old to new table, DROP old table. Throughout this operation you have an added benefit of automatic recovery if anything goes wrong.
Nikolai

Similar Messages

  • Why should we create index on  the table after inserting data ?

    Please tell me the Reason, why should we create index on the table after inserting data .
    while we can also create index on the table before insertion of the data.

    The choice depends on a number of factors, the main being how many rows are going to be inserted in the table as a percentage of the existing rows, or the percentage growth.
    Creating index after a table has been populated works better when the tables are large or the inserts are large for the following reasons
    1. The sort and creation of index is more efficient when done in batch and written in bulk. So works faster.
    2. When the index is being written blocks get acquired as more data gets written. So, when a large number of rows get inserted in a table that already has an index , the index data blocks start splitting / chaining. This increases the "depth" of the inverted b-tree makes and that makes the index less efficient on I/O. Creating index after data has been inserted allows Orale to create optical block distribution/ reduce splitting / chaining
    3. If an index exists then it too is routed through the undo / redo processes. Thats an overhead which is avoided when you create index after populating the table.
    Regards

  • Lock tables when load data

    Are there any way to lock tables when i insert data with SQL*Loader? or oracle do it for me automatically??
    how can i do this?
    Thanks a lot for your help

    Are there any problem if in the middle of my load (and commits) an user update o query data ?The only problem that I see is that you may run short of undo space (rollback segment space) if your undo space is limited and the user is running a long SELECT query for example: but this problem would only trigger ORA-1555 for the SELECT query or (less likely since you have several COMMIT) ORA-16XX because load transaction would not find enough undo space.
    Data is not visible to other sessions, unless, the session which is loading data, commits it. That's the way Oracle handle the read committed isolation level for transaction.
    See http://download-west.oracle.com/docs/cd/B10501_01/server.920/a96524/c21cnsis.htm#2689
    Or what happens if when i want to insert data someone has busy the table?You will get blocked if you try to insert data that has the same primary key as a row being inserted by a concurrent transaction.

  • Partitioning A table on a data column

    Hi,
    My requirement is to partition a table using a column with DATE datatype.
    Example:
    CREATE TABLE parttab
    STATUS CHAR(1) DEFAULT 'a' NOT NULL,
    UPDATER VARCHAR2(55) NOT NULL,
    UPDTIME DATE DEFAULT SYSDATE NOT NULL
    TABLESPACE part_tbs
    Now i want to RANGE partition the above table on the "UPDTIME" column, so that it will look like:
    Partition Value from Value till
    P1 01-Jan 31-Mar
    P2 01-Apr 30-Jun
    P3 01-Jul 30-Sep
    P4 MAXVALUE
    I tried using the below syntax, but failed with error.
    CREATE TABLE parttab
    STATUS CHAR(1) DEFAULT 'a' NOT NULL,
    UPDATER VARCHAR2(55) NOT NULL,
    UPDTIME DATE DEFAULT SYSDATE NOT NULL
    TABLESPACE part_tbs
    LOGGING
    PARTITION BY RANGE (UPDTIME)
    PARTITION P1 VALUES LESS THAN (to_date('04','MM')) LOGGING COMPRESS TABLESPACE part_tbs_P1,
    PARTITION P2 VALUES LESS THAN (to_date('07','DD-MM')) LOGGING COMPRESS TABLESPACE part_tbs_P2,
    PARTITION P3 VALUES LESS THAN (to_date('10','DD-MM')) LOGGING COMPRESS TABLESPACE part_tbs_P3,
    PARTITION P4 VALUES LESS THAN (MAXVALUE) LOGGING COMPRESS TABLESPACE part_tbs_P4
    COMPRESS
    NOCACHE
    PARALLEL ( DEGREE 2 INSTANCES Default )
    ROWDEPENDENCIES
    NOMONITORING
    ENABLE ROW MOVEMENT;
    Please suggest.
    Thanks in advance for the suggestions.

    what is the error message and your Oracle version? this sample may assist I guess -
    CREATE TABLE range_part (
    prof_history_id NUMBER(10),
    person_id       NUMBER(10) NOT NULL,
    organization_id NUMBER(10) NOT NULL,
    record_date     DATE NOT NULL,
    ph_comments     VARCHAR2(200))
    PARTITION BY RANGE (record_date) (
    PARTITION yr0 VALUES LESS THAN (TO_DATE('01-JAN-2000','DD-MON-YYYY'))
    TABLESPACE part1,
    PARTITION yr1 VALUES LESS THAN (TO_DATE('01-JAN-2001','DD-MON-YYYY'))
    TABLESPACE part2,
    PARTITION yr2 VALUES LESS THAN (TO_DATE('01-JAN-2002','DD-MON-YYYY'))
    TABLESPACE part3,
    PARTITION yr9 VALUES LESS THAN (MAXVALUE) TABLESPACE part4);for more examples you may visit - http://psoug.org/reference/partitions.html

  • R12 Interface table for loading data from 3rd part payroll system

    Hi All,
    Can anyone help me to have a lists and detailed technical information of available interface table on Oracle R12 for importing/loading data from third party payroll system. And what should be the best way of importing the data? It should be load first to AP then to GL or load it directly to GL?
    Any help much appreciated. Thanks.
    Cyrus

    Hi Cyrus,
    Can you please let us know your business requirements of this integration, i.e what business want to acheive out of this integration.
    It depends on what your business requirements are wether to send only accounting information from payroll system to your Oracle GL ( then you can integrate Payroll system to Oracle GL directly, by sending the accounting information from your payroll) or if your requirement is to create payroll related invoices in AP and then do payments in oracle AP and then pass accounting information to GL ( then integrate your payroll to AP)
    Regards,
    Madhav

  • Range partition the table ( containing huge data ) by month

    There ia a table with huge data ard 9GB.This needs to range patitioned by month
    to improve performance.
    Can any one suggest me the best option to implement partitioning in this.

    I have a lot of tables like this. My main tip is to never assign 'MAXVALUE' for your last partition, because it will give you major headaches when you need to add a partition for a future month.
    Here is an example of one of my tables. Lots of columns are omitted, but this is enough to illustrate the partitioning.
    CREATE TABLE "TSER"."TERM_DEPOSITS"
    ( "IDENTITY_CODE" NUMBER(10), "ID_NUMBER" NUMBER(25),
    "GL_ACCOUNT_ID" NUMBER(14) NOT NULL ,
    "ORG_UNIT_ID" NUMBER(14) NOT NULL ,
    "COMMON_COA_ID" NUMBER(14) NOT NULL ,
    "AS_OF_DATE" DATE,
    "ISO_CURRENCY_CD" VARCHAR2(15) DEFAULT 'USD' NOT NULL ,
    "IDENTITY_CODE_CHG" NUMBER(10)
    CONSTRAINT "TERM_DEPOSITS"
    PRIMARY KEY ("IDENTITY_CODE", "ID_NUMBER", "AS_OF_DATE") VALIDATE )
    TABLESPACE "DATA_TS" PCTFREE 10 INITRANS 1 MAXTRANS 255
    STORAGE ( INITIAL 0K BUFFER_POOL DEFAULT)
    LOGGING PARTITION BY RANGE ("AS_OF_DATE")
    (PARTITION "P2008_06" VALUES LESS THAN (TO_DATE(' 2008-07-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS', 'NLS_CALENDAR=GREGORIAN'))
    TABLESPACE "DATA_TS_PART1" PCTFREE 10 INITRANS 1
    MAXTRANS 255 STORAGE ( INITIAL 1024K BUFFER_POOL DEFAULT)
    LOGGING NOCOMPRESS , PARTITION "P2008_07" VALUES LESS THAN (TO_DATE(' 2008-08-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS',
    'NLS_CALENDAR=GREGORIAN'))
    TABLESPACE "DATA_TS_PART2" PCTFREE 10 INITRANS 1 MAXTRANS 255
    STORAGE ( INITIAL 1024K BUFFER_POOL DEFAULT) LOGGING NOCOMPRESS )
    PARALLEL 3

  • Error after loading data

    Hello,
    After successfully loading data into a cube, if i go to manage, i get the following
    message
    Diagnosis
    There is an inconsistency between the load status of the data and the option of reporting on this data.
    There is data in the InfoProvider that is OK from a quality point of view, but is not yet displayed in reporting.
    Procedure
    Choose Refresh to remove the inconsistency.
    If i choose refresh, i can report on the cube. But if i load data again, i get the same error.
    Can you pls help?
    Regards,
    Suraj

    Hi Suraj,
    This is an message you get when you are in the manage screen at a certain time of the load process.
    If you wait a little longer before you go into manage, you will not see it.
    Udo

  • Semantic Partitioning Delta issue while load data from DSO to Cube -BW 7.3

    Hi All,
    We have created the semantic partion with the help of BADI to perform Partitions. Everthing looks good,
    first time i have loaded the data it is full update.
    Second time i init the data load. I pulled the delta  from DSO to Cube . DSO is standard where as Cube is partition with the help of semantic partition. What i can see is the records which are updated in Latest delta are shown into the report rest all are ignored by the system.
    I tried compression but still it did not worked.
    Can some has face this kind
    Thanks

    Yaseen & Soujanya,
    It is very hard to guess the problem with the amount of information you have provided.
    - What do you mean by cube is not being loaded? No records are extracted from DSO? Is the load completing with 0 records?
    - How is data loaded from DSO to InfoCube? Full? Are you first loading to PSA or not?
    - Is there data already in the InfoCube?
    - Is there change log data for DSO or did someone delete all the PSA data?
    Sincere there are so many reasons for the behavior you are witnessing, your best option is to approach your resident expert.
    Good luck.
    Sudhi Karkada
    <a href="http://main.nationalmssociety.org/site/TR/Bike/TXHBikeEvents?px=5888378&pg=personal&fr_id=10222">Biking for MS Relief</a>

  • [4.0] Bug - can't save changes in table after clearing date field with shift+delete

    Steps to reproduce:
    1) Open any table containing columns of DATE type, go to "Data" tab
    2) Find any non-null cell in said column
    3) Select said cell, press shift+delete. Cell's value becomes (null) and row is marked for update.
    4) Press Commit button or F11
    After that, this line appears in "Messages" window:
    One error saving changes to table "SCHEMA"."TABLE_NAME":
    If you clear column by typing space and deleting it, cell's value becomes (null) as well, but commit works. Update statement from worksheet also work, so it's not a table issue.

    please any answer
    شركة نظافة بالرياض شركة تنظيف شقق بالرياضشركة تنظيف موكيت بالرياضشركة تنظيف مجالس بالرياضشركة تنظيف منازل بالرياض شركة تسليك مجارى بالرياضشركة نقل اثاث بالرياضشركة مكافحة حشرات بالرياضشركة تنظيف بيوت بالرياضشركة تنظيف خزانات بالرياضشركة عزل اسطح بالرياضشركة تنظيف بالخرجشركة رش مبيدات عزل اسطحتنظيف شقق بالرياضتنظيف موكيت بالرياضشركة تنظيف مجالس بالرياضشركة تنظيف بيوت بالرياضشركة تنظيف واجهات زجاج بالرياض شركة تنظيف بيارات بالرياضشركة تنظيف بالخرجشركة تنظيف منازل بالرياضشركة نقل اثاث بالرياضشركة تنظيف خزانات بالرياضشركة مكافحة حشرات بالرياضشركة نقل اثاث بالرياضشركة مكافحة حشرات بالرياضشركة تخزين اثاث بالرياضنقل اثاثشركة تنظيفشركة تنظيف شركة تنظيف شقق بالرياض شركة تنظيف موكيت بالرياض شركة نقل اثاث بالرياضشركة تنظيف بيوت بالرياضشركة تنظيف واجهات حجر بالرياضشركة كشف تسربات المياه بالرياضشركة تخزين اثاث بالرياضنقل اثاثشركة تنظيف فلل بالرياضشركة تخزين عفش بالرياضشركة نقل عفش بالرياضشركة تنظيف موكيت بالرياضشركة تنظيف مجالس بالرياضشركة تنظيف شقق بالرياضشركة تنظيف بيوت بالرياضشركة تنظيف بيارات بالرياضشركة رش مبيدات بالرياضشركة مكافحة حشرات بالرياضشركة عزل خزانات بالرياضعزل الاسطحشركة تنظيف خزانات بالرياضشركة تنظيف واجهات زجاج بالرياض شركة نقل اثاث بالرياضشركة ترميمات عامة بالرياضشركة تنظيف منازل بالرياضشركة تسليك مجارى بالرياض شركة تنظيف واجهات حجر بالرياضشركة كشف تسربات المياه بالرياضشركة تخزين عفش بالرياضنقل اثاثشركة تنظيف فلل بالرياضشركة تخزين عفش بالرياضشركة نقل عفش بالرياضشركة تنظيف موكيت بالرياضشركة تنظيف مجالس بالرياضشركة تنظيف شقق بالرياضشركة تنظيف بيوت بالرياضشركة تنظيف منازل بالرياضشركة تنظيف بيارات بالرياضشركة رش مبيدات بالرياضشركة مكافحة حشرات بالرياضشركة عزل خزانات بالرياضعزل الاسطحشركة تسليك مجارى بالرياضشركة تنظيف خزانات بالرياضشركة تنظيف واجهات زجاج بالرياضترميمات عامة شركة تنظيف واجهات حجرية بالرياضشركة كشف تسربات المياه بالرياضشركة تخزين اثاث بالرياضشركة نقل اثاث بالرياضشركة تنظيف فلل بالرياضشركة تخزين عفش بالرياضشركة نقل عفش بالرياضشركة تنظيف موكيت بالرياضشركة تنظيف مجالس بالرياضشركة تنظيف شقق بالرياضشركة تنظيف بيوت بالرياضشركة تنظيف بيارات بالرياضشركة رش مبيدات بالرياضشركة مكافحة حشرات بالرياضشركة عزل خزانات بالرياضشركة عزل اسطح بالرياضشركة تنظيف خزانات بالرياضشركة تنظيف واجهات زجاج بالرياض شركة كشف تسربات المياه بالخرجشركة تخزين اثاث بالخرجشركة نقل اثاث بالخرجشركة تخزين عفش بالخرجشركة نقل عفش بالخرجشركة تنظيف موكيت بالخرجشركة تنظيف مجالس بالخرجشركة تنظيف شقق بالخرجشركة تنظيف بيوت بالخرجشركة تنظيف منازل بالخرجشركة تنظيف بيارات بالخرجشركة مكافحة حشرات بالخرجشركة ترميمات عامة بالرياضشركة عزل خزانات بالرياضشركة عزل اسطح بالخرجشركة تنظيف خزانات بالخرجشركة تسليك مجاري بالخرجشركة رش مبيدات بالخرجشركة تنظيف بالخرج شركة كشف تسربات المياه بالخرجشركة تخزين اثاث بالخرجشركة نقل اثاث بالخرجشركة تخزين عفش بالخرجشركة نقل عفش بالخرجشركة تنظيف موكيت بالخرجشركة تنظيف مجالس بالخرجشركة تنظيف شقق بالخرجشركة تنظيف بيوت بالخرجشركة تنظيف منازل بالخرجشركة تنظيف بيارات بالخرجشركة مكافحة حشرات بالخرجشركة عزل خزانات بالخرجشركة عزل اسطح بالخرجشركة تنظيف خزانات بالخرجشركة تسليك مجاري بالخرجشركة رش مبيدات بالخرجشركة تنظيف فلل بالخرج شركة كشف تسربات المياه بالخرجشركة تخزين اثاث بالخرجشركة نقل اثاث بالخرجشركة تخزين عفش بالخرجشركة نقل عفش بالخرجشركة تنظيف موكيت بالخرجشركة تنظيف مجالس بالخرجشركة تنظيف شقق بالخرجشركة تنظيف بيوت بالخرجشركة تنظيف منازل بالخرجشركة تنظيف بيارات بالخرجشركة مكافحة حشرات بالخرجشركة عزل خزانات بالرياضشركة عزل اسطح بالخرجشركة تنظيف خزانات بالخرجشركة تسليك مجاري بالخرجشركة رش مبيدات بالخرجشركة تنظيف فلل بالخرجشركة ترميمات عامة بالرياضشركة ترميمات عامة بالخرجشركة تنظيف واجهات حجرية بالخرج

  • Missing some records after Loading Data in Oracle 11g Release 1

    Hi All,
    I wrote a script to load one million records into the database using all insert commands.
    I know there is a faster way using SQL Loader- but i am in the process of learning but ran out of time.
    Anyhow, 1 million data was injected into the database but only 999998 records are there....does anyone know why?
    Is there a way I can find out which data was rejected?
    thanks

    Hello,
    It could be nice that your script spool a log.
    By that way, you could more easily catch the error and the detail of the rows rejected.
    Best regards,
    Jean-Valentin

  • After loading data to data targer

    wen fetching the query,at wat sequence data will come from tables(like fact table,dim table,sid table,master data table)which one is first?

    Hi,
    First your query go to local cache and from there OLAP cache else Aggregates, else Infocube.
    Assign points if it helps
    Regards
    Srinivas

  • Can any one tell me How to load data in matrix from user table

    Hi,
    we need to load data to matrix when page loads. After loading data we should add row to that matrix and update the values in database without effecting previous data. It should have functionality like UDO Default Form. Can any help me out in this issue.
    Regards,

    Hi Souji,
    try this,
        SAPbouiCOM.Form oForm;
        SAPbouiCOM.DataTable DBTable;
        SAPbouiCOM.DBDataSource dbDatasource;
         string strqry;
         oForm = SBO_Application.Forms.Item("FormUID");
          int i;
          SAPbouiCOM.Item oitem;
           oitem = oForm.Items.Item("ItemUID");
           objMatrix = (SAPbouiCOM.Matrix)oitem.Specific;
           objMatrix.AddRow(1, objMatrix.RowCount + 1);
       dbDatasource = (SAPbouiCOM.DBDataSourceoForm.DataSources.DBDataSources.Item("TableName");
       DBTable= oForm.DataSources.DataTables.Add("UniqueName");
        strqry = "paste Query that you want to fetch";
        DBTableActivity.ExecuteQuery(strqry);
         for (i = 0; i <= DBTable.Rows.Count - 1; i++)
                objMatrix.FlushToDataSource();
                dbDatasource.SetValue("column1(name)", i, System.Convert.ToString(DBTable.GetValue(0, i)));
                dbDatasource.SetValue("column2(name)", i, System.Convert.ToString(DBTable.GetValue(1, i)));
                dbDatasource.SetValue("column3(name)", i, System.Convert.ToString(DBTable.GetValue(2, i)));
                dbDatasource.InsertRecord(i + 1);
                objMatrix.LoadFromDataSource();
    If it helps you, give me the reward points.
    Thanks
    Sravan Kumar Pothu

  • Why these key figures roll up but others do not after the data load

    Experts,
      I have 7 key figures on target cube, which all are direct mapping from  source cube. After loading data from source cube to target cube, 4 of the KFs got auto rolled up by themself, but the rest do not. For example, on source cube, one student record, id is 1002345, has the following KF values to represent the 7 KFs: 3.417, 3.411, 7.346, 3.00, 131.00, 383.00, 90.00. When loading to target cube, there is update rule to get other more information for target record as well. After the load finish, each student has 5 records now. The first 4 KFs has been changed to 3.417xN, 3.411xN, 7.346xN, 3.00xN on the target cube, in some of the 5 records, which are not true for the requirement.
    Here are my questions:
    1) Since all KFs are direct mapping from soure, why they got changed?
    2) Why only some of them change, and some of them do not?
    Thanks,
    Linda

    Thanks by pointing out the way to check. Yes, I checked under RSD1 and foundout the difference between the first 4 and last 3.
    The first 4 KFs aggregation is SUM, which needs to change to MAX. But even under edit mode, I can't change the aggregation(disable). So how can I change these 4 KF's aggregation?
    Thanks,
    Linda

  • How to load data thru flat file in demantra?

    Hi,
    I am using seeded data model for loading data for DM module in demantra.
    As per data model there is 3 staging tables for item, location and sales_data i.e. t_src_item_tmpl,t_src_loc_tmpl,t_src_sales_tmpl respectively.
    I have seen the final table for sales data i.e. sales_data table. ther is only one item_id and location_id column against which sales_date and quantity is loaded.
    I have same sales_data and quantity to be loaded for different location hierarchy and item hierarchy .
    How the data file is created ? if you have any sample data files , can you please share with me ? If any changes needs to be done in data model for loading datardsa in custom hierarchies ?
    Regards

    Welcome user10979220 to the wonderful forum.
    "How many rows will be there in item flat file ?
    How many rows will be there in location flat file ?
    How many rows will be there in sales data flat file ?"
    Answer to these questions are based on the number of sales that has occured as well as the item hierarchy and the product hierarchy.
    From the example
    Item_name | organization | ship to site | Sales date | quantity
    item 1 org1 site1 19-Jan-2008 80
    item1 org1 site2 20-Jan-2008 100
    I can say that the number of rows in sales data is 2 while that in item file is 1 and location file is 2.
    The answer to how the flat file will look is
    Item file can have
    Item_Id|Item_Name
    01     Item1
    Location file can have
    Org_Id|Org_Name|Site_Id|Site_name
    O_01     Organization 1     S_01     Site 1
    O_01     Organization 1     S_02     Site 2
    Sales file can have
    Item_Id|Item_Name|Org_Id|Org_Name|Site_Id|Site_name|sales_date|Quantity
    01     Item1     O_01     Organization 1     S_01     Site 1     19-Jan-2008 80
    01     Item1     O_01     Organization 1     S_02     Site 2     20-Jan-2008 100
    Once you define the data model and load the data it is advisable not to change the data model since the next time when you load the incremental data it will error out due to the mismatching columns. So its better that you define the data model first and then create the files accordingly.
    Hence after loading data if you want to view sales data at the product_category1 level then you should already have the product_category1 level defined in the data model.Else You have to create the product_category1 level and associate it in the item hierarchy.Also the sales file should have columns for product_category1.
    Then the item file will look like
    Item_Id|Item_Name|Product_category1_Id|Product_category1_desc
    01     Item1     PC_01     Product_category 1
    and the sales file will look like
    Item_Id|Item_Name|Org_Id|Org_Name|Site_Id|Site_name|sales_date|Quantity
    01     Item1     O_01     Organization 1     S_01     Site 1     19-Jan-2008 80
    01     Item1     O_01     Organization 1     S_02     Site 2     20-Jan-2008 100
    This is just a basic overview about loading the data into the staging tables.In short the item table should have the item hierarchy in it, the location file should have the location hierarchy in it and the sales file must contain the details about the sales at the lowest item-location level.
    Hope I answered your query.
    Thanks and Regards,
    Shekhar

  • SAP BPC 10 NW loading data from Infoprovider (data value issue in BPC)

    Hi Experts,
    We are on SAP BPC 10 - 801 NW - SP05
    We are having an error when loading data from infoprovider, we tested in DEV and QA and have similar behaviour: data values which are stored  in infoprovider are loaded into BPC model with no differences.
    However after make a transport procedure into PROD environment and then loading data  into BPC Model something happens: suppose that 100,00 is a data value stored in a BW infoprovider, after loading data into BPC from this infoprovider, different data value is loaded into BPC. E.G 100,0 is in BW , but 10000,0 is store in BPC
    Is there any idea to fix it or check it?
    Thanks for your answer

    Hi Giovanni,
    This case occurs with colombian currency and others.
    Compare the table TCURX in DEV, QA and PRD.
    There must be an entry for currency COP only in PRD.
    Like
    CURRKEY CURRDEC
    COP 0
    Note 1240163 explain this behavior.
    If you need the value to be stored in 100,00 format in BPC, you can create a routine to convert this value.

Maybe you are looking for

  • SOAP sender Adapter - response message missing in adapter engine level

    Hi, We have a synchronous scenario SOAP<-> P I <--->Proxy .Request comes from  web service get the response from ECC. Issue reported that response message not reached the web service . I am able to see the request & response messages in SXMB_MONI. To

  • Is there any ways to make Mail app show prefix and suffix in the address field?

    I apologize for any inconvenience that my poor English may cause. I would like to ask if there is a way to make Maill app show prefix and suffix in the address field. This is, in my culture, very important, especially for formal situations. When I ca

  • Typing problem in Dreamweaver? (Desperate for help ASAP!!)

    Hey guys sorry to bother but I'm new to Dreamweaver 8 and I'm so frustrated right now! D: I tried using a Photoshop addon called Sitegrinder 3 to design a website and let Sitegrinder automatize the coding project of converting from PSD to HTML since

  • CREATE TABLE AS - PERFORMANCE ISSUE

    Hi All, I am creating a table CONTROLDATA from existing tables PF_CONTROLDATA & ICDSV2_AutoCodeDetails as per the below query. CREATE TABLE CONTROLDATA AS SELECT CONTROLVALUEID, VALUEORDER, CONTEXTID, AUDITORDER, INVALIDATINGREVISIONNUMBER, CONTROLID

  • Statistical Plant Maintenance Orders?

    Hi there, We would like to try to avoid running settlement on PM orders.  We would be happy with the accounting treatment in CO that is used on statistical Internal Orders.  ie posting goes both to the order (statistical) and cost centre (real).  How