Oracle order by cluase on null and identical column values

Can anybody please help me to understand the below things. Thanks in advance.
1. How will be the row ordering when the ORDER BY column contains null values for all the records
2. How will be the row ordering if all the records catains the same values (for the column which is used in order by clause)
3. Whether index is having any impact in the ordering - (mainly in the case of above scenarios)
4. Whether any other db objects will have impact on ordering
Thanks again.
Regards,
Shine

Shine Haridasan wrote:
1. How will be the row ordering when the ORDER BY column contains null values for all the recordsIt will have the same effect as not ordering
http://download.oracle.com/docs/cd/E11882_01/server.112/e26088/statements_10002.htm#i2171079
>
Without an order_by_clause, no guarantee exists that the same query executed more than once will retrieve rows in the same order.
>
2. How will be the row ordering if all the records catains the same values (for the column which is used in order by clause)Same answer as 1.
3. Whether index is having any impact in the ordering - (mainly in the case of above scenarios)It might.
4. Whether any other db objects will have impact on orderingThey might.

Similar Messages

  • Popup Key LOV, NULL and "Invalid numeric value undefined for column"

    Hello.
    I've created an item based on database column of NUMBER type and set the following properties:
    Display As = Popup Key LOV (Displays description, returns key value)
    List of values definition=select 'display_value' d, 1 r from dual
    Null display value=%
    Null return value=
    Display Null=Yes
    When I select "%" in the LOV and try to apply changes to database I get error:
    ORA-20001: Error in DML: p_rowid=1781, p_alt_rowid=N1, p_rowid2=, p_alt_rowid2=. ORA-20001: Invalid numeric value undefined for column N2
    Error Unable to process row of table TTT.
    If I set Display As = Select List, all works fine. But I need Popup Key LOV.
    Could anybody help me?
    I use Application Express 2.2.1.00.04

    Hi all,
    I did my homework and solved this issue. First I would like to thank Patrick Wolf for the invaluable help he gives out on thread Re: Null value handling in LOVs The code presented here is just a minor edit to his code, but an essential one when dealing with Popup Key LOV items.
    Here's what I did:
    1. Create an Application Process.
    Name: RemoveNulls
    Sequence: 0
    Point: On Submit: After Page Submission - Before Computations and Validations
    Process Text:
    BEGIN
        FOR rItem IN
          ( SELECT ITEM_NAME
              FROM APEX_APPLICATION_PAGE_ITEMS
             WHERE APPLICATION_ID   = TO_NUMBER(:APP_ID)
               AND PAGE_ID          IN (TO_NUMBER(:APP_PAGE_ID), 0)
               AND LOV_DISPLAY_NULL = 'Yes'
               AND LOV_DEFINITION   IS NOT NULL
               AND LOV_NULL_VALUE   IS NULL
        LOOP
            IF (V(rItem.ITEM_NAME) = '%null' || '%' OR V(rItem.ITEM_NAME) = 'undefined')
            THEN
                Apex_Util.set_session_state(rItem.ITEM_NAME, NULL);
            END IF;
        END LOOP;
    END;Error Message: #SQLERRM#Condition: None
    2. You should be able to submit a Popup Key LOV with a NULL value now.
    Once again, THANKS, Patrick! You rock! I'm seriously thinking of trying ApexLib now :)
    Georger

  • Purchase order report with basic price and Excise duty values

    Hi All,
    Is there any standard report in SAP to get the Purchase order basic price / qty and its Excise duty values ? Since we are including Excise duties for non - codified items (w/o material master and with cost center 'K') Purchase order and the ed values are added to the cost and the requirement is to see the split up (Basic priceEdEcs+Hcs)for these Purchase orders for the vendors. If there is no std report then please let me know how to link Purchase order table with this Condition value table in query to get the desired report.
    Thanks in advance
    Benny

    Hi,
    there is no standert report to fullfill this requierment
    you have to create this report with the help of abaer
    for that which tax code you are using its importnat thing
    for po table EKPO
    1. A524 (Normal supply point / Material) it will return condition record number for condition type.
    2. KONP (Conditions (Item)) .
    3. J_1IMTCHID (Combination of Material Number and Chapter ID).
    4. J_1IEXCTAX (Tax calc. - Excise tax rates
    5 get the IR no. from RBKP and RSEG, and get the relevant FI doc no. and goto BSET table
    for IR number  to get from table apply logic for doc type RE ,you will get fi number nad in refernce field you will get invoice number + fiscal year
    Map this by refernig on po for which grn and invoice happen
    Regards
    Kailas Ugale

  • Reading lock and pk column values

    Hi!
    I was wondering whether it is possible to somehow read the value a
    persistent object has for its lock column (JDOLOCKX by default) and pk
    column (with datastore identity, JDOIDX by default).
    Especially reading the lock would be useful. My specific problem is this:
    in J2EE environment when a persistent object is serialized and given to the
    web (or any other ui-) layer to display, its identity is lost. There are
    nice examples for solving this problem in the documentation so this is not
    an issue. However, there is no easy way to determine whether someone else
    changed the same data while it was being watched (and modified) on the web
    page. So, in essence, this is an optimistic transaction. However, to use
    optimistic transactions one would have to use stateful session beans. And
    if I am correct, an optimistic transaction has to have a live persistence
    manager all the time, so this is not acceptable unless the
    persistencemanager releses its sql connection in between calls. I don't
    think it does, or does it? And anyhow, I'd prefer using stateless session
    beans.
    It would be quite easy to implement some sort of timestamp mechanism by
    inserting a persistent field for it and updating its value by using
    InstanceCallback interface. However, as the JDOLOCKX column already exists
    it would be nice not to reinvent the wheel.
    The pk value I would need in case the same db is used by many apps (not
    all in Java) and they need to communicate some information about a specific
    object stored in the db (in which case it would be easiest to give the pk
    value to identify an object). Again, I could use ApplicationIdentity or
    make some field where to store some unique value, but I'd prefer using the
    pk value since it is already there (and would not like to use Application
    Identity just because of this).
    Any other solutions to the above problems besides being able to read the
    JDOIDX and JDOLOCKX values are welcome, too.
    -Antti

    You can either parse the toString () generated from
    JDOHelper.getObjectId () or cast it to ObjectIds.Id (in 2.5.x) or Id (in
    3.0) to get the pk value.
    You can also retrieve the lock version in the same fashion.
    In Kodo 3:
    KodoHelper.getVersion (yourPC).
    In Kodo 2.5.x:
    ((PersistenceManagerImpl) pm).getState (JDOHelper.getObjectId
    (yourPC)).getVersion ();
    Antti Karanta wrote:
    On Tue, 14 Oct 2003 10:46:56 -0400, Patrick Linskey
    <[email protected]> wrote:
    Based on your description of your problem, you might want to look at the
    new attach/detach functionality in Kodo 3.0:
    http://solarmetric.com/Software/Documentation/3.0.0RC2/docs/ref_guide_detach.html
    This functionality is a preview of a proposed JDO 2.0 feature, and is
    targeted at exactly the use case that you've described.Yes, it definitely looks nice. One thing it does not mention, though:
    is the String stored in the detached-objectid-field just the (string
    version of the) primary key or is there something else there?
    And what about a case where I do not want a detachable instance, just
    want to be able to have the primary key value?
    And I take it there is no way to read these values (lock and pk) in the
    present version of Kodo?
    On Tue, 14 Oct 2003 18:38:21 +0300, Antti Karanta wrote:
    Hi!
    I was wondering whether it is possible to somehow read the value a
    persistent object has for its lock column (JDOLOCKX by default) and
    pk column (with datastore identity, JDOIDX by default).
    Especially reading the lock would be useful. My specific problem is
    this: in J2EE environment when a persistent object is serialized and
    given to the web (or any other ui-) layer to display, its identity is
    lost. There are nice examples for solving this problem in the
    documentation so this is not an issue. However, there is no easy way
    to determine whether someone else changed the same data while it was
    being watched (and modified) on the web page. So, in essence, this is
    an optimistic transaction. However, to use optimistic transactions
    one would have to use stateful session beans. And if I am correct, an
    optimistic transaction has to have a live persistence manager all the
    time, so this is not acceptable unless the persistencemanager releses
    its sql connection in between calls. I don't think it does, or does
    it? And anyhow, I'd prefer using stateless session beans.
    It would be quite easy to implement some sort of timestamp mechanism
    by inserting a persistent field for it and updating its value by
    using InstanceCallback interface. However, as the JDOLOCKX column
    already exists it would be nice not to reinvent the wheel.
    The pk value I would need in case the same db is used by many apps
    (not all in Java) and they need to communicate some information about
    a specific object stored in the db (in which case it would be easiest
    to give the pk value to identify an object). Again, I could use
    ApplicationIdentity or make some field where to store some unique
    value, but I'd prefer using the pk value since it is already there
    (and would not like to use Application Identity just because of this).
    Any other solutions to the above problems besides being able to read
    the JDOIDX and JDOLOCKX values are welcome, too.
    -Antti
    Steve Kim
    [email protected]
    SolarMetric Inc.
    http://www.solarmetric.com

  • Need to default text string values to null and leave numeric values as are

    Hi,
    This may be a simple question but I have the following query:
    select gis.province, ce.place_id
    from  cla_event ce
    Left join (select * from rbn_gis_area where version = 10) gis
    on ce.place_id  = gis.sp_codeProblem is place_id has text fields since the data is dirty and I receive there error 'ORA-01722: invalid number 01722. 00000 - "invalid number"'. I want to default any text to null. Here is some sample data.
    CREATE TABLE Temp_1 (Place_ID varchar2(50));
    INSERT INTO Temp_1(Place_ID)  VALUES (77415018)     ;
    INSERT INTO Temp_1(Place_ID)  VALUES (77305000)     ;
    INSERT INTO Temp_1(Place_ID)  VALUES (77415000)     ;
    INSERT INTO Temp_1(Place_ID)  VALUES (null)     ;
    INSERT INTO Temp_1(Place_ID)  VALUES (77423034)     ;
    INSERT INTO Temp_1(Place_ID)  VALUES (null)     ;
    INSERT INTO Temp_1(Place_ID)  VALUES (77424011)     ;
    INSERT INTO Temp_1(Place_ID)  VALUES ('Glebwood')     ;
    INSERT INTO Temp_1(Place_ID)  VALUES (77603002)     ;
    INSERT INTO Temp_1(Place_ID)  VALUES (77409012)     ;
    INSERT INTO Temp_1(Place_ID)  VALUES (null)     ;
    INSERT INTO Temp_1(Place_ID)  VALUES (null)     ;
    INSERT INTO Temp_1(Place_ID)  VALUES ('AVONDALE')     ;
    INSERT INTO Temp_1(Place_ID)  VALUES (27516000)     ;
    INSERT INTO Temp_1(Place_ID)  VALUES (10509000)     ;
    Select * from temp_1Thanks in advance!!!
    Banner:
    Oracle Database 11g Release 11.2.0.2.0 - 64bit Production
    PL/SQL Release 11.2.0.2.0 - Production
    "CORE 11.2.0.2.0 Production"
    TNS for Linux: Version 11.2.0.2.0 - Production
    NLSRTL Version 11.2.0.2.0 - Production

    May be this is helpful to you..
    select regexp_replace(Place_ID,'([[:alpha:]])','') from Temp_1;
    select gis.province, ce.place_id
    from  cla_event ce
    Left join (select * from rbn_gis_area where version = 10) gis
    on regexp_replace(ce.Place_ID,'([[:alpha:]])','')  = gis.sp_codeRegards,
    Dipali..

  • Get list of Not Null and PK Columns

    hi All,
    I need to get the list of all the columns which are part of PK and not null, can you help me on this?
    with the below query I am getting the not null ones but need to get the PK columns  also
    SqlServer Version is 2008R2
    SELECT TABLE_CATALOG AS Database_Name, TABLE_SCHEMA, TABLE_NAME, COLUMN_NAME, IS_NULLABLE
    FROM INFORMATION_SCHEMA.COLUMNS
    WHERE TABLE_SCHEMA = 'Express'
    AND IS_NULLABLE = 'NO'
    Thanks!
    Neil

    Give this a go. It includes by bitval solver function, as the status column is stored that way.
    CREATE FUNCTION dbo.solveBitVal(@val BIGINT)
    RETURNS @result TABLE (id INT, bitvalue BIGINT)
    BEGIN
    DECLARE @bitTranslate TABLE (id INT, bitvalue BIGINT)
    DECLARE @numbah TABLE (num INT)
    DECLARE @loopah INT
    SET @loopah = 1
    WHILE @loopah <= 63
    BEGIN
    INSERT INTO @bitTranslate (id, bitvalue) VALUES (@loopah, POWER(CAST(2 AS BIGINT),@loopah-1))
    INSERT INTO @numbah (num) VALUES (@loopah)
    SET @loopah = @loopah + 1
    END
    WHILE @val > 0
    BEGIN
    INSERT INTO @result
    SELECT MAX(id), MAX(bitvalue) FROM @bitTranslate WHERE bitvalue <= @val
    SELECT @val = @val - MAX(bitvalue) FROM @bitTranslate WHERE bitvalue <= @val
    END
    RETURN
    END
    GO
    ;WITH cons AS (
    SELECT o.name, c.*, b.id AS statusID, ROW_NUMBER() OVER (PARTITION BY c.id, colid ORDER BY b.id) AS seq
    FROM sys.sysconstraints c
    INNER JOIN sys.objects o
    ON c.id = o.object_id
    CROSS APPLY dbo.solveBitVal(c.status) b
    ), rCTE AS (
    SELECT c.name, c.constid, c.id, c.colid, c.spare1, c.status, c.actions, c.error, CAST(CASE WHEN statusID = 1 THEN 'PRIMARY KEY constraint '
    WHEN statusID = 2 THEN 'UNIQUE KEY constraint '
    WHEN statusID = 3 THEN 'FOREIGN KEY constraint '
    WHEN statusID = 4 THEN 'CHECK constraint '
    WHEN statusID = 5 THEN 'DEFAULT constraint '
    WHEN statusID = 16 THEN 'Column-level constraint'
    WHEN statusID = 32 THEN 'Table-level constraint '
    END AS NVARCHAR(MAX)) AS statusName, c.seq, c.statusID
    FROM cons c
    WHERE seq = 1
    UNION ALL
    SELECT c.name, c.constid, c.id, c.colid, c.spare1, c.status, c.actions, c.error, CAST(r.statusName + ', ' + CASE WHEN c.statusID = 1 THEN 'PRIMARY KEY constraint '
    WHEN c.statusID = 2 THEN 'UNIQUE KEY constraint '
    WHEN c.statusID = 3 THEN 'FOREIGN KEY constraint '
    WHEN c.statusID = 4 THEN 'CHECK constraint '
    WHEN c.statusID = 5 THEN 'DEFAULT constraint '
    WHEN c.statusID = 16 THEN 'Column-level constraint'
    WHEN c.statusID = 32 THEN 'Table-level constraint '
    END AS NVARCHAR(MAX)), c.seq, c.statusID
    FROM cons c
    INNER JOIN rCTE r
    ON c.id = r.id
    AND c.colid = r.colid
    AND c.seq - 1 = r.seq
    SELECT *
    FROM rCTE
    Don't forget to mark helpful posts, and answers. It helps others to find relevant posts to the same question.

  • User defined function to check the NULL and 0.00 value in a Amount field

    Hi Experts,
    I have one scenario in which i have amount field in Data type.
    I have to create one UDF which will do the following thing.
    1. If the amount field is Null it should throw an exception in mapping.
    2.If the amount field is 0.00 it should throw an exception in mapping.
    3. For all other cases it should pass the amount value to the target.
    Can anyone provide me the code For this UDF.
    Thanks in advance.

    Hi,
      You can add this logic to the UDF. Here var1 is the input string.
    java.math.BigDecimal b=new java.math.BigDecimal(var1);
              if(b.compareTo(new java.math.BigDecimal("0.00"))<=0)
                   //write your logic for values less equal to zero
    One small request, if you think your question has been answered, properly and correctly, could you please kindly if possible close down this thread. It helps people who look for solutions to similar problem.
    regards
    Anupam

  • Query with nulls and multiple columns

    I need help with structuring a query. I have the following table:
    REGION |     AREA
    1     | A
    1     |      B
    1     |      C
    1     |      D
    1     |      (null)
    (null)     | A
    (null)     |      B
    (null)     |      C
    (null)     |      A
    2     |      A
    2     |      (null)
    I need to get the results:
    REGION |     AREA
    1     |      (null)
    (null)     | A
    (null)     |      B
    (null)     |      C
    2     |      (null)
    So really - all the regions that have a null "AREA" and all the Areas with a null "REGION" that are unique. I can do this with two separate statements but I'd like to be able to do it in one query if possible.

    user9179751 wrote:
    I need help with structuring a query. I have the following table:
    REGION |     AREA
    1     | A
    1     |      B
    1     |      C
    1     |      D
    1     |      (null)
    (null)     | A
    (null)     |      B
    (null)     |      C
    (null)     |      A
    2     |      A
    2     |      (null)
    I need to get the results:
    REGION |     AREA
    1     |      (null)
    (null)     | A
    (null)     |      B
    (null)     |      C
    2     |      (null)
    So really - all the regions that have a null "AREA" and all the Areas with a null "REGION" that are unique. I can do this with two separate statements but I'd like to be able to do it in one query if possible.SELECT REGION, AREA FROM TABLE1 WHERE REGION IS NULL OR AREA IS NULL;

  • Get Min value and another column value from the same row

    hi all - for each customer and product, I need to grab the row that has the min value for field Value1. I could've just done group by on customer and product and min on Value1, but I also need to grab Vlaue2 from the same row where the minimum value for
    Value1 lies.
    DECLARE @Temp TABLE (CustomerID INT, ProductID VARCHAR(10), Trans_Date Date, Value1 INT, Value2 INT)
    INSERT INTO @Temp VALUES (123, 'ABC', '1/1/2013', 10, 100)
    INSERT INTO @Temp VALUES (456, 'ASD', '1/1/2013', 40, 500)
    INSERT INTO @Temp VALUES (456, 'ASD', '2/1/2013', 90, 700)
    INSERT INTO @Temp VALUES (123, 'ABC', '2/1/2013', 20, 700)
    SELECT * FROM @Temp
    The output should be
    123, ABC, 10, 100
    456, ASD, 40, 500
    I know that I can just join the table to itself and get the desired output but I am dealing with a table that has millions of rows and I was hoping there is more efficient way to do this. any help would be highly appreciated...

    Here is a correction on your DDL to make it into nearly valid table.
    CREATE TABLE Sales
    (customer_id INTEGER NOT NULL,
    product_id CHAR(10) NOT NULL,
    transaction_date DATE NOT NULL,
    PRIMARY KEY (customer_id, product_id, transaction_date),
    value1 INTEGER NOT NULL,
    value2 INTEGER NOT NULL);
    Here is the current syntax for insertion:
    INSERT INTO Sales
    VALUES (123, 'abc', '2013-01-01', 10, 100),
    (456, 'asd', '2013-01-01', 40, 500),
    (456, 'asd', '2013-02-01', 90, 700),
    (123, 'abc', '2013-02-01', 20, 700);
    WITH
    X
    AS
    (SELECT customer_id, product_id, transaction_date, value1, value2,
    MIN(value1) OVER () AS value1_min
    FROM Sales)
    SELECT X.*
    FROM X
    WHERE X.value1_min = X.value1;
    --CELKO-- Books in Celko Series for Morgan-Kaufmann Publishing: Analytics and OLAP in SQL / Data and Databases: Concepts in Practice Data / Measurements and Standards in SQL SQL for Smarties / SQL Programming Style / SQL Puzzles and Answers / Thinking
    in Sets / Trees and Hierarchies in SQL

  • Reset Identity Column Value in SQL Server depending on whats being inserted

    I have a SQL Server 2008 database with a table that has an ID field that has an identity increment and seed of 1.
    This work great as each new record created automatically gets a new and distinct number.
    Now when I insert certain new records I need to be able to make that value start at 10001 and then the next would be 10002 etc.
    But this is only for certain inserts.
    Is there a way to control the identity seed & increment on the fly via a query maybe?
    So some inserts would follow 1,2,3,4,5,6,7  depending on whats being inserted
    and others would follow 10001,10002,10003 depending on whats being inserted.
    is this possible?

    Make it a different field.  Don't mess with your PK.
    Make sure you have a plan for when you get 10000 records in this table.

  • Expand and collapase column values of Pivot table in obiee 11g?

    Hi experts,
    I have Requirement.
    *----------------------(+value1)---(+value2)----(-value3)---------(+value 4)
    --------------------------------------------- x---y---z
    **col1*-----col2----------------------------------------------------
    ----abc-------sys-------------------------------------1---3---9*
    ---xyz--------inc-------------------------------------7---8---77*
    ----xyz2------inc2------------------------------------7---8---32*
    ----xyz3------inc3------------------------------------5---08--13*
    ----xyz4------inc4------------------------------------6---0----72*
    ----xyz5------inc5------------------------------------4---2----3*
    like that i want ...
    any suggestions / help..
    I hope you understand..
    Thanks
    Raam
    Edited by: Tallapaneni on Jun 16, 2012 5:10 PM

    i couldnt understand what you mean, can you please describe?

  • Avoiding null and duplicate values using model clause

    Hi,
    I am trying to use model clause to get comma seperated list of data : following is the scenario:
    testuser>select * from test1;
    ID VALUE
    1 Value1
    2 Value2
    3 Value3
    4 Value4
    5 Value4
    6
    7 value5
    8
    8 rows selected.
    the query I have is:
    testuser>with src as (
    2 select distinct id,value
    3 from test1
    4 ),
    5 t as (
    6 select distinct substr(value,2) value
    7 from src
    8 model
    9 ignore nav
    10 dimension by (id)
    11 measures (cast(value as varchar2(100)) value)
    12 rules
    13 (
    14 value[any] order by id =
    15 value[cv()-1] || ',' || value[cv()]
    16 )
    17 )
    18 select max(value) oneline
    19 from t;
    ONELINE
    Value1,Value2,Value3,Value4,Value4,,value5,
    what I find is that this query has duplicate value and null (',,') coming in as data has null and duplicate value. Is there a way i can avoid the null and the duplicate values in the query output?
    thanks,
    Edited by: orausern on Feb 19, 2010 5:05 AM

    Hi,
    Try this code.
    with
    t as ( select substr(value,2)value,ind
            from test1
            model
            ignore nav
            dimension by (id)
            measures (cast(value as varchar2(100)) value, 0 ind)
            rules
            ( ind[any]=  instr(value[cv()-1],value[cv()]),
            value[any] order by id = value[cv()-1] || CASE WHEN value[cv()] IS NOT NULL
                                               and ind[cv()]=0     THEN ',' || value[cv()] END      
    select max(value) oneline
    from t;
    SQL> select * from test1;
            ID VALUE
             1 Value1
             2 Value2
             3 Value3
             4 Value4
             5 Value4
             6
             7 value5
             8
    8 ligne(s) sélectionnée(s).
    SQL> with
      2   t as ( select substr(value,2)value,ind
      3          from test1
      4          model
      5          ignore nav
      6          dimension by (id)
      7          measures (cast(value as varchar2(100)) value, 0 ind)
      8          rules
      9          ( ind[any]=  instr(value[cv()-1],value[cv()]),
    10          value[any] order by id = value[cv()-1] || CASE WHEN value[cv()] IS NOT NULL
    11                                             and ind[cv()]=0     THEN ',' || value[cv()] END 
    12          )
    13        )
    14   select max(value) oneline
    15   from t;
    ONELINE
    Value1,Value2,Value3,Value4,value5
    SQL>

  • Nulls and Empty Strings

    I have an ODBC application whose backend has been changed from Access to Oracle. There is one particular query that is causing problems. The query contains a where clause with something like "field = ''" in it. This query is intended to pick up records where field is null. If the query was modified to "field is null," it would work. However, I cannot change the application.
    Is there any way to configure the ODBC driver or something so that this query would work properly? I've tried defaulting the field to '', but Oracle is still treating it as null.
    (What is really frustrating is that the application NEVER populates this field!)

    Actually, Oracle is very consistent in that it requires you to use ternary logic when dealing with null values. NULL is never equal to a value, NULL is never not equal to a value.
    select * from someTable where column <> 'value'
    will show you only those rows in someTable where column is non-NULL and where its value is not 'value'
    select count(*) from someTable !=
    select count(*) from someTable where column <> 'value' +
    select count(*) from someTable where column = 'value'
    select count(*) from someTable =
    select count(*) from someTable where column <> 'value' +
    select count(*) from someTable where column = 'value' +
    select count(*) from someTable where column IS NULL
    There are plenty of discussions in Oracle texts and in Oracle newsgroups that go into much greater detail about the how's & why's of this.
    Justin

  • OPM/Oracle Order Management

    What are the pro's and con's of switching from OPM Order Fulfillment to Oracle Order Management?
    We use Order Fulfillment now and are concern we will lose the profile functionality when and if we upgrade.

    OPM Order Fulfillment is a simple to use Order Entry and Shipping application that meets the needs of many process manufacturers.
    Oracle Order Management is more flexible and configurable to suit a number of different industries. Oracle Process Manufacturing is integrated
    to Order Management and has functionality to support entry of order quantities in two units of measure, as well as designation
    of a preferred quality grade and automatic allocation of process inventory.
    In place of the Order Profile (or order template) functionality available in Order Fulfillment, we recommend using the Copy Order
    functionality in Order Management. You can establish a special order type in Order Management (you can call it Template or
    Profile) and assign a workflow to the order type that makes it "non-transactable" (the order stays in an "entered" state so that
    it can't be booked or shipped). You can query on these special order types and copy them to create standard orders. When you
    copy an order in Order Management, you can change the order type, so the Template order can be copied and the new order
    type can be "Standard".

  • Identity Column in SQL Server 2000

    I am working on SBO Version 2004a.
    I created <b>Identity Column</b> in User Defined Table and made the entries for this Table and its columns in Sap Business One OUTB and CUFD tables respectivily.
    I use this identity column to populate the Code and name fields automatically using Insert triggers.
    I add 10 records to the table and now delete the 5th record. The Identity column and the code field will contain values as (1,2,3,4,6,7,8,9,10)
    Now on Upgrading the patch level( SAP Business One 2004A (6.70.190)  SP: 00  PL: 36), the Identity column is converted to simple INT column and no Identity column property is attached to it.
    There is no T-SQL in SQL 2000 to convert an existing column to Identity Column hence I can not convert it back to Identity Column.
    The Only Solution is to drop the column and re-create it as an Identity Column.
    If I do so, the Identity column is polulated for all the rows starting from 1 to 9 (seed is 1).
    Now on adding the next record, the primary key voilation occurs on the Code field since the next available Identity column value is 10 but this value is already used in 9th record.
    <b>Is there any way so that the patch do not change the definition of the User defined tables and their columns.</b>
    Thanks
    Vishal Nigam

    Hi Vishal,
    I agree with Harold. It's against SAP's support policy to allow anyone to change SBO table definitions in any way. Other SQL objects such as stored procedures or triggers will probably be unaffected by an upgrade but it is highly likely that changes to tables will be reverted and may even cause the upgrade to fail.
    There is one other possible solution, depending on how you are using the table. It is to create your own table in SQL, i.e. a table in the SBO company database but which is not a user-defined table. SBO itself will be totally unaware of the table but you could read and write to it via code (if you have written an add-on).
    Kind Regards,
    Owen

Maybe you are looking for