Calculate Effective values for a set of records

Hi, We are having a set of rows in a databas table - (Month_key & Loan_value_at_month are only available, we need to calculate the EFFECTIVE_LOAN_VALUE_MONTH) as illustrated below
The logic for calculation will be
•     If MONTH_KEY = 1, then EFFECTIVE_LOAN_VALUE_MONTH should be calculated as 0.
•     If MONTH_KEY >1 and NTH_KEY<=12 (less or equal to a year) then:
     Check if the LOAN_VALUE_AT_MONTH is 0 and if true
     set EFFECTIVE_LOAN_VALUE_MONTH = 0
     If the EFFECTIVE_LOAN_VALUE_MONTH is non zero, then set EFFECTIVE_LOAN_VALUE_MONTH =
     Max(LOAN_VALUE_AT_MONTH(current time band), EFFECTIVE_LOAN_VALUE_MONTH (previous time band))
We tried with the help of Lag function to calculate this.......but in vain. Do we have any stastical functions to calcualte this logic using SQl. Kindly help
MONTH_KEY     LOAN_VALUE_AT_MONTH     EFFECTIVE_LOAN_VALUE_MONTH
1     6,912,007     0
2     8,305,709     8,305,709
3     9,116,882     9,116,882
4     9,617,609     9,617,609
5     11,198,658     11,198,658
6     12,151,381     12,151,381
7     13,511,137     13,511,137
8     14,126,983     14,126,983
9     12,975,124     14,126,983
10     13,501,073     14,126,983
11     11,588,826     14,126,983
12     3,461,046     14,126,983
Edited by: Selvaganapathy on Mar 19, 2013 12:24 AM
Edited by: Selvaganapathy on Mar 19, 2013 12:26 AM

Selvaganapathy wrote:
Hi, We are having a set of rows in a databas table - (Month_key & Loan_value_at_month are only available, we need to calculate the EFFECTIVE_LOAN_VALUE_MONTH) as illustrated below
The logic for calculation will be
•     If MONTH_KEY = 1, then EFFECTIVE_LOAN_VALUE_MONTH should be calculated as 0.
•     If MONTH_KEY >1 and NTH_KEY<=12 (less or equal to a year) then:
     Check if the LOAN_VALUE_AT_MONTH is 0 and if true
     set EFFECTIVE_LOAN_VALUE_MONTH = 0
     If the EFFECTIVE_LOAN_VALUE_MONTH is non zero, then set EFFECTIVE_LOAN_VALUE_MONTH =
     Max(LOAN_VALUE_AT_MONTH(current time band), EFFECTIVE_LOAN_VALUE_MONTH (previous time band))
We tried with the help of Lag function to calculate this.......but in vain. Do we have any stastical functions to calcualte this logic using SQl. Kindly helpWhat have you tried? From your description LAG sounds like a good idea but doing this in a singe SQL will be very hard - too many rules!
Your rules are complex enough to invite consideration of a user-written function to compute the value instead of a single SQL. You could pass the current and lagged values to the function and get the computed value back

Similar Messages

  • How to calculate acquisition value for specified day

    Hi,
    in my z program I have a problem how to calculate acquisition value for my asset for specified day.
    Example:
    I have asset created 8.7.2008 with TTYPE 104 (External asset acquisition) with value 5950.
    30.11.2008 there is another TTYPE 272 (Retirement of current-yr acquis., w/o revenue) with value 950.
    So BEFORE 30.11.2008 acquisition value is 5950. After is 5000. Is there any function module (or something else) in SAP system where I can send asset number and date a it return to me acquisition value for that day?
    Many thanks for any answer!

    Hi,
    your suggestion means that I have to compute acquisition value by myself (sum all TTYPE 1** - sum all TTYPE 2**). So SAP does't provide such functionality (LDB ADA have it, because it can compute acquisitiob value for specific day)?
    Mant thanks for answer

  • Find the last value of one set of records

    Hi friends,
    i have e set of records
    Customer_old_id customer_new_id start_date
    1001 1010 01-jan-07
    1010 1051 15-feb-08
    1051 1070 01-jan-09
    5001 5020 01--jan-05
    5020 5100 01-jun-06
    I would like to create a new set of records with values
    Customer_old_id customer_new_id start_date customer_last_id last_date
    1001 1010 01-jan-07 1070 01-jan-09
    1010 1051 15-feb-08 1070 01-jan-09
    1051 1070 01-jan-09 1070 01-jan-09
    5001 5020 01--jan-05 5100 01-jun-06
    5020 5100 01-jun-06 5100 01-jun-06
    Thanks in Advance
    Mark

    Hi, Mark,
    INSERT INTO table_x
    (     customer_old_id
    ,     customer_new_id
    ,     start_date
    ,     customer_last_id
    ,     last_date
    SELECT     customer_old_id
    ,     customer_new_id
    ,     start_date
    ,     CONNECT_BY_ROOT customer_new_id     -- customer_last_id
    ,     CONNECT_BY_ROOT start_date     -- last_date
    FROM     table_x
    START WITH     customer_new_id     NOT IN     (
                             SELECT     customer_old_id
                             FROM     table_x
    CONNECT BY     custormer_new_id     = PRIOR customer_old_id;(Untested)
    You have a parent-child relationshiip. If a row has customer_new_id = x, then the row with customer_old_id = x is its parent. That relationship is spelled out in the CONNECT BY clause. (Perhaps you normally think of the parent-child roles as being the opposite of what I described. That's fine, but there is no CONNECT_BY_LEAF operator, only CONNECT_BY_ROOT, so for purposes of this query, the rows with the values that need to be copied to all their relatives are the roots.)
    If a row has customer_new_id = x, but there is no row where customer_old_id = x, then that row has no parent: it is a root. Those rows are found in the START WITH clause.
    The customer_new_id and start_date become the customer_last_id and last_date of each root and all its descendants. CONNECT_BY_ROOT retains these values as the query traverses the tree, from parent to child.

  • T.code/Report for calculate received values for a subcontract

    Hi Experts,
    Can you any one help on this issue.
    u201CIs there a T-code/report that will calculate the received values for a subcontract  broken out by code - or remaining values per code - even when the contract has non-valuated GR lines?
    MATDOCS does not capture the dollar value of non-VAL GR lines - only the qty. It captures the dollar values of valuated GRs only.
    ME2N - captures the total remaining value at the Item level only - not at the code level; the codes are all listed but as in the initial account assignment tab ME23N with the original percent assignment.  It does not determine remaining value per code.u201D
    Thanks in advance.
    Mohammad Jeelan

    hi
    try MBLB

  • How to create count() measure for certain set of records in BMM Layer

    Hello all.
    I have a logical table like this one (Table AAA)
    Table AAA
    <p>
    --------------------------+<p>
    |       *Key*            |    *Name* |   <p>
    --------------------------+<p>
    |    1-2EMHS9     | AAA      |<p>
    --------------------------+<p>
    | 1-2EMWMO      | BBB      |<p>
    --------------------------+<p>
    |         NULL       |     CCC   |<p>
    --------------------------+<p>
    I need to calculate count() of records where <b>Key is not NULL</b>. In this example, this field must return coun() = 2.
    I suppose, CASE operator may help me with that, but I do not know for sure how to do this.
    Thank you for help,
    Alex.

    Thank you.
    But I must concretise my issue.
    I need to calculate number of records (e.g. order_id) that sutisfy appropriate condition (one of columns (e.g. loy_member_id) is set to null).
    I created logical column, that returns order_id if condition (loy_member_id is null) is met.
    Look at my logical column:
    Q Orders (LPM) - must return number of orders where loyalty_member_id is null.
    It has an expression:
    CASE  WHEN  NOT "Foxtrot (my) replica"."Sales Orders".LOY_MEMBER_ID IS NULL  THEN "Foxtrot (my) replica"."Sales Orders".ORDER_ID ELSE  NULL  END
    So, this returns order_id I need.
    But the question is how to count number of <b>order_id</b>'s returned by this column on BMM Layer.
    When I define my column with next expression:
    *<u>COUNT</u>(CASE WHEN NOT "Foxtrot (my) replica"."Sales Orders".LOY_MEMBER_ID IS NULL THEN "Foxtrot (my) replica"."Sales Orders".ORDER_ID ELSE NULL END)*
    I receive error:
    +[38083] The Attribute 'Q Orders LPM' defines a measure using an obsolete method.+
    Thank you,
    Alex.
    Edited by: Alex B on 3/3/2009 19:59

  • Costing run with values for material from info record.

    Hi All
    there is requirement with my client, that when i run my costing run the values of the components in BOM should be from the latest purchase order.
    For this i am maintined valuation criteria as price from info record and sub strategy as gross price from PO.
    When i am create a PO the info record is created automatically in background. So when i run costing run the value is picked correctly.
    Now when i create a PO with same material from diff vendor the info record is generated, but on costing run it picks the value from the info record of earlier vendor. For this i go and check the conditions tab in info record, ensure its balnk and save it. Again generate new PO and then run a costing run. This time the system reads the latest info record.
    My problem is that i want to ensure that during the costing run, system should read the latest info record w/o manual intervention. System behaviour is not consistent when there is multiple vendor and same material info records. Sometimes it pick up the value and sometimes it refers the old info record.
    Can any one help me out in resolving this issue. Its critical.
    Regards
    Rakesh

    Hi!
    Generally first info records will be created with vendor, then PO will be created, based on info records price, price will be updated in PO.
    If you have more than one vendor for same material, you have to create info records for each vendor. then you need to maintain source list, where you can fix the vendor. If there is no fix vendor, then you can do schedule agreement with vendors, where you have to give proportionate percentages again each vendor.
    Now for costing run :
    1. if you have only one info record, the price will take from that info record.
    2. if you have more than one info record, you need to mantain source list, where you can tick the fixed vendor, then system will take price from that vendor info record.
    3. if there is no fixed vendor and you have maintained scehduel agreement with weightage to vendor, then higher percentage vendor info record price will be picked up.
    4. if you have more info reocrds, not maintianed source list or schedule agreement, price will be picked up from first info record ( based on info record creation date).
    5. if you have a schedule agreement, but you have given equal weightage for example 50% and 50%, then also system will pick up from first info record.
    i think it is clear now.
    regs,
    ramesh b

  • Set flag for a set of records

    Hi
    I have a table that stores requests raised by employees as below
    name Request_ID date status
    ABC 112 05-Dec-11 rejected
    ABC 786 06-Dec-11 approved
    ABC 987 07-Dec-11 rejected
    ABC 119 08-Dec-11 approved
    MNP 221 09-Nov-11 rejected
    MNP 666 10-Nov-11 approved
    MNP 221 11-Nov-11 rejected
    MNP 999 12-Nov-11 approved
    RST 99 23-Dec-11 rejected
    RST 101 24-Dec-11 approved
    RST 876 25-Dec-11 rejected
    RST 127 26-Dec-11 approved
    I need to check the status of the latest request of each employee and set the flag of all his requests as -
    ‘A’ if the latest request status is ‘approved’
    ‘B’ if the latest request status is ‘rejected’
    The resultset should be something like -
    name Request_ID date status Flag
    ABC 112 05-Dec-11 rejected A
    ABC 786 06-Dec-11 approved A
    ABC 987 07-Dec-11 rejected A
    ABC 119 08-Dec-11 approved A
    MNP 221 09-Nov-11 rejected B
    MNP 666 10-Nov-11 approved B
    MNP 221 11-Nov-11 approved B
    MNP 999 12-Nov-11 rejected B
    RST 99 23-Dec-11 approved A
    RST 101 24-Dec-11 approved A
    RST 876 25-Dec-11 rejected A
    RST 127 26-Dec-11 approved A
    Could you please help me in framing this SQL. Use of any analytical function would be helpful
    Regards
    -pankaj

    Hi, Pankaj,
    Welcome to the forum!
    910543 wrote:
    Hi
    I have a table that stores requests raised by employees as belowWhenever you have a problem, please post CREATE TABLE and INSERT statements for your sample data, so that the people who want to help you can re-create the probelm accurately and test their ideas. since this is your first message, I'll do it for you:
    CREATE TABLE     table_x
    (     name           VARCHAR2 (5)
    ,     request_id     NUMBER
    ,     dt          DATE          -- DATE is not a good column name
    ,     status          VARCHAR2 (10)
    ,     CONSTRAINT  x_uk     UNIQUE (name, dt)
    INSERT INTO table_x (name, request_id, dt, status) VALUES ('ABC', 112, TO_DATE ('05-Dec-2011', 'DD-Mon-YYYY'), 'rejected');
    INSERT INTO table_x (name, request_id, dt, status) VALUES ('ABC', 786, TO_DATE ('06-Dec-2011', 'DD-Mon-YYYY'), 'approved');
    INSERT INTO table_x (name, request_id, dt, status) VALUES ('ABC', 987, TO_DATE ('07-Dec-2011', 'DD-Mon-YYYY'), 'rejected');
    INSERT INTO table_x (name, request_id, dt, status) VALUES ('ABC', 119, TO_DATE ('08-Dec-2011', 'DD-Mon-YYYY'), 'approved');
    INSERT INTO table_x (name, request_id, dt, status) VALUES ('MNP', 221, TO_DATE ('09-Nov-2011', 'DD-Mon-YYYY'), 'rejected');
    INSERT INTO table_x (name, request_id, dt, status) VALUES ('MNP', 666, TO_DATE ('10-Nov-2011', 'DD-Mon-YYYY'), 'approved');
    INSERT INTO table_x (name, request_id, dt, status) VALUES ('MNP', 221, TO_DATE ('11-Nov-2011', 'DD-Mon-YYYY'), 'rejected');
    INSERT INTO table_x (name, request_id, dt, status) VALUES ('MNP', 999, TO_DATE ('12-Nov-2011', 'DD-Mon-YYYY'), 'approved');
    INSERT INTO table_x (name, request_id, dt, status) VALUES ('RST',  99, TO_DATE ('23-Dec-2011', 'DD-Mon-YYYY'), 'rejected');
    INSERT INTO table_x (name, request_id, dt, status) VALUES ('RST', 101, TO_DATE ('24-Dec-2011', 'DD-Mon-YYYY'), 'approved');
    INSERT INTO table_x (name, request_id, dt, status) VALUES ('RST', 876, TO_DATE ('25-Dec-2011', 'DD-Mon-YYYY'), 'rejected');
    INSERT INTO table_x (name, request_id, dt, status) VALUES ('RST', 127, TO_DATE ('26-Dec-2011', 'DD-Mon-YYYY'), 'approved');
    COMMIT; name Request_ID date status ...If those are all the columns in the table, and you want to display a flag column, you can do something like this:
    SELECT     name
    ,     request_id
    ,     dt
    ,     status
    ,     CASE  FIRST_VALUE (status) OVER ( PARTITION BY  name
                               ORDER BY     dt     DESC
              WHEN  'approved'     THEN  'A'
              WHEN  'rejected'     THEN  'B'
         END          AS flag
    FROM     table_x
    ;If the table has a flag column, and you need to populate it, you can use the query above in a MERGE statement. (You may be better off not actually storing the flag, however, if it's hard to keep up to date.)
    MNP 221 09-Nov-11 rejected
    MNP 666 10-Nov-11 approved
    MNP 221 11-Nov-11 rejected
    MNP 999 12-Nov-11 approved ...
    I need to check the status of the latest request of each employee and set the flag of all his requests as -
    ‘A’ if the latest request status is ‘approved’
    ‘B’ if the latest request status is ‘rejected’
    The resultset should be something like -
    name Request_ID date status Flag
    MNP 221 09-Nov-11 rejected B
    MNP 666 10-Nov-11 approved B
    MNP 221 11-Nov-11 approved B
    MNP 999 12-Nov-11 rejected BIn the original data, the status for the row with request_id=999 was 'approved'. Is there a typo somewhere?
    Edited by: Frank Kulash on Jan 26, 2012 10:57 PM

  • How to use the first row function value for the rest of records

    Hi all,
    I am having a select statement like this. We are calling function enabled for each row. So it is taking
    time. As per our bussiness logic, we can use what the function 'enabled' will give for the first row to
    other rows also. so that we can avoid calling the function 'enabled' next row onwars.
    Before
    SELECT
    decode(enabled(col1, col2, col3),'TRUE','xxx', col4),
    decode(enabled(col1, col2, col3),'TRUE','xxx', col5),
    decode(enabled(col1, col2, col3),'TRUE','xxx', col6),
    decode(enabled(col1, col2, col3),'TRUE','xxx', col7),
    decode(enabled(col1, col2, col3),'TRUE','xxx', col8),
    from
    table1 t1, table2 t2 where t1.col1 = t2.col1
    After
    SELECT
    decode(enabled(col1, col2, col3),'TRUE','xxx', col4),
    decode('true/false','TRUE','xxx', col5), --Here 'true/false' is the first row value after calling the function 'enabled'
    decode('true/false','TRUE','xxx', col6),
    decode('true/false','TRUE','xxx', col7),
    decode('true/false','TRUE','xxx', col8),
    from
    table1 t1, table2 t2 where t1.col1 = t2.col1
    Any thoughts on this, how to implement this logic
    Thanks,
    Pal

    It's not clear where col1, col2, col3... etc. come from, so I've assumed they're in table1.
    with table1_X as (select enabled(col1, col2, col3) as enbl
                            ,col1, col2, col3, col4, col5, col6, col7, col8
                      from   table1)
    SELECT decode(t1.enbl,'TRUE','xxx', col4)
          ,decode(t1.enbl,'TRUE','xxx', col5)
          ,decode(t1.enbl,'TRUE','xxx', col6),
          ,decode(t1.enbl,'TRUE','xxx', col7),
          ,decode(t1.enbl,'TRUE','xxx', col8),
    from   table1_x t1 join table2 t2 on (t1.col1 = t2.col1)p.s. you'd be better leaving out the enabled function altogether and incorporating the logic of the function directly in the SQL, and then you would be avoiding any context switching which will clearly impact performance on any large amount of data.
    Edited by: BluShadow on 14-May-2013 08:26

  • ESB Process reads a Batch file for a limited set of records only..?

    Dear All,
    I am having an ESB process that reads a Batch file (csv) that has around 10,000 Products information.
    I have another BPEL process that will create the Product in Oracle Apps through standard API (Main BPEL process that calls a child process for creating product).
    I am invoking the BPEL process from ESB, and that works fine.
    Now, the Issue is: I am able to create at a time around 10 Products (the main process calls the child process in a loop.). The main process instance is not getting created but the child process instance is getting created for a set of records, afterwards the child process instances get stops creating. The main process instance could not be found in the console. Not getting why the process is not getting visible as well it is not completing the process.
    What could be the problem for this... Am I need to change any environment configurations...?
    Please update...
    Many Thanks in advance...

    Does this apply to you?
    https://social.technet.microsoft.com/Forums/en-US/9ccd8e50-b0af-4f58-9787-6435834e4c52/dfsr-not-working-on-new-windows-2008r2-server
    Thanks for the link, but no - not the same scenario although the error is the same.   The RGs I'm working with are all in sync and communication is working, it's just getting the backlog reported correctly.
    To reiterate, I can paste two versions of the exact command into the DOS window buffer; one copied from OneNote and one copied from my batch file.  Executing the one from OneNote succeeds and reports the RG in sync and the one copied from the batch
    file fails.
    I can repeat the results by up arrow once to the command pasted into the buffer from the batch file and see it fail.  Then up arrow twice to retrieve the command pasted from OneNote into the buffer and it will report correctly (illustrated in the grahic).
    Let me add that the command in the batch file was originally copied from OneNote and pasted in to the batch file; as if going into the batch file somehow corrupts it.
    - a -

  • Need to concatonate multiple values for same key columns into one string

    Hi...I'm attempting to use PL/SQL to read through a table containing more than one column value for a set of keys, and concatonate those values together into one string. For example, in the STUDENT table, for a STUDENT_ID there are multiple MAJORS_ACCOMPLISHED such as History, Biology and Mathematics. Each of the majors is stored in a different row with the same STUDENT_ID due to different faculty DEPARTMENT values.
    I want to read through the table and write out a single row for the STUDENT_ID and
    concatonate the values for MAJORS_ACCOMPLISHED. The next row should be another STUDENT_ID and the MAJORS ACCOMPLISHED for that student..etc.
    I've hit a wall trying to use cursors and WHILE loops to get the results I want. I'm hoping someone may have run across this before and have a suggestion. Tks!!

    I think you are looking for string aggregation.
    The following are the replies posted by me in the forum recently on the same case.
    they might help you.
    Re: Concatenating multiple rows in a table - Very urgent - pls help
    Re: Doubt in a query ( Urgent )
    Re: How to remove words which consist of max 2 letters?
    Re: output like Name1,Name2,Name3...

  • Comma separated values for input and return multiple values

    Hello everyone,
    I have this simple package. Can someone suggest a way to accept multiple empno as input (comma separated) and to return set of salary values for the set of employee numbers (compatible to work with lower Oracle versions). Thanks much!
    CREATE OR REPLACE PACKAGE test_multi IS
    FUNCTION GET_sal(P_empno IN emp.empno%TYPE) RETURN NUMBER;
    END test_multi;
    CREATE OR REPLACE PACKAGE BODY test_multi IS
    FUNCTION GET_sal(P_empno IN emp.empno%TYPE) RETURN NUMBER IS
    V_sal NUMBER(10,2);
    MSG VARCHAR2(200);
    BEGIN
    SELECT sal
    INTO V_sal
    FROM emp
    WHERE empno = p_empno;
    RETURN V_sal;
    EXCEPTION
    WHEN NO_DATA_FOUND THEN
    DBMS_OUTPUT.PUT_LINE('No data found.');
    IF (V_sal IS NULL OR V_sal = 0) THEN
    V_sal := 0;
    END IF;
    RETURN V_sal;
    WHEN OTHERS THEN
    MSG := SUBSTR(SQLERRM, 1, 70);
    DBMS_OUTPUT.PUT_LINE(MSG);
    END GET_sal;
    END test_multi; -- End package

    A way to do this in 10g or above...
    SQL> ed
    Wrote file afiedt.buf
      1  with e as (select '7499,7698,7654,7902' as enos from dual)
      2  --
      3  select empno, sal
      4  from emp
      5  where empno in (select regexp_substr(enos,'[^,]+',1,rownum)
      6                  from   e
      7*                 connect by rownum <= length(regexp_replace(enos,'[^,]'))+1)
    SQL> /
         EMPNO        SAL
          7902       3000
          7698       2850
          7654       1250
          7499       1600
    SQL>As for Oracle 8, .... well.... like Oracle, I no longer use unsupported versions, so I'd recommend you upgrade to something that is supported.

  • Load values in a set of values asigned to a segment if the accounting flexfield

    Hi folks:
    I have defined an accounting flexfield with several segments. One of the is the natural account qualified as natural.
    I have to enter up to 30000 values for that set of values and I want to know if there is an script to load those values directly into the table with SQL*Plus
    I know this can not be supported but the problem is that I have to do this five times
    Any help will be useful

    Hi Jose Luis!
    Check out the following website: http://www.comstar.co.uk/dataload/
    Dataload is a tool used by many Oracle Consultants for exactly the purpose you require. You can download the tool and several templates from this site. Enjoy!
    Pat Henry

  • Forcing iTunes to re-calculate normalization values?

    hi,
    I recently changed the volume level of many of my songs using an external program (mp3gain). The problem is that AFTER I did that, iTunes has never calculated the values for using normalization. Since it's using the values it calculated like 2 months ago, but the volume of my songs changed, normalization doesn't work as it should (obviously).
    So what I need to do: I need to somehow force iTunes to re-calculate the values for normalization for each song, but I don't really know how except for deleting all songs and re-adding them (which would cause a loss of ratings/playcounts etc. so I don't want to do it this way).
    Does anyone know how to force iTunes to do this?
    thanks

    Thanks. I've seen Otto42 around the foobar2k / HA forums.
    From what I've seen in his posts here, I've got to re-import. But I've exported the podcasts as a .opml file. Hopefully this will retain the podcasts as actual podcasts (I love the bookmarking feature). I'll report back here after I'm done. Right now I've got MP3Tag stripping the "comment" fields from my .mp3s and the "itunnnorm" fields from my .m4a files.
    :fingers crossed:

  • Setting up maximum order value for vendor

    Hi experts,
    I want to set a maximum price/value limit for vendor for giving order
    For example - i want to set a maximum limit of 1lakh for vendor A
    when ever I create PO for vendor A , system should chck the maximum value for A vendor.
    And if there a many open order for vendor A, then the total of open PO's shoukld be
    counted and when ever i am creating new PO & if ec=xceeds the value 1 lakh then
    system should not allow to create the new PO
    Vendor A       PO1     value - 50000      status - open
    vendor A     PO2     value - 30000     status - open
    vendor A     PO3     value - 40000     status - creating now
    At the time of creating PO3, PO value exceeds the vendor limit, so it should not allow to create/save the PO3
    please let me know how to handle this

    Hi,
    This can get really complex unless you have clear vision of what you want to control.
    For example, the maximum limit that you want to set, is it for the current month / current year etc?
    If it is the sum of all Open PO value, would you consider the PO's that were created the sme month and delivered but payment is pending though Invopice is posted?
    So answers to such questions provide the clarity for teh controls that need to set in place.
    For this you can create a Z table and maintain the limits that you like to set based on the criterion like monthly or yearly.
    Then Use a user-exit in the PO to calculate the net value of all open PO's for the vendor and then validate with the Z table and throw an error if it exceeds.
    However the performance may get effected with such calculations on PO save.
    in that case you may have to consider storing the cumulated PO value on every PO creation and PO Completion.
    Consider all this and take a decision.
    hope this helps.

  • What are some best practices for Effective Sequences on the PS job record?

    Hello all,
    I am currently working on an implementation of PeopleSoft 9.0, and our team has come up against a debate about how to handle effective sequences on the job record. We want to fully grasp and figure out what is the best way to leverage this feature from a functional point of view. I consider it to be a process-related topic, and that we should establish rules for the sequence in which multiple actions are inserted into the job record with a same effective date. I think we then have to train our HR and Payroll staff on how to correctly sequence these transactions.
    My questions therefore are as follows:
    1. Do you agree with how I see it? If not, why, and what is a better way to look at it?
    2. Is there any way PeopleSoft can be leveraged to automate the sequencing of actions if we establish a rule base?
    3. Are there best practice examples or default behavior in PeopleSoft for how we ought to set up our rules about effective sequencing?
    All input is appreciated. Thanks!

    As you probably know by now, many PeopleSoft configuration/data (not transaction) tables are effective dated. This allows you to associate a dated transaction on one day with a specific configuration description, etc for that date and a different configuration description, etc on a different transaction with a different date. Effective dates are part of the key structure of effective dated configuration data. Because effective date is usually the last part of the key structure, it is not possible to maintain history for effective dated values when data for those configuration values changes multiple times in the same day. This is where effective sequences enter the scene. Effective sequences allow you to maintain history regarding changes in configuration data when there are multiple changes in a single day. You don't really choose how to handle effective sequencing. If you have multiple changes to a single setup/configuration record on a single day and that record has an effective sequence, then your only decision is whether or not to maintain that history by adding a new effective sequenced row or updating the existing row. Logic within the PeopleSoft delivered application will either use the last effective sequence for a given day, or the sequence that is stored on the transaction. The value used by the transaction depends on whether the transaction also stores the effective sequence. You don't have to make any implementation design decisions to make this happen. You also don't determine what values or how to sequence transactions. Sequencing is automatic. Each new row for a given effective date gets the next available sequence number. If there is only one row for an effective date, then that transaction will have a sequence number of 0 (zero).

Maybe you are looking for