Increment Date

Hi,
I have a function which returns me the current date which is displayed on OAF page in the format mm/dd/yyyy.
I have to change this current date to tomorrow's date by incrementing it at the point where the returned value is stored in AM as i cant make changes in the function.
Date in AM is stored in variable returnValue
returnValue = new oracle.jbo.domain.Date(date2);
How do I operate on returnValue in AM so that my date changes to 12/15/2009 if current date and value in returnValue is 12/14/2009?

Hi get the code
import java.text.DateFormat;
import java.text.ParsePosition;
import java.util.Calendar;
returnValue = new oracle.jbo.domain.Date(date2);
long cpiA25=new SimpleDateFormat("mm/dd/yyyy").parse(returnValue,new ParsePosition(0)).getTime();
java.sql.Date date =new java.sql.Date(cpiA25);
Calendar caldt = Calendar.getInstance();
caldt .setTime(date );
caldt .add(caldt .DATE, 1);
thanx
Pratap

Similar Messages

  • How to extract incremental data from SQL server to oracle tables in ODI

    HI All,
    In my ODI sql server is install.My Source is in SQL server and my target is in Oracle.
    I need to create a interface mapping where i need to extract incremental data from sql server to oracle.
    There is a datetime(with Timestamp) field in sql server .I need to pull incremental data based on dateime.
    Example = tablename.DateTime > (select '1-jan-11' from dual) .....i am using this query but its not woking.the error is Invalid object name"dual".
    We are not going to use Incremental in IKM and LKM.
    Request you to please provide any suggestion ASAP.
    Thanks,
    Lony

    You can do that via Variable.
    In the interface mapping create a filter on Tablename.DateTime
    and put the condition like this
    Tablename.DateTime BETWEEN #VAR and in the variable use this query in refreshing tab with oracle schema
    SELECT max(start_time)||' AND '||max(END_TIME)+1 from audit_table where ETL_JOB_CODE = '20'In the package call the above variable in refresh mode and then interface.
    This way you will pass from the query between and condition date and pass to interface so that SQL Server fetches the data between those too range.
    Note:- You might need to tweak the date format so that SQL Server can understand.
    Hope this helps.

  • Is there a auto-increment data type in Oracle

    Is there a auto-increment data type in Oracle ?
    How to do it if there is no auto-increment data type in Oracle ?
    null

    jackie (guest) wrote:
    : Is there a auto-increment data type in Oracle ?
    : How to do it if there is no auto-increment data type in Oracle
    Hi,
    I think you need unique ID's, for this purpose you use sequences
    in Oracle. Example:
    create table xy (
    id number,
    name varchar2(100)
    alter table xy
    add constraint xy_pk primary key(id);
    create sequence xy_seq start with 1 maxvalue 99999999999;
    (there are many other options for create sequence)
    create or replace trigger xy_ins_trg
    before insert on xy
    for each row
    begin
    select xy_seq.nextval
    into :new.id
    from dual;
    end;
    This produces a unique value for the column id at each insert.
    Hope this will help.
    peter
    null

  • Increment date parameters on each run

    Hi,
    while scheduling concurren requests. we have option like
    "increment date parameters on each run"
    what exactly the meaning of the above

    Check the following thread:
    schedule purge program
    schedule purge program
    Also, have a look at:
    Note: 151849.1 - Scheduled Periodic Concurrent Program Runs One Time Only
    https://metalink.oracle.com/metalink/plsql/ml2_documents.showDocument?p_database_id=NOT&p_id=151849.1

  • Incremental Data loading in ASO 7.1

    HI,
    As per the 7.1 essbase dbag
    "Data values are cleared each time the outline is changed structurally. Therefore, incremental data loads are supported
    only for outlines that do not change (for example, logistics analysis applications)."
    That means we can have the incremental loading for ASO in 7.1 for the outline which doesn't change structurally. Now what does it mean by the outline which changes structurally? If we add a level 0 member in any dimension, does it mean structrual change to that outline?
    It also syas that adding Accounts/Time member doesn't clear out the data. Only adding/deleting/moving standard dimension member will clear out the data. I'm totally confused here. Can anyone pls explain me?
    The following actions cause Analytic Services to restructure the outline and clear all data:
    ● Add, delete, or move a standard dimension member
    ● Add, delete, or move a standard dimension
    ● Add, delete, or move an attribute dimension
    ● Add a formula to a level 0 member
    ● Delete a formula from a level 0 member
    Edited by: user3934567 on Jan 14, 2009 10:47 PM

    Adding a Level 0 member is generally, if not always, considered to be a structural change to the outline. I'm not sure if I've tried to add a member to Accounts and see if the data is retained. This may be true because by definition, the Accounts dimension in an ASO cube is a dynamic (versus Stored) hierarchy. And perhaps since the Time dimension in ASO databases in 7.x is the "compression" dimension, there is some sort of special rule about being able to add to it -- although I can't say that I ever need to edit the Time dimension (I have a separate Years dimension). I have been able to modify formulas on ASO outlines without losing the data -- which seems consistent with your bullet points below. I have also been able to move around and change Attribute dimension members (which I would guess is generally considered a non-structural change), and change aliases without losing all my data.
    In general I just assume that I'm going to lose my ASO data. However, all of my ASO outlines are generated through EIS and I load to a test server first. If you're in doubt about losing the data -- try it in test/dev. And if you don't have test/dev, maybe that should be a priority. :) Hope this helps -- Jason.

  • Exporting and importing only the Incremental datas

    Dear all,
    we r facing a big time consuming process of changing oracle schema datas from live servers to out testing servers..
    scenario as follows:
    1. .dmp files of size 1GB to 3 GB are taken from live server and posted in test servers to check the live issues
    2. same sized dumps are taken for simulation to rectify issue
    this type of above process consumes the whole time for exporting and importing
    can any one suggest me how can the incremental data's alone can be exported and imported to the test server.
    Hoping that i can get a valuable solution for this
    Thanks in Advance
    S.Aravind

    Hi Aravind,
    as Nicolas specified RMAN would be the best option.
    Incremental data refresh is not possible through exp/imp. but there is a possiblity of writing a script which automates this refresh process.

  • Increment date paramters each run is not working

    hi all,
    i have a schedualed concurrent request that take a date prameter, i schedule the request to run daily with the option "increment date paramters each run" but it runs daily without incremeting the date what could be worng
    fadi

    You mean it was working earlier ? yes and still for all old request but not the new ones.
    Do you by any chance use a sql to define the default parameter (like select sysdate from dual) ? yes i do i user select sysdate-1 from dual because i need it to run to the preivose date
    Try to use paramter type = date, and default = Current Date, and check the increment date run. will try but i dont need the current data, i need current date -1
    thanks dgood
    fadi

  • Incremental Data load in SSM 7.0

    Hello all,
    I once raised a thread in SDN which says how to automate data load into SSM 7.0.
    Periodic data load for a model in SSM
    Now my new requirement is not to upload the whole data again , but only the new data (data after the previous data load) . Is there a way to do the incremental data load in SSM 7.0 ? Loading the whole of the fact data again and again will take a hit on the performance of the SSM system. Is there a work around in case there is no solution ?
    Thanks
    Vijay

    Vijay,
    In your PAS model you can build a procedure to remove data and then load that data to the correct time period.
    In PAS, to remove data but not the variable definitions from the database:
    Removing data for a particular variable
    REMOVE DATA SALES
    or if there were particular areas only within
    SELECT product P1
    SELECT customer C1
    REMOVE SELECTED SALES
    or remove all data
    REMOVE DATA * SURE
    or just a time period
    REMOVE DATA SALES BEFORE Jan 2008
    Then you would construct or modify your Load Procedure to load the data for the new period
    SET PERIOD {date range}
    Regards.
    Bpb
    Then would

  • How to use incremental data load in OWB? can CDC be used?

    hi,
    i am using oracle 10g relese 2 and OWB 10g relese 1
    i want know how can i implement incremental data load in OWB?
    is it having such implicit feature in OWB tool like informatica?
    can i use CDC concept for this/ is it viable and compatible with my envoirnment?
    what could be other possible ways?

    Hi ,
    As such the current version of OWB does not provide the functionality to directly use CDC feature available. You have to come up with your own strategy for incremental loading. Like, try to use the Update Dates if available on your source systems or use CDC packages to pick the changed data from your source systems.
    rgds
    mahesh

  • Want to perform an incremental data replication

    I  have five vdisk in DC SAN (EVA 8400) which contains a large amount of data. This data has to replicate to newly deployed DR SAN. That is why we take a tape backup from DC SAN and want to restore it on DR before start the replication. After that we want to start the replication which will just replicate the incremental data, after restored from tape. But in HP CA (Continuous Access) I have not found any option to start the incremental data replication, it is started from the zero-block data with a newly automatic created vdisk. So please suggest me about the possibility of incremental data replication.

    Actually, I have got it to work...
    I used the methods ACTIVATE_TAB_PAGE & TRANSFER_DATA_TO_SUBSCREEN in the BADI LE_SHP_TAB_CUST_ITEM.
    Its working fine now.
    Thanks for the response.

  • Essbase 7.1 - Incremental data load in ASO

    Hi,
    Is there incremental data loading feature in ASO version 7.1? Let's say, I've the following data in ASO cube
    P1 G1 A1 100
    Now, I get the following 2 rows as per the incremental data from relational source:
    P1 G1 A1 200
    P2 G1 A1 300
    So, once I load these rows using rule file with override existing values option, will I've the following dataset in ASO:
    P1 G1 A1 200
    P2 G1 A1 300
    I know there is data load buffer concept in ASO 7.1. And this is the inly way to improve data load performance. But just wanted to check if we can implement incremental loading in ASO or not.
    And one more thing, Can 2 load rules run in parallel to load data in ASO cubes? As per my understanding, when we start loading data, the cube is locked for any other insert/update. Pls correct me if I'm wrong!
    Thanks!

    Hi,
    I think the features such as incremental data loads were available from version 9.3.1
    In the whats new for Essbase 9.3.1 it contains
    Incrementally Loading Data into Aggregate Storage Databases
    The aggregate storage database model has been enhanced with the following features:
    l An aggregate storage database can contain multiple slices of data.
    l Incremental data loads complete in a length of time that is proportional to the size of the
    incremental data.
    l You can merge all incremental data slices into the main database slice or merge all
    incremental data slices into a single data slice while leaving the main database slice
    unchanged.
    l Multiple data load buffers can exist on a single aggregate storage database. To save time, you
    can load data into multiple data load buffers at the same time.
    l You can atomically replace the contents of a database or the contents of all incremental data
    slices.
    l You can control the share of resources that a load buffer is allowed to use and set properties
    that determine how missing and zero values, and duplicate values, in the data sources are
    processed.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Incrementing Date field in Variable based on a specific calendar

    Hello All,
        I am currently setting up a schedule that includes Mass activities. I am running into a problem when i want a particular date field an activity to increment by one day based on a specific calendar i.e. Factory calendar (M-F open).
    EX)
    The due date field for on any given day the mass activity runs should be system date plus 2 business days.
    While the posting date field on the same transaction should increment by 2 day.
    Result would be:
    On System date = 02/17/11
    Due Date Field = 02/21/11
    Posting Date = 02/19/11
    I am currently using expression similiar to this: =Time.expression(Time.now('GMT'), 'add 2 days') but cannot find documentation how to include calendars in this as you would with =Time.isTimeWindowOpen(Time.now('GMT'), 'Factory Calendar').
    Can someone please point me in right direction?
    Thanks
    Dan

    Hi Gerben,
       I tried this and it yielded unacceptable(expected) results. Here is why i think this is the case.
    =Time.expression(Time.nextTimeWindowOpening(Time.now('America/Regina'), 'System_Week_WorkDays'), 'add 7 days') suppose to only be open m-f
    Time.now() should evaluate to today lets say 02/21/11
    then Time.nextTimeWindowOpening should return the next open day, after but including today, it checks and finds today is open so = 02/21/11
    and the time expression will add n= 7 days to that. 02/21/11 + 7 = 02/28/11 this is wrong.
    According to this calendar it should be 03/01/11
    Problem is that the add function is just adding n days indiscriminate of the calendar as the calendar is not part of the addition argument.
    Edited by: Daniel Austin on Feb 21, 2011 9:31 PM

  • Loading incremental data

    Hi,
    I am using 11.1.0.7 DB with 11.1.0.7B AWM. I would like to load data incrementally.
    Some time my fact data contains tuples which are loaded before (corrections/re statements) and when i load incrementally then it just replaces the existing tuple from the cube.
    Is there a better way to do an incremental load other than getting all the revelant data(from historic tables) and then using group by and then loading it?
    Thanks,

    The term "incremental" has two common meanings in the context of loading data into a cube. First, it could refer to loading a subset of records from source tables. For example, a fact table has data for years 2005 - 2010 and data is added daily. The goal of an incremental load might be to load only those records that we added to or updated in the fact table yesterday (e.g., '29-MAR-2010'). Solutions (1), (2) and (3) apply to that situation.
    "Incremental" might also be used to describe a situation where data read during a load changes, rather than replaces, data in the cube. For example, data such the following already exists in the cube:
    28-MAR-2010 PRODUCT_1 CUSTOMER_1 100.00
    28-MAR-2010 PRODUCT_3 CUSTOMER_2 150.00
    and the following data is added to the fact table (and these are the only records for these time, product and customer values):
    28-MAR-2010 PRODUCT_1 CUSTOMER_1 15.00
    28-MAR-2010 PRODUCT_3 CUSTOMER_2 -25.00
    And the intent is to have data appear as follows in the cube:
    28-MAR-2010 PRODUCT_1 CUSTOMER_1 115.00
    28-MAR-2010 PRODUCT_3 CUSTOMER_2 125.00
    What you need to know is that data read from a table always replaces the data that exists in the cube. So, if you just load from the fact table into the cube the data will be:
    28-MAR-2010 PRODUCT_1 CUSTOMER_1 15.00
    28-MAR-2010 PRODUCT_3 CUSTOMER_2 -25.00
    There are two things that you could do that would yield the following data in the cube:
    28-MAR-2010 PRODUCT_1 CUSTOMER_1 115.00
    28-MAR-2010 PRODUCT_3 CUSTOMER_2 125.00
    A) You could load the following records from the fact table directly into the cube.
    28-MAR-2010 PRODUCT_1 CUSTOMER_1 115.00
    28-MAR-2010 PRODUCT_3 CUSTOMER_2 125.00
    28-MAR-2010 PRODUCT_1 CUSTOMER_1 15.00
    28-MAR-2010 PRODUCT_3 CUSTOMER_2 -25.00
    The SQL used to load the cube can do a SUM .... GROUP BY. The net result will be:
    28-MAR-2010 PRODUCT_1 CUSTOMER_1 115.00
    28-MAR-2010 PRODUCT_3 CUSTOMER_2 125.00
    (I think you might need to map the cube with joins to get the sum ... group by. Be sure to check the SQL in the cube_build_log to make sure you are getting the SQL you expect.)
    B) You could load the following records into a seperate cube (let's call this SALES_CUBE_UPDATE, while your main cube is named SALES_CUBE).
    28-MAR-2010 PRODUCT_1 CUSTOMER_1 15.00
    28-MAR-2010 PRODUCT_3 CUSTOMER_2 -25.00
    As post load task, you can update the SALES cube to be the sum of the current value of SALES plus the value in the SALES_UPDATE cube. You would do this with OLAP DML code such as
    sales_cube_sales_stored(sales_cube_measure_dim 'SALES') = sales_cube_sales_stored + sales_cube_update_sales
    If you use this method, you would ideally:
    - Filter (in OLAP DML terms, LIMIT) the dimensions of the cube to only those values that have data in the SALES_CUBE_UPDATE cube so you don't spent time looping over dimension values that don't have data.
    - Loop over the composite dimension. E.g.,
    SET sales_cube_sales_stored(sales_cube_measure_dim 'SALES') = sales_cube_sales_stored + sales_cube_update_sales ACROSS sales_cube_composite
    Or, if the cube is partitioned (almost all cubes benefit from partitioning) you will loop the partition template. E.g.,
    SET sales_cube_sales_stored(sales_cube_measure_dim 'SALES') = sales_cube_sales_stored + sales_cube_update_sales ACROSS sales_cube_prt_template
    In most cases, you will do this only at the leaf levels and then aggregate so the entire process will look something like this:
    1) Load data into the sales_cube_update cube using the LOAD command (create this using a cube script in AWM). Don't bother to aggregate as part of the load.
    2) Run an OLAP DML program such as:
    " Limit to lowest levels
    LIMIT time TO time_levelrel 'DAY'
    LIMIT product TO product_levelrel 'ITEM'
    LIMIT customer TO customer_levelrel 'CUSTOMER'
    " Keep only those values where data exists in the SALES_CUBE_UPDATE cube.
    LIMIT time KEEP sales_cube_update_sales NE na
    LIMIT product KEEP sales_cube_update_sales NE na
    LIMIT customer KEEP sales_cube_update_sales NE na
    " Add the values of the sales_cube_update cube to the values in the sales_cube.
    " Loops the partition template for better performance.
    SET sales_cube_sales_stored(sales_cube_measure_dim 'SALES') = sales_cube_sales_stored + sales_cube_update_sales ACROSS sales_cube_prt_template
    " Save the data.
    UPDATE
    COMMIT
    3) Aggregate the cube (create an AGGREGATE command an AWM cube script).
    Notes:
    - Be sure to clear data from the SALES_CUBE_UPDATE cube before or after you load new data into it. (E.g., use the OLAP DML CLEAR command.)
    - If you will be running OLAP DML commands on data that exists in multiple partitions you can parallize the execution of the OLAP DML code. See the following post: http://oracleolap.blogspot.com/2010/03/parallel-execution-of-olap-dml.html
    Well, a bit of a lengthy explanation. I hope helps. Good look.

  • Insert record incrementing date - low cost

    Hi,
    I have following record structure
    PLAN_LEVEL     PERIOD_NUM     DAYS     OPERATIONSEQ     GROUP_ID     ALLOY     PLANNER_CODE     DEPARTMENT_CLASS     DEPARTMENT     LOAD_HOUR     OFFSET_DATE     OFFSET
    0     1     20     150     525150     INCONEL alloy 625     WIRE ROD     CD     WP     8.45402793     3/19/2007     14
    0     1     20     140     525150     INCONEL alloy 625     WIRE ROD     CD     BA     4.586287974     3/17/2007     30
    0     1     20     90     525150     INCONEL alloy 625     WIRE ROD     MM     MW2     0.750050215     3/12/2007     51
    0     1     20     70     525150     INCONEL alloy 625     WIRE ROD     MM     M2M     0.027779638     3/12/2007     72
    1     1     20     160     525150     INCONEL alloy 625     WIRE ROD     PMD     DPP     1.453799581     3/4/2007     101
    1     1     20     140     525150     INCONEL alloy 625     WIRE ROD     CHP     CHP     0.97472359     3/2/2007     132
    1     1     20     130     525150     INCONEL alloy 625     WIRE ROD     CHP     SG     4.873617025     2/25/2007     168
    1     1     20     120     525150     INCONEL alloy 625     WIRE ROD     CHP     CHP     1.091515669     2/24/2007     205
    1     1     20     110     525150     INCONEL alloy 625     WIRE ROD     PMD     PM     0.210131401     2/24/2007     242
    1     1     20     90     525150     INCONEL alloy 625     WIRE ROD     PMD     PM     0.262927179     2/18/2007     285
    1     1     20     60     525150     INCONEL alloy 625     WIRE ROD     B30     ESR     15.18807308     2/15/2007     331
    1     1     20     20     525150     INCONEL alloy 625     WIRE ROD     VIM     VIM     2.191847409     2/12/2007     380there is days column in the table i just need to replicate/insert above data into same table incrementing offset_date.
    e.g.
    as days value is 20 i have to insert above record group 20 times and increment offset_date by 1 every time so...
    for I 1..days here is 20
    insert table
    offset_date+1 for each record value in this case 12 recordso total insert for this record group will be 12*20 and for each data set it should increase offset_date+1
    is there any low cost ( good performance )solution than looping through it.
    thanks in advance

    May be better:
    Connected to Oracle Database 10g Enterprise Edition Release 10.2.0.1.0
    Connected as SYS
    SQL>
    SQL> drop table t;
    Table dropped
    SQL> create table t(i number, d1 date, offset number);
    Table created
    SQL> insert into t values(1, sysdate - 37, 3 );
    1 row inserted
    SQL> insert into t values(2, sysdate - 34, 7 );
    1 row inserted
    SQL> select * from t;
             I D1              OFFSET
             1 26.12.2006           3
             2 29.12.2006           7
    SQL> SELECT t.i, t.d1 + t2.i - 1, t.offset
      2    FROM t
      3        ,(SELECT rownum i
      4            FROM dual
      5          CONNECT BY LEVEL <= (SELECT MAX(offset) FROM t)) t2
      6   WHERE t.offset + 1 >= t2.i
      7   ORDER BY t.i, t.d1 + t2.i - 1;
             I T.D1+T2.I-1     OFFSET
             1 26.12.2006           3
             1 27.12.2006           3
             1 28.12.2006           3
             1 29.12.2006           3
             2 29.12.2006           7
             2 30.12.2006           7
             2 31.12.2006           7
             2 01.01.2007           7
             2 02.01.2007           7
             2 03.01.2007           7
             2 04.01.2007           7
    11 rows selected
    SQL> drop table t_temp;
    drop table t_temp
    ORA-00942: table or view does not exist
    SQL> CREATE TABLE t_temp AS
      2    SELECT t.i, t.d1 + t2.i - 1 d1, t.offset
      3      FROM t
      4          ,(SELECT rownum i
      5              FROM dual
      6            CONNECT BY LEVEL <= (SELECT MAX(offset) FROM t)) t2
      7     WHERE t.offset + 1 >= t2.i;
    Table created
    SQL> SELECT * FROM t_temp;
             I D1              OFFSET
             1 26.12.2006           3
             2 29.12.2006           7
             1 27.12.2006           3
             2 30.12.2006           7
             1 28.12.2006           3
             2 31.12.2006           7
             1 29.12.2006           3
             2 01.01.2007           7
             2 02.01.2007           7
             2 03.01.2007           7
             2 04.01.2007           7
    11 rows selected
    SQL>

  • Incremental Data Backup

    Dear sir/Madam
    I have One server i.e. Windows NT. With oracle 8 database loaded
    and one database which is having many users, I am taking backup of
    one x user which is very important to me . The same is exported and
    Import to another server which is Windows2000 and Oracle 8i Database loaded.
    In this server i am importing the data of x user.
    Every time i am export data from windows nt server data base and delete the user in windows2000 server and import the dmp file which is latest updated dmp.
    here what happen the time consume and import data of all tables.
    So what i need is system automatecaly export the updated tables and
    same may import in this server with the help of incremental command, but i dont want to take full dba backup.
    Please guide us how to do the same
    regards
    Mathpati.

    Post your question on Database - General Forum.
    And then You'll have an answer like:
    1- Create and save a script in the file system that runs command line to export the user you want. You may use the very same script you're already using;
    2- Schedule this script in OS scheduler (chron if unix based systems or whatever scheduler you want);
    Regards,
    Marcos

Maybe you are looking for

  • How do I scan a document with multiple pages _ HP 6500 wireless

    I tried this in another thread but never got an answer, so here it goes again. I have a problem trying to scan multiple pages FROM THE GLASS into one pdf file. I have the HP Officejet 6500 wireless E709n printer running under WIN7 64-bit. My printer

  • Mail Adapter Configuration in XI 3.0

    Hi all, i tried to set up a process using the mail adapter. Unfortunately it doesn't work. Has anyone succeeded in setting up a process with the mail adapter using POP3 and SMTP?

  • Connecting two printers wirelessly to Airport network?

    Hi, I currently have a working wireless network with the Linksys Wireless-B Broadband Router BEFW11S4 - which I believe is 802.11b - that connects 2-4 laptops at any point in time. I just purchased the new Airport Extreme 54 mbp Base Station. I would

  • I can't edit my contact list

    i cant even add contact!

  • Wifi problem after connecting to wpa2 enterprise

    hi all. I have iphone 3gs with me. It has doing great on all the wifi connection before. I can connect to my home wpa2 wifi and office hidden wpa2 wifi with no problem. Then, I tried to join my office enterprise wifi with wpa2 enterprise. It was succ