Triggers for moving data

Hi all.........
I have defined several tier (with its tablespaces) seting a retain time for each one. For example, 1 month for tier. It is supposed that passed this month, those datafiles (or partition) will be moved into the next tier defined. But , is not there any trigger or condition to prevent moving active data (even this data belongs to a tier with closed retain time) if this data is still active (is being used) ????
It is yes, where i can set those conditions???

The ILM Assistant (ILMA) is not designed to automatically move a partition from one tier to another. The ILMA will generate the scripts to move the partition from one tier to another but it does not then execute those scripts even though the use of a calendar on the main page might make you think it was in fact building a schedule of when to move a partition. Once you have generated the SQL script from the ILMA you can then give these scripts to your DBA for scheduling within all the other database operations or you can schedule the execution of the script yourself.
If you could provide some more information about your specific use case it might be possible to provide a work around.
Hope this helps
Keith

Similar Messages

  • Which CKM is used for moving data from Oracle to delimited file ?

    Hi All
    Please let me know Which CKM is used for moving data from Oracle to delimited file ?
    Also is there need of defining each columns before hand in target datastore. Cant ODI take it from the oracle table itself ?

    Addy,
    A CKM is a Check KM which is used to validate data and log errors. It is not going to assist you in data movement. You will need an LKM SQL to File append as answered in another thread.
    Assuming that you have a one to one mapping, to make things simpler you can duplicate the Oracle based model and create a file based model. This will take all the column definitions from the Oracle based model.
    Alternatively, you can also use an ODI tool odiSQLUnload to dump the data to a file
    HTH

  • What are required Oracle products for moving data from IBM IMS/DB(mainframe) to Oracle environment?

    I am z/OS system programmer, our company is using IMS as its main OLTP database. We are investigating moving data off the mainframe for data warehousing and online fraud detection. One option is using IBM InfoSphere CDC and DB2, another option is using IMS connect and writing our own program, I am wondering what is the oracle solution for this kind of issue?
    I am not oracle technician but I googled and find out Oracle has some product like Oracle Legacy Adapters, OracleAS CDC Adapter and Oracle Connect on z/OS, however I didn't find them in Oracle site(https://edelivery.oracle.com/), I don't know whether these products are deprecated or not?!
    I would very much appreciate any help or guidance you are able to give me

    Thank you for responding.
    I've considered dumping the data into a flat file and using SQL*Loader to import as you suggest but this would require some scripting on a per-table basis. Again: all I want to do is to copy the contents of a table from one database to another. I do not think I should have to resort to creating my own dump and load scripts in order to do that. However I agree with you that this type of solution may be my final solution.
    I've tried the db link solution. It was just a slow as the 'imp' solution for some reason. Don't know why. The tables are rather large (3 tables of a few Gb each) and therefore require intermediate commits when loaded. Otherwise the rollback segment will run out of space. So the 'db link solution' is really a PL/SQL script with a commit for each x records.
    I think Oracle is making it a bit difficult for me to copy the contents of a table from one database to another and to do it efficiently. Perhaps I'm missing something here?

  • Database Copy Doesn't find tables for moving data

    I'm running SQL Dev 3.1.7 on a Win7-64 PC against Oracle 11g. Using the Database Copy function to move and update tables from one instance to another. The application seems to work fine, creating all of the tables, but in the script when it tries to move the data, for some tables it fails as follows:
    Moving Data for object MY_TABLE_NAME
    Unable to perform batch insert.
    MY_TABLE_NAME ORA-00942: table or view does not exist
    Earlier in the script it dropped and then recreated the table in the target instance. The table does exist in the source instance. This occurred for 24 tables out of 105.
    Any ideas what is going on?
    Edited by: user12200489 on May 25, 2012 9:47 AM

    according to the log, it says the table got created. Here's the log entries as they pertain to our APPOINTMENT table ...
    DROP TABLE "APPOINTMENT" cascade constraints;
    table "APPOINTMENT" dropped.
    DROP SEQUENCE "APPOINTMENT_SEQ";
    sequence "APPOINTMENT_SEQ" dropped.
    -- DDL for Sequence APPOINTMENT_SEQ
    CREATE SEQUENCE "APPOINTMENT_SEQ" MINVALUE 1 MAXVALUE 9999999999999999999999999999 INCREMENT BY 1 START WITH 17721 CACHE 20 NOORDER NOCYCLE ;
    sequence "APPOINTMENT_SEQ" created.
    -- DDL for Table APPOINTMENT
    CREATE TABLE "APPOINTMENT" ("APPOINTMENT_ID" NUMBER(19,0), "PRACTICE_PATIENT_ID" NUMBER(19,0), "PRACTICE_PET_OWNER_ID" NUMBER(19,0), "STAFF_MEMBER_ID" NUMBER(19,0), "APPT_STATUS_ENUM_NAME" VARCHAR2(32 BYTE), "EXT_PRACTICE_APPOINTMENT_ID" VARCHAR2(255 BYTE), "CONFIRMATION_KEY" VARCHAR2(255 BYTE), "REASON" VARCHAR2(255 BYTE), "EXT_REASON" VARCHAR2(255 BYTE), "EXT_SECONDARY_REASON" VARCHAR2(255 BYTE), "NOTE" BLOB, "APPOINTMENT_DATE" TIMESTAMP (6), "CREATED_DATE" TIMESTAMP (6), "CREATED_BY" VARCHAR2(255 BYTE), "LAST_MODIFIED_DATE" TIMESTAMP (6), "LAST_MODIFIED_BY" VARCHAR2(255 BYTE), "CONFIRMATION_TIME" TIMESTAMP (6)) SEGMENT CREATION IMMEDIATE PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 NOCOMPRESS LOGGING STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT FLASH_CACHE DEFAULT CELL_FLASH_CACHE DEFAULT) TABLESPACE "VET2PET_T" LOB ("NOTE") STORE AS BASICFILE ( TABLESPACE "VET2PET_T" ENABLE STORAGE IN ROW CHUNK 8192 RETENTION NOCACHE LOGGING STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT FLASH_CACHE DEFAULT CELL_FLASH_CACHE DEFAULT)) ;
    table "APPOINTMENT" created.
    It even truncates the table before attempting to load the data ...
    TRUNCATE TABLE "APPOINTMENT";
    table "APPOINTMENT" truncated.
    But when it goes to move the data ...
    --- START --------------------------------------------------------------------
    Moving Data for object APPOINTMENT
    Unable to perform batch insert.
    APPOINTMENT ORA-00942: table or view does not exist
    --- END --------------------------------------------------------------------
    There is a lookup table involved but at the time of moving the data none of the foreign keys have been enabled.
    All the indices for the table get created with no issue and here is the log entry for when the constraints are created and enabled ...
    -- Constraints for Table APPOINTMENT
    ALTER TABLE "APPOINTMENT" ADD CONSTRAINT "PK_APPOINTMENT" PRIMARY KEY ("APPOINTMENT_ID") USING INDEX PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT FLASH_CACHE DEFAULT CELL_FLASH_CACHE DEFAULT) TABLESPACE "VET2PET_I" ENABLE;
    ALTER TABLE "APPOINTMENT" MODIFY ("CREATED_BY" NOT NULL ENABLE);
    ALTER TABLE "APPOINTMENT" MODIFY ("CREATED_DATE" NOT NULL ENABLE);
    ALTER TABLE "APPOINTMENT" MODIFY ("CONFIRMATION_KEY" NOT NULL ENABLE);
    ALTER TABLE "APPOINTMENT" MODIFY ("EXT_PRACTICE_APPOINTMENT_ID" NOT NULL ENABLE);
    ALTER TABLE "APPOINTMENT" MODIFY ("APPT_STATUS_ENUM_NAME" NOT NULL ENABLE);
    ALTER TABLE "APPOINTMENT" MODIFY ("PRACTICE_PET_OWNER_ID" NOT NULL ENABLE);
    ALTER TABLE "APPOINTMENT" MODIFY ("PRACTICE_PATIENT_ID" NOT NULL ENABLE);
    ALTER TABLE "APPOINTMENT" MODIFY ("APPOINTMENT_ID" NOT NULL ENABLE);
    ALTER TABLE "APPOINTMENT" MODIFY ("APPOINTMENT_DATE" NOT NULL ENABLE);
    table "APPOINTMENT" altered.
    table "APPOINTMENT" altered.
    table "APPOINTMENT" altered.
    table "APPOINTMENT" altered.
    table "APPOINTMENT" altered.
    table "APPOINTMENT" altered.
    table "APPOINTMENT" altered.
    table "APPOINTMENT" altered.
    table "APPOINTMENT" altered.
    table "APPOINTMENT" altered.
    Any ideas?
    thanks.

  • Options for moving data

    I'm moving some rather large tables between two Oracle 10 instances.
    In simple words I would like to copy a schema from one database to another. Simple, you say? Not so!
    I cannot use Oracle data pump on the target host as I lack the required privileges (Oracle Data Pump is essentially a server side tool unlike the old 'imp' and 'exp' tools which are client-side tools).
    The first option I try is good old-fashioned exp and imp. The export is pretty fast. Only 1 hour. Then comes the import which is painfully slow (we are talking an estimated 2-3 days). The reason is that Oracle is doing a commit per row because the tables contains TIMESTAMP columns. I'm wondering what is so special about TIMESTAMP columns that it forces Oracle not to use array inserts?? Anyway the restriction is fully documented by Oracle.
    (Utilities Guide for Oracle 10.2 database, see description on import parameter BUFFER).
    The tables I'm loading into have no indexes and there are no indexes as part of the dump file. Furthermore the target tables have been pre-allocated in size so they do not need to extend as part of the import. Lastly logging is turned off on these tables. So all should be in place for fast 'imp' performance. However not so because of the TIMESTAMP restriction. Hmmm.
    Then I try other options. How about the COPY command in SQL*Plus. With this you can control the frequency of the commit. Unfortunately the COPY command only works on tables with colums of type CHAR, DATE, LONG, NUMBER and VARCHAR2. In other words: I cannot use this solution either.
    What are my options? I simply want to move data from one Oracle 10 to another with reasonable performance. All together we are talking about 5-6 Gb data. Not a lot by modern standards. Why is it so difficult? Am I overlooking something obvious?
    Thanks.
    Edited by: user491370 on Apr 19, 2009 9:27 AM

    Thank you for responding.
    I've considered dumping the data into a flat file and using SQL*Loader to import as you suggest but this would require some scripting on a per-table basis. Again: all I want to do is to copy the contents of a table from one database to another. I do not think I should have to resort to creating my own dump and load scripts in order to do that. However I agree with you that this type of solution may be my final solution.
    I've tried the db link solution. It was just a slow as the 'imp' solution for some reason. Don't know why. The tables are rather large (3 tables of a few Gb each) and therefore require intermediate commits when loaded. Otherwise the rollback segment will run out of space. So the 'db link solution' is really a PL/SQL script with a commit for each x records.
    I think Oracle is making it a bit difficult for me to copy the contents of a table from one database to another and to do it efficiently. Perhaps I'm missing something here?

  • Sample pgm for moving data from table control to internal table

    Hi Experts,
          I am newbi to ABAP. I don't have good material for Table control . Appreciate if you direct me to some good source of knowledge on Table control.
    The problem at hand : I am trying to move info/data from table control (in screen painter/ input and output mode ) to ITAB but couldn't . Sample pgm if possible.
    <b>Modify ITAB index TC-Current_Line .</b>
    The above statement is not inserting new lines to ITAB . Help me!
    Thanks for your time

    hi,
    do like this...
    <b>PROCESS AFTER INPUT.</b>
    *&SPWIZARD: PAI FLOW LOGIC FOR TABLECONTROL 'TAB1'
      LOOP AT itab_det.
        CHAIN.
         FIELD itab_det-comp_code.
          FIELD itab_det-bill_no.
          FIELD itab_det-bill_date.
          FIELD itab_det-vend_cust_code.
          FIELD itab_det-bill_amt.
          MODULE <b>tab1_modify</b> ON CHAIN-REQUEST.
        ENDCHAIN.
        FIELD itab_det-mark
          MODULE tab1_mark ON REQUEST.
      ENDLOOP.
    <b>MODULE tab1_modify INPUT.</b>
      APPEND itab_det.
    <b>ENDMODULE.                    "TAB1_MODIFY INPUT</b>

  • Looking for best design approach for moving data from one db to another.

    We have a very simple requirement to keep 2 tables synched up that live in 2 different databases. There can be up to 20K rows of data we need to synch up (nightly).
    The current design:
    BPEL process queries Source DB, puts results into memory and inserts into Target DB. Out of memory exception occurs. (no surprise).
    I am proposing a design change to get the data in 1000 row chunks, something like this:
    1. Get next 1000 records from Source DB. (managed through query)
    2. Put into memory (OR save to file).
    3. Read from memory (OR from a file).
    4. Save into Target DB.
    Question is:
    1 Is this a good approach and if so, does SOA have any built in mechanisms to handle this? I would think so since I believe this is a common problem - we don't want to reinvent the wheel.
    2. Is it better to put records into memory or writing to a file before inserting into the Target DB?
    The implementation team told me this would have to be done with Java code, but I would think this would be out of the box functionality. Is that correct?
    I am a SOA newby, so please let me know if there is a better approach.
    Thank you very much for your valued input.
    wildeman

    Hi,
    After going through your question, the first thing that came to my mind is what would be the size of the 20K records.
    If this is going to be huge then even the 1000 row logic might take significant time to do the transfer. And I think even writing it to a file will not be efficient enough.
    If the size is not huge then probably your solution might work. But I think you will need to decide on the chunk size based on how well your BPEL process will work. Possible you can try different size and test the performance to arrive at an optimal value.
    But in case the size is going to be huge, then you might want to consider using ETL implementations. Oracle ODI does provide such features out of the box with high performance.
    On the other hand, implementing the logic using the DBAdapter should be more efficient than java code.
    Hope this helps. Please do share your thoughts/suggestions.
    Thanks,
    Patrick

  • "searching for movie data"

    I'm having a difficult time. I upgraded to iLife 08. Then I read all this stuff about Final Cut Express having a similar interface to iMovie so I bought it. I don't like either one compared to iMovie HD 06, which I still have. When I launch 08 it starts "searching for movie data" in each clip. When I got to the external HD and look in the folder the files are listed in order as Clip 1.dv but the folder icon is a blank page with a bent corner. I have no idea what I have done or how to make it right? ps I have 318 clips in this folder.
    Message was edited by: Mark Johnson7

    I guess what I want is to be able to use the same video in each of the apps.
    Since iMovie '06 employs destructive editing, I would leave the original files alone in the project package. Options for moving data from iMovie '06 to '08 depends on what you want to do. Basically you have three options:
    1) You can use the "built in" iMovie '08 copy your current iMovie '06 source files to iMovie '08. Then generally means you lose the edits already performed.
    2) You can manually copy source files, transition clips, title clips, special effect clips, etc. from your iMovie '06 project package to a "visible" area on a hard drive and then import them to iMovie '08 and manually put the individual clips back together in their proper order as you may desire to retain them but won't be able to modify transitions, special effects, or titles you keep.
    3) You can copy the current status of you iMovie '06 timeline to a "visible" hard drive location and then import it as a single clip into iMovie '08 for further editing. You will not, however, be able to change your previous transitions, titles, or special effects except by cutting them out.
    Once the file(s) is(are) in in Imovie '08, you have the option of either physically copying the same data again to FCE or simply export an XML file which FCE can open. Unfortunately, iMovie v7.1 now exports an XML that can still be opened in FCE v3.5 but then you will have to reset pointers to re-link to the source media. (FCE v4 seems to accept them properly.) Since both iMovie '08 and FCE were designed to edit in a non-destructive manner, theoretically you shouldn't have to physically copy media unless you want to do so. Moving or deleting files at the Finder level could, however, cause problems with either or both applications. Each can be used independently using the same source files or you can XML updates from iMovie '08 to FCE any time you wish.
    Must say however, that such a multi-pronged approach could easily lead to problems and while I might move between any of these applications, I would not choose to edit in all three at the same time.
    I don't think I fully understand what happens to the files in each when I start messing with them.
    Files imported to iMovie '06 are normally stored as DV compressed files along with transition/title/special effect clips in the iMovie '06 "Project" package file in the "Media" folder and your current "timeline" can be found in the "project" package in the "Shared Movies > iDVD" folder in the form of an MOV "reference" file.
    Source video files for iMovie '08 are stored in their "native" imported format in the appropriate "iMovie Events" folder within the respective "events" while other media directly moved to projects are stored in the "iMovie Projects" in their respective "project" packages.
    FCE storage in the "FCE Documents" folder depends on the your method of importing/transferrring data to the application.

  • PXI 5114 - Seperate Triggers for Each Channel

    We are using a PXI-5114 High Speed Digitizer.  We were wondering if it was possible to set different triggers for each of the two channels.  We are analyzing a NTSC composite video signal and would like to observe two different lines of video in a LabVIEW VI.

    I would like to clarify a little bit about what you are trying to accomplish.  Do you have one or two separate video sources you wish to hook up to your digitizer?  When you refer to "lines" are you referring to the individual horizontal lines that make up a single video frame?
    The amount of time it takes to reconfigure the digitizer for a new acquisition is not deterministic and will depend upon the speed of your machine as well as any other activity you have going on over the PXI bus.  As a result, reconfiguring in between an event may not be fast enough depending on how quickly you expect the image to be moving.  I am also not quite sure how setting up a trigger for the top horizontal line and then another one for the bottom horizontal line will get the measurement you are looking for because each line will be refreshed at a periodic rate by the video source.  Triggering on a specific line will give you a record with the digitized data for that line but it seems like you will need to do some kind of processing on that data to tell if the data contains this line of the digitized image contains is object you were looking for or not.
    Please let me know if it does not sound like I am envisioning the application correctly.
    -Matt

  • Moving data from flat file

    Hi,
    I am moving data from flat file to oracle table. while populating the oracle table if I get any errors in flat file those errors should populate in ODI error table.
    Is this is possible? if yes. Could you please let me know the set up in ODI.

    CKM is the dedicated for checking the constraints while doing transformation. The constraints includes, PK,FK, conditions etc.,
    There are two types/ways of checking the constraints.
    Flow Control: Triggered CKM after data is loaded in to I$.
    Static Control : Triggered CKM after data is loaded in to target table.
    If you opt for any one the above ODI will create E$ and SNP_CHECK_TAB (summary table for logging the errors) and load the error records.
    ODI will also provide you an option of loading the corrected error records by RECYCLE ERROR feature. In this phase/run ODI will load ONLY the error records in to target table (assuming its been corrected/cleaned).
    **how to set up flow control could you please provide steps**
    **Appreciate your help**

  • Address Book field names changing for "imported" data

    I'm migrating from a thousand year old Palm Pilot that has served me so well to an iPhone. So it's time to get my contacts into OSX Address Book.
    I have tried moving the data a couple of ways - exporting vCards and text files from the Palm Desktop software.
    I did extensive editing of the text file in MS Excel to get the data into good order.
    I have set up a template in the Address Book preferences to create fields that I want/need. Some are generic default field names, some have custom names. eg. I want to see a field called "Bus. Phone" not one called "work". Call me picky.
    I get the same basic result no matter if I import the vCard file or a text file - The names of the fields just show up with very generic names in the imported data. I should note that creating a NEW record seems to use the field names that I have specified in the preferences.
    Also, when importing the text file, Address Book asks me to assign the field names for certain fields (while it seems smart enough to know what labels to use for other fields all on it's own). However, the Field Names available in the drop down menu ARE NOT the ones that I have set up in the prefs. I seem limited to the generic default field names only.
    Can anybody tell me how to get the field names to be what I want them to be for address data that is being imported like this?
    Thanks for any suggestions - this has been driving me crazy as it would seem to me to be a pretty basic process. (I mean, the Address Book has this import functionality and custom field name functionality built in so it should work right?!)
    Allen

    No sooner do I ask than I notice that other people have asked the same question. In one of the reply's I saw mention of a product called "Abee".
    I tracked it down and it did the job for me! Custom Field names live on!
    It's at:
    http://www.sillybit.com/abee/

  • Action needs to be triggered in future date

    Hi All,
    We have a requirement where we need to trigger an Action on Contract's End date. How can we define action, so that it would be triggered in future date automatically with out any user intervention. I have checked with Date profiles. That wouldnt help in this scenario. 
    Can we do this with events..? How can we schedule an event, so that the event would be triggered in future (ex. after 2 years)?
    Thanks in Advance!
    Regards,
    Rajesh.

    Hi Rajesh,
    A lot depends on how are you planning to schedule the action in your case. With actions a lot of different combinations of 'schedule condition','start condition', 'action merging' options and 'processing time' can be used to meet the requirements.
    If you want trigger the action immediately when the contract end date is reached, one of the options could be to use the condition contract end date = current date as schedule condition and then use the following:
    Processing Time = Immediate Processing
    Action Merging = Max. 1 Action for Each Action Definition
    Do not set a start condition.
    Thanks & regards,
    Ahmad

  • Job not getting triggered for Multiple Scheduler Events

    hi,
    I would like a job to be triggered for multiple scheduler events, subscribing to a single event works fine. But, when I set multiple event condition, nothing works.
    My objective is to run a job, whenever job starts or restarts or exceeds max run duration.
    Note : Is it possible to trigger a job, when a job RESTARTS by subscribing to JOB_START ????????
    procedure sniffer_proc(p_message in sys.scheduler$_event_info)
    is
    --Code
    end sniffer_proc
    dbms_scheduler.create_program(program_name => 'PROG',
    program_action => 'sniffer_proc',
    program_type => 'stored_procedure',
    number_of_arguments => 1,
    enabled => false);
    -- Define the meta data on scheduler event to be passed.
    dbms_scheduler.define_metadata_argument('PROG',
    'event_message',1);
    dbms_scheduler.enable('PROG');
    dbms_scheduler.create_job
    ('JOB',
    program_name => 'PROG',
    * event_condition => 'tab.user_data.event_type = ''JOB_OVER_MAX_DUR''' ||*
    *' or tab.user_data.event_type = ''JOB_START''',*
    queue_spec => 'sys.scheduler$_event_queue,auagent',
    enabled => true);
    I tried this too...
    dbms_scheduler.create_job
    ('JOB',
    program_name => 'PROG',
    * event_condition => 'tab.user_data.event_type = ''JOB_OVER_MAX_DUR''' ||*
    *' and tab.user_data.event_type = ''JOB_START''',*
    queue_spec => 'sys.scheduler$_event_queue,auagent',
    enabled => true);
    Need help
    Thanks...
    Edited by: user602200 on Dec 28, 2009 3:00 AM
    Edited by: user602200 on Dec 28, 2009 3:03 AM

    Hi,
    Here is complete code which I tested on 10.2.0.4 which shows a second job that runs after a first job starts and also when it has exceeded its max run duration. It doesn't have the condition but just runs on every event raised, but the job only raises the 2 events.
    Hope this helps,
    Ravi.
    -- run a job when another starts and exceeds its max_run_duration
    set pagesize 200
    -- create a user just for this test
    drop user test_user cascade;
    grant connect, create job, create session, resource,
      create table to test_user identified by test_user ;
    connect test_user/test_user
    -- create a table for output
    create table job_output (log_date timestamp with time zone,
            output varchar2(4000));
    -- add an event queue subscriber for this user's messages
    exec dbms_scheduler.add_event_queue_subscriber('myagent')
    -- create the first job and have it raise an event whenever it completes
    -- (succeeds, fails or stops)
    begin
    dbms_scheduler.create_job
       ( 'first_job', job_action =>
         'insert into job_output values(systimestamp, ''first job runs'');'||
         'commit; dbms_lock.sleep(70);',
        job_type => 'plsql_block',
        enabled => false, repeat_interval=>'freq=secondly;interval=90' ) ;
    dbms_scheduler.set_attribute ( 'first_job' , 'max_runs' , 2);
    dbms_scheduler.set_attribute
        ( 'first_job' , 'raise_events' , dbms_scheduler.job_started);
    dbms_scheduler.set_attribute ( 'first_job' , 'max_run_duration' ,
        interval '60' second);
    end;
    -- create a simple second job that runs when the first starts and after
    -- it has exceeded its max_run_duration
    begin
      dbms_scheduler.create_job('second_job',
                                job_type=>'plsql_block',
                                job_action=>
        'insert into job_output values(systimestamp, ''second job runs'');',
                                event_condition =>
       'tab.user_data.object_name = ''FIRST_JOB''',
                                queue_spec =>'sys.scheduler$_event_queue,myagent',
                                enabled=>true);
    end;
    -- this allows multiple simultaneous runs of the second job on 11g and up
    begin
      $IF DBMS_DB_VERSION.VER_LE_10 $THEN
        null;
      $ELSE
        dbms_scheduler.set_attribute('second_job', 'parallel_instances',true);
      $END
    end;
    -- enable the first job so it starts running
    exec dbms_scheduler.enable('first_job')
    -- wait until the first job has run twice
    exec dbms_lock.sleep(180)
    select * from job_output;

  • Best method for passing data between nested components

    I have a fairly good sized Flex application (if it was
    stuffed all into one file--which it used to be--it would be about
    3-4k lines of code). I have since started breaking it up into
    components and abstracting logic to make it easier to write,
    manage, and develop.
    The biggest thing that I'm running into is figuring out a way
    to pass data between components. Now, I know how to write and use
    custom events, so that you dispatch events up the chain of
    components, but it seems like that only works one way (bottom-up).
    I also know how to make public variables/functions inside the
    component and then the caller can just assign that variable or call
    that function.
    Let's say that I have the following chain of components:
    Component A
    --Component B
    -- -- Component C
    -- -- -- Component D
    What is the best way to pass data between A and D (in both
    directions)?
    If I use an event to pass from D to A, it seems as though I
    have to write event code in each of the components and do the
    bubbling up manually. What I'm really stuck on though, is how to
    get data from A to D.
    I have a remote object in Component A that goes out and gets
    some data from the server, and most all of the other components all
    rely on whatever was returned -- so what is the best way to be able
    to "share" data between all components? I don't want to have to
    pass a variable through B and C just so that D can get it, but I
    also don't want to make D go and request the information itself. B
    and C might not need the data, so it seems stupid to have to make
    it be aware of it.
    Any ideas? I hope that my explanation is clear enough...
    Thanks.
    -Jake

    Peter (or anyone else)...
    To take this example to the next (albeit parallel) level, how
    would you go about creating a class that will let you just
    capture/dispatch local data changes? Following along my original
    example (Components A-D),let's say that we have this component
    architecture:
    Component A
    --Component B
    -- -- Component C
    -- -- -- Component D
    -- -- Component E
    -- -- Comonnent F
    How would we go about creating a dispatch scheme for getting
    data between Component C and E/F? Maybe in Component C the user
    picks a username from a combo box. That selection will drive some
    changes in Component E (like triggering a new screen to appear
    based on the user). There are no remote methods at play with this
    example, just a simple update of a username that's all contained
    within the Flex app.
    I tried mimicking the technique that we used for the
    RemoteObject methods, but things are a bit different this time
    around because we're not making a trip to the server. I just want
    to be able to register Component E to listen for an event that
    would indicate that some data has changed.
    Now, once again, I know that I can bubble that information up
    to A and then back down to E, but that's sloppy... There has to be
    a similar approach to broadcasting events across the entire
    application, right?
    Here's what I started to come up with so far:
    [Event(name="selectUsername", type="CustomEvent")]
    public class LocalData extends EventDispatcher
    private static var _self:LocalData;
    // Constructor
    public function LocalData() {
    // ?? does anything go here ??
    // Returns the singleton instance of this class.
    public static function getInstance():LocalData {
    if( _self == null ) {
    _self = new LocalData();
    return _self;
    // public method that can be called to dispatch the event.
    public static function selectUsername(userObj:Object):void {
    dispatchEvent(new CustomEvent(userObj, "selectUsername"));
    Then, in the component that wants to dispatch the event, we
    do this:
    LocalData.selectUsername([some object]);
    And in the component that wants to listen for the event:
    LocalData.getInstance().addEventListener("selectUsername",
    selectUsername_Result);
    public function selectUsername_Result(e:CustomEvent):void {
    // handle results here
    The problem with this is that when I go to compile it, it
    doesn't like my use of "dispatchEvent" inside that public static
    method. Tells me, "Call to possibly undefined method
    "dispatchEvent". Huh? Why would it be undefined?
    Does it make sense with where I'm going?
    Any help is greatly appreciated.
    Thanks!
    -Jacob

  • EDI output triggered for old PO

    Hello All,
      on 09.09.2014, EDI ( output type NEU, transmission medium 6) output type has triggered for more than 700 PO's which has been created in 2012 for several vendors and sent to supplier again as a duplicate copy. Suppliers are reaching client for this and some suppliers started sending goods which client has not ordered. There are no changes available in PO and users have not edited those PO's. There is no batch job scheduled to trigger output type. How else is possible for triggering output type in old PO. Please share your ideas.
    Regards,
    Prasath J

    Ask your basis team when the last copy was done. if the copy was done between creation date and now, then the PO in the copy system will show how the PO was before the incident happened.
    This way you can see if there were unprocessed messages.
    Message records do not create by itself in old POs, And 700 cases do not speak for a manual activity either,especially as you are saying that you have no change records.
    Maybe there was something done programatically  to solve an earlier problem from you:
    PO Output Message Disappears

Maybe you are looking for

  • Client Security solution not working with fingerprint, Fingerprint Tab is unavailable

    Just recieved my Thinkpad T430 today and have done all updates on the system. I am trying to use Client Security Solution with my fingerprint however it will not allow me to do so. Whenever I go into password manager I get the message, "The requested

  • Any status change on Entourage attchment corruption??

    Like many others, I ran into a problem with sending email attachments using Entourage when Tiger first came out... I haven't heard anything in a while about that topic... I did a quick search and didn't see any recent, pertinent posts... Does anyone

  • Recovery Manager could not restore your computer.

    HP G32-301tx Notebook PC LK435PA windows 7 64-bit Recovery Manager could not restore your computer.

  • Email password failed

    Yesterday I had to change my one of my emails password because it was hacked into... And it was the primer email for my bb 9330.  So I rest the password on the computer and went to my phone, and it say to verify the email..... So I did and it said fa

  • Crash of my computer - How do I re-install the Adobe suite cc ?

    Hello, I need your help with the re-installation of the Adobe suite cc on a second computer. My main computer has been out of service for several weeks. It can't be repaired and returned to service, forcing me to use my second backup computer on whic