Inserting historical data for flashback queries

We would like to start using this feature, but we have 10+ years of historical data changes in our tables. Is there a way to "load" these changes prior to enabling flaskback? Is there really a "point forward" only option?
_mike                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

Weel, thanks for the responses. It's a shame, because I would really like to adopt Flashback, but we can;t loose the historical auditing we have already in the system.
Thanks for all of the replies.
_mike                                                                                                                                                                                                                                                                                                                                                                                                                                       

Similar Messages

  • HISTORICAL DATA FOR HR

    Hi All,
    I am working on HR design in SAP.
    My client wants Historical information for every Infotype.
    I had around 30 NO.
    Creating DSO for each Infotype is the only option to capture historical data.
    I am not interested on creating Infosets.
    PLease advice!!
    Nani

    Praveen thanks for your quick response,
    For my requirement I need the historical data for the employees for the headcount trends, how can that be achieved?
    Thanks,
    Aashish

  • Is there a way to get historical data for an AP that is no longer online via Prime?

    I am running a wireless network with about 130 AP's connected to a 8510 controller and using Prime for management.  I am trying to get AP utilization for an AP that is no longer connected and when I try to run the report Prime reports that it cannot pull the report because the AP is offline.
    Is there some way to get historical data for this AP?  Does Prime store this type of data?
    Seth

    I believe that all devices need to be up in order to pull any information current or historically.  If the AP is down, I don't think that a report in Prime will show info for that given device even if you set the date/time to when it was up. Never really looked into that as I've never tried to look at info if the access point was down. If you run a report on channel changes, does the AP show up?
    -Scott

  • Hello, is there any calender application which keeps historical data for iPad 2?

    hello, is there any calender application which keeps historical data for iPad 2?

    The one built-in keeps whatever historical data you sync to it.

  • Reg: Loading historic data for the enhanced field

    Hello All,
    We need to add a new field 0VENDOR to our datasource 0FI_GL_4. This field is available in our BSEG table. Hence, we are planning to go ahead with datasource enhancement.
    Now, please advice on how to update the historical data to this newly added field.I have heard there is a BW functionality/program to do so without deleting the entire data. Kindly advice on the possible solutions.
    Thanks & Regards
    Sneha Santhanakrishnan

    HI Sneha,
    Using remodeling option you will be able to do that, ie.. loading historical data for new attributes without deleting existing data. But the problem is in remodeling either you can assign constant or some other attribute value or values determined using EXIT.
    Now when you are loading data from source system and if you need historical data as well the best practise is delete existing data and reload it from source sytsem.
    But if you don't want to do that then I can give you one trick but not sure whether it will work or not. The idea is to populate the historical values for 0Vendor using customer exit option of remodeling. Now to get the historical values in customer exit you will need all that data in some BW table, here you can think of creating some generic extractor which will store the values of all the documents and the respective vendor, as you will load data form source system you will get historical values as well.
    Now read that table in customer exit and populate vendor value. This will be one time process to populate the historical values.
    Regards,
    Durgesh.

  • 0HR_PT_2 How to get back historical data for new report time type

    Hi All Expert,
    We have implemented and using the 0HR_PT_2 extractor for the past whole year. The Delta is working. Recently, there is requirement to read more data from the ZL custer table, and more new BW Report Time Types are added to extract such data.
    The delta doesn't pick up the the past whole year data of the new report time type that we added.
    Do we need to re-initialize the load to get back those historical data every time when we add a new report time type?
    Please advice and Thx
    Ken
    Edited by: Ken Hong on Feb 27, 2008 9:24 PM
    Edited by: Ken Hong on Feb 27, 2008 9:25 PM

    P.s, all hidden files are shown in es file explorer as this backup folder was hidden originally. It has a '.' in front. So I'm pretty sure the folder it's gone, but as I've not erased my phone again, shouldn't the folder be somewhere in my SD card still and how can I find it using my Mac?

  • How to insert same data for diffrent date

    please help
    I need to insert this data in abc using one single query or pl sql code .
    Please assume tab has more than 10 years of data and having millions of record
    insert into abc
    select count(distinct id) from tab where date = sysdate - 45
    insert into abc
    select count(distinct id) from tab where date = sysdate - 105
    insert into abc
    select count(distinct id) from tab where date = sysdate - 195
    insert into abc
    select count(distinct id) from tab where date = sysdate - 380
    insert into abc
    select count(distinct id) from tab where date = sysdate - 745
    thanks and advance

    Hi,
    I understand you want to scan the table only once... There is a way to accomplish that (probably easier and more readable then the one I give here also exists), but one thing sure, the tab is scanned once only here:
    with counts as (select
                    count(distinct case when t.date>=sysdate - 45 then t.id else null end) c1,
                    count(distinct case when t.date>=sysdate - 105 then t.id else null end) c2,
                    count(distinct case when t.date>=sysdate - 195 then t.id else null end) c3,
                    count(distinct case when t.date>=sysdate - 380 then t.id else null end) c4,
                    count(distinct case when t.date>=sysdate - 745 then t.id else null end) c5
                    from tab t)
    select case when l.rn=1 then cnt.c1
                when l.rn=2 then cnt.c2
                when l.rn=3 then cnt.c3
                when l.rn=4 then cnt.c4
                when l.rn=5 then cnt.c5
                end sums
    from  counts cnt,
         (select rownum rn from dual connect by level <=5) l;What's happening here is
    1) we get all counts as 5 columns in one row
    2) we put value of each column in a new row
    I'm not sure the complexity is worth the result in your case..
    Maris

  • Obtaining QM Historical Data for SPC Analysis

    Can someone please advise me if ECC 700 has a standard BAPI/RFC to return historical QM data.  I have searched through BAPI Explorer and didn't not see anything which would fill this requirement.
    What I did find was the following document, [SAP QM Productivity Pack for SAP xMII (Starter Kit)|https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/f0b0cf92-af8b-2a10-d8a5-f1e115fa5f7e], which provides detailed code to create an custom RFC. Before proceeding down this path, I wanted to confirm that I did not miss something.
    Thanks,
    Michael Teti
    Senior Solutions Consultant
    SeeIT Solutions, LLC

    Michael,
    Here is the link.  This is considered old tech, but I have not gotten anyone's specific input on why it should not be used.  There may be reasons.  I have used it twice, once just playing around and with only one caveat, it was pretty easy to implement. 
    [SQ01 Queries|https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/390b889b-0e01-0010-e4af-c9a062b9b880]
    The document covers 5 or so RFC calls.  You will need to check the field names since they are defined internally and do not use the table column names (of course).  Probably the only thing left out of Sam's document.  One of the RFC calls will give you the internal field names.
    You will also need to create your own query in SQ01 in SAP prior to doing the other stuff.  I think there is a way to do it via the BAPI calls, but I could never get it to work the one time I spent half a day trying.  It might have been a permissions issue, so if you want, go ahead and play around. 
    Good luck,
    Mike

  • Insert several dates for the same event

    Hey guys,
    I have two tables joined. 1 with dates and venue, the other
    with speakers.
    How can I add more dates to the same speaker during his
    programme of appearance. i.e. Speaker A will be appearing for the
    following dates: 20/02/2007 - 27/02/2007 & 5/03/2007.....
    I was thinking of a Date fiels and a dropdown dynamic field.
    But I'm having problems figuring it out...
    Any help would be very much appreciated..
    Thanks

    Thanks Sabaidee,
    Please find below the whole code:
    <cfset CurrentPage=GetFileFromPath(GetTemplatePath())>
    <cfif IsDefined("FORM.MM_InsertRecord") AND
    FORM.MM_InsertRecord EQ
    "insert">
    <cfquery datasource="utcg302">
    INSERT INTO speaker (speakerName, venue, eventDate)
    VALUES (
    <cfif IsDefined("FORM.speaker") AND #FORM.speaker# NEQ
    "">
    <cfqueryparam value="#FORM.speaker#"
    cfsqltype="cf_sql_clob"
    maxlength="150">
    <cfelse>
    </cfif>
    <cfif IsDefined("FORM.Venue") AND #FORM.Venue# NEQ "">
    <cfqueryparam value="#FORM.Venue#"
    cfsqltype="cf_sql_clob"
    maxlength="50">
    <cfelse>
    </cfif>
    <cfif IsDefined("FORM.date") AND #FORM.date# NEQ "">
    <cfqueryparam value="#FORM.date#"
    cfsqltype="cf_sql_timestamp">
    <cfelse>
    NULL
    </cfif>
    </cfquery>
    </cfif>
    <cfquery name="rsInsert" datasource="utcg300">
    SELECT speaker.speakerID, speaker.speakerName, speaker.venue,
    speaker.eventDate, review.reviewID, review.review
    FROM speaker INNER JOIN review ON speaker.actID =
    review.actID
    ORDER BY speaker.eventDate
    </cfquery>
    <form
    action="<cfoutput>#CurrentPage#</cfoutput>" id="insert"
    name="insert"
    method="POST">
    <label for="textfield">Date</label>
    <p>
    <input name="date" type="text" id="date" />
    </p>
    <table width="200">
    <tr>
    <td><label>
    <input type="radio" name="Venue" value="Stoney" />
    Stoney</label></td>
    </tr>
    <tr>
    <td><label>
    <input type="radio" name="Venue" value="William Edwards"
    />
    William Edwards</label></td>
    </tr>
    </table>
    <p> </p>
    <p>
    <label for="select">speaker</label>
    <select name="speaker" id="Speaker">
    <option value="">Choose One</option>
    <option value=""></option><cfoutput
    query="rsInsert">
    <option
    value="#rsInsert.speakerID#">#rsInsert.speakerName#</option>
    </cfoutput>
    </select>
    <p>
    <label for="Submit"></label>
    <input type="submit" name="Submit" value="Submit"
    id="Submit" />
    </p>
    <input type="hidden" name="MM_InsertRecord"
    value="insert">
    </form>

  • How to find out which table store historical data for V_T001B?

    dear all,
    how i want to trace and find out which table that store all maintenance data history from V_T001B?
    thanks.

    i already find out into both table... but the data not store there...

  • Ability to correct or insert location Data for photo's

    I have some photo's that either do not have location data or are wrong due to the camera not being accurate.
    I can get the location map displayed but cannot drop a pin or change existing pins to locate the photo.
    I find this is really frustrating.
    Please can something be done about this?
    Thanks.

    To those of you who want to add a location to your photos using google earth. There is a plug in that allows you to simply drag and drop the photos from iPhoto to the dock icon of Geotager plug in.
    http://craig.stanton.net.nz/code/geotagger/
    The photos are tagged with the data and you can use them with Places. It is a 2 step process that will allow you to geotag old photos or new ones you rescan. I have even used Geotagger to tag photos that I have scanned in from vacations long ago.
    It is FREE
    You need the latest version of Google earth. I also recommend the crosshairs google earth plug-in
    I plan to get a nikon gp-1 as soon as I can to eliminate this step.

  • Set up of historical data for 2lis_08 datasources

    Hello All,
    Would like to know how to setup the setup tables for 2lis_08 datasources which are shipments.
    thanks
    amit

    Hi Amit,
    transaction VTBW for transport and VIFBW for shipment costs !
    Hope it helps!
    Bye,
    Roberto

  • How to get the histoical data for newly added field in the cube?

    Hi Experts,
    I have small doubt on remodeling the infocube.
    After adding the characteristic or keyfigure  to a cube by using remodeling concept, how can I get the historical data for that particular field.
    I have searched in SDN also but I didn't get proper information.
    Please excuse me if I posted repeated question.
    helpful answer will be awarded with poitns.
    Thanks & regards,
    Venkat.

    hi
    depending on your customer need you could use the remodelling functionnality but sometimes you have no way to retrieve what you want so another option you should consider is the following:
    Advantages
    that will cost less effort and guarantee the result.
    Drawbacks
    data is redondant for a while
    space (depending on the volume of historical data)
    So here are the steps :
    step 1Adjust your extraction process according to the fields you need to add to populate the cube.
    step 2 Then create a dso next or even a cube, feed the dso with a full load with the enhanced extractor you adjusted with the new fields in step 1 only once in fact this should be one shot.
    step 3 Copy the query to the previous built  multi-provider on top of the new historical data from dso and the running live delta cube. Adjust the queries if necessary.
    optionnal Then if you want to get rid of the dso or new cube for historical data you could empty the actual one push the data from the new data provider and that's all.
    bye
    Boujema

  • How to extract the historical data from R/3

    hi
    I am extracting data from R/3 through LO Extraction. client asked me to enhance the data source by adding field. i have enhanced the field and wrote exit to populate the data for that field.
    how to extract the historical data into BI for the enhanced field. already delta load is running in BI.
    regards

    Hi Satish,
    As per SAP Standard also the best way is to delete whole data from the cube and then load the data from set up tables as you have enhanced the data source.
    After data source enhancement it is supported to load normally because you don't get any historical data for that field.
    Best way is to take down time from the users, normally we do in weekends/non-business hours.
    Then fill the set-up tables; if the data is of huge volume you can adopt parallel mechanism like:
    1. Load set-up tables by yearly basis as a background job.
    2. Load set-up tables by yearly basis with posting periods from jan 1st to 31st dec of any year basis as a background job.
    This can make your self easier and faster for load of set-up tables. After filling up set-up tables. You can unlock all users as there is no worries of postings.
    Then after you can load all the data into BI first into PSA and then into Cube.
    Regards,
    Ravi Kanth.

  • The size limit of the OCI LOB Array Insert is 64K for one field?

    I have a table with 4 field, and one is BLOB field. I want to insert 16 rows in one OCIStmtExecute. I know I can specify the iter parameter with 16 to execute sql 16 times.
    I found example in "Application Developer's Guide - Large Objects" in page "Data Interface for Persistent LOBs 13-17", there is a example function called "array_insert". It shows the usage of OCIBindArrayOfStruct, but can only insert LOB with same size, the LOB field of each row filled with the same size data.
    But I have to insert LOB with different size, for example 8K for row 1, and 16K for row 2, 128K for row 3. Than I find the alenp parameter of OCIBindByName/OCIBindByPos. It is "pointer to array of actual lengths of array elements."(OCI document). So I think I find the solution for my problem. But the type of alenp parameter is ub2*, is it means I can only insert 64K data for each row in my array insert? It is too small, I hope I can array insert BLOB with 16M each row.
    Or there is any other solution for my problem? I look forward to it for a long time! thanks every one!

    It is called Data Interface to work with LOB datatypes by APIs designed for use with legacy datatypes. I can specify SQLT_BIN to bind memory binary data to BLOB column, and INSERT or UPDATE directly. It can be without LOB locator and save round-trip to the server. This is very fit my needs, because I have to insert very much BLOBs to server as soon as possible.
    I have make a test program, and multi-row with different size blob( less than 65536 bytes) can be insert at one time, without locators. Multi-row wiht same size blob( more than 64K) also can be insert at one time--the alenp parameter is not used. I only can not insert multi-row with different size blob( more than 64k) because the type of alenp is ub2*.
    thank you for your reply!

Maybe you are looking for

  • Text wrap not working properly CS4

    Despite trying numerous times over last few days, reviewing multiple tutorials to make sure I know what I'm doing, I simply cannot get the text wrap options to work properly. In SOME instances (not all ... why?!?) I can select text wrap, but when I t

  • Vendor open Items clearing

    Hi, The opening balances for some vendors is with a debit balance which are  uploaded at the time of migration to SAP. And later on there is credit to these vendors. Now when I am trying to clear the same through f-44 the opening balance is not being

  • How to parse xml in java

    i wrote java client to invoke webservice(TIBCO) as a result i am getting back xml. now i have to parse xml and use the information. can anyone pls advise on this. thanks

  • How to view font characters?

    I have forgotten what I knew once.  How do I view individual characters in a font? Character Viewer does not show all the characters in my fonts.   Specifically I am looking for the ligature for "ffl" in Times.   For comparison, years ago I found a p

  • VPN Concentrators Replaced?

    I see EOL messages on the VPN Concentrators homepage. Are these being replaced with ASA 5500 devices? Second question, then will the ASA 5500 VPN editions support Vista Clients with some type of Mandatory Client Firewall Enabled Detection Policy? Mea