SCEP Historical Data

 
1. Are SCEP Historical Data (Threat History) stored on a log on the Server?  or does SCCM 2012 read the data from the SCEP Logs on the client?
2. If I'm looking to keep\pull SCEP historical data for 1 year, do I have to edit the "Delete Aged Threat Data" under maintenance task from the default "30" days to "365"?
Just seems strange that the default is 30 days.  I know Auditors require tracking historical data from a year or so.
Andrew Marcos

Yes, I know this is an old post, but I’m trying to clean them up. Did you solve this problem, if so what was the solution?
No There will be no problem doing this except for there will be an increase in the database size.
Garth Jones | My blogs: Enhansoft and
Old Blog site | Twitter:
@GarthMJ

Similar Messages

  • Upload Sales Historical Data in R/3?

    Hi Guys:
    We are planning to use forecased based planning MRP Procedure ( MRP Type VV ) for Material Planning , we need historical sales data to execute the forecast.
    can anybody tell me step by step how to upload 12 months historical data into SAP R/3, so that we can excute forecast based planning for a material?
    Thanks
    Sweth
    Edited by: Csaba Szommer on May 12, 2011 9:57 AM

    i need to to pull data in bw from r/3 for frieght rates
    example for rate is from source to dest(in rows) and on the basis of this we have counditon types(in columns) which we will put in coloums and on the base of those cound type we will get cummulated value of master data rate.
    like
    source  dest   rate(based on coundition types)
    now in r/3 some coundition type  are changed to new ones or we can say merged to new ones, now my question for new one i can pull no problem but what i'll do for the historical master data like coundition type in 2006 are diff and 2007 is diff
    but they mapped in each othe
    thx
    rubane

  • Upload of historical data from legacy system

    Dear forum
    We are running standard SOP with transaction MC88 in ECC6.0.
    We are doing forecasting on material/plant level, based on historical data (we create the sales plan from transaction MC88).
    Now, the problem is that we are introducing new products into SOP, but there is no historical data in SAP for those materials.
    Is there any way we can import historical data into SAP from legacy system, so that SOP would take this data into account when calculating the forecast?
    Note: we do not want to build anything in flexible planning. Just want to check whether or not it is possible to import historical values from a legacy system, to use as historical data in SAP.
    Thanks in advance
    Lars

    Thanks,
    But that will not help us.
    Any ohter suggestions out there?

  • HOW TO TRANSFER HISTORICAL DATA FROM ONE ACCOUNT TO ANOTHER

    제품 : FIN_GL
    작성날짜 : 2006-05-29
    HOW TO TRANSFER HISTORICAL DATA FROM ONE ACCOUNT TO ANOTHER
    =============================================================
    PURPOSE
    특정 기간의 Balance 를 Account 별로 Transfer 하는 방법에 대해 알아 보도록 한다.
    Explanation
    GL 의 Mass Maintenance 기능을 이용하면 한 Account 에서 다른 Account 로 혹은 Multiple Account 에서 다른 하나의 Account 로 Balance 를 이동 시킬 수 있다.
    1. GL Responsibility 에서 Other> Mass Maintenance 를 선택한다.
    2. Move/Merge 작업을 위한 Request Name 과 Description을 입력한다.
    3. Request Type 으로 Move 혹은 Merge 를 선택한다.
    4. source-to-target account 를 위해 line number 를 입력한다.
    5. LOV 에서 source Account 를 선택하여 입력한다.
    모든 Account 는 enable 상태여야 한다.
    6. target account 역시 LOV에서 선택하여 입력한다.
    7. 지정한 작업을 수행 하기 전에 먼저 확인 작업을 할 수도 있다.
    8. 작업한 내용을 저장한다.
    9. Move/Merge request 를 수행한다.
    Example
    N/A
    Reference Documents
    Note. 146050.1 - How to Transfer Historical Data from One Account to Another

    Follow the directions here:
    http://support.apple.com/kb/HT2109

  • How to get Historic Data in Oracle

    Hi,
    THe following query might be useful to generate the consecutive dates from given date to sysdate.
    SELECT dt
    FROM   ( SELECT to_date('02/19/1981','mm/dd/yyyy')+rownum-1 AS dt
           FROM    user_objects
    WHERE  TRUNC(dt)<=TRUNC(SYSDATE);NOw lets consider the basic scott.emp table
    The above requirement can be acheived using this query as we may get only 14 records as there are only 14 records in scott.emp table
    select ( to_date('02/19/1981','mm/dd/yyyy')+rownum-1) as dt
    from scott.emp If we see the scott.emp table there are 2 reocrds with empno 7499,7521 with hiredates 20-feb-81 and 22-feb-81 respectively. on pulling the historic data from19-feb-81 to 04-mar-81 as there are only 14 records in the table we need to get the sum of sal or comm as it is along with those dates in the table.
    Assuming there are more than one record for the same date for the dates mentioned above , i have used the following query but could not get the desired output.
    Select To_Date('02/19/1981','mm/dd/yyyy')+Rownum-1,
    sum(Case When Sal>0 And Trunc(Hiredate)= (To_Date('02/19/1981','mm/dd/yyyy')+Rownum-1)
    Then (Sal)
    Else 0 End ) As Hist_Sal,
    sum(Case When Nvl(Comm,0)>= 0 And Trunc(Hiredate)= (To_Date('02/19/1981','mm/dd/yyyy')+Rownum-1)
    Then (Nvl(Comm,0))
    Else 0 End) As hist_comm
    From Scott.Emp
    Group By To_Date('02/19/1981','mm/dd/yyyy')+Rownum-1
    order by 1; I have tried the other option also and I could get the desired output.THe query goes like this:
    --Success Statement
    Select To_Char(Rt.Business_Date , 'Day,Mon DD,yyyy') As Business_Date ,To_Date(Rt.Business_Date) As Hist_date,
    ( Select sum(sal)
    From Scott.Emp  Slm
    Where Trunc (Hiredate) = Trunc(Rt.Business_Date)
    ) as hist_sal
    FROM
    (select ((TO_DATE('02/19/1981','mm/dd/yyyy')-1)+rnm) as business_date from (select rownum rnm from user_objects)) rt
    Where
    Trunc(Rt.Business_Date) Between To_Date('02/19/1981','mm/dd/yyyy') And To_Date('10/31/2012','mm/dd/yyyy')
    order by Hist_date;But i want to get the historic dates/data to be genearted from scott.emp table insetead of using this logic *(select ((TO_DATE('02/19/1981','mm/dd/yyyy')-1)+rnm) as business_date from (select rownum rnm from user_objects)) rt* as written in Success Statement
    As it would be helpful for my requirement ,else I need to write subqueries for all the cols i need as i have written in the above success statement.
    please advise.
    Regards,

    sri wrote:
    Hi,
    THe following query might be useful to generate the consecutive dates from given date to sysdate.
    SELECT dt
    FROM   ( SELECT to_date('02/19/1981','mm/dd/yyyy')+rownum-1 AS dt
    FROM    user_objects
    WHERE  TRUNC(dt)<=TRUNC(SYSDATE);
    Unless you're using Oracle 8 (or older) then it's more efficient to say:
    SELECT     start_dt + LEVEL - 1     AS dt
    FROM     (
              SELECT     TO_DATE ('02/19/1981', 'MM/DD/YYYY')     AS start_dt
              ,     TRUNC (SYSDATE)                           AS end_dt
              FROM     dual
    CONNECT BY  LEVEL     <= 1 + (end_dt - start_dt)
    ;and it doesn't depend on how many rows happen to be in user_objects.
    NOw lets consider the basic scott.emp table
    The above requirement can be acheived using this query as we may get only 14 records as there are only 14 records in scott.emp table
    select ( to_date('02/19/1981','mm/dd/yyyy')+rownum-1) as dt
    from scott.emp If we see the scott.emp table there are 2 reocrds with empno 7499,7521 with hiredates 20-feb-81 and 22-feb-81 respectively. on pulling the historic data from19-feb-81 to 04-mar-81 as there are only 14 records in the table we need to get the sum of sal or comm as it is along with those dates in the table.
    Assuming there are more than one record for the same date for the dates mentioned above , i have used the following query but could not get the desired output.
    Select To_Date('02/19/1981','mm/dd/yyyy')+Rownum-1,
    sum(Case When Sal>0 And Trunc(Hiredate)= (To_Date('02/19/1981','mm/dd/yyyy')+Rownum-1)
    Then (Sal)
    Else 0 End ) As Hist_Sal,
    sum(Case When Nvl(Comm,0)>= 0 And Trunc(Hiredate)= (To_Date('02/19/1981','mm/dd/yyyy')+Rownum-1)
    Then (Nvl(Comm,0))
    Else 0 End) As hist_comm
    From Scott.Emp
    Group By To_Date('02/19/1981','mm/dd/yyyy')+Rownum-1
    order by 1; I have tried the other option also and I could get the desired output.THe query goes like this:
    --Success Statement
    Select To_Char(Rt.Business_Date , 'Day,Mon DD,yyyy') As Business_Date ,To_Date(Rt.Business_Date) As Hist_date,
    ( Select sum(sal)
    From Scott.Emp  Slm
    Where Trunc (Hiredate) = Trunc(Rt.Business_Date)
    ) as hist_sal
    FROM
    (select ((TO_DATE('02/19/1981','mm/dd/yyyy')-1)+rnm) as business_date from (select rownum rnm from user_objects)) rt
    Where
    Trunc(Rt.Business_Date) Between To_Date('02/19/1981','mm/dd/yyyy') And To_Date('10/31/2012','mm/dd/yyyy')
    order by Hist_date;But i want to get the historic dates/data to be genearted from scott.emp table insetead of using this logic *(select ((TO_DATE('02/19/1981','mm/dd/yyyy')-1)+rnm) as business_date from (select rownum rnm from user_objects)) rt* as written in Success Statement
    As it would be helpful for my requirement ,else I need to write subqueries for all the cols i need as i have written in the above success statement.Sorry, I'm not sure what you're asking.
    Do you want to know if there's a simpler and/or more efficient way to get the same results as the 2nd query you posted above?
    Here's one way:
    WITH      all_dates   AS
         SELECT     start_dt + LEVEL - 1     AS dt
         FROM     (
                  SELECT     TO_DATE ('02/19/1981', 'MM/DD/YYYY')     AS start_dt
                  ,             TO_DATE ('10/31/2012', 'MM/DD/YYYY')     AS end_dt
                  FROM    dual
         CONNECT BY  LEVEL     <= 1 + (end_dt - start_dt)
    SELECT       TO_CHAR (a.dt, 'Day, Mon DD, YYYY')     AS business_date
    ,       SUM (e.sal)                                AS hist_sal
    FROM              all_dates  a
    LEFT OUTER JOIN      scott.emp  e  ON  e.hiredate = a.dt
    GROUP BY  a.dt
    ORDER BY  a.dt
    ;For testing purposes, it would be a lot clearer if you made the end_dt something like April 5, 1981 rather than October 31, 2012.
    I hope this answers your question.
    If not, post the results you want from the data in scott.emp, given some reasonable date range. (I suggest Nov. 16, 1981 through Jan. 24, 1982; that includes December 3, 1981 which has 2 rows with the same hiredate.)
    Explain, using specific examples, how you get those results from that data.
    Always say which version of Oracle you're using (e.g., 11.2.0.2.0).
    See the forum FAQ {message:id=9360002}

  • Remote historical data is not retrieved completely viewing it in MAX4

    Hi,
    since I installed LabVIEW 8 I have some problems retrieving historical data from another computer. Sometimes not all data is retrieved (if I zoom in or out or move back in time) and this missing data won't be retrieved ever.
    I already deleted the Citadel cache once, but after this even less data was retrieved... What's really weird, is, that for channels which weren't retrieved correctly, the data gets not updated anymore!
    On the remote computer I have a LabVIEW DSC Runtime 7.1 running (MAX 3.1.1.3003) on my local computer MAX 4.0.0.3010 and LabVIEW 8 DSC (development system) is installed parallel to LV DSC 7.1.1 (dev system). LV 8 is installed for testing purposes (I doubt we'll switch soon) and overall I like MAX 4. The HyperTrend.dll on my local computer is version  3.2.1017.
    This is really a quite annoying bug!
    So long,
        Carsten
    Message Edited by cs42 on 02-02-2006 09:18 AM

    Hi,
    > We've been unable to reproduce this issue. If you could provide some additional information, it might help us out.
    I did fear this, as even on my computer it is happening just sometimes...
    > 1) How many traces are you viewing?
    The views I observed this in had 2 to 13 traces.
    > 2) How often are the traces being updated?
    For some it's pretty often (about once a second), for some it's very infrequent (no change in data, that means updated because of max time between logs). I more often see this for traces that are updated very infrequently. But I think I've seen this for frequent traces as well (for these it does work currently).
    > 3) Are the traces being updated by a tag value change, or by the "maximum time between logs" setting in the engine?
    It happened for both types.
    > 4) What is the frequency of the "maximum time between logs" setting?
    Max time between logs is 10 minutes.
    > 5) Is the Hypertrend running in live mode when you zoom out/pan?
    I think it happened in both modes, but it defenitely did in live mode.
    > 6) If you disable/re-enable live mode in the Hypertrend, does the data re-appear?
    I couldn't trigger the loading of the data. All I did is wait and work with MAX (zooming, panning, looking at data) and after quite a while (some hours), the data appeared.
    Just tested this on a view where data is missing (for some days now!), and it didn't trigger data reloading. Zooming and panning don't as well. There's a gap of up to 3 days now for some traces. 7 of the 13 traces of this view are incompletely shown. All stopping at the same time but reappearing at different ones.
    AFAIR from the laboratory computer (these are temperatures and it's very plausable that these didn't change), there wasn't any change in these traces so they all got logged because of max time...
    I just created a new view and added these traces: the gap is there as well.
    (Sorry to put this all in this entry even if it is related to you other questions, but I started this live test with disable/re-enable live mode. )
    > 7)
    Are the clocks on the client and server computers synchronized? If not
    synchronized, how far apart are the times on the two computers?
    They should be (Windows 2000 Domain synchronized to ADS), but are 5 seconds apart.
    One thing I remember now: I have installed DIAdem 10 beta 2 (10.0.0b2530, USI + DataFinder 1.3.0.2526). There I had (and reported) some problems with data loading from a Citadel Database of a remote machine as well. This was accounted to some cache problem. Maybe a component is interfering?
    Thanks for investigating.
    Cheers,
        Carsten

  • How to populate historical data in the appraisal

    Hello,
    I am working in Oracle PMS module. Our requirement is to display historical data of an employee in the next appraisal cycle.
    Eg. If there are 3 appraisal cycle in a year. In the first appraisal manager will add objectives and competencies for an employee. lets say he added competency 1 ,competency 2,Objective 1 and Objective 2. after completion of the first cycle.
    When manager initiates the second appraisal. in the objectives block 2 values (Objective 1 and Objective 2) should be auto populated. Same as in competency block 2 values (competency 1 and competency 2) should be auto populated.
    Here manager can add more objectives or competencies.
    Please suggest how this can be achived.
    Thanks in advance,
    sheetal

    Hi Shunhui
    Answers :-
    If the required fields are not available in the Maintenance link, does it mean that I have to goto SE11 and create an append structure for the extract structure MC02M_0ITM which belongs to the datasource 2LIS_02_ITM?
    Then I have to write codes in the user exit for transactional datasources in RSAP0001?
    Ans : That's correct . Once you have done this on R/3 side, replicate the data source in BW side. Activate the transfer rules with appropriate mapping for 3 new fields and also adjust the update rules so that these 3 new fields are availble in Info providers.
    Are you able to provide some information on the errors that might arise during transports with the delta being active in the production system? Will data be corrupted? Will I need to clear the delta queue before the transport goes over?
    Ans : I assume that deltas are active in Production system . There is risk to your active delta into BW .
    Before your R/3 transports reach Production system , you need to clear the delta queue that means bring the number of LUWs for 2LIS_02_ITM to zero . Also stop the delta collector job in R/3 for Application 02(purchasing). Make sure that there are no postings (transactions /users are locked) so that there will be change in purchsing base tables.
    Then send your R/3 transport into Production system (Append structure , new fields + user exit code) .Replicate on BW side. Now send the BW transports to BW production system .
    I had done this earlier for 2LIS_12_VCITM and had a step by step proceduew written with me ....will have to search for that document . Let me know your email ID , will try to send it once I find the document
    Regards
    Pradip
    (when you edit the questions , there are option buttons with which you can assign points for that you need to do log on to SDN )

  • How to extract the historical data from R/3

    hi
    I am extracting data from R/3 through LO Extraction. client asked me to enhance the data source by adding field. i have enhanced the field and wrote exit to populate the data for that field.
    how to extract the historical data into BI for the enhanced field. already delta load is running in BI.
    regards

    Hi Satish,
    As per SAP Standard also the best way is to delete whole data from the cube and then load the data from set up tables as you have enhanced the data source.
    After data source enhancement it is supported to load normally because you don't get any historical data for that field.
    Best way is to take down time from the users, normally we do in weekends/non-business hours.
    Then fill the set-up tables; if the data is of huge volume you can adopt parallel mechanism like:
    1. Load set-up tables by yearly basis as a background job.
    2. Load set-up tables by yearly basis with posting periods from jan 1st to 31st dec of any year basis as a background job.
    This can make your self easier and faster for load of set-up tables. After filling up set-up tables. You can unlock all users as there is no worries of postings.
    Then after you can load all the data into BI first into PSA and then into Cube.
    Regards,
    Ravi Kanth.

  • How to see the historic data of CAT2

    Dear All,
    I have a question related to CATS.
    Through which table I can get the historical data of an employee which is stored in CAT2 transaction. I tried through CATSDB but not able to get the number of working hours as stored in CAT2.
    I clied on a field of CAT2 and checked the technical details. there I found the table name CATD but when I run SE11 and enter this table name, it display it as a sturcture not as a table.
    Please provide your help in this regard.
    Regards,
    -Neha

    Maximum NUmber of Columns allowed is 1023. So if there are more data than that, I am afraid there is no way to see them all in one screen.
    the better thing to do is to use the Settings>Format List>Choose fields option from the selection screen of SE16 to just choose the fields which you want in the output.
    It is highly unlikely that you are using all the 100 fields so you can very well hide a few of them with no impact on your output.
    As someone else has suggested using a report such as CATSXT_DA will defintiely be a much more useful way of viewing all relevant fields from CATSDB.

  • 0HR_PT_2 How to get back historical data for new report time type

    Hi All Expert,
    We have implemented and using the 0HR_PT_2 extractor for the past whole year. The Delta is working. Recently, there is requirement to read more data from the ZL custer table, and more new BW Report Time Types are added to extract such data.
    The delta doesn't pick up the the past whole year data of the new report time type that we added.
    Do we need to re-initialize the load to get back those historical data every time when we add a new report time type?
    Please advice and Thx
    Ken
    Edited by: Ken Hong on Feb 27, 2008 9:24 PM
    Edited by: Ken Hong on Feb 27, 2008 9:25 PM

    P.s, all hidden files are shown in es file explorer as this backup folder was hidden originally. It has a '.' in front. So I'm pretty sure the folder it's gone, but as I've not erased my phone again, shouldn't the folder be somewhere in my SD card still and how can I find it using my Mac?

  • How to fill a new single field in a Infocube with historical data

    Hello Everybody,
    We have an SD infocube with historical data since 1997.
    Some of the infoobjects (fields) of the infocube were empty during all this time.
    Now we require to fill a single field of the infocube with historical data from R/3.
    We were thinking that an option could be to upload data from the PSA in order to fill only the required field (infoobject).
    Is it possible? is there any problem doing an upload from the PSA requests directly to the infocube.
    Some people of our team are thinking that the data may be duplicated... are they right?
    Which other solutions can we adopt to solve this issue?
    We will appreciate all your valuable help.
    Thanks in advance.
    Regards.
    Julio Cordero.

    Remodeling in BI 7:
    /people/mallikarjuna.reddy7/blog/2007/02/06/remodeling-in-nw-bi-2004s
    http://www.bridgeport.edu/sed/projects/cs597/Fall_2003/vijaykse/step_by_step.htm
    Hope it helps..

  • Reloading Historical Data into Cube: 0IC_C03...

    Dear All,
    I'm working on SAP BW 7.3 and I have five years of historical data being loaded into InfoCube: 0IC_C03 (Material Stocks/Movements (as of 3.0B)) and the deltas are also scheduled (DataSources: 2LIS_03_BF & 2LIS_03_UM) for the InfoCube. I have new business requirement to reload the entire historical data into InfoCube with addition of "Fiscal Quarter" and I do not have a DSO in the data flow. I know its a bit tricky InfoCube (Inventory Management) to work on and reloading the entire historical data can be challenging.
    1. How to approach this task, what steps should I take?
    2. Drop/Delete entire data from InfoCube and then delete Setup Tables and refill these and then run "Repair Full Load", is this they way?
    3. What key points should I keep in mind, as different BO reports are already running and using this InfoCube?
    4. Should I run "Repair Full Load" requests for all DataSources( 2LIS_03_BX &2LIS_03_BF & 2LIS_03_UM) ?
    I will appreciate any input.
    Many thanks!
    Tariq Ashraf

    Hi Tariq,
    Unfortunately, you will need some downtime to execute a stock initialization in a live production system. Otherwise you cannot guarantee the data integrity. There are however certainly ways to minimize it. Please see SAP Note 753654 - How can downtime be reduced for setup table update. You can dramatically reduce the downtime to distinguish between closed and open periods. The closed periods can be updated retrospectively.
    To make it more concrete, you could consider e.g. the periods prior to 2014 as "closed". You can then do the initialization starting from January 1, 2014. This will save you a lot of downtime. Closed periods can be uploaded retrospectively and do not require downtime.
    Re. the marker update,please have a look at the document Re-initialization of the Material Stocks/Movements cube (0IC_C03) with 2LIS_03_BF, 2LIS_03_BX and 2LIS_03_UM in BW 7.x, step 10, 11, 12 and 13 contain important information.
    Re. the steps, it looks OK for me but please double check with the document Re-initialization of the Material Stocks/Movements cube (0IC_C03) with 2LIS_03_BF, 2LIS_03_BX and 2LIS_03_UM in BW 7.x.
    Please have a look at the following document for more background information around inventory management scenarios:
    How to Handle Inventory Management Scenarios in BW (NW2004)
    Last but not least, you might want to have a look at the following SAP Notes:
    SAP Note 436393 - Performance improvement for filling the setup tables;
    SAP Note 602260 - Procedure for reconstructing data for BW.
    Best regards,
    Sander

  • Non-cumulative initialization is not possible with historical data

    Hi all,
    While loading inventory cube.. 0IC_C03  .. after loading 2LIS_03_BX request get sucessful. while compress that request No Marker update. i getting following error msg.
    <b>Non-cumulative initialization is not possible with historical data</b>     
    Regards
    siva.

    You sure you didn't use BF instead of BX?  This messge indicates that you have historical data in your cube whereas BX only loads a fixed state at a specific point in time...  Or maybe, did you initialize BX twice in R/3 without deleting the previous init, I don't know if that could cause that error but it's another possibility.

  • Historic data migration (forms 6i to forms 11g)

    Hello,
    We have done a migration from Forms 6i to Forms 11g. We are facing a problem with the historic data for a download/upload file
    utility. In forms 6i the upload/download was done using OLE Container which has become obsolete, the new technology being webutil.
    We had converted the historic data from Long RAW to BLOB (by export/import & by the TO_LOB) and while opening them it throws a
    message or not able to open. This issue exists for all types of documents like .doc, .docx, .html, .pdf. We are unable
    to open the documents after downloading to local client machines.
    One option which works is to manually download the documents (pdf, doc etc) from the older version of forms 6i (OLE) and
    upload it to the forms 11g (Webutil). Is there any way this can be automated?
    Thanks
    Ram

    Are you colleagues?
    OLE Containers in Oracle Forms 6i

  • How to recreate EBS user and keep all his historical data.

    Hi all
    We have a user that is having an issue seeing any of his scheduled Discoverer reports within the Schedule Manager window of Discoverer Plus; Discoverer Desktop works fine.
    The solution for it's to recreate the EBS user. The problem with this is that, if we recreate the EBS user, he will lose all historical data connected to that user, including the results of the scheduled Discoverer reports as well as all of the EBS created/last updated information.
    There is a way to recreate an EBS user and preserve the historical references.
    Thanks

    We have a user that is having an issue seeing any of his scheduled Discoverer reports within the Schedule Manager window of Discoverer Plus; Discoverer Desktop works fine.
    The solution for it's to recreate the EBS user. The problem with this is that, if we recreate the EBS user, he will lose all historical data connected to that user, including the results of the scheduled Discoverer reports as well as all of the EBS created/last updated information.Why do you need to recreate the user?
    Are you saying you are going to create a new username for the same user and end-date the old one?
    There is a way to recreate an EBS user and preserve the historical references.I believe there is no such a way to find all records/tables with the old user_id. Even if you find the list and update them manually, I believe this approach is not supported.
    Please log a SR to confirm the same with Oracle support.
    Thanks,
    Hussein

Maybe you are looking for