Sample Data Scrambling Scripts

Hi,
I am looking for a generalised data scrambling script with algorithm for scrambling columns having amount, SSN, text, telephone number, address etc.
Thanks
AT

Amit, we do this:
1. Scrub the emails. for sure, replace the '@' with a '#' in the email in case you have any auto notification stuff setup. But this a double edged sword. You might swamp your own mail server with failed emails. but better than being tagged as a spammer.
2. SSNs and phones are set to a default dummy value.
3. we have a sequence for first/last name and we use 'firt'||firstseq.nextval to create first names and same for last names.
3. We keep the street address, ZIP and state to the real value as we need authentic data for search purposes.
As far as scripting is concerned, I have a table with 4 columns: owner, table_name and column_name and scrub_value.
Say the values are:
owner='SCOTT', TABLE_NAME='EMP', COLUMN_NAME='SSN', SCRUB_VALUE='111-11-1111'
owner='SCOTT', TABLE_NAME='EMP', COLUMN_NAME='FIRST_NAME', SCRUB_VALUE='FirstName'||firstname_seq.nextval
generate on the fly using a cursor
update scott.emp set ssn='111-11-1111', first_name='FirstName'||firstname_seq.nextval
I just loop thru the table and get statements scrub dynamically.

Similar Messages

  • Navteq sample data

    hi,
    I have problems during the navteq sampla data import (http://www.oracle.com/technology/software/products/mapviewer/htdocs/navsoft.html). I'm follwing exactly the instructions from the readme file.
    Import: Release 10.2.0.1.0 - Production on Thursday, 25 September, 2008 10:35:28
    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    Master table "SYSTEM"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
    Starting "SYSTEM"."SYS_IMPORT_FULL_01": system/********@ORCL REMAP_SCHEMA=world_sample:world_sample dumpfile=world_sample.dmp directory=sample_world_data_dir
    Processing object type TABLE_EXPORT/TABLE/TABLE
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    . . imported "WORLD_SAMPLE"."M_ADMIN_AREA4" 394.5 MB 118201 r
    After the last log message the systems hangs with 50% cpu time and nothing happens. What could be the problem? Only the table M_ADMIN_AREA4 contains the 118201 rows.
    Regards, Arnd
    Edited by: aspie on Sep 25, 2008 11:45 AM

    Hi, I followed the instructions from the readme file:
    1. Create a work directory on your machine. Change directory to your work
    directory and download the world_sample.zip file into it. Unzip the
    world_sample.zip file:
    C:\> mkdir world_sample
    C:\> cd world_sample
    C:\world_sample> unzip world_sample.zip
    2. In your work directory, open a Sqlplus session and connect as the SYSTEM
    user. Create a user for the World Sample data (if you have not previously
    done so):
    C:\world_sample> sqlplus SYSTEM/password_for_system
    SQL> CREATE USER world_sample IDENTIFIED BY world_sample;
    SQL> GRANT CONNECT, RESOURCE TO world_sample;
    3. This step loads the World Sample data into the specified user, and creates
    the metadata required to view the sample data as a Map in MapViewer. Before
    you begin, make a note of the full directory path of your work directory,
    since you will be prompted for it. You will also be prompted for the user
    (created in step 2), and the SYSTEM-user password. This step assumes that
    you are already connected as the SYSTEM user, and that you are in your work
    directory. To begin, run the load_sample_data.sql script in your Sqlplus
    session. Exit the Sqlplus session after the script has successfully
    concluded:
    SQL> @load_sample_data.sql
              Enter value for directory_name: C:\world_sample
              Enter value for user_name: world_sample
              Password: password_for_system
    SQL> exit;
    In step 2 I created a user dedicated tablespace. The same problem.
    Regards Arnd.

  • PER_ADDRESS Scrambling Script Running for 13 Hours in EBS 12.0.6

    Hi All,
    We do Address scrambling in PER_ADDRESS table in cloned developemnt instance(from PRODUCTION) to hide actual data.We run the following script after every clone. But the script is running more than *13 hours*. We run this script in APPS user.
    Application: EBS 12.0.6
    Database: 10.2.0.4 (Size 500GB)
    OS: RHEL 5.2
    =================================================================
    set serveroutput on size 20000000;
    DECLARE
    CURSOR cur_address_dtls IS
    SELECT pa.address_id address_id, pa.object_version_number object_version_number
    FROM per_addresses pa
    WHERE ( pa.address_line1 IS NOT NULL
    OR pa.address_line2 IS NOT NULL
    OR pa.address_line3 IS NOT NULL)
    ORDER BY pa.address_id;
    TYPE ty_address_list IS TABLE OF cur_address_dtls%ROWTYPE;
    rec_address_list ty_address_list;
    l_object_version_number NUMBER;
    l_prev_address_id NUMBER:=0;
    BEGIN
    OPEN cur_address_dtls ;
    FETCH cur_address_dtls BULK COLLECT INTO rec_address_list;
    FOR i IN rec_address_list.FIRST .. rec_address_list.LAST
    LOOP
    IF nvl(l_prev_address_id,0) <> rec_address_list(i).address_id THEN
    l_object_version_number := rec_address_list(i).object_version_number;
    BEGIN
    hr_person_address_api.update_person_address(p_validate => FALSE
    ,p_effective_date => SYSDATE
    ,p_address_id => rec_address_list(i).address_id
    ,p_object_version_number => l_object_version_number
    ,p_address_line1 => 'ABCDE'
    ,p_address_line2 => 'FGHIJ'
    ,p_address_line3 => NULL
    COMMIT;
    EXCEPTION
    WHEN OTHERS THEN
    dbms_output.put_line('Error in scrambling address for the person id:- '||rec_address_list(i).address_id);
    dbms_output.put_line(SQLERRM);
    ROLLBACK;
    END;
    END IF;
    l_prev_address_id := rec_address_list(i).address_id;
    END LOOP;
    CLOSE cur_address_dtls;
    END;
    ==========================================================
    Oracle suggested the followings
    ==================================================================================
    Issue is not with the update being done on PER_ADDRESSES table or any Core
    Code. APPS performance team reviewed the issue and also the latest trace and
    finds that there is a huge wait event happening with 'TCP Socket (KGAS)'.
    This wait event occurs when trying to send a mail message using utl_tcp or
    utl_smtp, and is having trouble connecting.
    This is not a Core HR Code issue. There can be some customization or alerts
    which sends mail and hitting this issue. Customer need to check the custom
    code or any alerts they specified and then check the mail server
    configuration.
    "Can you please check with your internal DBA team to understand the source of "wait on "TCP Socket (KGAS)" the raw trace file provided points to it.
    There has to be some custom code which is waiting on this."
    ======================================================================================
    Is it possible to tune this query for a short time spam.
    Please help me out......
    Thanks,
    Gohappy

    Hi Srini,
    I found the following triggers on PER_ADDRESSES table.
         OWNER     TRIGGER_NAME                    TRIGGER_TYPE     TABLE_OWNER     STATUS
    =======================================================================
    1     APPS     PER_ADDRESSES_WHO     BEFORE EACH ROW     HR     ENABLED
    2     APPS     GHR_PER_ADDRESSES_AFIUD     AFTER EACH ROW     HR     ENABLED
    3     HR     DR$PER_ADDRESSES_N4TC     AFTER EACH ROW     HR     ENABLED
    4     APPS     PER_ADDRESSES_OVN     BEFORE EACH ROW     HR     ENABLED
    5     APPS     BEN_EXT_ADD_EVT     AFTER EACH ROW     HR     ENABLED
    6     APPS     HR_PA_MAINTN_ADDR_HIST_BRU     BEFORE EACH ROW     HR     ENABLED
    7     APPS     PERADDRESSES_133I_DYT     AFTER EACH ROW     HR     ENABLED
    8     APPS     PERADDRESSES_10234U_DYT     AFTER EACH ROW     HR     DISABLED
    9     APPS     PERADDRESSES_10242I_DYT     AFTER EACH ROW     HR     DISABLED
    10     APPS     HR_PA_MAINTN_ADDR_HIST_BRI     BEFORE EACH ROW     HR     ENABLED
    11     APPS     PERADDRESSES_56U_DYT     AFTER EACH ROW     HR     ENABLED
    12     APPS     PERADDRESSES_10244U_DYT     AFTER EACH ROW     HR     DISABLED
    13     APPS     PERADDRESSES_10239I_DYT     AFTER EACH ROW     HR     DISABLED
    Can I disable all the triggers(enabled) the triggers before running the scrambling script?
    Cheers,
    Gohappy

  • Sample data for learning

    I have installed Oracle 10g Express and am looking for some sample data to learn Oracle. I see HR sample schema but am missing OE, OC, PM, IX, SH schemas that I read about on several documents. Where do I get these? Is there a place I can download script to create these schemas?
    Thanks,
    Sanjeev

    Just one thing you should be aware of is that if you installed Oracle XE you won't be able to install SH demo schema, since this requires partitioning, which is only available for the Enterprise Edition. So even if you are able to download or extract the demo files from the companion disk, if you plan to use all the features from the demo schemas you must install Oracle Enterprise Edition instead of Express Edition.
    ~ Madrid

  • Oracle Reports Developer 6i - sample data tables

    I downloaded and installed all the components of Oracle Reports 6i on my PC. When I look at the Oracle Reports Developer manual (building reports.pdf) it is referring to a database tables like stocks, stock_history etc. I am trying to learn how to use the tool using this manual. Aren't the sql scripts that create those tables and load the sample data a part of the installation. Am I missing anything. I will appreciate any input. Again, I did a full install by selecting all the components and not just a typical install.
    Thanks in advance.

    Hi Suresh
    Try locating these tables in scott schema. If you don't find it there, try locating the scripts to create this table in Oracle Database installation folder.
    Regards
    Sripathy

  • Sales schema and data load script

    Hi,
    I am trying to learn OBIEE on my own. i managed to install oracle XE and OBIEE 10g. I need a script to create a new db schema and load sales tables with sample data to create reports. can anyone help me with the scripts?

    AsI mentioned, i dont this these scripts will come with a XE installl...primarily because XE may not be certified with the sample schema. This is why I would recommend you try to install the full DB instead of trying to run these scripts on XE.
    Did you try to install the ACTUAL Oracle DB Enterprise Edition? I believe there is a examples part to the install that will have these scripts. Please try this and let me know.
    If this helped, please mark as helpful or correct.

  • Tool for creating sample data

    Greetings all,
    I have a application and I have some lookup tables used in the application for select lists etc... When I export the application I can get it to create the tables on installation but they are empty and I was wondering if there are instructions out there on how to have the data inserted into the table (Like the sample data is inserted in the OEHR Database). I know I can import the data from csv files after the application is installed, but wanted to have the lookup table populated when the application is deployed.
    I have been taking my application and installing it on other APEX instances, but then have to go in and import table data. I looked at the OEHR sample data build and I see that they do some insert into statements, but wanted to find out if there is an automated way to do this or some instructions out there to add this to the application sql file? There is much information in the OEHR sql file, and I don't know what is required or mandatory to make the insert into stuff work.
    Thanks in advance
    Wally
    Example table
    CategoryTable
    CategoryID (Number) PK
    CategoryName(VARCHAR2)(50)
    Data
    CategoryID CategoryName
    1 Cat1
    2 Cat2
    3 Cat3
    4 Cat4
    5 Cat5

    Thi sis what I do when I want to bundle supporting (lookup) table data.. I use SQL Developer to download a my table data as insert statements. I save those insert statements to a script and then add them to the application install file, via the supporting objects tab in APEX..
    See this site for more information: http://www.yocoya.com/downloads/apexsuppaper.pdf
    Thank you,
    Tony Miller
    Webster, TX
    Never argue with an idiot. They drag you down to their level, then beat you with experience.

  • Time doesn't match sampled data?

    Hallo all experts,
    I write a LV code which reads data from USB 6211 and saves them with time instants in a text file, but the time instants don't correspond the sampled data. The time values are generated by elapsed time, after build array with the data read from DAQ, they are fed to the write to a text file. The test signal is 10 Hz, but the text file yields 0.2 Hz signal. How could I synchronize them?
    Any tips are highly appreciated.
    win2

    Don't use the "elapsed time" express VI for precision timings. It seems to have limited resolution (internally, it converts a timestamp to DBL).
    You can use e.g. the tick count to keep track of the time. See the attached comparison. (still there will always be some subtle differences due to the software timings).
    LabVIEW Champion . Do more with less code and in less time .
    Attachments:
    usb6211_forumMOD.vi ‏42 KB

  • How to renormalize number of flows in Netflow Sampled data

    Hi,
    I am working on extrapolation(renormalization) of bytes/packets/flows from randomly sampled (1 out of N packets) collected data. I believe bytes/packets can be renormalized by multiplying bytes/packets value in exported flow record by N.
    Now, I am trying to extrapolate number of flows. So far i have not got any information on it. Do you people have any idea on how flows can be renormalized from sampled data ?
    Well, at the same time i have some doubts regarding this concept altogether -
    1. In packet sampling, we do not know how many flows got dropped. Even router cache will not have entries for dropped flows
    2. In flow sampling, router cache will maintain entries of all the flows and there may be some way by which one can know how many actual flows were there. But again there is no way to know values of individual attributes in missed flows like srcip/dstip/srcport/dstport etc.(though they are there in flow cache)
    3. In case of sampling (1 out of N packets), we anyway multiply #packets and #bytes with N to arrive at estimate for total packets and bytes. When we multiply by N, it means we have taken into account all those packets as well which were NOT sampled. So, it means all the packets which flowed between source and destination have been accounted for. Then there are no missed flows, isn't it ? And if there do exist some missed flows then multiplication by N to extrapolate number of packets/bytes is not correct.
    4. What is the use of count of flows anyways. Number of flows may vary depending upon the configuration such as active timeout etc. So, it does not provide any information about the actual flow between source and destination unlike number of packets and bytes.
    Please share your thoughts.
    Thanks,
    Deepak

    The simplest way is to call GetTableCellRangeValues with VAL_ENTIRE_TABLE as the range, next summing array elements.
    But I don't understand your comment on checksum, so this may not be the more correct method for your actual needs: can you explain what do you mean?
    Proud to use LW/CVI from 3.1 on.
    My contributions to the Developer Zone Community
    If I have helped you, why not giving me a kudos?

  • I'm doing a scan around a line by sampling data 360 degrees for every value of z(z is the position on the line). So, that mean I have a double for-loop where I collect the data. The problem comes when I try to plot the data. How should I do?

    I'm doing a scan around a line by sampling data 360 degrees for every value of z(z is the position on the line). So, that mean I have a double for-loop where I collect the data. The problem comes when I try to plot the data. How should I do?

    Jonas,
    I think what you want is a 3D plot of a cylinder. I have attached an example using a parametric 3D plot.
    You will probably want to duplicate the points for the first theta value to close the cylinder. I'm not sure what properties of the graph can be manipulated to make it easier to see.
    Bruce
    Bruce Ammons
    Ammons Engineering
    Attachments:
    Cylinder_Plot_3D.vi ‏76 KB

  • For this sample data how to fulfill my requirement ?

    For this sample data how to fulfill my requirement ?
    with temp as
    select 'MON' WEEKDAY,'9-10' TIMING,'I' CLASS FROM DUAL UNION
    select 'MON' WEEKDAY,'9-10' TIMING,'II' CLASS FROM DUAL UNION
    select 'MON' WEEKDAY,'9-10' TIMING,'III' CLASS FROM DUAL UNION
    select 'MON' WEEKDAY,'10-11' TIMING,'I' CLASS FROM DUAL UNION
    select 'MON' WEEKDAY,'10-11' TIMING,'II' CLASS FROM DUAL UNION
    select 'TUE' WEEKDAY,'9-10' TIMING,'I' CLASS FROM DUAL UNION
    select 'TUE' WEEKDAY,'9-10' TIMING,'II' CLASS FROM DUAL
    select ?? (what will be the query ??)
    How can i get output data in this way :
    WEEKDAY TIMING CLASS
    MON 9-10 I,II,III
    MON 10-11 I,II
    TUE 9-10 I,II

    If in 11g, you can use LISTAGG
    with temp as
    select 'MON' WEEKDAY,'9-10' TIMING,'I' CLASS FROM DUAL UNION
    select 'MON' WEEKDAY,'9-10' TIMING,'II' CLASS FROM DUAL UNION
    select 'MON' WEEKDAY,'9-10' TIMING,'III' CLASS FROM DUAL UNION
    select 'MON' WEEKDAY,'10-11' TIMING,'I' CLASS FROM DUAL UNION
    select 'MON' WEEKDAY,'10-11' TIMING,'II' CLASS FROM DUAL UNION
    select 'TUE' WEEKDAY,'9-10' TIMING,'I' CLASS FROM DUAL UNION
    select 'TUE' WEEKDAY,'9-10' TIMING,'II' CLASS FROM DUAL
    select
    WEEKDAY,
    TIMING,
    LISTAGG(CLASS,',') WITHIN GROUP (order by 1) as class_aggregate
    from temp
    GROUP by WEEKDAY,TIMING;
    WEEKDAY       TIMING     CLASS_AGGREGATE
    MON           9-10       I,II,III
    MON           10-11      I,II
    TUE           9-10       I,IIOther techniques for different versions are also mentioned here :
    http://www.oracle-base.com/articles/misc/StringAggregationTechniques.php#listagg

  • ChaRM- Is data scrambling required for Implementing ChaRm in SM?

    Hello
    Is data scrambling required for the implementation of ChaRM in Solution Manager?  Iu2019ve heard that data scrambling is a requirement for CHaRm and if so is there documentation from SAP with the specifics.  I do not understand why data needs to be scrambled for transport management.
    Thanks!

    We are using CHaRM since 2007, and there is no data scrambling for CHaRM itself required. CHaRm is jsut taking control over TMS. It is a process to ensure some quality and to facilitate audit.
    Regards,
    Holger

  • Need help in framing an SQL query - Sample data and output required is mentioned.

    Sample data :
    ID Region State
    1 a A1
    2 b A1
    3 c B1
    4 d B1
    Result should be :
    State Region1 Region2
    A1 a b
    B1 c d

    create table #t (id int, region char(1),state char(2))
    insert into #t values (1,'a','a1'),(2,'b','a1'),(3,'c','b1'),(4,'d','b1')
    select state,
     max(case when region in ('a','c') then region end) region1,
      max(case when region in ('b','d') then region end) region2
     from #t
     group by state
    Best Regards,Uri Dimant SQL Server MVP,
    http://sqlblog.com/blogs/uri_dimant/
    MS SQL optimization: MS SQL Development and Optimization
    MS SQL Consulting:
    Large scale of database and data cleansing
    Remote DBA Services:
    Improves MS SQL Database Performance
    SQL Server Integration Services:
    Business Intelligence

  • BPC 5 - Best practices - Sample data file for Legal Consolidation

    Hi,
    we are following the steps indicated in the SAP BPC Business Practice: http://help.sap.com/bp_bpcv151/html/bpc.htm
    A Legal Consolidation prerequisit is to have the sample data file that we do not have: "Consolidation Finance Data.xls"
    Does anybody have this file or know where to find it?
    Thanks for your time!
    Regards,
    Santiago

    Hi,
    From [https://websmp230.sap-ag.de/sap/bc/bsp/spn/download_basket/download.htm?objid=012002523100012218702007E&action=DL_DIRECT] this address you can obtain .zip file for Best Practice including all scenarios and csv files under misc directory used in these scenarios.
    Consolidation Finance Data.txt is in there also..
    Regards,
    ergin ozturk

  • AP/PO Sample Data

    I have a requirement to develop a PO/AP environment does any have sample data i can to test requisition and PO approval.
    Can any send a sample PO/AP so i can with Data loader

    Hi VJ,
    System will place a Hold only when there is mismatch between Purchase Data and Invoice data, if you get invoice details with all required details and this matches with PO ( even if the Receiving is entered in different PO line ), system will pass the Invoice.
    Hope this answers your query, please let me know if you are looking for any specific information.

Maybe you are looking for

  • 11.1.2 64 bit on Windows 7

    Hi, Has anyone faced an issue where it says "IIS is not installed" when you try to install 11.1.2 64 bit on Windows 7 x64? IIS and WWWeb Services are fine as when I call http://localhost, I can see multilingual welcome screen of IIS7. The problem occ

  • Real J2ME applications (2)

    I am looking for real j2me applications. I have already posted this question at: http://forum.java.sun.com/wireless/thread.jsp?forum=76&thread=422879 However, most of the replies gave examples of games or tutorial examples. But I was looking for j2me

  • Jumping to start/stop of layer?

    Once a layer is selected, is there any way to jump to the beginning or end of that layer aside from dragging in the timeline and holding down the shift key? In Final Cut, you hit the up arrow and it jumps to the start of clip. Here, it moves up in th

  • Adobe Contribute

    Heya, I'm doing some web development for clients and am looking for a way for my clients to be able to update their own websites easily. The most obvious choice seems to be adobe contribute, but for the simple task of updating a website content they

  • Moving CS5 workspaces to CS5.5

    Is there a pain-free way to copy my AE CS5 custom workspaces to CS5.5?