Data verification

Hi,
A customer is planning to upgrade their Oracle database from 8i to 10g. The database is in a validated state and contains critical data(pharmaceutical) that must stay correct. They want after the upgrade a check that the data is still the same as before. eg. Text strings may not be different, data may not be changed during the upgrade.
Is there a good practice to confirm that the data is still the same after the upgrade?

One of the way is to use minus operator.
I remember I did it for one time..
Objects comparison
select * from (select object_name,object_type from user_objects
minus select object_name,object_type from user_objects@newdatabase) order by object_type, object_name
Column comparison
select table_name,column_name from user_tab_columns
minus
select table_name,column_name from user_tab_columns@newdatabase
You can do for important tables data.

Similar Messages

  • Reconciliation and data verification approaches for data in  sap r/3 and bw

    Hi
    Cam anybody suggest what are the different reconciliation and data verification approaches to assure that BW and R/3 source data are in sync.
    Thanks in advance.
    Regards,
    Nisha.

    Hi
      What you can do is, go to R/3 transaction RSA3 and run the extractor, it gives you the number of records extracted and then go to BW Monitor to check the number of records in the PSA.
    if it's the same, there you go.
    there is an HOW-TO Document on service market place "Reconcile Data Between SAP Source Systems and SAP BW".
    Check this document as well which is very helpful.
    There is another HOW-TO document "Validate infocube data by comparing it with PSA Data". this is also a good document.
    Hope it helps.
    Hari Immadi
    http://immadi.com
    SEM BW Analyst

  • Data Warehouse and ETL tools for data verification ?

    Data Warehouse and ETL tools for data verification ?
    How need to to data verification using ETL tool ? Also how to relate this thing to datawaehouse ?
    Thanks in Advance

    Hi  Shyamal Kumar,
    1)  BW it self  facilitates to do the ETL (Extraction Transformation Loading)  steps:
         example:
                     Extraction  - from SAP or other data bases
                     Transformation - using transfer rules, Updates rules
                     Loading  -  Loading into ODS, Cube, master data
    2) Typically used ETL tools in the industry are:
         a)   datastage from Ascential (owned by IBM)
         b)   Informatica
         c)   Mercator
    Regards, BB

  • IX12- 300r data verification in the middle of the day and cant access shares.

    The raid verification has kicked off in the middle of the business day and has degraded performance to almost unusable. This is a big volume at 18Tb so should send users home.
    No way to turn this off once it has started and appears no way to schedule when it is going to run.
    Anyone found a work-around for this... really have not hit this issue before with older firmware so not sure if its a new firmware issue or we have just been lucky. Currently on 4.0.4.146.
    Any basic raid controller with raid and data verfication you can schedule, appears not to be the case with Lenova EMC solutions that makes them unusable in a corporate envirnment. 

    It is unfortunate to hear that as well as the device should automatically detect the new disk and rebuild the RAID array. If the replacement hard drive is not a Lenovo approved drive or was not received from an authorized reseller, the device may not detect the hard drive.
    For a list of approved drives: Which hard drive brands and models are approved for use with the StorCenter ix12-300r?
    If the replacement is not from one of our resellers, it is recommended to clean the device before use: Cleaning a hard disk for use with an ix or px network storage device   
     Also, is this device in warranty? You may as well want to call in to technical support to make sure that the device is functioning properly and the firmware is not corrupted. 
    Have questions and need answers?
    Search the database for answers to FAQ's, software/driver downloads, tutorials, news, features and more!
    LenovoEMC Support & Downloads
    LenovoEMC North America Support Contact Page

  • Changes to data verification between 10g and 11g Clients

    The 10g and 11.1.0.7 Databases are currently set to AL32UTF8.
    In each database there is a VARCHAR2 field used to store data, but not specifically AL32UTF8 data but encrypted data.
    Using the 10g Client to connect to either the 10g database or 11g database it works fine.
    Using the 11.1.0.7 Client to go against either the 10g or 11g database and it produces the error: ORA-29275: partial multibyte character
    What has changed?
    Was it considered a Bug in 10g because it allowed this behavior and now 11g is operating correctly?

    jfedynic wrote:
    The 10g and 11.1.0.7 Databases are currently set to AL32UTF8.
    In each database there is a VARCHAR2 field used to store data, but not specifically AL32UTF8 data but encrypted data.
    Using the 10g Client to connect to either the 10g database or 11g database it works fine.
    Using the 11.1.0.7 Client to go against either the 10g or 11g database and it produces the error: ORA-29275: partial multibyte character
    What has changed?
    Was it considered a Bug in 10g because it allowed this behavior and now 11g is operating correctly?
    29275, 00000, "partial multibyte character"
    // *Cause:  The requested read operation could not complete because a partial
    //          multibyte character was found at the end of the input.
    // *Action: Ensure that the complete multibyte character is sent from the
    //          remote server and retry the operation. Or read the partial
    //          multibyte character as RAW.It appears to me a bug got fixed.

  • File time & Date verification

    How can I verify the time and date stamp on a file after the system clock time has been changed and the bios chip is trashed? I am trying to verify when a trashed computer was last access and other than looking at the modified dates of files within the
    system32 folder I cannot tell for certain if the modified date and time is genuine because of the change of bios settings. Is there any off the shelf software that I can use that would give me a reliable answer? Any help would be really appreciated.

    I dont feel there is such a simple way to detect that.. May be IT forensic expert may do.. 

  • How to turn of data verification?

    Is there a way to turn of data verifcation after burning a disc? I burn a lot of discs with photos. But the verifcation costs way too much time. I don't have that time because my clients are waiting to get their disc asap.
    I can turn it of in Roxio Toast 8, but since burning in finder is much easier for me with using Aperture it would be great to turn it of in finder as well.
    Thanks!

    This doesn't address your question but the new Toast 9 accesses the Aperture library using the Media Browser, which may make using it easier for you. One thing to note is that any changes made to the library after Toast is open aren't recognized unless Toast is closed and reopened. I do a lot of photo work with clients and very much like Toast's Photo Disc feature because it gives them a full resolution Mac & PC slide show as well as the image files. You have that on Toast 8, too.

  • Workflow : Material Data Verification

    Hi,
    I have created one workflow to review material master data. It's working fine in test mode but it's not working with realtime environment.
    pls. guide
    Urvang Vakharia

    Hi,
    Yes, my workflow is triggered
    with the following status-
    Receiver Data:
    Receiver FM = SWW_WI_CREATE_VIA_EVENT_IBF
    RFC Destination = WORKFLOW_LOCAL_190
    Trace Data
    Action = Receiver started correctly
    RFC Status = Password logon no longer possible - too many faile

  • Date Verification

    In Labview 7.1, how can you
    -display the current date on the front panel
    -allow user to edit the date
    -verify that it is a valid date

    The simplest solution is to use a time stamp control, since the time/date format are automatically managed. You can then add an "In-range" comparison node to fix the upper and lower acceptable values.
    See the attached vi.
    Chilly Charly    (aka CC)
             E-List Master - Kudos glutton - Press the yellow button on the left...        
    Attachments:
    Check time.vi ‏21 KB

  • DVD studio pro 4 vs Eclipse Data Verification

    Having a few headaches from the duplicator besides an otherwise seemingly perfect DVD.
    DVD Studio Pro 4 has no errors, burns fine, encoded great, and plays on 6 dvd players here in the studio + computers.
    I sent the image of for duplication and the company reported a ton of errors - here is a link to the report.
    http://lightrhythmvisuals.com/downloads/report.pdf
    Ive got no idea what these error messages mean and asked for more information about what is the problem specifically with the DVD. The duplicator just exclaims that "we are getting these error reports - theres a problem with the authoring." Ive tried calling eclipse and nobody seems to be able to help.
    Upon checking eclipses site - which I have been assured is professionally industry leader in optical data and technology, it seems that there software scans for 1000s of errors - is there a chance this software is a little over sensitive and my dvd is actually fine ?
    Has anybody had this issue before - any ideas - anyone - deadline is now passed so any news a bonus.

    thanks for your response - yep 16 tracks on the disc - as apposed to chaptering one timeline. Its more like a music video album than a film.
    OK well getting somewhere with this - the duplicator team have got back to me saying that contents on the data side of the disc were confusing the software and triggering errors - have shortened the names on a lot of files on the data side - and re-authored just in case
    Im suprised eclipse data is industry standard - theres virtually no information about it online and support is minimal
    more as it happens

  • Inventory data verification BW vs R/3

    Hi Experts,
    I'm working in support projects, my user raised issue on inventory report is tie with R/3 report.
    please make note, my inventory report based on standard Inventory infocube 0IC_C03.
    When I generate report by plant for the valueated stock qty, valueated stock value, compare to R/3 mc.5 t.code the final figure is not tie with R/3 system.
    Can  anybody suggest where could be problem and how to investigate.
    SRI

    Hello Guys,
    Is there a way to find Standard BI reports that are similar to SAP standard report?
    Eg : Standard SAP reports MC.9 and MB5W
    What will be the equivalent SAP BI report?
    Thanks
    Mohan

  • Turn Burn "verification" off before burning a CD/ DVD

    What I love about OS X is that you can easily burn any CD/DVD without having to jump all the hoops and loops found in windows.
    HOWEVER, there are times when I just want to turn off the data verification before a CD/DVD starts burning. I could simply hit the "X" and deal with all the other error and warning messages after it finishes... OR I could turn the data verification off before hand and have a smooth process...
    I know that programs like Dragon Burn and Toast allow you to turn the verification off. However, I am not planing on spending $49-$99 just for that feature...
    So, does anybody know how to do this without having to mess around the terminal too much.
    Thanks

    Sonicray wrote:
    Thank you for the reply, I am not burning disk images, I am burning different documents, presentations, videos and audio.
    I have used hdiutil before but there is no real point for me to build disk images out of my files if I am only going to use it once...
    I was hoping there would be a check box or a shortcut key somewhere for turning the verification off...
    I'm not aware of an option like this. it's probably not built into GUI at all.

  • MDM for Managing Master data for PP & QM Modules

    We have studied the various offerings of SAP MDM for Materials, Suppliers, Financials
    These are definitely areas where we would like to automate using MDM  and thereby improve Quality, Turn around Time and eliminate errors, redundancy etc.
    In these  areas we have also succeeded to a some extent in controlling the maintenance process by way of centralizing and building checks and controls in the Configuration of different Modules.
    The main pain areas in terms of Master data have been in PP and QM Modules
    PP-   Maintenance of BoM and Recipe - which is based on process package finalized by R&D Section.
    R&D scientists are not so familiar with SAP and SAP Settings
    Process Package is usually a simple document- defining specific quantities of the different components required to produce a Specific quantity of a Product  and also descibles the procedure of the entire production process, but when it translates into a Recipe and Bom- different tables in SAP come into play.
    Eventually the BoMs and Recipes are created by manual input of information from the R&D Process Package
    Similarly the QM master data  - Inspection Plan in SAP -  is based on Specification finalized by the Quality wing of R&D
    R&D scientists are not so familiar with SAP and SAP Settings
    Specification  is usually a simple document defining the different tests to be performed for inspecting a material and the expected range in which the results should fall in order for the material to be accepted- but when it translates into a Recipe and Bom- different tables in SAP come into play.
    Eventually the Specifications are created by manual input of information from the R&D Process Package
    There is no direct relation between the original R&D Document and the SAP Master data- Verification is a laborious manual process.
    We are wondering whether SAP MDM would have the scope to cover these two areas
    Tks

    Hi Hari,
    If I understand correctly,you are doing a feasibility study about and trying to find out if standard offering in SAP MDM supports your requirement of linking the master data and R&D documents.
    If thats the case I guess you can design a solution based on your business requirement.For this you can either create a custom repository or extend the currently offered standard business content.You can link the documents to the master data,also store these as pdf/image/links etc.
    Thanks,
    Ravi

  • Refresh cube data to compare prod vs. uat data

    Hello Guru's
    I am working on project where I need to certify the regression of the cube data and reports built on top of the cube.
    I have suggested to extract the prod data in UAT and rebuild the cube , extract the data from cube and prod cube and compare. If there is any change it reflects some of the change in cube logic etc and it fails.
    Similarly once the cube are rebuild run the report/dashboard and compare them against prod dashboard to certify using binary comparision, if both are same if it is fine and if there is any difference it fails.
    I would like to take feedback/suggestions if the above sounds good and workable or is there any better approach. There are no SQL's existing (wrriten by test team) to compare the result and no one want to invest in this. Kindly help me wiht this.
    Thanks

    A couple suggestions that might help your testing is saving the cubes, dimensions and/or AW's to Templates using the Save to Template context menu item off of the cubes and dimensions in the AWM navigator.  This will enable you to compare the XML templates to see if definitions of the cubes, dimensions, and/or AW's in your various test environments to see if your definitions are the same.
    Also, after building each cube, you might compare the Views associated with the data of each cube for data verification.  This might be how you are extracting the data anyway.
    --Ken Chin

  • Can I use an OLE DB Command Task to call a parameterized stored procedure, perform some data editing and pass variables back to the SSIS for handling?

    I am using a Data Flow and an OLE DB Source to read my staged 3rd party external data. I need to do various Lookups to try and determine if I can find the external person in our database...by SSN...By Name and DOB...etc...
    Now I need to do some more data verification based on the Lookup that is successful. Can I do those data edits against our SQL server application database by utilizing an OLE DB Command? Using a Stored Procedure or can I sue straight SQL to perform my edit
    against every staging row by using a parameter driven query? I'm thinking a Stored Procedure is the way to go here since I have multiple edits against the database. Can I pass back the result of those edits via a variable and then continue my SSIS Data Flow
    by analyzing the result of my Stored Procedure? And how would I do that.
    I am new to the SSIS game here so please be kind and as explicit as possible. If you know of any good web sites that walk through how to perform SQL server database edits against external data in SSIS or even a YouTube, please let me know.
    Thanks!

    Thanks for that...but can I do multiple edits in my Stored Procedure Vaibhav and pass back something that I can then utilize in my SSIS? For example...
    One and Only one Member Span...so I'd be doing a SELECT COUNT(*) based on my match criteria or handle the count accordingly in my Stored Procedure and passing something back via the OLE DB Command and handling it appropriately in SSIS
    Are there "Diabetes" claims...again probably by analyzing a SELECT COUNT(*)
    Am I expecting too much from the SSIS...should I be doing all of this in a Stored Procedure? I was hoping to use the SSIS GUI for everything but maybe that's just not possible. Rather use the Stored Procedure to analyze my stged data, edit accordingly, do
    data stores accordingly...especially the data anomalies...and then use the SSIS to control navigation
    Your thoughts........
    Could you maybe clarify the difference between an OLE DB Command on the Data Flow and the Execute SQL Task on the Control Flow...
    You can get return values from oledb comand if you want to pipeline.
    see this link for more details
    http://josef-richberg.squarespace.com/journal/2011/6/30/ssis-oledb-command-and-procedure-output-params.html
    The procedure should have an output parameter defined for that
    I belive if you've flexibility of using stored procedure you may be better off doing this in execute sql task in control flow. Calling sp in data flow will cause it to execute sp once for each row in dataset whereas in controlflow it will go for set based
    processing
    Please Mark This As Answer if it solved your issue
    Please Vote This As Helpful if it helps to solve your issue
    Visakh
    My Wiki User Page
    My MSDN Page
    My Personal Blog
    My Facebook Page

Maybe you are looking for

  • TV@nywhere Master poor video quality with 6800 GT

    I have the TV@nywhere master (8606-020) and it was working fin with my NVIDIA Geforce 6800 GT OC (256MB) but when I updated the driver to their latest version, every time I would go full-screen in MSIPVR it would make lines all through the video.  I

  • MIM CM Install Issue (Schema)

    Hi, Busy following the "TLG MIM CM with Modern App.docx" from Connect.Microsoft.com to deploy MIM CM. I have extended the schema as stipulated (& restarted DC), but when I try to run the Certificate Config Wizard on the MIM CM server, the following e

  • Create cart with balance sheet account

    Currently, our users create a shopping cart using a cost center or WBS element.  However, our Finance department would like the ability to use a balance sheet account (prepaid expense) instead.  Is this possible? If so, what would need to be done on

  • Error opening container with DB_READ_UNCOMMITTED

    I am running into an exception condition when I try re-creating the Isolation sample found at [http://www.oracle.com/technology/documentation/berkeley-db/xml/gsg_xml_txn/cxx/isolation.html#dirtyreads] . Everything works fine with all of the settings

  • ValidateStructure: SQL error -14508

    Hi! After ValidateStructure I get the following errors.  BR0301E SQL error -14508 at location stats_tab_validate-2, SQL statement: 'ANALYZE TABLE "SAPSR3"."/BI0/F0CCMAETPH" VALIDATE STRUCTURE CASCADE ONLINE' ORA-14508: specified VALIDATE INTO table n