Oracle Data Cleansing

Hi experts
I hope you can help me, do you know an Oracle Data Cleansing tool ? I have to apply data cleansing on Oracle tables
Kind regards

Oracle Warehouse Builder has some data quality profiling analysis and may be programmed for cleansing capabilities. But OWB is tool more for ETL programmer than for business people.
However a cleansing "tool" does not need to be Oracle specific.
Data is data, stored in Oracle or not.
Try to search generic data cleansing tools.
What are your requirements? Is it only deduplication, or anomalies, or typing errors, or normalization of addresses or something else?
Also it depends how business understand cleansing and what is definition of "bad" data.
PS/ I remember Oracle previously (in v6,7,8) had address cleansing product. Then it became 3-rd party.

Similar Messages

  • Missing Data Cleanse package option in Data services Repository Manager

    Hello, I am trying to create a repository for data cleansing package repository in database using Repository Manager, but the drop down list box is not giving an option "Data Cleansing" to create repository. Only three values are available (Local,CEntral, and Profile).
    Using Data Services 3.0 Version. Any help or any workarounds creating a repository with our Repository Manager.
    Thanks,
    Kiran

    We searched log files and found no errors.
    We then decided to attempt re-creating the local repository.  We checked the Show Details box.
    The local repository was reported has having been successfully created.  This was the result we had before.
    We then checked for canned functions and the list matched to those of our Oracle and SQL Server local repositories.
    So, best guess is that something went wrong on the initial local repository creation that was either not reported or reported, but we missed.

  • ERP Data Cleansing

    Oracle has ERP databases. The current trend is to extract the data from an ERP system and then by data cleansing we migrate same to e-commerce data.We want to know from a content management point of view, what will be the input data source and the expected output for same. As ERP data source is too big how will a manufacturer trim the data and give it
    pls interact at [email protected]

    I think this is a very valid question and needs answer from an expert. Has anyone attempted to use DQS and MDS along with SSIS to ensure data quality in migration project?

  • How to export a data as an XML file from oracle data base?

    could u pls tell me the step by step procedure for following questions...? how to export a data as an XML file from oracle data base? is it possible? plz tell me itz urgent requirement...
    Thankz in advance
    Bala

    SQL> SELECT * FROM v$version;
    BANNER
    Oracle DATABASE 11g Enterprise Edition Release 11.1.0.6.0 - Production
    PL/SQL Release 11.1.0.6.0 - Production
    CORE    11.1.0.6.0      Production
    TNS FOR 32-bit Windows: Version 11.1.0.6.0 - Production
    NLSRTL Version 11.1.0.6.0 - Production
    5 rows selected.
    SQL> CREATE OR REPLACE directory utldata AS 'C:\temp';
    Directory created.
    SQL> declare                                                                                                               
      2    doc  DBMS_XMLDOM.DOMDocument;                                                                                       
      3    xdata  XMLTYPE;                                                                                                     
      4                                                                                                                        
      5    CURSOR xmlcur IS                                                                                                    
      6    SELECT xmlelement("Employee",XMLAttributes('http://www.w3.org/2001/XMLSchema' AS "xmlns:xsi",                       
      7                                  'http://www.oracle.com/Employee.xsd' AS "xsi:nonamespaceSchemaLocation")              
      8                              ,xmlelement("EmployeeNumber",e.empno)                                                     
      9                              ,xmlelement("EmployeeName",e.ename)                                                       
    10                              ,xmlelement("Department",xmlelement("DepartmentName",d.dname)                             
    11                                                      ,xmlelement("Location",d.loc)                                     
    12                                         )                                                                              
    13                   )                                                                                                    
    14     FROM   emp e                                                                                                       
    15     ,      dept d                                                                                                      
    16     WHERE  e.DEPTNO=d.DEPTNO;                                                                                          
    17                                                                                                                        
    18  begin                                                                                                                 
    19    OPEN xmlcur;                                                                                                        
    20    FETCH xmlcur INTO xdata;                                                                                            
    21    CLOSE xmlcur;                                                                                                       
    22    doc := DBMS_XMLDOM.NewDOMDocument(xdata);                                                                           
    23    DBMS_XMLDOM.WRITETOFILE(doc, 'UTLDATA/marco.xml');                                                                  
    24  end;                                                                                                                  
    25  /                                                                                                                      
    PL/SQL procedure successfully completed.
    .

  • Data Cleansing in CRM Web UI

    Dear Experts,
    I have activated Data Cleansing in CRM 7.0 and it is working.
    Can anyone advise on the setting or configuration required to get the CRM Data cleansing functionality to copy communication data stored at the relationship from the Source record to the Master record.
    Example:
    Company A has a relationship with Contact person A - this relationship has a mobile number and email address maintained.
    When merging the account of Company A (source) with a duplicate company A1(master), we need to copy the relationship (already occurs) between Contact Person A and Company A and the relationship level mobile number and email address to the master account which will be company A1 (this does not occur).
    Currently the relationship level communication data is not copied.
    Please provide step by step config settings to enable this. I already have the SAP Notes on Data Cleansing and have covered all WIKI's on the web. What i need are specific steps to overcome this issue.
    Thanks!!!

    In order to transfer communication data maitained at the Contact Person relationship level, you have to maintain the required node in transaction BUSWU02: BUP115     Contact's Work Address and BUP120 Contact's Communication Data
    If data is still not transferred, SAP has released notes: 1243559, 1491950, and 1493240.  to resolve.  Check the relevance before implementing.
    FK

  • Powerpivot Data Refresh Not working with Oracle Data Source in sharePoint 2013

    I am using SQL Server 2012 PowerPivot for Excel 2010. Getting the following error in SharePoint 2013 environment, when using Oracle data source within a workbook -
    EXCEPTION: Microsoft.AnalysisServices.SPAddin.DataRefreshException: Engine error during processing of OLE DB or ODBC error: The specified module could not be found..:
    <Site\PPIV workbook>---> Microsoft.AnalysisServices.SPAddin.DataRefreshException: OLE DB or ODBC error:
    The specified module could not be found..   
     at Microsoft.AnalysisServices.SPAddin.DataRefresh.ASEngineInstance.ProcessDataSource(String server, String databaseName, String datasourceName, SecureStoreCredentialsWrapper
    runAsCredentials, SecureStoreCredentialsWrapper specificConfigurationCredentials, DataRefreshService dataRefreshService, String fileUrlForTracing)     -
    -- End of inner exception stack trace ---   
     at Microsoft.AnalysisServices.SPAddin.DataRefresh.ASEngineInstance.ProcessDataSource(String server, String databaseName, String datasourceName, SecureStoreCredentialsWrapper
    runAsCredentials, SecureStoreCredentialsWrapper specificConfigurationCredentials, DataRefreshService dataRefreshService, String fileUrlForTracing)   
     at Microsoft.AnalysisServices.SPAddin.DataRefresh.DataRefreshService.ProcessingJob(Object parameters)
    I created a simple Excel 2013 PPIV workbook with an oracle data source and uploaded that to SharePoint 2013, but no change in the results - still getting the above error.
    What is this error? We have installed Oracle client (64-bit, since we use 64-bit Excel and Sp is also 64-bit) on SSAS PPIV Server and SharePoint Content DB Server. Do we need
    to install it anywhere else?
    Thanks,
    Sonal

    Hi Sonal,
    To use PowerPivot for SharePoint on SharePoint 2013, it is required to install PowerPivot for SharePoint with the Slipstream version of SQL Server 2012 SP1. If you install SQL Server 2012 and then use the upgrade version of SQL Server 2012 SP1 to upgrade,
    the environment will not support SharePoint 2013.
    I would suggest you refer to the following articles:
    Install SQL Server BI Features with SharePoint 2013 (SQL Server 2012 SP1):
    http://technet.microsoft.com/en-us/library/jj218795.aspx
    Upgrade SQL Server BI Features to SQL Server 2012 SP1:
    http://technet.microsoft.com/en-us/library/jj870987.aspx
    Regards,
    Elvis Long
    TechNet Community Support

  • Oracle Data Profiling and Data Quality

    Hi,
    How to create metabase for Oracle Data Profiling and Data Quality.Is metabase and repository are same.

    Hi,
    You can create a metabase in the Metabase Manager:
    - Expand Control Admin
    - Click on Metabases
    - in the Metabases window, right-click on the white area and select Add...
    - go through the wizard to create your metabase
    This is documented in the ODQ/ODP tutorial (http://www.oracle.com/technology/products/oracle-data-quality/pdf/oracledq_tutorial.pdf) and in the Documentation (in Metabase Manager or Oracle Data Quality go to Help and then Manuals).
    Thanks,
    Julien

  • Performance issue with Oracle data source

    Hi all,
    I've a rather strange problem that I'm stuck on need some assistance on.
    I have a rules file which drags data in via an SQL data source thats an Oracle server. If I cut/paste the 3 sections of "select" "from" and "where" into SQL-Developer and run the query, it takes less than 1 second to complete. When I run the "load data" with this rule file or even use the "Retrieve" with the rules file edit, it takes up to an hour to complete/retrieve the data.
    The table in question being used has millions of rows and I'm using one of the indexed fields to retrieve the data. It's as if the Essbase/Rule file is ognoring the index, or I have a config issue with the ODBC settings on the server that is causing the problem.
    ODBC.INI file entry for the Oracle server as follows (changed any sensitive info to xxx or 999).
    [XXX]
    Driver=/opt/data01/hyperion/common/ODBC-64/Merant/5.2/lib/ARora22.so
    Description=DataDirect 5.2 Oracle Wire Protocol
    AlternateServers=
    ApplicationUsingThreads=1
    ArraySize=60000
    CachedCursorLimit=32
    CachedDescLimit=0
    CatalogIncludesSynonyms=1
    CatalogOptions=0
    ConnectionRetryCount=0
    ConnectionRetryDelay=3
    DefaultLongDataBuffLen=1024
    DescribeAtPrepare=0
    EnableDescribeParam=0
    EnableNcharSupport=0
    EnableScrollableCursors=1
    EnableStaticCursorsForLongData=0
    EnableTimestampWithTimeZone=0
    HostName=999.999.999.999
    LoadBalancing=0
    LocalTimeZoneOffset=
    LockTimeOut=-1
    LogonID=xxx
    Password=xxx
    PortNumber=1521
    ProcedureRetResults=0
    ReportCodePageConversionErrors=0
    ServiceType=0
    ServiceName=xxx
    SID=
    TimeEscapeMapping=0
    UseCurrentSchema=1
    Can anyone please advise on this lack of performance.
    Thanks in advance
    Bagpuss

    One other thing that I've seen is that if your Oracle data source and Essbase server are in different geographic locations, you can get some delay when it retrieves data over the WAN. I guess there is some handshaking going on when passing the data from Oracle to Essbase (either by record or groups of records) that is slowed WAY down over the WAN.
    Our solution to this was remove teh query out of the load rule, run it via SQL+ on a command line at the geographic location where the Oracle database is, then ftp the resulting file to where the Essbase server is.
    With upwards of 6 million records being retrieved, it took around 4 hours in the load rule, but running the query via command line took 10 minutes, then the ftp took less than 5.

  • Unable to replicate oracle data into timesten

    I have created CACHE GROUP COMPANY_MASTER
    Cache group :-
    Cache Group TSLALGO.COMPANY_MASTER_TT:
      Cache Group Type: Read Only
      Autorefresh: Yes
      Autorefresh Mode: Incremental
      Autorefresh State: On
      Autorefresh Interval: 1 Minute
      Autorefresh Status: ok
      Aging: No aging defined
      Root Table: TSLALGO.COMPANY_MASTER
      Table Type: Read Only
    But whenever I start timesten server the following lock seen in ttxactadmin <dsn_name>
    Program File Name: timestenorad
    30443   0x7fab902c02f0        7.22     Active      Database  0x01312d0001312d00   IX    0
                                                       Table     1733200              S     4221354128           TSLALGO.COMPANY_MASTER
                                                       Row       BMUFVUAAAAaAAAAFBy   S     4221354128           SYS.TABLES
                                                       Row       BMUFVUAAACkAAAALAF   Sn    4221354128           SYS.CACHE_GROUP
    Due to the following lock oracle data is not replicated in timesten.
    When we check sqlcmdid it shows following output
    Query Optimizer Plan:
    Query Text: CALL ttCacheLockCacheGp(4, '10752336#10751968#10751104#10749360#', 'S', '1111')
      STEP:             1
      LEVEL:            1
      OPERATION:        Procedure Call
      TABLENAME:
      TABLEOWNERNAME:
      INDEXNAME:
      INDEXEDPRED:
      NONINDEXEDPRED:
    Please suggest why timesten take lock on following table.

    966234 wrote:
    Unable to download Oracle Data Integrator with version 11.1.1.6.Hope this could be resolved ASAP.What is the file you are trying to download? Is it for Windows or Linux or All Platforms?
    Thanks,
    Hussein

  • Unable to download Oracle Data Integrator-Version 11.1.1.6(Important)

    Unable to download Oracle Data Integrator with version 11.1.1.6.Hope this could be resolved ASAP.

    966234 wrote:
    Unable to download Oracle Data Integrator with version 11.1.1.6.Hope this could be resolved ASAP.What is the file you are trying to download? Is it for Windows or Linux or All Platforms?
    Thanks,
    Hussein

  • Can SQL*Plus connect via ODBC to NON-Oracle data source?????

    I am struggling to understand something. I downloaded Oracle instance client, SQL*Plus and ODBC components with the hopes of being able to connect via SQL*Plus to a non-Oracle/ODBC compliant database.
    Is this possible? Or is SQL*Plus ability to connect via ODBC only to an Oracle data source??
    Thanks...

    sqlplus only connects to oracle. you can use the odbc driver from instant client to allow other applications to access oracle via odbc (e.g. excel). if you need to connect to non-oracle odbc database (ms-access, foxpro, etc.) you need odbc driver for those sources.
    you can use sqldeveloper to connect to oracle and non-oracle databases. check otn product info for sqldeveloper for more details.

  • Error while installing Oracle Data miner 10G Release 2

    Hello,
    I am a student involved in research in Data mining. I am new to Oracle Database and data miner.
    I installed Oracle Enterprise Manager 10g Grid Control Release 2 (10.2.0.1). Now I am trying to install ORacle data miner (10.2.0.1). However, at the time of installation ODM gives the following error:
    "specified data mining server is not compatible. 10.1.0.4.0."
    I have installed Oracle 10.2.0.1 but when I login using SqlPlus I get the following information -
    SQL*Plus: Release 10.1.0.4.0 - Production on Sun Jul 23 09:52:41 2006
    Copyright (c) 1982, 2005, Oracle. All rights reserved.
    Connected to:
    Oracle Database 10g Enterprise Edition Release 10.1.0.4.0 - Production
    With the Partitioning, OLAP and Data Mining options
    SQL>
    I would be really obliged if someone can help me with this.
    Thanks in advance
    Pooja

    Hi ,
    Download and install the product version(10.2.0.1.) of Oracle Data Mining....
    Simon

  • Oracle Data Miner 10.1.0.2 Interoperate with Database 10g Release 2

    Hi all,
    I cannot connect from Oracle Data Miner to a newly upgraded Database 10g Release 2 with Data Mining option. This database was 10.1.0.2 before upgrade, and I could connect via Oracle Data Miner before the upgrade (though it needs to be upgraded to 10.1.0.3+ for data mining to function).
    I have similar problem for a new installation on another computer. The error message in either case is "Cannot connect to specified Data Mining Server. Check connection information and try again."
    I can use SQL*Plus to login as the data mining user using the net service corresponding to the connect string. I check the v$option and DBA_REGISTRY as per the Data Mining Admin. documentation to verify that the data mining option exists and is valid. I am able to use the same connect string "host:port:SID" to connect from Analytical Workspace Manager to verify that the connectivity is OK.
    Furthermore, some Oracle by Example seems not valid for a DB of version 10.2. For example, at the URL http://www.oracle.com/technology/obe/obe10gdb/bidw/odm/odm.htm#p, the point 6 <ORACLE_HOME>\dm\lib\odmapi.jar is not applicable, because the path <ORACLE_HOME>\dm no longer exists.
    Therefore, I have query if Oracle Data Miner 10.1.0.2 can work with DB 10.2? What procedure should I follow? Please advise.
    Thanks and regards,
    lawman

    I am waiting on the beta version since I have installed Oracle10gR2.
    I've been checking the OTN website every day to see when it is released.
    If it is not a bother, can you send me an email when I can download it.
    Thanks in advance.
    Have a wonderful day/weekend,
    Andy

  • Oracle Data Integrator 11.1.1.5 Work Schema - List of Privileges

    Hi All,
    Oracle Data Integrator 11.1.1.5.
    Extracting data from Oracle DB for Oracle EBS 12.1.3.
    Customer created read-only schema (XXAPPS) to extract the data from EBS.
    For ODI Work schema we now created one schema 'XBOL_ODI_TEMP' on the source DB. We are now looking for appropriate privileges that needs to be granted to XXAPPS and 'XBOL_ODI_TEMP' so that we won’t face the any error messageS related to permissions when we run ODI scenario?
    We are now facing the error message : ODI-1227: Task SrcSet0 (Loading) fails on the source ORACLE connection VTB_ORACLE_EBS_1213.
    Caused By: java.sql.SQLSyntaxErrorException: ORA-00942: table or view does not exist.
    Similar previliges can be granted to the work schema on target.
    Venkat

    i think it would be fine with only one schema(user) created at the source system which has got read access on the tables of the EBS DB. Now to resolve this error, assuming XXAPPS user is the one used,
    in the topology --> data server(for EBS) --> physical schema the EBS schema name could be selected for Schema and XXAPPS as the work schema(for all ODI work related objects e.g. CDC)
    Also, in the Data server the user XXAPPS needs to be used which has read access to EBS tables.
    Now everytime ODI generates a query it will access a table lets say DUMMY as ,<EBS Schema>.DUMMY thus the reference is made.
    Alternatively, you can create synonyms for EBS tables in XXAPPS schema.

  • BP_TASK Data Cleansing

    Happy New Year Experts,
    I have a question or three for you on Data Cleansing in the Web IC.  I will explain what I have done and what I need answers to.
    The setup of the Data Cleansing Cases and Account search is fine.  We can search for Business Partners and Merge Now or Merge Later and then search for cases if we chose to Merge Later.
    When we go into the Case to process it most of the functionality is fine.  What isn't ok is this BP_TASK config .
    I understand that in order to execute the 'START' button in the Case Processing screens you need to have the Task config setup.  I have done this to an extent as described below:
    1) Setup Number Ranges - IMG->CRM->Master Data->BP->DQA-> Maintain Number Ranges [Create line 01-0000000001-9999999999]
    2) In Task (1003) add the action profile BP_TASK - IMG->CRM->Transactions->Basic Settings->Define transaction types
    3) Create job CRM_BUPA_REALIGNMENT (periodic job 5-10 min)
    Point 3 is what I am coming unstuck with.  I cannot create a periodic job without assiging an ABAP Program to run etc.  There is nothing anywhere that says 'use this program or method' when creating the job step.
    Secondly, when I select the 'START' button in the Case Processing screen - after confirming the changes, the case errors with a vague message saying the case cannot be saved.  However, if I actually hit the 'SAVE' button the case saves and the changes I confirmed are processed between the various accounts.  So the whole process is about 95% great and 5% annoying.
    Before the questions, all authorisation settings are good as well.
    The questions are then:
    1)  What parameters are required above what I described for the periodic job?
    2)  Does the 'Task' transaction type need to be in the Business Transaction profile for the specific business role the Data Cleansing functionality is assigned within?
    3)  Once a task is created, I guess that the job will process these and that a user does not necessarily need to manual process these tasks?
    4)  Should I change the Action Definition and Condition at all over and above the standard setup that it is currently in?
    Any help and guidance would be great - I'm afraid I have 'Wood for Trees' syndrome now
    Many Thanks for any assistance.
    Regards,
    Mat.

    Hi Gregor,
    I have read your interesting blog. We have a similar kind of data cleansing activity. But when I tried to implement the same, we get a message that the Data Cleansing option is not available for the Insurance industry specific settings. Can you pls help us here ? When I analysed the same the FM 'FKK_CLEANSING_ALLOWED' has a hard coded code element as follows:
    *4.71: Event 7500 does not exist yet*
      IF 1 = 2.
         REFRESH: t_fbstab[].
         CALL FUNCTION 'FKK_FUNC_MODULE_DETERMINE'
           EXPORTING
              i_fbeve            = gc_event_7500
              i_only_application = gc_x
              i_applk            = applk
           TABLES
              t_fbstab           = t_fbstab[].
         LOOP AT t_fbstab INTO fbstab.
           CALL FUNCTION fbstab-funcc
              CHANGING
                c_cleansing_allowed = cleansing_allowed.
         ENDLOOP.
         IF 1 = 2.
    *     für Verwendungsnachweis
           CALL FUNCTION 'FKK_SAMPLE_7500'
              CHANGING
                c_cleansing_allowed = cleansing_allowed.
         ENDIF.
      ENDIF.
    Our SAP version is as folows:
    SAP_BASIS      620         0056 SAPKB62056             SAP Basis Component            
    INSURANCE     472           0010     SAPKIPIN10     INSURANCE 472 : Add-On Installation
    Any help is much appreciated.
    Thanks.
    I Peter

Maybe you are looking for

  • Did you receive an unsolicited promo offer and after accepting realize your employee discount was dropped?

    This is EXACTLY what happened to me! About two months ago, I received a text from Verizon stating, "Verizon appreciates your loyalty. You are currently on the $50/2GB plan. You can now get an extra 2GB of data per month for the same price you pay now

  • Technical system creation in SLD

    Hi, 1. I have installed Netweaver sneak preview-java edition. 2. I am trying to create a new technical system for WebAS ABAP. We are on ECC 5.0 . 3. I have imported Cimsap.zip and content.zip successfully. 4. Now in the screen "Technical system wizar

  • Local and Remote Database

    Hi Guys, I would like your suggestion on the following situation I want to try, currently we have a database on a web-sever to deliver content, add members and created pages for the CRM functions for our sales staff. We have specific requirements for

  • Setting Helper Apps in Mail 4.6

    Normally when I click a link in an email the website opens in Safari.  Today, nothing.  The contextual menu that comes up when I either Right Click or Control Click is no help-  it lists Open, Open Behind Mail, Download Linked File, Copy Link, and Ne

  • Adapter Message Based Partitioning

    Hi, I'm trying to investigate the use of partitioning without much success. I have set up two publishing AQ adapters with a partition created on both, routing event1 via first partition (on adapter1) and routing event2 via the second partition (on ad