The 2014 ASUG Data Quality Survey is Live

This is the third ASUG survey from the Data Governance Special Interest Group (SIG)/ Metric Working Group. Like the other two surveys from 2009 and 2011, this survey is polling how companies are managing quality control for their master data, in particular the number of controls and in what areas.
The main difference between this and previous surveys is that we are asking that you respond for a single domain (customer, material, vendor) per survey
(you can take it more than once if you have input for multiple domains). In previous surveys, the majority of respondents were only able to complete the
questions for one domain with a level of confidence. This change will allow for a cleaner analysis.
This survey will be used for benchmarking what companies are measuring, and not the quality of your data. That will be a future topic focused on the popular
areas of measurement.
Before you take the survey, we recommend you have the list of what your company is measuring and the ASUG Taxonomy Handbook (available on the ASUG Data Governance SIG discussion board as well as at www.DataIntent-LLC.com, or on the LinkedIn discussion group: ASUG Data Quality Controls Taxonomy). Some questions will refer to the number of queries you are measuring by the taxonomy categories.
Please do not get caught up in precision; it is better to have many participants and be close, than for us to lose your input because you are overly precise about the numbers of queries by taxonomy node.
This survey is divided into three main parts: demographics, master data controls poll (customer, material or vendor) and operations; followed by a few
questions about the survey itself including an opt-in opportunity should you like to see the results. In order to qualify to see the composite data, you must
complete 29 of the 43 questions. Everyone qualifies for the summary report, which will be available in the near future on the ASUG website and through
various presentations.
We expect this survey to take about 20 minutes and assure you all individual responses are confidential.
As long as you return to the survey from the same PC, you may save and return to the same survey (this will help you to collaborate).
If at any time you would like assistance with this survey or the ASUG Taxonomy Handbook, please reply to the discussion thread, or email [email protected] or any member on the steering committee for data quality metrics found at the end of the survey.
We welcome your participation in this survey - please contact any steering committee member with questions or respond to the discussion on the website.
Thank you,
ASUG Data Governance Team

Just updating in case anyone else has this problem.
1) The missing help files are appraently a known issue: our Oracle account manager says they can get a complete copy from somewhere and provide it to us. So if you have this problem, as far as I know you'll need to chase up someone in Oracle at this stage.
2) I think that linux install is the server component only; the client is windows only; and the help files come with the client: so help files are not likely to be available from some other install, as far as I can work out.

Similar Messages

  • Question on CKM and Data Quality

    As I understand, CKM only supports the check based on db constraints. If I want to have more complicated business logic built to the checking, does Data Quality sound like a good choice. Or other suggestions?
    In my case, I will need to check the data in the source table based on the table data from different sources (both target and source tables). This should be doable through data quality, correct? I am new to ODI. When I first installed the ODI, I didn't choose to install data quality module. I suppose I can install DQ separately and link it back to ODI? Do they share the same master repository?
    Sorry for the naive questions, you help is greatly appreciated.
    -Wei

    Hi,
    My idea is just like:
    for instance a FK validation:
    create function F$_validate_FK (value1 number) return number
    as
    v_return number;
    begin
    select my_fk into v_return from any_table where column_fk = value1 ;
    return v_return ;
    exception
    When NO_DATA_FOUND then
    return -1;
    end;
    And at constraint you will have:
    -1 = (select F$_validate(table1.column_to_be_validated) from dual)
    Any record that have -1 as return will be not valide for the flow.
    The F$ function can be created in a ODI procedure before the interface and dropped at end if you think to be necessary.
    Make any sense?
    (Maybe there are several syntax error in this example, I just write it and did not compilate, just to show the idea)
    Edited by: Cezar Santos on 28/04/2009 10:20

  • ODI Data Profiling and Data Quality

    Hi experts,
    Searching about ODI features for data profiling and data quality I found (I think) many ... extensions? for the product, that confuse me. Cause I think there are at least three diferents ways to do data profiling and data quality:
    In first place, I found that ODI has out of the box features for data profiling and data quality, but, acording to the paper, this features are too limited.
    The second way I found was the product Oracle Data Profiling and Oracle Data Quality for Oracle Data Integrator 11gR1 (11.1.1.3.0) that is in the download page of ODI. Acording to the page, this product extends the existing inline Data Quality and Data profiling features of ODI.
    Finally, the third way is Oracle Enterprise Data Quality that is another product that can be integrated to ODI.
    I dont know if I understood good my alternatives. But, in fact, I need a general explanation of what ODI offer to do Data Quality and Data profiling. Can you help me to understand this?
    Very thanks in advance.

    Hi after 11.1.1.3 version of ODI release, oracle no longer supports ODP/ODQ which is a trillium software product and this not owned by oracle. Oracle is recommending the usage OEDQ for quality purposes. It's better if you could spend time on OEDQ rather than trying to learn and implement ODP/ODQ in ODI

  • Verification of data quality in migration process

    Hi All,
    I am in a project that migration data from SQLserver to Oracle database. But my quesion is not performance but the check of data quality.
    My procedures to move data is: a) Extract data to a flat file from SQLserver via a GUI tool b) ftp it to UNIX c) sqlldr to Oracle temp tables d) copy the data from temp tables to fact tables.
    My point is to only check the sqlserver log file and sqlldr log file, if no error in them and the row counts match in SQLserver and Oracle, then we can say a,b,c are successful.
    And since d is a third party stored procedure, we can trust its correctness. I don't see any point where the error could happen.
    But the QA team think we have to do at least two more verification: 1. compare some rows column by column 2. sum some numeric columns to compare the results.
    Can someone give me some suggestion on how you check the data quality in your migration projects, please?
    Best regards,
    Leon

    Without wishing to repeat anything that's already been said by Kim and Frank this is exactly the type of thing you need checks around.
    1. Exporting from SQL Server into a CSV
    Potential to loose precision in data types such as numbers, dates, timestamps, or in character sets (unicode, utf etc)
    2. Moving from windows to unix
    Immediately there are differences in EOL characters
    Potential for differences in character sets
    Potential issues with incomplete ftp of files
    3. CSV into temp tables with SQL Loader
    Potential to loose precision in data types such as numbers, dates, timestamps, or in character sets (unicode, utf etc)
    Potentail to have control files not catering for special characters
    4. Copy from temp tables to fact tables
    Potential to have column mappings wrong
    Potential to loose precision in data types such as numbers, dates, timestamps, or in character sets (unicode, utf etc)
    And I'm sure there are loads more things that could go wrong at any stage. You have to cater not only for things going wrong in the disaster sense i.e. disk fails, network fails, data precision lost, but also consider there could be obscure bug in any of the technologies you're working with. They're not things you can directly predict but you should have verification in place to make sure you know if something has gone wrong - however subtle.
    HTH
    David

  • Just used Match consolidations & match comparison with data quality for add

    I recently started to look for jobs on BODS and data quality i used data quality match consolidations.
    Are there any ither impt elements which are widely used in the industry from data quality perspective?
    Kind regards.
    Edited by: cplusplus1 on Feb 8, 2012 7:08 AM

    I am not entirely sure I understand your question.  We do have Data Quality blueprints for use in Data Services available.  You can find these on the SCN.
    The SAP Community Network hosts several Data Quality blueprints which are available for download.  Go to http://www.sdn.sap.com/irj/sdn/ds, and if necessary enter your user id and password or create an account.
    Scroll down to Downloads and click the link in Samples Data Services Blueprints.  There are several download packages that specialize in different areas and one that has all available samples contained in one download package.
    Note: A "review" page displays after clicking the link of the blueprint to be downloaded. Click View this Code Sample to download the file.
    Hope this is helpful.
    Thanks,
    Paula

  • CDS Data Quality Health Check Error -(Code: 207,103)

    Using the CDS Individual Data Quality Health Check. 
    Created a new data source, snap shot and mapping.  Cloned the job and modified the snapshot an mapping.  Ran the Job and go the following :An error occurred whilst creating task '[I2A] Profile Individual Misc Data' : Failed to load process "[I2A] Profile Individual Misc Data" (Code: 207,103) - Any Suggestions

    Determined the Error to be a missing processor.  I must have deleted it by mistake.  You can identify which one is missing by the processor Name [I2A] Profile Individual Misc Data

  • TS3999 How do I get my Macbook Pro and iphone 5 to sync calendars in 2014? It worked fine in 2013, but the 2014 dates are blank on my Macbook.

    How do I get my Macbook Pro and iphone 5 to sync calendars in 2014? It worked fine in 2013, but the 2014 dates are blank on my Macbook. The all icloud is checked on my phone and the enable icloud account and push are selected on the Macbook.
    Thanks

    Do the events from your iPhone calendar appear on icloud.com?  If not, go to Settings>Mail,Contacts,Calendars>Default Calendar (in the Calendars section) and make sure you have selected an iCloud calendar as your default.  If you haven't, make this change, then add a new event on your iPhone calendar and see if appears on icloud.com and your Mac now.

  • In Data Quality transform please explain Associate transform with the help of any example.

    In Data Quality transform please explain Associate transform with the help of any example.

    Hi Neha,
    If we are using multiple match transforms and consolidate the final output we will use associate transform .
    Let me explain with one example based on data quality blue prints for USA .
    We have customer/vendor data .     We need to find the duplicates .
    1. First  we will find the duplicates  on   Name and Address
    2. Second  we will find the duplicates on Name and Email
    3. Third we will find the duplicates on Name and Phone
    Here why we need to find the duplicates in multiple stages . If we are finding the duplicates on combination of  Name, Address, Email and Phone  we may not get proper duplicates   as we are finding the potential duplicates . That's why we are finding the duplicates on different combinations .
    In this case we will get the different group numbers for each match combination . Each combination name is there .
    We want to consolidate and give the group number to the whole set of duplicates . We will pass these  3 match groups to associative transform and generate the  consolidated match group for the input data.
    I hope you understand the concept .
    Thanks & Regards,
    Ramana.

  • The Data quality knowledge base property is empty

    Hello 
    I build Knowledge base and published it successfully (using the DQS client wizard  normally )
    But when i open new ssis project and try to use the DQS component i can't use the knowloedge base i build , i get the message The Data quality knowledge base property is empty
    How can i be sure that i published successfully (any query on repository ?)
     What do i miss ?

    Hello,
    Use the Data Quality Client to check in the Open Knowledge Base
    screen if the knowledge base was published. To do so, click Open Knowledge Base
    in the home screen of Data Quality Client, and then check the Date Published column.
    Alternately, you can also check the PUBLISH_DATE column in the DQS_MAIN.dbo.A_KNOWLEDGEBASE
    table.
    Thanks,
    Vivek

  • Data Quality Services Installer script - Where is the 64 bit version

    Hi.
    I have Microsoft SQL Server 2012 - 11.0.2218.0 Enterprise Edition on Windows NT 6.1 <X64> (Build 7601: Service Pack 1) (WOW64) installed on my laptop. The Data Quality Services feature has been installed. When I try and run the DQS client it says I
    must run the DQS installer script. I therefore run the Data Quality Server Installer, which is the other option in the DQS menu, and it errors saying 'You are running 32-bit version of DQS installer on 64-bit OS'. I've looked for the 64-bit version but I can't
    find it. Any ideas where I can get it from?
    Thanks in advance for any help.

    iTunes 64bit version for Windows is somewhere in the future. Nice to see Apple is obviously doing something to remedy the issue.

  • The latest Premier Pro CC 2014 update forced me to convert and create version of my project while guaranteeing that the project would not be altered; however, the timelines are not the up to date versions I had.

    The latest Premier Pro CC 2014 update forced me to convert and create version of my project while guaranteeing that the project would not be altered; however, the timelines/sequences are not the up to date versions I had. I saved different version of this project for various purposes, but they were all named differently(i.e joe and mary wedding, joe and mary wedding 2, joe and mary wedding 3, joe and mary wedding 3.1).  I do not know if they just pulled the timelines from the earliest version, eradicated the data entirely, or something else.
    Since the program prompts me to create and save a new version for the updated premier, I still have the previous projects, but cannot open them.  I imagine they are unchanged, but cannot be sure.
    Thoughts? Is it possible to uninstall an update re-install it later?
    Please advise.
    Thank you.
    Brian

    Also...
    Add to right click menu of timelines an entry for "go to next keyframe" and "go to last keyframe" a la After Effects, we should also be able to assign a kb shortcut to this.
    There are buttons on either side of the create keyframe button to advance forward or backward in the effects control panel or timeline.
    You can also advance forward or backward in the effects control window by holding shift while dragging the playhead.
    Allow copy/paste of selection of audio/video effects, not just one at a time. Sure we can copy/paste attributes but since we can also copy/paste effects it would only make sense to allow to copy/paste a selection of effects.
    I guess I don't get this one. If I have 5 effects on a clip, I can simply command-click on the ones I want to copy over to the other clip.

  • How to install Data Quality Client on Windows when the Servers are on AIX

    I have installed ODI, ODI DP and DQ Servers on an AIX machine. However, Oracle Data Quality Client is an Windows only component. How do i install it? When I use the software of ODI, for Windows platform,I get the following error when i try to connect.
    An unexpected error has occurred. Please refer this information to Trillium Software support.
    Description: invalid command name "::com::avellino::metabase::authenticate"
    File: ..\include\ExecutionContextHandler.h
    Line: 326
    Stack:
    invalid command name "::com::avellino::metabase::authenticate"
    while executing
    "::com::avellino::metabase::authenticate madmin 303f4450-8ce1-436f-9a16-54fae4d8a4cf ::com::trilliumsoftware::tssauto::pwd_answer"
    invoked from within
    "remote0 read"

    Hi,
    We are hitting same issue (Metabase Manager from windows) to ODQ server on Solaris and event viewer in windows server showing exactly same message .
    Any luck or pointer ?
    Atul Kumar
    http://onlineAppsDBA.com

  • Can you re-import your CD's in high quality, but keep the original track data?

    Dear All
    I got roughly halfway through importing my collection of about 800 CD's into iTunes 12, before realising that the default import settings applied by iTunes is not 'the best'.  In fact it's not set anywhere near 'the best', it's something like 128kbps lossy format.   I wanted to import my CD's in lossless CD-quality.  WAV would have been ideal.  However it would appear I've missed a trick by assuming it would be full-quality unless I specified something else.
    I'm not using portable devices to listen to my music, I have a Mac Mini and an endless supply of hard-drive storage space, and I'm only accessing it at home.
    A large number of my discs are classical and jazz, and in many cases I've needed to amend or reorganise the metadata, so that it all complies with an overall filing and categorising system that makes sense and is useful.   I also have a large number of rock and blues CD's, but their metadata is normally correct when the discs are issued, so they usually doesn't need adjusting.   If you're wondering, yes my taste in music is eclectic.   I'm just as big a fan of Mozart as I am of Motorhead as I am of Mo Foster.   The problem is that with classical and opera it's important that the 'artist' category shows the names of the performer/soloist/conductor/orchestra, not the name of the composer!  (I wish I *did* have actual recordings of JS Bach playing the Well Tempered Clavier!!).
    So I'm looking at this big pile of CD's thinking, "is it worth re-importing them all again at CD-quality?", because I'd really like top-quality copies, and I only really want to have to do this once.   It *is* worth doing so long as I don't have to do all the jiggery-pokery with all the artist/composer data again, because that takes an enormous amount of time.
    So the question is... will iTunes 12 let me re-import my discs at maximum quality, but at the same time let me keep the data that was painstakingly edited and stored with the previous (lesser-quality) versions?   Or is there a workaround?  Maybe I could surreptitiously replace the old audio with new audio when iTunes isn't running and hope that it won't notice?
    It goes without saying that I have spent quite a while trying to do this for myself before asking the forum, and so far I've drawn a blank so I'd be very grateful for any expert thoughts and opinions.
    Best wishes
    Nick

    HI
    Don't use WAV format as you will not retain metadate. You should use aiff if space is not a problem but Apple lossles format will be just as good with smaller file size.
    When you reimport your CDS, you will be asked if you wish to replace existing copies. Select yes, you data changes should be retained. Be aware that when you reimport, CD track names and album title must match your existing files, otherwise they will not replace existing versions.
    Jim

  • The Data Quality Services license validation has failed.

    Hello 
    Am Harsha here, Am getting some strange error on my MSSQL Server 2012 DataQualityServices, I have installed DQSinstaller and everything is working properly till yesterday, But today after i restart the Machine am not able to open DQS client its giving some
    strange error like "The Data Quality Services license validation has failed." Please see the Screen shot below. Please let me know how to rectify this Issue.

    After patching, the dqsinstaller.exe should be run to regenerate the SQL CLR procedures to refresh the version. Try this... 
    Launch Cmd as Administrator
    cd "C:\Program Files\Microsoft SQL Server\MSSQL11.InstanceName\MSSQL\Binn"
    DQSInstaller.exe -upgradedlls
    Thanks, Jason
    Didn't get enough help here? Submit a case with the Microsoft Customer Support team for deeper investigation - http://support.microsoft.com/select/default.aspx?target=assistance

  • Emptying the buffer of a receiver of a live stream

    Hi,
    In my application, a publisher starts a live video and a receiver could recieve it through FMS rtmp. The problem:
    1) Once the receiver faces latency, due to slow connection, the latency keeps on increasing from 1 second to 10s, 20s and above. What i feel here is that the reciever's buffer starts accumulating video data and keeps on accumulating it and never flushes it.I am unable to find a property in netStream and Application.xml to empty the buffer of a poor connection reciever.
    How can i fix this issue, so that a slow connection receiver's latency remains constant around 2 - 3 seconds. Please help.
    Regards,
    Sahil.

    Hi Nitin,
    1) Fms version is 3.5.7.
    2) I dont have issues with Audio, Its video that raises latency on slow connections of video receivers. To solve this issue i set the publisher's video quality according to receiver's bandwidth. This helps BUT does not solve the problem.
    Example:
    Publisher is broadcasting video at 800 kbps. The reciver of this stream has a lower bandwidth of 512 kbps, this difference (800 - 512 = 288 kbps) at receiver produces latency. The extra bit of data ie the 288 kbps is somewhere buffered by the recieiver in its stream and this keeps on accumulating on and on. Thus latency never drops or reduces.
    This is why i wish to clear/empty/drop the extra receiver stream data.
    Thanks,
    Sahil.

Maybe you are looking for

  • Windows 7 64 stuck (keyboard/mouse) but not when reboot my Boot camp

    Hi Guys, After spending more than 3 times 12 hours reading/searching/trying I've decided to seek for your help. My problem is pretty simple, trying to install Windows 7 64bit on MacBook Pro 15 Retins (mid 2014 model - MGXC2LL) gets stuck at language

  • How do i delete a game demo?

    I downloaded a game demo, and I want to remove it from my mac.  It is located in the launch pad, and I don't know how to get rid of it.

  • How to remove entries from the new auto-complete ?

    Firefox 33 came with a new auto-complete for forms. Unfortunately I cannot seem to delete the auto-complete suggestions once they have been stored. This is necessary functionality for things like mis-spelled words, where you want to get rid of the ol

  • Lightroom 5.7 Map module so slow it is almost useless

    I've used iPhoto and Aperture a lot but because Apple are withdrawing these programs soon apparently, I decided to upgrade to Lightroom.  Everything else works fine, but the map section is just so slow.  The map layers take far too long to download. 

  • Query on OMS

    hi all, When i try to start the OMS, am getting the following error. $ oemctl start oms sysman/oem_temp OEMCTL for AIX: Version 9.2.0.1.0 Production Copyright (c) 1998, 2002, Oracle Corporation. All rights reserved. Starting the Oracle Management Ser