Microsoft Data Quality Services API
Hi,
Microsoft DQS doesn't provide APIs. Is there a plan of releasing the API anytime.
We have a requirement to evaluate several Data Quality issues and integrate in our application.
DQS is great and provide most of the validation we need for our customers.
But our requirement is to integrate the same into our application and customize.
Please let know if there is any plan to release the APIs or is there any workaround for us.
0
Hi All,
I have utilized the DQS for my organization Data Cleansing project. And we have built Data Quality reports using DQ output.
We also use Profiler Summary report in the Activity Monitor, but this is manual process where Data owners need to manually enter into the Activity monitor and generet the porofiler report.
My question is, which DQS table store the this Profiler summary report where by I can automate the report through that source.
Here is the screenshot of the report we are looking at, we need to find out which DQS source table got this information.
Thanks for the help
Similar Messages
-
Backup and Restore of DQS (Data Quality Service SQL 2012) Databases
We are currently using DPM 2010 running on Server 2008 R2 as our backup solution. We will soon be leveraging the Data Quality Services in SQL 2012 along with the Master Data Service.
In the SQL 2012 documentation from Microsoft it states, “The backup and restore operations of the DQS databases must be synchronized.”
Otherwise the restored Data Quality Server will not be functional. Currently I believe that DPM will run serialized backups of databases from one SQL server.
I was hoping someone could point me towards some documentation for backing up DQS with DPM.
Is anybody currently doing this? If so have you been successful restoring?LogicalName cant be same for mdf and ldf. verify again with FILELISTONLY. Also you have put wrong logical name
MOVE N'OriginalDB' TO N'D:\sql data\TargetDB.mdf'
Please paste the output of
RESTORE FILELISTONLY FROM DISK = 'D:\backup.bak' -
Data Quality Services Installer script - Where is the 64 bit version
Hi.
I have Microsoft SQL Server 2012 - 11.0.2218.0 Enterprise Edition on Windows NT 6.1 <X64> (Build 7601: Service Pack 1) (WOW64) installed on my laptop. The Data Quality Services feature has been installed. When I try and run the DQS client it says I
must run the DQS installer script. I therefore run the Data Quality Server Installer, which is the other option in the DQS menu, and it errors saying 'You are running 32-bit version of DQS installer on 64-bit OS'. I've looked for the 64-bit version but I can't
find it. Any ideas where I can get it from?
Thanks in advance for any help.iTunes 64bit version for Windows is somewhere in the future. Nice to see Apple is obviously doing something to remedy the issue.
-
The Data Quality Services license validation has failed.
Hello
Am Harsha here, Am getting some strange error on my MSSQL Server 2012 DataQualityServices, I have installed DQSinstaller and everything is working properly till yesterday, But today after i restart the Machine am not able to open DQS client its giving some
strange error like "The Data Quality Services license validation has failed." Please see the Screen shot below. Please let me know how to rectify this Issue.After patching, the dqsinstaller.exe should be run to regenerate the SQL CLR procedures to refresh the version. Try this...
Launch Cmd as Administrator
cd "C:\Program Files\Microsoft SQL Server\MSSQL11.InstanceName\MSSQL\Binn"
DQSInstaller.exe -upgradedlls
Thanks, Jason
Didn't get enough help here? Submit a case with the Microsoft Customer Support team for deeper investigation - http://support.microsoft.com/select/default.aspx?target=assistance -
Oracle Database Inserts Via Microsoft Data Transformation Services (DTS)
This question involves a SQL Server database and an Oracle database. The databases reside on different servers. One of our developers periodically uses Microsoft DTS (Data Transformation Services) to read data from a SQL Server database and insert it into an Oracle database. Normally the job runs once a day and reportedly inserts about 20,000 rows. The job usually runs fine. About a month ago execution of the daily job was suspended. Two days ago the developer ran a job to select and insert nine days of information. He estimated that 80,000 rows would be inserted. The job cancelled after twenty-three minutes when it filled up the 512 MB UNDO tablespace. (FYI, we use automatic UNDO management.) At the point of failure the number of active sessions spiked sharply in the Oracle database because of system I/O waits (log file parallel write, db file parallel write, and control file parallel write). The number of active sessions also spiked sharply in three other Oracle databases whose files reside on the same array of disk drives. Most of those sessions were waiting on commits (log file sync). The spikes lasted for one minute or less. Grid Control’s performance monitor shows that sqlservr.exe is the module being executed when the UNDO tablespace fills up. We ran the job a second time and closely monitored it, watching the amount of UNDO space grow until it used all 512 MB available. The symptoms described above for the first cancellation were repeated in the second cancellation.
We reran the job by processing a single day’s worth of information and that ran fine. Then we ran it for two days of information, then for six days of information. Everything ran fine. During those tests no more than 70 MB of space of UNDO were used.
Our developer reported that last week he ran the job for nine days of information, the same amount as the job that cancelled twice today. He estimates that it ran for about 80 minutes and went to a normal end-of-job.
Can anyone here offer an explanation of why we seem to be getting these varied demands for space in the UNDO tablespace? Do you know if Microsoft DTS issues a commit after each insert or only a single commit at the end-of-job?
Thank you,
BillHi Arthur,
Yes both instances are same.
Microsoft SQL Server 2008 R2 (SP2) - 10.50.4263.0 (X64) Aug 23 2012 15:56:56 Copyright (c) Microsoft Corporation Enterprise Edition (64-bit) on Windows NT 6.1 <X64> (Build 7601: Service Pack 1) (Hypervisor)
I have run the Main package using the using SQL Agent, main package calls the child packages.
The error message shown on SQL agent job is:
R6025 - pure virtual function call. The return value was unknown. The process exit code was 255.
or sometimes
The step did not generate any output. The return value was unknown. The process exit code was -532459699.
in the even log it says:
Error Level:
Event ID 1000
Faulting application name: DTExec.exe, version: 2009.100.4263.0, time stamp: 0x5036ba73
Faulting module name: DTSPipeline.dll, version: 2009.100.4263.0, time stamp: 0x5036ba53
Exception code: 0x40000015
Fault offset: 0x00000000000a33c5
Faulting process id: 0x98c
Faulting application start time: 0x01cf64ba9b72b27c
Faulting application path: C:\Program Files\Microsoft SQL Server\100\DTS\Binn\DTExec.exe
Faulting module path: C:\Program Files\Microsoft SQL Server\100\DTS\Binn\DTSPipeline.dll
Report Id: e8eb9b4f-d0ad-11e3-babd-005056997b14
Information Level:
Windows error reporting Event ID 1001
Fault bucket , type 0
Event Name: APPCRASH
Response: Not available
Cab Id: 0
Problem signature:
P1: DTExec.exe
P2: 2009.100.4263.0
P3: 5036ba73
P4: DTSPipeline.dll
P5: 2009.100.4263.0
P6: 5036ba53
P7: 40000015
P8: 00000000000a33c5
P9:
P10:
Attached files:
These files may be available here:
C:\ProgramData\Microsoft\Windows\WER\ReportQueue\AppCrash_DTExec.exe_ccc7a4e176faafbea69955957371ea96e175b_44c41e3e
Analysis symbol:
Rechecking for solution: 0
Report Id: e8eb9b4f-d0ad-11e3-babd-005056997b14
Report Status: 4 -
Advantages of Data Quality Services(DQS) over fuzzy loop
Hi There
I am using DQS in my packages. Its work is similar to Fuzzy lookup.
I am wondering what are the advantages of DQS over Fuzzy Lookup.
Can you please help me ..??
Regards,
NidhiDQS is much more advanced.
You have concept of domains within knowledge base .It can either have static list of data or it can even be made dynamic by accessing online data service. Also you can specify additional business rules inside it. You also have flexibility of determining
what all values you want to include as valid ones based on suggestions it rpovides and you can approve the ones you want.
Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs
^^^ this.
In addition to the above (which is a very good answer), DQS allows you to offload the work of maintaining the data quality rules to a "data steward". For example, when a new rule is added (or a change to an existing existing rule) it doesn't require modifications
to the underlying ETL by an IT developer along with a release.
BI Developer and lover of data (Blog |
Twitter)
Yep Indeed. Thats definitely a big plus
Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs -
Hi All,
I have utilized the DQS for my organization Data Cleansing project. And we have built Data Quality reports using DQ output.
We also use Profiler Summary report in the Activity Monitor, but this is manual process where Data owners need to manually enter into the Activity monitor and generet the porofiler report.
My question is, which DQS table store the this Profiler summary report where by I can automate the report through that source.
Here is the screenshot of the report we are looking at, we need to find out which DQS source table got this information.
Thanks for the help0
Hi All,
I have utilized the DQS for my organization Data Cleansing project. And we have built Data Quality reports using DQ output.
We also use Profiler Summary report in the Activity Monitor, but this is manual process where Data owners need to manually enter into the Activity monitor and generet the porofiler report.
My question is, which DQS table store the this Profiler summary report where by I can automate the report through that source.
Here is the screenshot of the report we are looking at, we need to find out which DQS source table got this information.
Thanks for the help -
Data Quality Services - Summary report
Hi,
Is anyone has a idea about which table is stored summary information comming out from Activity monitor?
the example is below:
when I exported the data the summary as follows:
My pourpose is to automate this report if it is stored on DQS data bases?
Field
Domain
Corrected Values
Suggested Values
Completeness
Accuracy
EmployeeName
EmpName
5 (0.06%)
0 (0%)
7303 (88.73%)
8222 (99.89%)
EmployeeKey
EmployeeKey
1 (0.01%)
0 (0%)
8231 (100%)
8215 (99.81%)
CostCentreKey
CostCentreKey
0 (0%)
0 (0%)
8231 (100%)
8141 (98.91%)
CostGroupKey
CostCentreGroupKey
0 (0%)
0 (0%)
7188 (87.33%)
7094 (86.19%)
LeaveGroupKey
LeaveGroupKey
0 (0%)
0 (0%)
8231 (100%)
8129 (98.76%)
EmployeeStatusKey
EmployeeStatusKey
0 (0%)
0 (0%)
8231 (100%)
8212 (99.77%)
EmployeePositionNumber
EmployeePositionNumber
0 (0%)
0 (0%)
8214 (99.79%)
8117 (98.61%)
EmployeeEmail
EmployeeEmail
0 (0%)
0 (0%)
5133 (62.36%)
8220 (99.87%)
HoursPerWeek
HoursPerWeek
0 (0%)
0 (0%)
8214 (99.79%)
8213 (99.78%)
Gender
Gender
0 (0%)
0 (0%)
8231 (100%)
8231 (100%)
EmployeeEFT
EmployeeEFT
0 (0%)
0 (0%)
8214 (99.79%)
8213 (99.78%)
EmployeePostCode
EmployeePostCode
0 (0%)
0 (0%)
8153 (99.05%)
8124 (98.7%)
EmployeeSuburb
EmployeeSuburb
133 (1.62%)
0 (0%)
8152 (99.04%)
8134 (98.82%)
ReportToManager
ReportToManager
0 (0%)
0 (0%)
7037 (85.49%)
7036 (85.48%)
PositionClassificationCode
PositionClassificationCode
0 (0%)
0 (0%)
8144 (98.94%)
8144 (98.94%)
PositionClassificationDesc
PositionClassificationDesc
0 (0%)
0 (0%)
8144 (98.94%)
8144 (98.94%)
PositionLocation
PositionLocation
0 (0%)
0 (0%)
8214 (99.79%)
8122 (98.68%)
Age
Age
0 (0%)
0 (0%)
8231 (100%)
8229 (99.98%)
CurrentClassCode
CurrentClassCode
0 (0%)
0 (0%)
7908 (96.08%)
7906 (96.05%)
CurrentCLassDescription
CurrentCLassDescription
0 (0%)
0 (0%)
7908 (96.08%)
7907 (96.06%)
EmpState
EmpState
0 (0%)
0 (0%)
8153 (99.05%)
8137 (98.86%)0
Hi All,
I have utilized the DQS for my organization Data Cleansing project. And we have built Data Quality reports using DQ output.
We also use Profiler Summary report in the Activity Monitor, but this is manual process where Data owners need to manually enter into the Activity monitor and generet the porofiler report.
My question is, which DQS table store the this Profiler summary report where by I can automate the report through that source.
Here is the screenshot of the report we are looking at, we need to find out which DQS source table got this information.
Thanks for the help -
Exporting Data Quality Statistics Into An Excel File Or Database
Hi,
I would like to ask if it is possible to export the data profiling statistics into an excel file / flat file or a database table.
The output required by our development team is that we would like to be able to manipulate and review the data profiling outside of the Data Quality Services User Interface.
I'm aware that after the cleansing of a specific set of data is that you can save/export the output results into an excel file or database, however the feature I'm looking for is for the data profiling statistics itself.
Mostly information on the knowledge base. Specifically, how many new records this specific column has and how many unique data and invalid data this column has and etc.
The reason for this is that so we can control and track the data profiling information and be a bit more flexible in creating reports and presenting the data. Using the DQS user interface would not suit our needs
for the project.
Sorry if this has been asked before but i've tried searching around and couldn't find any information regarding this functionality.
Thanks!I'm not too sure where they are stored, but you could use the directories shown in transaction AL11 so find them.
-
hi
how you handle data quality using SSIS and what it means by quality of dataHi coool_sweet,
Data quality functionality provided by Data Quality Services is built into a component of SQL Server Integration Services (SSIS) 2012 named
DQS Cleansing component which enables us to perform data cleansing as part of an Integration Services package.
If you are using an earlier version of SSIS, we can use Data Profiling Task to profile data that is stored in SQL Server and to identify potential problems with data quality.
Besides, as Vaibhav suggest, we can also make use Derived column, Data conversion, lookup etc. in the package to apply business rule and to take care of data quality.
Thanks,
Katherine Xiong
Katherine Xiong
TechNet Community Support -
DQS matching data quality project finished but stuck on Matching - Export screen
Attempt to open the data quality project again results in the export process screen. Back and Next are greyed out. i.e. not possible to run the matching project again.
to repeat;
New data quality project
Map - 7 mappings applied
Matching - overlapping clusters
Export - selected matching results to export, Click Export. Success. Click "Finish" (Next is greyed out)
Attempt to open the project again. back on the export screen ahain
DQP is in State: in Work, Locked: False, Contains unpublished content: True
Data Quality Services Client : 11.0.3401.0
Data Quality Services Server : 11.0.3401.0
any ideas?Have spent some time battling the UI.. As a workaround,
never click Finish. You will then be able to re-run the match process with updated results over and over.
Note: If at any time you click Finish the data quality project will be "stuck" again and you will have to create a new one to have updated match results. -
Microsoft Azure Monitoring Services Management Library stable release date
With the general availability of autoscale and the Microsoft Azure Management Libraries, what's the timeline on the general availability of Microsoft Azure Monitoring Services Management Library? Is there an estimated release date for v1.0?
We have the preview release for the API/REST and a Nuget package
REST/API
http://msdn.microsoft.com/en-us/library/azure/dn510374.aspx
Nuget
http://www.nuget.org/packages/Microsoft.WindowsAzure.Management.Monitoring/0.10.0-preview
Johnny Coleman [MSFT] Any code posted to this Forum is [As-Is] With No Warranties -
Data Services and Data Quality Recommnded Install process
Hi Experts,
I have a few questions. We have some groups that have requested Data Quality be implemented along with another request for Data Services to be implemented. I've seen the requested for Data Services to be installed on the desktop, but from what I've read, it appears to be best to install this on the server side to allow for more of a central benefit to all.
My questions are:
1. Can Data Services (Server) install X.1 3.2 be installed on the same server as X.I 3.1 SP3 Enterprise?
2. Is the Data Services (CLIENT) Version dependent on if the Data Services (Server) install is completed? Basically can the u201CData Services Designeru201D be used without the Server install?
3. Do we require a new License key for this or can I use the Enterprise Server license key?
4. At this time we are not using this to move data in and out of SAP, just using this to read data that is coming from SAP.
From what I read, DATA Services comes with the SAP BusinessObjects Data Integrator or SAP BusinessObjects Data Quality Management solutions. Right now it's seems we dont have a need for the SAP Connection supplement, but definetly something we would implement in the near future. What would be the recommended architecture? A new Server with tomcat and cmc (seperate from our current BOBJ Enterprise servers)? or can DataServices be installed on the same?
Thank you,
TeresaHi Teresa.
Hope you are referring to BOE 3.1 (Business Objects Enterprise) and BODS (Business Objects Data Services) installation on the same server machine.
Am not an expert on BODS installation.
But this is my observation :
We had recently tested on a test machine BOE BOXI 3.1 SP3 (full build) installation before upgrade of our BOE system.
We also have BODS in our environment.
Which we also wanted to check whether we could keep on the same server.
So on this test machine, which already has BOXI 3.1 SP3 build, when i installed BODS server installation,
what we observed was that,
all the menus of BOE went away
and only menus of BODS were seen.
May be BODS installation overwrites/ or uninstalls BOE, if it already exists ?
I dont know. Though i could not fine any documentation, saying that we cannot have BODS and BOE on the same server machine. But this is what we observed.
So we have kept BODS and BOE on 2 different machines running independently and we do not see any problem.
Cheers
indu -
Using WebCenter Spaces Web Service API through JDeveloper's Data Control
Hi,
I'm trying to access WebCenter Spaces Web Service API (located at http://host:port/webcenter/SpacesWebService)
using JDeveloper's Web Center Data Control.
I created a data control in my portal project using JDeveloper's Web Service Data Control wizard.
I also created and configured key stores (jps-config.xml) at both sides (spaces server and my portal client).
How I created a data control:
* First I entered the name and the URL (http://host:port/webcenter/SpacesWebService?WSDL)
* Then I entered HTTP basic authentication details (user name and password)
* After that I drag-and-drop the getGroupSpaces() method from the data control to a .jspx page as a ADF read-only table.
* Then ran my portal project and navigated to this .jspx page and it worked. List of group spaces appeared well on that page.
The problem is that I got only public group spaces and group spaces created by the user I entered on HTTP basic authentication details.
I makes no sense to enter some static user details in a web service client (or a data control).
So the question is; can I use identity propagation to get only group spaces created by the same user which I logged in my portal?
Edited by: 832886 on Feb 18, 2011 3:09 AMHi,
You generally get NameError when you are executing the createCred/updateCred from a incorrect location. Are you using the wlst from oracle_common\common\bin?
Also, In your steps I don't see what you did to populate the walllet at JDeveloper end after your updateCred failed. This is a required step. Use the wlst from the location emntioned above and you should be able to proceed.
Thanks,
Vishal -
Forum for Data Services, RapidMarts, Data Quality
SAP BO Moderators,
Could you educate me if there is any forum for BO Data Services, Rapid Marts and Data Quality.
If not, could you tell when will this forum going to start?
Thank you.
- AnilAnil
you can post your questions at the below forum for Enterprise Information Management
Data Services and Data Quality
regards,
Shiva
Maybe you are looking for
-
Dont work driver on Scanner ScanJet 300 on winows 7
Please, create new drivers for Scanner ScanJet 300 on Windows 7 platform. I have this divace. Dont work it on Windows 7. Windows 7 and drivers of ScanJet 300 scanner are not complianted.
-
N+1 & OfficeExtend Access Points
There seems to be a lot of conflicting information out there so I thought I'd ask you guys. Is N+1 supported with OfficeExtend Access Point (OEAP) 600 clients supported? I want to implement it on the following 2* 8510 WLC's release 7.5
-
SOA Suite 10.1.3.1.0 BPEL Process, no domain drop down.
I've installed SOA Suite 10.1.3.1.0 with an Oracle 10g Enterprise Dehydration Database on a separate server. Currently we have two BPEL Domains, however, when trying to launch the BPEL Console, I do not get a drop down list to select between the two
-
Requisitions Interface V 11.0.3
We are on Oracle Apps Rel.11.0.3. and trying to build an interface for importing requisitions from the legacy system. Having problems with the "Import Source" parameter of the Requisitions Import Program. The list of values does not contain any value
-
my computer crashed and lost my itunes, now i have a new computer...is there any way to get my music back from my account