Exclude Matches between records from a Particular Source system EDQ+Siebel UCM

I have a requirement to modify the EDQ-CDS Individual Match processor.
I want to exclude the match between two records (Driver and Candidate) if they have same value in a particular field and i want the Value to be hard-coded or specified as reference Data in EDQ (eg: if both driver and candidate are from same source say 'Source ABC' only then NO MATCH , for all other source records match should happen).
We tried adding Match rule which does the 'Exact Match' comparison on the field 'source' and set the rule to 'NO MATCH' , but this is excluding the matching between records from any system, which is not my requirement. I am only looking to exclude the match when the Source value is say 'Source ABC'.
I think it should be possible with the Comparison type  'In Value' or 'In List ' or 'All in List'. But not sure who to use these comparison rules and there is not much documentation available EDQ online.
Any help on this appreciated.
We are on Siebel 8.2.2.4 + EDQ 11.1.1.7  on Solaris 64 bit.

In List is documented in the 12.1.3 online help. Before this, it was an extension to the product used by CDS.
It is very straight-forward. You add it as a comparison, and create a reference data set with your list of values in it. If the identifier value hits the list, the comparison evaluates to true.
I've pasted most of the help from 12.1.3 below:
Comparison: In List
The In List comparison provides a way of making the application of a Match rule conditional on one or both identifier values in a comparison being matched with a single value or list of values.
Use
Use this comparison as a way to apply a Match rule only to a subset of the data in the Match process. For example:
Where a Match rule is designed to match Middle Eastern names, only apply it if a Country identifier matches a list of Middle Eastern countries.
Where a Match rule is designed to match Electronic products, only apply it if a Product Category identifier matches a single value.
The comparison may also be used to eliminate matches that would otherwise be caught by Match rules lower in the decision table, if used in a Match rule with a 'No Match' result; for example, so that matches on certain rules are only presented if Country or Nationality values are not in a list of safe values.
Options
Option
Type
Purpose
Default Value
Require data in both records?
Yes/No
If Yes, both input identifier values must contain data. If they do not, the comparison will always be 'False'.
If No, only one input identifier value must contain data.
No
Match whole value?
Yes/No
If Yes, the whole identifier value (or values) specified must match.
If No, tokens within the identifier will be matched against the list. In this case, delimiters must be specified in the relevant fields below to determine how to split tokens.
No
Required value reference data
Reference Data
A Reference Data set with a list of values, for example a set of country codes.
Clear
Required value
Free text
A single value to match against. Note that if a Reference Data set and a value is specified, both are matched against.
Clear
Require match for all values?
Yes/No
If Yes, all tokens in the identifier value(s) must match against the required list or value.
If No, any one of the tokens must match.
This option is only used if Match whole value? is set to "No".
No
Delimiter characters reference data
Free text or browse
A Reference Data set with a list of delimiter characters used to tokenize identifier values before matching against the list.
This option is only used if Match whole value? is set to "No".
None
Delimiter characters
Free text
This field is used to specify the delimiter characters to use as an alternative to linking to a Reference Data set.
This option is only used if Match whole value? is set to "No".
Note: If a Reference Data list of delimiters and specific characters are entered here, both are considered delimiter characters.
None

Similar Messages

  • Insert space between records from open .. close dataset

    I am generating a file that contains different information for a business partner with the aid of datasets. I want my records to look like this ...
    partner1 BPname1 address1 contact info1   - from table1
    partner2 BPname2 address2 contact info2
    partner1 agency1 address1 insurance obj1  - from table2
    partner2 agency2 address2 insurance obj2
    But the one I am seeing in the generated file in AL11 is like this ...
    partner1 BPname1 address1 contact info1
    partner2 BPname2 address2 contact info2
    partner1 agency1 address1
    partner2 agency2 address2
    I want to see a space between records from tables 1 and 2.
    Code:
    OPEN DATASET FILE FOR OUTPUT IN TEXT MODE ENCODING DEFAULT
    IF SUBRC = 0
       LOOP AT TABLE1
         TRANSFER TABLE1 TO FILE
       ENDLOOP
       CLOSE DATASET
       OPEN DATASET FILE APPENDING
       IF SUBRC = 0
         TRANSFER TABLE2 TO FILE
       ELSE
         MESSAGE ERROR
       ENDIF
    ELSE
       MESSAGE ERROR
    ENDIF
    CLOSE DATASET
    Thanks.

    Hi Cor,
    Here is my code to create datagridview (see below)
    By the way, I would like to ask if i will always declare this code every time i will do the insert , update and delete records. Let say i have separate buttons for every process. its possible to create one time and i will call this function.
    Another question. can i put a auto number in the first column of Datagridview upon adding of records.
    Dim conn As New System.Data.OleDb.OleDbConnection() conn.ConnectionString = "Provider=Microsoft.ACE.OLEDB.12.0;Data Source=C:\Users\jov\Desktop\FeedBackSystem\FBSystems\FBSystems\Data\KPI.accdb"
    form load event (DataGridView)
        Dim dt As New DataTable        Private cma As CurrencyManager = DirectCast(BindingContext(dt), CurrencyManager)        'Set Data GridView
            With dgvReport            dt.Columns.Add("ItemNumber", GetType(String))            dt.Columns.Add("ReportName", GetType(String))            dt.Columns.Add("Ratings", GetType(String))            dt.Columns.Add("Comment", GetType(String))            AddHandler cma.CurrentChanged, AddressOf CurrentChanged            .ReadOnly = True
                .MultiSelect = False
                .AllowUserToAddRows = False
                .AllowUserToDeleteRows = False
                dgvReport.DataSource = dt            For Each c As DataGridViewColumn In dgvReport.Columns                c.Width = 200            Next
            End With
    This is the code to insert records from textbox to datagridview
    Private Sub btnAdd_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles btnAdd.Click
    Dim i As Integer
    dt.Rows.Add(txtItemNo.Text, CmbReportName.Text, CmbRate.Text, txtComment.Text)
    End Sub

  • How do I stop notifications from a particular source?

    Perhaps I inadvertently clicked on something recently, but I've begun getting notifications from "Mac Prices Australia" (I live in the United States). I can see the item listed in Notifications in System Preferences, but I don't see an option to remove it. Plus, there are a number of items here that I don't actually use and would like to remove. How do I go about this?

    Well, I realize that would keep the notifications from appearing, but I also assume there's a bit of code for Mac Prices Australia that would still remain on my Mac, and I'd like to delete that entirely. I poked around in the Library, hoping to find a folder with these notifications, but I haven't found one yet.
    I guess I titled my question incorrectly. I should have asked how to remove rather than just stop notifications from a particular source.

  • Delete All Indexed data in Oracle SES or Indexed data from a particular source (UCM,Discussion Forum)

    Hi ,
    We have configured Oracle SES 11.2.2.2 with Oracle Webcenter 11.1.1.8 and UCM .
    We are able to index all the data successfully.
    For some test cases , i would like to delete all the indexed data / indexed data from a particular source from backend and run crawlers again .
    Is there any SP exists that can do the job or any table that i can purge to achieve this?
    Thanks
    Regards,
    Sid

    Hi Sripathy,
    From webcenter we had crawled everything including the user profiles which are appearing in the search results , we would like to get rid of that indexed data and would re index data with params as /rsscrawl?excludedServiceIds=oracle.webcenter.peopleconnections.profile,oracle.webcenter.community,oracle.webcenter.list
    We will try your suggestion for recrawling the whole source for our first requirement .
    -Sid

  • How to record from an external source

    I recently bought a hp envy 17" and I cant figure out how to record onto this laptop.  I have to jacks on the left side, one with an icon of a headphones, the other with an icon of a headset.  I want to be able to take input a stereo jack to record into one of these and record into a program like Audacity.  Under device manager it says I have an ATI High Def Audio Device.  How do I record from an external source?  Thanks.

    @toolchic, first make sure that in the IDT HD Sound properties [right click the IDT ICON near the clock, which says "Controls IDT Audio Settings" on mouse over the icon], and choose "Open Audio control panel".
    Next choose "Recording > Settings". Here choose the input you want to record with. In your case choose "Integrated Microphone Array" on the top left. Click "Settings" just bellow, slide the bar to "10db boost" on the right, and slide the recording level bar about midway up [at the bottom of the window where all the volume controls for all recording devices are].
    While you/re there, it's a good idea to mute the "External mic", and mute any other recording device that you do not need. Next, click the "Processing" tab below the settings tab, and choose "Reduce noise". Leave all the others unchecked. for these will produce a not so nice sound. This helps to eliminate fan noise and other background noise in your recording.
    Now when you open your preferred digital audio program e.g.. Audacity [it's free and pretty easy to use, see link below], you can choose the INTEGRATED MIC ARRAY in the recording options. Choose that and then "arm" the track for recording [Usually an icon with R in the recording controls] Next click "record" to begin recording. Of course these steps depend on the digital audio recording program you're using.
    If you don't have a dedicated digital audio recording program, and if you have windows 7 or vista, you can record long sessions with the default Microsoft recording program [this program is useless in XP, for it only records for like a minute or so, maximum!. A dedicated program such as Audacity records depending on how much hard drive disk space, or close to that.
    @fritz, in essence, input jacks of most current notebooks, external computer mics are mono, but they use a  stereo mini jack [or 3 band, if I understood you correctly]. These are the same as regular earphone jacks. One band is signal, one is earth, and the other is +5 Volts to drive the condenser [these are known as "condenser mics"]. But if you use a dedicated studio type recording mic, then you need a dedicated preamp, and if it is a condenser as apposed to a dynamic mic, the preamp needs to supply what's known as "phantom power" [typically 45v] through the input jack where the mic is plugged into.
    However, if you record from the mic in/line in jack of a line out of say a mic preamp, channel mixer deck, CD/DVD/TV etc., then this has to be a stereo mini jack [has 3 bands]. Special cables with 2 [stereo] RCA jacks to mini jack are the ones to use. That is if the line out connector of your recording course is RCA [your typical TV/CD//DVD audio / video out or in are RCA connectors e.g..].
    Did this help?
    http://audacity.sourceforge.net/download/
    be-well

  • How to check data records in R/3 (source system)

    Hello,
    I need to check the data records in R/3 (source system). Is Transaction RSA3 the only option or is there another way. When I use RSA3, all I see is the sandbox and cursor. Nothing seems to happen.
    Pls help.
    SD

    Hi Sebastian,
    To some extent this works out i.e Comparision of Tables in R/3 Vs ODS at BIW.
    Tables in R/3 like VBAK(Order),VBRK(Billing),MKPF(inventory Management) and LIKP(Delivery)
    compare them with the respective Datasource/PSA/ODS
    2LIS_11_VAITM
    2LIS_12_VCITM
    2LIS_13_VDHDR      
    2LIS_03_BF
    in the BIW.
    The Total numbers of Document Numbers ,Quantity or Value should match !!!!!!!!
    Hope it helps !!
    Rgds
    SVU123
    Edited by: svu123 on Mar 3, 2009 1:16 PM

  • Collecting data from two R3 source systems

    Dear All,
    Scenario:
    We are in the process of implementing SAP BI 7.0 in our organisation.
    We have a single BI server (7.01) collecting data from two source
    systems from two different divisions within the group.
    The first system is of Company 1 where we have IS u2013 MILL (ECC 6.0).
    BI has been implemented and running successfully in this unit. Now we
    have started implementing BI in the second unit u2018Company 2u2019
    where the industry specific solution IS u2013 AFS (ECC 6.0) is the source
    system.
    We are activating BI Standard Content for AFS. When we try to upload
    master data, we face a problem.
    The master data (0Division, 0Mat_Grp_3, etc.) is already loaded with
    IS Mill data (data from the first source system). Now when the data is
    loaded from IS AFS (second source system), wherever, the same key
    exists in the info object, we notice that the existing data is
    overwritten by the freshly loaded data, thus causing loss of existing
    data.
    A typical example is as follows:
    BW Data u2013 Source system 1 u2013 (Before upload of data from source system
    2):
    0Division
    99         EN          Stock Transfer Out
    AFS Data (Source system 2) -
    0Division 
    99         EN             STO
    BW Data u2013 Source system 1 u2013 (After upload of data from source system 2):
    After uploading the master data from AFS, the Values areu2026
    0Division
    99        EN            STO
    The original data is lost.
    Similar problem is noticed with other master data as well.
    Please suggest the right methodology for upload of master data when
    data to BW is sourced from two systems. How to retain the data in the
    same Infoobject but at the same time maintain distinction between the
    data from the two systems.
    We donu2019t find any specific mention of such scenarios in the standard
    documentation.
    Will such a problem recur when we upload transactional data?
    Regards,
    Aslam Khan

    Hi Aslam,
                Please use some compounding attribute for your master data object like Sourcesystem ID.  While loading the master data from different source systems... you should have two different flows to fill the same master data object.  In each flow you can specify the Source system ID as constant value ... eg SS1 (Source system 1)  and SS2(Source system 2) in each of the data flows.   This should solve your issue without overwriting your master data
    Thanks
    Kishore

  • Keyfigures in particular source system.

    Hi All,
    I would like to know the keyfigures that are used in particular source system in the BI.
    could any one please let me know how to find it out at one place rather then searching in each transformation of that source system.
    Thank you in advance.
    Regards,
    Prem.

    Hi,
    We have a table RSOSFIELDMAP. This table gives the mapping between datasource fields and the infoobjects in BI for a particular source system.
    Here, in the input screen you can give your source system name and in the 'infoobject' you can provide your starting characters for keyfigures type infoobjects.
    e.g. All the keyfigures infoobjects in our BW system are named as ZKF*. 
    So give in the input screen
    Source system - <your source system name>
    Infoobject - ZKF*
    In this way it would bring out all the infoobjects that are of keyfigures type and also their corresponding datasource fields.

  • Moving the 80 Million records from Conversion database to System Test database (Just for one transaction table) taking too long.

    Hello Friends,
    The background is I am working as conversion manager and we move the data from oracle to SQL Server using SSMA and then we will apply the conversion logic and then move the data to system test ,UAT and Production.
    Scenario:
    Moving the 80 Million records from Conversion database to System Test database (Just for one transaction table) taking too long. Both the databases are in the same server.
    Questions are…
    What is best option?
    IF we use the SSIS it’s very slow and taking 17 hours (some time it use to stuck and won’t allow us to do any process).
    I am using my own script (Stored procedure) and it’s taking only 1 hour 40 Min. I would like know is there any better process to speed up and why the SSIS is taking too long.
    When we move the data using SSIS do they commit inside after particular count? (or) is the Microsoft is committing all the records together after writing into Transaction Log
    Thanks
    Karthikeyan Jothi

    http://www.dfarber.com/computer-consulting-blog.aspx?filterby=Copy%20hundreds%20of%20millions%20records%20in%20ms%20sql
    Processing
    hundreds of millions records can be done in less than an hour.
    Best Regards,Uri Dimant SQL Server MVP,
    http://sqlblog.com/blogs/uri_dimant/
    MS SQL optimization: MS SQL Development and Optimization
    MS SQL Consulting:
    Large scale of database and data cleansing
    Remote DBA Services:
    Improves MS SQL Database Performance
    SQL Server Integration Services:
    Business Intelligence

  • Master Data cleansing and transformation from non-SAP source systems

    Hi all,
    Our client (Media)wants to cleanse and transform his master data from non-SAP source system to be uploaded into BW (no R/3 yet). If anybody has a document regarding this topic that i could use, i will appreciate if u sent it to me.
    thanks.

    Hi,
    https://websmp203.sap-ag.de/~sapidb/011000358700001965262003
    https://websmp203.sap-ag.de/~sapidb/011000358700006591612001
    https://websmp203.sap-ag.de/~sapidb/011000358700001971392004
    https://websmp203.sap-ag.de/~form/sapnet?_SHORTKEY=01100035870000471477&_OBJECT=011000358700008927932002E
    /manfred

  • Differentiating Master Data from two different source systems

    Friends,
    i have used standard InfoObjects that provide master data for two InfoCubes which take data from two different source systems. Now the some of the master data is identical in both the source systems (example: 10 stands for "Industrial" in one whereas 10 stands for "Agricultural" in the other). What do i do so that the system(BW) differentiates the two.
    Thanks in advance for all the help.
    Mike

    I tried to include the 0SOURSYSTEM in the compounding of this InfoObject (for master data) but it gives me a list of other objects that use this InfoObject as a reference characteristic and also says that this InfoObject is used in a ODS and data needs to be emptied in the ODS before activating  this InfoObject.  Please let me know if there is any way out of it.
    Thanks
    Mike

  • Error in Previewing data from a UD Source System

    Hi,
    I have created a data source from a UD Source System; however, when I preview data, I am prompted by this message:
    "Inbound processing of data package 000001 finished"
    There is no error, but there is also no data preview that is displayed.
    Can you help me with my problem?
    I look forward to your reply experts!
    Regards,
    Ramon

    Hi,
    sampling is out of a directory placed on the CLIENT. On runtime the directory must be placed on the SERVER.
    So you must have two locations: one for sampling (*local) and one for (*remote) and before deployment you must switch from _local to _remote.
    The _remote location points to a directory on the SERVER, don't forget to grant read/write privileges to the oracle user on the file system.
    Regards,
    Detlef

  • HELP:  0 from 0 records: Initialization Option of source system

    I executed an 'Initialization option for source system' on the scheduler and no data is coming from the source system.
    Steps taken:
    1.  Delete data from target DSO
    2. Picked InfoPackage and selected it returns back 'Initialization option for source system'  on the scheduler
    3.  Deleted the initialization setting
    4. Executed InfoPackage
    Result:  "0 from 0 records" found. How do I fill the queue table in the source system if not records are available?  T
    5. I go to delete the request in the PSA to retry: "No data in the PSA".
    This is to resolve a HIGH status ticket because I stopped delta process of other Infoproviders until this situation is resolved.
    Can you provide detail steps to properly create a trigger delta data from R/3 from a "Initialization Option of source system".  We have cancelled delta undates until this is resolved.

    Hi
    Now that you have run the init once, there is no need to delete the init flag
    The delta pointer is set now, you can run the delta Ip from now onwards.
    But if there were already some previous deltas done for this, and the delta had failed, because of which you have to initialise the delta again, then you need to delete the previous initialisation.
    To delete the init flag.
    Open your delta IP in RSA1(or in the PC)
    Goto the scheduler menu option
    initialization option for source system
    There you will see that a window will pop up which will show a succesful load, and you will see a tick mark in the first column.
    If you see a cross mark that would mean that the initialization has failed.
    this mark whihc you see is called the init flag.
    If you need to reinitialize the delta, then you have to delete this .
    Select the entire row and click the third button at the bottom.
    This will delete the initialization for source system( i.e init flag)
    now, before running the delta, you have to run the init IP again.
    it depends on the scenario whether you hav to run with data transfer or without data transfer.
    Hope this clarifies the query
    Regards
    Shilpa

  • Mapping is inserting multiple records from a single source to Dimension.

    Hi All,
    I am very new to OWB. Please help me out. I've created Dimension with the help of the wizard and then a mapping which consist of single source and single dimension. The mapping is populating nearly 500 times of the actual records. Following are some details to give you a better understanding of mapping: I created a dimension with four levels and two hierarchy. Levels are L1, L2, L3 and L4 and hierarchies are H1-> L1, L2 and L4
    and H2-> L3 and L4. L4 is lowest level of hierarchy. L1 and L3 are parent levels in the respective hierarchies. I assigned an attribute of each level as Business identifier that means business identifier attribute is different in each level. In mapping I mapped the parent natural key(Key for parent Level in a hierarchy) as the value which has been mapped for parent level. The result is coming 500 times of the record that exist in source table. I've tried even single common business identifier for each level but again the result is 5 times of the records. Please let me know the solution.
    Thanks is advance.
    Amit

    Hi ,
    You may not be having multiple records in your dimension.
    To understand better the records insertion, try a snow flake version of the dimension and see how the records are inserted as per the levels in the respective tables.
    Thanks

  • Listener doesn't pick up messages from a particular source.

    I have a queue populated from two different sources. My listener configured on that queue receives messages send from one source and not the other.
    I can see the messages from both sources in the queue when I check through the server administration page.
    To make things interesting, messages from both the source looks identical (in terms of all the properties except id and timestamp).
    Any clues anyone?
    Thanks

    Are the two queues are having two different names.
    which server are you using
    Can you tell me how to see the messages from in queue from administration console

Maybe you are looking for

  • How can I fix left pane text overlap in iCal Day view?

    In "Day" view, the items in the left hand (list) pane in iCal are overlapping, obscuring the bottom 1/3 of each line of text. It is as if the table cells are too close together vertically. How do I fix this?

  • Silent Installation of Additional OMS 10.2.0.4 with 11.1.0.7 repo fails...

    Followed instructions from the OEM Installation and Basic Configuration guide E10953-04 - Software Only Installation... In the section 'Temporary update of the Repository Core Version' I changed the version to 10.2.0.1.0 When trying to install additi

  • A tip how to make iMovie '08 rescan the iPhoto library for video clips

    I could not figure out how to easily force iMovie '08 to rescan all iPhoto library video clips. My problem was that my Casio S880 videos had the audio format DVI ADPCM which is not supported by iMovie '08. I made a script which transcoded all my iPho

  • Getting rid of the ATT00001.htm attachment

    I look like an utter fool when I send email using Mail with an attachment (images, documents, anything) and I get a response form those I am emailing asking what the junk attachment "ATT00001.htm" or any deriative of the numbers ending with a ".htm"

  • How to connect LDAP to Enterprise Portal

    Hi Guru,             I need complete steps to connect LDAP into enterprise portal as LDAP is microsoft AD existing one.                     I am confuse about the user mapping and authenticatation, compareing and Single sign one.  Does we are calling