City Level information

Hello All,
I am new to Data Quality and currently trying to clean up our address information using DQ 3.2. We have a number of requirements to try and report on the city level information. For example sales by city.
Now here's the problem. Say they want to report on the city of Los Angeles. LA actually consists of a number of cities such as Beverly Hills, North Hollywood, Burbank etc. The word "Los Angeles" never appears anywhere in the address. We are doing this globally for all countries and I noticed a similar pattern in a number of other countries as well.
The big question: is there a field that i can select in the output of the address cleanse transform that actually gives me "Los Angeles"? Do note here that it should ideally apply for all countries. If this is possible, I could populate that in a separate field and have the users report on that field.
Thanks in advance!
Rohit

Hi Rohit,
the reference data used for address cleansing is in general from the country specific postal authority. For US it is the United States Postal Services. As USPS core purpose is to deliver letters or parcels to existing addresses they have set up their post code system. The boundaries of the post code regions some times do not map to the boundaries of other, even also official federal authorities like the municipalities. (See a  [map of the Postcodes in the City of Los Angeles|http://lahd.lacity.org/lahdinternet/Portals/0/Policy/LAZipCodes.pdf] in the Internet)
Or the reference data suppliers do not include additional regional, neighborhood or community information, that would be helpfull for your granularity of mapping.
A workaround would be to include a Query Transform in your Job after Global Address Cleanse, where you have a lookup in an additional table e.g. with Postcode and Reporting Grouping Value (e.g. "City of Los Angeles") and populate this additional column based on the corrected cleansed address output from Global Address Cleanse.
The number of reporting groups could be huge, depending on industries, markets, ... Some companies want to go down to statistical information grouping just some building into one unit (Statistical Areas), others will just split a country into 4 or 5 regions based on  multiple states (Nielsen Region). Most cases external lists are used to generate this additional field.
What other regions did you identify in your analysis?
Niels

Similar Messages

  • GL cost allocation batches transaction level information

    Hi
    In Gl_LINES_ALL, all reference columns information is NULL.
    After posting the cost batches into GL, what is procedure to find transaction level information for these cost batches.
    Eg: For Inventory and receivables batches transaction information stored into GL_JE_LIENS Reference columns, in the same way cost batches transaction how can we find.
    Regards
    Kishore S.

    Hi Dianne,
    Check whether the real time integration between co and fi has been done in transaction OK17.
    Thanks
    Aravind

  • Problem with "View project-level information" permission when accessing build definition in Visual Studio Online

    Hi, 
    from some time all our team members are experiancing problems when acsessing list of builds on Visual Studio Online or when trying to edit build definition in Visual Studio. Error message suggests problem with missing View project-level information permission:
    "TF50309: The following account does not have sufficient permissions to complete the operation: XXXXXXXX. The following permissions are needed to perform this operation: View project-level information."
    I've checked permissions of my User account and Group. "View project-level information" is set to Allow on both levels. We didn't make any changes in security configuration recently. Does anyone faced similar problem?
    Short term solution was to add all users, to Project Collection Administrators Group, but it is not what we would like to live with.

    It seems that the problem was fixed and Project Collection Administrators permissions are no longer required. Great :-)

  • What kind of log I can get with "level informational" - c2950 & 6500

    Hello, everybody, It is nice to meet you. I am working in Tokyo in japan as network engineer.
    It is my first time to write something on here Cisco Support Community and I don't speak English very well.
    So I hope everyone understand my pour question and writing. But I wish somebody anwser to this question.
    I am using kiwi sever to manage four of 6500 and about thirty of c2950.
    The kiwi server is working normally logging both 6500's syslog and c2950's.
    But I have question about difference between 6500's level informational and c2950's.
    For example, When I log in 6500 I can get a log from my kiwi server.
    And When I enter some command to change config I can also get a log from my kiwi server.
    However When it come to c2950 I can't get the logs. I can only get logs that notice 'interface up/down' or 'changed config' stuff like that.
    I think I have correct config about logging both 6500 and c2950 as follow.
    Trap logging: level informational, OOOOO message lines logged
            Logging to XX.XX.XX.XX, OOOOO message lines logged, xml disabled,
                   filtering disabled
    Anybody know what kind of log I can get with "level informational" on c2950.
    Thanks everybody.

    Information is a fairly low severity which means you'll get everything except debugging messages.  This includes link up/down, config change,
    CPU hog, etc.  If you're not seeing messages other than link up/down and config change, then there may not be anything else happening on the switch which triggers a log message.  Check the output of "show log".  That should agree with your syslog logs.

  • Sales Order - Line Level Information in Info cube

    Hi BW Guru's,
    We are using the following scenario: -
    Info cube = Sales Order – Line level information (yes, we want to capture the product level sales order information in info cube for our internal reporting purpose)
    ODS = NO. we are not using the same
    Info Source – Same as data source (standard one)
    Data Source – 2LIS_11_VAITM (standard business content data source for sales order information)
    Delta in above data source = contract related information (5 extra fields) – working fine – added as append structure in the data source
    Extractor = Standard using SBIW - Perform setup - SD Sales Order (SD – 11)
    Number of Sales order line records / month = 80,000
    Historical Sales order data to be uploaded = 1 year = 80,000 * 12 = Nearly 1 Million records
    Hardware Configuration = 4 * 750 MHZ, 8 GB RAM
    Please advice on the following things: -
    1) How long it will take to upload 1 million records in BW?
    2) Does our server capacity sufficient enough to take 1 million load?
    3) Which delta procedure should we follow for day to day Sales Order line level data updation? Please advice on data selection procedure.
    4) In case, last month Sales Order is changed, how to reflect the same in our info cube? Please suggest the suitable method.
    Waiting for your reply.
    Thanks
    Rajiv

    Hi Rajiv,
    1) How long it will take to upload 1 million records in BW?
    This depends on your network, on the width of your data record, the performance of your machine, on the settings for communication between BW and R/3 ....
    It can be done in 2 hours, in 3 hours, it can take a whole day. There is no exact answer possible.
    2) Does our server capacity sufficient enough to take 1 million load?
    If your disc space is enough, no problem. If you already loaded your master data, there is not too much space necessary to load the transactional data.
    3) Which delta procedure should we follow for day to day Sales Order line level data updation? Please advice on data selection procedure.
    Just run your initial setup and initialize your delta. Run a daily delta package (this is what most companies do).
    4) In case, last month Sales Order is changed, how to reflect the same in our info cube? Please suggest the suitable method.
    The extractor takes care about this, just post the delta to your cube, that's it.
    Hope this helps,
    regards
    Siggi

  • Can't print- 27" 2013 iMac-my epson stylus 3800 pro will not print-i updated driver n added printer in syst prefs-When i print the printer queue opens and shows the job as "stopped"-Supply level "information is not available"-maybe part of problem...thx

    I can't print from my  27" late 2013 iMac.  I am using an epson stylus 3800 pro.  I updated driver and added printer in system preferences. When i print the printer queue opens and shows the job as "stopped".  Supply level shows  "information is not available which maybe part of problem...thank you

    The first thing you could try is a reset of the printing system. Please note that this action will remove all printers and scanners from the Printers & Scanners preference pane.
    With the reset complete, unplug the Epson USB cable from the Mac for a minute and then reconnect it. This action should result in the printer being added again to the Mac automatically. If this does happen, then see if you can print again.
    If the printer is not added again automatically, open Printers & Scanners and click the plus button to add the Epson. Once this is done, then see if you can print again.

  • N8 buggy battery level information

    Hi,
    I own a Nokia N8 and have installed Nokia Battery Monitor 2.1 and I've noticed that the native battery level monitor is always reporting more battery level than the Nokia Battery Monitor.
    Also I'm experiencing problems with the native monitor because it seems to be stuck at the same level (It has been stuck at 100% while Nokia Battery Monitor reported 70%).
    Switching off the phone hasn't solved the problem until I used restore the phone to factory default option but know it is stuck at 80% (and Nokia Battery Monitor reports 53%)
    Any idea on how to solve this?

    I don't think there is a way to solve it. They measure differently. Battery monitor bases it on past use. The native battery monitor is supposed to measure "actual" battery level. However, I know on mine, it only measures from 100% down to 70%-50-40-30-20-10. I do like Battery monitor better, though, even though it has a bad habit of getting stuck, and I have to reboot my phone to get it to work sometimes.

  • Update trigger on table fr audit purpose to record column level information

    Hi,
    I want to create a update trigger which will record data in an audit table.
    In audit table I want to have columns like old_value_of_Col and new_value_of_Col.
    This is easily achievable. But my main concern is to add three more columns having the information of:
    1) Which column was updated?
    2) Old value of column that was updated
    3) New value of column that was updated
    If possible, Also if one updates three columns for example, then in such a case. I would like to have three entries in audit table for the corresponding three columns.
    Please help.
    Thanks in advance.

    A few approaches to consider.
    First, if you are on 11g, take a look at Flashback Data Archive: http://download.oracle.com/docs/cd/B28359_01/appdev.111/b28424/adfns_flashback.htm#ADFNS01011
    Second, you can use Fine-Grained Auditing to capture DML statements and bind variables. It is not as easy to reconstruct the before/after picture, but it may be sufficient for some purposes.
    For trigger-based solutions, I have seen the approach you propose (1 row for each column changed) and it is tedious and prone to maintenance headaches. You have to write code for each column, doing compares (remembering to consider NULLs), dealing with different datatypes, etc. We used that design becasue there was an actual requirement to produce a report that needed such a structure.
    An easier trigger-based solution is to create a history table for the table you want to track, with all of the columns from the original, plus whatever else you need for housekeeping. Write an After Insert/Update/Delete trigger on your base table, and populate/insert rows into the history table:
    - For inserts, populate and insert row from the :new. values
    - For deletes, populate and insert a row from the :old. values
    - For updates, popualte and insert a row from the :old. values and another from the :new. values
    I would also have a column to designate whether this is an Insert, Delete, Update-Old or Update-New row.
    Once you have done one example, the rest are easy. If you were sufficiently motivated (I have not yet been :-) ), you could probably write a script to query DBA_TAB_COLS and generate the code.

  • Understanding of the table level information from SAP HANA Studio

    Hello Gurus,
    Need some clarification from following information provided by SAP HANA Studio.
    I have a table REGIONS and the contents are as follows:
    REGION_ID REGION_NAME
    11      Europe
    12      Americas
    13      Asia
    44      Middle East and Africa
    15      Australia
    6      Africa
    The Runtime Information about the table is as follows:
    Image# 1
    Image# 2
    Image# 3
    Total size in KB show in Image#2 and 3 are not matching. Why?
    Total size in KB show in Image#2 and Total Memory Consumption are not matching. Why?
    The values of Memory consumption in Main Storage and Delta Storage are not matching in Image# 1 and Image# 2. Why?
    Estimated Maximum Memory Consumption (Image# 1) and Estimated Maximum Size (Image# 2) are matching.
    Why the Loaded column Image# 2 is showing the value ‘PARTIALLY’. The table has just 6 rows and still partially loaded? Why?
    What is the significance of column Loaded in Image# 2 and 3 ?
    Thanks,
    Shirish.

    Have a look on this
    Playing with SAP HANA
    That presentation should help you to answer the questions yourself
    Regards,
    Krishna Tangudu

  • Dunning level information for FIAR

    hi
    is there any objects like dunneling level 01, o2 , o3 , o5 an d05 levels.

    Hi Suneel,
    The Dunning level data will be there in the transactional data. you may check the FIAP: Line Item 0FIAP_C03 and
    FIAR: Line Item 0FIAR_C03 infocubes for this. check the dunning levels.
    There is one more IO 0FCDUNN_LEV for Dunning level. You can check dunning levels  in Open Items Cube 0FC_C07 as well.
    Regards,
    Pratap Sone

  • Tuxedo Server Queue Level Information

    I have two queries related to Tuxedo Information base
    1. How Tuxedo decides how many requests can reside in server queue of one instance of server.
    2. How through C API function one can get the total no of request queued on one server? What is pre-condition (limitations) to get this information?
    Thanks & Regards,
    - Ajeet

    Ajeet,
    Tuxedo allows messages to be placed on a message queue as long as there is
    available space on the queue below the operating system imposed maximum byte
    limit. If a particular message would take up over 75% of the maximum byte
    limit or if the message would not fit on the queue at all but a short header
    message would fit on the queue, then Tuxedo writes a short header message to
    the queue and transfers the actual message data using a file. This is
    slower than transferring the entire message in the queue but avoids queue
    full error conditions whenever possible.
    Queries to the T_MSG MIB class can be used to see the OS maximum allowable
    byte limit, the number of bytes currently on the queue, and the number of
    messages currently on the queue. The T_QUEUE MIB class is focused on the
    Tuxedo attributes of the queue rather than the operating system attributes
    of the queue.
    The operating system command "ipcs -q -a" can also be used to see the
    maximum number of bytes, current number of bytes, and current number of
    messages for all queues in the system.
    Ed
    <AJEET TEWARI> wrote in message news:[email protected]..
    I have two queries related to Tuxedo Information base
    1. How Tuxedo decides how many requests can reside in server queue of one
    instance of server.
    2. How through C API function one can get the total no of request queued on
    one server? What is pre-condition (limitations) to get this information?
    Thanks & Regards,
    - Ajeet

  • Customer City  level data...

    Hi Guru,
    I am trying to do something like this....
    I have a customer table with the list of cusotmer name and these customer are realted to a diff diff cities.
    My question is when ever any customer logs in he/she should see only the data related to his/her city.
    Can we do that. Can we do something with variable. Please suggest.
    thanks in advance.
    k

    Hi,
    you should be able to do this by creating a session variable.
    I suppose you know how to do this?
    - you create a query returning the different cities of the users who logs in - you can use the :USER variable to do this
    - making use of row wise initialisation you will be able to restrict your data
    Once you've created your session variable, you can limit your fact table by adding a filter on your region dimension.
    So a user only has access to the data of his region(s)
    I hope it's a bit clear...if you need more help, just let us know.
    Kr,
    A

  • OBIEE-Essbase Federation Ques

    Hello All,
    We are doing federation b/w essbase and relational datasources. All our region level information (upto state) is in essbase and city level information is present in the relational tables for which drill down to city to fetch those details.
    The problem that I am facing here is with the Essbase Defaults. For the Time dimension in essbase, its defaulted to most recent month (Oct)
    So now in the report lets say you pulled state and dollars, that report will fetch from the cube for the most recent month. So far good but now when you drill to state, it wont pass the current month to the state. Is there a way to pass that current month to the city level SQL query.
    Well I guess this would be problem with every other dimension because of those essbase defaults. In this case of months, I would be able to create some default Current date variables or something and would be able to work it but for several other dimesions the defaults will always create me a problem
    I have a frequency dim(DLY, QTD, MTD, YTD) in essbase which is defaulted to 'DLY'. so how would I be able to pass the essbase defaults to relational queries??
    In the reports, we can use default values for dashboard prompts but was wondering how would we do it in ANSWERS?
    Please advice
    Thanks
    Prash
    Edited by: Prash11 on Oct 28, 2010 1:08 PM

    Refer this http://www.oracle.com/technetwork/middleware/bi-enterprise-edition/overview/obiee-essbase-modeling-guide-130897.pdf for Modeling. And for any Essbase releated question post it in Essbase forum Essbase

  • Discoverer - Dynamic Drill Level puzzle

    Hi,
    I came across a strange requirement for discoverer reports. The client has a sales organisation hierarchy with levels like country,state, region, city.
    1) The first problem is to show the data in the reports according to value of the country, region or city of the user. That is a Manager from country X should be able to see only data from country X. The possible solution for this kind of requirement would be a master table to store mapping of username to their geography. And then at the EUL level, putting mandatory conditions in the report to filter out data. Is there any other better approach possible?
    2) The TOUGHER puzzle is same report opening at different level of hierarchy according to the user's position. That is a sales SUM report opens up at country level and shows separate sums of each state under that country head, that can be drilled down to City level. That is two levels of drill down (region, city).
    When a State head logs in then he gets the report for sum of sales for each region under that state, that can be drilled futher down to city level (one level of drill down).
    Is it possible to open the same report at country, region, state, city level according to the user??
    I checked one of the posts from Rod in the message : HR Intelligence - Salary Information - Can we restrict ?
    In this he has suggested a very much possible way to replace the item with a custom Item where the actual underlying Item is decided in the DECODE. So in the current puzzle, I can create an Item SALES_AREA which decodes to Country, state, region or city depending on the user. But I don’t know how user friendly it’ll be for super users who want to use all the areas in the same report as different columns.
    Is there any other way possible way out for both of the above problems… I know this would be real creative solution to come up with..
    Thanks

    Hi
    We really are only limited by our own imagination when it comes to what can be done inside Discoverer. There are many different ways you can implement what you hve proposed.
    Have you considered using a VPD or VPD-like scenario to control what you are asking?
    Did you consider using some of the code that I supplied in the same posting you referenced in your original posting? Using this methodology you can use a separate control / bridge table to handle who sees what. Then you create a function of the username / userid. If that function returns true or 1 or yes or whataver you choose, you open up access to the column or row.
    The code I supplied can be used to manage both column-based and row-based security such that protected columns like SSN, date of birth, salary are hidden from prying eyes, or where the whole row will only display when a user has the correct clearance. If you've stayed with me and tuned in to my logic, you will be thinking Mandatory Conditions at this point. :-)
    I hope this helps
    Best wishes
    Michael

  • Authentication on local SQL Server 2008 R2 Express server fails after Lan Manager authentication level changed to "Send NTLMv2 response only\refuse LM & NTLM"

    I'm upgrading my organisation's Active Directory environment and I've created a replica of our environment in a test lab.
    One medium-priority application uses a SQL server express installation on the same server that the application itself sits on.
    The application itself recently broke after I changed the following setting in group policy:
    "Send LM & NTLM - use NTLMv2 session security if negotiated"
    to
    "Send NTLMv2 response only\refuse LM & NTLM"
    The main intent was to determine which applications will break if any - I was very surprised when troubleshooting this particular application to find that the issue was actually with SQL Server express itself.
    The errors I get are as follows (note that there are hundreds of them, all the same two):
    Log Name:      Application
     Source:        MSSQL$SQLEXPRESS
     Date:          1/19/2015 2:53:28 PM
     Event ID:      18452
     Task Category: Logon
     Level:         Information
     Keywords:      Classic,Audit Failure
     User:          N/A
     Computer:      APP1.test.dev
     Description:
     Login failed. The login is from an untrusted domain and cannot be used with Windows authentication. [CLIENT: 127.0.0.1]
     Event Xml:
     <Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event">
       <System>
         <Provider Name="MSSQL$SQLEXPRESS" />
         <EventID Qualifiers="49152">18452</EventID>
         <Level>0</Level>
         <Task>4</Task>
         <Keywords>0x90000000000000</Keywords>
         <TimeCreated SystemTime="2015-01-19T22:53:28.000000000Z" />
         <EventRecordID>37088</EventRecordID>
         <Channel>Application</Channel>
         <Computer>APP1.test.dev</Computer>
         <Security />
       </System>
       <EventData>
         <Data> [CLIENT: 127.0.0.1]</Data>
         <Binary>144800000E00000017000000570053004C004400430054004D00540052004D0053005C00530051004C0045005800500052004500530053000000070000006D00610073007400650072000000</Binary>
       </EventData>
     </Event>
    Log Name:      Application
     Source:        MSSQL$SQLEXPRESS
     Date:          1/19/2015 2:53:29 PM
     Event ID:      17806
     Task Category: Logon
     Level:         Error
     Keywords:      Classic
     User:          N/A
     Computer:      APP1.test.dev
     Description:
     SSPI handshake failed with error code 0x8009030c, state 14 while establishing a connection with integrated security; the connection has been closed. Reason: AcceptSecurityContext failed. The Windows error code indicates the cause of failure.  [CLIENT:
    127.0.0.1].
    Event Xml:
     <Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event">
       <System>
         <Provider Name="MSSQL$SQLEXPRESS" />
         <EventID Qualifiers="49152">17806</EventID>
         <Level>2</Level>
         <Task>4</Task>
         <Keywords>0x80000000000000</Keywords>
         <TimeCreated SystemTime="2015-01-19T22:53:29.000000000Z" />
         <EventRecordID>37089</EventRecordID>
         <Channel>Application</Channel>
         <Computer>APP1.test.dev</Computer>
         <Security />
       </System>
       <EventData>
         <Data>8009030c</Data>
         <Data>14</Data>
         <Data>AcceptSecurityContext failed. The Windows error code indicates the cause of failure.</Data>
         <Data> [CLIENT: 127.0.0.1]</Data>
         <Binary>8E4500001400000017000000570053004C004400430054004D00540052004D0053005C00530051004C004500580050005200450053005300000000000000</Binary>
       </EventData>
     </Event>
    All of the documentation that I have followed suggests that the errors are caused by incorrect SPN configuration- I figured that they were never correct and it has always failed over to NTLM in the test environment (I can't look at production - we couldn't
    replicate the setup due to special hardware and also RAM considerations), but only NTLMv2 has issues.
    So I spent some time troubleshooting this.  We have a 2003 forest/domain functional level, so our service accounts can't automatically register the SPN.  I delegated the write/read service principle name ACEs in Active Directory.  SQL Server
    confirms that it is able to register the SPN.
    So next I researched more into what is needed for Kerberos to work, and it seems that Kerberos is not used when authenticating with a resource on the same computer:
    http://msdn.microsoft.com/en-us/library/ms191153.aspx
    In any scenario that the correct username is supplied, "Local connections use NTLM, remote connections use Kerberos".  So the above errors are not Kerberos (since it is a local connection it will use NTLM).  It makes sense I guess - since
    it worked in the past when LM/NTLM were allowed, I don't see how changing the Lan Manager settings would affect Kerberos.
    So I guess my question is:
    What can I do to fix this? It looks like the SQL server is misconfigured for NTLMv2 (I really doubt it's a problem with the protocol itself...).  I have reset the SQL service or the server a number of times.  Also - all of my other SQL applications
    in the environment work.  This specific case where the application is authenticating to a local SQL installation is where I get the failure - works with LAN Manager authentication set to "Send LM & NTLM - use NTLMv2 session security if negotiated",
    but not "Send NTLMv2 response only\refuse LM & NTLM".
    Note also - this behaviour is identical whether I set the Lan Manager authentication level at the domain or domain controller level in Active Directory - I did initially figure I had set up some kind of mismatch where neither would agree on the authentication
    protocol to use but this isn't the case.

    Maybe your application doesn't support "Send NTLMv2 response only. Refuse LM & NTLM".
    https://support.software.dell.com/zh-cn/foglight/kb/133971

Maybe you are looking for

  • How to catch a hidden UI component ?

    How do we know a UI component is hidden on the screen(Please donot confuse with the visible property).Let say you have a long form on the screen with vertical scrollbar.The very first element is a checkbox. when the user scroll down to the last compo

  • Missing Device Driver Intel 82801FBM

    Reloaded Win XP Pro, need Intel 82801FBM LPC? Did try to reinstall driver from specific location, no Intel 82801 Thanks in advance for any assistance.

  • Manifest: line too long

    Hi all, I have a manifest in my EAR file (WEB-INF/MANIFEST) with a Class-Path inside (it's long), and I'm getting this error. weblogic.management.ApplicationException: Failed to parse deployment descriptor for /home/weblogic/bea/user_projects/mydomai

  • How do I default valuation area to post all valuation area in TBB1 or TBB4

    Dear All, How do I default valuation area to 'Post All Valuation Area' when I run TBB1 or TBB4 or any other Treasury T-Code (that has these 2 valuation areas as option)? Currently, by default, SAP is pointing to Post Operative Area. What is required

  • NWA not working after patching NW 7.3 JAVA application server

    Hi Experts, After installing a new SAP 7.3, i patched it to stack 5 and as a result NWA and system administration no longer works,  gives a 500 error, all other JAVA pages tested, user management etc all work.. MMC shows SAP is up and i can find no o