Secondary Unit Designations from Data Services' Global Address Cleanse

Using Data Services' Global Address Cleanse transform and want to have multiple Secondary Unit Designations retained, parsed, standardized and returned in output data.
ie:  1 Dexter Ave Floor 10 Suite 5
Only finding "Suite 5" store in the output SECONDARY_ADDRESS field.
The 1st unit of the 2 units, "Floor 10" is being lost consistenly using the Global Address Cleanse transform (not the US Regulatory Address Cleanse transform).    I want to standardize and output secondary and tertiary unit designations and numbers like Building 2 Floor 10 Suite 5 and other multi-level unit desinations such as those listed by the USPS at http://pe.usps.com/text/pub28/pub28apc_003.htm .
So is any Complex_Name and Complex_Type address line info like "Rockafellow Center", "Eastdale Mall", "Manhattan Mall", "Building 2", etc.
Same behavior for US and Canada.  Multiple units or dwellings are very common on input addresses and should be retained as per USPS Cass Certification.  How can this be accomplished using Data Services' Global Address Cleanse transform?

Clark,
you either need to download one of the following additional Address Directories
- Address Directory - All-World
- Address Directory - United Kingdom
for Data Services XI 3.x from the SMP. To have access to them you need a Annual Subscription to the Directories.
Niels

Similar Messages

  • Migration from Data Services 3.2 - Data Services 4.0 document

    Hi Experts,
          Can somebody provide me a document or link where i can find Migration or Process Flow from Data Services 3.2 to Data Services 4.0?
    Any help in advance is appreciated
    Thanks
    AJ

    Hi,
    check the DS Upgrade Guide:
    https://websmp109.sap-ag.de/~sapidb/011000358700001323242010E/sbo401_ds_upgrade_en.pdf
    You need a S-User with Password for login.
    Regards
    -Seb.

  • How to execute a HANA stored proc from Data Services

    Hi
    How do i execute a HANA stored procedure from Data Services.
    in the HANA SQL editor , we run the stored procedure "name_of_sp" as
    call name_of_sp ();
    call name_of_sp ( 1 , 2 ) // suppose 1 and 2 are two integer input parameters.
    so how do I call the above from Data Services .
    SQL('name_of_datastore', 'call name_of_sp()') does not seem to work , ,
    Rishi

    I got the answer , we dont need to import sp .
    i was just having a syntax error:
    the statement below works
    SQL('name_of_datastore', 'call name_of_sp()');

  • Is it possible to invoke a Java class from Data Services 4.0?

    Is it possible to invoke a Java class from Data Services? I have a query transform with a varchar column which I want to run an external java class against to encrypt the string value. In the Management Console, I created an adapter of type TestAdapter and referenced my jar file in the Classpath section, but when I create a Datastore of type Adapter I can't import any functions related to my java class? It seems like I need to create a new Adapter type similar to the TestAdapter with the 'Adapter Class' set to my java class? I can't figure out how to do this - which is the correct approach and is there some documentation available? thanks!

    First u nees to imoprt the class which u are doing right
    then u need to call the function of the class , and then you can put the value in a string .
    DbCon.function()
    String data = DbCon.db;
    where db is a string in DbCon
    Cheers
    Varun Rathore

  • Difference Between Data Services Designer and Data Services Workbench

    Hello All,
    I am new to Data Services .
    What is the difference between Data Services Designer and Data Services Workbench .
    Am bit confused in the above two .
    Please help me to understand the same.
    Thanks in advance.
    Aisurya

    Workbench is used to create, display and modify the objects. It will display the source table data and we can see the logs of the job which we have executed and also we can see the status of a job. In bods 4.2 you can design the dataflow in workbench in previous release we don’t have that option but designer contains debugging option, you can write scripts,  it will support all databases; these option are not available in workbench. for more information refer this document:
    https://decisionfirst.files.wordpress.com/2014/07/data-services-workbench-intro.pdf
    http://scn.sap.com/community/data-services/blog/2014/03/01/data-services-42-workbench
    http://scn.sap.com/community/data-services/blog/2013/01/24/data-services-workbench-part-1

  • Global Address Cleanse Transform - (Japanese To English conversion)

    Hi All,
    Is there way I can provide the input to Global Address Cleanse  Transaformation (Japan Engine) in uniocde (Japanese - KANA ) and get the output in english like provide input LOCALITY1 as 高島市 but get the output in english like  "Takashima City".
    Thanks,
    Amit

    Data Services XI 12.1 Technical Manuals, Designer Guide, Data Quality, Page. 555:
    Caution:
    The USA Regulatory Address Cleanse Transform does not accept Unicode
    data. If an input record has characters outside the Latin1 code page
    (character value is greater than 255), the USA Regulatory Address Cleanse
    transform will not process that data. Instead, the input record is sent to the
    corresponding standardized output field without any processing. No other
    output fields (component, for example) will be populated for that record. If
    your Unicode database has valid U.S. addresses from the Latin1 character
    set, this transform processes as usual.
    Best regards,
    Niels

  • Global Address Cleansing with suggestion

    Iu2019m using Global Address Cleansing with suggestion list enabled.  The problem Iu2019m having is for that Canadian addresses are not populating the address related fields (i.e. primary name, number, etcu2026) until the selection process is complete.  This differs from the US engines that populate the fields with each reply.  Is there a way to replicate how the US engine handles suggestion replies within the Global Address Cleansing transform?
    We are using Data Services v 3.2 (12.2.1.2)
    Thanks for your help in advance

    HI,
    I am also facing the same issue with Canada address.
    Did you resolved your issue?
    Thanks,
    Ravi

  • blank values using Global Address Cleanse Transform

    Hi,
    We are trying to cleanse Global Addresses using Global Address cleanse transform. (with USA and Global Engines). We are passing Locality, Region and Postal code as multiline items. In the output some of the records are not getting populated. For these records if we keep USA as defult country then the fields are getting populated. The problem is we cannot take USA as defult country because it has global addresses and for other countries also it is filling USA as country name. Why is it that without giving USA as default country the fields are not getting populated for some records?
    Below are some of the sample addresses.
    1)     10 INDUSTR. HWY MS6     LESTER     PA     19029
    2)     PO BOX_22964     JACKSON     MS     39225
    3)     306 EASTMAN     GREENWOOD     MS     38930
    4)     3844 W NORTHSIDE DR     JACKSON     MS     39209
    5)     259 W QIANJIANG RD     ZHEJIANG     CN     31440
    Can you please suggest a way to fill the countries for these addresses? Any inputs on this will be appreciated.
    regards,
    Madhavi

    Hi,
    As Lance indicates, you set up your address cleanse (for US I would suggest using the URAC transform) and map in your input fields as normal.  In the output, you will select to output postcode2 along with all the other standardized fields you want posted in the output.
    Note:  If an address is assignable using the CASS rules established by the USPS to the USPS referential data, the postcode2 will be populated.  In cases where it is not assignable, the postcode2 can be empty or the input postcode2 data could be preserved based on a user's settings.
    Thanks,
    Paula

  • USA and Canade Engines in Global Address Cleanse

    Hi All, My client has license for All world directories. We deployed all world directories on BODS 4.0 application. I can see ga_all_world_gen" and "ga_country" directories but not any other. When I use Global address engine then it works fine for other countries data. But when I use Canada engine then I got error that cancity.dir is missing. When I use USA engine then I get error for some other directories missing.
    Do we need separate license to use Canada or USA engines? Can I cleanse USA and canada addresses using Global Address engine?
    Thanks,

    hi,
    I am not sure about the licenses issue.
    If you are using Global Address Cleanse, usa and canada addresses can also be cleansed but only country, city and postal code. Street level can be cleanse if you have the country specified address directories.

  • Output Language in Global Address Cleansing

    Hi,
    How can the DQ Global Address Transform be restricted to give all the out put field values in English text?
    Even though the input text is in English, DQ transform gives the address field in Non English characters as its a Hungarian address.
    Is there a way to force it to give in English only?
    Many thanks,

    Global Address Cleanse Transform - Enterprise Information Management - SCN Wiki
    Go to this URL

  • Error while calling ABAP program from Data Services

    Hi All,
    We have a ABAP program which accepts two parameters 1] a date 2] a string of comma separated ARTICLE numbers .
    We have used a ABAB transform in ABAP dataflow which refers this ABAP program.
    If I pass a string of 6 articles as second parameter the job executes successfully
    But if i pass 9 articles as follows
    $GV_ITEM_VALUES='3564785,1234789,1234509,1987654,1234567,2345678,3456789,4567890,5456759';
    i get the following error
    ABAP program syntax error: <Literals that take up more than one line are not permitted>.
    The error occurs immediately after ABAP dataflow starts, ie even before the ABAP job gets submitted to ECC
    I am using BODS 4.2 . The datatype of $GV_ITEM_VALUES is varchar(1000).
    The ABAP program that gets generated by the DS job has the following datatype for this parameter
    PARAMETER $PARAM2(1000) TYPE C
    Is there a different way to pass string characters to ABAP transform in data services?
    I have attached the screen shot of trace log and error
    Regards,
    Sharayu

    Hi Sharayu,
    The error your getting is because the  literals exceeds more than 72 characters.
    It seems that the length of the string is exceeding more than 72 character.
    Can you check the following in ECC GUI
    Go to Transaction SE38=>Utilities=>Settings=>ABAP Editor=>Editor=> Downwards -Comp.Line  Length(72).
    The checkbox which defines length 72 must be clicked so the error is coming. Can you uncheck the checkbox and then try passing the parameter $GV_ITEM_VALUES using the BODS job
    Regards
    Arun Sasi

  • How to Connect to UniData from Data Services

    Other ETL tools like DataStage has a Unidata connector stage
    Is there anything similar in BusinessObjects Data Services ?

    There is no separate connectivity available out of the box. However, if there is a JDBC driver available for it then you can use that to connect to it as a SQL technology. Or you can invoke SSAS web services and process the xml output in odi.

  • DataServices Global Address Cleansing code 3010 issue

    Hello,
    During the address cleansing we're getting "3010 . Locality, region, and postcode are valid. Unable to match primary name to directory." I don't understand what this means since the address is valid.
    Appreciate your insight
    Cheers
    Tansu

    Hi Tansu,
    Please check this KB Article[1589996|https://service.sap.com/sap/support/notes/1589996].
    Let me know if it helps.
    Thanks,
    George
    Edited by: George Ruan on Oct 6, 2011 2:34 AM

  • DataServices Global Address Cleansing code 3010

    Hello,
    During the address cleansing  we're getting "3010 . Locality, region, and postcode are valid. Unable to match primary name to directory." I don't understand what this means, since the address is valid.
    Appreciate your insight
    Cheers
    PS: Is this the right forum to post a DataServices question?
    Tansu

    Hi Tansu,
    Please check this KB Article[1589996|https://service.sap.com/sap/support/notes/1589996].
    Let me know if it helps.
    Thanks,
    George
    Edited by: George Ruan on Oct 6, 2011 2:34 AM

  • Data Quality - USA Address Cleanse

    I receive the following error when executing the USA Address Cleanse.
    "The job to calculate usage dependencies cannot be executed because the job contains obsolete connection information."
    I read in other blogs, my repository might be corrupt. I recreated the repository using Repo Manager and added it back to the job server and still received the above error. I am not doing anything special just Source>Address Cleanse>Target.
    Has anyone seen this issue before?
    Thank you,
    Lance

    are you seeing this error when you are executing the job ?
    or while saving the Dataflow ?
    calculate usage dependency job will use internal object, looks like you have changed the password of repository database and the connection details stored in internal object is not updated
    to update the password of internal object, go to managment console and register the repository if its already registered, modify the repositroy connection details and click on Apply this will update the password of internal object, you don't have to recreate your repo if it was successfully created earlier
    what is the DS Version ?

Maybe you are looking for