Problem with universal data cleanse

I have problem with universal data cleanse.
I am using DS 3.2.x (12.2.2).
I created:
Dictionary: TEST
Classification: TEST_CL
Custom outputs: category: TEST_CAT
Dictionary Entry:
Primary: BO
Classification: TEST_CL
Gender: Unassigned
Secondary information:
When used as: TEST_CAT
Standard: Business Objects
Rule file:
DataCleanse Rule File v2.0;
TEST_CAT = TEST_CL;
action = TEST_CAT;
TEST_CAT = 1 : TEST_CAT : 1;
end_action
Data Cleanse:
Input: Multiline1
Options:
Parsing Dictionary: TEST
Rule file: .../test_rules.dat
Break On Whitespace Only: Yes
Parser Sequences Multiline1: TEST_CAT
Output:
Parent_component: TEST_CAT1
Generated_field_name: TEST_CAT and RULE_LABEL
Generated_field_class: parsed/standardized
Contenet_type: none
and EXTRA field
I wanted to replace word u201CBOu201D with the standard words u201CBusiness Objectu201D, but I have result: there is "BO" in the extra field,
there are "null" in others fields.
What am I doing wrong?
Thanks for all help!
P.S. I don't have cleansing packages installer.

There seem to be a couple of things going on here:
1. If you are using your custom dictionaries, then you have to map your input to MULTILINE1 and enable the custom parsers - just something to be aware of
2. You mentioned that you made some changes to the existing dictionary and you are not seeing any changes. To be clear, do you have different TEST and PRODUCTION environments? Or is it the same environment except that you have a local DS repository and for the dictionary you are pointing to another repository (using Dictionary --> Manage Connections)?
Having the dictionaries on a different repository should not make any difference as long as you point to them in your designer using the Dictionary --> Manage Connection option.
So I think there may be some issue with your job setup and/or dictionary values need to be looked at. You can start by adding another output field named "EXTRA" to see whether or not your data is getting parsed at all. Also, make sure the entry "CLEANME" is classified as FIRM_NAME_ALONE in the dictionary and that you are selecting the correct dictionary name in the Datacleanse Transform options.

Similar Messages

  • Experimenting with Universal Data Cleanse

    I am working with Universal Data cleanse and custom dictionaries for a project I am working on.
    The goal is to clean the leagacy firm information with standards. The issue I am having is using the dicitonary, how to I replace words with the standard word.
    For example, the text contains the word: CLEANME
    I want to have the dictionary replace that with the standard: CLEAN ME
    IBoth were added as dictionary entries, one at a time, and the other as the Alias or the Standard in that dictionary entry. I get no results.
    Reading the documentation, it seems I should be setting it up like this:
    Dictionary Entry: CleanMe
    Classification: Firm_Name, Firm_Name_Only
    Gender: Unassigned
    Standard: Clean Me
    Thanks for all help!

    There seem to be a couple of things going on here:
    1. If you are using your custom dictionaries, then you have to map your input to MULTILINE1 and enable the custom parsers - just something to be aware of
    2. You mentioned that you made some changes to the existing dictionary and you are not seeing any changes. To be clear, do you have different TEST and PRODUCTION environments? Or is it the same environment except that you have a local DS repository and for the dictionary you are pointing to another repository (using Dictionary --> Manage Connections)?
    Having the dictionaries on a different repository should not make any difference as long as you point to them in your designer using the Dictionary --> Manage Connection option.
    So I think there may be some issue with your job setup and/or dictionary values need to be looked at. You can start by adding another output field named "EXTRA" to see whether or not your data is getting parsed at all. Also, make sure the entry "CLEANME" is classified as FIRM_NAME_ALONE in the dictionary and that you are selecting the correct dictionary name in the Datacleanse Transform options.

  • Problem with Java Dates and UPDATE for SQL2000

    I am having problems with the date formats for Java. I am trying to put the current date time into a SQL table, here it the code I am using:
    var Today = new Date()
    var conn = Server.CreateObject( "ADODB.Connection" )
    conn.Open( "Provider=SQLOLEDB;Server=(local);Database=BillTracking;UID=sa;PWD=;")
    var sql = "UPDATE BillAssignments SET DatePosted = " + Today + " WHERE AssignmentID = '" + Request.QueryString("AssignmentID") + "'"
    var rs = conn.execute(sql)
    I keep getting different errors and I have been unable to find a solution yet. I know that I need to change the date format from the Java standard to the one that SQL likes.
    Help....
    Norm...

    Please tell us where the Java part of this comes in. I see that you are using JavaScript to load up data via an ADO connection (presumably on an IIS platform) - but I do not see where you are using Java
    Lee

  • Problem with billing date at closed period

    Hi
    We have a problem with billing date in transaction . Previously we could set up different (earlier) billing date than the actual posting period and now it is not working. We receive an error message that the posting period is closed ) .
    I attached an example invoice which we managed to post and where you can see what Iu2019m thinking about, parking number: 1024337, BILLING DOCUMENT: 841835. In this case the billing date was at 30.11.2007, but the posting in 15.07.2008 and we managed to post.
    Pls help that issue
    yps

    hi,
    it is due to posting period closed for the month of 07/2008.
    u need to open posting during that period.
    Regards,
    Greeshma

  • Facing problem with a date column in select query

    Hi,
    I am facing problem with a date column. Below is my query and its fainling with " invalid number format model" .
    Query: SELECT *
    FROM EMP
    WHERE trunc(LAST_UPDATED) >= to_date(to_char(22-05-2009,'dd-mm-yyyy'),'dd-mm-yyyy')
    LAST_UPDATED column is "DATE" data type.
    Please help me Thanks

    Radhakrishna Sarma wrote:
    SeánMacGC wrote:
    WHERE LAST_UPDATED >= to_date('22-05-2009','dd-mm-yyyy');
    You do not need the TRUNC here in any case.
    I don't think so. What if the user wants only data for 22nd May and the table has records with date later than 22nd also? In that case your query willl not work. In order for the Index to work, I think the query can be written like this I think Sean is right though. Use of TRUNC Function is quiet useless based on the condition given here, since the to_date Function used by OP will always point to midnight of the specified date, in this case 22-05-2009 00:00:00.
    Regards,
    Jo
    Edit: I think Sean proved his point... ;)

  • Problem with loading data to Essbase

    Hi All,
    I have a problem with loading data into Essbase. I've prepared maxl script to load the data, calling rule file. The source table is located in RDBMS Oracle. The script works correctly, ie. generally loads data into Essbase.
    But the problem lies in the fact, that after deletion of data from Essbase, when I'm trying to load it again from the source table I get the message: WARNING - 1003035 - No data values modified by load of this data file - although there is no data in Essbase... I've also tried to change the mode of loading data from 'overwrite' to 'add to existing values' (in rule file) but it does'nt help ... Any ideas what can I do?

    Below few lines from EPM_ORACLE_INSTANCE/diagnostics/logs/essbase/dataload_ODL.err:
    [2013-09-24T12:01:40.480-10:01] [ESSBASE0] [AGENT-1160] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1116830016] Received Validate Login Session request
    [2013-09-24T12:01:40.482-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1114724672] Received client request: Get App and Database Status (from user [admin@Native Directory])
    [2013-09-24T12:01:54.488-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1101564224] Received client request: MaxL: Execute (from user [admin@Native Directory])
    [2013-09-24T12:01:54.492-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1115777344] Received client request: MaxL: Describe (from user [admin@Native Directory])
    [2013-09-24T12:01:54.492-10:01] [ESSBASE0] [MLEXEC-2] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1115777344] Output columns described
    [2013-09-24T12:01:54.494-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1102616896] Received client request: MaxL: Define (from user [admin@Native Directory])
    [2013-09-24T12:01:54.494-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1102616896] Received client request: MaxL: Fetch (from user [admin@Native Directory])
    [2013-09-24T12:01:54.494-10:01] [ESSBASE0] [MLEXEC-3] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1102616896] Record(s) fetched
    [2013-09-24T12:01:54.496-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1116830016] Received client request: MaxL: Fetch (from user [admin@Native Directory])
    [2013-09-24T12:01:54.498-10:01] [ESSBASE0] [AGENT-1160] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1114724672] Received Validate Login Session request
    [2013-09-24T12:01:54.499-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1101564224] Received client request: Get Application State (from user [admin@Native Directory])

  • Problem with the date conversion

    Hi Friends,
    i am facing the problem with the date conversion,  Actuall my requirement is to pass the date to the screen based on the user setting roles(SU01).
    I have fetched the user setting date format by using the funciton module SUSR_GET_USER_DEFAULTS, The function module picks the exact user date setting (Like as MM/DD/YYYY, MM.DD.YYYY, DD.MM.YY).
    After that i have implemented the FORMAT_DATE_4_OUTPUT funciton module for converting of the user role setting date format into system  date format.
    for the english language case the funciton module FORMAT_DATE_4_OUTPUT works fine but the funciton module not supported for other languages
    Can you please provide the Function Moudle for user setting date conversion.
    The funciton module is most important for us,
    Thanks
    Charan
    Moderator message: date conversion questions = FAQ, please search before posting.
    Edited by: Thomas Zloch on Dec 21, 2010 2:19 PM

    Hope this logic helps you.
    DATA LF_DATE    TYPE DATS VALUE '21122010'. " 21-dec-2010
    DATA LF_DATE_BI(10).
    WRITE LF_DATE TO LF_DATE_BI.  "Now LF_DATE_BI contains the date in user format
    "Now populate the value LF_DATE_BI to the screen field

  • PSP: problems with viewing data

    Hello.
    I'm currently working at on-line shop and have some problems with viewing data from database. When there is no much inserts to table its working very well. But after inserting all Inserts I have its acting weird.
    Sample with 10 INSERTS:
    http://gafgarion.atspace.com/psp/1.jpg
    Sample with 100 INSERTS:
    http://gafgarion.atspace.com/psp/2.jpg
    I'm using Oracle 9i. when I have more data in my database its acting weird. There is SELECT only from one table, but sometimes I have data from other tables aswell.
    I didnt touch any config files or something else. Only created new User and DAD.
    any ideas what should I do to fix that ??
    thnx in advice

    Hello,
    My guess is that you are speaking about PLSQL Server Pages (PSP), and the PLSQL Web Toolkit.
    This is why I do not think that you will have lot of answer since this forum is targeted toward Web Services developer (XML, SOAP, and so on)
    I am inviting you to ask your question on the general Oracle Application Server - General or PLSQL forums.
    Regards
    Tugdual Grall

  • Problems with retrieving data from tables with 240 and more records

    Hi,
    I've been connecting to Oracle 11g Server (not sure exact version) using Oracle 10.1.0 Client and O10 Oracle 10g driver. Everything was ok.
    I installed Oracle 11.2.0 Client and I started to have problems with retrieving data from tables.
    First I used the same connection string, driver and so on (O10 Oracle 10g) then I tried ORA Oracle but with no luck. The result is like this:
    I'm able to connect to database. I'm able to retrieve data but from small tables (e.g. with 110 records it works perfectly using both O10 and ORA drivers). When I try to retrieve data from tables with like 240 and more records retrieval simply hangs (nothing happens at all - no error, no timeout). Application seems to hang forever.
    I'm using Powerbuilder to connect to Database (either PB10.5 using O10 driver or PB12 using ORA driver). I used DBTrace, so I see that query hangs on the first FETCH.
    So for the retrievals that hang I have something like:
    (3260008): BIND SELECT OUTPUT BUFFER (DataWindow):(DBI_SELBIND) (0.186 MS / 18978.709 MS)
    (3260008): ,len=160,type=DECIMAL,pbt=4,dbt=0,ct=0,prec=0,scale=0
    (3260008): ,len=160,type=DECIMAL,pbt=4,dbt=0,ct=0,prec=0,scale=1
    (3260008): ,len=160,type=DECIMAL,pbt=4,dbt=0,ct=0,prec=0,scale=0
    (3260008): EXECUTE:(DBI_DW_EXECUTE) (192.982 MS / 19171.691 MS)
    (3260008): FETCH NEXT:(DBI_FETCHNEXT)
    and this is the last line,
    while for retrievals that end, I have FETCH producing time, data in buffer and moving to the next Fetch until all data is retrieved
    On the side note, I have no problems with retrieving data either by SQL Developer or DbVisualizer.
    Problems started when I installed 11.2.0 Client. Even if I want to use 10.0.1 Client, the same problem occurs. So I guess something from 11.2.0 overrides 10.0.1 settings.
    I will appreciate any comments/hints/help.
    Thank you very much.

    pgoel wrote:
    I've been connecting to Oracle 11g Server (not sure exact version) using Oracle 10.1.0 Client and O10 Oracle 10g driver. Everything was ok.Earlier (before installing new stuff) did you ever try retrieving data from big tables (like 240 and more records), if yes, was it working?Yes, with Oracle 10g client (before installing 11g) I was able to retrieve any data, either it was 10k+ records or 100 records. Installing 11g client changed something that even using old 10g client (which I still have installed) fails to work. The same problem occur no matter I'm using 10g or 11g client now. Powerbuilder hangs on retrieving tables with more than like 240 records.
    Thanks.

  • Facing lot of problems with the DATA object  -- Urgent

    Hi,
    I am facing lot of problems with the data object in VC.
    1. I created the RFC initially and then imported the data object in to VC. Later i did some modifications to RFC Function module,and when i reload the data object, I am not able to see the new changes done to RFC in VC.
    2. Even if i delete the function module, after redeploying the IVIew, results are getting displayed.
    3. How stable is the VC?
      I restarted the sql server and portal connection to R3 is also made afresh.... still i am viewing such surprise results..
    please let me know what might be the problem.

    Hi Lior,
    Are u aware of this problem.
    If yes, please let me know...
    Thanks,
    Manjunatha.T.S

  • Problem with a data set: DIAdem crashes

    Hi,
    I've got a problem with a data set. When I want to zoom in DIAdem-View, DIAdem crashes with the following message (translated from German ;-):
    error type: FLOAT INEXACT RESULT or FLOAT INVALID OPERATION or FLOAT STACK CHECK
    error address: 00016CB8
    module name: gfsview.DLL
    I've got some similar data set not showing such problems. Further on I scanned the data a bit, but in the 59000 points I didn't see anything special. I did try to delete "NOVALUE"s as well, but after that there still exist "NOVALUE"s.
    Does anyone have an idea what to look for?
    Thanks,
    Carsten

    Carsten,
    Could you please upload you Citadel database to the following FTP site:
    ftp.ni.com/incoming
    If you want to compress (ZIP) and/or put a password on the data, that's fine. Please send me a private email at [email protected] (with the file name and password if you put one on the file) once you have uploaded the file and I will check it out.
    Otmar
    Otmar D. Foehner
    Business Development Manager
    DIAdem and Test Data Management
    National Instruments
    Austin, TX - USA
    "For an optimist the glass is half full, for a pessimist it's half empty, and for an engineer is twice bigger than necessary."

  • Problem with input data format - not "only" XML

    Hi Experts,
    I have problem with input data format.
    I get some data from JMS ( MQSeries) , but input format is not clear XML. 
    This is some like flat file with content of XMLu2026.
    Example:
    0000084202008-11-0511:37<?xml version="1.0" encoding="UTF-8"?>
    <Document xmlns="urn:xsd:test.01" xmlns:xsi=http://www.w3.org/2001/XMLSchema-instance Sndr="0001" Rcvr="SP">
    ...content....
    </Document>000016750
    Problems in this file is :
    1. data before parser <? xml version="1.0"> -> 0000084202008-11-0511:37 
    2. data after last parser </Document> -> 000016750
    This data destroy XML format.
    Unfortunately XI is not one receiver of this files and we canu2019t change this file format in queue MQSeries ( before go to XI) .
    My goal is to get XML from this file:
    <?xml version="1.0" encoding="UTF-8"?>
    <Document xmlns="urn:xsd:test.01" xmlns:xsi=http://www.w3.org/2001/XMLSchema-instance Sndr="0001" Rcvr="SP">
    ...content....
    </Document>
    My questions:
    1. Is any way or technique to delete this data 0000084202008-11-0511:37  in XI from this file ?
    2. Is any way to get only XML from this file ?
    Thanx .
    Regards,
    Seo

    Hi Buddy
    What is the XI adapter using?
    If you use inbound File adapter content conversion you could replace these values with none and then pass it to the scenario.
    Does that help?
    Regards
    cK

  • Spotify and facebook problem with the date

    Hi, everyone.My spotify account have a problem with the date. I changed the date on facebook and nothing happens with that, the spotify account has a wrong date, is not the same as facebook account

    Hi ,
    By default , RFC accept date format of SQL date (yyyy-mm-dd) . If you are using a date picker from WD, it directly set the date in SQL date format. Incase if you are trying to pass date to RFC in some other way you have to convert that into SQL date format before passing.
    if you are passing String date of format dd-mm-yyyy , you try this method to convert that to SQL date and pass to your RFC.
    public java.sql.Date sqlDateConvert( String date)  {
        //@@begin sqlDateConvert()
         java.sql.Date dateObj=null;
         try{     
              StringTokenizer tempStringTokenizer = new StringTokenizer(""+date,"-");          int dd=Integer.parseInt(tempStringTokenizer.nextToken().trim());
                                    int mm=Integer.parseInt(tempStringTokenizer.nextToken().trim());
              mm=mm-1;
              int yyyy=Integer.parseInt(tempStringTokenizer.nextToken().trim());
              Calendar cal =Calendar.getInstance();   
              cal.set(yyyy,mm,dd);                         
              dateObj = new java.sql.Date( cal.getTime().getTime());
         }catch(Exception e)
              return dateObj;
    Hope this will help you.

  • I have problem with Cellular data network it is not apearing in iphone setting so help me how to bring this option in iphone

    i have problem with Cellular data network it is not apearing in iphone setting so help me how to bring this option in iphone

    What brand/model USB drive? Is it bus or AC powered?
    On Mail...
    First Quit Mail, then I'd backup these two Mail folders, by right clicking on them in the Finder, then choose Archive/Compress.
    Users/YourUserName/Library/Mail
    Users/YourUserName/Library/Mail Downloads
    (Could be a different folder here if you chose such in Mail Prefs)
    Right click on that Mail folder, choose archive, you'll get everything in the folder, and the folder itself in a file called Mail.zip, move it to a safe place, same for the Mail Downloads folder... only the plist is separate.
    /Users/YourUserName/Library/Preferences/com.apple.mail.plist

  • Is anyone else having problems with Apertures Date fields?

    After experiencing a sysems failure I began restoring my images from backups.  I have had no end of problems with Image dates.  I have come to the determination that Aperture is NOT using either the "Create Date" or "DateTimeOriginal" as the image "Date" or "Date Created" data if other fields have other dates.
    Here is a clip from my image metadata using exiftool:
    File Modification Date/Time     : 2012:09:23 19:53:15-04:00
    File Access Date/Time           : 2013:01:16 11:34:38-05:00
    File Inode Change Date/Time     : 2013:01:16 11:31:32-05:00
    Create Date                          : 2006:02:20 11:51:12.10
    Date/Time Original              : 2006:02:20 16:51:12.10
    Modify Date                          : 2006:02:20 11:51:12.10
    And here is what Aperture utilized on import
    Date:                               9/23/12 7:53:15 PM EDT
    Date Created                         9/23/2012 7:53:15 PM
    So Aperture utlized the File Modification Date/Time as the Create Date despite that the fact that the Create Date field is present and is properly formatted in the original image.
    According to the Aperture mapping table, this shouldn't be happening.
    Now, before someone recommends that I use Aperture's Date Adjust utility - I'm talking about  slightly over 30,000 images.  Editing images one at a time, or in blocks when you don't know what field Aperture is using as the Create Date would require individual inspection of each image followed by manual adjustment of each image.  That approach isn't acceptable.

    Hmm ok, let me restate the issue then, I thought I was clear.  I could have pointed out in my original post however, that only the EXIF and IPTC date fields were displayed from the metadata dump.
    The data set presented in my first post is the EXIF dataset from the file which clearly shows the image was captured by digital camera at
    Create Date                          : 2006:02:20 11:51:12.10
    Date/Time Original              : 2006:02:20 16:51:12.10
    (The delta of 5 hours is the result of Zulu versus local time offset)
    But, when I ingested it into Aperture, the import routine utilized the IPTC field
    File Modification Date/Time     : 2012:09:23 19:53:15-04:00
    which is updated by the OS whenever you move the file around outside of Aperture. (a number of image data fields are updated by the OS - filename for example is another.).  This resulted in the image date fields being stamped in Aperture as:
    Date:                               9/23/12 7:53:15 PM EDT
    Date Created                         9/23/2012 7:53:15 PM
    Which obviously came form the File Modification Date/Time field and not (either) the Create Date or Date/Time Original fields.

Maybe you are looking for