Running a Data cleansing project through ODI

Hello everyone,
I'm trying to run a data cleansing project in my odi package, i already exported the project and when i try to run the task in ODI operator looks fine but nothing happens to my output file in Data quality.
All the configurations seems to be ok.
Anybody can help?
thanks,
Questbr

Hi Balaji,
R u able to run that ODQ project independently ?
Reagrds,
Rathish A M

Similar Messages

  • Running a DQ project through ODI

    Hello everyone,
    I'm trying to run a data cleansing project in my odi package, i already exported the project and when i try to run the task in ODI operator looks fine but nothing happens to my output file.
    All the configurations seems to be ok.
    Anybody can help?
    thanks,

    Hello everyone,
    I'm trying to run a data cleansing project in my odi package, i already exported the project and when i try to run the task in ODI operator looks fine but nothing happens to my output file.
    All the configurations seems to be ok.
    Anybody can help?
    thanks,

  • Error while using Rule file in loading data into Essbase through ODI

    Hi Experts,
    I am facing problem while loading data into Essbase. I am able to load data into Essbase successfully. But when i used Rule fule to add values to existing values am getting error.
    test is my Rule file.
    com.hyperion.odi.essbase.ODIEssbaseException: com.hyperion.odi.essbase.ODIEssbaseException: Cannot put olap file object. Essbase Error(1053025): Object [test] already exists and is not locked by user [admin@Native Directory]
         at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
         at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:346)
         at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2458)
         at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:48)
         at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)
         at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2906)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2609)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:540)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:453)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1740)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:338)
         at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:214)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:272)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:263)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:822)
         at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:123)
         at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:83)
         at java.lang.Thread.run(Thread.java:662)
    from com.hyperion.odi.common import ODIConstants
    from com.hyperion.odi.connection import HypAppConnectionFactory
    from java.lang import Class
    from java.lang import Boolean
    from java.sql import *
    from java.util import HashMap
    # Get the select statement on the staging area:
    sql= """select C1_HSP_RATES "HSP_Rates",C2_ACCOUNT "Account",C3_PERIOD "Period",C4_YEAR "Year",C5_SCENARIO "Scenario",C6_VERSION "Version",C7_CURRENCY "Currency",C8_ENTITY "Entity",C9_VERTICAL "Vertical",C10_HORIZONTAL "Horizontal",C11_SALES_HIERARICHY "Sales Hierarchy",C12_DATA "Data" from PLANAPP."C$_0HexaApp_PLData" where      (1=1) """
    srcCx = odiRef.getJDBCConnection("SRC")
    stmt = srcCx.createStatement()
    srcFetchSize=30
    #stmt.setFetchSize(srcFetchSize)
    stmt.setFetchSize(1)
    print "executing query"
    rs = stmt.executeQuery(sql)
    print "done executing query"
    #load the data
    print "loading data"
    stats = pWriter.loadData(rs)
    print "done loading data"
    #close the database result set, connection
    rs.close()
    stmt.close()
    Please help me on this...
    Thanks & Regards,
    Chinnu

    Hi Priya,
    Thanks for giving reply. I already checked that no lock are available for rule file. I don't know what's the problem. It is working fine without the Rule file, but throwing error only when using rule file.
    Please help on this.
    Thanks,
    Chinnu

  • Error in loading data into essbase while using Rule file through ODI

    Hi Experts,
    Refering my previous post Error while using Rule file in loading data into Essbase through ODI
    I am facing problem while loading data into Essbase. I am able to load data into Essbase successfully. But when i used Rule file to add values to existing values I am getting error.
    test is my Rule file.
    com.hyperion.odi.essbase.ODIEssbaseException: com.hyperion.odi.essbase.ODIEssbaseException: Cannot put olap file object. Essbase Error(1053025): Object [test] already exists and is not locked by user [admin@Native Directory]
    at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
    at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:346)
    at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2458)
    at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:48)
    at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)
    at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2906)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2609)
    at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:540)
    at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:453)
    at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1740)
    at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:338)
    at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:214)
    at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:272)
    at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:263)
    at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:822)
    at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:123)
    at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:83)
    at java.lang.Thread.run(Thread.java:662)
    from com.hyperion.odi.common import ODIConstants
    from com.hyperion.odi.connection import HypAppConnectionFactory
    from java.lang import Class
    from java.lang import Boolean
    from java.sql import *
    from java.util import HashMap
    # Get the select statement on the staging area:
    sql= """select C1_HSP_RATES "HSP_Rates",C2_ACCOUNT "Account",C3_PERIOD "Period",C4_YEAR "Year",C5_SCENARIO "Scenario",C6_VERSION "Version",C7_CURRENCY "Currency",C8_ENTITY "Entity",C9_VERTICAL "Vertical",C10_HORIZONTAL "Horizontal",C11_SALES_HIERARICHY "Sales Hierarchy",C12_DATA "Data" from PLANAPP."C$_0HexaApp_PLData" where (1=1) """
    srcCx = odiRef.getJDBCConnection("SRC")
    stmt = srcCx.createStatement()
    srcFetchSize=30
    #stmt.setFetchSize(srcFetchSize)
    stmt.setFetchSize(1)
    print "executing query"
    rs = stmt.executeQuery(sql)
    print "done executing query"
    #load the data
    print "loading data"
    stats = pWriter.loadData(rs)
    print "done loading data"
    #close the database result set, connection
    rs.close()
    stmt.close()
    Please help me on this...
    Thanks & Regards,
    Chinnu

    Hi Priya,
    Thanks for giving reply. I already checked that no lock are available for rule file. I don't know what's the problem. It is working fine without the Rule file, but throwing error only when using rule file.
    Please help on this.
    Thanks,
    Chinnu

  • Extracting data from Essbase & loading into flat file through ODI

    Hi,
    I want to extract data from Essbase and load it into a flat file through ODI(for extraction from essbase I'm using a report script) and I’m using these KM’s:- LKM Hyperion Essbase data to SQL,IKM SQL to FILE Append & for reversing I’m using RKM Hyperion Essbase.All the mappings have been done and the interface has been made. But when I’m executing the interface it is throwing the error below:-
    ODI-1217: Session ESS_FILEI (114001) fails with return code 7000.
    ODI-1226: Step ESS_FILEI fails after 1 attempt(s).
    ODI-1240: Flow ESS_FILEI fails while performing a Loading operation. This flow loads target table ESS_FILE.
    ODI-1228: Task SrcSet0 (Loading) fails on the target FILE connection FILE_PS_ODI.
    Caused By: java.sql.SQLException: ODI-40417: An IOException was caught while creating the file saying The system cannot find the path specified
    at com.sunopsis.jdbc.driver.file.impl.commands.CommandCreateTable.execute(CommandCreateTable.java:62)
    at com.sunopsis.jdbc.driver.file.CommandExecutor.executeCommand(CommandExecutor.java:33)
    at com.sunopsis.jdbc.driver.file.FilePreparedStatement.execute(FilePreparedStatement.java:178)
    at oracle.odi.runtime.agent.execution.sql.SQLCommand.execute(SQLCommand.java:163)
    at oracle.odi.runtime.agent.execution.sql.SQLExecutor.execute(SQLExecutor.java:102)
    at oracle.odi.runtime.agent.execution.sql.SQLExecutor.execute(SQLExecutor.java:1)
    at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2906)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2609)
    at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:537)
    at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:453)
    at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1740)
    at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:338)
    at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:214)
    at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:272)
    at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:263)
    at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:822)
    at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:123)
    at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
    at java.lang.Thread.run(Thread.java:619)
    Please let me know what I'm missing and how I can resolve this error.
    Thanks

    It seems that you are trying to use the file as your staging areas. Hyperion LKM extracts essbase data into a DB staging area which can then be used by your file IKM to load it into file.
    You need to use a RDBMS for your staging area.

  • ODI: How determine if auto load is complete and all data has gone through

    Hi All,
    I am currently executing the below scenario and following http://docs.oracle.com/cd/E26070_11/otn/pdf/integration/E26075_01.pdf for the same.
    SRC: AGILE
    TRG: OBIEE
    As per the above i get out of the box packages which are used and executed , since this is something which comes out of the box how to determine whether my data load is complete and data has gone through from SRC->ODM->TRG?
    My Operator and sessesion show success and secondly i don see any errors in logs as well .
    Any help ? Anybody already experienced above scenario please help .
    Cheers,
    Mak

    So who do I believe?
    Blog:  http://www.jamoroki.com
    James King
    about.me/jamoroki
      <http://about.me/jamoroki>

  • [Resolved][ODQ] Error Running Exported Batch Script Outside of ODI

    Dear ODI experts,
    I have Installed 10.1.3.4.0 ---> selected "Oracle Data Integrator, Oracle Data Profiling, Oracle Data Quality 10.1.3.4.0" ---> Complete (1012MB)
    - follow all steps of "Design a Name and Address Cleansing Project" section in Sample Tutorial
    - in "Oracle DQ User Interface", right click Projects->Quality->customer master[1]
    - select "Run" from the context menu
    - the background tasks completed 100% properly
    - export the project as a Batch Script
    - make changes to config.txt, runProjectN.cmd, and *.stx files following as specified in the Sample Tutorial document
    - at step 13-f, got the following errors by executing runProjectN.cmd
    Wed Nov 28 17:18:37 2007 - 03503E ERROR: Can't open file: <D:\Bin\Oracle\ODIDQ10134\demo\oracledq\projects\oracledq\project1\data/e7_us_globrtr_p2.dat>. errno = 2; N
    o such file or directory Occurred in Transformer - (cl_open).
    I did a full search in ODI folders and couldn't find "e7_us_globrtr_p2.dat".
    I tried both export with data and export w/o data with proper change to specify input file path, and got the same error.
    Appreciate any suggestions and/or hints help me further debug of fix this issue.
    Regards,
    Roy
    The following is a complete message echo of runProjectN.cmd
    ===========================================================================
    D:\Bin\Oracle\ODIDQ10134\demo\oracledq\projects\oracledq\project1\scripts>runproject1.cmd
    D:\Bin\Oracle\ODIDQ10134\demo\oracledq\projects\oracledq\project1\scripts>set TS_PROJECT=D:\Bin\Oracle\ODIDQ10134\demo\oracledq\projects\oracledq\project1
    D:\Bin\Oracle\ODIDQ10134\demo\oracledq\projects\oracledq\project1\scripts>set TS_CONFIG=D:\Bin\Oracle\ODIDQ10134\demo\oracledq\projects\oracledq\project1\scripts\config.txt
    D:\Bin\Oracle\ODIDQ10134\demo\oracledq\projects\oracledq\project1\scripts>set TS_SETTINGS=D:\Bin\Oracle\ODIDQ10134\demo\oracledq\projects\oracledq\project1\settings
    D:\Bin\Oracle\ODIDQ10134\demo\oracledq\projects\oracledq\project1\scripts>set TS_DATA=D:\Bin\Oracle\ODIDQ10134\demo\oracledq\projects\oracledq\project1\data
    D:\Bin\Oracle\ODIDQ10134\demo\oracledq\projects\oracledq\project1\scripts>cd /D D:\Bin\Oracle\ODIDQ10134\oracledq\quality_server\tsq11r0s\Software\bin
    D:\Bin\Oracle\ODIDQ10134\oracledq\quality_server\tsq11r0s\Software\bin>call tranfrmr D:\Bin\Oracle\ODIDQ10134\demo\oracledq\projects\oracledq\project1\settings\e5_tranfrmr_p1.stx
    D:\Bin\Oracle\ODIDQ10134\oracledq\quality_server\tsq11r0s\Software\bin>call globrtr D:\Bin\Oracle\ODIDQ10134\demo\oracledq\projects\oracledq\project1\settings\e6_globrtr_p2.stx
    D:\Bin\Oracle\ODIDQ10134\oracledq\quality_server\tsq11r0s\Software\bin>call tranfrmr D:\Bin\Oracle\ODIDQ10134\demo\oracledq\projects\oracledq\project1\settings\e9_tranfrmr_p3.stx
    Wed Nov 28 17:18:37 2007 - 03503E ERROR: Can't open file: <D:\Bin\Oracle\ODIDQ10134\demo\oracledq\projects\oracledq\project1\data/e7_us_globrtr_p2.dat>. errno = 2; N
    o such file or directory Occurred in Transformer - (cl_open).
    D:\Bin\Oracle\ODIDQ10134\oracledq\quality_server\tsq11r0s\Software\bin>call cusparse D:\Bin\Oracle\ODIDQ10134\demo\oracledq\projects\oracledq\project1\settings\e10_cusparse_p4.stx
    Wed Nov 28 17:18:37 2007 - 03503E ERROR: Can't open file: <D:\Bin\Oracle\ODIDQ10134\demo\oracledq\projects\oracledq\project1\data/e9_us_tranfrmr_p3.dat>. errno = 2;
    No such file or directory Occurred in Customer Data Parser - (cl_open).
    D:\Bin\Oracle\ODIDQ10134\oracledq\quality_server\tsq11r0s\Software\bin>call tsqsort D:\Bin\Oracle\ODIDQ10134\demo\oracledq\projects\oracledq\project1\settings\e11_srtforpm_p5.stx
    Wed Nov 28 17:18:37 2007 - 03503E ERROR: Can't open file: <D:\Bin\Oracle\ODIDQ10134\demo\oracledq\projects\oracledq\project1\data/e10_us_cusparse_p4.dat>. errno = 2;
    No such file or directory Occurred in tsqsort - (cl_open).
    D:\Bin\Oracle\ODIDQ10134\oracledq\quality_server\tsq11r0s\Software\bin>call uspmatch D:\Bin\Oracle\ODIDQ10134\demo\oracledq\projects\oracledq\project1\settings\e12_pmatch_p6.stx
    Wed Nov 28 17:18:37 2007 - 03503E ERROR: Can't open file: <D:\Bin\Oracle\ODIDQ10134\demo\oracledq\projects\oracledq\project1\data/e11_us_srtforpm_p5.dat>. errno = 2;
    No such file or directory Occurred in CGeoIO::initGIO - (cl_open).
    D:\Bin\Oracle\ODIDQ10134\oracledq\quality_server\tsq11r0s\Software\bin>call winkey D:\Bin\Oracle\ODIDQ10134\demo\oracledq\projects\oracledq\project1\settings\e13_winkey_p7.stx
    Wed Nov 28 17:18:38 2007 - 03503E ERROR: Can't open file: <D:\Bin\Oracle\ODIDQ10134\demo\oracledq\projects\oracledq\project1\data/e12_us_pmatch_p6.dat>. errno = 2; N
    o such file or directory Occurred in CWKD::initWKD - (cl_open).
    D:\Bin\Oracle\ODIDQ10134\oracledq\quality_server\tsq11r0s\Software\bin>call tsqsort D:\Bin\Oracle\ODIDQ10134\demo\oracledq\projects\oracledq\project1\settings\e14_srtforrl_p8.stx
    Wed Nov 28 17:18:38 2007 - 03503E ERROR: Can't open file: <D:\Bin\Oracle\ODIDQ10134\demo\oracledq\projects\oracledq\project1\data/e13_us_winkey_p7.dat>. errno = 2; N
    o such file or directory Occurred in tsqsort - (cl_open).
    D:\Bin\Oracle\ODIDQ10134\oracledq\quality_server\tsq11r0s\Software\bin>call rellink D:\Bin\Oracle\ODIDQ10134\demo\oracledq\projects\oracledq\project1\settings\e15_rellink_p9.stx
    Wed Nov 28 17:18:38 2007 - 03503E ERROR: Can't open file: <D:\Bin\Oracle\ODIDQ10134\demo\oracledq\projects\oracledq\project1\data/e14_us_srtforrl_p8.dat>. errno = 2;
    No such file or directory Occurred in CMatcher::InitMatcher - (cl_open).
    D:\Bin\Oracle\ODIDQ10134\oracledq\quality_server\tsq11r0s\Software\bin>call common D:\Bin\Oracle\ODIDQ10134\demo\oracledq\projects\oracledq\project1\settings\e16_common_p10.stx
    Wed Nov 28 17:18:39 2007 - 03503E ERROR: Can't open file: <D:\Bin\Oracle\ODIDQ10134\demo\oracledq\projects\oracledq\project1\data/e15_us_rellink_p9.dat>. errno = 2;
    No such file or directory Occurred in Create Common - (cl_open).
    D:\Bin\Oracle\ODIDQ10134\oracledq\quality_server\tsq11r0s\Software\bin>call datarec D:\Bin\Oracle\ODIDQ10134\demo\oracledq\projects\oracledq\project1\settings\e17_datarec_p11.stx
    Wed Nov 28 17:18:39 2007 - 03503E ERROR: Can't open file: <D:\Bin\Oracle\ODIDQ10134\demo\oracledq\projects\oracledq\project1\data/e16_us_common_p10.dat>. errno = 2;
    No such file or directory Occurred in CReCons::initRC - (cl_open).
    D:\Bin\Oracle\ODIDQ10134\oracledq\quality_server\tsq11r0s\Software\bin>
    ===========================================================================
    Solution:
    Needs to modify "e9_tranfrmr_p3.stx" to point to "e6_nomatch_globrtr_p2_us.dat".

    Seems to be the DRM service went down, try running the batch after stopping & restarting the service.

  • Error while Loading Entity Dimension through ODI

    Hi,
    When I tried to load the outline for Entity dimension onto the Planning (11.1.1.0) through ODI, I'm getting the following error:
    7000 : null : java.sql.SQLException: Invalid COL ALIAS "STORAGE C2_DATA_STORAGE" for column "DATA"
    java.sql.SQLException: Invalid COL ALIAS "STORAGE C2_DATA_STORAGE" for column "DATA"
         at com.sunopsis.jdbc.driver.file.bb.b(bb.java)
         at com.sunopsis.jdbc.driver.file.bb.a(bb.java)
         at com.sunopsis.jdbc.driver.file.w.e(w.java)
         at com.sunopsis.jdbc.driver.file.w.executeQuery(w.java)
         at com.sunopsis.sql.SnpsQuery.executeQuery(SnpsQuery.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execCollOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSqlC.treatTaskTrt(SnpSessTaskSqlC.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
         at com.sunopsis.dwg.cmd.e.i(e.java)
         at com.sunopsis.dwg.cmd.h.y(h.java)
         at com.sunopsis.dwg.cmd.e.run(e.java)
         at java.lang.Thread.run(Unknown Source)
    Kindly help.
    -Jitendra

    Hi,
    Send over the models/interface and I will have a look.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Data Cleansing Terms Clarification

    Folks,
    Another question and thanks again to all those that have been helpful so far.
    In a data flow that I am building I have both customer information and address information. I plan on running my data both through an English base data cleanse and a Canada address cleanse.
    I notice that there are many options for the same output field. The difference being the generated field class. I am confused as to what is the difference between Parsed and Standardized in regards to the base data cleanse.
    Also I am confused that there are extra types such as alternate or none. Also in dealing with address cleansing there are other items such as generated field category and generated field addrclass which also seem to have an impact on the data that is output from these transforms.
    I would like my transforms to correct error data and also if it is possible add items such as city or province when they are missing based on the postal code which is there for example. I have gone through both the designer guide and reference guide in regards to these terms, it has left me more confused than anything.
    Which way of implementing this would be the best.
    Thanks in advance,
    Bill
    Edited by: William Grdovich on Oct 5, 2010 4:13 PM

    Bill,
    I notice that there are many options for the same output field. The difference being the generated field  class. I am confused as to what is the difference between Parsed and Standardized in regards to the base data cleanse.
    Parsed means that the address has been separated into its components (house numnber, street name, city, etc).  Standardized means that the address has been parsed, and each component value has been corrected, updated or enhanced.
    Also I am confused that there are extra types such as alternate or none.
    An altenate type means that an alternate value is available.  For example, in New York City, 6th Ave (the official name) is also known as Avenue of the Americas (the alternate name).  If a field has type 'None', it means that there is only one type associated with this field.
    Also in dealing with address cleansing there are other items such as generated field category and generated field addrclass which also seem to have an impact on the data that is output from these transforms.
    A description of the field category columns is provided in the SAP Business Objects Data Services Reference Guide, Data Quality Fields, Global Address Cleanse fields.
    I would like my transforms to correct error data and also if it is possible add items such as city or province when they are missing based on the postal code which is there for example.
    Use Generated Field Class 'Best', Generated Field Category 'Component, Generated Field Addrclass 'Official'.  If your selected output field has  Generated Field Class 'None' then use Generated Field Category 'Standardized'.
    Paul

  • QUESTION:  Essbase data extraction and Installing ODI Agent??

    For extracting data from Essbase cubes, ODI has "LKM Hyperion Essbase DATA to SQL".
    We can use (1). ReportScript, or (2). MDX-query, or (3). CalcScript
    For data-extraction using CalcScript, ODI Agent must be running on the same server as the Essbase server.
    Does anyone know if there is a need for ODI Agent on the Essbase machine if we use MDX-query method for data-extraction?
    We would like to avoid installing ODI Agent for Essbase data-extraction.
    .

    Thanks John.
    One related question. To move data from one Essbase cube to another Essbase cube using ODI Interface, Can we do it efficiently through MDX-query?
    We want to avoid Replicated-partitioning OR CalcScripts, if possible.
    BTW... Your ODI/Hyperion blog is a bible for us.

  • Need to Design a project through SSIS

    Dear All,
    I have a one project which I need to completed through maximum use of SSIS.
    Task is: There is a one source DB which have many tables & other side one destination DB  with same numbers of tables what source DB has.
    with I want that on particular date All tables of source DB should be updated in destination tables, for examples if there is table source.x with 10 records & then same records should be inserted in destination.x & after that one alert should send
    across with excel attachment which have all information of those records with table names.
    I want to complete this through SSIS only.
    Please help.
    Thanks,

    Is it your homework? Hint is use a query on destination tables which filters out the data on particular date..
    Best Regards,Uri Dimant SQL Server MVP,
    http://sqlblog.com/blogs/uri_dimant/
    MS SQL optimization: MS SQL Development and Optimization
    MS SQL Consulting:
    Large scale of database and data cleansing
    Remote DBA Services:
    Improves MS SQL Database Performance
    SQL Server Integration Services:
    Business Intelligence

  • Error while loading Data into Essbase using ODI

    Hi,
    I am very new to ODI. I have installed ODI and working on Demo environment only. I havn't done any configuration. I am using Essbase Technology which is coming by default.
    I have created one sample outline in Essbase and a text file to load data into essbase using ODI.
    Following my text file.
    Time     Market     Product     Scenario     Measures     Data
    Jan     USA     Pepsi     Actual     Sales     222
    I am getting the error. I have checked in Operator. It is giving at step 6 i.e. Integration SampleLoad data into essbase.
    Here is the description.
    from com.hyperion.odi.common import ODIConstants
    from com.hyperion.odi.connection import HypAppConnectionFactory
    from java.lang import Class
    from java.lang import Boolean
    from java.sql import *
    from java.util import HashMap
    # Get the select statement on the staging area:
    sql= """select C3_C1 ""Time"",C5_C2 ""Market"",C2_C3 ""product"",C6_C4 ""Scenario"",C1_C5 ""Measures"",C4_C6 ""Data"" from "C$_0Demo_Demo_genData" where      (1=1) """
    srcCx = odiRef.getJDBCConnection("SRC")
    stmt = srcCx.createStatement()
    srcFetchSize=30
    stmt.setFetchSize(srcFetchSize)
    rs = stmt.executeQuery(sql)
    #load the data
    stats = pWriter.loadData(rs)
    #close the database result set, connection
    rs.close()
    stmt.close()
    Please help me to proceed further...

    Hi John,
    Here is the error message in execution tab....
    org.apache.bsf.BSFException: exception from Jython:
    Traceback (innermost last):
    File "<string>", line 20, in ?
    java.sql.SQLException: Unexpected token: TIME in statement [select   C3_C1    ""Time]
         at org.hsqldb.jdbc.jdbcUtil.sqlException(jdbcUtil.java:67)
         at org.hsqldb.jdbc.jdbcStatement.fetchResult(jdbcStatement.java:1598)
         at org.hsqldb.jdbc.jdbcStatement.executeQuery(jdbcStatement.java:194)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
         at java.lang.reflect.Method.invoke(Unknown Source)
         at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java)
         at org.python.core.PyMethod.__call__(PyMethod.java)
         at org.python.core.PyObject.__call__(PyObject.java)
         at org.python.core.PyInstance.invoke(PyInstance.java)
         at org.python.pycode._pyx4.f$0(<string>:20)
         at org.python.pycode._pyx4.call_function(<string>)
         at org.python.core.PyTableCode.call(PyTableCode.java)
         at org.python.core.PyCode.call(PyCode.java)
         at org.python.core.Py.runCode(Py.java)
         at org.python.core.Py.exec(Py.java)
         at org.python.util.PythonInterpreter.exec(PythonInterpreter.java)
         at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:144)
         at com.sunopsis.dwg.codeinterpretor.k.a(k.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
         at com.sunopsis.dwg.cmd.e.i(e.java)
         at com.sunopsis.dwg.cmd.g.y(g.java)
         at com.sunopsis.dwg.cmd.e.run(e.java)
         at java.lang.Thread.run(Unknown Source)
    java.sql.SQLException: java.sql.SQLException: Unexpected token: TIME in statement [select   C3_C1    ""Time]
         at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
         at com.sunopsis.dwg.codeinterpretor.k.a(k.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
         at com.sunopsis.dwg.cmd.e.i(e.java)
         at com.sunopsis.dwg.cmd.g.y(g.java)
         at com.sunopsis.dwg.cmd.e.run(e.java)
         at java.lang.Thread.run(Unknown Source)

  • Error running Execution plan for 'Project Analytics in PeopleSoft 9.0 '

    Hi ,
    I am running execution plan for Projects Peoplesoft 9.0 (BI Apps version 7.9.6)
    The issue is the file type data source went thru the database (data warehouse)
    the tasks that were completed SUCCESS were from the File source only from (SrcFiles).
    error as below.
    ANOMALY INFO::: Error while executing : INFORMATICA TASK:SDE_PSFT_90_Adaptor:SDE_PSFT_ExchangeRateDimension_Full:(Source : FULL Target : FULL)
    MESSAGE:::
    Irrecoverable Error
    Error while contacting Informatica server for getting workflow status for SDE_PSFT_ExchangeRateDimension_Full
    Error Code = 36331:Unknown reason for error code 36331
    Pmcmd output :
    one more info,if this helps
    values defined in the Parameters of the execution plan.
    1     DATASOURCE     ,DBConnection_OLAP     is 'DataWarehouse'
    1 DATASOURCE     ,DBConnection_OLTP     is 'PSFT_9_0_FINSCM'
    1     DATASOURCE     ,FlatFileConnection     is 'PSFT_9_0_FlatFile'
    The 'Relational Connection' In Informatica Repository Manager is currently pointing to only 'PSFT_9_0_FINSCM'
    & 'DataWarehouse'
    Please let me know what is wrong?
    Regards,
    JK

    Hi ,
    I created the PSFT connection in 'Applicaiton connection' of Repository manager.
    Also the Informatica server's tnsnames.ora should have entry for the source( PSFT database) and that of the datawarehouse.
    thanks,
    JK

  • BP_TASK Data Cleansing

    Happy New Year Experts,
    I have a question or three for you on Data Cleansing in the Web IC.  I will explain what I have done and what I need answers to.
    The setup of the Data Cleansing Cases and Account search is fine.  We can search for Business Partners and Merge Now or Merge Later and then search for cases if we chose to Merge Later.
    When we go into the Case to process it most of the functionality is fine.  What isn't ok is this BP_TASK config .
    I understand that in order to execute the 'START' button in the Case Processing screens you need to have the Task config setup.  I have done this to an extent as described below:
    1) Setup Number Ranges - IMG->CRM->Master Data->BP->DQA-> Maintain Number Ranges [Create line 01-0000000001-9999999999]
    2) In Task (1003) add the action profile BP_TASK - IMG->CRM->Transactions->Basic Settings->Define transaction types
    3) Create job CRM_BUPA_REALIGNMENT (periodic job 5-10 min)
    Point 3 is what I am coming unstuck with.  I cannot create a periodic job without assiging an ABAP Program to run etc.  There is nothing anywhere that says 'use this program or method' when creating the job step.
    Secondly, when I select the 'START' button in the Case Processing screen - after confirming the changes, the case errors with a vague message saying the case cannot be saved.  However, if I actually hit the 'SAVE' button the case saves and the changes I confirmed are processed between the various accounts.  So the whole process is about 95% great and 5% annoying.
    Before the questions, all authorisation settings are good as well.
    The questions are then:
    1)  What parameters are required above what I described for the periodic job?
    2)  Does the 'Task' transaction type need to be in the Business Transaction profile for the specific business role the Data Cleansing functionality is assigned within?
    3)  Once a task is created, I guess that the job will process these and that a user does not necessarily need to manual process these tasks?
    4)  Should I change the Action Definition and Condition at all over and above the standard setup that it is currently in?
    Any help and guidance would be great - I'm afraid I have 'Wood for Trees' syndrome now
    Many Thanks for any assistance.
    Regards,
    Mat.

    Hi Gregor,
    I have read your interesting blog. We have a similar kind of data cleansing activity. But when I tried to implement the same, we get a message that the Data Cleansing option is not available for the Insurance industry specific settings. Can you pls help us here ? When I analysed the same the FM 'FKK_CLEANSING_ALLOWED' has a hard coded code element as follows:
    *4.71: Event 7500 does not exist yet*
      IF 1 = 2.
         REFRESH: t_fbstab[].
         CALL FUNCTION 'FKK_FUNC_MODULE_DETERMINE'
           EXPORTING
              i_fbeve            = gc_event_7500
              i_only_application = gc_x
              i_applk            = applk
           TABLES
              t_fbstab           = t_fbstab[].
         LOOP AT t_fbstab INTO fbstab.
           CALL FUNCTION fbstab-funcc
              CHANGING
                c_cleansing_allowed = cleansing_allowed.
         ENDLOOP.
         IF 1 = 2.
    *     für Verwendungsnachweis
           CALL FUNCTION 'FKK_SAMPLE_7500'
              CHANGING
                c_cleansing_allowed = cleansing_allowed.
         ENDIF.
      ENDIF.
    Our SAP version is as folows:
    SAP_BASIS      620         0056 SAPKB62056             SAP Basis Component            
    INSURANCE     472           0010     SAPKIPIN10     INSURANCE 472 : Add-On Installation
    Any help is much appreciated.
    Thanks.
    I Peter

  • Domain Names Not Appearing In Drop Down of Data Cleansing Task

    For some reason, in several of my DQS data cleansing tasks, the domain names of the knowledge base I choose are not appearing. This is turn leads to an error message stating that "No domain field has been mapped to the input column...".
    The task can see the DQS server and it can see the various knowledge bases on that server, but the domains are not appearing.
    Has anyone else run into this issue?
    Thanks!!
    A. M. Robinson

    I have just found the same issue.
    For me, I believe it is because the domains are only used in matching rules, which the SSIS task does not support. Therefore they are not in the drop down as the SSIS DQS task cannot use them. Only map domains for data correction and you should be ok.

Maybe you are looking for

  • I want to Create a Project that combines TWO VIs, is it possible?

    Hi every one I am working on a project that essentially is split into two part at the moment; 1. Data Acquisition - (In LabVIEW SignalExpress) 2. Data Manipulation Now I want to combine them together so that they can work sequentially together at the

  • Sequence settings for RED footage

    Hi to all, I'm editing with RED footage, using the low res redcode 512x288 clips. I'm trying to set my sequence settings to be able to edit without having to render for preview. But i can't seem to find the REDCODE compressor in my compressor list. A

  • Best practice on frequent updates 11g

    I have some source tables I want to get updates from every 20 minutes. What is the best approach in ODI? The tables have several hundred thousand records in them. They do have a last_updated_date in them

  • Clipping mask issue.. simple one but I can't see it.

    I want to fill the pattern into the shield. Tried everything on this, it's so simple that I can not even see it. Maybe it's the double lines on outline view? Please help. Multi million dollar idea going to waste.

  • Disabling of FORWARD and BACK Button of Internet Explorer.......

    Hello everybody, Can anyone tell me how to disable the FORWARD and BACK Button of Internet Explorer through JavaScript.......If possible give me the code snippet.....