Error while populating Xref table

Hi all,
I have created a project where i will extract job_id from source instance(which i am getting from AIAServiceConfigProperties.xml file) and populate it in the xref table.
Now i have imported 3 knowledge modules for this project:-
1. KM_LKM SQL to SQL (Mediator XREF)
2. KM_IKM SQL Control Append (Mediator XREF)
have not imported CKM as it cant handle LONG datatypes.
I have kept xref_table in the target datastore and the the job table in the source datastore panel. i have created a variable which extracts the sourceID from the AIAConfig file. But when it comes to the step of populating data into the xref table this error crops up:-
com.sunopsis.sql.SnpsMissingParametersException: Missing parameter
     at com.sunopsis.sql.SnpsQuery.completeHostVariable(SnpsQuery.java)
     at com.sunopsis.sql.SnpsQuery.updateExecStatement(SnpsQuery.java)
     at com.sunopsis.sql.SnpsQuery.executeUpdate(SnpsQuery.java)
     at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execStdOrders(SnpSessTaskSql.java)
     at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
     at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java)
     at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
     at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
     at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
     at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
     at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
     at com.sunopsis.dwg.cmd.e.i(e.java)
     at com.sunopsis.dwg.cmd.g.y(g.java)
     at com.sunopsis.dwg.cmd.e.run(e.java)
     at java.lang.Thread.run(Unknown Source)
Am i missing something here?
Regards,
Sourav

Hi SH,
I have created a variable GetSourceColumnName and in the xref table i have mapped #GetSourceColumnName against XREF_COLUMN_NAME in the xref table. When i am executing the package it is showing the required value but it is showing error in loading data into this xref table.
(in my AIAServiceConfig file my systemID is EBIZ_01 so when i am clicking on the variable after execution i can see this system ID)

Similar Messages

  • Error while populating Xref database

    Hi All,
    I am populating Xref database through ODI, but while executing it i am getting this error--
    com.sunopsis.sql.SnpsMissingParametersException: Missing parameter
    The error is generated at this step..Insert Flow into I$ table(sql code)
    /* Use of a PL-SQL bloc to perform the Insert (management of LONGS and LOBS etc.) */
    declare cursor myCursor is
    select      
         oramds:/apps/AIAMetaData/xref/JOBCODE_ID.xref     XREF_TABLE_NAME,
         #LOADJOBDETAILSINSTAGINGTABLE.GetSourceColumnName     XREF_COLUMN_NAME,
         SYS.GUID()     ROW_NUMBER,
         C1_VALUE     VALUE,
         'N'     IS_DELETED,
         CURRENT_TIMESTAMP     LAST_MODIFIED ,
         'I'     IND_UPDATE
    from      AIA11G_XREF.C$_0XREF_DATA
    where     (1=1)
    begin
         /* Loop over the Cursor and execute the insert statement */
         for aRecord in myCursor loop
              insert into     AIA11G_XREF.I$_XREF_DATA
                   XREF_TABLE_NAME,
                   XREF_COLUMN_NAME,
                   ROW_NUMBER,
                   VALUE,
                   IS_DELETED,
                   LAST_MODIFIED,
                   IND_UPDATE
              values      (
                   aRecord.XREF_TABLE_NAME,
                   aRecord.XREF_COLUMN_NAME,
                   aRecord.ROW_NUMBER,
                   aRecord.VALUE,
                   aRecord.IS_DELETED,
                   aRecord.LAST_MODIFIED,
                   aRecord.IND_UPDATE
         end loop;
    end;
    In the step Create Flow table I$--the sql generated is:-
    create table AIA11G_XREF.I$_XREF_DATA
         XREF_TABLE_NAME     VARCHAR2(2000) NULL,
         XREF_COLUMN_NAME     VARCHAR2(2000) NULL,
         ROW_NUMBER     VARCHAR2(48) NULL,
         VALUE     VARCHAR2(2000) NULL,
         IS_DELETED     VARCHAR2(1) NULL,
         LAST_MODIFIED     TIMESTAMP NULL,
         IND_UPDATE      char(1)
    NOLOGGING
    My source and target both are of oracle technology and i have imported LKM SQL to Oracle and IKM Oracle Incremental Update(pl/sql) and CKM
    and my odi version is 10.1.3.5
    Please solve this problem
    Regards,
    Sourav
    Edited by: user13263578 on Feb 22, 2011 9:18 PM
    Edited by: user13263578 on Feb 22, 2011 9:19 PM
    Edited by: user13263578 on Feb 23, 2011 4:03 AM

    Hi,
    You can copy the SQL code generated by ODI ( for the step giving this error) and then try to execute it via any SQL client like toad/ sql developer etc.
    This will help you to find the wrong data .
    Also is LAST_MODIFIED and LAST_ACCESSED have datatype as VARCHAR2 ? From the name it seems it is ment to have DATE .
    Thanks,
    Sutirtha

  • GENERATE_SUBPOOL_DIR_FULL  error while populating dynamic tables

    Hello!
    I have created a program that extensively uses dynamic internal tables. However, I faced a probelm regarding dynamic popultion of tables.
    Here is the sample:
    REPORT  ZTEST.
    DATA: t_fcat    TYPE lvc_t_fcat,
          dt_outtab TYPE REF TO DATA.
    FIELD-SYMBOLS: <tab> TYPE STANDARD TABLE.
    CALL FUNCTION 'LVC_FIELDCATALOG_MERGE'
            EXPORTING i_structure_name = 'CSKS'
            CHANGING  ct_fieldcat      = t_fcat.
    DO 100 TIMES.
      CALL METHOD cl_alv_table_create=>create_dynamic_table
                   EXPORTING it_fieldcatalog = t_fcat
                   IMPORTING ep_table        = dt_outtab
                   EXCEPTIONS generate_subpool_dir_full = 1
                              others                    = 2.
      IF sy-subrc = 0.
          WRITE / sy-index.
      ELSE.
          WRITE: / 'Error - sy-subrc = ', sy-subrc.
          EXIT.
      ENDIF.
    ENDDO.
    WRITE / 'End'.
    By using this program I found out that I can create no more than 36 internal tables for each program run. At the 37th one, the program raises an exception GENERATE_SUBPOOL_DIR_FULL.
    Each method call creates a subroutine pool that stores a table, and obviously their number is limited somehow, in my case to 36.
    we can ask user to leave the program and start again, which is really not a feature of a professional application.
    I'd like to have the solution with no limits as my program should run for as many as 100 tables where I need to create dynamically, at one strech.
    I'm using 4.6 release.
    Are there any experiences with this?
    Thanks in advance!
    Kind regards,
    Lalitha.

    Check this form, is one alternative, may be is usefull for you:
    FORM create_corresponding_tables TABLES not_standard
    STRUCTURE not_standard.
    DATA: progname LIKE sy-repid.
    DATA: BEGIN OF repcode OCCURS 10,
    line(72),
    END OF repcode.
    repcode-line = 'REPORT NEW_INFOTYPE_DATA.'.
    APPEND repcode.
    repcode-line = 'FORM DATA_DECLARATIONS.'.
    APPEND repcode.
    LOOP AT not_standard.
    CONCATENATE 'P' not_standard-infty INTO repcode-line.
    CONCATENATE 'DATA: ' repcode-line 'LIKE' repcode-line
    'OCCURS 10 WITH HEADER LINE.'
    INTO repcode-line SEPARATED BY space.
    APPEND repcode.
    ENDLOOP.
    repcode-line = 'ENDFORM.'.
    APPEND repcode.
    GENERATE SUBROUTINE POOL repcode NAME progname.
    PERFORM data_declarations IN PROGRAM (progname).
    ENDFORM. " CREATE_CORRESPONDING_TABLES

  • Getting an error while populating the XREF table

    Hi all,
    I have created a package for populating xref table(have imported the required kms into this project)in which the steps are as follows:-
    1. getting the source column name from the AIAserviceConfigProperties file
    2. then the interface for extracting the job_id into the xref table
    I am getting the source column name but i am not able to populate the xref table..i am getting the following error:-
    com.sunopsis.core.SnpsInexistantObjectException: There is no connection for this logical schema/context pair:ESB_XREF / GLOBAL
    please throw a light on this
    Regards,
    Sourav
    Edited by: user13263578 on Feb 20, 2011 8:26 PM

    hi all,
    Do i have to create a new context because of importing KM_LKM SQL to SQL (ESB XREF) in my project?
    Regards,
    Sourav
    Edited by: user13263578 on Feb 20, 2011 10:36 PM

  • Error while dropping a table

    Hi All,
    i got an error while dropping a table which is
    ORA-00600: internal error code, arguments: [kghstack_free1], [kntgmvm: collst], [], [], [], [], [], [], [], [], [], []
    i know learnt that -600 error is related to dba. now how to proceed.
    thanks and regards,
    sri ram.

    00600 errors should be raised as service request with Oracle as it implies some internal bug.
    You can search oracle support first to see if anyone has had the same class of 00600 error, and then if not (and therefore no patch) raise your issue with Oracle.
    http://support.oracle.com

  • Error while loading AS400 table

    Posted: 14 May 2014 08:11
    Post subject: Error while loading AS400 table
    Hi,
    I was trying to load an AS400 table using a flat file as source. The table and flat file has a single field only. I'm getting the error "<SQLExecute>: <[IBM][System i Access ODBC Driver]Error in assignment.>"
    The screenshot of the full error message is attached. can someone please help me to resolve this.
    Thanks,
    david

    Posted: 14 May 2014 08:11
    Post subject: Error while loading AS400 table
    Hi,
    I was trying to load an AS400 table using a flat file as source. The table and flat file has a single field only. I'm getting the error "<SQLExecute>: <[IBM][System i Access ODBC Driver]Error in assignment.>"
    The screenshot of the full error message is attached. can someone please help me to resolve this.
    Thanks,
    david

  • Full Load: Error while executing :TRUNCATE TABLE: S_ETL_PARAM

    Hi All,
    We are using Bi Apps 7.9.6.1. Full Load was running fine. But Now we are facing a problem with truncating a table "S_ETL_PARAM".
    I have restart informatica Server And also DAC Srever. But still I am getting the same in the DAC Log as, *"NOMALY INFO::: Error while executing : TRUNCATE TABLE:S_ETL_PARAM*
    *MESSAGE:::com.siebel.etl.database.IllegalSQLQueryException: DBConnection_OLTP:SIEBTRUN ('siebel.S_ETL_PARAM')*
    *Values :*
    *Null Map*
    *EXCEPTION CLASS::: java.lang.Exception"*
    Any Suggestion.....
    Thanks in Advance,
    Deepak

    are you trying to run incremental load when you get this truncate error? can you re-run full load and see that still runs ok? pls also check your DW side database logs like alert lor any DB level issue. such errors do not throw friendly messages in DAC/Informatica side.

  • Error while filling setup tables for 2lis_13_vahdr

    Dear friends,
    I am getting below Error while filling setup tables for 2lis_13_vahdr.
    More faulty documents found than the tolerated 0000000000
    Message no. M2222
    Can any one guide me how to address this issue.
    Thanks and Regards
    Nithya

    Hello Nithya,
    It seems the No of tolerated faulty documents that you have given is 0. So when you initialize try to these.
    While executing OLI*BW including a value in the maintain  "No. tolerated faulty documents" - 5000 or 10000.
    Then execute the program in background.
    Once the job is completed check the background job log if any errors.
    Thanks
    Chandran

  • Error while filling setup table for Sales

    Dear Team,
    We are getting following error while filling setup table for Sales (application component 11) and for Billing (application component 13),
    application component 11
    Error determining rate: foreign curr.  local curr. INR date 08.10.2007 (doc. 673624) (JOB - RMCVNEUA)
    Message No M2810
    application component 13
    Billing document 480050000: error determining stats. currency rate (no updating)
    More faulty documents found than the tolerated 0000000000 (JOB - RMCVNEUF)
    I have seen lot of threads based on this but all r showing diffrenent currency to INR but for the above mentioned documents currency is in INR only so why it is showing the error I am not able to determine.
    If anybody faced this problem kindly reply to this.
    Best Regards,
    SG

    Hi There,
    Might be your using a wrong document number.
    I mean if your using the 11 you should use only Sales document number and for 13 you should only use billing document number.
    If you use vice versa that error will throw.
    Regards,
    MQ

  • Timeout error while filling setup tables

    HI all
    I am getting timeout error while filling setup tables(OLI1BW).I clicked on execute button on hte screen.Its not scheduled as a background job.
    Going through threads i have come to know that to solve this issue
    1.BASIS team have to increase backgroung processses or memory
    2.run it as a background job.
    When I went to Program->Executein the backgroundPrint background parameters window has popped up.
    What to do in there?I am afraid if I include parameters like printer name....what is it and how long is it going to print?
    Please let me know.
    Thanks,
    Harika.

    thanks for the reply.
    it was set to LOCL only.
    But i have 3 options in the dropdown  windowsprinter :Send to onenote2007
                                      Microsoft XPS document writer
                                      OUrprinteraddress
    WHich one to choose here?
    Edited by: harikag on Aug 10, 2011 8:47 AM

  • 3-1674105521 Multiple Paths error while using Bridge Table

    https://support.us.oracle.com/oip/faces/secure/srm/srview/SRViewStandalone.jspx?sr=3-1674105521
    Customer Smiths Medical International Limited
    Description: Multiple Paths error while using Bridge Table
    1. I have a urgent customer encounterd a design issue and customer was trying to add 3 logical joins between SDI_GPOUP_MEMBERSHIP and these 3 tables (FACT_HOSPITAL_FINANCE_DTLS, FACT_HOSPITAL_BEDS_UTILZN and FACT_HOSPITAL_ATRIBUTES)
    2. They found found out by adding these 3 joins, they ended with circular error.
    [nQSError: 15001] Could not load navigation space for subject area GXODS.
    [nQSError: 15009] Multiple paths exist to table DIM_SDI_CUSTOMER_DEMOGRAPHICS. Circular logical schemas are not supported.
    In response to this circular error, the developer was able to bypass the error using aliases, but this is not desired by client.
    3. They want to know how to avoid this error totally without using alias table and suggest a way to resolve the circular join(Multiple Path) error.
    Appreciated if someone can give some pointer or suggestion as the customer is in stiff deadline.
    Thanks
    Teik

    The strange thing compared to your output is that I get an error when I have table prefix in the query block:
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.4.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Master table "SYSTEM"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
    Starting "SYSTEM"."SYS_IMPORT_FULL_01":  system/******** DUMPFILE=TMP1.dmp LOGFILE=imp.log PARALLEL=8 QUERY=SYSADM.TMP1:"WHERE TMP1.A = 2" REMAP_TABLE=SYSADM.TMP1:TMP3 CONTENT=DATA_ONLY
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    ORA-31693: Table data object "SYSADM"."TMP3" failed to load/unload and is being skipped due to error:
    ORA-38500: Unsupported operation: Oracle XML DB not present
    Job "SYSTEM"."SYS_IMPORT_FULL_01" completed with 1 error(s) at Fri Dec 13 10:39:11 2013 elapsed 0 00:00:03
    And if I remove it, it works:
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.4.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Master table "SYSTEM"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
    Starting "SYSTEM"."SYS_IMPORT_FULL_01":  system/******** DUMPFILE=TMP1.dmp LOGFILE=imp.log PARALLEL=8 QUERY=SYSADM.TMP1:"WHERE A = 2" REMAP_TABLE=SYSADM.TMP1:TMP3 CONTENT=DATA_ONLY
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    . . imported "SYSADM"."TMP3"                             5.406 KB       1 out of 2 rows
    Job "SYSTEM"."SYS_IMPORT_FULL_01" successfully completed at Fri Dec 13 10:36:50 2013 elapsed 0 00:00:01
    Nicolas.
    PS: as you can see, I'm on 11.2.0.4, I do not have 11.2.0.1 that you seem to use.

  • Out of Disk Space Error while Generating Pivot Table

    Hi!
    I encountered the following error while Generating Pivot Table -
    View pivotTableView!1 returned the following message:
    Odbc driver returned an error (SQLExtendedFetch).
    Error Details
    Error Codes: OPR4ONWY:U9IM8TAC
    Location: saw.odbc.statement.fetch,
    saw.cube.engine.execute.dataTraversal, saw.cube.engine.execute,
    saw.cube.cache.processCube, saw.views.pivottable.displayer,
    saw.subsystem.portal, saw.httpserver.request,
    saw.rpc.server.responder, saw.rpc.server,
    saw.rpc.server.socketServer, saw.threadPool, saw.threadPool,
    saw.threads
    File: ./project/webodbcaccess/odbcresultsetimpl.cpp, Line: 182
    State: S1000. Code: 10058. [NQODBC] [SQL_STATE: S1000] [nQSError:
    10058] A general error has occurred. Out of disk space. (S1000)

    user11204020, what's your question?? :)
    You're getting an out of disk space error ... which means you've run out of disk space, no?
    Since it's saw related I'm guessing you'll need to make more space available to your OracleBIdata/tmp directory

  • DAC: "error while reading repository table". Please help!

    Hi everybody,
    I copied DAC directory from server to a workstation.
    DAC client on server machine runs normally.
    When trying to run DAC client on the workstation, I get an error message "Error while reading repository table".
    I tried to find any DAC log file to get some additional information, but I couldn't.
    The connection in DAC seems to be configured properly (test connection successful).
    The only difference I found between server and workstation is that there are two Oracle homes installed on the workstation and one - on the server.
    Thanks,
    Alex

    So you can read the tables now? Good to know.
    Just as a hint: if you "tnsping" your data source from a command window, you'll see which path and file is used, so you can find out which home.
    TNS Ping Utility for 32-bit Windows: Version 11.1.0.6.0 - Production on 04-MAR-2009 16:42:43
    Copyright (c) 1997, 2007, Oracle. All rights reserved.
    Used parameter files:
    C:\Oracle\product\11.1.0\client_1\network\admin\sqlnet.ora <--- my 11g home is used and not the 9i or 10g which also reside on this machine.
    @DAC client can't connect to server: i.e. the icon stays red?
    - Is your server up and running? (I don't want to start a flame war, but it happens to the best of us)
    - Did you go through the steps here? http://download.oracle.com/docs/cd/E12513_01/doc/bic.101/e12653/dac_configure.htm
    Cheers,
    C.

  • DAC: error while reading repository table

    Hi everybody,
    I installed DAC (copied from server to a workstation).
    When trying to run DAC client on the workstation, I get an error message "Error while reading repository table"
    The connection seems to be configured properly (test connection successful).
    DAC client on server machine starts normally.
    What can be a problem?
    Thanks,
    Alex

    Hello again!
    Has anybody seen this problem?
    What can it be?
    Where can I find DAC logs?
    Cheers,
    Alex

  • Error: while populating the target.

    While populating the target table gettig the following error:
    ODI-1228: Task Int_HDM_IND_PRTY_ETHN (Export) fails on the target ORACLE connection ORACLE_ETLDEV_INOVA_3.
    Caused By: java.sql.SQLException: ORA-01471: cannot create a synonym with same name as object
         at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:457)
         at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:405)
         at oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:889)
         at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:476)
         at oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:204)
         at oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:540)
         at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:217)
         at oracle.jdbc.driver.T4CPreparedStatement.executeForRows(T4CPreparedStatement.java:1079)
         at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1466)
         at oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:3752)
         at oracle.jdbc.driver.OraclePreparedStatement.execute(OraclePreparedStatement.java:3937)
         at oracle.jdbc.driver.OraclePreparedStatementWrapper.execute(OraclePreparedStatementWrapper.java:1535)
         at oracle.odi.runtime.agent.execution.sql.SQLCommand.execute(SQLCommand.java:163)
         at oracle.odi.runtime.agent.execution.sql.SQLExecutor.execute(SQLExecutor.java:102)
         at oracle.odi.runtime.agent.execution.sql.SQLExecutor.execute(SQLExecutor.java:1)
         at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2906)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2609)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:537)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:453)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1740)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:338)
         at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:214)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:272)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:263)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:822)
         at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:123)
         at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
         at java.lang.Thread.run(Thread.java:662)

    Hi,
    Bottom line: The object name is already persist in ur back end DB.
    Login to ur back end schema and query,
    select * from all_objects where object_name like '<object name which ODI generate>%'
    Drop it manually and run the interface.
    Cause for the problem, initially u would have ran the same interface with different KM and opted DELETE_TEMPORARY_OBJECTS to False/ODI fails without dropping the temporary table, so ODI will keep the C$_<object name> in back end (may be in table form).
    Now, with DBLINK u are trying to create Synonym with the same name as the earlier object which eventually fails.
    Makes sense?
    Thanks,
    Guru

Maybe you are looking for