DAC in DTAP

Hi all,
We are using the DAC in a DTAP environment.
This week we did a first migration from Development to Test.
We exported the DAC repository and imported it to the Test server.
We now have a problem with tasks that are inactivated in Development and become active in Test.
This is an example:
Task ‘Extract for Person Dimension’, originally from Source System Container ‘Siebel 8.0 Vertical’.
The underlying Informatica mapping was customized as part of the development process.
Then a new Task was created with a custom indication: ‘Extract for Person Dimension C’, created under Source System Container ‘Case Management’.
The original task still will be selected when assembling a subject area. Thus the orginal task in Source System Container ‘Siebel 8.0 Vertical’ was set to inactive. Modifying the orginal task is not possible, only cloning the record to the Source System Container ‘Case Management’ and set it there to inactive.
Thus in Development the following situation exists:
•     ‘Extract for Person Dimension’, Source System Container ‘Siebel 8.0 Vertical’, active
•     ‘Extract for Person Dimension’, Source System Container ‘Case Management’, inactive
•     ‘Extract for Person Dimension C’, Source System Container ‘Case Management’, active
Then when the DAC repository is exported and imported to the Test environment the following situation exists:
•     ‘Extract for Person Dimension’, Source System Container ‘Siebel 8.0 Vertical’, active
•     ‘Extract for Person Dimension’, Source System Container ‘Case Management’, inactive
•     ‘Extract for Person Dimension C’, Source System Container ‘Case Management’, active
The custom subject area still contains the correct tasks, however with assembling again the Test environment all active tasks are included in the subject area.
Did we do something wrong? Did we miss a step?
Thanks!
Michael

It seems that the table W_ETL_OBJ_REF isn't imported correct.
We import trough exported XML files.
Tips?

Similar Messages

  • Error while creating a new DAC connection using connection type MSSQL

    Hi,
    I am trying to create a new DAC connection i.e. a new DAC repository in the SQL Server 2008 database.
    DAC version : 10.1.3.4.1
    Database : SQL Server 2008
    I have downloaded the sqljdbc4.jar file from the below link and placed it in the D:\orahome\10gR3_1\bifoundation\dac\lib folder.
    [http://www.microsoft.com/en-us/download/details.aspx?displaylang=en&id=11774 ]
    I have entered all the details correctly for database name, database host, database port. I created a new Authentication file.
    I get the below error when I try to test the connection.
    MESSAGE:::MSSQL driver not available!
    EXCEPTION CLASS::: java.lang.IllegalArgumentException
    com.siebel.etl.gui.login.LoginDataHandler$LoginStructure.testConnection(LoginDataHandler.java:512)
    com.siebel.etl.gui.login.LoginDataHandler.testConnection(LoginDataHandler.java:386)
    com.siebel.etl.gui.login.ConnectionTestDialog$Executor.run(ConnectionTestDialog.java:290)
    ::: CAUSE :::
    MESSAGE:::com.microsoft.sqlserver.jdbc.SQLServerDriver
    EXCEPTION CLASS::: java.lang.ClassNotFoundException
    java.net.URLClassLoader$1.run(URLClassLoader.java:200)
    java.security.AccessController.doPrivileged(Native Method)
    java.net.URLClassLoader.findClass(URLClassLoader.java:188)
    java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:276)
    java.lang.ClassLoader.loadClass(ClassLoader.java:251)
    java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)
    java.lang.Class.forName0(Native Method)
    java.lang.Class.forName(Class.java:169)
    com.siebel.etl.gui.login.LoginDataHandler$LoginStructure.testConnection(LoginDataHandler.java:510)
    com.siebel.etl.gui.login.LoginDataHandler.testConnection(LoginDataHandler.java:386)
    com.siebel.etl.gui.login.ConnectionTestDialog$Executor.run(ConnectionTestDialog.java:290)
    The error seems to be a connectivity issue with SQL Server. Am I using the correct jar file?
    Please help me out in resolving this issue. Appreciate the help provided on this forum earlier.
    Thank You

    Add
    .\lib\sqljdbc4.jar
    at end of the line starting with SQLSERVERLIB in config.bat file
    Pls mark correct

  • Error while creating a task for creating an generic task  /app/dac/CustomSQ

    Hi ,
    I have created a sql file in DAC server /app/dac/CustomSQLs/ ,just to fire an update sql in database
    In DAC task tab i have created a task with the following:
    Command for incremental load :/app/dac/CustomSQLs/DBNameBeforeLoad.sql
    Primary source :flatfileconnection
    target source:DBCONNECTION_OLAP
    Execution type:SQL FILE
    Task phase:GENERAL
    I created subject area and assembled ,then created a Execution plan.
    When i try to execute this EP ,it shows the following error in DAC log:
    ANOMALY INFO::: Error while creating a task for creating an generic task /app/dac/CustomSQLs/DBNameBeforeLoad.sql
    MESSAGE:::/app/dac/CustomSQLs/DBNameBeforeLoad.sql - invalide template name!
    EXCEPTION CLASS::: com.siebel.analytics.etl.etltask.TaskInitializationException
    com.siebel.analytics.etl.etltask.SQLFileTask.doInit(SQLFileTask.java:69)
    com.siebel.analytics.etl.etltask.GenericTaskImpl.init(GenericTaskImpl.java:194)
    Does my above configuration is correct ..?

    verify the following settings:
    1. mapping of ECC plant and storage location and logical system number with the appropriate availability group code since this links the stock types in EWM to ECC org elements (SPRO -> EWM -> Interfaces -> ERP Integration -> Goods movement)
    2. ensure the availability group configuration is correct (SPRO -> EWM -> GR Process -> Configure availability group for putaway) - check all applicable nodes in this configuration since the availability group code is assigned in the pertinent storage types

  • Error while creating the DWH tables using DAC

    Hi,
    I am getting error while creating the DWH tables using DAC. I have created a ODBC DSN using merant driver with DAC repository DB credentials and the test connection is successful. And while creating the tables i gave the olap dw credentials and the DSN name which i created earlier. But it throws the error as below:
    Please find the below mentioned error message
    =====================================
    STD OUTPUT
    =====================================
    CREATING SIEBEL DATABASE OBJECTS
    F:\DAC\bifoundation\dac\UTILITIES\BIN\DDLIMP /I N /s N /u infdomain /p ******* /c DB_DAC /G "SSE_ROLE" /f F:\DAC\bifoundation\dac/conf/sqlgen/ctl-file/oracle_bi_dw.ctl /b "" /K "" /X "" /W N
    Error while importing Siebel database schema.
    =====================================
    ERROR OUTPUT
    =====================================
    Siebel Enterprise Applications ODBC DDL Import Utility, Version 7.7 [18030] ENU
    Copyright (c) 2001 Siebel Systems, Inc. All rights reserved.
    This software is the property of Siebel Systems, Inc., 2207 Bridgepointe Parkway,
    San Mateo, CA 94404.
    User agrees that any use of this software is governed by: (1) the applicable
    user limitations and other terms and conditions of the license agreement which
    has been entered into with Siebel Systems or its authorized distributors; and
    (2) the proprietary and restricted rights notices included in this software.
    WARNING: THIS COMPUTER PROGRAM IS PROTECTED BY U.S. AND INTERNATIONAL LAW.
    UNAUTHORIZED REPRODUCTION, DISTRIBUTION OR USE OF THIS PROGRAM, OR ANY PORTION
    OF IT, MAY RESULT IN SEVERE CIVIL AND CRIMINAL PENALTIES, AND WILL BE
    PROSECUTED TO THE MAXIMUM EXTENT POSSIBLE UNDER THE LAW.
    If you have received this software in error, please notify Siebel Systems
    immediately at (650) 295-5000.
    F:\DAC\bifoundation\dac\UTILITIES\BIN\DDLIMP /I N /s N /u infdomain /p ***** /c DB_DAC /G SSE_ROLE /f F:\DAC\bifoundation\dac/conf/sqlgen/ctl-file/oracle_bi_dw.ctl /b /K /X /W N
    Connecting to the database...
    28000: [DataDirect][ODBC Oracle driver][Oracle]ORA-01017: invalid username/password; logon denied
    Unable to connect to the database...
    any help is appreciated.
    Thanks,
    RM

    The fact that you are getting an "ORA-01017: invalid username/password; logon denied" message indicates that you are at least talking to the database.
    The log shows that username "infdomain" is being used. Can you double check the username and password you have in DAC in a SQL*Plus/SQL Developer session?
    Please mark if useful/helpful,
    Andy.

  • Error while importing a DW table into DAC

    Hi,
    We are on OBIEE 7.9.6 and we have a requirement of adding new DW table. I created a new table in DW and getting error "Could connect to data source process. Process failed during creation of connection pool", while trying to import this new table into DAC. I would like to know why I am getting this error.
    Any help is greatly appreciated.
    Thanks,
    Chandra

    Thank you for your response.
    I added this table in DW database and have imported into informatica. When I look at the traffic light it is red. Looks like thats the problem.
    Thanks again.

  • Error While running the ETL Load in DAC (BI Financial Analytics)

    Hi All,
    I have Installed and Configured BI Applictions 7.9.5 and Informatic8.1.1. For the first time when we run the ETL Load in DAC it has failed.for us every Test Connection was sucess.and getting the error message as below.
    The log file which I pasted below is from the path
    /u01/app/oracle/product/Informatica/PowerCenter8.1.1/server/infa_shared
    /SessLogs
    SDE_ORAR12_Adaptor.SDE_ORA_GL_AP_LinkageInformation_Extract_Full.log
    DIRECTOR> VAR_27028 Use override value [DataWarehouse] for session parameter:[$DBConnection_OLAP].
    DIRECTOR> VAR_27028 Use override value [ORA_R12] for session parameter:[$DBConnection_OLTP].
    DIRECTOR> VAR_27028 Use override value [9] for mapping parameter:[$$DATASOURCE_NUM_ID].
    DIRECTOR> VAR_27028 Use override value ['Y'] for mapping parameter:[$$FILTER_BY_LEDGER_ID].
    DIRECTOR> VAR_27028 Use override value ['N'] for mapping parameter:[$$FILTER_BY_LEDGER_TYPE].
    DIRECTOR> VAR_27028 Use override value [04/02/2007] for mapping parameter:[$$INITIAL_EXTRACT_DATE].
    DIRECTOR> VAR_27028 Use override value [] for mapping parameter:[$$LAST_EXTRACT_DATE].
    DIRECTOR> VAR_27028 Use override value [1] for mapping parameter:[$$LEDGER_ID_LIST].
    DIRECTOR> VAR_27028 Use override value ['NONE'] for mapping parameter:[$$LEDGER_TYPE_LIST].
    DIRECTOR> TM_6014 Initializing session [SDE_ORA_GL_AP_LinkageInformation_Extract_Full] at [Thu Feb 12 12:49:33 2009]
    DIRECTOR> TM_6683 Repository Name: [DEV_Oracle_BI_DW_Rep]
    DIRECTOR> TM_6684 Server Name: [DEV_Oracle_BI_DW_Rep_Integration_Service]
    DIRECTOR> TM_6686 Folder: [SDE_ORAR12_Adaptor]
    DIRECTOR> TM_6685 Workflow: [SDE_ORA_GL_AP_LinkageInformation_Extract_Full]
    DIRECTOR> TM_6101 Mapping name: SDE_ORA_GL_AP_LinkageInformation_Extract [version 1]
    DIRECTOR> TM_6827 [u01/app/oracle/product/Informatica/PowerCenter8.1.1/server/infa_shared/Storage] will be used as storage directory for session [SDE_ORA_GL_AP_LinkageInformation_Extract_Full].
    DIRECTOR> CMN_1805 Recovery cache will be deleted when running in normal mode.
    DIRECTOR> CMN_1802 Session recovery cache initialization is complete.
    DIRECTOR> TM_6708 Using configuration property [SiebelUnicodeDB,apps@devr12 bawdev@devbi]
    DIRECTOR> TM_6703 Session [SDE_ORA_GL_AP_LinkageInformation_Extract_Full] is run by 64-bit Integration Service [node01_oratestbi], version [8.1.1 SP4], build [0817].
    MANAGER> PETL_24058 Running Partition Group [1].
    MANAGER> PETL_24000 Parallel Pipeline Engine initializing.
    MANAGER> PETL_24001 Parallel Pipeline Engine running.
    MANAGER> PETL_24003 Initializing session run.
    MAPPING> CMN_1569 Server Mode: [ASCII]
    MAPPING> CMN_1570 Server Code page: [ISO 8859-1 Western European]
    MAPPING> TM_6151 Session Sort Order: [Binary]
    MAPPING> TM_6156 Using LOW precision decimal arithmetic
    MAPPING> TM_6180 Deadlock retry logic will not be implemented.
    MAPPING> TM_6307 DTM Error Log Disabled.
    MAPPING> TE_7022 TShmWriter: Initialized
    MAPPING> TE_7004 Transformation Parse Warning; transformation continues...
    MAPPING> TE_7004 Transformation Parse Warning; transformation continues...
    MAPPING> TE_7004 Transformation Parse Warning; transformation continues...
    MAPPING> TE_7004 Transformation Parse Warning; transformation continues...
    MAPPING> TE_7004 Transformation Parse Warning; transformation continues...
    MAPPING> TE_7004 Transformation Parse Warning; transformation continues...
    MAPPING> TM_6007 DTM initialized successfully for session [SDE_ORA_GL_AP_LinkageInformation_Extract_Full]
    DIRECTOR> PETL_24033 All DTM Connection Info: [<NONE>].
    MANAGER> PETL_24004 Starting pre-session tasks. : (Thu Feb 12 12:49:34 2009)
    MANAGER> PETL_24027 Pre-session task completed successfully. : (Thu Feb 12 12:49:34 2009)
    DIRECTOR> PETL_24006 Starting data movement.
    MAPPING> TM_6660 Total Buffer Pool size is 12582912 bytes and Block size is 128000 bytes.
    READER_1_1_1> DBG_21438 Reader: Source is [devr12.tessco.com], user [apps]
    READER_1_1_1> BLKR_16003 Initialization completed successfully.
    WRITER_1_*_1> WRT_8146 Writer: Target is database [DEVBI], user [bawdev], bulk mode [ON]
    WRITER_1_*_1> WRT_8106 Warning! Bulk Mode session - recovery is not guaranteed.
    WRITER_1_*_1> WRT_8124 Target Table W_GL_LINKAGE_INFORMATION_GS :SQL INSERT statement:
    INSERT INTO W_GL_LINKAGE_INFORMATION_GS(SOURCE_DISTRIBUTION_ID,JOURNAL_LINE_INTEGRATION_ID,LEDGER_ID,LEDGER_TYPE,DISTRIBUTION_SOURCE,JE_BATCH_NAME,JE_HEADER_NAME,JE_LINE_NUM,POSTED_ON_DT,SLA_TRX_INTEGRATION_ID,DATASOURCE_NUM_ID) VALUES ( ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
    WRITER_1_*_1> WRT_8270 Target connection group #1 consists of target(s) [W_GL_LINKAGE_INFORMATION_GS]
    WRITER_1_*_1> WRT_8003 Writer initialization complete.
    READER_1_1_1> BLKR_16007 Reader run started.
    WRITER_1_*_1> WRT_8005 Writer run started.
    WRITER_1_*_1> WRT_8158
    *****START LOAD SESSION*****
    Load Start Time: Thu Feb 12 12:49:34 2009
    Target tables:
    W_GL_LINKAGE_INFORMATION_GS
    READER_1_1_1> RR_4029 SQ Instance [SQ_XLA_AE_LINES] User specified SQL Query [SELECT
    DLINK.SOURCE_DISTRIBUTION_ID_NUM_1 DISTRIBUTION_ID,
    DLINK.SOURCE_DISTRIBUTION_TYPE SOURCE_TABLE,
    DLINK.ACCOUNTING_LINE_CODE LINE_CODE,
          AELINE.ACCOUNTING_CLASS_CODE,
    GLIMPREF.JE_HEADER_ID JE_HEADER_ID,
    GLIMPREF.JE_LINE_NUM JE_LINE_NUM,
    AELINE.AE_HEADER_ID AE_HEADER_ID,
    AELINE.AE_LINE_NUM AE_LINE_NUM,
    T.LEDGER_ID LEDGER_ID,
    T.LEDGER_CATEGORY_CODE LEDGER_TYPE,
        JBATCH.NAME BATCH_NAME,
       JHEADER.NAME HEADER_NAME,
          PER.END_DATE
    FROM XLA_DISTRIBUTION_LINKS DLINK
       , GL_IMPORT_REFERENCES        GLIMPREF
       , XLA_AE_LINES                              AELINE
       , GL_JE_HEADERS                         JHEADER
       , GL_JE_BATCHES                         JBATCH
       , GL_LEDGERS                                 T
       , GL_PERIODS   PER
    WHERE DLINK.SOURCE_DISTRIBUTION_TYPE IN
             (  'AP_INV_DIST', 'AP_PMT_DIST'
              , 'AP_PREPAY')
    AND DLINK.APPLICATION_ID = 200
    AND AELINE.APPLICATION_ID = 200
    AND AELINE.GL_SL_LINK_TABLE = GLIMPREF.GL_SL_LINK_TABLE
    AND AELINE.GL_SL_LINK_ID         = GLIMPREF.GL_SL_LINK_ID
    AND AELINE.AE_HEADER_ID         = DLINK.AE_HEADER_ID        
    AND AELINE.AE_LINE_NUM           = DLINK.AE_LINE_NUM
    AND GLIMPREF.JE_HEADER_ID   = JHEADER.JE_HEADER_ID
    AND JHEADER.JE_BATCH_ID       = JBATCH.JE_BATCH_ID
    AND JHEADER.LEDGER_ID                   = T.LEDGER_ID
    AND JHEADER.STATUS                         = 'P'
    AND T.PERIOD_SET_NAME = PER.PERIOD_SET_NAME
    AND JHEADER.PERIOD_NAME = PER.PERIOD_NAME
    AND JHEADER.CREATION_DATE >=
              TO_DATE('04/02/2007 00:00:00'
                    , 'MM/DD/YYYY HH24:MI:SS' )
    AND DECODE('Y', 'Y', T.LEDGER_ID, 1) IN (1)
    AND DECODE('N', 'Y', T.LEDGER_CATEGORY_CODE, 'NONE') IN ('NONE')]
    READER_1_1_1> RR_4049 SQL Query issued to database : (Thu Feb 12 12:49:34 2009)
    READER_1_1_1> CMN_1761 Timestamp Event: [Thu Feb 12 12:49:34 2009]
    READER_1_1_1> RR_4035 SQL Error [
    ORA-01114: IO error writing block to file 513 (block # 328465)
    ORA-27072: File I/O error
    Linux-x86_64 Error: 28: No space left on device
    Additional information: 4
    Additional information: 328465
    Additional information: -1
    ORA-01114: IO error writing block to file 513 (block # 328465)
    ORA-27072: File I/O error
    Linux-x86_64 Error: 28: No space left on device
    Additional information: 4
    Additional information: 328465
    Additional information: -1
    Database driver error...
    Function Name : Execute
    SQL Stmt : SELECT
    DLINK.SOURCE_DISTRIBUTION_ID_NUM_1 DISTRIBUTION_ID,
    DLINK.SOURCE_DISTRIBUTION_TYPE SOURCE_TABLE,
    DLINK.ACCOUNTING_LINE_CODE LINE_CODE,
    AELINE.ACCOUNTING_CLASS_CODE,
    GLIMPREF.JE_HEADER_ID JE_HEADER_ID,
    GLIMPREF.JE_LINE_NUM JE_LINE_NUM,
    AELINE.AE_HEADER_ID AE_HEADER_ID,
    AELINE.AE_LINE_NUM AE_LINE_NUM,
    T.LEDGER_ID LEDGER_ID,
    T.LEDGER_CATEGORY_CODE LEDGER_TYPE,
    JBATCH.NAME BATCH_NAME,
    JHEADER.NAME HEADER_NAME,
    PER.END_DATE
    FROM XLA_DISTRIBUTION_LINKS DLINK
    , GL_IMPORT_REFERENCES GLIMPREF
    , XLA_AE_LINES AELINE
    , GL_JE_HEADERS JHEADER
    , GL_JE_BATCHES JBATCH
    , GL_LEDGERS T
    , GL_PERIODS PER
    WHERE DLINK.SOURCE_DISTRIBUTION_TYPE IN
    ( 'AP_INV_DIST', 'AP_PMT_DIST'
    , 'AP_PREPAY')
    AND DLINK.APPLICATION_ID = 200
    AND AELINE.APPLICATION_ID = 200
    AND AELINE.GL_SL_LINK_TABLE = GLIMPREF.GL_SL_LINK_TABLE
    AND AELINE.GL_SL_LINK_ID = GLIMPREF.GL_SL_LINK_ID
    AND AELINE.AE_HEADER_ID = DLINK.AE_HEADER_ID
    AND AELINE.AE_LINE_NUM = DLINK.AE_LINE_NUM
    AND GLIMPREF.JE_HEADER_ID = JHEADER.JE_HEADER_ID
    AND JHEADER.JE_BATCH_ID = JBATCH.JE_BATCH_ID
    AND JHEADER.LEDGER_ID = T.LEDGER_ID
    AND JHEADER.STATUS = 'P'
    AND T.PERIOD_SET_NAME = PER.PERIOD_SET_NAME
    AND JHEADER.PERIOD_NAME = PER.PERIOD_NAME
    AND JHEADER.CREATION_DATE >=
    TO_DATE('04/02/2007 00:00:00'
    , 'MM/DD/YYYY HH24:MI:SS' )
    AND DECODE('Y', 'Y', T.LEDGER_ID, 1) IN (1)
    AND DECODE('N', 'Y', T.LEDGER_CATEGORY_CODE, 'NONE') IN ('NONE')
    Oracle Fatal Error
    Database driver error...
    Function Name : Execute
    SQL Stmt : SELECT
    DLINK.SOURCE_DISTRIBUTION_ID_NUM_1 DISTRIBUTION_ID,
    DLINK.SOURCE_DISTRIBUTION_TYPE SOURCE_TABLE,
    DLINK.ACCOUNTING_LINE_CODE LINE_CODE,
    AELINE.ACCOUNTING_CLASS_CODE,
    GLIMPREF.JE_HEADER_ID JE_HEADER_ID,
    GLIMPREF.JE_LINE_NUM JE_LINE_NUM,
    AELINE.AE_HEADER_ID AE_HEADER_ID,
    AELINE.AE_LINE_NUM AE_LINE_NUM,
    T.LEDGER_ID LEDGER_ID,
    T.LEDGER_CATEGORY_CODE LEDGER_TYPE,
    JBATCH.NAME BATCH_NAME,
    JHEADER.NAME HEADER_NAME,
    PER.END_DATE
    FROM XLA_DISTRIBUTION_LINKS DLINK
    , GL_IMPORT_REFERENCES GLIMPREF
    , XLA_AE_LINES AELINE
    , GL_JE_HEADERS JHEADER
    , GL_JE_BATCHES JBATCH
    , GL_LEDGERS T
    , GL_PERIODS PER
    WHERE DLINK.SOURCE_DISTRIBUTION_TYPE IN
    ( 'AP_INV_DIST', 'AP_PMT_DIST'
    , 'AP_PREPAY')
    AND DLINK.APPLICATION_ID = 200
    AND AELINE.APPLICATION_ID = 200
    AND AELINE.GL_SL_LINK_TABLE = GLIMPREF.GL_SL_LINK_TABLE
    AND AELINE.GL_SL_LINK_ID = GLIMPREF.GL_SL_LINK_ID
    AND AELINE.AE_HEADER_ID = DLINK.AE_HEADER_ID
    AND AELINE.AE_LINE_NUM = DLINK.AE_LINE_NUM
    AND GLIMPREF.JE_HEADER_ID = JHEADER.JE_HEADER_ID
    AND JHEADER.JE_BATCH_ID = JBATCH.JE_BATCH_ID
    AND JHEADER.LEDGER_ID = T.LEDGER_ID
    AND JHEADER.STATUS = 'P'
    AND T.PERIOD_SET_NAME = PER.PERIOD_SET_NAME
    AND JHEADER.PERIOD_NAME = PER.PERIOD_NAME
    AND JHEADER.CREATION_DATE >=
    TO_DATE('04/02/2007 00:00:00'
    , 'MM/DD/YYYY HH24:MI:SS' )
    AND DECODE('Y', 'Y', T.LEDGER_ID, 1) IN (1)
    AND DECODE('N', 'Y', T.LEDGER_CATEGORY_CODE, 'NONE') IN ('NONE')
    Oracle Fatal Error].
    READER_1_1_1> CMN_1761 Timestamp Event: [Thu Feb 12 12:49:34 2009]
    READER_1_1_1> BLKR_16004 ERROR: Prepare failed.
    WRITER_1_*_1> WRT_8333 Rolling back all the targets due to fatal session error.
    WRITER_1_*_1> WRT_8325 Final rollback executed for the target [W_GL_LINKAGE_INFORMATION_GS] at end of load
    WRITER_1_*_1> WRT_8035 Load complete time: Thu Feb 12 12:49:34 2009
    LOAD SUMMARY
    ============
    WRT_8036 Target: W_GL_LINKAGE_INFORMATION_GS (Instance Name: [W_GL_LINKAGE_INFORMATION_GS])
    WRT_8044 No data loaded for this target
    WRITER_1__1> WRT_8043 ****END LOAD SESSION*****
    MANAGER> PETL_24031
    ***** RUN INFO FOR TGT LOAD ORDER GROUP [1], CONCURRENT SET [1] *****
    Thread [READER_1_1_1] created for [the read stage] of partition point [SQ_XLA_AE_LINES] has completed: Total Run Time = [0.673295] secs, Total Idle Time = [0.000000] secs, Busy Percentage = [100.000000].
    Thread [TRANSF_1_1_1] created for [the transformation stage] of partition point [SQ_XLA_AE_LINES] has completed. The total run time was insufficient for any meaningful statistics.
    Thread [WRITER_1_*_1] created for [the write stage] of partition point [W_GL_LINKAGE_INFORMATION_GS] has completed. The total run time was insufficient for any meaningful statistics.
    MANAGER> PETL_24005 Starting post-session tasks. : (Thu Feb 12 12:49:35 2009)
    MANAGER> PETL_24029 Post-session task completed successfully. : (Thu Feb 12 12:49:35 2009)
    MAPPING> TM_6018 Session [SDE_ORA_GL_AP_LinkageInformation_Extract_Full] run completed with [0] row transformation errors.
    MANAGER> PETL_24002 Parallel Pipeline Engine finished.
    DIRECTOR> PETL_24013 Session run completed with failure.
    DIRECTOR> TM_6022
    SESSION LOAD SUMMARY
    ================================================
    DIRECTOR> TM_6252 Source Load Summary.
    DIRECTOR> CMN_1740 Table: [SQ_XLA_AE_LINES] (Instance Name: [SQ_XLA_AE_LINES])
         Output Rows [0], Affected Rows [0], Applied Rows [0], Rejected Rows [0]
    DIRECTOR> TM_6253 Target Load Summary.
    DIRECTOR> CMN_1740 Table: [W_GL_LINKAGE_INFORMATION_GS] (Instance Name: [W_GL_LINKAGE_INFORMATION_GS])
         Output Rows [0], Affected Rows [0], Applied Rows [0], Rejected Rows [0]
    DIRECTOR> TM_6023
    ===================================================
    DIRECTOR> TM_6020 Session [SDE_ORA_GL_AP_LinkageInformation_Extract_Full] completed at [Thu Feb 12 12:49:36 2009]
    Thanks in Advance,
    Prashanth
    Edited by: user10719430 on Feb 11, 2009 7:33 AM
    Edited by: user10719430 on Feb 12, 2009 11:31 AM

    Need to increase temp tablespace.

  • Regarding REFRESHING of Data in Data warehouse using DAC Incremental approa

    My client is planning to move from Discoverer to OBIA but before that we need some answers.
    1) My client needs the data to be refreshed every hour (incremental load using DAC) because they are using lot of real time data.
    We don't have much updated data( e.g 10 invoices in an hour + some other). How much time it usually takes to refresh those tables in Data wareshouse using DAC?
    2) While the table is getting refreshed can we use that table to generate a report? If yes, what is the state of data? Stale or incorrect(undefined)?
    3) How does refresh of Fin analytics work? Is it one module at a time or it treats all 3 modules (GL, AR and AP) as a single unit of refresh?
    I would really appreciate if I can get an answer for all the questions.
    Thank You,

    Here you go for answers:
    1) Shouldn't be much problem for a such small amt of data. All depends on ur execution plan in DAC which can always be created as new and can be customized to load data for only those tables...(Star Schema) ---- Approx 15-20 mins as it does so many things apart from loading table.
    2) Report in OBIEE will give previous data as I believe Cache will be (Shud be) turned on. You will get the new data in reports after the refresh is complete and cache is cleared using various methods ( Event Polling preferred)
    3) Again for Fin Analytics or any other module, you will have OOTB plans. But you can create ur new plans and execute. GL, AR, AP are also provided seperate..
    Hope this answers your question...You will get to know more which going through Oracle docs...particular for DAC

  • Unable to capture video from VHS through DAC-200

    (Running Final Cut Pro 5.1.4)
    I have a DAC-200 to hook up my VCR to my G5 in order to capture video from VHS tapes into Final Cut. I had this set-up working a few months ago, but it was disconnected to rearrange equipment and now I can't get it working again. I have followed the advice from the DAC-200 installation instructions and advice from other forum topics but can't seem to get it to work. I finally was able to get something in the log and capture window besides "preview disabled" and "cannot capture because there is no video." I now can get Final Cut to capture video, however it only captures a blank black screen (and this is all I can see in the preview window).
    I've triple-checked every cable hook-up, the G5 recognizes the converter box, the video tape plays fine to a tv. I've even had our IT-guy here at work come check out my set-up for help and he can't figure out what is wrong either. (He did get my log&capture window to get white fuzz in the black screen at one point though. Does that mean anything?)
    Any help is greatly appreciated! (I'm happy to answer any other questions too.) Thanks!

    I had difficullty with the X.4.10 plus QT7.2 and put my info on the Forum as well as sending feedback to Apple. Shortly afterwards, Apple brought out a short security update which I downloaded and, hey presto, I was then able to use QT7.2 with OSX.4.10. It may be that you missed out on the later small update to QT7.2..... ? Another factor I found was that some little while later I again had a problem so I again installed QT7.2 and over-installed FCP5.1.4 which has resolved my problems for the time being. Worth a try if you still can't get through on the DAC.
    Ron.

  • IMovie not recognising Data Video DAC AV/DV converter

    I use my DataVideo Dac-1 to capture old VHS family movies. Drop them into iMovie edit and make short DVD's. The system worked a treat with our old G4 and Panther. Have just bought a dual G5 2 Gig machine & iMovie HD. The system is not seeing the DAC on the fire wire bus. I have tried new cables, plugging into a native bus etc etc. but no go. The DAC is admitted 5 years old but worked perfectly up to now. I am loath to have to scrap it because of a Fire wire incompatibility issue. Anyone else out here with similiar issues and a possible solution - or else off to the mail-order I go and cough up €200

    Hi Richard,
    I cannot help, sorry, but I read here many reports concerning Converters worked/then failed....
    I recommend http://www.apple.com/feedback/imovie.html to report in detail your system's profile and converter specs.... maybe, there IS some <bad word following> bug???

  • Unit Testing in DAC

    Hi,
    I am facing a problem in unit testing in DAC. I am able to run full loads and incremental but i am unable to do unit testing in DAC. It shows an error that my Integration Services are not up when i run an individual task . Are there any setups for unit testing in DAC. I had followed http://docs.oracle.com/cd/E15586_01/fusionapps.1111/e14849/dacquickstart.htm#BABCIIIH topic : To unit test an execution plan task. If any one would help me in running a unit test that would be a great help.
    Thanks,
    Ram

    I'm struggling with the idea of testing View/Controller classes which depend on "container things" like bindings. Since there are many ways to interact with the 'container things' I guess it depends on exactly how you're using those bindings in your View/Controller classes. I've had success using Mockito along with my own fixture classes that do things like set up a mock ADFContext and RequestContext. Within my test setUp and tearDown, I set up and tear down these fixture classes, which results in a safe, mocked-out environment in which the test can execute. It's often tricky because many of the these objects are static and singleton-like, but so far I've always found a back door by which to inject a mock object. We also have our own utility classes that look-up 'container things', so we of course make sure we have a way to provide mock 'container things' through those utility classes.
    If you haven't come across the concept of 'mock objects' before, then it may be best to do a bit of background reading first:
    [http://www.mockobjects.com/2009/09/brief-history-of-mock-objects.html]
    [http://en.wikipedia.org/wiki/Mock_object]
    There are many mocking libraries about (EasyMock, jMock, rMock) but IMO Mockito is the one to choose right now.
    Of course, the alternative is to apply some [Inversion of Control|http://martinfowler.com/articles/injection.html] and make sure that the code you right is always ignorant of exactly where the 'container things' have come from. This makes mocking much easier, but I've not yet investigated use of a D.I. container (e.g. Spring) along with ADF - I suspect your options in 10g would be limited.

  • How do I use a DAC with my Airport Express ?

    Hello,
    Can I use a DAC with Airport Express ? If yes, how ?
    TIA

    check out this CNET Crave article: Using Apple's AirPort Express with a DAC: A how-to guide

  • Problem with Itunes and Remote when external DAC is used

    I am using IPad Remote app to connect to itunes library paired a windows vista Toshiba Laptop.
    When  the pc speaks are used or the iPad for audio reproduction, everything works OK, and iPad Remote can connect to iTunes Library.
    Unfortunately when the optical audio output of the PC is used for audio reproduction through an external DAC, amplifier and speakers, iPad cannot connect to iTunes library!
    Thanks in advance.

    Perhaps Tom could assist.

  • I am trying to start DAC 11g server in AIX but getting erorr (MESSAGE:::Schema version is missing or old).

    OS detected: AIX
    Jan 14, 2014 4:33:52 PM com.siebel.etl.etlmanager.DerivedConstants getDacDomainHome
    INFO: The DAC_DOMAIN_HOME set to /apps/DAC11g/dac
    Jan 14, 2014 4:33:52 PM com.siebel.etl.etlmanager.DerivedConstants getDacConfigLocation
    INFO: The DAC_CONFIG_LOCATION value hasn't been found, the USERDIR value was used instead
    Jan 14, 2014 4:33:52 PM com.siebel.etl.etlmanager.DerivedConstants getDacConfigLocation
    INFO: The DAC_CONFIG_LOCATION set to /apps/DAC11g/dac
    Jan 14, 2014 4:33:52 PM com.siebel.etl.etlmanager.DerivedConstants getOracleDacHome
    INFO: The BI_ORACLE_HOME value hasn't been found, the USERDIR value was used instead
    Jan 14, 2014 4:33:52 PM com.siebel.etl.etlmanager.DerivedConstants getOracleDacHome
    INFO: The ORACLEDIR set to /apps/DAC11g/dac
    Jan 14, 2014 4:33:53 PM com.siebel.etl.bootup.ApplicationManager <init>
    INFO: Application booting up...
    Jan 14, 2014 4:33:54 PM oracle.security.jps.internal.common.util.XmlSchemaValidationUtil$StrictErrorHandler warning
    WARNING: Failed to validate the xml content. SchemaLocation: schemaLocation value = 'http://xmlns.oracle.com/oracleas/schema/11/jps-config-11_1.xsd' must have even number of URI's. Location: line 2 column 272.
    Jan 14, 2014 4:33:56 PM com.siebel.etl.database.ConnectionManager init
    INFO: Creating repository pool
    Jan 14, 2014 4:34:00 PM com.siebel.etl.database.ConnectionManager init
    INFO: Repository pool created
    Jan 14, 2014 4:34:00 PM com.siebel.etl.net.QServer <init>
    GLOBAL: DAC Version: Dac Build AN 11.1.1.6.4.20121119.2022
    Jan 14, 2014 4:34:00 PM com.siebel.etl.net.QServer <init>
    GLOBAL: Process name: 18940116@lfx007qpa
    Jan 14, 2014 4:34:00 PM com.siebel.etl.net.QServer <init>
    GLOBAL: The System properties are: java.assistive       ON
    java.runtime.name       Java(TM) SE Runtime Environment
    ibm.signalhandling.rs   false
    sun.boot.library.path   /usr/java6_64/jre/lib/ppc64/default:/usr/java6_64/jre/lib/ppc64
    java.vm.version 2.4
    com.ibm.oti.configuration       scar
    java.vm.vendor  IBM Corporation
    java.vendor.url http://www.ibm.com/
    path.separator  :
    java.vm.name    IBM J9 VM
    user.country    US
    sun.java.launcher       SUN_STANDARD
    user.dir        /apps/DAC11g/dac
    java.vm.specification.name      Java Virtual Machine Specification
    java.runtime.version    pap6460sr9fp1-20110208_03 (SR9 FP1)
    java.fullversion        JRE 1.6.0 IBM J9 2.4 AIX ppc64-64 jvmap6460sr9-20110203_74623 (JIT enabled, AOT enabled)
    J9VM - 20110203_074623
    JIT  - r9_20101028_17488ifx3
    GC   - 20101027_AA
    java.awt.graphicsenv    sun.awt.X11GraphicsEnvironment
    java.endorsed.dirs      /usr/java6_64/jre/lib/endorsed
    os.arch ppc64
    com.ibm.vm.bitmode      64
    java.io.tmpdir  /tmp/
    line.separator
    com.ibm.util.extralibs.properties
    java.vm.specification.vendor    Sun Microsystems Inc.
    user.variant
    java.awt.fonts
    os.name AIX
    sun.java2d.fontpath
    java.jcl.version        20110203_01
    sun.jnu.encoding        ISO8859-1
    java.library.path       /usr/java6_64/jre/lib/ppc64/default:/usr/java6_64/jre/lib/ppc64:/usr/java6_64/jre/lib/ppc64:/usr/java6_64/jre/lib/ppc64/default:/usr/java6_64/jre/lib/ppc64/j9vm:/usr/java6_64/jre/lib/ppc64:/usr/java6_64/jre/../lib/ppc64:/apps/oracle/oraclehomes/11.2.0/lib:/apps/Informatica9/9.0.1/server/bin:/apps/Informatica9/9.0.1:/apps/Informatica9/9.0.1/ODBC6.0/lib:.:/apps/Informatica9/9.0.1/server/bin:/apps/Informatica9/9.0.1/server/bin:/apps/Informatica9/9.0.1/server/bin:/apps/Informatica9/9.0.1/server/bin:/apps/Informatica9/9.0.1/server/bin:/apps/Informatica9/9.0.1/server/bin:/apps/Informatica9/9.0.1/server/bin:/apps/Informatica9/9.0.1/server/bin:/apps/Informatica9/9.0.1/server/bin:/apps/Informatica9/9.0.1/server/bin:/apps/Informatica9/9.0.1/server/bin:/apps/Informatica9/9.0.1/server/bin:/apps/Informatica9/9.0.1/server/bin:/apps/Informatica9/9.0.1/server/bin:/apps/Informatica9/9.0.1/server/bin:/usr/lib:/apps/Informatica9/9.0.1/server/bin:/apps/oracle/oraclehomes/11.2.0/lib:/apps/Informatica9/9.0.1:/apps/Informatica9/9.0.1/ODBC6.0/lib:/apps/Informatica9/9.0.1/server/bin:/apps/Informatica9/9.0.1/server/bin:/apps/Informatica9/9.0.1/server/bin:/apps/Informatica9/9.0.1/server/bin:/apps/Informatica9/9.0.1/server/bin:/apps/Informatica9/9.0.1/server/bin:/apps/Informatica9/9.0.1/server/bin:/apps/Informatica9/9.0.1/server/bin:/apps/Informatica9/9.0.1/server/bin:/apps/Informatica9/9.0.1/server/bin:/apps/Informatica9/9.0.1/server/bin:/apps/Informatica9/9.0.1/server/bin:/apps/Informatica9/9.0.1/server/bin:/usr/lib
    jxe.current.romimage.version    15
    com.ibm.oti.vm.bootstrap.library.path   /usr/java6_64/jre/lib/ppc64/default:/usr/java6_64/jre/lib/ppc64
    com.ibm.cpu.endian      big
    java.specification.name Java Platform API Specification
    java.class.version      50.0
    ibm.system.encoding     ISO8859-1
    java.util.prefs.PreferencesFactory      java.util.prefs.FileSystemPreferencesFactory
    os.version      6.1
    com.ibm.oti.vm.library.version  24
    com.ibm.jcl.checkClassPath
    user.home       /home/snlmgr
    user.timezone   America/Los_Angeles
    java.awt.printerjob     sun.print.PSPrinterJob
    file.encoding   ISO8859-1
    java.specification.version      1.6
    user.name       snlmgr
    java.class.path ./lib/msbase.jar:./lib/mssqlserver.jar:./lib/msutil.jar:./lib/sqljdbc.jar:./lib/ojdbc6.jar:./lib/ojdbc5.jar:./lib/ojdbc14.jar:./lib/db2java.zip:./lib/teradata.jar:./lib/terajdbc4.jar:./lib/log4j.jar:./lib/tdgssjava.jar:./lib/tdgssconfig.jar:./lib/nzjdbc.jar:./lib/bijdbc.jar:./lib/ttjdbc6.jar:./lib/orai18n.jar:./lib/timestenjmsxla.jar:./lib/jms.jar:./lib/javax.jms.jar:./DAWSystem.jar:./lib/biacm.paramproducer.jar::./lib/oracle_common/modules/oracle.pki_11.1.1/oraclepki.jar:./lib/oracle_common/webservices/wsclient_extended.jar:./lib/oracle_common/modules/oracle.jmx_11.1.1/jmxspi.jar:./lib/oracle_common/modules/oracle.odl_11.1.1/ojdl.jar:./lib/oracle_common/modules/oracle.jps_11.1.1/jps-internal.jar:./lib/oracle_common/modules/oracle.jps_11.1.1/jps-platform.jar:./lib/oracle_common/modules/oracle.jps_11.1.1/jps-se.jar:./lib/oracle_common/modules/oracle.idm_11.1.1/identitystore.jar:./lib/oracle_common/modules/oracle.jps_11.1.1/jps-az-rt.jar:./lib/oracle_common/modules/oracle.jps_11.1.1/jacc-spi.jar:./lib/oracle_common/modules/oracle.iau_11.1.1/fmw_audit.jar:./lib/oracle_common/modules/oracle.jmx_11.1.1/jmxframework.jar:./lib/oracle_common/modules/oracle.igf_11.1.1/identitydirectory.jar::./lib/oracle_common/jlib/help-share.jar:./lib/oracle_common/jlib/ohj.jar:./lib/oracle_common/jlib/jewt4.jar:./lib/oracle_common/jlib/share.jar:./lib/oracle_common/jlib/oracle_ice.jar:
    com.ibm.oti.shared.enabled      false
    java.vm.specification.version   1.0
    sun.arch.data.model     64
    sun.java.command        com.siebel.etl.net.QServer
    java.home       /usr/java6_64/jre
    com.ibm.oti.jcl.build   20110202_1316
    user.language   en
    ibm.signalhandling.sigint       true
    java.specification.vendor       Sun Microsystems Inc.
    os.encoding     ISO8859-1
    oracle.security.jps.config      /apps/DAC11g/dac/conf/security/jps-config-jse.xml
    java.vm.info    JRE 1.6.0 IBM J9 2.4 AIX ppc64-64 jvmap6460sr9-20110203_74623 (JIT enabled, AOT enabled)
    J9VM - 20110203_074623
    JIT  - r9_20101028_17488ifx3
    GC   - 20101027_AA
    java.version    1.6.0
    java.ext.dirs   /usr/java6_64/jre/lib/ext
    jxe.lowest.romimage.version     15
    sun.boot.class.path     /usr/java6_64/jre/lib/ppc64/default/jclSC160/vm.jar:/usr/java6_64/jre/lib/annotation.jar:/usr/java6_64/jre/lib/beans.jar:/usr/java6_64/jre/lib/java.util.jar:/usr/java6_64/jre/lib/jndi.jar:/usr/java6_64/jre/lib/logging.jar:/usr/java6_64/jre/lib/security.jar:/usr/java6_64/jre/lib/sql.jar:/usr/java6_64/jre/lib/ibmorb.jar:/usr/java6_64/jre/lib/ibmorbapi.jar:/usr/java6_64/jre/lib/ibmcfw.jar:/usr/java6_64/jre/lib/rt.jar:/usr/java6_64/jre/lib/charsets.jar:/usr/java6_64/jre/lib/resources.jar:/usr/java6_64/jre/lib/ibmpkcs.jar:/usr/java6_64/jre/lib/ibmcertpathfw.jar:/usr/java6_64/jre/lib/ibmjgssfw.jar:/usr/java6_64/jre/lib/ibmjssefw.jar:/usr/java6_64/jre/lib/ibmsaslfw.jar:/usr/java6_64/jre/lib/ibmjcefw.jar:/usr/java6_64/jre/lib/ibmjgssprovider.jar:/usr/java6_64/jre/lib/ibmjsseprovider2.jar:/usr/java6_64/jre/lib/ibmcertpathprovider.jar:/usr/java6_64/jre/lib/ibmxmlcrypto.jar:/usr/java6_64/jre/lib/management-agent.jar:/usr/java6_64/jre/lib/xml.jar:/usr/java6_64/jre/lib/jlm.jar:/usr/java6_64/jre/lib/javascript.jar
    java.vendor     IBM Corporation
    file.separator  /
    java.compiler   j9jit24
    sun.io.unicode.encoding UnicodeBig
    ibm.signalhandling.sigchain     true
    Jan 14, 2014 4:34:00 PM com.siebel.etl.engine.core.ETLUtils logException
    SEVERE:
    ANOMALY INFO::: QServer will stop owing to repository version mismatch: Schema version is missing or old.
    Try upgrading repository.
    MESSAGE:::Schema version is missing or old.
    Try upgrading repository.
    EXCEPTION CLASS::: com.siebel.etl.upgrade1.VersionMismatchException
    com.siebel.etl.upgrade1.UpgradeUtils.isRepositoryValid(UpgradeUtils.java:96)
    com.siebel.etl.net.QServer.versionIntegrityCheck(QServer.java:461)
    com.siebel.etl.net.QServer.<init>(QServer.java:132)
    com.siebel.etl.net.QServer.main(QServer.java:499)

    (1) Download the Windows Installer CleanUp utility installer file (msicuu2.exe) from the following Major Geeks page (use one of the links under the "DOWNLOAD LOCATIONS" thingy on the Major Geeks page):
    http://majorgeeks.com/download.php?det=4459
    (2) Doubleclick the msicuu2.exe file and follow the prompts to install the Windows Installer CleanUp utility. (If you're on a Windows Vista or Windows 7 system and you get a Code 800A0046 error message when doubleclicking the msicuu2.exe file, try instead right-clicking on the msicuu2.exe file and selecting "Run as administrator".)
    (3) In your Start menu click All Programs and then click Windows Install Clean Up. The Windows Installer CleanUp utility window appears, listing software that is currently installed on your computer.
    (4) In the list of programs that appears in CleanUp, select any iTunes entries and click "Remove", as per the following screenshot:
    (5) Quit out of CleanUp, restart the PC and try another iTunes install. Does it go through properly this time?

  • Multiple issues with PCI-5640R FPGA: DAC and Strange Execution at Host

    We are working on a communications systems project using the PCI-5640R
    IF-RIO transceiver and the FPGA module. At the FPGA, a sequence of bits are
    being modulated through multiplication with the sine wave generator.  The
    next step is to take the modulated sinusoidal signal and send it through the
    DAC. Throughout this project, we have been using the Analog Input and Output
    example project from Getting Started with the 5640-R IF... as a template
    to build this project.  There are, however, several issues/questions we
    have.  Attached are the HOST and FPGA vis that we are working with.  
    1.  The host only runs every other time.  At the host (BPSK_TX(HOST).VI, the execution gets
    halted for an infinite period of time at one of the FIFOs until 'stop' is
    hit.  But then at the subsequent execution, the host completes execution
    of the program.  In other words, the host is only receiving data from the
    FPGA every second time it is run. Why is this happening?  Are we missing
    something at the host or FPGA VI?  
     2.  How exactly do we send our own digitized signal through
    the DAC?  As seen in one of the FPGA VIs, we have tried modfiying the
    output section of the FPGA VI in the Analog Input and Output project in which
    the FPGA reads from the FIFO.  In our case, we are modulating the signal
    in a separate section, writing it to a target-scoped FIFO and then reading from
    that FIFO and processing the data as in the example.  This modified FPGA
    vi is "BPSK_TX(FPGA).VI)
    Unfotunately, we are not observing anything at an oscilloscope
    connected to the transceiver.  Even when we try to pass in a
    "custom" signal at the HOST we have no luck observing anything
    coherent.   As seen in the bare_sine_wave_test (FPGA).vi, we have
    attempted a relatively simple way of sending a signal through the DAC, yet
    still no luck.  I am guessing that this is related to issue #1.  
    On a related note, when
    receiving the signal and running it through the ADC, what steps are
    necessary?  Can one assume that it is
    similar to the FPGA.VI in the analog input and output example? 
    3.  How do for loops and while loops synchronize with timed loops
    and frames in the FPGA?  In the FPGA we are using a for loop to
    modulate the signal because the sine wave generator cannot be contained within
    a timed loop on the FPGA.  This will be important to us because at the
    receiver we will need to know the symbol rate in order to recover the signal.
     I would sincerely appreciate any feedback or help that can be provided
    on this,
    Attachments:
    BPSK_TX(FPGA).vi ‏152 KB
    BPSK_TX(HOST).vi ‏257 KB
    bare_sine_wave_test (HOST).vi ‏135 KB

    It may be that the FPGA Refernce has not been binded.  The issue was that the VIs need to be bound to the ni5640R FPGA VI Reference.ctl control.  This is an option on the popup menu when clicking on the Open FPGA VI Reference VI.  In some cases, it may already be selected in the popup menu.  In this case, unselect the Bind to Typedef option.  For good measure, I usually selected the FPGA VI to use with the host VI, and then I reset the Bind to Typedef option.  In most cases this should fix the ni5640R FPGA VI Reference.ctl control mismatches throughout the VI.  In some cases, I have to Save All, close the host VI and all subVIs.  Then reopen the host VI.  This has always working in all cases for me. 
    Jerry

  • Bug report: 3.2; DAC; GridControl; Navigation - Invalid reaction to mouse drag

    Hi,
    This is a bug report, I think.
    I have a one-column GridControl with mandatory column. Normally, DAC framework prohibits from navigation out of empty field and emits one of those DAC-1002 or DAC-603 errors. The only exception I noticed was a double-click on another row. In this case navigation allowed but an error is still produced.
    Today I was playing with this GridControl and discovered that I can drag the yellow frame across rows. And here comes a bug - GridControl doesn't change the current row on InfoBus according to current selection marked by yellow frame.
    In my case it brings the following problem: if the cursor is placed on the new empty row, user drags the cursor to another row and then tries to use any other navigation means except dragging, he gets a navigation error. DAC framework still thinks that the current row is the empty one and produces an error telling that the attribute cannot be empty.
    Dev Team, any comments?
    Regards,
    Vladimir

    Hmm... Nobody answers. I'll try another variation of the same question.
    DAC framework implicitly validates attributes on the base of Mandatory constraint in the Entity Object definition. Is it possible to set the level of this validation? For example, I want to conduct validation only when user navigates out of the row or the whole rowset, but not when he just leaves a field.
    Regards,
    Vladimir

Maybe you are looking for