Beriebsmodus DAC-Scriptblock

Während der DAC-Scriptblock unter Diadem 8.x die Auswahlbox "Betriebsmodus: synchron/asynchron zum Messtakt" führte, ist dieses Feld unter Diadem 9 verschwunden.
Warum? (Die Betriebsart 'asynchron' hatte in Diadem 8.1 unter WinXP nicht funktioniert.)
Habe ich etwas übersehen?
In welcher Betreibsart läuft der Scriptblock von Diadem 9.1 eigentlich?
Sollte der Block jetzt immer synchron zum Messtakt laufen und innerhalb des Scriptes z.B. (die relativ langsamen) seriellen Schnittstellen bewirtschaftet werden, so kann von 'harter Echtzeit' eines Dac-Schaltplans wohl keine Rede mehr sein.
(siehe Attachment)
Offenbar treten die Takte vom Block Takt2 (Software Interrupts ?) auch während der Bearbeitungszeit des Scriptblocks auf, werden aber nicht sofort bearbeitet, sondern stapeln sich auf und werden erst nach Ende der Bearbeitung des Scriptblocks praktisch gleichzeitig bearbeitet.
Man male sich aus, was passiert, wenn anstelle des Generator-Simulationsblockes eine echte Messhardware stünde oder wenn der DAC-Schaltplan auch echte Regelaufgaben zu leisten hat.
Martin Bohm
[email protected]
Attachments:
script1.zip ‏4 KB

Danke für den Hinweis, war mir entfallen. Es wird wohl so sein, wie Sie schreiben. Die Begründung, warum der Verarbeitungsblock nur synchron laufen sollte, kann ich nicht nachvollziehen. Besser solle der Scriptblock mit der Übergabe von Daten an langsame Hardware warten, als der gesammte DAC-Schaltplan. Dieser hat i.d.R. noch andere, wichtige Augaben zu erledigen (z.B. Regelabläufe), wie im Beispiel durch den Frequenzgenerator dargestellt.
Ich habe meinen Test-Schaltplan entsprechend angepasst (Script2.zip). Im Asynchron-Mode ist die Ausführung deutlich verbessert, aber noch alles andere als befriedigend.
Martin Bohm
[email protected]
Attachments:
Script2.zip ‏4 KB

Similar Messages

  • String-Daten aus DAC-Scriptblock empfangen?

    Gibt es irgend eine Möglichkeit, um nichtnumerische Daten, die innerhalb eines Scripblocks anfallen, zu empfangen?
    Die Datenleitungen können nur numerische Werte (grüne Leitungen)
    übertragen. Die Kanlal oder Scriptparameter können nur Daten an das
    Script übergeben, jedoch keine Daten empfangen. Ich benötige aber einen
    Weg, um einen innerhalb der Scriptbearbeitung anfallenden String (es
    ist ein Pfad)  irgendwie zu empfangen.
    In meiner Verzweifelung habe ich in Abwandlung eines Beispiels aus der
    Hilfe versucht, Daten über eine OLE bzw. ActiveX -Verbindung in den
    Datenbereich zu schaufeln. Hier der relevante Ausschnitt:
    Function ConnectToDIAdem
    'Create the DIAdem object
      Dim nValueT
      ConnectToDIAdem = 0
        On Error Resume Next
        Set oDIAdem = CreateObject("DIAdem.TOCommand")
        If Err.Number > 0 then
          MsgBox ("Err No " & CStr(Err.Number) & " " & Err.Description)
          Err.Clear
        Else
          oDIAdem.bNoErrorDisplay = true
          oDIAdem.bNoWarningDisplay = true
          ConnectToDIAdem = 1
        End If
    End Function
    Sub DisconnectFromDIAdem
    'Destroy the DIAdem object
      Set oDIAdem   = Nothing
    End Sub
    Dim oDIAdem
    Sub SFD_Init( DeviceParam1V, DeviceParam2V, ErrorP )
        Const strCanNotStart = "An error has occurred while executing the example."
    'Execute the DIAdem command
            Dim Exe_All, Exe_One, Exe_Type, Para
        If ConnectToDIAdem Then
            If oDIAdem.CmdExecuteSync("ChD(2,2) = 15" ) <> 1 Then
               MsgBox strCanNotStart
            End If
            oDIAdem.CmdExecuteSync("WndShow('SHELL','Show')")
            Call DisconnectFromDIAdem
        Else
            MsgBox strCanNotStart
        End If
    End Sub
    Beim Start des DAC-Plans (mit Scriptblock) stürtzt zunächst Diadem
    kommentarlos ab. Beim Versuch, Diadem mit dem Task-manager zu beenden,
    gibt es promt ein reset des Rechners. (Win2000)
    Ich poste in Deutsch, da Diadem-DAC ohnehin nur im deutschsprachigem Raum verbreitet sein dürfte.

    Hallo,
    Im Script als solches ist nur ein kleiner Fehler: Die Zuweisung eines Wertes an die CHD Variable muss mit einem := erfolgen.
    Wenn Sie dies ändern, lässt sich das Programm im DIAdem SCRIPT ausführen.
    Was man allerdings vermeiden sollte ist, die DIAdem.ToCommand Schnittstelle aus dem Script-DAC Treiber aufzurufen. Ein Zugriff auf die Datenmatrix ist aus dem Treiber generell nicht möglich, auch nicht auf diesem Umweg. Allerdings hängt der Programmfehler nicht mit der ChD Variable zusammen, der ActiveX Zugriff auf die DIAdem API aus diesem Kontext führt alleine schon zu Problemen.
    Um die Strings Ihres Messgerätes verwerten zu können, müssten Sie diese während der Messung in einer Datei ablegen. FileIO ist im Script DAC Treiber Kontext gestattet.
    MfG
    Ingo Schumacher
    Systems Engineer Sound&VibrationNational Instruments Germany

  • DAC Scriptbloc​k: Warnung, "Böse" Variblenna​men verursache​n System-cra​sh!

    Die Verwendung des Variablennamen "tu" im Script eines DAC-Scriptblocks verursacht einen Systemabsturz! (vergl. Attachment)
    Getestet mit Diadem 9.10.2160 unter Win2000 Pro und WinXP Pro.
    Martin Bohm
    [email protected]
    Dieses Posting ist keine Frage im eigentlichen Sinne. Ich möchte nur andere DIAdem-Anwender vor tagelanger und damit kostspieliger Fehlersuche bewahren, wie es mir ergangen ist.
    Attachments:
    script3.zip ‏3 KB

    Hallo Herr Bohm,
    TU ist eine DIAdem Variable, die im Objekt für einfache Texte das Unterstreichen markiert. TXTUNDL ist die "lange" Variante, die auch zB. mit CRTL-A protokolliert wird. Grundsätzlich können solche Variablen nur im Kontext des entsprechenden Objektes verwendet werden zB.:
    Call GRAPHObjOpen("Text1")
    TXTUNDL = 1
    Call GRAPHObjClose("Text1")
    Wir werden zur nächsten Version die Überprüfung ergänzen.
    Gruß
    Walter

  • Data exchange Mainscript (SCRIPT) with script block (DAC)

    Is there any way to exchange data beetwen a Mainscript (SCRIPT) with
    user-dialoges and script block (DAC) in this way that the script in
    scriptblock can access to this data?
    Background: I write a DAC-Application with some script-blocks for
    reading and writing data to/from real devices. During the development
    I'd like to simulate all device accesses because I don'd have the
    devices in my office. I write all scripts with a branch for simulation
    an real measurement on startup.
    How can I execute a swich (simulation / mesurement) without changing
    all my scripts all times? Can a script read a variable anyway (Variable
    from Mainscript, Diadem-Uservariable or "Hilfsvariable" like L1)? 
    Can I fill "DeviceParam1V" with content of a variable?
    I could use a input channel connected with a formula-block for it. (The
    formula-block can read a variable.) But this way is uncomfortable an
    don't work for input blocks.
    Martin Bohm
    [email protected]

    Because the DAC Script is executed in its own runtime environment you cannot use the DIAdem variables as in a normal VBS or a SUD.
    Still, there are ways to exchange information.
    First of all by an extra channel as Input (you named it)
    Secondly, there are several variables you can use. Have a look at the Script DAC block. There are two fields called Parameter1 and Parameter2. And each signal you configure has a parameter of its own.
    Prior to starting the scheme, you can use a script to change the value of those parameters:
    Call DACObjOpen("Script-in1")
      VBSSignalParam(1) = "abc"
    Call DACObjClose("Script-in1")
    Is changing the parameter of the first signal that is configured.
    Call DACObjOpen("Script-in1")
      VBSParameter1 = "1st device parameter"
      VBSParameter2 = "2nd device parameter"
    Call DACObjClose("Script-in1")
    is changing the global device parameters.
    On the side of the Script DAC driver VBS you cann use the paramP funtion to access the signal parameter that corresponds to the actual channel (as referenced by ChannelnumberP)
    ' SFD_ReadChannel
    ' Zweck               : Lesen eines Wertes für den Kanal "ChannelNumberP"
    ' ChannelNumberP      | Kanalnummer aus dem Block-Dialog
    ' ParamP              | Vom Anwender definierte Variable aus dem Block-Dialog
    ' DataP               | Variable zur Rückgabe des neuen Kanalwertes. Diese
    '                     | Variable sollte zumindest auf einen gültigen Wert
    '                     | initialisiert werden.
    ' ErrorP              | Variable zur Rückgabe einer Fehlermeldung. Wird diese
    '                     | Variable gesetzt, stoppt DIAdem die Messung
    Sub SFD_ReadChannel( ChannelNumberP, ParamP, DataP, ErrorP )
    End Sub
    To acces the device Parameters, use the init function:
    ' SFD_Init
    ' Zweck               : Diese Prozedur wird während des Messungsstarts aufgerufen
    ' DeviceParam1V       | Erster Parameter, der vom Anwender im DAC-Block
    '                     | eingegeben werden kann
    ' DeviceParam2V       | Zweiter Parameter, der vom Anwender im DAC-Block
    '                     | eingegeben werden kann
    ' ErrorP              | Variable zur Rückgabe einer Fehlermeldung. Wird diese
    '                     | Variable gesetzt, stoppt DIAdem die Messung
    Sub SFD_Init( DeviceParam1V, DeviceParam2V, ErrorP )
    End Sub
    Ingo Schumacher
    Systems Engineer Sound&VibrationNational Instruments Germany

  • Error while creating a new DAC connection using connection type MSSQL

    Hi,
    I am trying to create a new DAC connection i.e. a new DAC repository in the SQL Server 2008 database.
    DAC version : 10.1.3.4.1
    Database : SQL Server 2008
    I have downloaded the sqljdbc4.jar file from the below link and placed it in the D:\orahome\10gR3_1\bifoundation\dac\lib folder.
    [http://www.microsoft.com/en-us/download/details.aspx?displaylang=en&id=11774 ]
    I have entered all the details correctly for database name, database host, database port. I created a new Authentication file.
    I get the below error when I try to test the connection.
    MESSAGE:::MSSQL driver not available!
    EXCEPTION CLASS::: java.lang.IllegalArgumentException
    com.siebel.etl.gui.login.LoginDataHandler$LoginStructure.testConnection(LoginDataHandler.java:512)
    com.siebel.etl.gui.login.LoginDataHandler.testConnection(LoginDataHandler.java:386)
    com.siebel.etl.gui.login.ConnectionTestDialog$Executor.run(ConnectionTestDialog.java:290)
    ::: CAUSE :::
    MESSAGE:::com.microsoft.sqlserver.jdbc.SQLServerDriver
    EXCEPTION CLASS::: java.lang.ClassNotFoundException
    java.net.URLClassLoader$1.run(URLClassLoader.java:200)
    java.security.AccessController.doPrivileged(Native Method)
    java.net.URLClassLoader.findClass(URLClassLoader.java:188)
    java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:276)
    java.lang.ClassLoader.loadClass(ClassLoader.java:251)
    java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)
    java.lang.Class.forName0(Native Method)
    java.lang.Class.forName(Class.java:169)
    com.siebel.etl.gui.login.LoginDataHandler$LoginStructure.testConnection(LoginDataHandler.java:510)
    com.siebel.etl.gui.login.LoginDataHandler.testConnection(LoginDataHandler.java:386)
    com.siebel.etl.gui.login.ConnectionTestDialog$Executor.run(ConnectionTestDialog.java:290)
    The error seems to be a connectivity issue with SQL Server. Am I using the correct jar file?
    Please help me out in resolving this issue. Appreciate the help provided on this forum earlier.
    Thank You

    Add
    .\lib\sqljdbc4.jar
    at end of the line starting with SQLSERVERLIB in config.bat file
    Pls mark correct

  • Error while creating a task for creating an generic task  /app/dac/CustomSQ

    Hi ,
    I have created a sql file in DAC server /app/dac/CustomSQLs/ ,just to fire an update sql in database
    In DAC task tab i have created a task with the following:
    Command for incremental load :/app/dac/CustomSQLs/DBNameBeforeLoad.sql
    Primary source :flatfileconnection
    target source:DBCONNECTION_OLAP
    Execution type:SQL FILE
    Task phase:GENERAL
    I created subject area and assembled ,then created a Execution plan.
    When i try to execute this EP ,it shows the following error in DAC log:
    ANOMALY INFO::: Error while creating a task for creating an generic task /app/dac/CustomSQLs/DBNameBeforeLoad.sql
    MESSAGE:::/app/dac/CustomSQLs/DBNameBeforeLoad.sql - invalide template name!
    EXCEPTION CLASS::: com.siebel.analytics.etl.etltask.TaskInitializationException
    com.siebel.analytics.etl.etltask.SQLFileTask.doInit(SQLFileTask.java:69)
    com.siebel.analytics.etl.etltask.GenericTaskImpl.init(GenericTaskImpl.java:194)
    Does my above configuration is correct ..?

    verify the following settings:
    1. mapping of ECC plant and storage location and logical system number with the appropriate availability group code since this links the stock types in EWM to ECC org elements (SPRO -> EWM -> Interfaces -> ERP Integration -> Goods movement)
    2. ensure the availability group configuration is correct (SPRO -> EWM -> GR Process -> Configure availability group for putaway) - check all applicable nodes in this configuration since the availability group code is assigned in the pertinent storage types

  • Error while creating the DWH tables using DAC

    Hi,
    I am getting error while creating the DWH tables using DAC. I have created a ODBC DSN using merant driver with DAC repository DB credentials and the test connection is successful. And while creating the tables i gave the olap dw credentials and the DSN name which i created earlier. But it throws the error as below:
    Please find the below mentioned error message
    =====================================
    STD OUTPUT
    =====================================
    CREATING SIEBEL DATABASE OBJECTS
    F:\DAC\bifoundation\dac\UTILITIES\BIN\DDLIMP /I N /s N /u infdomain /p ******* /c DB_DAC /G "SSE_ROLE" /f F:\DAC\bifoundation\dac/conf/sqlgen/ctl-file/oracle_bi_dw.ctl /b "" /K "" /X "" /W N
    Error while importing Siebel database schema.
    =====================================
    ERROR OUTPUT
    =====================================
    Siebel Enterprise Applications ODBC DDL Import Utility, Version 7.7 [18030] ENU
    Copyright (c) 2001 Siebel Systems, Inc. All rights reserved.
    This software is the property of Siebel Systems, Inc., 2207 Bridgepointe Parkway,
    San Mateo, CA 94404.
    User agrees that any use of this software is governed by: (1) the applicable
    user limitations and other terms and conditions of the license agreement which
    has been entered into with Siebel Systems or its authorized distributors; and
    (2) the proprietary and restricted rights notices included in this software.
    WARNING: THIS COMPUTER PROGRAM IS PROTECTED BY U.S. AND INTERNATIONAL LAW.
    UNAUTHORIZED REPRODUCTION, DISTRIBUTION OR USE OF THIS PROGRAM, OR ANY PORTION
    OF IT, MAY RESULT IN SEVERE CIVIL AND CRIMINAL PENALTIES, AND WILL BE
    PROSECUTED TO THE MAXIMUM EXTENT POSSIBLE UNDER THE LAW.
    If you have received this software in error, please notify Siebel Systems
    immediately at (650) 295-5000.
    F:\DAC\bifoundation\dac\UTILITIES\BIN\DDLIMP /I N /s N /u infdomain /p ***** /c DB_DAC /G SSE_ROLE /f F:\DAC\bifoundation\dac/conf/sqlgen/ctl-file/oracle_bi_dw.ctl /b /K /X /W N
    Connecting to the database...
    28000: [DataDirect][ODBC Oracle driver][Oracle]ORA-01017: invalid username/password; logon denied
    Unable to connect to the database...
    any help is appreciated.
    Thanks,
    RM

    The fact that you are getting an "ORA-01017: invalid username/password; logon denied" message indicates that you are at least talking to the database.
    The log shows that username "infdomain" is being used. Can you double check the username and password you have in DAC in a SQL*Plus/SQL Developer session?
    Please mark if useful/helpful,
    Andy.

  • Error while importing a DW table into DAC

    Hi,
    We are on OBIEE 7.9.6 and we have a requirement of adding new DW table. I created a new table in DW and getting error "Could connect to data source process. Process failed during creation of connection pool", while trying to import this new table into DAC. I would like to know why I am getting this error.
    Any help is greatly appreciated.
    Thanks,
    Chandra

    Thank you for your response.
    I added this table in DW database and have imported into informatica. When I look at the traffic light it is red. Looks like thats the problem.
    Thanks again.

  • Error While running the ETL Load in DAC (BI Financial Analytics)

    Hi All,
    I have Installed and Configured BI Applictions 7.9.5 and Informatic8.1.1. For the first time when we run the ETL Load in DAC it has failed.for us every Test Connection was sucess.and getting the error message as below.
    The log file which I pasted below is from the path
    /u01/app/oracle/product/Informatica/PowerCenter8.1.1/server/infa_shared
    /SessLogs
    SDE_ORAR12_Adaptor.SDE_ORA_GL_AP_LinkageInformation_Extract_Full.log
    DIRECTOR> VAR_27028 Use override value [DataWarehouse] for session parameter:[$DBConnection_OLAP].
    DIRECTOR> VAR_27028 Use override value [ORA_R12] for session parameter:[$DBConnection_OLTP].
    DIRECTOR> VAR_27028 Use override value [9] for mapping parameter:[$$DATASOURCE_NUM_ID].
    DIRECTOR> VAR_27028 Use override value ['Y'] for mapping parameter:[$$FILTER_BY_LEDGER_ID].
    DIRECTOR> VAR_27028 Use override value ['N'] for mapping parameter:[$$FILTER_BY_LEDGER_TYPE].
    DIRECTOR> VAR_27028 Use override value [04/02/2007] for mapping parameter:[$$INITIAL_EXTRACT_DATE].
    DIRECTOR> VAR_27028 Use override value [] for mapping parameter:[$$LAST_EXTRACT_DATE].
    DIRECTOR> VAR_27028 Use override value [1] for mapping parameter:[$$LEDGER_ID_LIST].
    DIRECTOR> VAR_27028 Use override value ['NONE'] for mapping parameter:[$$LEDGER_TYPE_LIST].
    DIRECTOR> TM_6014 Initializing session [SDE_ORA_GL_AP_LinkageInformation_Extract_Full] at [Thu Feb 12 12:49:33 2009]
    DIRECTOR> TM_6683 Repository Name: [DEV_Oracle_BI_DW_Rep]
    DIRECTOR> TM_6684 Server Name: [DEV_Oracle_BI_DW_Rep_Integration_Service]
    DIRECTOR> TM_6686 Folder: [SDE_ORAR12_Adaptor]
    DIRECTOR> TM_6685 Workflow: [SDE_ORA_GL_AP_LinkageInformation_Extract_Full]
    DIRECTOR> TM_6101 Mapping name: SDE_ORA_GL_AP_LinkageInformation_Extract [version 1]
    DIRECTOR> TM_6827 [u01/app/oracle/product/Informatica/PowerCenter8.1.1/server/infa_shared/Storage] will be used as storage directory for session [SDE_ORA_GL_AP_LinkageInformation_Extract_Full].
    DIRECTOR> CMN_1805 Recovery cache will be deleted when running in normal mode.
    DIRECTOR> CMN_1802 Session recovery cache initialization is complete.
    DIRECTOR> TM_6708 Using configuration property [SiebelUnicodeDB,apps@devr12 bawdev@devbi]
    DIRECTOR> TM_6703 Session [SDE_ORA_GL_AP_LinkageInformation_Extract_Full] is run by 64-bit Integration Service [node01_oratestbi], version [8.1.1 SP4], build [0817].
    MANAGER> PETL_24058 Running Partition Group [1].
    MANAGER> PETL_24000 Parallel Pipeline Engine initializing.
    MANAGER> PETL_24001 Parallel Pipeline Engine running.
    MANAGER> PETL_24003 Initializing session run.
    MAPPING> CMN_1569 Server Mode: [ASCII]
    MAPPING> CMN_1570 Server Code page: [ISO 8859-1 Western European]
    MAPPING> TM_6151 Session Sort Order: [Binary]
    MAPPING> TM_6156 Using LOW precision decimal arithmetic
    MAPPING> TM_6180 Deadlock retry logic will not be implemented.
    MAPPING> TM_6307 DTM Error Log Disabled.
    MAPPING> TE_7022 TShmWriter: Initialized
    MAPPING> TE_7004 Transformation Parse Warning; transformation continues...
    MAPPING> TE_7004 Transformation Parse Warning; transformation continues...
    MAPPING> TE_7004 Transformation Parse Warning; transformation continues...
    MAPPING> TE_7004 Transformation Parse Warning; transformation continues...
    MAPPING> TE_7004 Transformation Parse Warning; transformation continues...
    MAPPING> TE_7004 Transformation Parse Warning; transformation continues...
    MAPPING> TM_6007 DTM initialized successfully for session [SDE_ORA_GL_AP_LinkageInformation_Extract_Full]
    DIRECTOR> PETL_24033 All DTM Connection Info: [<NONE>].
    MANAGER> PETL_24004 Starting pre-session tasks. : (Thu Feb 12 12:49:34 2009)
    MANAGER> PETL_24027 Pre-session task completed successfully. : (Thu Feb 12 12:49:34 2009)
    DIRECTOR> PETL_24006 Starting data movement.
    MAPPING> TM_6660 Total Buffer Pool size is 12582912 bytes and Block size is 128000 bytes.
    READER_1_1_1> DBG_21438 Reader: Source is [devr12.tessco.com], user [apps]
    READER_1_1_1> BLKR_16003 Initialization completed successfully.
    WRITER_1_*_1> WRT_8146 Writer: Target is database [DEVBI], user [bawdev], bulk mode [ON]
    WRITER_1_*_1> WRT_8106 Warning! Bulk Mode session - recovery is not guaranteed.
    WRITER_1_*_1> WRT_8124 Target Table W_GL_LINKAGE_INFORMATION_GS :SQL INSERT statement:
    INSERT INTO W_GL_LINKAGE_INFORMATION_GS(SOURCE_DISTRIBUTION_ID,JOURNAL_LINE_INTEGRATION_ID,LEDGER_ID,LEDGER_TYPE,DISTRIBUTION_SOURCE,JE_BATCH_NAME,JE_HEADER_NAME,JE_LINE_NUM,POSTED_ON_DT,SLA_TRX_INTEGRATION_ID,DATASOURCE_NUM_ID) VALUES ( ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
    WRITER_1_*_1> WRT_8270 Target connection group #1 consists of target(s) [W_GL_LINKAGE_INFORMATION_GS]
    WRITER_1_*_1> WRT_8003 Writer initialization complete.
    READER_1_1_1> BLKR_16007 Reader run started.
    WRITER_1_*_1> WRT_8005 Writer run started.
    WRITER_1_*_1> WRT_8158
    *****START LOAD SESSION*****
    Load Start Time: Thu Feb 12 12:49:34 2009
    Target tables:
    W_GL_LINKAGE_INFORMATION_GS
    READER_1_1_1> RR_4029 SQ Instance [SQ_XLA_AE_LINES] User specified SQL Query [SELECT
    DLINK.SOURCE_DISTRIBUTION_ID_NUM_1 DISTRIBUTION_ID,
    DLINK.SOURCE_DISTRIBUTION_TYPE SOURCE_TABLE,
    DLINK.ACCOUNTING_LINE_CODE LINE_CODE,
          AELINE.ACCOUNTING_CLASS_CODE,
    GLIMPREF.JE_HEADER_ID JE_HEADER_ID,
    GLIMPREF.JE_LINE_NUM JE_LINE_NUM,
    AELINE.AE_HEADER_ID AE_HEADER_ID,
    AELINE.AE_LINE_NUM AE_LINE_NUM,
    T.LEDGER_ID LEDGER_ID,
    T.LEDGER_CATEGORY_CODE LEDGER_TYPE,
        JBATCH.NAME BATCH_NAME,
       JHEADER.NAME HEADER_NAME,
          PER.END_DATE
    FROM XLA_DISTRIBUTION_LINKS DLINK
       , GL_IMPORT_REFERENCES        GLIMPREF
       , XLA_AE_LINES                              AELINE
       , GL_JE_HEADERS                         JHEADER
       , GL_JE_BATCHES                         JBATCH
       , GL_LEDGERS                                 T
       , GL_PERIODS   PER
    WHERE DLINK.SOURCE_DISTRIBUTION_TYPE IN
             (  'AP_INV_DIST', 'AP_PMT_DIST'
              , 'AP_PREPAY')
    AND DLINK.APPLICATION_ID = 200
    AND AELINE.APPLICATION_ID = 200
    AND AELINE.GL_SL_LINK_TABLE = GLIMPREF.GL_SL_LINK_TABLE
    AND AELINE.GL_SL_LINK_ID         = GLIMPREF.GL_SL_LINK_ID
    AND AELINE.AE_HEADER_ID         = DLINK.AE_HEADER_ID        
    AND AELINE.AE_LINE_NUM           = DLINK.AE_LINE_NUM
    AND GLIMPREF.JE_HEADER_ID   = JHEADER.JE_HEADER_ID
    AND JHEADER.JE_BATCH_ID       = JBATCH.JE_BATCH_ID
    AND JHEADER.LEDGER_ID                   = T.LEDGER_ID
    AND JHEADER.STATUS                         = 'P'
    AND T.PERIOD_SET_NAME = PER.PERIOD_SET_NAME
    AND JHEADER.PERIOD_NAME = PER.PERIOD_NAME
    AND JHEADER.CREATION_DATE >=
              TO_DATE('04/02/2007 00:00:00'
                    , 'MM/DD/YYYY HH24:MI:SS' )
    AND DECODE('Y', 'Y', T.LEDGER_ID, 1) IN (1)
    AND DECODE('N', 'Y', T.LEDGER_CATEGORY_CODE, 'NONE') IN ('NONE')]
    READER_1_1_1> RR_4049 SQL Query issued to database : (Thu Feb 12 12:49:34 2009)
    READER_1_1_1> CMN_1761 Timestamp Event: [Thu Feb 12 12:49:34 2009]
    READER_1_1_1> RR_4035 SQL Error [
    ORA-01114: IO error writing block to file 513 (block # 328465)
    ORA-27072: File I/O error
    Linux-x86_64 Error: 28: No space left on device
    Additional information: 4
    Additional information: 328465
    Additional information: -1
    ORA-01114: IO error writing block to file 513 (block # 328465)
    ORA-27072: File I/O error
    Linux-x86_64 Error: 28: No space left on device
    Additional information: 4
    Additional information: 328465
    Additional information: -1
    Database driver error...
    Function Name : Execute
    SQL Stmt : SELECT
    DLINK.SOURCE_DISTRIBUTION_ID_NUM_1 DISTRIBUTION_ID,
    DLINK.SOURCE_DISTRIBUTION_TYPE SOURCE_TABLE,
    DLINK.ACCOUNTING_LINE_CODE LINE_CODE,
    AELINE.ACCOUNTING_CLASS_CODE,
    GLIMPREF.JE_HEADER_ID JE_HEADER_ID,
    GLIMPREF.JE_LINE_NUM JE_LINE_NUM,
    AELINE.AE_HEADER_ID AE_HEADER_ID,
    AELINE.AE_LINE_NUM AE_LINE_NUM,
    T.LEDGER_ID LEDGER_ID,
    T.LEDGER_CATEGORY_CODE LEDGER_TYPE,
    JBATCH.NAME BATCH_NAME,
    JHEADER.NAME HEADER_NAME,
    PER.END_DATE
    FROM XLA_DISTRIBUTION_LINKS DLINK
    , GL_IMPORT_REFERENCES GLIMPREF
    , XLA_AE_LINES AELINE
    , GL_JE_HEADERS JHEADER
    , GL_JE_BATCHES JBATCH
    , GL_LEDGERS T
    , GL_PERIODS PER
    WHERE DLINK.SOURCE_DISTRIBUTION_TYPE IN
    ( 'AP_INV_DIST', 'AP_PMT_DIST'
    , 'AP_PREPAY')
    AND DLINK.APPLICATION_ID = 200
    AND AELINE.APPLICATION_ID = 200
    AND AELINE.GL_SL_LINK_TABLE = GLIMPREF.GL_SL_LINK_TABLE
    AND AELINE.GL_SL_LINK_ID = GLIMPREF.GL_SL_LINK_ID
    AND AELINE.AE_HEADER_ID = DLINK.AE_HEADER_ID
    AND AELINE.AE_LINE_NUM = DLINK.AE_LINE_NUM
    AND GLIMPREF.JE_HEADER_ID = JHEADER.JE_HEADER_ID
    AND JHEADER.JE_BATCH_ID = JBATCH.JE_BATCH_ID
    AND JHEADER.LEDGER_ID = T.LEDGER_ID
    AND JHEADER.STATUS = 'P'
    AND T.PERIOD_SET_NAME = PER.PERIOD_SET_NAME
    AND JHEADER.PERIOD_NAME = PER.PERIOD_NAME
    AND JHEADER.CREATION_DATE >=
    TO_DATE('04/02/2007 00:00:00'
    , 'MM/DD/YYYY HH24:MI:SS' )
    AND DECODE('Y', 'Y', T.LEDGER_ID, 1) IN (1)
    AND DECODE('N', 'Y', T.LEDGER_CATEGORY_CODE, 'NONE') IN ('NONE')
    Oracle Fatal Error
    Database driver error...
    Function Name : Execute
    SQL Stmt : SELECT
    DLINK.SOURCE_DISTRIBUTION_ID_NUM_1 DISTRIBUTION_ID,
    DLINK.SOURCE_DISTRIBUTION_TYPE SOURCE_TABLE,
    DLINK.ACCOUNTING_LINE_CODE LINE_CODE,
    AELINE.ACCOUNTING_CLASS_CODE,
    GLIMPREF.JE_HEADER_ID JE_HEADER_ID,
    GLIMPREF.JE_LINE_NUM JE_LINE_NUM,
    AELINE.AE_HEADER_ID AE_HEADER_ID,
    AELINE.AE_LINE_NUM AE_LINE_NUM,
    T.LEDGER_ID LEDGER_ID,
    T.LEDGER_CATEGORY_CODE LEDGER_TYPE,
    JBATCH.NAME BATCH_NAME,
    JHEADER.NAME HEADER_NAME,
    PER.END_DATE
    FROM XLA_DISTRIBUTION_LINKS DLINK
    , GL_IMPORT_REFERENCES GLIMPREF
    , XLA_AE_LINES AELINE
    , GL_JE_HEADERS JHEADER
    , GL_JE_BATCHES JBATCH
    , GL_LEDGERS T
    , GL_PERIODS PER
    WHERE DLINK.SOURCE_DISTRIBUTION_TYPE IN
    ( 'AP_INV_DIST', 'AP_PMT_DIST'
    , 'AP_PREPAY')
    AND DLINK.APPLICATION_ID = 200
    AND AELINE.APPLICATION_ID = 200
    AND AELINE.GL_SL_LINK_TABLE = GLIMPREF.GL_SL_LINK_TABLE
    AND AELINE.GL_SL_LINK_ID = GLIMPREF.GL_SL_LINK_ID
    AND AELINE.AE_HEADER_ID = DLINK.AE_HEADER_ID
    AND AELINE.AE_LINE_NUM = DLINK.AE_LINE_NUM
    AND GLIMPREF.JE_HEADER_ID = JHEADER.JE_HEADER_ID
    AND JHEADER.JE_BATCH_ID = JBATCH.JE_BATCH_ID
    AND JHEADER.LEDGER_ID = T.LEDGER_ID
    AND JHEADER.STATUS = 'P'
    AND T.PERIOD_SET_NAME = PER.PERIOD_SET_NAME
    AND JHEADER.PERIOD_NAME = PER.PERIOD_NAME
    AND JHEADER.CREATION_DATE >=
    TO_DATE('04/02/2007 00:00:00'
    , 'MM/DD/YYYY HH24:MI:SS' )
    AND DECODE('Y', 'Y', T.LEDGER_ID, 1) IN (1)
    AND DECODE('N', 'Y', T.LEDGER_CATEGORY_CODE, 'NONE') IN ('NONE')
    Oracle Fatal Error].
    READER_1_1_1> CMN_1761 Timestamp Event: [Thu Feb 12 12:49:34 2009]
    READER_1_1_1> BLKR_16004 ERROR: Prepare failed.
    WRITER_1_*_1> WRT_8333 Rolling back all the targets due to fatal session error.
    WRITER_1_*_1> WRT_8325 Final rollback executed for the target [W_GL_LINKAGE_INFORMATION_GS] at end of load
    WRITER_1_*_1> WRT_8035 Load complete time: Thu Feb 12 12:49:34 2009
    LOAD SUMMARY
    ============
    WRT_8036 Target: W_GL_LINKAGE_INFORMATION_GS (Instance Name: [W_GL_LINKAGE_INFORMATION_GS])
    WRT_8044 No data loaded for this target
    WRITER_1__1> WRT_8043 ****END LOAD SESSION*****
    MANAGER> PETL_24031
    ***** RUN INFO FOR TGT LOAD ORDER GROUP [1], CONCURRENT SET [1] *****
    Thread [READER_1_1_1] created for [the read stage] of partition point [SQ_XLA_AE_LINES] has completed: Total Run Time = [0.673295] secs, Total Idle Time = [0.000000] secs, Busy Percentage = [100.000000].
    Thread [TRANSF_1_1_1] created for [the transformation stage] of partition point [SQ_XLA_AE_LINES] has completed. The total run time was insufficient for any meaningful statistics.
    Thread [WRITER_1_*_1] created for [the write stage] of partition point [W_GL_LINKAGE_INFORMATION_GS] has completed. The total run time was insufficient for any meaningful statistics.
    MANAGER> PETL_24005 Starting post-session tasks. : (Thu Feb 12 12:49:35 2009)
    MANAGER> PETL_24029 Post-session task completed successfully. : (Thu Feb 12 12:49:35 2009)
    MAPPING> TM_6018 Session [SDE_ORA_GL_AP_LinkageInformation_Extract_Full] run completed with [0] row transformation errors.
    MANAGER> PETL_24002 Parallel Pipeline Engine finished.
    DIRECTOR> PETL_24013 Session run completed with failure.
    DIRECTOR> TM_6022
    SESSION LOAD SUMMARY
    ================================================
    DIRECTOR> TM_6252 Source Load Summary.
    DIRECTOR> CMN_1740 Table: [SQ_XLA_AE_LINES] (Instance Name: [SQ_XLA_AE_LINES])
         Output Rows [0], Affected Rows [0], Applied Rows [0], Rejected Rows [0]
    DIRECTOR> TM_6253 Target Load Summary.
    DIRECTOR> CMN_1740 Table: [W_GL_LINKAGE_INFORMATION_GS] (Instance Name: [W_GL_LINKAGE_INFORMATION_GS])
         Output Rows [0], Affected Rows [0], Applied Rows [0], Rejected Rows [0]
    DIRECTOR> TM_6023
    ===================================================
    DIRECTOR> TM_6020 Session [SDE_ORA_GL_AP_LinkageInformation_Extract_Full] completed at [Thu Feb 12 12:49:36 2009]
    Thanks in Advance,
    Prashanth
    Edited by: user10719430 on Feb 11, 2009 7:33 AM
    Edited by: user10719430 on Feb 12, 2009 11:31 AM

    Need to increase temp tablespace.

  • Regarding REFRESHING of Data in Data warehouse using DAC Incremental approa

    My client is planning to move from Discoverer to OBIA but before that we need some answers.
    1) My client needs the data to be refreshed every hour (incremental load using DAC) because they are using lot of real time data.
    We don't have much updated data( e.g 10 invoices in an hour + some other). How much time it usually takes to refresh those tables in Data wareshouse using DAC?
    2) While the table is getting refreshed can we use that table to generate a report? If yes, what is the state of data? Stale or incorrect(undefined)?
    3) How does refresh of Fin analytics work? Is it one module at a time or it treats all 3 modules (GL, AR and AP) as a single unit of refresh?
    I would really appreciate if I can get an answer for all the questions.
    Thank You,

    Here you go for answers:
    1) Shouldn't be much problem for a such small amt of data. All depends on ur execution plan in DAC which can always be created as new and can be customized to load data for only those tables...(Star Schema) ---- Approx 15-20 mins as it does so many things apart from loading table.
    2) Report in OBIEE will give previous data as I believe Cache will be (Shud be) turned on. You will get the new data in reports after the refresh is complete and cache is cleared using various methods ( Event Polling preferred)
    3) Again for Fin Analytics or any other module, you will have OOTB plans. But you can create ur new plans and execute. GL, AR, AP are also provided seperate..
    Hope this answers your question...You will get to know more which going through Oracle docs...particular for DAC

  • Unable to capture video from VHS through DAC-200

    (Running Final Cut Pro 5.1.4)
    I have a DAC-200 to hook up my VCR to my G5 in order to capture video from VHS tapes into Final Cut. I had this set-up working a few months ago, but it was disconnected to rearrange equipment and now I can't get it working again. I have followed the advice from the DAC-200 installation instructions and advice from other forum topics but can't seem to get it to work. I finally was able to get something in the log and capture window besides "preview disabled" and "cannot capture because there is no video." I now can get Final Cut to capture video, however it only captures a blank black screen (and this is all I can see in the preview window).
    I've triple-checked every cable hook-up, the G5 recognizes the converter box, the video tape plays fine to a tv. I've even had our IT-guy here at work come check out my set-up for help and he can't figure out what is wrong either. (He did get my log&capture window to get white fuzz in the black screen at one point though. Does that mean anything?)
    Any help is greatly appreciated! (I'm happy to answer any other questions too.) Thanks!

    I had difficullty with the X.4.10 plus QT7.2 and put my info on the Forum as well as sending feedback to Apple. Shortly afterwards, Apple brought out a short security update which I downloaded and, hey presto, I was then able to use QT7.2 with OSX.4.10. It may be that you missed out on the later small update to QT7.2..... ? Another factor I found was that some little while later I again had a problem so I again installed QT7.2 and over-installed FCP5.1.4 which has resolved my problems for the time being. Worth a try if you still can't get through on the DAC.
    Ron.

  • IMovie not recognising Data Video DAC AV/DV converter

    I use my DataVideo Dac-1 to capture old VHS family movies. Drop them into iMovie edit and make short DVD's. The system worked a treat with our old G4 and Panther. Have just bought a dual G5 2 Gig machine & iMovie HD. The system is not seeing the DAC on the fire wire bus. I have tried new cables, plugging into a native bus etc etc. but no go. The DAC is admitted 5 years old but worked perfectly up to now. I am loath to have to scrap it because of a Fire wire incompatibility issue. Anyone else out here with similiar issues and a possible solution - or else off to the mail-order I go and cough up €200

    Hi Richard,
    I cannot help, sorry, but I read here many reports concerning Converters worked/then failed....
    I recommend http://www.apple.com/feedback/imovie.html to report in detail your system's profile and converter specs.... maybe, there IS some <bad word following> bug???

  • Unit Testing in DAC

    Hi,
    I am facing a problem in unit testing in DAC. I am able to run full loads and incremental but i am unable to do unit testing in DAC. It shows an error that my Integration Services are not up when i run an individual task . Are there any setups for unit testing in DAC. I had followed http://docs.oracle.com/cd/E15586_01/fusionapps.1111/e14849/dacquickstart.htm#BABCIIIH topic : To unit test an execution plan task. If any one would help me in running a unit test that would be a great help.
    Thanks,
    Ram

    I'm struggling with the idea of testing View/Controller classes which depend on "container things" like bindings. Since there are many ways to interact with the 'container things' I guess it depends on exactly how you're using those bindings in your View/Controller classes. I've had success using Mockito along with my own fixture classes that do things like set up a mock ADFContext and RequestContext. Within my test setUp and tearDown, I set up and tear down these fixture classes, which results in a safe, mocked-out environment in which the test can execute. It's often tricky because many of the these objects are static and singleton-like, but so far I've always found a back door by which to inject a mock object. We also have our own utility classes that look-up 'container things', so we of course make sure we have a way to provide mock 'container things' through those utility classes.
    If you haven't come across the concept of 'mock objects' before, then it may be best to do a bit of background reading first:
    [http://www.mockobjects.com/2009/09/brief-history-of-mock-objects.html]
    [http://en.wikipedia.org/wiki/Mock_object]
    There are many mocking libraries about (EasyMock, jMock, rMock) but IMO Mockito is the one to choose right now.
    Of course, the alternative is to apply some [Inversion of Control|http://martinfowler.com/articles/injection.html] and make sure that the code you right is always ignorant of exactly where the 'container things' have come from. This makes mocking much easier, but I've not yet investigated use of a D.I. container (e.g. Spring) along with ADF - I suspect your options in 10g would be limited.

  • How do I use a DAC with my Airport Express ?

    Hello,
    Can I use a DAC with Airport Express ? If yes, how ?
    TIA

    check out this CNET Crave article: Using Apple's AirPort Express with a DAC: A how-to guide

  • Problem with Itunes and Remote when external DAC is used

    I am using IPad Remote app to connect to itunes library paired a windows vista Toshiba Laptop.
    When  the pc speaks are used or the iPad for audio reproduction, everything works OK, and iPad Remote can connect to iTunes Library.
    Unfortunately when the optical audio output of the PC is used for audio reproduction through an external DAC, amplifier and speakers, iPad cannot connect to iTunes library!
    Thanks in advance.

    Perhaps Tom could assist.

Maybe you are looking for

  • Why won't iPad1 show up as a Device in iTunes when the iPad is Wi-Fi and powered up?

    iPad1 is properly designated to be Wi-Fi sync'd to the Windows 7 64. iPad correctly shows Device in iTunes; and correctly sync's in iTunes when tethered to computer via USB. No Device listed  in iTunes when iPad is on and powered by 10w power cube. N

  • Problem with ME22 update

    Hi, I am updating ME22 when I get a inbound confirmation IDOC (ORDRSP). I have found the right exit and tryinng to update delvery date with new confirmation date using Call Transaction. But my problem is the program which is updating  the PO using In

  • Append Control Image to Report stops Source Distributi​on from working

    I have a top level built executeable that calls various external vi's using the "Call by Reference Node" function.  This seems to work ok, to a point.  An additional point here is this is code that I had running well in LV8.2 and am now trying to get

  • Sql*net V2 vs. Net 8

    I need to connect Forms 4.5 with Oracle 8. Since Forms 4.5 uses sql*net v2 and Oracle 8 supports Net 8 as TNS, how do i go about accomplishing this ? Need help ASAP. Thanks !

  • Nomad Jukebox 3 went for a sw

    We had the grandkids in the spa with us and the younger one knocked it into to the water. It went to the bottom. I am very handy working on things I made my own TV once from a kit. I can only see one screw it was under the faceplate. The first thing