SQL query log File..

Hi All,
I am using a standalone BIP environment... Is there way we can see the actual query log the BIP generates? which location on the server..or how to get it activated.
I want to see what is the actual sql query that is generated along with variables in it.
Fyi : I want to have the query log n not the server error log..
-dev

Hi Eric,
I would like you to re-check on the content level settings here as they are the primary causes of this kind of behavior. You could notice that the same information might have written down in the logical plan of the query too.
Also, as per your description
"In the SOURCES for this logical table, I've set the logical level of the content for E2 appropriately (detail level, same as E1)"
I would like to check on this point again, as if you had mapped E2 to E1 in the same logical source with an inner join, you would get to set the content level at E1 levels themselves but not E2 (Now, that E2 would become a part of the E1 hierarchy too). This might be the reason, the BI Server is choosing to elimiate(null) the values from E2 too (even you could see them in the sql client)
Hope this helps.
Thank you,
Dhar

Similar Messages

  • Summary SQL*LDR log file

    Hi,
    I need to get only a summary log file that show me records by error. for example if the log file has 5000 error for datatype and 300 for PK violation and 2000 for not null value, the problem that the log will show me this error row by row. can I get it summary like this
    -5000 rows rejected due to error for datatype
    -300 rows rejected due to PK violation
    -2000 rows rejected due to not null value
    Database: 9i
    O/S: Windows 2000 Server

    eng. Habeeli wrote:
    schavali wrote:
    Pl see your duplicate post here - big SQL*LDR log file
    Pl post what you have found in the documentation so far. AFAIK, there is no way to get a summary log file.
    Sriniits not duplicate ,I change to this form that specialized in sql loadre.
    I don't found any thing about summary log in the documnetPl post a link to the document you read. Did you see my comment above ?
    HTH
    Srini

  • Location of query log files in OBIEE 11g (version 11.1.1.5)

    Hi,
    I wish to know the Location of query log files in OBIEE 11g (version 11.1.1.5)??

    Hi,
    Log Files in OBIEE 11g
    Login to the URL http://server.domain:7001/em and navigate to:
    Farm_bifoundation_domain-> Business Intelligence-> coreapplications-> Dagnostics-> Log Messages
    You will find the available files:
    Presentation Services Log
    Server Log
    Scheduler Log
    JavaHost Log
    Cluster Controller Log
    Action Services Log
    Security Services Log
    Administrator Services Log
    However, you can also review them directly on the hard disk.
    The log files for OBIEE components are under <OBIEE_HOME>/instances/instance1/diagnostics/logs.
    Specific log files and their location is defined in the following table:
    Log Location
    Installation log                     <OBIEE_HOME>/logs
    nqquery log <OBIEE_HOME>/instances/instance1/diagnostics/logs/OracleBIServerComponent/coreapplication_obis1
    nqserver log <OBIEE_HOME>/instances/instance1/diagnostics/logs/OracleBIServerComponent/coreapplication_obis1
    servername_NQSAdminTool log      <OBIEE_HOME>/instances/instance1/diagnostics/logs/OracleBIServerComponent/coreapplication_obis1
    servername_NQSUDMLExec log                          <OBIEE_HOME>/instances/instance1/diagnostics/logs/OracleBIServerComponent/coreapplication_obis1
    servername_obieerpdmigrateutil log (Migration log)           <OBIEE_HOME>/instances/instance1/diagnostics/logs/OracleBIServerComponent/coreapplication_obis1
    sawlog0 log (presentation)                          <OBIEE_HOME>/instances/instance1/diagnostics/logs/OracleBIPresentationServicesComponent/coreapplication_obips1
    jh log (Java Host)                               <OBIEE_HOME>/instances/instance1/diagnostics/logs/OracleBIJavaHostComponent\coreapplication_obijh
    webcatupgrade log (Web Catalog Upgrade)                <OBIEE_HOME>/instances/instance1/diagnostics/logs/OracleBIPresentationServicesComponent/coreapplication_obips1
    nqscheduler log (Agents)                          <OBIEE_HOME>/instances/instance1/diagnostics/logsOracleBISchedulerComponent/coreapplication_obisch1
    nqcluster log                                    <OBIEE_HOME>/instances/instance1/diagnostics/logs/OracleBIClusterControllerComponent\coreapplication_obiccs1
    ODBC log                                    <OBIEE_HOME>/instances/instance1/diagnostics/logs/OracleBIODBCComponent/coreapplication_obips1
    opmn log                                    <OBIEE_HOME>/instances/instance1/diagnostics/logs/OPMN/opmn
    debug log                                    <OBIEE_HOME>/instances/instance1/diagnostics/logs/OPMN/opmn
    logquery log                               <OBIEE_HOME>/instances/instance1/diagnostics/logs/OPMN/opmn
    service log                                    <OBIEE_HOME>/instances/instance1/diagnostics/logs/OPMN/opmn
    opmn out                              <OBIEE_HOME>/instances/instance1/diagnostics/logs/OPMN/opmn
    Upgrade Assistant log                         <OBIEE_HOME>Oracle_BI1/upgrade/logs
    Regards
    MuRam

  • Best way to spool DYNAMIC SQL query to file from PL/SQL

    Best way to spool DYNAMIC SQL query to file from PL/SQL [Package], not SqlPlus
    I'm looking for suggestions on how to create an output file (fixed width and comma delimited) from a SELECT that is dynamically built. Basically, I've got some tables that are used to define the SELECT and to describe the output format. For instance, one table has the SELECT while another is used to defined the column "formats" (e.g., Column Order, Justification, FormatMask, Default value, min length, ...). The user has an app that they can use to customize the output...which leaving the gathering of the data untouched. I'm trying to keep this formatting and/or default logic out of the actual query. This lead me into a problem.
    Example query :
    SELECT CONTRACT_ID,PV_ID,START_DATE
    FROM CONTRACT
    WHERE CONTRACT_ID = <<value>>Customization Table:
    CONTRACT_ID : 2,Numeric,Right
    PV_ID : 1,Numeric,Mask(0000)
    START_DATE : 3,Date,Mask(mm/dd/yyyy)The first value is the kicker (ColumnOrder) as well as the fact that the number of columns is dynamic. Technically, if I could use SqlPlus...then I could just use SPOOL. However, I'm not.
    So basically, I'm trying to build a generic routine that can take a SQL string execute the SELECT and map the output using data from another table to a file.
    Any suggestions?
    Thanks,
    Jason

    You could build the select statement within PL/SQL and open it using a cursor variable. You could write it to a file using the package 'UTL_FILE'. If you want to display the output using SQL*Plus, you could have an out parameter as a ref cursor.

  • How to Open Or read SQL Server log file .ldf

    Hi all,
    How to Open Or read SQL Server log file .ldf
    When ever we create database from sql server, it's create two file. (1) .mdf (2) .ldf.
    I want to see what's available inside the .ldf file.
    Thanks,
    Ashok

    I am not too sure but may be the below two undocumented commands might yield the desired result.
    DBCC Log
    Fn_dblog function
    Refer these links for more info,
    http://www.mssqlcity.com/Articles/Undoc/SQL2000UndocDBCC.htm
    http://blogs.sqlserver.org.au/blogs/greg_linwood/archive/2004/11/27/37.aspx
    http://searchsqlserver.techtarget.com/tip/0,289483,sid87_gci1173464,00.html
    Some 3rd party tools like Log Explorer can do the job for you.
    http://www.lumigent.com/products/le_sql.html
    - Deepak

  • Problem specifying SQL Loader Log file destination using EM

    Good evening,
    I am following the example given in the 2 Day DBA document chapter 8 section 16.
    In step 5 of 7, EM does not allow me to specify the destination of the SQL Loader log file to be on a mapped network drive.
    The question: Does SQL Loader have a limitation that I am not aware of, that prevents placing the log file on a network share or am I getting this error because of something else I am inadvertently doing wrong ?
    Note: I have placed the DDL, load file data and steps I follow in EM at the bottom of this post to facilitate reproducing the problem *(drive Z is a mapped drive)*.
    Thank you for your help,
    John.
    DDL (generated using SQL developer, you may want to change the space allocated to be less)
    CREATE TABLE "NICK"."PURCHASE_ORDERS"
        "PO_NUMBER"      NUMBER NOT NULL ENABLE,
        "PO_DESCRIPTION" VARCHAR2(200 BYTE),
        "PO_DATE" DATE NOT NULL ENABLE,
        "PO_VENDOR" NUMBER NOT NULL ENABLE,
        "PO_DATE_RECEIVED" DATE,
        PRIMARY KEY ("PO_NUMBER") USING INDEX PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS NOCOMPRESS LOGGING TABLESPACE "USERS" ENABLE
      SEGMENT CREATION DEFERRED PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 NOCOMPRESS LOGGING STORAGE
        INITIAL 67108864
      TABLESPACE "USERS" ;
    Load.dat file contents
    1, Office Equipment, 25-MAY-2006, 1201, 13-JUN-2006
    2, Computer System, 18-JUN-2006, 1201, 27-JUN-2006
    3, Travel Expense, 26-JUN-2006, 1340, 11-JUL-2006
    Steps I am carrying out in EM
    log in, select data movement -> Load Data from User Files
    Automatically generate control file
    (enter host credentials that work on your machine)
    continue
    Step 1 of 7 ->
      Data file is located on your browser machine
      "Z:\Documentation\Oracle\2DayDBA\Scripts\Load.dat"
       click next
    step 2 of 7 ->
      Table Name
      nick.purchase_orders
      click next
    step 3 of 7 ->
      click next
    step 4 of 7 ->
      click next
    step 5 of 7 ->
      Generate log file where logging information is to be stored
      Z:\Documentation\Oracle\2DayDBA\Scripts\Load.LOG
      Validation Error
      Examine and correct the following errors, then retry the operation:
      LogFile - The directory does not exist.

    Hi John,
    But, i did'nt found any error when i am going the same what you did.
    My Oracle Version is 10.2.0.1 and using Windows xp. See what i did and i got worked
    1.I created one table in scott schema :
    SCOTT@orcl> CREATE TABLE "PURCHASE_ORDERS"
      2  (
      3      "PO_NUMBER"      NUMBER NOT NULL ENABLE,
      4      "PO_DESCRIPTION" VARCHAR2(200 BYTE),
      5      "PO_DATE" DATE NOT NULL ENABLE,
      6      "PO_VENDOR" NUMBER NOT NULL ENABLE,
      7      "PO_DATE_RECEIVED" DATE,
      8      PRIMARY KEY ("PO_NUMBER") USING INDEX PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS NOCOMPRESS LOGGING TABLESPACE "USERS" ENABLE
      9  )
    10  TABLESPACE "USERS";
    Table created.I logged into em Maintenance-->Data Movement-->Load Data from User Files-->My Host Credentials
    Here i total 3 text boxes :
    1.Server Data File : C:\ORACLE\PRODUCT\10.2.0\ORADATA\ORCL\USERS01.DBF
    2.Data File is Located on Your Browser Machine : z:\load.dat <--- Here z:\ means other machine's shared doc folder; and i selected this option (as option button click) and i created the same load.dat as you mentioned.
    3.Temporary File Location : C:\ORACLE\PRODUCT\10.2.0\ORADATA\ORCL\ <--- I did'nt mentioned anything.
    Step 2 of 7 Table Name : scott.PURCHASE_ORDERS
    Step 3 of 7 I just clicked Next
    Step 4 of 7 I just clicked Next
    Step 5 of 7 I just clicked Next
    Step 6 of 7 I just clicked Next
    Step 7 of 7 Here it is Control File Contents:
    LOAD DATA
    APPEND
    INTO TABLE scott.PURCHASE_ORDERS
    FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
    PO_NUMBER INTEGER EXTERNAL,
    PO_DESCRIPTION CHAR,
    PO_DATE DATE,
    PO_VENDOR INTEGER EXTERNAL,
    PO_DATE_RECEIVED DATE
    And i just clicked on submit job.
    Now i got all 3 rows in purchase_orders :
    SCOTT@orcl> select count(*) from purchase_orders;
      COUNT(*)
             3So, there is no bug, it worked and please retry if you get any error/issue.
    HTH
    Girish Sharma

  • MaxDB SQL query log

    Hi,
    i was wondering if there is a way to see sql query logs.
    I try to put some BLOBS in the database, on my development evironment this works fine.
    But on a new machine with a new database this doesn't work.
    It could be helpful so see some kind of query logs, to get en idea of what is going to be inserted
    into the database.
    It would be helpful anyway to get a log of the sql statements passed to the database system.
    Thanks for any help

    Hi,
    although SAP MaxDB logs all data manipulating statements in the log volumes, this information cannot be retranslated into SQL statements.
    If you wanted to track DDL (Data Definition Language) commands, you would find these via the SYSDDLHISTORY table. However, there is no such internal table for DML (Data Manipulation Language) commands in SAP MaxDB.
    I think you should try to catch your insert statements on application level. If this is not possible, you could try to issue a 'select * from runningcommands' and enable the SAP MaxDB Command Monitor to check which commands are caugth by your constraint settings.
    This might seem obvious, but have you verified that your transactions were actually commited, e.g. is either 'autocommit' enabled in your SQL client or did you make sure to manually commit your inserts?
    Thorsten

  • Webanalysis : Query Log File

    Hi,
    Plz suggest me where can i the path for report's query log file. i.e I run a webanalysis report and now i want to see the query that it fire to the source.
    I want to see the detail level log.
    Is there any other setting that i need to do like entry in property file or something.
    Plz suggest the way.
    Thanks

    Create two env variables
    ADM_TRACE_LEVEL=0
    REDIRECTOR_TRACE_LEVEL=0
    Also you can add this line to WA.prop file
    LogQueries=true
    Hope it hepls
    Regards
    CK

  • Query log file missing

    I am trying create a query log file for an ASO cube. I have created the database.cfg file in the database directory and restarted the application. Then I run a few queries, and stop the application. Query log file (database.qlg) is not created. My database.cfg file looks like this:<BR>QUERYLOG [LOB]<BR>QUERYLOG LOGFILESIZE 2<BR>QUERYLOG TOTALLOGFILESIZE 1024<BR>QUERYLOG ON<BR><BR>Thanks

    I am trying create a query log file for an ASO cube. I have created the database.cfg file in the database directory and restarted the application. Then I run a few queries, and stop the application. Query log file (database.qlg) is not created. My database.cfg file looks like this:<BR>QUERYLOG [LOB]<BR>QUERYLOG LOGFILESIZE 2<BR>QUERYLOG TOTALLOGFILESIZE 1024<BR>QUERYLOG ON<BR><BR>Thanks

  • Enable SQL Query logging while generating a report

    I am a newbie to BI Publisher. We are using Oracle BI Publisher 11.1.1.5.0 for reports generation.
    I am trying to get the actual SQL query executed when a report is generated.
    It would be great if any one can share some tips to enable SQL query logging and how to verify it.
    Thanks,
    Satya

    Thanks for your response.
    It seems the NQQuery.log is for OBIEE not for BI Publisher. I could see the admin guide refers to Oracle Fusion Middleware System Administrator's Guide for Oracle Business Intelligence Enterprise Edition.
    I raised an SR at myoraclesupport (MOS), I got a response saying enable SQL query log option is not available in BIP.
    Best regards,
    Satya

  • Logical sql in log file.

    Can someone please tell me how to see the complete sql query in the log file. If I run the same query the sql is not being produced I looked in the server log file and also manage sessions log file. It just says all columns from 'Subject Area'. I want to see all the joins and filters as well. Even for repeated queries how can I see complete sql. I set my logging level to 2.

    http://lmgtfy.com/?q=obiee+disable+query+caching
    http://catb.org/esr/faqs/smart-questions.html#homework

  • Query log file location?

    Is log file is created when query has completed execution, if yes plz tell its location.

    Put a set timing on before executing the query in SQL *Plus like:
    SQL> set timing on
    SQL> select 1 from dual;
    1
    1
    Elapsed: 00:00:00.00
    To make trace on just do the following:
    SQL> SET AUTOTRACE TRACEONLY
    SQL> SELECT 1 FROM DUAL;
    Elapsed: 00:00:00.01
    Execution Plan
    0 SELECT STATEMENT Optimizer=ALL_ROWS (Cost=2 Card=1)
    1 0 FAST DUAL (Cost=2 Card=1)
    Statistics
    1 recursive calls
    0 db block gets
    0 consistent gets
    0 physical reads
    0 redo size
    419 bytes sent via SQL*Net to client
    508 bytes received via SQL*Net from client
    2 SQL*Net roundtrips to/from client
    0 sorts (memory)
    0 sorts (disk)
    1 rows processed
    SQL>
    Regards
    Edited by: Virendra.k.Yadav on Aug 20, 2010 2:37 AM

  • Repeated Errors in SQL Server log file

    I have hundreds of these errors saying 'Login failed for user 'Reporting' The user is not associated with a trusted SQL Server connection [CLIENT: ip address]
    The ip address is that of the server that sql server is installed on.
    Looking in my log file, all looks good until I get to Service Broker manager has started, then I get Error: 18452, Severity: 14 State: 1 then these two lines repeat about every minute, for the last 3 days!
    I think I must have just missed a tick box somewhere, but where?
    I have been into one of the databases, and input  and checked data, both via an application I wrote and SQL Server Management Studio.
    I am also having trouble connecting using my application to connect to the database, I can only connect if I use a Windows administrator account (this SQL Server 2005 running on a Windows 2003 Server, with the app on PC running Windows 2000)

    Hello Graham,
    did you find any solutions for that problem? I'm discovering a similar problem. The state of my error message is 5. According to the following source http://blogs.msdn.com/sql_protocols/archive/2006/02/21/536201.aspx this means the user is not known. This is correct. Neither the SQL-Server nor the Windows system has such a userid.
    The faulty user name is 'Reporting'. What is that user used for? I have set up other SQL Server 2005 servers but was never asked for such a name.
    Greetings,
    Frank

  • Big SQL*LDR log file

    Hi,
    I need to get only a summary log file that show me records by error. for example if the log file has 5000 error for datatype and 300 for PK violation and 2000 for not null value, the problem that the log will show me this error row by row. can I get it summary like this
    -5000 rows rejected due to error for datatype
    -300 rows rejected due to PK violation
    -2000 rows rejected due to not null value
    can u give me hand with this?

    Pl post details of OS and database versions. There is a dedicated SQL*Loader forum at Export/Import/SQL Loader & External Tables
    Have you reviewed the documentation for your database version at http://docs.oracle.com ? If so, what have you determined so far ?
    HTH
    Srini

  • SQL Server log file

    We are facing one issue can anyone please help me out.
    Logs are getting full in below path . Can we remove the logs ? is it archive logs ?
    Path : H:\Program Files\Microsoft SQL Server\MSSQL10_50.MSSQLSERVER\MSSQL\DATA

    If T-Log backups are configured then the log automatically get truncated once backup is compelted. If you dont have backups configured and you dont want to save the transactions you can manually truncate it
    Hi,
    From SQL Server 2008 onwards truncate_only is removed, it is replaced by
    backup log db_name to disk='Null'
    Also if your log is growing AND YOU DONT NEED POINT IN TIME RECOVERY switch recovery model to simple it will force checkpoint and logs will get truncated( If transaction does not requires the log) . Avoid using backup log to null or truncate only.
    Swapna,
    >>Also if you dont want the transactions to be saved then you can set the recovery model and Simple
    This is incorrect language used saving/committing transaction does not depends on recovery model. Recovery model only controls logging and recovery. if transaction is committed it will be present in database no matter what recovery model you use.
    Please mark this reply as answer if it solved your issue or vote as helpful if it helped so that other forum members can benefit from it.
    My TechNet Wiki Articles

Maybe you are looking for

  • How can I remove black lines in Zen V Plus? Can you tell me why is there o

    It's very annoying and there are more and more!Why is there one?Why should I update firmware?Why should I update?How could I update?It's so annoying!!!!!Message Edited by 2004254 on -24-200607:06 AM

  • Failed Camera Raw Plug-In Install

    Hello, I was running CS3 off my old MacBook Pro with no problems for years.  Just got an iMac and installed the program, Updater wants to install Camera Raw 4.6 but fails about halfway through with no explanation.  I tried following the manual workar

  • MySQL uprgrade: 5.0.22 -- 5.0.82

    We're currently running OS X Server 10.4.11 and MySQL 5.0.22. A recent attempt at upgrading MySQL from 5.0.x to 5.1.x was awful (to say the least) and I'm wondering if anyone's had experience doing an incremental upgrade from 5.0.x to 5.0.x? It's nee

  • Photoshop Trial Trouble

    I downloaded a 30-day trial version of Photoshop yesterday, and was able to use it. Today I try to use it, and it says my trial has expired and I must subscribe. How can I fix this and continue to use what's left of the trial?

  • My HD won't accept 1.3.1 update

    i've had safari 1.3.2 (i think) for a while on my 10.3.9 panther but i recently did a clean install. after doing all the software updates i'm left with safari 1.2(v125). in trying to get back to where i was previously, my hd won't accept the safari 1