Extract SQL history from 10046 trace files
Hi all,
I need to extract the complete sql history from sql trace files to "debug" a client application.
I know I can read the raw trc file and rebuild the sql history looking for the PARSING / EXEC / FETCH entries.
However, this is a very long and boring manual task: do you know if there is some free tool to automate this task?
thanks
Andrea
user585511 wrote:
I agree that the 10046 trace captures everything. If I do read the raw trc file I see the DML. The problem is that tkprof's record does not record the DML (maybe it thinks that some DML is recursive sql and it gets misleaded... I am not sure) so I am looking for an alternate tool to process 10046 trace files
Regards
AndreaReally?
Generate a trace of some dml:
oracle:orcl$
oracle:orcl$ sqlplus /nolog
SQL*Plus: Release 11.2.0.1.0 Production on Thu May 16 08:28:55 2013
Copyright (c) 1982, 2009, Oracle. All rights reserved.
SQL> conn snuffy/snuffy
Connected.
SQL> alter session set tracefile_identifier = "snuffy_session";
Session altered.
SQL> alter session set events '10046 trace name context forever, level 12';
Session altered.
SQL> insert into mytest values (sysdate);
1 row created.
SQL> commit;
Commit complete.
SQL> ALTER SESSION SET EVENTS '10046 trace name context off';
Session altered.
SQL> exitrun tkprof on the trace
oracle:orcl$ ls -l $ORACLE_BASE/diag/rdbms/$ORACLE_SID/$ORACLE_SID/trace/*snuffy
*.trc
-rw-r----- 1 oracle asmadmin 3038 May 16 08:29 /u01/app/oracle/diag/rdbms/orcl/orcl/trace/orcl_ora_4086_snuffy_session.trc
oracle:orcl$ tkprof /u01/app/oracle/diag/rdbms/orcl/orcl/trace/orcl_ora_4086_snu
ffy_session.trc snuffy.rpt waits=YES sys=NO explain=system/halftrack
TKPROF: Release 11.2.0.1.0 - Development on Thu May 16 08:31:32 2013
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.Look at the report:
oracle:orcl$ cat snuffy.rpt
TKPROF: Release 11.2.0.1.0 - Development on Thu May 16 08:31:32 2013
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Trace file: /u01/app/oracle/diag/rdbms/orcl/orcl/trace/orcl_ora_4086_snuffy_session.trc
Sort options: default
count = number of times OCI procedure was executed
cpu = cpu time in seconds executing
elapsed = elapsed time in seconds executing
disk = number of physical reads of buffers from disk
query = number of buffers gotten for consistent read
current = number of buffers gotten in current mode (usually for update)
rows = number of rows processed by the fetch or execute call
SQL ID: 938dgt554gu98
Plan Hash: 0
insert into mytest <<<<<<<<<<<<<<<< oh my! Here is the insert statement
values
(sysdate)
call count cpu elapsed disk query current rows
Parse 1 0.00 0.00 0 0 0 0
Execute 1 0.00 0.00 0 1 5 1
Fetch 0 0.00 0.00 0 0 0 0
total 2 0.00 0.00 0 1 5 1
Misses in library cache during parse: 0
Optimizer mode: ALL_ROWS
Parsing user id: 86 (SNUFFY)
Rows Row Source Operation
0 LOAD TABLE CONVENTIONAL (cr=1 pr=0 pw=0 time=0 us)
error during execute of EXPLAIN PLAN statement
ORA-00942: table or view does not exist
parse error offset: 83
Elapsed times include waiting on following events:
Event waited on Times Max. Wait Total Waited
---------------------------------------- Waited ---------- ------------
SQL*Net message to client 1 0.00 0.00
SQL*Net message from client 1 3.35 3.35
SQL ID: 23wm3kz7rps5y
Plan Hash: 0
commit
call count cpu elapsed disk query current rows
Parse 1 0.00 0.00 0 0 0 0
Execute 1 0.00 0.00 0 0 1 0
Fetch 0 0.00 0.00 0 0 0 0
total 2 0.00 0.00 0 0 1 0
Misses in library cache during parse: 0
Parsing user id: 86 (SNUFFY)
Elapsed times include waiting on following events:
Event waited on Times Max. Wait Total Waited
---------------------------------------- Waited ---------- ------------
SQL*Net message to client 2 0.00 0.00
SQL*Net message from client 2 4.72 8.50
log file sync 1 0.00 0.00
SQL ID: 0kjg1c2g4gdcr
Plan Hash: 0
ALTER SESSION SET EVENTS '10046 trace name context off'
call count cpu elapsed disk query current rows
Parse 1 0.00 0.00 0 0 0 0
Execute 1 0.00 0.00 0 0 0 0
Fetch 0 0.00 0.00 0 0 0 0
total 2 0.00 0.00 0 0 0 0
Misses in library cache during parse: 0
Parsing user id: 86 (SNUFFY)
OVERALL TOTALS FOR ALL NON-RECURSIVE STATEMENTS
call count cpu elapsed disk query current rows
Parse 3 0.00 0.00 0 0 0 0
Execute 3 0.00 0.00 0 1 6 1
Fetch 0 0.00 0.00 0 0 0 0
total 6 0.00 0.00 0 1 6 1
Misses in library cache during parse: 0
Elapsed times include waiting on following events:
Event waited on Times Max. Wait Total Waited
---------------------------------------- Waited ---------- ------------
SQL*Net message to client 3 0.00 0.00
SQL*Net message from client 3 4.72 11.86
log file sync 1 0.00 0.00
OVERALL TOTALS FOR ALL RECURSIVE STATEMENTS
call count cpu elapsed disk query current rows
Parse 0 0.00 0.00 0 0 0 0
Execute 0 0.00 0.00 0 0 0 0
Fetch 0 0.00 0.00 0 0 0 0
total 0 0.00 0.00 0 0 0 0
Misses in library cache during parse: 0
3 user SQL statements in session.
0 internal SQL statements in session.
3 SQL statements in session.
0 statements EXPLAINed in this session.
Trace file: /u01/app/oracle/diag/rdbms/orcl/orcl/trace/orcl_ora_4086_snuffy_session.trc
Trace file compatibility: 11.1.0.7
Sort options: default
1 session in tracefile.
3 user SQL statements in trace file.
0 internal SQL statements in trace file.
3 SQL statements in trace file.
3 unique SQL statements in trace file.
58 lines in trace file.
8 elapsed seconds in trace file.
oracle:orcl$
Similar Messages
-
SQL*NET waits in trace file
Hi All,
There is a long running query, i generated trace file for this request. In trace file i found that there are huge waits on SQL*Net message from client
The below is the trace file output:
Elapsed times include waiting on following events:
Event waited on Times Waited Max. Wait Total Waited
SQL*Net message to client 16 0.00 0.00
SQL*Net more data to client 17 0.00 0.00
db file sequential read 1450 0.02 4.26
SQL*Net message from client 16 1414.20 2702.84
How to resolve this waits from SQL*NET message from client? I checked the network connection, there is no delays in network.
Any inputs on this issue will be appreciated.As Satish indicated, the "SQL*Net message from client" wait is an event which indicates that the database server was waiting for the next request from the client computer, and not an indication that the query needs to be tuned. Manually review the trace file. At one point in the trace file, you will see this wait event with an ela= value which begins with 14142 - please post to this thread that line from the trace file along with the 20 lines before that line and the 20 lines after that line. You may just have a long wait on this event at the beginning, and another long wait on this event at the end of the query.
Charles Hooper
IT Manager/Oracle DBA
K&M Machine-Fabricating, Inc. -
Extracting SQL Queries from Crystal Reports
I am trying to find a way or a utility to be able extract SQL queries from Crystal reports into a text file for documentation purposes. These queries are not in the repository, they were entered into each of the reports when the reports were being built and I can't find a way to extract them. Any ideas/suggestions?
Hello,
CR doesn't have the ability, and I don't recall if this has ever been asked previously. Great suggestion for the Idea Place tab in the up right corner of this page.
If you find a developer it's quite simple to get:
// log onto the server and then get the SQL.
rptClientDoc.DatabaseController.LogonEx("van-w-13-dwilli", "xtreme", "sa", "pw");
GroupPath gp = new GroupPath();
string tmp = String.Empty;
rptClientDoc.RowsetController.GetSQLStatement(gp, out tmp);
// show the SQL but easy enough to save the SQL text to a file.
MessageBox.Show(tmp, "Data Source Set and SQL Statement", MessageBoxButtons.OK, MessageBoxIcon.Information);
Of course you need to open the report first, lots of samples on how to...
Thank you
Don -
How can I use Automator to extract specific Data from a text file?
I have several hundred text files that contain a bunch of information. I only need six values from each file and ideally I need them as columns in an excel file.
How can I use Automator to extract specific Data from the text files and either create a new text file or excel file with the info? I have looked all over but can't find a solution. If anyone could please help I would be eternally grateful!!! If there is another, better solution than automator, please let me know!
Example of File Contents:
Link Time =
DD/MMM/YYYY
Random
Text
161 179
bytes of CODE memory (+ 68 range fill )
16 789
bytes of DATA memory (+ 59 absolute )
1 875
bytes of XDATA memory (+ 1 855 absolute )
90 783
bytes of FARCODE memory
What I would like to have as a final file:
EXCEL COLUMN1
Column 2
Column3
Column4
Column5
Column6
MM/DD/YYYY
filename1
161179
16789
1875
90783
MM/DD/YYYY
filename2
xxxxxx
xxxxx
xxxx
xxxxx
MM/DD/YYYY
filename3
xxxxxx
xxxxx
xxxx
xxxxx
Is this possible? I can't imagine having to go through each and every file one by one. Please help!!!Hello
You may try the following AppleScript script. It will ask you to choose a root folder where to start searching for *.map files and then create a CSV file named "out.csv" on desktop which you may import to Excel.
set f to (choose folder with prompt "Choose the root folder to start searching")'s POSIX path
if f ends with "/" then set f to f's text 1 thru -2
do shell script "/usr/bin/perl -CSDA -w <<'EOF' - " & f's quoted form & " > ~/Desktop/out.csv
use strict;
use open IN => ':crlf';
chdir $ARGV[0] or die qq($!);
local $/ = qq(\\0);
my @ff = map {chomp; $_} qx(find . -type f -iname '*.map' -print0);
local $/ = qq(\\n);
# CSV spec
# - record separator is CRLF
# - field separator is comma
# - every field is quoted
# - text encoding is UTF-8
local $\\ = qq(\\015\\012); # CRLF
local $, = qq(,); # COMMA
# print column header row
my @dd = ('column 1', 'column 2', 'column 3', 'column 4', 'column 5', 'column 6');
print map { s/\"/\"\"/og; qq(\").$_.qq(\"); } @dd;
# print data row per each file
while (@ff) {
my $f = shift @ff; # file path
if ( ! open(IN, '<', $f) ) {
warn qq(Failed to open $f: $!);
next;
$f =~ s%^.*/%%og; # file name
@dd = ('', $f, '', '', '', '');
while (<IN>) {
chomp;
$dd[0] = \"$2/$1/$3\" if m%Link Time\\s+=\\s+([0-9]{2})/([0-9]{2})/([0-9]{4})%o;
($dd[2] = $1) =~ s/ //g if m/([0-9 ]+)\\s+bytes of CODE\\s/o;
($dd[3] = $1) =~ s/ //g if m/([0-9 ]+)\\s+bytes of DATA\\s/o;
($dd[4] = $1) =~ s/ //g if m/([0-9 ]+)\\s+bytes of XDATA\\s/o;
($dd[5] = $1) =~ s/ //g if m/([0-9 ]+)\\s+bytes of FARCODE\\s/o;
last unless grep { /^$/ } @dd;
close IN;
print map { s/\"/\"\"/og; qq(\").$_.qq(\"); } @dd;
EOF
Hope this may help,
H -
How to load SQL scripts from a text file.
Hi, i tried several time to load a text file/SQL script with 10 different tables and data, but 10g Express doesen't allows me to do that, any one can direct me or point out to me what i should do or do i need to adopt any special method to to get this done. i am sure there must be some thing where you can upload SQL scripts from a text file (in SQL command editor!). thanks
Hi,
see my other answer here:
SQL command editor doesn't take more than 1 insert command
This seems to be a duplicate question, right? Or am I missing something?
Regards,
~Dietmar. -
Which Version of Adobe do I need to be able to "extract" a page from a existing file and save/download to another file?
Acrobat Pro or Standard.
-
SQL History from Entreprise Manager
Hi Experts i need your help.. actullay i need SQL Text that is executed by DB User . i wan to know how to get SQL text history from Enterprise Manager .. my DB Version is 10gr2(10.2.0.4).
regards,Duplicate thread!
SQL History from Enterprise Manager -
Explain Plan from TKPROF trace file.
Hello,
My procedure is taking time for that i have trace on in procedure fo find explain plan of particular query.
Using below statement Trace is on.
EXECUTE IMMEDIATE 'ALTER SESSION SET SQL_TRACE = TRUE';
EXECUTE IMMEDIATE 'ALTER SESSION SET TIMED_STATISTICS = TRUE';Now for getting expalin plan from TKPROF i have used below statement but for some query i have found explain paln and some case i cant found explain plan.
tkprof mf_ora_23773.trc mf_ora_23773.txt explain =abc/abc
can u please help me to analyze where i m wrong?
Thanks.First of all, you should best avoid using the explain= clause on the tkprof command line.
This will run explain plan on the statement in the trace file, and that explain plan can even be wrong, as there is no information on datatypes in the trace file.
The real explain plan data is flushed to the trace file, when the program issues commit or rollback. Oracle always issues an implicit commit when the program disconnects, so when you run the program to completion you should have explain plan output in your trace files.
You won't get explain plan output if you don't have access to the objects in the SQL statement. Also recursive SQL won't produce explain plan result.
One would need to see part of your trace file to verify your assertion the explain clause doesn't always work.
Sybrand Bakker
Senior Oracle DBA -
Need help understanding Explain Plan from 10046 trace
Below is a query and Explain Plan from a 10046 trace shown with trcanlzr.sql.
In the explain plan I don't understand what's happining at line ID 10 and 11. Specifically, is the result at line 11 rowids from lines 12 & 14? and then what? Are those rowids somehow used in line ID 10?
SELECT cp.cred_process_id, cp.provider_id,
brdg_credentialing.get_appl_specialist(cp.cred_process_id,'R') specialist_name,
brdg_cred_report_pkg.provider_name(cp.cred_process_id) provider_name,
ctc_apptype.description appl_type_desc,
TRUNC (brdg_credentialing.get_appl_received_dt(cp.cred_process_id)) init_received_dt,
brdg_code_util.code_descr(brdg_credentialing.get_appl_status_cd_ctc_id(cp.cred_process_id)) appl_status_desc,
brdg_credentialing.get_appl_prac_specialties(cp.cred_process_id,'Y') primary_specialty,
cwh.city practice_city,
UPPER (cwh.state) practice_state,
TRUNC (ch.event_dt) specialist_assign_dt,
DECODE (ctc_apptype.code,'INITPPO', TRUNC (brdg_credentialing.get_appl_received_dt(cp.cred_process_id)),
'REAPP', TRUNC (brdg_credentialing.get_appl_received_dt(cp.cred_process_id)),
'SPECCRED', TRUNC (brdg_credentialing.get_appl_received_dt(cp.cred_process_id)),
'TRANS', TRUNC (brdg_credentialing.get_appl_received_dt(cp.cred_process_id)),
'RECPPO', p.next_recred_dt,
'RECAPP', p.next_recred_dt, NULL) sort_date,
p.next_recred_dt
FROM brdg_cred_app_open_vw cp,
brdg_cat_type_codes ctc_apptype,
brdg_cred_work_history cwh,
brdg_cred_history ch,
brdg_providers p
WHERE cp.type_cd_ctc_id = ctc_apptype.cat_type_code_id
AND ctc_apptype.category_cd = 'CRED'
AND ctc_apptype.type_cd = 'APPTYPE'
AND cp.cred_process_id = cwh.cred_process_id (+)
AND cwh.primary_practice_flag (+) = 'Y'
AND cp.cred_process_id = ch.cred_process_id
AND ch.cred_history_id = (SELECT MAX(cred_history_id)
FROM brdg_cred_history
WHERE cred_process_id = cp.cred_process_id
AND event_cd_ctc_id = brdg_credentialing.get_event_ctc_id ('SEVENT','SPESTCHG'))
AND cp.provider_id = p.provider_id (+)
and brdg_credentialing.get_appl_specialist_id(cp.cred_process_id) = 5
ORDER BY 3 ASC, 3, 5, 12, 6
Explain Plan Operation
ID PID Card Rows Cost SearchCols / Indexed Cols Predicates
0: 1 36 SELECT STATEMENT
1: 0 1 139 36 SORT ORDER BY
2: 1 139 . FILTER [+]
3: 2 1 311 11 .. NESTED LOOPS OUTER
4: 3 1 311 10 ... NESTED LOOPS OUTER
5: 4 1 311 9 .... NESTED LOOPS
6: 5 1 311 8 ....+ NESTED LOOPS
7: 6 4 16 1 ....+. TABLE ACCESS BY INDEX ROWID CAT_TYPE_CODES
8: 7 4 16 1 ....+.. INDEX RANGE SCAN CAT_TYPE_CODE_UK 2/3 [+] [+]
9: 6 1 311 2 ....+. TABLE ACCESS BY INDEX ROWID CRED_PROCESSES [+]
10: 9 183 61927 1 ....+.. INDEX RANGE SCAN CDPR_CTCD_FK1 1/1 [+] [+]
11: 10 1 3 2 ....+... NESTED LOOPS
12: 11 1 16 1 ....+.... TABLE ACCESS BY INDEX ROWID CAT_TYPE_CODES
13: 12 1 16 1 ....+....+ INDEX UNIQUE SCAN CTCD_PK 1/1 [+] [+]
14: 11 1 3 1 ....+.... INDEX UNIQUE SCAN CAT_TYPE_CODE_UK 3/3 [+] [+]
15: 5 1 11 1 ....+ TABLE ACCESS BY INDEX ROWID CRED_HISTORY [+]
16: 15 1 311 1 ....+. INDEX UNIQUE SCAN CDHT_PK 1/1 [+] [+]
17: 16 1 311 ....+.. SORT AGGREGATE
18: 17 1 526 2 ....+... TABLE ACCESS BY INDEX ROWID CRED_HISTORY [+]
19: 18 23 9950 1 ....+.... INDEX RANGE SCAN CDHT_CDPR_FK 1/1 [+] [+]
20: 4 1 219 1 .... TABLE ACCESS BY INDEX ROWID PROVIDERS
21: 20 1 219 1 ....+ INDEX UNIQUE SCAN PROV_PK 1/1 [+] [+]
22: 3 1 311 1 ... TABLE ACCESS BY INDEX ROWID CRED_WORK_HISTORY [+]
23: 22 3 1057 1 .... INDEX RANGE SCAN CDWH_CDPR_FK 1/1 [+] [+]
24: 2 172 .. INLIST ITERATOR
25: 24 1 172 1 ... INDEX UNIQUE SCAN CAT_TYPE_CODE_UK 3/3 [+] [+]
26: 2 1 0 2 .. TABLE ACCESS BY INDEX ROWID CRED_HISTORY [+]
27: 26 23 2004 1 ... INDEX RANGE SCAN CDHT_CDPR_FK 1/1 [+] [+]
(1) X/Y: Where X is the number of searched columns from index, which has a total of Y columns.
(2) Actual rows returned by operation (average if there were more than 1 execution).
2 - filter( NOT EXISTS (SELECT 0 FROM "PPO"."CAT_TYPE_CODES" "BRDG_CAT_TYPE_CODES" WHERE
"CODE"="BRDG_CODE_UTIL"."ID_CODE"("BRDG_CREDENTIALING"."GET_APPL_STATUS_CD_CTC_ID"(:B1)) AND
("TYPE_CD"='APPROVAL' OR "TYPE_CD"='DENIED' OR "TYPE_CD"='INACTIVE' OR "TYPE_CD"='TERMED') AND
"CATEGORY_CD"='APPSTAT') AND NOT EXISTS (SELECT 0 FROM "PPO"."CRED_HISTORY" "BRDG_CRED_HISTORY"
WHERE "CRED_PROCESS_ID"=:B2 AND "EVENT_CD_CTC_ID"="BRDG_CODE_UTIL"."GET_ID"('CRED','SEVENT','MSODC
8 - access("CTC_APPTYPE"."CATEGORY_CD"='CRED' AND "CTC_APPTYPE"."TYPE_CD"='APPTYPE')
9 - filter("BRDG_CREDENTIALING"."GET_APPL_SPECIALIST_ID"("CP"."CRED_PROCESS_ID")=5 AND
("CP"."INS_DT">=TO_DATE(' 2007-12-20 17:00:00', 'syyyy-mm-dd hh24:mi:ss') OR
"CP"."TYPE_CD_CTC_ID"<>"BRDG_CODE_UTIL"."GET_ID"('CRED','APPTYPE','RECPPO')))
10 - access("CP"."TYPE_CD_CTC_ID"="CTC_APPTYPE"."CAT_TYPE_CODE_ID")
filter( NOT EXISTS (SELECT 0 FROM "PPO"."CAT_TYPE_CODES"
"CTC_APPTYPE","PPO"."CAT_TYPE_CODES" "CTC_TYPE" WHERE "CTC_TYPE"."CAT_TYPE_CODE_ID"=:B1 AND
"CTC_TYPE"."CODE"="CTC_APPTYPE"."CODE" AND "CTC_APPTYPE"."TYPE_CD"='APPSENT' AND
"CTC_APPTYPE"."CATEGORY_CD"='APPTYPE'))
13 - access("CTC_TYPE"."CAT_TYPE_CODE_ID"=:B1)
14 - access("CTC_APPTYPE"."CATEGORY_CD"='APPTYPE' AND "CTC_APPTYPE"."TYPE_CD"='APPSENT' AND
"CTC_TYPE"."CODE"="CTC_APPTYPE"."CODE")
15 - filter("CP"."CRED_PROCESS_ID"="CH"."CRED_PROCESS_ID")
16 - access("CH"."CRED_HISTORY_ID"= (SELECT MAX("CRED_HISTORY_ID") FROM "PPO"."CRED_HISTORY"
"BRDG_CRED_HISTORY" WHERE "CRED_PROCESS_ID"=:B1 AND
"EVENT_CD_CTC_ID"="BRDG_CREDENTIALING"."GET_EVENT_CTC_ID"('SEVENT','SPESTCHG')))
18 - filter("EVENT_CD_CTC_ID"="BRDG_CREDENTIALING"."GET_EVENT_CTC_ID"('SEVENT','SPESTCHG'))
19 - access("CRED_PROCESS_ID"=:B1)
21 - access("CP"."PROVIDER_ID"="P"."PROVIDER_ID"(+))
22 - filter("CWH"."PRIMARY_PRACTICE_FLAG"(+)='Y')
23 - access("CP"."CRED_PROCESS_ID"="CWH"."CRED_PROCESS_ID"(+))
25 - access("CATEGORY_CD"='APPSTAT' AND ("TYPE_CD"='APPROVAL' OR "TYPE_CD"='DENIED' OR
"TYPE_CD"='INACTIVE' OR "TYPE_CD"='TERMED') AND "CODE"="BRDG_CODE_UTIL"."ID_CODE"("BRDG_CREDENTIAL
ING"."GET_APPL_STATUS_CD_CTC_ID"(:B1)))
26 - filter("EVENT_CD_CTC_ID"="BRDG_CODE_UTIL"."GET_ID"('CRED','SEVENT','MSODC'))
27 - access("CRED_PROCESS_ID"=:B1)Welcome to the forums!
user11987210 wrote:
In the explain plan I don't understand what's happining at line ID 10 and 11. Specifically, is the result at line 11 rowids from lines 12 & 14? and then what? Are those rowids somehow used in line ID 10?
9: 6 1 311 2 ....+. TABLE ACCESS BY INDEX ROWID CRED_PROCESSES [+]
10: 9 183 61927 1 ....+.. INDEX RANGE SCAN CDPR_CTCD_FK1 1/1 [+] [+]
11: 10 1 3 2 ....+... NESTED LOOPS
12: 11 1 16 1 ....+.... TABLE ACCESS BY INDEX ROWID CAT_TYPE_CODES
13: 12 1 16 1 ....+....+ INDEX UNIQUE SCAN CTCD_PK 1/1 [+] [+]
14: 11 1 3 1 ....+.... INDEX UNIQUE SCAN CAT_TYPE_CODE_UK 3/3 [+] [+] The NESTED LOOPS operation (ID #11) has two children, ID #12 sometimes called the driving source, and ID #14 the inner loop. ID #14 is executed once for each row returned by ID #12. The results of ID #11 are then fed to ID #10 which performs an INDEX RANGE SCAN.
Hope this helps! -
Max wait from a trace file...what does it mean
Hi,
following is a part of an extended sql trace file :
Elapsed times include waiting on following events:
Event waited on Times Max. Wait Total Waited
---------------------------------------- Waited ---------- ------------
library cache lock 5 0.00 0.00
row cache lock 7 0.00 0.00
library cache pin 3 0.00 0.00
rdbms ipc reply 2 0.00 0.00
SQL*Net message to client 8185 0.00 0.01
SQLNet message from client 8185 1671.98 1688.26*
here what does the column max.wait mean? this is a trace file when a proc was run remotely to a database - it has lots of dbms_output statements, it creates an output file (on the remote pc from which it runs) the proc calls several sql scrips, does lot of query and dml
Now when teh same script is run locally from the server those sql net waits are not there. and the interesting fact is: when run locally it takes 4 mins
and remotely it takes 1 hour. How can we interpret this? and what does teh max wait indiccate?
thanks,Can you explain more on this? the same proc when run locally does not have this wait and when run from remote ip it has this wait - so does that mean that this is due to network issue?
should we for example - remove the dbms_output statements from this and try ? what should we do to be able to run the script from remote ip and run it in 6-7 minutes- when the scritp is run locally from the from server it takes 4 minutes and from a remote ip it takes one hour. the script also has a spool statement as it has to log its output. Is it that spoling that may be causing this?
Thanks again,
Edited by: orausern on Jan 18, 2010 7:09 AM -
Getting timestamps for SQL session without the trace file enabled
hi, i have a clarification. Is there a method or way by which i can get the timestamps of an SQL session without the the trace file enabled , and if so please get me the details of this.
thanks in advance.Hi,
Don't very understand what do you want.
SQL> set timing on
SQL> select * from dual;
D
X
Elapsed: 00:00:01.07
SQL> Is this ?
Nicolas. -
Extracting SQL statement from a Webi document's data provider using SDK.
Hi all,
Is it possible to extract the SQL statement from an existing Webi document's data provider using BO SDK? I've searched through the class library but haven't found any information on this yet. If you have done it, could you provide some guidance. Many thanks.I found the following Java code that might be of some help to you. I realize you are using .NET but this might push you down the right path.
The trick here is to use the Report Engine SDK to get the DataProvider of the DocumentInstance. Then, look at the SQLDataProvider to get your SQLContainer.
My apologies for the poor formatting. This didn't copy and paste over to the forums very well. I've cleaned up as much as I could.
<%@ page import="com.crystaldecisions.sdk.framework.*" %>
<%@ page import="com.crystaldecisions.sdk.exception.SDKException" %>
<%@ page import="com.crystaldecisions.sdk.occa.infostore.*" %>
<%@ page import="com.businessobjects.rebean.wi.*" %>
<%
boolean loginSuccessful = false;
IEnterpriseSession oEnterpriseSession = null;
String username = "username";
String password = "password";
String cmsname = "cms_name";
String authenticationType = "secEnterprise";
try
//Log in. oEnterpriseSession = CrystalEnterprise.getSessionMgr().logon( username, password, cmsname, authenticationType);
if (oEnterpriseSession == null)
out.print("<FONT COLOR=RED><B>Unable to login.</B></FONT>");
else
{ loginSuccessful = true;
catch (SDKException sdkEx)
{ out.print("<FONT COLOR=RED><B>ERROR ENCOUNTERED</B><BR>" + sdkEx + "</FONT>");}
if (loginSuccessful) { IInfoObject oInfoObject = null;
String docname = "WebI document name";
//Grab the InfoStore from the httpsession IInfoStore oInfoStore = (IInfoStore) oEnterpriseSession.getService("", "InfoStore"); //Query for the report object in the CMS. See the Developer Reference guide for more information the query language. String query = "SELECT TOP 1 * " + "FROM CI_INFOOBJECTS " + "WHERE SI_INSTANCE = 0 And SI_Kind = 'Webi' " + "AND SI_NAME='" + docname + "'";
IInfoObjects oInfoObjects = (IInfoObjects) oInfoStore.query(query);
if (oInfoObjects.size() > 0)
//Retrieve the latest instance of the report oInfoObject = (IInfoObject) oInfoObjects.get(0);
// Initialize the Report Engine ReportEngines oReportEngines = (ReportEngines)
oEnterpriseSession.getService("ReportEngines");
ReportEngine oReportEngine = (ReportEngine) oReportEngines.getService(ReportEngines.ReportEngineType.WI_REPORT_ENGINE);
// Openning the document DocumentInstance oDocumentInstance = oReportEngine.openDocument(oInfoObject.getID());
DataProvider oDataProvider = null;
SQLDataProvider oSQLDataProvider = null;
SQLContainer oSQLContainer_root = null;
SQLNode oSQLNode = null;
SQLSelectStatement oSQLSelectStatement = null;
String sqlStatement = null;
out.print("<TABLE BORDER=1>");
for (int i=0; i<oDocumentInstance.getDataProviders().getCount(); i++)
oDataProvider = oDocumentInstance.getDataProviders().getItem(i);
out.print("<TR><TD COLSPAN=2 BGCOLOR=KHAKI>Data Provider Name: " + oDataProvider.getName() + "</TD></TR>");
if (oDataProvider instanceof SQLDataProvider)
oSQLDataProvider = (SQLDataProvider) oDataProvider;
oSQLContainer_root = oSQLDataProvider.getSQLContainer();
if (oSQLContainer_root != null)
for (int j=0; j<oSQLContainer_root.getChildCount(); j++)
oSQLNode = (SQLNode) oSQLContainer_root.getChildAt(j);
oSQLSelectStatement = (SQLSelectStatement) oSQLNode;
sqlStatement = oSQLSelectStatement.getSQL();
out.print("<TR><TD>" + (j+1) + "</TD><TD>" + sqlStatement + "</TD></TR>");
else
out.print("<TR><TD COLSPAN=2>Data Provider is not a SQLDataProvider. SQL Statement can not be retrieved.</TD></TR>"); } } out.print("</TABLE>");
oDocumentInstance.closeDocument(); }
oEnterpriseSession.logoff();}%> -
Extracting specific data from multiple text files to single CSV
Hello,
Unfortunately my background is not scripting so I am struggling to piece together a powershell script to achieve the below. Hoping an experienced powershell scripter can provide the answer. Thanks in advance.
I have a folder containing approx. 2000 label type files that I need to extract certain information from to index a product catalog. Steps to be performed within the script as I see are:
1. Search folder for *.job file types
2. Search the files for certain criteria and where matched return into single CSV file
3. End result should be a single CSV with column headings:
a) DESCRIPTION
b) MODEL
c) BARCODETry:
# Script to extract data from .job files and report it in CSV
# Sam Boutros - 8/24/2014
# http://superwidgets.wordpress.com/category/powershell/
$CSV = ".\myfile.csv" # Change this filename\path as needed
$Folders = "d:\sandbox" # You can add multiple search folders as "c:\folder1","\\server\share\folder2"
# End Data entry section
if (-not (Test-Path -Path $CSV)) {
Write-Output """Description"",""Model"",""Barcode""" | Out-File -FilePath $CSV -Encoding ascii
$Files = Get-ChildItem -Path $Folders -Include *.job -Force -Recurse
foreach ($File in $Files) {
$FileContent = Get-Content -Path $File
$Keyword = "viewkind4"
if ($FileContent -match $Keyword) {
for ($i=0; $i -lt $FileContent.Count; $i++) {
if ($FileContent[$i] -match $Keyword) {
$Description = $FileContent[$i].Split("\")[$FileContent[$i].Split("\").Count-1]
} else {
Write-Host "Keyword $Keyword not found in file $File" -ForegroundColor Yellow
$Keyword = "Code:"
if ($FileContent -match $Keyword) {
for ($i=0; $i -lt $FileContent.Count; $i++) {
if ($FileContent[$i]-match $Keyword) {
$Parts = $FileContent[$i].Split(" ")
for ($j=0; $j -lt $Parts.Count; $j++) {
if ($Parts[$j] -match $Keyword) {
$Model = $Parts[$j+1].Trim()
$Model = $Model.Split("\")[$Model.Split("\").Count-1]
} else {
Write-Host "Keyword $Keyword not found in file $File" -ForegroundColor Yellow
$Keyword = "9313"
if ($FileContent -match $Keyword) {
for ($i=0; $i -lt $FileContent.Count; $i++) {
if ($FileContent[$i] -match "9313") {
$Index = $FileContent[$i].IndexOf("9313")
$Barcode = $null
for ($j=0; $j -lt 12; $j++) {
$Barcode += $FileContent[$i][($Index+$j)]
} else {
Write-Host "Keyword $Keyword not found in file $File" -ForegroundColor Yellow
Write-Output "File: '$File', Description: '$Description', Model: '$Model', Barcode: '$Barcode'"
Write-Output """$Description"",""$Model"",""$Barcode""" | Out-File -FilePath $CSV -Append -Encoding ascii
Sam Boutros, Senior Consultant, Software Logic, KOP, PA http://superwidgets.wordpress.com (Please take a moment to Vote as Helpful and/or Mark as Answer, where applicable) -
Memory Dump for Bind Variable included in 10046 trace file
A curious thing I've seen today. While looking through an extended Oracle Trace file, I see the following:
=====================
PARSING IN CURSOR #26 len=88 dep=0 uid=28 oct=6 lid=28 tim=2667421262 hv=3259943383 ad='4bbb4ad8'
UPDATE V_QRTZ_TRIGGERS SET JOB_DATA = :1 WHERE TRIGGER_NAME = :2 AND TRIGGER_GROUP = :3
END OF STMT
PARSE #26:c=0,e=54,p=0,cr=0,cu=0,mis=0,r=0,dep=0,og=1,tim=2667421260
BINDS #26:
bind 0: dty=113 mxl=3876(3876) mal=00 scl=00 pre=00 oacflg=03 oacfl2=0 size=3876 offset=0
bfp=0cd99aa4 bln=3876 avl=86 flg=05
value=
Dump of memory from 0x0CD99AA4 to 0x0CD99AFA
*CD99AA0 01005400 00002C01 00000100 [.T...,......]*
*CD99AB0 00000100 EB23EF03 581D0000 571D0000 [......#....X...W]*
*CD99AC0 0F000F00 00000000 0F005920 0E14E12F [........ Y../...]*
*CD99AD0 CDE21ADA 00000000 737E06D9 0400FB09 [..........~s....]*
*CD99AE0 07000F00 800387E6 A17B3F20 0000000E [........ ?{.....]*
*CD99AF0 571D0000 EE56CF00 00001500 [...W..V.....]*
bind 1: dty=1 mxl=128(45) mal=00 scl=00 pre=00 oacflg=03 oacfl2=10 size=256 offset=0
bfp=0cd99984 bln=128 avl=15 flg=05
value="EC-MHM Retrieve"
bind 2: dty=1 mxl=128(96) mal=00 scl=00 pre=00 oacflg=03 oacfl2=10 size=0 offset=128
bfp=0cd99a04 bln=128 avl=32 flg=01
value="2BBDE87AF15D4B5E867AB6482D7D58C8"
BINDS #9:
bind 0: dty=1 mxl=32(18) mal=00 scl=00 pre=00 oacflg=03 oacfl2=1 size=192 offset=0
bfp=0c2de90c bln=32 avl=18 flg=05
value="EC_SCHEDULE_PIN_TO"
bind 1: dty=1 mxl=128(15) mal=00 scl=00 pre=00 oacflg=13 oacfl2=1 size=0 offset=32
bfp=0c2de92c bln=128 avl=15 flg=01
value="EC-MHM Retrieve"
bind 2: dty=1 mxl=32(32) mal=00 scl=00 pre=00 oacflg=13 oacfl2=1 size=0 offset=160
bfp=0c2de9ac bln=32 avl=32 flg=01
value="2BBDE87AF15D4B5E867AB6482D7D58C8"
EXEC #9:c=15625,e=1060,p=0,cr=0,cu=0,mis=0,r=0,dep=1,og=1,tim=2667424863
FETCH #9:c=0,e=95,p=0,cr=4,cu=0,mis=0,r=1,dep=1,og=1,tim=2667425091
FETCH #9:c=0,e=1,p=0,cr=0,cu=0,mis=0,r=0,dep=1,og=0,tim=2667425236
Is this normal and is it caused by size of Bind Variable?
Kind regards,
TRONdIt's normal and and happens due to bind variable datatype. In your case this is blob, (there are a couple of other datatypes - timestamp among other) which will be represented that way in the trace file.
Best regards
Maxim -
SQL Developer generates strange trace files on server
Hello out there,
I observed the generation of some strange trace files on the database server (Oracle 11.0.2.0.2 64bit on Win 2008R2).
Whenever I start SQL Developer (3.2.20.09.87 64bit with JDK 1.7.0_17 64bit on Win7 64bit) for each connection I defined one trace file like this is generated:
Trace file C:\ORACLE\diag\rdbms\ora\ora\trace\ora_ora_8500.trc
Oracle Database 11g Release 11.2.0.2.0 - 64bit Production
Windows NT Version V6.1 Service Pack 1
CPU : 2 - type 8664, 2 Physical Cores
Process Affinity : 0x0x0000000000000000
Memory (Avail/Total): Ph:990M/3959M, Ph+PgF:3743M/7918M
Instance name: ora
Redo thread mounted by this instance: 1
Oracle process number: 23
Windows thread id: 8500, image: ORACLE.EXE (SHAD)
*** 2013-03-06 08:04:13.842
*** CLIENT ID:() 2013-03-06 08:04:13.842
*** SERVICE NAME:() 2013-03-06 08:04:13.842
*** MODULE NAME:() 2013-03-06 08:04:13.842
*** ACTION NAME:() 2013-03-06 08:04:13.842
Breaking the connection before proto/dty negotiation, error raised 3113I enabled listener log to find out the origin of this and it contains lines like the following:
06-MRZ-2013 08:04:13 * (CONNECT_DATA=(CID=(PROGRAM=null)(HOST=__jdbc__)(USER=null))(SERVICE_NAME=ora.vu)(CID=(PROGRAM=null)(HOST=__jdbc__)(USER=null))) * (ADDRESS=(PROTOCOL=tcp)(HOST=192.168.36.143)(PORT=49320)) * establish * ora.vu * 0
06-MRZ-2013 08:04:13 * (CONNECT_DATA=(CID=(PROGRAM=null)(HOST=__jdbc__)(USER=null))(SERVICE_NAME=ora.vu)(CID=(PROGRAM=null)(HOST=__jdbc__)(USER=null))) * (ADDRESS=(PROTOCOL=tcp)(HOST=192.168.36.143)(PORT=49322)) * establish * ora.vu * 0
06-MRZ-2013 08:04:13 * (CONNECT_DATA=(CID=(PROGRAM=null)(HOST=__jdbc__)(USER=null))(SERVICE_NAME=ora.vu)(CID=(PROGRAM=null)(HOST=__jdbc__)(USER=null))) * (ADDRESS=(PROTOCOL=tcp)(HOST=192.168.36.143)(PORT=49323)) * establish * ora.vu * 0
06-MRZ-2013 08:04:13 * (CONNECT_DATA=(CID=(PROGRAM=null)(HOST=__jdbc__)(USER=null))(SERVICE_NAME=ora.vu)(CID=(PROGRAM=null)(HOST=__jdbc__)(USER=null))) * (ADDRESS=(PROTOCOL=tcp)(HOST=192.168.36.143)(PORT=49325)) * establish * ora.vu * 0
06-MRZ-2013 08:04:14 * (CONNECT_DATA=(CID=(PROGRAM=null)(HOST=__jdbc__)(USER=null))(SERVICE_NAME=ora.vu)(CID=(PROGRAM=null)(HOST=__jdbc__)(USER=null))) * (ADDRESS=(PROTOCOL=tcp)(HOST=192.168.36.143)(PORT=49329)) * establish * ora.vu * 0
06-MRZ-2013 08:04:14 * (CONNECT_DATA=(CID=(PROGRAM=null)(HOST=__jdbc__)(USER=null))(SERVICE_NAME=ora.vu)(CID=(PROGRAM=null)(HOST=__jdbc__)(USER=null))) * (ADDRESS=(PROTOCOL=tcp)(HOST=192.168.36.143)(PORT=49331)) * establish * ora.vu * 0The IP address is mine and I have excatly 6 connections defined for that server. On other servers, similar trace files are generated, one for each connection in my SQL Developer.
This also occurred with JDK 1.6 so I don't think it's a Java issue.
Besides the generation of the trace files there seem to be no other problems.
Any ideas?Hi,
I think Srini is probably correct. The noted bug applies to 11.2.0.1 and up, is fixed in 12c, and included in an 11.2.0.3 patch. However the version of SQL Developer also affects the creation of trace files on product startup (prior to any user initiated db connect attempts).
For example,
A. SQL Developer 3.1.07.42 - no such trace files created.
B. SQL Developer 3.2.20.09.87 - such trace files created for 11.2.0.1 connections, but not 10g XE or 12c connections.
So I presume an OCIServerAttach call got added in 3.2.2, not sure in support of which feature, but the bug will only impact users of 11.2.0.1, 11.2.0.2, and unpatched 11.2.0.3 DB releases.
Regards,
Gary
SQL Developer Team
Maybe you are looking for
-
Xml report output in excel format without using options tab in EBS
How to get xml publisher report output in excel format without using options tab in EBS? I am getting XML Publisher report output in excel format by using options tab while submitting the concurrent request . But i want to get excel output automatica
-
Moved itunes now i have lost all sound with my media
I moved itunes and the library....now i have no sound for any of my music on my computer or through my videos...even putting in a cd there is no more sound...it just doesnt play. Any advice would be greatly appreciated.
-
Satellite T110-10N - How to recalibrate my battery?
Hi I have a Satellite T110-10N. The battery does not load 100% only on 50%. I like to know how recalibrate it.
-
Remove plug-in connections from document
I have an old document that was originally created using plug-ins. It has been completely changed, and the plug-ins are no longer needed to complete the task at hand, but due to som old history, I can't package the file (because I'm missing the plug-
-
Error when trying to send scan pic thru e-mail
i scanned a pic to the puter as a jpeg file to send to e-mail contact and get and error message bout no e-mail association's so i can not get pics sent