Long time reorganization table
Hi Support
I tried to run the table reorganization of 128 GB, this run takes 15 hours.
Brtools run this reorganization, the options used were:
- brtools
- (3)
- (1)
- BRSPACE options for reorganization of tables
1 - BRSPACE profile (profile) ...... [initRQ1.sap]
2 - Database user/password (user) .. [/]
3 ~ Reorganization action (action) . []
4 ~ Tablespace names (tablespace) .. []
5 ~ Table owner (owner) ............ []
6 ~ Table names (table) ............ [GLPCA]
7 - Confirmation mode (confirm) .... [yes]
8 - Extended output (output) ....... [no]
9 - Scrolling line count (scroll) .. [20]
10 - Message language (language) .... [E]
11 - BRSPACE command line (command) . [-p initRQ1.sap -s 20 -l E -f
tbreorg -t "GLPCA"]
Standard keys: c - cont, b - back, s - stop, r - refr, h - help
BR0662I Enter your choice:
BR1001I BRSPACE 7.00 (41)
BR1002I Start of BRSPACE processing: sehujakt.tbr 2012-02-01 16.11.59
BR0484I BRSPACE log file: /oracle/RD1/sapreorg/sehujakt.tbr
BR0280I BRSPACE time stamp: 2012-02-01 16.12.03
BR1009I Name of database instance: RD1
BR1010I BRSPACE action ID: sehujakt
BR1011I BRSPACE function ID: tbr
BR1012I BRSPACE function: tbreorg
BR0280I BRSPACE time stamp: 2012-02-01 16.12.08
BR0657I Input menu 353 - please enter/check input values
Options for reorganization of tables: SAPRD1.GLPCA (degree 1)
1 ~ New destination tablespace (newts) ........ []
2 ~ Separate index tablespace (indts) ......... []
3 - Parallel threads (parallel) ............... [1]
4 ~ Table/index parallel degree (degree) ...... []
5 - Create DDL statements (ddl) ............... [yes]
6 ~ Category of initial extent size (initial) . []
7 ~ Sort by fields of index (sortind) ......... []
8 - Table reorganization mode (mode) .......... [online]
Standard keys: c - cont, b - back, s - stop, r - refr, h - help
and its execution was 15 hours
Now this same process we have to do in production, the table has 329GB
GLPCA approx.
This table is archived run now do the reorganization is required to
restore the real space in the table.
I need to run this reorganization with the best option from Sete to avoid
the high execution time of 15 hours in quality, because on average
production would be 35 hours in such execution to this table and not so
far achieved the best way to try to reduce the execution time for this
process of reorganization.
Thanks for your attention, I look forward to your comments
Regards
Hi,
Here is a simple execution of an online table reorganization:
-- Check table can be redefined
EXEC Dbms_Redefinition.Can_Redef_Table('SCOTT', 'EMPLOYEES');
-- Create new table with CTAS
CREATE TABLE scott.employees2
TABLESPACE tools AS
SELECT empno, first_name, salary as sal
FROM employees WHERE 1=2;
-- Start Redefinition
EXEC Dbms_Redefinition.Start_Redef_Table( -
'SCOTT', -
'EMPLOYEES', -
'EMPLOYEES2', -
'EMPNO EMPNO, FIRST_NAME FIRST_NAME, SALARY*1.10 SAL);
-- Optionally synchronize new table with interim data
EXEC dbms_redefinition.sync_interim_table( -
'SCOTT', 'EMPLOYEES', 'EMPLOYEES2');
-- Add new keys, FKs and triggers
ALTER TABLE employees2 ADD
(CONSTRAINT emp_pk2 PRIMARY KEY (empno)
USING INDEX
TABLESPACE indx);
-- Complete redefinition
EXEC Dbms_Redefinition.Finish_Redef_Table( -
'SCOTT', 'EMPLOYEES', 'EMPLOYEES2');
-- Remove original table which now has the name of the new table
DROP TABLE employees2;
Regards,
Venkata S Pagolu
Edited by: Venkata Pagolu on Feb 17, 2012 1:48 PM
Similar Messages
-
Report take a long time in table TST03
Hi,
Our developpers have done a job to create a spool order to file. Among others, this job run the SAPLSPOX, and SAPLSPOC reports, but they takes a long time when they read or insert in table TST03.
The spool order has about 40.000 or 50.000 pages and the job can last 5.000 or 6.000 seconds. Is it normal?. If not what can I do?.
Regards.Do you clean you old spools regularly?
Note 130978 - RSPO1041 - Replacement for RSPO0041
Markus -
Printing tables takes a long time
Hi,
I am using the printerJob.printDialog() to print tables (I implemented a print method).
The problem is that when the table has many entries printing takes very long time (printing table of 100 entries might take a couple of minutes).
I am using the SwingWorker when printing in the following way:
final SwingWorker worker = new SwingWorker() {
public Object construct() {
try
PrinterJob printerJob = PrinterJob.getPrinterJob();
printerJob.setPrintable(printable);
if (!printerJob.printDialog())
return 0;
NMSMainWindow.getInstance().setCursor(new Cursor(Cursor.WAIT_CURSOR));
printerJob.print();
NMSMainWindow.getInstance().setCursor(new Cursor(Cursor.DEFAULT_CURSOR));
JOptionPane.showMessageDialog(NMSMainWindow.getInstance(), "Printing completed successfully", "Print", JOptionPane.INFORMATION_MESSAGE);
catch (PrinterException pe)
pe.printStackTrace();
NMSMainWindow.getInstance().setCursor(new Cursor(Cursor.DEFAULT_CURSOR));
return 0;
worker.start();My question is: Is there a way to send a print job in the background, but in the meanwhile, while it is being processed, keep on working?
Thanks,
EfratEdit init<sid>.ora
and add the following line
_fast=true
Seriously, you need to run the normal selects to find out 'what it is waiting for' and/or run statspack.
Information about this can be found on the websites you so far avoided to use.
And Cost Based Optimizer is not supported for SYS in 8i.
Sybrand Bakker
Senior Oracle DBA -
Query optimization - Query is taking long time even there is no table scan in execution plan
Hi All,
The below query execution is taking very long time even there are all required indexes present.
Also in execution plan there is no table scan. I did a lot of research but i am unable to find a solution.
Please help, this is required very urgently. Thanks in advance. :)
WITH cte
AS (
SELECT Acc_ex1_3
FROM Acc_ex1
INNER JOIN Acc_ex5 ON (
Acc_ex1.Acc_ex1_Id = Acc_ex5.Acc_ex5_Id
AND Acc_ex1.OwnerID = Acc_ex5.OwnerID
WHERE (
cast(Acc_ex5.Acc_ex5_92 AS DATETIME) >= '12/31/2010 18:30:00'
AND cast(Acc_ex5.Acc_ex5_92 AS DATETIME) < '01/31/2014 18:30:00'
SELECT DISTINCT R.ReportsTo AS directReportingUserId
,UC.UserName AS EmpName
,UC.EmployeeCode AS EmpCode
,UEx1.Use_ex1_1 AS PortfolioCode
SELECT TOP 1 TerritoryName
FROM UserTerritoryLevelView
WHERE displayOrder = 6
AND UserId = R.ReportsTo
) AS BranchName
,GroupsNotContacted AS groupLastContact
,GroupCount AS groupTotal
FROM ReportingMembers R
INNER JOIN TeamMembers T ON (
T.OwnerID = R.OwnerID
AND T.MemberID = R.ReportsTo
AND T.ReportsTo = 1
INNER JOIN UserContact UC ON (
UC.CompanyID = R.OwnerID
AND UC.UserID = R.ReportsTo
INNER JOIN Use_ex1 UEx1 ON (
UEx1.OwnerId = R.OwnerID
AND UEx1.Use_ex1_Id = R.ReportsTo
INNER JOIN (
SELECT Accounts.AssignedTo
,count(DISTINCT Acc_ex1_3) AS GroupCount
FROM Accounts
INNER JOIN Acc_ex1 ON (
Accounts.AccountID = Acc_ex1.Acc_ex1_Id
AND Acc_ex1.Acc_ex1_3 > '0'
AND Accounts.OwnerID = 109
GROUP BY Accounts.AssignedTo
) TotalGroups ON (TotalGroups.AssignedTo = R.ReportsTo)
INNER JOIN (
SELECT Accounts.AssignedTo
,count(DISTINCT Acc_ex1_3) AS GroupsNotContacted
FROM Accounts
INNER JOIN Acc_ex1 ON (
Accounts.AccountID = Acc_ex1.Acc_ex1_Id
AND Acc_ex1.OwnerID = Accounts.OwnerID
AND Acc_ex1.Acc_ex1_3 > '0'
INNER JOIN Acc_ex5 ON (
Accounts.AccountID = Acc_ex5.Acc_ex5_Id
AND Acc_ex5.OwnerID = Accounts.OwnerID
WHERE Accounts.OwnerID = 109
AND Acc_ex1.Acc_ex1_3 NOT IN (
SELECT Acc_ex1_3
FROM cte
GROUP BY Accounts.AssignedTo
) TotalGroupsNotContacted ON (TotalGroupsNotContacted.AssignedTo = R.ReportsTo)
WHERE R.OwnerID = 109
Please mark it as an answer/helpful if you find it as useful. Thanks, Satya Prakash JugranHi All,
Thanks for the replies.
I have optimized that query to make it run in few seconds.
Here is my final query.
select ReportsTo as directReportingUserId,
UserName AS EmpName,
EmployeeCode AS EmpCode,
Use_ex1_1 AS PortfolioCode,
BranchName,
GroupInfo.groupTotal,
GroupInfo.groupLastContact,
case when exists
(select 1 from ReportingMembers RM
where RM.ReportsTo = UserInfo.ReportsTo
and RM.MemberID <> UserInfo.ReportsTo
) then 0 else UserInfo.ReportsTo end as memberid1,
(select code from Regions where ownerid=109 and name=UserInfo.BranchName) as BranchCode,
ROW_NUMBER() OVER (ORDER BY directReportingUserId) AS ROWNUMBER
FROM
(select distinct R.ReportsTo, UC.UserName, UC.EmployeeCode,UEx1.Use_ex1_1,
(select top 1 TerritoryName
from UserTerritoryLevelView
where displayOrder = 6
and UserId = R.ReportsTo) as BranchName,
Case when R.ReportsTo = Accounts.AssignedTo then Accounts.AssignedTo else 0 end as memberid1
from ReportingMembers R
INNER JOIN TeamMembers T ON (T.OwnerID = R.OwnerID AND T.MemberID = R.ReportsTo AND T.ReportsTo = 1)
inner join UserContact UC on (UC.CompanyID = R.OwnerID and UC.UserID = R.ReportsTo )
inner join Use_ex1 UEx1 on (UEx1.OwnerId = R.OwnerID and UEx1.Use_ex1_Id = R.ReportsTo)
inner join Accounts on (Accounts.OwnerID = 109 and Accounts.AssignedTo = R.ReportsTo)
union
select distinct R.ReportsTo, UC.UserName, UC.EmployeeCode,UEx1.Use_ex1_1,
(select top 1 TerritoryName
from UserTerritoryLevelView
where displayOrder = 6
and UserId = R.ReportsTo) as BranchName,
Case when R.ReportsTo = Accounts.AssignedTo then Accounts.AssignedTo else 0 end as memberid1
from ReportingMembers R
--INNER JOIN TeamMembers T ON (T.OwnerID = R.OwnerID AND T.MemberID = R.ReportsTo)
inner join UserContact UC on (UC.CompanyID = R.OwnerID and UC.UserID = R.ReportsTo)
inner join Use_ex1 UEx1 on (UEx1.OwnerId = R.OwnerID and UEx1.Use_ex1_Id = R.ReportsTo)
inner join Accounts on (Accounts.OwnerID = 109 and Accounts.AssignedTo = R.ReportsTo)
where R.MemberID = 1
) UserInfo
inner join
select directReportingUserId, sum(Groups) as groupTotal, SUM(GroupsNotContacted) as groupLastContact
from
select distinct R.ReportsTo as directReportingUserId, Acc_ex1_3 as GroupName, 1 as Groups,
case when Acc_ex5.Acc_ex5_92 between GETDATE()-365*10 and GETDATE() then 1 else 0 end as GroupsNotContacted
FROM ReportingMembers R
INNER JOIN TeamMembers T
ON (T.OwnerID = R.OwnerID AND T.MemberID = R.ReportsTo AND T.ReportsTo = 1)
inner join Accounts on (Accounts.OwnerID = 109 and Accounts.AssignedTo = R.ReportsTo)
inner join Acc_ex1 on (Acc_ex1.OwnerID = 109 and Acc_ex1.Acc_ex1_Id = Accounts.AccountID and Acc_ex1.Acc_ex1_3 > '0')
inner join Acc_ex5 on (Acc_ex5.OwnerID = 109 and Acc_ex5.Acc_ex5_Id = Accounts.AccountID )
--where TerritoryID in ( select ChildRegionID RegionID from RegionWithSubRegions where OwnerID =109 and RegionID = 729)
union
select distinct R.ReportsTo as directReportingUserId, Acc_ex1_3 as GroupName, 1 as Groups,
case when Acc_ex5.Acc_ex5_92 between GETDATE()-365*10 and GETDATE() then 1 else 0 end as GroupsNotContacted
FROM ReportingMembers R
INNER JOIN TeamMembers T
ON (T.OwnerID = R.OwnerID AND T.MemberID = R.ReportsTo)
inner join Accounts on (Accounts.OwnerID = 109 and Accounts.AssignedTo = R.ReportsTo)
inner join Acc_ex1 on (Acc_ex1.OwnerID = 109 and Acc_ex1.Acc_ex1_Id = Accounts.AccountID and Acc_ex1.Acc_ex1_3 > '0')
inner join Acc_ex5 on (Acc_ex5.OwnerID = 109 and Acc_ex5.Acc_ex5_Id = Accounts.AccountID )
--where TerritoryID in ( select ChildRegionID RegionID from RegionWithSubRegions where OwnerID =109 and RegionID = 729)
where R.MemberID = 1
) GroupWiseInfo
group by directReportingUserId
) GroupInfo
on UserInfo.ReportsTo = GroupInfo.directReportingUserId
Please mark it as an answer/helpful if you find it as useful. Thanks, Satya Prakash Jugran -
Table valueset taking long time to open the LOV
Hi,
We added a table valueset to a concurrent program. The table vaueset showsTransaction number from ra_interface_lines_all table. It is having long list. So we added the partial string entering message before open a long list.But still it is taking long time.
Please any help on this highly appreciated.
Thanks,
SambaHi
Try to modify the query or creating an index will speed up the process.
Thanks & regards
Rajan -
Long time taken to fetch data from Database Table
Moved to correct forum by moderator.
I had a requirement where i need to fetch records from table A912 for the matching entries in an internal table and satisfying some conditions.
The Internal Table(it_out) has 1,206 entries, while Databse table(A912) has 9,758,234 entries.
The Structure of Internal Table and Select Query are,
TYPES : BEGIN OF ty_a912,
matnr LIKE a912-matnr, "Material
kunwe LIKE a912-kunwe, "Ship-to party
datbi LIKE a912-datbi, "Validity end dt
datab LIKE a912-datab, "Validity start
knumh LIKE a912-knumh, "Cond rec no.
END OF ty_a912.
TYPES : BEGIN OF ty_out,
erdat LIKE vbak-erdat, "Date
vbeln LIKE vbak-vbeln, "Sales document
posnr LIKE vbap-posnr, "Item
kunnr LIKE vbpa-kunnr, "Customer
fkdat LIKE vbkd-fkdat, "Billing Date
ndc LIKE zndc-zndc, "EAN/UPC
matnr LIKE vbap-matnr, "Material
zr00p LIKE konv-kbetr, "ZR00 Price
zcarp LIKE konv-kbetr, "ZCAR Price
zrfcp LIKE konv-kbetr, "ZRFC Price
ctrnr TYPE char80, "Contract#
ctrnm TYPE char80, "Contract Name
a912p TYPE konv-kbetr,
END OF ty_out.
SELECT
matnr kunwe datbi datab knumh
FROM a912
INTO TABLE t_a912
FOR ALL ENTRIES IN t_out
WHERE matnr EQ t_out-matnr
AND kunwe EQ t_out-kunnr
AND datbi GE t_out-fkdat
AND datab LE t_out-fkdat.
It takes very long time to process this select query, is there any alternate way?
Please suggest some valid solution for this issue.
Edited by: Matt on Dec 3, 2008 10:08 AMPrabhakar Manoharan wrote:>
> Moved to correct forum by moderator.
>
> I had a requirement where i need to fetch records from table A912 for the matching entries in an internal table and satisfying some conditions.
> The Internal Table(it_out) has 1,206 entries, while Databse table(A912) has 9,758,234 entries.
>
Hi,
can you give us an SQL trace in ST05? The SQL Trace (ST05) Quick and Easy
The FAE will be processed in a special way: see https://forums.sdn.sap.com/click.jspa?searchID=-1&messageID=6630311
If an index is NOT supported the FAE part will take forever...
if t_out is filled by another table you may go for a join? But again, index support is the key
bye
yk
Edited by: YukonKid on Dec 4, 2008 10:31 AM -
Taking snapshot of oracle tables to sql server using transactional replication is taking a long time
Hi All,
I am trying to replicate around 200 oracle tables onto sql server using transaction replication and it taking a long time i.e the initial snapshot is taking more than 24 hrs and it still going on.
Is there any way to replicate those these tables faster?
Kindly help me out..
ThanksHi,
According to the description, I know the replication is working fine. But it is very slow.
1. Check the CPU usage on Oracle publisher and SQL Server. This issue may due to slow client processing (Oracle performance) or Network performance issues.
2. Based on SQL Server 2008 Books Online ‘Performance Tuning for Oracle Publishers’ (http://msdn.microsoft.com/en-us/library/ms151179(SQL.100).aspx). You can enable the transaction
job set and follow the instructions based on
http://msdn.microsoft.com/en-us/library/ms147884(v=sql.100).aspx.
2. You can enable replication agent logging to check the replication behavior. You may follow these steps to collect them:
To enable Distribution Agent verbose logging. Please follow these steps:
a. Open SQL Server Agent on the distribution server.
b. Under Jobs folder, find out the Distribution Agent.
c. Right click the job and choose Properties.
d. Select Steps tap, it should be like this:
e. Click Run agent and click Edit button, add following scripts by the end of scripts in the command box:
-Output C:\Temp\OUTPUTFILE.txt -Outputverboselevel 2
f. Exit the dialogs
For more information about the steps, please refer to:
http://support.microsoft.com/kb/312292
Hope the information helps.
Tracy Cai
TechNet Community Support -
Hi,
I built a query with 4 tables inside (load from Oracle DB and two of them are quite big, more than millions of rows). After filtering, I tried to build relationships between tables using Table.Join formula. However, the process took extremly long time to
bring out results (I ended the process after 15 mins' processing). There's a status bar kept updating while the query was processing, which is showed as . I suppose
this is because the query folding didn't working, so PQ had to load all the data to local memory first then do the opertion, instead of doing all the work on the source system side. Am I right? If yes, is there any ways to solve this issue?
Thanks.
Regards,
QilongHi Curt,
Here's the query that I'm refering,
let
Source = Oracle.Database("reporting"),
AOLOT_HISTS = Source{[Schema="GEN",Item="MVIEW$_AOLOT_HISTS"]}[Data],
WORK_WEEK = Source{[Schema="GEN",Item="WORK_WEEK"]}[Data],
DEVICES = Source{[Schema="GEN",Item="MVIEW$_DEVICES"]}[Data],
AO_LOTS = Source{[Schema="GEN",Item="MVIEW$_AO_LOTS"]}[Data],
Filter_WorkWeek = Table.SelectRows(WORK_WEEK, each ([WRWK_YEAR] = 2015) and (([WORK_WEEK] = 1) or ([WORK_WEEK] = 2) or ([WORK_WEEK] = 3))),
Filter_AlotHists = Table.SelectRows(AOLOT_HISTS, each ([STEP_NAME] = "BAKE" or [STEP_NAME] = "COLD TEST-IFLEX" or [STEP_NAME] = "COLD TEST-MFLEX") and ([OUT_QUANTITY] <> 0)),
#"Added Custom" = Table.AddColumn(Filter_AlotHists, "Custom", each Table.SelectRows(Filter_WorkWeek, (table2Row) => [PROCESS_END_TIME] >= table2Row[WRWK_START_DATE] and [PROCESS_END_TIME] <= table2Row[WRWK_END_DATE])),
#"Expand Custom" = Table.ExpandTableColumn(#"Added Custom", "Custom", {"WRWK_YEAR", "WORK_WEEK", "WRWK_START_DATE", "WRWK_END_DATE"}, {"WRWK_YEAR", "WORK_WEEK",
"WRWK_START_DATE", "WRWK_END_DATE"}),
Filter_AolotHists_byWeek = Table.SelectRows(#"Expand Custom", each ([WORK_WEEK] <> null)),
SelectColumns_AolotHists = Table.SelectColumns(Filter_AolotHists_byWeek,{"ALOT_NUMBER", "STEP_NAME", "PROCESS_START_TIME", "PROCESS_END_TIME", "START_QUANTITY", "OUT_QUANTITY", "REJECT_QUANTITY",
"WRWK_FISCAL_YEAR", "WRWK_WORK_WEEK_NO"}),
Filter_Devices= Table.SelectRows(DEVICES, each ([DEPARTMENT] = "TEST1")),
SelectColumns_Devices = Table.SelectColumns(Filter_Devices,{"DEVC_NUMBER", "PCKG_CODE"}),
Filter_AoLots = Table.SelectRows(AO_LOTS, each Text.Contains([DEVC_NUMBER], "MC09XS3400AFK") or Text.Contains([DEVC_NUMBER], "MC09XS3400AFKR2") or Text.Contains([DEVC_NUMBER], "MC10XS3412CHFK") or Text.Contains([DEVC_NUMBER],
"MC10XS3412CHFKR2")),
SelectColumns_AoLots = Table.SelectColumns(Filter_AoLots,{"ALOT_NUMBER", "DEVC_NUMBER", "TRACECODE", "WAFERLOTNUMBER"}),
TableJoin = Table.Join(SelectColumns_AolotHists, "ALOT_NUMBER", Table.PrefixColumns(SelectColumns_AoLots, "AoLots"), "AoLots.ALOT_NUMBER"),
TableJoin1 = Table.Join(TableJoin, "AoLots.DEVC_NUMBER", Table.PrefixColumns(SelectColumns_Devices, "Devices"), "Devices.DEVC_NUMBER")
in
TableJoin1
Could you please give me some hints why it needs so long to process?
Thanks. -
INSERT INTO TABLE using SELECT takes long time
Hello Friends,
--- Oracle version 10.2.0.4.0
--- I am trying to insert around 2.5 lakhs records in a table using INSERT ..SELECT. The insert takes long time and seems to be hung.
--- When i try to SELECT the query fetches the rows in 10 seconds.
--- Any clue why it is taking so much timevishalrs wrote:
Hello Friends,hello
>
>
--- Oracle version 10.2.0.4.0
alright
--- I am trying to insert around 2.5 lakhs records in a table using INSERT ..SELECT. The insert takes long time and seems to be hung.
I don't know how a lakh is, but it sounds like a lot...
--- When i try to SELECT the query fetches the rows in 10 seconds.
how did you test this? and did you fetch the last record, or just the first couple of hundred.
--- Any clue why it is taking so much timeWithout seeing anything, it's impossible to tell the reason.
Search the forum for "When your query takes too long" -
my select query(2m records) coming within a second but while creating a table (nologging) based on the select clause it is taking long time.
can anybody give me the suggestion which part i will look to improve the performance..Plan
SELECT STATEMENT ALL_ROWS Cost: 11 Bytes: 655 Cardinality: 1
19 FILTER
18 NESTED LOOPS Cost: 11 Bytes: 655 Cardinality: 1
15 NESTED LOOPS Cost: 9 Bytes: 617 Cardinality: 1
12 NESTED LOOPS Cost: 8 Bytes: 481 Cardinality: 1
9 NESTED LOOPS Cost: 6 Bytes: 435 Cardinality: 1
6 NESTED LOOPS Cost: 4 Bytes: 209 Cardinality: 1
3 TABLE ACCESS BY INDEX ROWID TABLE OYSTER_WEB3.TRANSACTION Cost: 2 Bytes: 155 Cardinality: 1
2 BITMAP CONVERSION TO ROWIDS
1 BITMAP INDEX SINGLE VALUE INDEX (BITMAP) OYSTER_WEB3.IX_LINE_COMMODITY_ID
5 TABLE ACCESS BY INDEX ROWID TABLE OYSTERPLUS_DATA.BRIO_SUPPLIERS Cost: 2 Bytes: 54 Cardinality: 1
4 INDEX UNIQUE SCAN INDEX (UNIQUE) OYSTERPLUS_DATA.PK_BRIO_SUPPLIERS Cost: 1 Cardinality: 1
8 TABLE ACCESS BY INDEX ROWID TABLE OYSTER3.FLAT_SITE_MV Cost: 2 Bytes: 226 Cardinality: 1
7 INDEX UNIQUE SCAN INDEX (UNIQUE) OYSTER3.PK_FLAT_SITE_MV Cost: 1 Cardinality: 1
11 TABLE ACCESS BY INDEX ROWID TABLE OYSTER3.SITE_COMMODITY_CODING Cost: 2 Bytes: 46 Cardinality: 1
10 INDEX UNIQUE SCAN INDEX (UNIQUE) OYSTER3.PK_SITE_COMMODITY_CODING Cost: 1 Cardinality: 1
14 TABLE ACCESS BY INDEX ROWID TABLE OYSTERPLUS_DATA.BRIO_COMMODITIES Cost: 1 Bytes: 136 Cardinality: 1
13 INDEX UNIQUE SCAN INDEX (UNIQUE) OYSTERPLUS_DATA.PK_BRIO_COMMODITIES Cost: 0 Cardinality: 1
17 TABLE ACCESS BY INDEX ROWID TABLE OYSTER3.SUPPLIER_ALIAS Cost: 2 Bytes: 38 Cardinality: 1
16 INDEX UNIQUE SCAN INDEX (UNIQUE) OYSTER3.PK_SUPPLIER_ALIAS Cost: 1 Cardinality: 1 -
CDHDR table query taking long time
Hi all,
Select query from CDHDR table is taking long time,in where condition i am giving OBJECTCLASS = 'MAT_FULL' udate = sy-datum and langu = 'EN'.
any suggestion to improve the performance.i want to select all the article which got changed on current date
regards
shibuThis will always be slow for large data volumes, since CDHDR is designed for quick access by object ID (in this case material number), not by date.
I'm afraid you would need to introduce a secondary index on OBJECTCLAS and UDATE, if that query is crucial enough to warrant the additional disk space and processing time taken by the new index.
Greetings
Thomas -
Query takes a long time on EBAN table
Hi,
I am trying to execute a simple select statement on EBAN table. This query takes unexpectionally longer time to execute.
Query is :
SELECT banfn bnfpo ernam badat ebeln ebelp
INTO TABLE gt_eban
FROM eban FOR ALL ENTRIES IN gt_ekko_ekpo
WHERE
banfn IN s_banfn AND
ernam IN s_ernam
and ebeln = gt_ekko_ekpo-ebeln AND
ebelp = gt_ekko_ekpo-ebelp.
Structure of gt_ekko_ekpo
TYPES : BEGIN OF ty_ekko_ekpo,
ebeln TYPE ekko-ebeln,
ebelp TYPE ekpo-ebelp,
bukrs TYPE ekko-bukrs,
aedat TYPE ekko-aedat,
lifnr TYPE ekko-lifnr,
ekorg TYPE ekko-ekorg,
ekgrp TYPE ekko-ekgrp,
waers TYPE ekko-waers,
bedat TYPE ekko-bedat,
otb_value TYPE ekko-otb_value,
otb_res_value TYPE ekko-otb_res_value,
matnr TYPE ekpo-matnr,
werks TYPE ekpo-werks,
matkl TYPE ekpo-matkl,
elikz TYPE ekpo-elikz,
wepos TYPE ekpo-wepos,
emlif TYPE ekpo-emlif,
END OF ty_ekko_ekpo.
Structure of GT_EBAN
TYPES : BEGIN OF ty_eban,
banfn TYPE eban-banfn,
bnfpo TYPE eban-bnfpo,
ernam TYPE eban-ernam,
badat TYPE eban-badat,
ebeln TYPE eban-ebeln,
ebelp TYPE eban-ebelp,
END OF ty_eban.
Query seems to be OK to me. But still am not able to figure out the reason for this performance issue.
Please provide your inputs.
Thanks.
RichaHi Richa,
Maybe you are executing the query with S_BANFN empty. Still based on the note 191492 you should change your query on like the following
1st Suggestion:
if gt_ekko_ekpo[] is not initial.
SELECT banfn banfpo INTO TABLE gt_eket
FROM eket FOR ALL ENTRIES IN gt_ekko_ekpo
WHERE
ebeln = gt_ekko_ekpo-ebeln AND
ebelp = gt_ekko_ekpo-ebelp.
if sy-subrc = 0.
delete gt_eket where banfn not in s_banfn.
if gt_eket[] is not initial
SELECT banfn bnfpo ernam badat ebeln ebelp
INTO TABLE gt_eban
FROM eban FOR ALL ENTRIES IN gt_eket
WHERE
banfn = gt_eket-banfn
and banfpo = gt_eket-banfpo.
if sy-subrc = 0.
delete gt_eban where ernam not in s_ernam.
endif.
endif.
endif.
endif.
2nd Suggestion:
if gt_ekko_ekpo[] is not initial.
SELECT banfn banfpo INTO TABLE gt_eket
FROM eket FOR ALL ENTRIES IN gt_ekko_ekpo
WHERE
ebeln = gt_ekko_ekpo-ebeln AND
ebelp = gt_ekko_ekpo-ebelp.
if sy-subrc = 0.
delete gt_eket where banfn not in s_banfn.
if gt_eket[] is not initial
SELECT banfn bnfpo ernam badat ebeln ebelp
INTO TABLE gt_eban
FROM eban FOR ALL ENTRIES IN gt_eket
WHERE
banfn = gt_eket-banfn
and banfpo = gt_eket-banfpo
and ernam in s_ernam.
endif.
endif.
endif.
Hope this helps.
Regards,
R -
Query takes very long time and analyze table hangs
Hi
One of the oracle query taking very long time (ie more than a day) and affecting business requirment of getting the report in time.
I tried to analyze the table with compute statistics option, however it hangs/runs forever on one of the huge table?
Please let me know how to troubleshoot this issueHi,
What's your Oracle version?
You should use DBMS_STATS package not ANALYZE..
Regards, -
CV04N takes long time to process select query on DRAT table
Hello Team,
While using CV04N to display DIR's, it takes long time to process select query on DRAT table. This query includes all the key fields. Any idea as to how to analyse this?
Thanks and best regards,
Bobby
Moderator message: please read the sticky threads of this forum, there is a lot of information on what you can do.
Edited by: Thomas Zloch on Feb 24, 2012Be aware that XP takes approx 1gb of your RAM leaving you with 1gb for whatever else is running. MS Outlook is also a memory hog.
To check Virtual Memory Settings:
Control Panel -> System
System Properties -> Advanced Tab -> Performance Settings
Performance Options -> Adavanced Tab - Virtual Memory section
Virtual Memory -
what are
* Initial Size
* Maximum Size
In a presentation at one of the Hyperion conferences years ago, Mark Ostroff suggested that the initial be set to the same as Max. (Max is typically 2x physical RAM)
These changes may provide some improvement. -
Hi All,
We used report generation tool kit to generate the report on word and with other API 's under it,we get good reports .
But when the data points are more (> 100 on all channels) it take a long time to write all data and create a table in the word and generate report.
Any sugegstions how to make this happen in some seconds .
Please assist.Well, I just tried my suggestion. I simulated a 24-channel data producer (I actually generated 25 numbers -- the first number was the row number, followed by 24 random numbers) and generated 100 of these for a total of 2500 double-precision values. I then saved this table to Excel and closed the file. I then opened Word (all using RGT), wrote a single text line "Text with Excel", inserted the previously-created "Excel Object", and saved and closed Word.
First, it worked (sort of). The Table in Word started on a new page, and was in a very tiny font (possibly trying to fit 25 columns on a page? I didn't inspect it very carefully). This is probably "too much data" to really try to write the whole table, unless you format it for, say, 3 significant figures.
Now, timing. I ran this four times, two duplicate sets, one with Excel and Word in "normal" mode, one in "minimized". To my surprise, this didn't make a lot of difference (minimized was less than 10% faster). Here are the approximate times:
Generate the data -- about 1 millisecond.
Write the Excel Report -- about 1.5 seconds
Write the Word Report -- about 10.5 seconds
Seems to me this is way faster than trying to do this directly in Word.
Bob Schor
Maybe you are looking for
-
I've search the entire SDN, there seem to be no one with real answere to this: -After I generated the COPA Datasource (account base), on the BW side, you replicate, create infosource and eventualy create a COPA cube. The problem is this; the cube has
-
Substitution of accounts during release to accounting
Dear All, During billing document creation, when we do release to accounting the accounts are determined through VKOA. However, I want that for some specific customers of a particular company code, the system should bypass VKOA and hit some other G/L
-
Report painter - summary/ detail option
Dear Experts, In report painter i have developed report. My user wants two reports one like summary (cost element group) detail(detail cost element) please advise how to get two different reports. I have developed two reports but both are showing de
-
Apple mail crashing after opening
I am using mail which has worked fine for years (literally) and suddenly 2 days ago it starts crashing as soon as I try to do anything with emails inthe inbox. I suspected malware and deleted the emails from within library and rebuilt the inbox. This
-
1.what are the tools are used to make webservices? 2.how to create soap request and responce? 3.how to make a wsdl file?3.a. any tool avilabe in market? 4.plz tell me about hole life cycle of Webservices? 5.what are the s/w needed? 6.SOA?