Gather table stats takes time for big table
Table has got millions of record and it is partitioned. when i analyze the table using the following syntax it takes more than 5 hrs and it has got one index.
I tried with auto sample size and also by changing the estimate percentage value like 20, 50 70 etc. But the time is almost same.
exec dbms_stats.gather_table_stats(ownname=>'SYSADM',tabname=>'TEST',granularity =>'ALL',ESTIMATE_PERCENT=>100,cascade=>TRUE);
What i should do to reduce the analyze time for Big tables. Can anyone help me. l
Hello,
The behaviour of the ESTIMATE_PERCENT may change from one Release to another.
In some Release when you specify a "too high" (>25%,...) ESTIMATE_PERCENT in fact you collect the Statistics over 100% of the rows, as in COMPUTE mode:
Using DBMS_STATS.GATHER_TABLE_STATS With ESTIMATE_PERCENT Parameter Samples All Rows [ID 223065.1]For later Release, *10g* or *11g*, you have the possibility to use the following value:
estimate_percent => DBMS_STATS.AUTO_SAMPLE_SIZEIn fact, you may use it even in *9.2*, but in this release it is recommended using a specific estimate value.
More over, starting with *10.1* it's possible to Schedule the Statistics collect by using DBMS_SCHEDULE and, specify a Window so that the Job doesn't run during production hours.
So, the answer may depends on the Oracle Release and also on the Application (SAP, Peoplesoft, ...).
Best regards,
Jean-Valentin
Similar Messages
-
Gather schema stats not running for custom schema's in EBS 12.1.1
Hi All,
We are running Gather schema stats program periodically in our EBS system with Schama name as "ALL", but it is not generating the statistics for the custom schema. We have custom schema registered in our EBS application. Can you please let us know if there is any issue with our setup or this is standard behaviour of Gather schama stats concurrent program.
Thanks,Hi,
At how much percent ur using with Gather schema stats program like 10%,20%(in Gather schema stats program form)...I think there are no updates on the tables of that custom schema thats why the gather schema progam ignored it..can u check were there updates?
Regards -
I need ur help..i have to unplug and plug the charging pin to my ipad many times until i cam see the icon on the upper right that ia alrdy charging..i often see 'not charging' message many times before it will charge succesfully..need ur help guys.thanka
Try this first - Reset the iPad by holding down on the Sleep and Home buttons at the same time for about 10-15 seconds until the Apple Logo appears - ignore the red slider - let go of the buttons. (This is equivalent to rebooting your computer.)
The quickest way (and really the only way) to charge your iPad is with the included 10W or 12W (5W on Mini) USB Power Adapter. iPad will also charge, although more slowly, when attached to a computer with a high-power USB port (many recent Mac computers) or with an iPhone Power Adapter (5W). When attached to a computer via a standard USB port (2.5W, most PCs or older Mac computers) iPad will charge very slowly (but iPad indicates not charging). Make sure your computer is on while charging iPad via USB. If iPad is connected to a computer that’s turned off or is in sleep or standby mode, the iPad battery will continue to drain.
Apple recommends that once a month you let the iPad fully discharge & then recharge to 100%.
How to Calibrate Your Mac, iPhone, or iPad Battery
http://www.macblend.com/how-to-calibrate-your-mac-iphone-or-ipad-battery/
At this link http://www.tomshardware.com/reviews/galaxy-tab-android-tablet,3014-11.html , tests show that the iPad 2 battery (25 watt-hours) will charge to 90% in 3 hours 1 minute. It will charge to 100% in 4 hours 2 minutes. The new iPad has a larger capacity battery (42 watt-hours), so using the 10W charger will obviously take longer. If you are using your iPad while charging, it will take even longer. It's best to turn your new iPad OFF and charge over night. Also look at The iPad's charging challenge explained http://www.macworld.com/article/1150356/ipadcharging.html
Also, if you have a 3rd generation iPad, look at
Apple: iPad Battery Nothing to Get Charged Up About
http://allthingsd.com/20120327/apple-ipad-battery-nothing-to-get-charged-up-abou t/
Apple Explains New iPad's Continued Charging Beyond 100% Battery Level
http://www.macrumors.com/2012/03/27/apple-explains-new-ipads-continued-charging- beyond-100-battery-level/
New iPad Takes Much Longer to Charge Than iPad 2
http://www.iphonehacks.com/2012/03/new-ipad-takes-much-longer-to-charge-than-ipa d-2.html
Apple Batteries - iPad http://www.apple.com/batteries/ipad.html
iPhone: Hardware troubleshooting (Power/Battery section also applies to iPad)
http://support.apple.com/kb/TS2802
Extend iPad Battery Life (Look at pjl123 comment)
https://discussions.apple.com/thread/3921324?tstart=30
New iPad Slow to Recharge, Barely Charges During Use
http://www.pcworld.com/article/252326/new_ipad_slow_to_recharge_barely_charges_d uring_use.html
iPad: Charging the battery
http://support.apple.com/kb/HT4060
Best Practices for iPad Battery Charging
http://www.ilounge.com/index.php/articles/comments/best-practices-for-ipad-batte ry-charging/
Tips About Charging for New iPad 3
http://goodscool-electronics.blogspot.com/2012/04/tips-about-charging-for-new-ip ad-3.html
How to Save and Prolong the battery life of your new ipad
https://discussions.apple.com/thread/4480944?tstart=0
Prolong battery lifespan for iPad / iPad 2 / iPad 3: charging tips
http://thehowto.wikidot.com/prolong-battery-lifespan-for-ipad
iPhone, iPod, Using the iPad Charger
http://support.apple.com/kb/HT4327
Install and use Battery Doctor HD
http://itunes.apple.com/tw/app/battery-doctor-hd/id459702901?mt=8
To Extend a Device’s Battery Life, Get to Know It Better
http://tinyurl.com/b67c7xz
iPad Battery Replacement
http://www.apple.com/batteries/replacements.html
In rare instances when using the Camera Connection Kit, you may notice that iPad does not charge after using the Camera Connection Kit. Disconnecting and reconnecting the iPad from the charger will resolve this issue.
Cheers, Tom -
Execution takes time for the report in Application
I have developed a report for payroll runresult.Its execution time is 3 minutes,but when i run in application it takes half an hour
The query i use is...
SELECT round((sum(nvl(DECODE(PET.element_name,'Basic Salary',ROUND(prrv.result_value,3),0),0))+
sum(nvl(DECODE(PET.element_name,'House Rent Allowance',ROUND(prrv.result_value,3),0),0))+
sum(nvl(DECODE(PET.element_name,'Transport Allowance',ROUND(prrv.result_value,3),0),0))+
sum(nvl(DECODE(PET.element_name,'Petrol Allowance',ROUND(prrv.result_value,3),0),0))+
sum(nvl(DECODE(PET.element_name,'Vehicle Allowance',ROUND(prrv.result_value,3),0),0)) +
sum(nvl(DECODE(PET.element_name,'Additional HRA',ROUND(prrv.result_value,3),0),0)) +
sum(nvl(DECODE(PET.element_name,'Other Allowance',ROUND(prrv.result_value,3),0),0))+
sum(nvl(DECODE(PET.element_name,'Carried Balance',ROUND(prrv.result_value,3),0),0))+
sum(nvl(DECODE(PET.element_name,'Mobile Allowance',ROUND(prrv.result_value,3),0),0)) +
sum(nvl(DECODE(PET.element_name,'Fixed Over Time',ROUND(prrv.result_value,3),0),0)) +
sum(nvl(DECODE(PET.element_name,'Rent Deduction',ROUND(prrv.result_value,3),0),0)) -
sum(nvl(DECODE(PET.element_name,'Vehicle Allowance Deducted',ROUND(prrv.result_value,3),0),0)) -
sum(nvl(DECODE(PET.element_name,'HRA Deducted',ROUND(prrv.result_value,3),0),0)) -
sum(nvl(DECODE(PET.element_name,'Rent Deduction',ROUND(prrv.result_value,3),0),0)) -
sum(nvl(DECODE(PET.element_name,'Insurance Deduction',ROUND(prrv.result_value,3),0),0)) +
sum(nvl(DECODE(PET.element_name,'Medical Insurance',ROUND(prrv.result_value,3),0),0)) +
sum(nvl(DECODE(PET.element_name,'Family Passage Allowance',ROUND(prrv.result_value,3),0),0)) +
sum(nvl(DECODE(PET.element_name,'Salik Allowance',ROUND(prrv.result_value,3),0),0)) -
sum(nvl(DECODE(PET.element_name,'Family Insurance Deduction',ROUND(prrv.result_value,3),0),0)) +
sum(nvl(DECODE(PET.element_name,'Fixed Incentive',ROUND(prrv.result_value,3),0),0)) +
sum(nvl(DECODE(PET.element_name,'Company Accomodation Provided',ROUND(prrv.result_value,3),0),0))-
sum(nvl(DECODE(PET.element_name,'Mess Deduction',ROUND(prrv.result_value,3),0),0))+
sum(nvl(DECODE(PET.element_name,'Normal Overtime',ROUND(prrv.result_value,3),0),0))+
sum(nvl(DECODE(PET.element_name,'Special Overtime',ROUND(prrv.result_value,3),0),0))+
sum(nvl(DECODE(PET.element_name,'Telephone Allowance',ROUND(prrv.result_value,3),0),0))+
sum(nvl(DECODE(PET.element_name,'Other Allowance',ROUND(prrv.result_value,3),0),0))+
sum(nvl(DECODE(PET.element_name,'Other Earnings',ROUND(prrv.result_value,3),0),0))-
sum(nvl(DECODE(PET.element_name,'Other Deductions',ROUND(prrv.result_value,3),0),0)) -
sum(nvl(DECODE(PET.element_name,'Salary Advance Recovery',ROUND(prrv.result_value,3),0),0))-
sum(nvl(DECODE(PET.element_name,'Air Ticket Refund',ROUND(prrv.result_value,3),0),0))-
sum(nvl(DECODE(PET.element_name,'Vehicle Fines and Charges',ROUND(prrv.result_value,3),0),0))-
sum(nvl(DECODE(PET.element_name,'Abroad Emergency Leave',ROUND(prrv.result_value,3),0),0))-
sum(nvl(DECODE(PET.element_name,'Abscond Leave',ROUND(prrv.result_value,3),0),0))-
sum(nvl(DECODE(PET.element_name,'Air Fare Paid',ROUND(prrv.result_value,3),0),0))-
sum(nvl(DECODE(PET.element_name,'Recovery Carried Balance',ROUND(prrv.result_value,3),0),0))-
sum(nvl(DECODE(PET.element_name,'Hajj Leave',ROUND(prrv.result_value,3),0),0))-
sum(nvl(DECODE(PET.element_name,'Local Leave',ROUND(prrv.result_value,3),0),0))-
sum(nvl(DECODE(PET.element_name,'Sick Leave Deduction',ROUND(prrv.result_value,3),0),0))-
sum(nvl(DECODE(PET.element_name,'Sick Leave Without certificate',ROUND(prrv.result_value,3),0),0))-
sum(nvl(DECODE(PET.element_name,'Round Off Deduction',ROUND(prrv.result_value,3),0),0))+
sum(nvl(DECODE(PET.element_name,'Round Off Earnings',ROUND(prrv.result_value,3),0),0))),0)Net_Salary,papf.national_identifier groupcode,papf.FULL_NAME,paav.effective_date,
haout.name costcentre,pg.name grade, pj.name designation ,paaf.ASSIGNMENT_NUMBER employee_number,decode(pop.org_payment_method_name,'RI_CASH','C','B')Payment
FROM pay_run_results_v prrv,pay_run_results prr,pay_assignment_actions paa,per_all_assignments_f paaf,per_all_people_f papf,hr_all_organization_units_tl haout,
per_grades pg, per_jobs pj,pay_assignment_actions_v paav,pay_personal_payment_methods_f ppp,pay_all_payrolls_f pap,pay_payroll_actions ppa,
pay_org_payment_methods_f_tl pop,pay_element_types_f PET
WHERE --prrv.assignment_action_id = 49449 and
ppp.assignment_id(+) = paa.assignment_id
AND pop.org_payment_method_id(+) = ppp.org_payment_method_id AND
paa.ASSIGNMENT_ACTION_ID=prrv.ASSIGNMENT_ACTION_ID and paaf.ASSIGNMENT_ID=paa.ASSIGNMENT_ID
and haout.organization_id=paaf.ORGANIZATION_ID
and papf.EMPLOYEE_NUMBER=paaf.ASSIGNMENT_NUMBER
AND prrv.run_result_id = prr.run_result_id
AND ppa.effective_date BETWEEN pet.effective_start_date
AND pet.effective_end_date
AND prr.element_type_id = pet.element_type_id
and prrv.CLASSIFICATION_NAME not like 'Info%'
-- and prrv.element_name not like 'Gross Salary%'
-- and prrv.element_name not like 'Annual%'
and pj.job_id(+)=paaf.job_id
and haout.name=nvl(:P_Dept,haout.name)
and ppa.PAYROLL_ACTION_ID=paa.PAYROLL_ACTION_ID
and pap.PAYROLL_ID=ppa.PAYROLL_ID
and paav.ASSIGNMENT_ID=paaf.ASSIGNMENT_ID
and paav.ASSIGNMENT_ACTION_ID=paa.ASSIGNMENT_ACTION_ID
and pap.PAYROLL_NAME=NVL(:P_PAYROLL_NAME,pap.PAYROLL_NAME)
--and papf.national_identifier = 03917
and paav.effective_date >= TO_DATE ('01-' || :p_start_date, 'DD-MON-YYYY')
AND paav.effective_date <= LAST_DAY (TO_DATE ('01-' || :p_start_date, 'DD-MON-YYYY'))
-- and paav.DATE_EARNED=to_char(:p_date,'DD-MON-YYYY')
and pg.grade_id(+)=paaf.grade_id group by papf.national_identifier,papf.FULL_NAME,paav.effective_date,
haout.name ,pg.name , pj.name,paaf.ASSIGNMENT_NUMBER,pop.org_payment_method_name,papf.employee_number
Cant analyse where it takes time...
Regards,
MeraI don't think that the "SUM" structure in the SELECT makes a difference in performance (And DO NOT use PL/SQL for this!) And as long as you SUM most PET.element_name values you do not need to filter by them.
But, of course, the solution of "Sven W." would make the code MUCH nicer (took me a while to get it) and the "format" of the join conditions would take anyone ages to maintain that code!
But your problem is that the execution differs massively right?
Do you do the same thing? E.g. bind variables in the interactive part? Do the variables have the same value?
Do you filter on the other variables like "P_DEPT" etc. ?
"and haout.name=nvl(:P_Dept,haout.name)"
The good thing is that you have an execution which makes you happy so you need to compare the SQL and the execution for both. Somewhere must be a difference. (Is the result the same by the way (just in case your parameters differ)?)
If the results and parameters are the same, then do you not work with parameters in the "interactive" test and ORACLE is choosing a different execution plan as it knows more about your question. If that's the case, try to hint the query so that it's using the desired execution plan when you work with parameters.
And how did you get that "execution plan" it's missing so much data and it's not formated. (put around it and check in "Preview" if it looks as desired)
-- andy -
Takes time for clear photo to come up (pixellated).
Hello. I am using iPhoto 2 and notice that when I click on a photo, it takes 3 or 4 seconds for the picture to become clear, ie: it looks pixellated but finally clears to a good picture. Hope someone can tell me how to fix this if possible.
Thanks.Sorry Terrance, this is reference to our ibook G4 NOT the iMac. (2 on i5 does sound a bit bizzare alright! Even having just iPhoto 2 is a big lag and will update it one of these days!)
I tried Preview and they do seem to open quicker in that, ie about 1-2 seconds as opposed to 4-5 in iPhoto.
Re your reference to size, you may have something there as the files are quite large, ie 4 mb +/-. Ones in the 1 mb range open fairly quickly. -
Getting ResultSet from Statement takes time..
Hi!!
I am using Oracle 9 and my stored procedure returns a REFCURSOR. For a number of Ids input to the procedure it returns the corresponding Id,Name pairs.
For a particular set of data, the following statemnt takes almost 3 seconds.
ResultSet rs = (ResultSet) stmt.getObject(1);
The prefetch size is set to 10 and the number of records it returns is 2.
Can anyone tell why it is taking so much time and what can be done to improve it.
TIA,
Basu.Hi Basu,
This just a guess -- based on the (lack of) information you have provided, but the time may be due to the SQL (i.e. stored procedure), or building the "ResultSet" (if the "ref cursor" returns a large, complex structure) or the network load, or the database load. Perhaps you could start out by seeing how long your PL/SQL function takes (in case you haven't already done so). Then you could run your java code from the same machine as the database server. Then ...?
Hope this has helped you.
Good Luck,
Avi. -
Gather table stats taking longer for Large tables
Version : 11.2
I've noticed that gathers stats (using dbms_stats.gather_table_stats) is taking longer for large tables.
Since row count needs to be calculated, a big table's stats collection would be understandably slightly longer (Running SELECT COUNT(*) internally).
But for a non-partitioned table with 3 million rows, it took 12 minutes to collect the stats ? Apart from row count and index info what other information is gathered for gather table stats ?
Does Table size actually matter for stats collection ?Max wrote:
Version : 11.2
I've noticed that gathers stats (using dbms_stats.gather_table_stats) is taking longer for large tables.
Since row count needs to be calculated, a big table's stats collection would be understandably slightly longer (Running SELECT COUNT(*) internally).
But for a non-partitioned table with 3 million rows, it took 12 minutes to collect the stats ? Apart from row count and index info what other information is gathered for gather table stats ?
09:40:05 SQL> desc user_tables
Name Null? Type
TABLE_NAME NOT NULL VARCHAR2(30)
TABLESPACE_NAME VARCHAR2(30)
CLUSTER_NAME VARCHAR2(30)
IOT_NAME VARCHAR2(30)
STATUS VARCHAR2(8)
PCT_FREE NUMBER
PCT_USED NUMBER
INI_TRANS NUMBER
MAX_TRANS NUMBER
INITIAL_EXTENT NUMBER
NEXT_EXTENT NUMBER
MIN_EXTENTS NUMBER
MAX_EXTENTS NUMBER
PCT_INCREASE NUMBER
FREELISTS NUMBER
FREELIST_GROUPS NUMBER
LOGGING VARCHAR2(3)
BACKED_UP VARCHAR2(1)
NUM_ROWS NUMBER
BLOCKS NUMBER
EMPTY_BLOCKS NUMBER
AVG_SPACE NUMBER
CHAIN_CNT NUMBER
AVG_ROW_LEN NUMBER
AVG_SPACE_FREELIST_BLOCKS NUMBER
NUM_FREELIST_BLOCKS NUMBER
DEGREE VARCHAR2(10)
INSTANCES VARCHAR2(10)
CACHE VARCHAR2(5)
TABLE_LOCK VARCHAR2(8)
SAMPLE_SIZE NUMBER
LAST_ANALYZED DATE
PARTITIONED VARCHAR2(3)
IOT_TYPE VARCHAR2(12)
TEMPORARY VARCHAR2(1)
SECONDARY VARCHAR2(1)
NESTED VARCHAR2(3)
BUFFER_POOL VARCHAR2(7)
FLASH_CACHE VARCHAR2(7)
CELL_FLASH_CACHE VARCHAR2(7)
ROW_MOVEMENT VARCHAR2(8)
GLOBAL_STATS VARCHAR2(3)
USER_STATS VARCHAR2(3)
DURATION VARCHAR2(15)
SKIP_CORRUPT VARCHAR2(8)
MONITORING VARCHAR2(3)
CLUSTER_OWNER VARCHAR2(30)
DEPENDENCIES VARCHAR2(8)
COMPRESSION VARCHAR2(8)
COMPRESS_FOR VARCHAR2(12)
DROPPED VARCHAR2(3)
READ_ONLY VARCHAR2(3)
SEGMENT_CREATED VARCHAR2(3)
RESULT_CACHE VARCHAR2(7)
09:40:10 SQL> >
Does Table size actually matter for stats collection ?yes
Handle: Max
Status Level: Newbie
Registered: Nov 10, 2008
Total Posts: 155
Total Questions: 80 (49 unresolved)
why so many unanswered questions? -
How to gather table stats for tables in a different schema
Hi All,
I have a table present in one schema and I want to gather stats for the table in a different schema.
I gave GRANT ALL ON SCHEMA1.T1 TO SCHEMA2;
And when I tried to execute the command to gather stats using
DBMS_STATS.GATHER_TABLE_STATS (OWNNAME=>'SCHMEA1',TABNAME=>'T1');
The function is failing.
Is there any way we can gather table stats for tables in one schema in an another schema.
Thanks,
MK.You need to grant analyze any to schema2.
SY. -
Gather shema stats for all objects in DB taking too much time
Dear ,
we schedule concurrent request " Gather Schema Statistics" to gather ALL schema on DB weekly . it takes on Test 4.5 hours to complete and 5 hours on Production system . we have only HR in Production and very soon it will start supply chain and Financial so I'm afraid from this time to increase in the near future . how could we reduce this time ? i tried to gather stats only with changed objects using " GATHER AUTO " option instead of " GATHER " but it gave me the following error
Cause: FDPSTP failed due to ORA-20001: NEW TABLE is an invalid identifier
ORA-06512: at "APPS.FND_STATS", line 799
ORA-06512: at line 1
The SQL statement being executed at the time of the error was: SELECT R.Conc_Login_Id,
Start of log messages from FND_FILE
In GATHER_SCHEMA_STATS , schema_name= ALL percent= 10 degree = 8 internal_flag= NOBACKUP
ORA-20001: NEW TABLE is an invalid identifier
End of log messages from FND_FILE
please if anyone knows how to reduce time for Gather stats or know how to solve this error he'll be much appreciatedHi,
i tried to gather stats only with changed objects using " GATHER AUTO " option instead of " GATHERYoudo not have to change this parameter, just schedule the concurrent program for HR (as suggested above), and you can schedule it for other schemas on regular basis (at the weeke ends) when there is no activity on the system -- Please see (Note: 168136.1 - How Often Should Gather Schema Statistics Program be Run?).
Regards,
Hussein -
Hi All,
DB version:10.2.0.4
OS:Aix 6.1
I want to gather table stats for a table since the query which uses this table is running slow. Also I noticed that this table is using full table scan and it was last analyzed 2 months back.
I am planning to execute the below query for gathering the stats. The table has 50 million records.
COUNT(*)
51364617
I expect this gonna take a long time if I execute the query Like below.
EXEC DBMS_STATS.gather_table_stats('schema_name', 'table_name');
My doubts specified below.
1. can i use the estimate_percent parameter also for gathering the stats ?
2. how much percentage should I specify for the parameter estimate_percent?
3. what difference will it make if I use the estimate_percent parameter?
Thanks in advance
Edited by: user13364377 on Mar 27, 2012 1:28 PMIf you are worried about the stats gathering process running for a long time, consider gathering stats in parallel.
1. Can you use estimate_percent? Sure! Go for it.
2. What % to use? Why not let the database decide with auto_sample_size? Various "rules of thumb" have been thrown around over the years, usually around 10% to 20%.
3. What difference will it make? Very little, probably. Occasionally you might see where a small sample size makes a difference, but in general it is perfectly ok to estimate stats.
Perhaps something like this:
BEGIN
dbms_stats.gather_table_stats(ownname => user, tabname => 'MY_TABLE',
estimate_percent => dbms_stats.auto_sample_size, method_opt=>'for all columns size auto',
cascade=>true,degree=>8);
END; -
hi!
I have some confusions about analyze and gather table stats command. Please answers my questions to remove these confusions:
1 - What’s major difference between analyze and gather table stats ?
2 - Which is better?
3 - Suppose i am running analyze/stats table command when some transactions are being performed on the process table then what will be affected of performance?
Hopes you will support me.
regards
Irfan Ahmad[email protected] wrote:
1 - What’s major difference between analyze and gather table stats ?
2 - Which is bet
3 - Suppose i am running analyze/stats table command when some transactions are being performed on the process table then what will be affected of performance?1. analyze is older and probably not being extended with new functionality/support for new features
2. overall, dbms_stats is better - it should support everything
3. Any queries running when the analyze takes place will have to use the old stats
Although dbms_stats is probably better, I find the analyze syntax LOTS easier to remember! The temptation for me to be lazy and just use analyze in development is very high. For advanced environments though dbms_stats will probably work better.
There is one other minor difference between analyze and dbms_stats. There's a column in the user/all/dba_histograms view called endpoint_actual_value that I have seen analyze populate with the data value the histogram was created for but have not seen dbms_stats populate. -
Down time for 2LIS_03_BX setup table
Hello SDN,
I have data reconcilation issue in my BW server for inventort managenent, for data source 2LIS_03_BX and 2LIS_03_BF. For this I have to refill setup table of 2LIS_03_BX in R3 server. For filling setup table for R3 system I need to ask for down time from cleint
I NEED TO KNOW HOW DO I CALCULATE REQUIRED DOWN TIME FOR FILLING SETUP TABLE FOR DATA SOURCE 2LIS_03_BX.Dear Pravender,
I understand your statement (" For the complete load, check in how much time you can do initialization? That much down time only you need, later you can fill setup tables for history data and load.") as to follow following steps:
1. In the down time, start initialization without data transfer info package in BIW for 2LIS_03_BF and 2LIS_03_UM.
2. Then during up time (after releasing down time, transactional data posting allowed in R3), fill setup table for 2LIS_03_BX.
3. Run full upload info package for 2LIS_03_BX
4. Start Delta info package for data sources 2LIS_03_BF and 2LIS_03_UM
Let me know if I am right for the above procedures. These steps will allow us take very less down time.
Thanks for the reply
Regards,
Jaydeep
Edited by: Jaydeepsinh Rathore on Sep 4, 2009 8:23 AM -
Gather table stats after shrink?
Hello,
do I need (or is it useful) to run dbms_stats.gather_table_stats after table shrink or does oracle update statistics automatically?
Regards,
ArndtHi,
I'd suggest you to write some script to run it from crontab to gather statistics automaticaly.
You can do the same via Enterprise Manager scheduling it on daily/hourly basis you wish.
select 'Gather Schema Stats' from dual;
select 'Started: '|| to_char(sysdate, 'DD-MM-YYYY HH24:MI:SS') from dual;
-- exec dbms_stats.gather_schema_stats('<my_schema>');
select '- for all columns size auto' from dual;
exec DBMS_STATS.gather_schema_stats(ownname => '<my_schema>', estimate_percent => DBMS_STATS.AUTO_SAMPLE_SIZE, method_opt => 'FOR ALL COLUMNS SIZE AUTO', degree => DBMS_STATS.AUTO_DEGREE, cascade => true, force => true);
select '- for all indexed columns size skewonly' from dual;
exec DBMS_STATS.gather_schema_stats(ownname => '<my_schema>', estimate_percent => DBMS_STATS.AUTO_SAMPLE_SIZE, method_opt => 'FOR ALL INDEXED COLUMNS SIZE SKEWONLY', degree => DBMS_STATS.AUTO_DEGREE, cascade => true, force => true);
select 'Finished: '|| to_char(sysdate, 'DD-MM-YYYY HH24:MI:SS') from dual;Good luck!
http://dba-star.blogspot.com/ -
Temp table, and gather table stats
One of my developers is generating a report from Oracle. He loads a subset of the data he needs into a temp table, then creates an index on the temp table, and then runs his report from the temp table (which is a lot smaller than the original table).
My question is: Is it necessary to gather table statistics for the temp table, and the index on the temp table, before querying it ?It depends yesterday I have very bad experience with stats one of my table has NUM_ROWS 300 and count(*)-7million and database version is 9206(bad every with optimizer bugs) so queries starts breaking and lot of buffer busy and latch free it took while to figure out but I have deleted the stats and every thing came under control - my mean to say statistics are good and bad. Once you start collecting you should keep an eye.
Thanks. -
On refreshing the page ,the table takes time to appear completely.(in adf)
Hi,
I have a table inside the page.I have made its property "content delivery=immediate" even then its taking time to load completely(i have made its property "column selection=multiple" and have given width in % to every column.....but it takes time to stretch to 100% on refresh )
Please help
thanks>>
I have made its property "content delivery=immediate"
even then its taking time to load completely
(i have made its property "column selection=multiple" and have given
width in % to every column.....but it takes time to stretch to 100% on refresh ).
>>
am not sure stretching fields it makes times too long.
AFAIK
time taken for too long will be query in your vo. dont make it complex.
and iterator range size. immediate option ,content delevery options, enoumous amount of records
based on this make a little bit time.
not too long.
Maybe you are looking for
-
Two VGA monitors don't work with two different G4s, but both work with G3!
I recently acquired two used Power Mac G4s (one of them is a mirrored-door dual-processor that I got from school and was told should work, the other is a Gigabit Ethernet model, and should also work), from two completely different sources. They both
-
Error in RuntimeWorkBench - component monitoring- AdapterEngine
Hi SapAll. here i just want to test a Idoc To File Scenario but i dont have Idoc at sender side to trigger,so i just created a test data with all the required data in ESR(Message map,ping) and copied the test data from Message mapping and pasted into
-
After purchasing my beloved iPhone 4, I would like to pass my old 3G phone to my wife, who uses voice only, no data. AT&T says the iPhone 3G will not work properly without a data plan and cannot be used for voice only service. Is AT&T correct? Is the
-
Trashed iPhoto prefs now 10,000 phoptos gone
My iPhoto was running really slow. I had about 10,000 photos on it. The spinning beachball was happening everytime I tried to do anything. I restarted several times, rebuilt permissions. I then thought that I would trash the iphoto prefs . I did this
-
Is there a way to make a website in flash, the client can edit?
is there a way to make a website in flash, the client can edit? if not which is the best tool to use? thanks!