Setting Optimal db_cache_size
How can i do it?
I've looked at the V$DB_CACHE_ADVICE but don't know how to interpret it (what kind of information is needed to determine db_cache_size)
Please help!
Thanks,
OK.
Take the contents, and graph them as shown in that section of the doc. The curve has a 'if you double the cache, you get huge benefits' area, and it has a 'if you double the cache, you get almost no added benefits' area.
Stick to the 'huge benefits' portion, and stop increasing the cache when you get to the 'almost no added benefits' portion.
Similar Messages
-
Set optimal db_cache_size param
I read this article http://articles.techrepublic.com.com/5100-10878_11-5031958.html because I want to set optimal db_cache_size parameter on my database. But I still don't know which oracle metric should check to draw such chart and choose best value for my database. Could anybody give me some advice?
The article you read is 8 years old, and doesn't apply to Oracle 10g and beyond, as Oracle 10g and beyond are self-tuning.
Setting db_cache_size in 10g and beyond will disable self-tuning.
So if you are running 10g and beyond (yet again you didn't include your version) the best thing you can do is
- not to read 8 year old articles
- not to set db_cache_size
and better still
be very cautious about anything Donald K Burleson writes, as his tuning approach often includes generic measures, and most other experts like Jonathan Lewis, Tom Kyte, Richard Foote, Cary Millsap and lot of others
fundamentally disagree with him.
Finally: how to determine db_cache_size is in the article, so I suggest you re-read it. Hint: especially read the bits on the cache advisor. Try to be not distracted by the impressive multi-color graphs.
Sybrand Bakker
Senior Oracle DBA -
PS : Excuse me for having to bring up an ancient question.
http://www.dba-oracle.com/oracle_tips_rollback_segments.htm
According to the link above Optimal parameter is not recommended.
I'd like to understand according to the scenario below why it is so.
SVRMGR> select USN, EXTENTS, XACTS, OPTSIZE, SHRINKS, WRAPS, EXTENDS, AVESHRINK, AVEACTIVE, STATUS from v$rollstat;
USN EXTENTS XACTS OPTSIZE SHRINKS WRAPS EXTENDS AVESHRINK AVEACTIVE STATUS
0 15 0 0 0 0 0 0 ONLINE
12 7 1 41943040 0 6 5 0 37391448 ONLINE
2 2 0 41943040 0 0 0 0 0 ONLINE
3 2 0 41943040 2 18 17 178257920 217073014 ONLINE
4 2 0 41943040 0 0 0 0 0 ONLINE
5 2 0 41943040 0 0 0 0 0 ONLINE
6 2 0 41943040 0 0 0 0 0 ONLINE
7 2 0 41943040 0 0 0 0 0 ONLINE
8 2 0 41943040 0 0 0 0 0 ONLINE
9 2 0 41943040 2 0 0 125829120 0 ONLINE
10 2 0 41943040 0 0 0 0 0 ONLINELast month on another database I ran into this error :
ORA-01595: error freeing extent (1) of rollback segment (7)
ORA-01594: attempt to wrap into rollback segment (7) extent (1) which is being freed
And I was expecting the same in my scenario on this database as well, that "Optimal" would shrink once that size has been reached. But that didnt happen. Could anyone please tell me the pros and cons of using it as per this setup ?
DB : Oracle 7.3.4
Initial = NEXT = 20MB
MinExtents = 2 ; Max = 121 ; Optimal = 40MBA Rollback Segment would not shrink if there are Active Transactions in it --- that is what the "XACTS" shows.
Generally, it is preferable to NOT use OPTIMAL but to manually resize and/or drop and recreate Rollback segments periodically -- e.g. as a weekly scheduled job.
In an OLTP environment where all transactions are always approximately the same size you do not have issues. However, there are very few such real world databases. Even having a single daily batch job that is 100x larger than other transactions results in rollback segments growing oversized. OPTIMAL, in theory, was designed to handle such scenarios. However, because you cannot predict which rollback segment would be growing on any day, you can have different rollback segments growing on different days.
Most DBAs have, out of experience, found it better to resize rollback segment manually and/or drop and recreate them (manually or as scheduled jobs) in the V7/V8 days.
Edited by: Hemant K Chitale on Jul 18, 2009 5:45 PM -
Hi Folks
My System:
OS: linux OEL 5.2 x86_64
Memory: 112GB
CPU: 4*6Core = 24Cores
Filesystem: ASM 11.1.0.7
RDBMS: Oracle 11.1.0.7 Archivelogmode
We have recently migrated an OLTP Database (~1.3TB) to this server above. It is a Single Instance DB and it is also the only one DB on this server no other apps. I have several question on how to set the parameters. This is the current configuration
memory_max_target big integer 95G
memory_target big integer 95G
sga_max_size big integer 65G
sga_target big integer 65G
pga_aggregate_target big integer 30GI set the sga_target to 65G because I cannot set the db_cache_size to a higher value than ~28G. I also set the log_buffer to 64MB but then I found the following Note on metalink:
The Log_buffer Default Size Cannot Be Reduced In 10g R2 ID: 351857.1
I set it to 64MB because the migration. I created a new empty 11g Instance and import with impdp over network the whole DB, with parallel index creation of huge tables.
Hence my Redo Buffer is ~ 212MB, a bit huge?
Oracle sets the memory setting to the following values:
Shared Pool 12800
Buffer Cache 48128
Large Pool 1024
Java Pool 3072
Other 1536
SQL> show sga
Total System Global Area 6.9482E+10 bytes
Fixed Size 2161480 bytes
Variable Size 1.8790E+10 bytes
Database Buffers 5.0466E+10 bytes
Redo Buffers 223121408 bytesI have 6 Redolog groups each 2 members of 1GB a member. And there are 8 DBWR and 4 ARCH processes.
How would you use so much memory optimal? Is in your opinion the number of dbwr oversized? Have you other suggestion on how to use this resource optimal?
I think the "bottleneck" for this system will be the filesystem?
btw: it is not yet in production.
Thanks a lot
Regards OviwanI got it now. The target for the backupset is an nfs export (netapp filer) and I used native nfs (knfs) which eats up all resources. Now I switched to direct nfs, this is as fast as knfs and it don't need as much resources as knfs. The time for the backups are also equal with dnfs or knfs :)
but one thing is a bit crazy. on the filer I monitored the I/O: with knfs ~5000I/O and with dnfs ~500I/O --> 10times fewer I/O but same duration... nothing else was running on this filer during the tests...
mount options: rw,bg,hard,rsize=32768,wsize=32768,nfsvers=3,nointr,timeo=600,tcp,noac -
OPTIMAL size for rollback segment in Oracle 8i
In our Database we have 13 Rollback segments & total size of all rollback segments is 33gb.
Used % it is showing 99.84%.
each rollback segment having near around 4gb of size.
I am in confusion to set OPTIMAL value for each rollback
segment.
can any one help me out reg this.
My roll back seg statistics are as fallows:
SEGMENT_NAME TABLESPACE_NAME WAITS SHRINKS WRAPS STATUS
R00 SYSTEM 0 0 12 ONLINE
RBS0 RBS 0 0 19 ONLINE
RBS01 RBS 0 0 16 ONLINE
RBS02 RBS 1 0 12 ONLINE
RBS03 RBS 0 0 11 ONLINE
RBS04 RBS 0 0 21 ONLINE
RBS05 RBS 1 0 22 ONLINE
RBS06 RBS 0 0 16 ONLINE
RBS07 RBS 0 0 15 ONLINE
RBS08 RBS 0 0 12 ONLINE
RBS09 RBS 1 0 19 ONLINE
SEGMENT_NAME TABLESPACE_NAME WAITS SHRINKS WRAPS STATUS
RBS12 RBS 0 0 13 ONLINE
RBS13 RBS 0 0 18 ONLINE
SYSTEM SYSTEM 0 0 0 ONLINEAman,
Definitely that would be a great approach to share the knowledge; but right now my notes are still in the shape of notes only; i have'nt tested them. I simply read forum and asktom and whenever i found good lines i copy and paste them and whenever i found the similar question in the forum i paste / repharase there in my own language and understanding (because till then i have tested and learnt them very well.)
Although i am highly obliged by your suggestation; let me that much knowledgable; so that i may run my own blog...!
Regards
Girish Sharma -
Usual dilemma: how to configure SGA
Hi all! I'm trying to configure my Oracle 9i database on Win2k to guarantee optimal performances. My system has 1 Gb RAM (this is efectively the free memory). I don't want to use sub-cache areas for the 4 tablespaces I have. I don't know which is the workload of my system (is there a way to aproximatively know it?). How can I set init.ora parameters to achieve optimal performances? I think the parameters I've to set are:
DB_CACHE_SIZE
SHARED_POOL_SIZE
LOG_BUFFERS
DB_BLOCK_SIZE
DB_BLOCK_BUFFERS
SGA_MAX_SIZE
Thank you very much! Ste.Thanks guys,
Got it !! Hybrid extensions.
I am using Adobe Extension builder 3 in Eclipse to code and create the zxp for Premiere pro panel.
Seeing the hybrid extension part in HTML Panels Tips: #10 Packaging / ZXP Installers | Photoshop, etc. I have added a hybrid extension panel using the bundle manifest editor in the extension builder. it creates an mxi file automatically with all the properties, I have to change the maximum host version number in the mxi file in order to make the extension work in the Premiere pro CC 2014. Using the Hybrid option in the Bundle manifest editor we can choose a set of folder paths for placing the files during installation itself and later can be used using 'csInterface.getSystemPath(SystemPath.CODE)' from javascript functions to modify them.
Thanks for your relpies,
Anoop NR -
K9N Platinum (not SLI) Ethernet/LAN ports shut down
Not sure if this has been up before but i could not find this exact problem. I have the K9N Platinum with am AMD 64x 4200+ CPU and OCZ 1GB x2 mem. My build and install of XP went without incident. My problem is that when I start to download anything of size, the LAN shut down. It will say "cable unplugged". If I switch to the second port, it will be good for a while, then the same thing. The only fix is to reboot. I am connected through a switch. I loaded the latest drivers (nVidia) from the monitor software. I found a different set on the NVidia web site and installed, but now it will not pick up DHCP. It will work if I hard code the IP. I read that it should be using Realtek drivers, not Nvidia. Please help!!! I opened a case with support but they have not answered yet.
That is a start on Specs. You can look at anyone's signatuires here to see what is most commonly looked for in specs. Your are missing the PSU specs.
However one of the first things to try is to go into Bios and set optimized defaults. Then you will need to make sure nothing else in Bios has changed as setting Optimized defaults usually changes things quite a bit.
Then make sure that your using the drivers that came with the board including the NVidia and Network drivers.
You may have to enter Device manager and remove the Network Driver and let it restart.
You did answer one question many folks would have with the specs you gave about memory. The memory you have listed should be fine. -
I just threw in my 4 corsair 3200 256s when I built this system not realizing the complexity of which you can affect memory in these new systems. I have K8N Platinum Sli board and for the 1st few days had problems of one sort of another, then ran Memtest and prime which both showed serious memory problems which I isolated to one of the strips which I have removed and have sent back to the seller. I have 3 slips in now ans wasnt particular which slots they went in, but they are all together.
I have begun to notice forums with people discussing memory configs and altering them to get the best or stable performance. I have then looked at my moboard manual on this subject and it says, '...DO NOT install three memory modules on three DIMMs, or it may cause some failure.' Is it telling me not to do what I have done? I have ran my system like this for 3 days without problems apart from the system hanging a couple of times in the middle of or after I had finished playing PES4. The manual also says ' Always insert the memory modules into the GREEN slots first, and it is strongly recommended not to insert the memory modules into the PURPLE slots while the green are left empty.'
Is this in all cases or just if you are running Dual-channel? Is it better to just fill the green slots and run 512 or carry on running 768?1. You have not changed any bios settings from Optimized defaults, is that correct?
2. You are running 4x256 corsair value ram (identical chips on each stick)?
3. You have not given us a "sig" with your system specs and you should do that now. See Forum guide for how to create a profile with a sig. CPU-Z first tab will give all your cpu info.
4. Have you cleared cmos and, if you have not, you should do it as described in the manual (being careful to unplug your power supply and waiting 5 minutes before opening your case, moving JBAT1 jumper, then replacing JBAT1 jumper, closing case, and reinserting power supply plug. AFTER you have cleared CMOS when you reboot for the first time you should hit "del", set optimized defaults, f10, yes to save, and then reboot normally.
If that does not work, proceed to the next steps.
5. It would be helpful if you ran CPU-Z with 4 sticks and showed us what it says under SPD tab for each slot (they should all be identical but check- if they are not all identical then your sticks are not identical and that explains everything) and then what Memory tab shows. There are things that can be done if the SPD result does or does not match Memory result, but we need to see the results first.
6. Download memtest and use it to see if you have a bad stick. Run it for about 30 minutes or until you have completed at least two complete rounds of tests. If you get errors, test sticks two at a time in slots 1 and 2 to see if there is one particular stick that is failing. It's a process of elimination, time consuming and a PITA, but nobody said perfection was easy. Clear cmos carefully as described in the manual before each test. If you think you have isolated a bad stick, test it by itself in slot 1. If you think slot 1 is bad, use a single stick for your memtest and see if that stick fails in slot 1 and passes in slot 3. If you get errors in memtest note what test the errors appeared in.
I will be checking back online in about 8 hours and in the meantime, if you post some results and answers here, maybe somebody else can help too. Phew.... I am tired just thinking about all of that. Good luck. -
Please help me fix this issue.
I assume this is due to the fact that system detects your internal monitor as default one, and that is there where system sends main data. In order to force system send data to the external monitor, get a mouse and physical keyboard, close MB lid, and then press any key on the external keyboard.
when you get there, go to system preferences/display, and manually move the Apple upper bar to the external monitor, then choose mirror; ultimately, set optimal resolution. -
So yeah, I dropped my computer and nothing comes up. I'm using an apdapter to use a monitor I have to see if the computer still works. It shows the default picture of Leopard galaxy thingy but the login window doesn't come up so i can log in. What do I do????
I assume this is due to the fact that system detects your internal monitor as default one, and that is there where system sends main data. In order to force system send data to the external monitor, get a mouse and physical keyboard, close MB lid, and then press any key on the external keyboard.
when you get there, go to system preferences/display, and manually move the Apple upper bar to the external monitor, then choose mirror; ultimately, set optimal resolution. -
Trouble with Display fitting to Samsung Monitor
Hey,
I just got a HDMI Samsung 27" monitor to connect to my Mac Pro. I set the setting in the system preferences to 1920x1080 like the monitor is, but it still streches things. It makes the circles into ovals. It also doesn't take up the whole monitor so it has black bars on each side. On the monitor there is no way to change this. I have heard that I might need to get a DVI to HDMI cord converter, but not sure.
Anyone know how to fix this?
Thanks!The Video card use DVI and I downloaded your monitor's manual from here.
Look at section 3-1 about setting optimal resolution (which may be too late for you for that mentod).
Look at section 3-4, Standard Signal Mode Table.
Section 3-6 looks interesting about the on screen display (OSD) particularly Size & Position starting on page 3-7.
The specs on page 6-5 do confirm the max resolution for that monitor is 1920 x 1080.
So, I don't know. Maybe you need to do something with that Size and Position setting with the monitor's OSD.
The video card uses DVI and the monitor uses HDMI so presumably you can use a DVI-to-HDMI adaptor or cable. I think it shouldn't matter which you use. -
Proper screen resolution for optimum work with SD
A couple of questions.
I've just set up a 27" IMAC with FC. What's surprised me is the poor quality of the SD image I'm getting in the Viewer and Canvas. Two years ago I edited another SD project on a system using two 24" Apple monitors, and at that time we were all amazed by the quality of the image. Hardly looked like SD at all. We were even impressed when we looked at the work on the full screen. Yet that's not what I'm getting now.
I'm just wondering if there as been a change in the most recent version of FC. (Mine is FCP 7;I don't know what version I was using two years ago on that other project.) I looked in the manual and under "Viewer/Pop-Up Menu" I read: "+If the Viewer is scaled to anything other than 100 percent and you're displaying a DV clip, only one field is shown during playback or while scrubbing through the clip+." Has this always been the case with DV, which my SD footage is? Could my problem only be a problem with this latest version of FC?
As it is, set optimally on the 27" with a second monitor for the Browser, my Viewer and Canvas are producing a 199% size image. So given that, plus the fact that I'm only seeing one field, maybe I shouldn't be surprised at the weak image I'm getting.
Second question.
After reading about this loss of one field when the DV image is other than 100% I thought I'd set the Viewer pop-up menu to 100% rather than "Fit Window". Problem here, of course is that my image is now very small - a diagonal of only 7 inches. To try and deal with this I've played around with the IMAC screen resolution and when I change it from 2560x1440 to 1920X1080 I can get a 100% two field image with a 9 1/2 inch diagonal. Not great but better.
Does anyone think this is a recommended way to go? I'm always confused about how a monitor is reproducing a format that is 525 TV lines so the issue of the optimal resolution on my screen is confusing me. (I tried setting the screen to 1600X900. Now my 100% image is 11 1/2 inches but there is a very noticeable drop in image quality.)
Thanks for any help or advice I can receive.
JohnHas this always been the case with DV, which my SD footage is?
Yes. Perhaps you old system had the Viewer and Canvas windows set to 100% and your new one doesn't.
Could my problem only be a problem with this latest version of FC?
Not likely.
As it is, set optimally on the 27" with a second monitor for the Browser, my Viewer and Canvas are producing a 199% size image. So given that, plus the fact that I'm only seeing one field, maybe I shouldn't be surprised at the weak image I'm getting.
BINGO!
For best results, always use a properly calibrated, external TV monitor for critical viewing when editing.
-DH
Message was edited by: David Harbsmeier -
Video Playback Fine On iPhone - Horrible On Computer
Seen others with this issue. But think this is NOT talked about enough, therefore NOT addressed. Hopefully someone can help me ?
Summary: Record video on iPhone, plays back great on iPhone ( A PHONE !!). Get raw footage to a computer, video plays back horribly choppy and audio out of sync (on a COMPUTER !!). Results for me are the same on ALL players on my computer. Others seem to have it play fine with VLC, but not me.
Here's my stats:
iPhone 4, lastest O.S., latest update.
Computer - PowerBook (laptop) G4 17", 1.67Ghz Aluminum Last PPC Edition (Higher Res model), 2 GB Ram, ALL up to date, latest possible O.S., etc..
Players tested with SAME horrible results - Quicktime PRO (default .mov file), VLC, iTunes, WMV, Flip For Mac. ALL latest versions for above computer at least, etc..
I shoot some video (any length) with my iPhone 4 (which has no video camera settings, sooooo...IDK). Video plays back flawless on iPhone.
Plug iPhone into computer, download RAW uncompressed footage to desktop via "Image Capture". From that moment on (if viewing on computer), the raw footage's movie file) video playback acts like computer can't handle its quality, (choppy, skipping lots of frames, timing horribly off, etc..). Reacts same way with ANY player I use. Actually VLC is the worse, it just completely chokes. Quicktime PRO will at least play it, just horribly choppy. ALL players are up to date.
BUT if I burn a video DVD of it using Toast Pro, etc... then play it on convential DVD player and TV, it plays back awesome ! ALSO, if I "export for web" in Quicktime PRO and wait a amazing amount of time, I get a new file (MP4 or something), and that file will play fine on any player in my computer. BUT it is of a low quality, in which is BS that I have to go so low to view it cause I've played other videos downloaded from the net, or various places and they playback fine via QT or ANY player for that matter. Full screen, high res videos too !!
So I ask, what is the issue ? And why has this not been extensively discussed here and why has Apple not at least publically acknowledged this issue and gave its official word ? Is there something I can do ? Why can't the iPhone have camera settings on it to allow me to lower its video quality ? Or more importantly, why can't my computer seem to handle a simple video file that is really NOT true HD (which is 1920x1080) ? Its honestly not even much above TV resolution (720 x something). My computers screen resolution is 1680 x 1050 dpi, and my video card is a ATI Mobility Radeon 9700, AGP 128.
I could be mistaken, but I am almost sure I've downloaded and viewed movies (in file format) via QT that are of a highe res, better quality even and they playback fine. So can I get some clarification on this please ? Whats going on with the iPhones video capture ? Is it out of frame rate ? Or what ?
Thanks for anyones time --mcsound wrote:
Given the specs of your computer, I would not expect smooth playback of HD video. You definitely lack the processor power and are fairly low on the amount of RAM need to do so.
Hi mcsound, thanks for chiming in to help.
I don't know if I agree with this. Mathematic reasoning just don't support this. I agree consumerism does though - lol.
Playing HD don't take much resources, as my old xBox 360 plays HD movies just fine. But the real point is, 1280 x 720 is not HD, sure its a higher res than standard TV, but not by some overwhelming amount that causes issues like this. I can play other high res videos on this computer just fine, from DVD (which actually should be more taxing on the computer), and also from file (which is literally the same).
The xBox 360 has less processing power and less ram than my old "out of spec" computer, and it plays HD movies just fine. Point to all this is, it just aint adding up. The first inclination I agree, is to go buy a new computer. But, if you do the math, there is no reason why a 1.67Ghz processor can't play HD movies. Cause actually, it can, my computer plays them fine in iMovie, and edits HD video and renders HD video and burns HD video, and real 1920 x 1080 HD video, not this faux HD of 1280 x 720. So see how this is not adding up ?
RAM has nothing to do with this either. Playing movies from file or disc uses almost no ram. But editing film does for sure, and at that, these 2GB of ram I got has edited, rendered and made plenty of HD movies.
Look at the screenshots I supplied, the before and during screens of the "System Memory" (RAM) shows that while playing the faux HD iPhone clip, it used only 100mb of Ram.
So again, no reason to think it is the specs of the computer. I agree it could be the video card, or processor MAYBE, but again, why does other HD clips/movies play on this computer smoothly, while anyone taken from the iPhone doesn't ?
If it is the case that the iPhone is just taking a amazingly high quality of 1280 x 720 (higher than other 1280 x 720 cameras for some other reason), then why in the world would Apple not give the iPhone a camera setting to allow us to pick "capture" setting optimized for the end result application we plan on using ?
Either way, this is not ok, and either way, its funny how it gets most people buying into the whole "consumerism" thing. Which is the last thing we need. Don't get me wrong, I buy things like crazy, and will be buying a couple new computers soon, BUT there is no need for it to play HD or several other things people think. HD is not rocket science, and again, HD is old and was supported even when my computer came out. My screen resolution exceeds the iPhones video clips resolution !
There is some framerate controversy I think, or something that the iPhone is doing that other sources are not. But I just can't find it out, and I find it strange there is not more talk about this. -
Hello all,
I am hoping someone can help me. I have a dev and prod
instance, with identical rollback segment specs. The problem I
have is that I am running out of space on one of my instances
(dev). The prod instances rbs's are all the same size while the
dev instance rbs's are all over the place. One rbs (in dev) may
be at 10mb and another at 800. I don't know why the size
doesn't reduce after I issue a commit. I think the problem is
that the rbs's for both instances are public and one set of
rbs's is only being used. Can any offer any suggestions? Thank
you!I would like to suggest 2 things :
1) make all rollback segments to same size
2) set optimal storage parameter of rollback segment. -
I have just started reading the concepts and I got to know that Oracle database data is stored in data blocks. The standard block size is specified by
DB_BLOCK_SIZE init parameter. Additionally we can specify upto 5 other block sizes using DB_nK_CACHE_SIZE parameter.
Let us say I define in the init.ora
DB_BLOCK_SIZE = 8K
DB_CACHE_SIZE = 4G
DB_4K_CACHE_SIZE=1G
DB_16K_CACHE_SIZE=1G
Questions:
a) Does this mean I can create tablespaces with 8K, 4K and 16K block sizes only?
b) whenever I query data from these tablespaces, it will go and sit in these respective cache sizes?
Thanks in advance.
Neelyes, it will give error message if you create tablespace with non standard block size without specify the db_nk_cache_size parameter in the init parameter file .
Use the BLOCKSIZE clause of the CREATE TABLESPACE statement to create a tablespace with a block size different from the database standard block size. In order for the BLOCKSIZE clause to succeed, you must have already set the DB_CACHE_SIZE and at least one DB_nK_CACHE_SIZE initialization parameter. Further, and the integer you specify in the BLOCKSIZE clause must correspond with the setting of one DB_nK_CACHE_SIZE parameter setting. Although redundant, specifying a BLOCKSIZE equal to the standard block size, as specified by the DB_BLOCK_SIZE initialization parameter, is allowed.
The following statement creates tablespace lmtbsb, but specifies a block size that differs from the standard database block size (as specified by the DB_BLOCK_SIZE initialization parameter):
CREATE TABLESPACE lmtbsb DATAFILE '/u02/oracle/data/lmtbsb01.dbf' SIZE 50M
EXTENT MANAGEMENT LOCAL UNIFORM SIZE 128K
BLOCKSIZE 8K;
reference:-http://download.oracle.com/docs/cd/B28359_01/server.111/b28310/tspaces003.htm
Maybe you are looking for
-
Hi there, i'm new to the iPHONE community and please can someone tell me how to backup my phone to iCLOUD. How do i connect with wi-fi. Must i buy a wi-fi connetion or what.Thank you kind regards
-
Help on table update - millions of rows
Hi, I am trying to do the following however the process is taking lot of time. Can someone help me the best way to do it. qtemp1 - 500,000 rows qtemp2 - 50 Million rows UPDATE qtemp2 qt SET product = SELECT product_cd FROM qtemp1 qtemp WHERE qt.quote
-
Enter matrix coordinates into GUI text fields
Hello, I have a GUI with an extra panel that appears with a molecule. I have managed to use the mouse to rotate the molecule. It is also used to get the coordinate values of the molecule as it is moved and print them onto my DOS screen. I have set up
-
SneakPreview 2004s SP7 CAF External Services - Web Services Problem
I would like to use our web services in our CAF (Composite Application Services) Project. So I selected External Services and in the popup menu selected Import. After filling WSDL location und clicking on Finish Button. I got following Message: Canno
-
Hello, I have a new PC. I bought PSE 12 last February and downloaded it from the Adobe website. I have my serial numbers but no disks. How do I install the program on the new computer? I do not want to update to PSE 13. Thank you.