Oracle JRockit Real Time 3.1.2 and weblogic?

http://www.oracle.com/technology/software/products/jrockit/index.html". I go to the website, and find a software called: Oracle JRockit Real Time 3.1.2 for windows 64 bit. My question is: In order to let weblogic 9.2 works in windows 2003 R2 64 bit machine, SHOULD I MUST DOWNLOAD "Oracle JRockit Real Time 3.1.2 for windows 64 bit"? Or I don't need to download anything? Actually, I just download a file called jdk-6u18-windows-x64.exe. Please clear my question, thanks! One more thing, what is the functionality for Oracle JRockit Real Time 3.1.2? Is it a kind of JRE ?

Hi,
For more information on JRocket Real Time have a read of :- http://www.oracle.com/technology/products/jrockit/jrrt/index.html
As far as I am aware you don't have to install the real time product you can installl the JRockit JVM and use that with Weblogic, I think newer version of JRocket come bundled with Real Time which is the JVM and monitoring and analysis tools. There are still standalone JRocket JVM installs available, you can get them from oracle edelivery.
JRocket is a JVM but optimized for use with Weblogic, so you should get better overall performance.
Cheers
John
http://john-goodwin.blogspot.com/

Similar Messages

  • Welcome to the Oracle JRockit Real Time forum

    The JRockit forum over at forums.bea.com is since the first of July in read only mode - this is now the official forum for the JRockit Real Time product.
    Best regards,
    Stefan

    Bonjour "Etudiant from Tunisia",
    I'm not sure what you mean with "please give me the user/password".
    As far as I know there is no online OWB available where you could try out the product (if that is what you mean).
    If you would like to learn more about OWB, read online documentation or for instance the link that is provided in an earlier post in this thread.
    Otherwise simply install OWB and try it yourself. The Installation and Configuration Guide is clear enough even without much experience installing Oracle software, and the OWB User Guide provides some basic insight on working with OWB itself.
    Good luck, Patrick
    ps If people are still expecting what Igor mentioned when he started this forum ("Oracle Warehouse Builder (OWB) product management and development will monitor the discussion forum on regular basis"), don't count on it; fortunately the product itself has been around long enough now, there are quite some users that can share usable insights and/or drop some useful lines on the new threads... ;-)

  • What is the real time use of implicit and explicit cursors in pl/sql

    what is the real time use of implicit and explicit cursors in pl/sql.............please tell me

    You can check the following link ->
    http://www.smart-soft.co.uk/Oracle/oracle-plsql-tutorial-part5.htm
    But, i've a question ->
    Are you student?
    Regards.
    Satyaki De.

  • Using JRockit Real Time with Large Heap

    I want to know if JRockit Real Time can work with extremely large heap, e.g. 100G, to provide effective deterministic GC, before proceeding with evaluation for my application.
    Thanks in advance!
    Wing

    Hi Wing,
    In general, extremely large heaps can make all GCs slower. This is true for Deterministic GC as well. However, it is very application dependent. It is above our standard recommendation, but you can download JRockit Real Time and give it a try.

  • Oracle Enterprise Real-Time Scheduling (Sidewinder)

    Hi Friends,
    Any idea where can i find some documentation on Oracle Enterprise Real-Time Scheduling (Sidewinder)?
    Regards,
    Anirban Roy

    Start here:
    http://www.oracle.com/corporate/press/2007_mar/030507_Oracle%20Utilities%20MWM.htm
    Those contacts might lead you to the documentation

  • Oracle SGA Real Time Consumption Information(9i,10g and 11g)

    Hello,
    I need to prepare a comparative analysis report of SGA for an Oracle Production Instance
    The analysis would show the pre-allocated memory to SGA components v/s real time consumption of memory by these SGA components. I need to do this for each of following components.
    SGA itself
    Fixed Size
    Variable Size
    Database Buffers
    Redo Buffers
    The pre-allocated memory to above SGA components can be obtained by querying v$sga. But from where do I get its real time(current) memory conusmption in Oracle Production environment.
    In addition to above, i need the same information (pre-allocated and real time consumption) for following.
    Keep buffer cache
    Recycle buffer cache
    Specific block size caches
    Shared pool
    Large pool
    Java pool
    streams pool
    Which tables do I need to consider in order to derive 1)pre-allocated memory and 2)real time consumption for above mentioned SGA components
    Please advice.
    Thank you for your time in reading this post.
    Thanks,
    Ruchir

    Hi,
    Have a look at v$sgastat. Also, use statspack in 9i and AWR reports on 10g. Also, the size of the caches won't grow unless they are used. The parameters you have specified within the parameter file, like sga_target (10g onwards) and possibly the other pools if you have specified them, will show you what the caches can grow to.
    For example, you could just log onto the DB and do show parameter sga_ or shared_pool and you will seee values for these. Also, it depends whether you are running in automatic memory management mode - where the sga_target parameter is set - or manual. 9i will be manual, but 10g could be auto. In manual case, 9i, check out the parameters individually.
    Also, read the docs about the parameters shown and you will see what it says abotu them. There will be lots in the docs about performance tuning and monitoring of the instance. You might even learn some other interesting facts while reading through the docs...
    Hope this helps,
    Rob
    http://www.ora00600.com

  • ORACLE 11G  "real time apply" not work?????

    we have a database original on ORACLE 10.2.0.4 and we upgrade it to 11.1.0.7.
    after that we create standby database and tried to use "real time apply" feature.
    Primary database can transfer log files to standby database and standby database also can apply logs. The problem is it can NOT work on "real time apply".
    Ant ideal what wrong?
    === procedures ====== (standby database)
    SQL> startup mount;
    ORACLE instance started.
    Total System Global Area 2087780352 bytes
    Fixed Size 2161272 bytes
    Variable Size 1795163528 bytes
    Database Buffers 251658240 bytes
    Redo Buffers 38797312 bytes
    Database mounted.
    SQL> alter database open read only;
    Database altered.
    SQL> alter database recover managed standby database using current logfile disconnect;
    Database altered.
    SQL> select PROTECTION_MODE, PROTECTION_LEVEL, DATABASE_ROLE, SWITCHOVER_STATUS, OPEN_MODE, GUARD_STATUS from v$database;
    PROTECTION_MODE PROTECTION_LEVEL DATABASE_ROLE SWITCHOVER_STATUS
    OPEN_MODE GUARD_S
    MAXIMUM PERFORMANCE MAXIMUM PERFORMANCE PHYSICAL STANDBY NOT ALLOWED
    MOUNTED NONE
    SQL> select process, status from v$managed_standby;
    PROCESS STATUS
    ARCH CONNECTED
    ARCH CONNECTED
    ARCH CONNECTED
    ARCH CONNECTED
    RFS IDLE
    MRP0 APPLYING_LOG
    6 rows selected.
    ========== Primary database init.ora file setup =====
    ### for DG use
    db_unique_name = DBPMY
    log_archive_config='dg_config=(DBPMY,DBSBY)'
    log_archive_dest_1='LOCATION=/Archive/DBPMY/arch/arch MANDATORY'
    log_archive_dest_2='service=DBSBY valid_for=(online_logfiles,primary_role) db_unique_name=DBSBY LGWR ASYNC=20480 OPTIONAL REOPEN=15 NET_TIMEOUT=30'
    *.log_archive_format='DBPMY_%r_%t_%s.arc'
    log_archive_dest_state_1 = enable
    log_archive_dest_state_2 = enable

    There are a couple of things to look at.
    1. Real time apply requires standby redo logs on the standby database. On the standby database run this query:
    SELECT * FROM v$logfile where type = 'STANDBY';
    if you get 0 rows back you'll need to create standby logfiles
    The general guideline is to size them exactly like your redo logs but add one additional standby log to ensure it doesn't cause a bottleneck.
    2. Get the size of your logfiles:
    SELECT GROUP#, BYTES FROM V$LOG;
    3. For example if you have 3 redo logs that are 50 MB in size, create 4 standby redo logs 50 MB each and don't multiplex them.
    ALTER DATABASE ADD STANDBY LOGFILE ('/Archive/DBSBY/onlinelog/slog1.rdo') SIZE 50M;
    ALTER DATABASE ADD STANDBY LOGFILE ('/Archive/DBSBY/onlinelog/slog2.rdo') SIZE 50M;
    ALTER DATABASE ADD STANDBY LOGFILE ('/Archive/DBSBY/onlinelog/slog3.rdo') SIZE 50M;
    ALTER DATABASE ADD STANDBY LOGFILE ('/Archive/DBSBY/onlinelog/slog4.rdo') SIZE 50M;
    4. Cancel recovery on standby
    recover managed standby database cancel;
    5. Restart recovery using real time apply
    recover managed standby database using current logfile disconnect;
    6. To validate that real time is working you can check a few places.
    -It will say in the database alert log on standby that it's using real time apply
    OR
    -Check primary
    SELECT status, recovery_mode FROM v$archive_dest_status where dest_name = 'LOG_ARCHIVE_DEST_2';
    If the recovery_mode is "MANAGED REAL TIME APPLY" then real time apply is working, if it's anything else then we'll need to check more things.
    NOTE that if you are going to allow your current primary to switch roles and become a standby then you'll want to create standby redo logs on primary as well
    Sometimes recovery gets "stuck" and simply resetting the destination parameters can resolve it:
    alter system set log_archive_dest_2='service=DBSBY valid_for=(online_logfiles,primary_role) db_unique_name=DBSBY LGWR ASYNC=20480 OPTIONAL REOPEN=15 NET_TIMEOUT=30';
    There are some other things we can check next but let's start with the easiest fixes first.

  • Real time logging: best practices and questions ?

    I've 4 couples of DS 5.2p6 in MMR mode on Windows 2003.
    Each server is configured with the default setting of "nsslapd-accesslog-logbuffering" enabled, and the log files are stored on a local file system, then later centrally archived thanks to a log sender daemon.
    I've now a requirement from a monitoring tool (used to establish correlations/links/events between applications) to provide the directory
    server access logs in real time.
    At a first glance, each directory generates about 1,1 Mb of access log per second.
    1)
    I'd like to know if there're known best practices / experiences in such a case.
    2)
    Also, should I upgrade my DS servers to benefit from any log management related feature ? Should I think about using an external disk
    sub-sytem (SAN, NAS ....) ?
    3)
    In DS 5.2, what's the default access logbuffering policy : is there a maximum buffer size and/or time limit before flushing to disk ? Is it configurable ?

    Usually log-buffering should be enabled. I don't know of any customers who turn it off. Even if you do, I guess it should be after careful evaluation in your environment. AFAIK, there is no configurable limit for buffer size or time limit before it is committed to disk
    Regarding faster disks, I had the bright idea that you could creating a ramdisk and set the logs to go there instead of disk. Let's say the ramdisk is 2gb max in size and you receive about 1MB/sec in writes. Say max-log-size is 30MB. You can schedule a job to run every minute that copies over the newly rotated file(s) from ramdisk to your filesystem and then send it over to logs HQ. If the server does crash, you'll lose upto a minute of logs. Of course, the data disappears after reboot, so you'll need to manage that as well. Sounds like fun to try but may not be practical.
    Ramdisk on windows
    [http://msdn.microsoft.com/en-us/library/dd163312.aspx]
    Ramdisk on solaris
    [http://wikis.sun.com/display/BigAdmin/Talking+about+RAM+disks+in+the+Solaris+OS]
    [http://docs.sun.com/app/docs/doc/816-5166/ramdiskadm-1m?a=view]
    I should ask, how realtime should this log correlation be?
    Edited by: etst123 on Jul 23, 2009 1:04 PM

  • Oracle 10g Real Time Backup

    Hi
    My company need to set database real time backup for Oracle 10g release 2 (Standard Edition).
    I see in Oracle Database 10g Product Family [Oracle Database 10g Product Family|http://www.oracle.com/technology/products/database/oracle10g/pdf/twp_general_10gdb_product_family.pdf] not support the Data Guard.
    Can I do it in Oracle 10g SE ?
    Thanks in advance

    hi,
    trouble is the cost of EE can be quite expensive
    another way of creating a psuedo (hope I spelt that correctly) dataguard would be to script a method of shipping archive files to a DR server.
    Live System
    =========
                 BEGIN
              IF ORACLE ONLINE THEN
                            SWITCH LOG;
                       END IF;
                       IF ANOTHER INSTANCE OF SCRIPT RUNNING (LOCAL OR REMOTE) THEN
                                 ERROR;
                                 EXIT;
                       END IF;
              IF (STANDBY NOT ONLINE) THEN
                              ERROR;
                              SEND EMAIL;
                              EXIT;
                       END IF;
                       GET LAST SHIPPED LOG FILE TIMESTAMP FROM THE REMOTE SITE OR FROM LOCAL SERVER;
                        WHILE LOG > SHIPPED LOG FILE TIMESTAMP DO
                                  TRANSFER LOG FILE TO REMOTE SITE;
                                  DO CHECKSUM ON LOG FILES;
                                  IF NOT MATCH THEN
                                         RETRY TRANSFER;
                                         IF FAILURE 3 TIME THEN
                                               ERROR;
                                               SEND EMAIL;
                                         END IF;
                          END IF;
                                  ' GOT THIS FAR SO ALL IS OK
                                  MARK THE LOG FILE AS TRANSFERRED;
                        WEND;        
    DR System
    ========
              BEGIN
                   CREATE A LOCK SEMAPHORE;        - Only one instance allowed
                                 CHECK LAST APPLIED LOG FILE;
                   IF CONNECTION TO LIVE SERVER OK THEN
                                         CHECKSUM  NEXT LOG FILE WITH LIVE SYSTEM;
                                         IF CHECKSUM NOT SAME THEN
                                                  IF LOG FILE ON ITS WAY FROM LIVE THEN
                                                           EXIT;
                                                  END IF;
                                                  WHILE  LOGFILE EXISTS DO
                                                            GET NEXT LOG FILE FROM LIVE;
                                                  WEND;
                                         END IF;
                                 ELSE
                                         ' Extra work
                                         APPLY NEXT LOG FILE
                                 END IF;
                   APPLY LOG FILES;
                                 WORKOUT NO OF LOG FILE APPLIED;
                                 KEEP SPECIFIED PERCENTAGE LOG FILES;
                                 DELETE EXTRA LOG FILES;
                                 TRANSFER LAST APPLIED LOG FILE ATTRIBUTES TO LIVE SERVER;
                                 EXIT;
                         END;       I cannot give you the actual code that I have as there are some more bits that I have added for our client but what it does is give you an over view of the process involved for creating a pseudo DG environment.
    regards
    Alan

  • Oracle EDQ Real time Integration

    Hi,
    Will it be possible to create 1:m relationship WSDL file in EDQ. I have a requirement where Customer is the Parent entity and other child entities such as Address, References, Privacy preferences etc. would have a multiple entries for one customer record.
    The default process to create a Web services, creates a 1:1 WSDL only. Is there any other approach to create a WSDL with 1:M relationship in EDQ?
    Thanks and Regards,
    Rajesh.

    Hi Rajesh,
    Your question is very unclear - perhaps you could be more clear on what you are trying to achieve?
    There is no 'default process to create a web service' that assumes 1:1. The wizard for creating web services allows you to tick a box on either the input or output interfaces of the web service to enable multiple records. Typical examples are:
    1. Record cleaning services 1:1
    2. Record clustering services 1:M (driving record in, many cluster keys out)
    3. Reference matching services: 1:M (working record in, matching reference records returned)
    4. Matching services M:M (driving and candidate records in, matching records returned)
    Where you are dealing with hierarchical data, you may need to use the capabilities in EDQ to consolidate or split records. EDQ works with a flat record structure but can easily consolidate or split records at several phases of a process.
    Regards,
    Mike

  • Oracle SOA Suite 11.1.1.5 and weblogic server with OEPE

    Hello
    Where can I find the installers for Oracel SOA Suite 11.1.1.5 and corresponding supported weblogic server with OEPE.
    I want to install this on Debian and on Oracle Solaris 10 both, so better to have generic installers.
    Few months ago, i could find them (2 months ago), but not now anywhere, not even on edelievery.
    Where does oracle store all older version installers?
    Regards

    Hi
    Download wls1035_oepe111172 Generic Version from below link.
    http://www.oracle.com/technetwork/middleware/ias/downloads/wls-main-097127.html
    Thanks,
    --Vijay                                                                                                                                                                                                                                                                                                                                               

  • BT NetProtect plus – Real time scanning and update...

    I have posted this message as both a pointer in case others have the same issue as well as whether anyone has found a proper resolution.
    I have recently experianced problems with my BT NetProtect plus indicating the Real time scanning has stopped and any attempt to restart it does not work. This is accompanied with it keep promptig that there is a new update and to restart the computer. This happened every time I logged on.
    I had spent some considerable time with the BT support people who have now reinstalled the software about 6 time's after the 2 time I had already tried.
    I run Windows 7 64 bit and recently installed an SSD drive. The key thing I have subsequently discovered in my upgrade is that I changed some registry settings to make my default installation location to be my D: drive. I followed this post :
    The 64bit version has two versions of regedit. Make this change as well:
    1.) Enter into Start>Run: %systemroot%\syswow64\regedit 2.) Go to: HKEY_LOCAL_MACHINE \ SOFTWARE \ Microsoft \ Windows \ CurrentVersion 3.) Change the Path in DWORDs ProgramFilesDir, ProgramFilesDir (x86) to the new path, probably just changing the drive letter
    Everything works fine apart from BT NetProtect plus which when installed new has the above issue.  If I change the default install location back to C: drive then it installs OK and works OK when I subsequently change the default install location back to the D: drive. When I say works OK, that is until BT NetProtect plus issues a major update when the same problem appears again.
    I am waiting further contact from the 3rd line helpdesk but the 2nd line team did not seam to know of this problem.
    Has anyone else experenced it or know of a proper solution ?

    A full McAfee scan on my desktop PC can take about 2 hours or so and a full scan sometimes slows down my over 3 year old desktop PC. Thats why I only do a full McAfee scan every once in a while but I do a quick McAfee scan every 6 or 7 days.
    I do like the McAfee shredder though and use that now and again. I also got McAfee quick clean set to run at lest twice a week.  
    I have used Norton in the past and that slowed down my last PC more than McAfee from BT. The free version of AVG is good but its only an anti-virus. Some say the windows firewall is ok but not the best.
    I also have windows defender set to do a quick scan every evening around 9pm. As most nights I have my Desktop PC on at that time.
    I also do a disk clean up at leat once a day.
    Darren

  • Regarding PSA real time scenarios and things

    Hai
    I knew abour PSA therotically . But i want in detail real time scenarios concering about PSA . Like , if data laoding is failed then how to look at PSA and after identifying how to update to corresponding datatarget like this ......
    So please tell in detail some real time scenarios about PSA and also send some documents or weblogd or old forums to [email protected] now
    I ll assign the points for thanks
    bye
    mohammed

    hi chakri, go through this.
    1.The Persistent Staging Area (PSA) is the inbound storage area for data from the source systems in the SAP Business Information Warehouse. The requested data is saved, unchanged from the source system. Request data is stored in the transfer structure format in transparent, relational database tables in the Business Information Warehouse. The data format remains unchanged, meaning that no summarization or transformations take place, as is the case with Info Cubes.
    2. The data records in BW are transferred to the transfer structure when you load data with the transfer method PSA. One TRFC is carried out per data package. Data is written to the PSA table from the transfer structure, and stored there. A transparent PSA table is created for every transfer structure that is activated. The PSA tables each have the same structure as their respective transfer structures.
    3. The number of fields is limited to a maximum of 255 when using TRFCs to transfer data. The length of the data record is limited to 1962 bytes when you use TRFCs.
    4. Four types of updates a) psa and data targets parallel b)psa and data targets serially c) only psa with subsequent updates to data targets d) only data targets
    5.Since the data is stored in transparent tables in database we can edit the data in psa using  ABAP routines using PSA-API’s.
    6. u can delete the requests from the PSA  for the incorrect data provided that the concerned request data is not populated into data targets.
    7. table for PSA:There is only one table for all psa requests.One way to know the psa table name is by right click the psa node and 'maintain export datasource', you will see the table name is /BIC/B0000xxxx (System generated psa table name ) ('ExtractStru.') with se16 you can see the contents, each request no. displayed in field REQUEST.
    8. reporting:  We can not report directly on psa  as the  same data is coming form the source system we can see a report from the source sytem. How ever technically speaking we can have a report on that. You are creating an export datasource on it and report using a Remotecube. you could even create an ODS with the same structure of the PSA and push the PSA data into it with no data transformation and then report. When u create an export data soruce on psa it will start with 7 followed by ur  PSA name. The below link gives the reporting on psa document.
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/968dab90-0201-0010-c093-9d2a326969f1
    Re: Reporting on PSA
    Reporting on PSA
    9.Changelog and PSA: If we delete data in changelog the data in psa will not be deleted. You have to delete the data in the PSA exclusively.The change log acts as the PSA for the data target if the ODS acts as a datasource. What I mean is if the ODS supplies to another cube/ODS then the source ODS's Changelog is the PSA for the target ODS/Cube.
    10.PSA as a data source:   create an export data source. The export data source starts with 7 followed by psa name.  Go to data marts in the info sources tab and replicate that data source.  Create an info source and assign this data source.  Activate transfer rules and communication structure Then create a data package  and schedule the load to the data target.
    11.PSA deletion:
      Re: PSA Deletion
    12. PSA data in EP:
    Re: urgent: PSA Data in EP?
    13.Deleting PSA data source:
    To delete the export data source
    Just go to RSA6 and find the your datasource in the node 'DM', click once (highlight it) and there is a delete icon on the standard icon bar, click on that you should be able to delete the datasource.
    Delete PSA datasource
      14.Debug load inpsa:
    Re: How to Debug a Load into PSA    
    to load data manually from psa to the data targets.
    1. go to rsal .ie administrator work bench
    2. under modelling tab choose psa option
    3.select your application component
    4. choose ur info soruce and expand the tree to see the requests
    5. double click on the psa icon
    6. a new screen will appear which asks for the no of records to enter give the number and press enter u can see the data in the psa
    7. to update psa data manually go to the context menu  of that request and select the optio "update date immeditatley" it will be  loaded to the data target/
    cheers
    ravi
    dont forget to assign points that is the way of saying thanks in sdn

  • How to create a Real Time Interactive Business Intelligence Solution in SharePoint 2013

    Hi Experts,
    I was recently given the below requirements to architect/implement a business intelligence solution that deals with instant/real data modifications in data sources. After going through many articles, e-books, expert blogs, I am still unable to piece the
    right information to design an effective solution to my problem. Also, client is ready to invest in the best 
    infrastructure in order to achieve all the below requirements but yet it seems like a sword of Damocles that hangs around my neck in every direction I go.
    Requirements
    1) Reports must be created against many-to-many table relationships and against multiple data sources(SP Lists, SQL Server Custom Databases, External Databases).
    2) The Report and Dashboard pages should refresh/reflect with real time data immediately as and when changes are made to the data sources.
    3) The Reports should be cross-browser compatible(must work in google chrome, safari, firefox and IE), cross-platform(MAC, Android, Linux, Windows) and cross-device compatible(Tabs, Laptops &
    Mobiles).
    4) Client is Branding/UI conscious and wants the reports to look animated and pixel perfect similar to what's possible to create today in Excel 2013.
    5) The reports must be interactive, parameterized, slice able, must load fast and have the ability to drill down or expand.
    6) Client wants to leverage the Web Content Management, Document Management, Workflow abilities & other features of SharePoint with key focus being on the reporting solution.
    7) Client wants the reports to be scalable, durable, secure and other standard needs.
    Is SharePoint 2013 Business Intelligence a good candidate? I see the below limitations with the Product to achieve all the above requirements.
    a) Cannot use Power Pivot with Excel deployed to SharePoint as the minimum granularity of refresh schedule is Daily. This violates Requirement 1.
    b) Excel Services, Performance Point or Power View works as in-memory representation mode. This violates Requirement 1 and 2.
    b) SSRS does not render the reports as stated above in requirement 3 and 4. The report rendering on the page is very slow for sample data itself. This violates Requirement 6 and 7.
    Has someone been able to achieve all of the above requirements using SharePoint 2013 platform or any other platform. Please let me know the best possible solution. If possible, redirect me to whitepapers, articles, material that will help me design a effective
    solution. Eagerly looking forward to hear from you experts!.
    Please feel free to write in case you have any comments/clarifications.
    Thanks, 
    Bhargav

    Hi Experts,
    Request your valuable inputs and support on achieving the above requirements.
    Looking forward for your responses.
    Thanks,
    Bhargav

  • Multiclip Editing in Real Time

    I have this concert footage I am syncing and I am now up to 9 angles all in standard DEF DV NTSC. I have 1 or 2 more to put in. So I will have say 11 total angles. I know I can view and make cuts in real time up to 16 on the screen at 1 time I've read. I need to see in REAL time the 16 angles and edit in real time these concerts. I've done it with 4 angles and it works great in FCP 5. Even have done it with 6 angles.
    I notice when I play it back with 9 angles though, within about 20 seconds, I start to see the time code updating every 5 or 10 frames and then I start to get sound drop outs and video drop outs...sometimes the video freezes as well. I have unchecked all the boxes in the View menu that are important for speed according to the manual, but nevertheless I think around the 8th angle or so, I get this problem.
    Where is the bottleneck? I have this card in for my display: GeForce 6800 Ultra. And I have 4 GB of RAM. My 2nd 400 GB hard drive has about 30 GB left on it.
    Is my GeForce 6800 Ultra what determines my ability to see more streams in real time? Is there some way I can watch these 11 streams without any drop outs and edit real time?
    And if I need a faster system, what area of my system would I need to improve. Ideally I want to be able to edit 16 stream in real time without any slowdown in the sound and video drop outs. I need to make these cuts precisely as I feel the flow of the music and see all streams together. Soon I will do this with HDV footage as well.
    Any help would be VERY much appreciated. Thanks-
    Dual 2.5 Ghz PowerPC G5   Mac OS X (10.4.1)  

    What displays are connected to the 6800?
    If it's driving two displays, I think it divides its onboard RAM between the two (128/128). I read some time ago that ATI had released a little app that allowed you to turn off the second display connector on its cards, so all of the RAM would be funneled to the main display. I don't know if there's a way to do that with the nVidia card.
    But if there is, that might help. IOW, you'd have to work on a single display, but all 256MB of RAM would be available to it. That might help, if it's even possible.
    If you pare down the files on your media drive, so that it's less than 60% full, with all files living in the first 60% of the platters (the outer and middle tracks), that would help. To do that, you'd trash enough files to get down to 60% full then copy them all to another volume, wipe the original media volume, then move the files back.
    Make sure External Devices is set to None, so the G5 doesn't have to push video out the FW port.
    Repair Premissions, quit all unneeded apps and processes, and make sure your boot volume isn't more than 60% full. YOu could also set RT Extreme Display Quality to Low. Leave Frame Rate to Full, though, or FCP will intentional drop frames to maintain real-time playback.
    Others may have additional suggestions, but if none of our suggestions resolve this, then you might consider buying a fast RAID array. Expensive, yes, and not typically needed for DV, but it will drive more streams of video than you can do now.
    Then again, I don't need to use multi-clip, so I don't know whether your particular problem is typical or not.

Maybe you are looking for

  • Error while rman recovery of database

    We are using rman recovery for database clone. I am getting the following error, restoring datafile 00442 to /d05/u10/oracle/ndevdata/applsysd06.dbf channel dup1: reading from backup piece /db05/PROD_BACKUPS/inc0_dPROD_u1bku1pnr_t702605051_s15403_p1

  • I often get the message "The Adobe Flash Plugin has crashed / Reload the page to try again" but the plugin check page does not list Adobe Flash.

    The plugin check page does list something called 'Shockwave Flash." I went to the Adobe site and downloaded and installed the latest version of Adobe Flash but it still does not show up on the plugin check page. I don't see how Adobe Flash can be cra

  • Hide or delete Characteristics? how to, please help

    Hi there, I have an ODS with 400 Char - they are options on a part.  For each option there is only a value of Y or N. When I run a query I want to hide or delete all characteristics that are equal to N (I don't want to show the characteristic or the

  • Why is a real name required?

    Why do you require my real name? I wished to create a second account, to use when talking with people I met on the internet(keeping it seperate from my real life one) but I see that I need to type in my full real name. I find it ridiculus since my ac

  • Label and textfield flipped (180 degree)

    i'd like to make an app with a pane divided in 2. on the bottom part i have button, label, textfield normally displayed, on the upper part i'll have other button, textfield ecc, rotated of 180 degree. how can i make it? thanks