Optimization of Remote PM

To optimize remote PM it might be useful if client could specify what
besides default fetch group fields should be fetched in a single go.
It is the same issue as eager fetching solves with database access.
Ideally it should be separate from eager database fetching because
optimization of database access is not always the same as optimization of
state fetch. But it is probably can be the same (that is use eager fetching
to configure database access and state

Yes I agree
"Abe White" <[email protected]> wrote in message
news:c4fgg2$gl1$[email protected]..
Currently state is transferred in the same chunks as eager fetching
uses. I think this is optimal from a complexity-vs-performancestandpoint.

Similar Messages

  • Optimization of remote calls of EJB

    Hello,
    does the SAP Web AS support automatic optimization of remote calls of EJBs such that the overhead associated with remote calls is avoided iff the target EJB of the call actually runs in the same JVM (and therefore would allow a local call)? This would imply that in this case objects are passed by reference instead of by value.
    To clarify: I am aware of the fact that EJBs can be called through Local and LocalHome interfaces. But that prevents the distribution of the EJBs. What I am looking for is to always use Remote and Home interface (remote call) and let the AppServer optimize the call to be virtually local if possible.
    From what I know, JBoss and WebLogic support this feature. Is there anything like that for the Web AS. What do I need to configure?
    Any hint is greatly appreciated. Please let me know if you need additional clarification on my question. Thanks!
    With kindest regards,
    Nick.

    Hi Nick,
    The optimizations I was talking about are a proprietary internal functionality and not something application developers can rely on. That's why they are not documented in any external documentation. According to your problem, my proposal is to declare both the remote and local interfaces of the beans and use the proper one depending on whether the bean client wants to pass parameters by value or by reference.
    SAP does not have plans to dynamically (or automatically as you call it) switch from calling by value to calling by reference as this is not just a performance optimization - this breaks the functionality. If we decide to do it, we will have at least two problems:
    1. Incompatibility with the RMI specification
    2. Incompatibility with previous versions
    As I already mentioned, there are EJB components that rely on being called by value, no matter whether the client resides in the same JVM or is a remote one.
    I still cannot get your goal - both insisting on remote interfaces and expecting object-by-reference.
    Best regards,
    Viliana

  • Remote console session freezes when scrolling

    My host: Laptop running Windows XP SP3.
    Guest: Windows 2008 R2, physically on a USB hard drive.
    My VMware server version is 2.0.2.
    I have ongoing problems with the host session freezing up. I have had this problem intermittently while working in MMC on the guest, or consistently in MS Excel 2008 on the guest. Because I can consistently reproduce the problem in Excel, I'll focus on it. The session will freze simply by selecting a cell, and then dragging down to select multiple rows of cells. If I exceed the number of rows visible, the Excel session should scroll down. Instead, the entire session becomes unresponsive. I can not move the mouse pointer or switch or any other applications in the remote session. My only option is to shut down the virtual machine and restart it.
    I have also had the same problem when navigating MMC. The entire session will sometimes freeze after I click the plus sign to open up an item.  In this case, I have not yet identified under what circumstances the "freezing" occurs.

    I am experiencing what I think is the same issue. However, my vmware tools (seem) up to date (declared version is 7.7.6, update option is inactive), and there is no svga driver under c:\program files\common files\vmware\drivers (there is one in c:\program files\vmware\vmwaretools\drivers\video though).
    The VM freezes after scrolling up/down in notepad++ , exact conditions are hard to define, but usually after 2 minutes of editing a text file a freeze is guaranteed. Only way to recover is a reboot of the VM (a shutdown never completes). Help appreciated.
    Host : Ubuntu 9.10,
    Version : VMwareServer 2.0.2
    Guest : Windows 7 Pro 32bit
    Log from attaching console to freeze:
    Jun 22 16:10:46.994: vcpu-0| Guest display topology changed: numDisplays 1
    Jun 22 16:10:46.995: vcpu-0| Guest: vmx_fb: SVGADBG: ready
    Jun 22 16:10:46.996: vcpu-0| Guest: vmx_fb: This is the primary surface: PPDEV ffb81010
    Jun 22 16:10:46.997: vcpu-0| Guest: vmx_fb: Display driver is out of sync with virtual hardware.  Disabling 3d.
    Jun 22 16:10:46.997: vcpu-0| Guest: vmx_fb:   Current hardware revision: 0.0.
    Jun 22 16:10:46.997: vcpu-0| Guest: vmx_fb:   Driver compiled against:   2.0.
    Jun 22 16:10:46.997: vcpu-0| Guest: vmx_fb: DrvGetDirectDrawInfo: Overlay flags set
    Jun 22 16:10:46.997: vcpu-0| Guest: vmx_fb: This is the primary surface: PPDEV ffb81010
    Jun 22 16:10:46.997: vcpu-0| Guest: vmx_fb: Display driver is out of sync with virtual hardware.  Disabling 3d.
    Jun 22 16:10:46.997: vcpu-0| Guest: vmx_fb:   Current hardware revision: 0.0.
    Jun 22 16:10:46.998: vcpu-0| Guest: vmx_fb:   Driver compiled against:   2.0.
    Jun 22 16:10:46.998: vcpu-0| Guest: vmx_fb: DrvGetDirectDrawInfo: Overlay flags set
    Jun 22 16:10:48.009: mks| HostOps showCursor before defineCursor!
    Jun 22 16:10:57.939: vcpu-0| GuestRpc: Channel 0, guest application toolbox.
    Jun 22 16:10:57.939: vcpu-0| TOOLS sending 'OS_PowerOn' (3) state change request
    Jun 22 16:10:57.949: vcpu-0| TOOLS autoupgrade protocol version 2
    Jun 22 16:10:57.953: vcpu-0| TOOLS ToolsCapabilityGuestTempDirectory received 1 C:WindowsTEMP
    Jun 22 16:10:57.953: vcpu-0| TOOLS ToolsCapabilityGuestConfDirectory received C:ProgramDataVMwareVMware Tools
    Jun 22 16:10:57.989: vcpu-0| TOOLS setting the tools version to '7398'
    Jun 22 16:10:57.999: vcpu-0| TOOLS installed version 7398, available version 7398
    Jun 22 16:10:57.999: vcpu-0| TOOLS will not be autoupgraded.
    Jun 22 16:10:57.999: vcpu-0| TOOLS Setting autoupgrade-checked TRUE.
    Jun 22 16:10:58.093: vcpu-0| Guest: toolbox: Version: build-203138
    Jun 22 16:10:58.093: vcpu-0| TOOLS unified loop capability requested by 'toolbox'; now sending options via TCLO
    Jun 22 16:10:58.397: vcpu-0| TOOLS state change 3 returned status 1
    Jun 22 16:11:25.297: mks| VNCENCODE 2 encoding mode change: (1024x768x24depth,32bpp,4096bytes/line)
    Jun 22 16:11:25.332: mks| VNCENCODE 2 encoding mode change: (1024x768x24depth,32bpp,4096bytes/line)
    Jun 22 16:11:28.630: mks| SVGA: display status changed, using optimizations for remote consoles.
    Jun 22 16:11:38.210: vcpu-0| VMMouse: CMD Read ID
    Jun 22 16:11:38.218: mks| MKS switching absolute mouse on
    Jun 22 16:11:45.233: vcpu-0| MKS Backdoor get pointer: first time, notify tools are running
    Jun 22 16:11:46.905: vcpu-0| TOOLS unified loop capability requested by 'toolbox-dnd'; now sending options via TCLO
    Jun 22 16:11:46.938: vcpu-0| GuestRpc: Channel 1, guest application toolbox-dnd.
    Jun 22 16:12:02.442: vmx| ide1:0: Command READ(10) took 2.000 seconds (ok)
    Jun 22 16:12:20.540: vmx| GuestRpcSendTimedOut: message to toolbox-dnd timed out.
    Jun 22 16:12:35.540: vmx| GuestRpcSendTimedOut: message to toolbox-dnd timed out.
    Jun 22 16:12:35.540: vmx| GuestRpc: app toolbox-dnd's second ping timeout; assuming app is down

  • How to enable SSL optimization only for a single remote WAE and specific website?

    Hi guys.
    I have to enable SSL optimization for a specifc HTTPS website only and for a specific remote site only (branch office).
    The scenario is as follows:
    Multiple sites connected via a MPLS cloud. Each site has its own WAE device (module or appliance).
    There is a central manager and core WAE in the main site (central site).
    There is a website accessed via HTTPS by all the remote sites. This specific website is hosted within the main site.
    For only a specific branch office (remote site) we want to enable SSL optimization for this specific website.
    I saw this great and useful doc, but I still have some concerns.
    https://supportforums.cisco.com/docs/DOC-16452
    Basically, according to I see, I should do the following if I want to enable SSL optimization with the entire environment:
    - export the certificate and keys;
    - enable secure store in the central manager;
    - In the remote and core WAE, Check "initialize CMS secure store" and "Open CMS Secure Store";
    - In the core WAE, import the CA certificate (upload PEM file);
    - In the core WAE, create the SSL Accelerated Service by:
        --importing the client certificate and the key;
        -- Match interesting traffic;
        -- Put the SSL Acc Service in service;
    - Finally, make sure SSL acceleration is enabled in both remote and core WAE.
    The concerns:
    I only need to enable SSL optimization for a specific location accessing a specific website.
    Should the steps above work fine If I enable the SSL service for this specific website in the core WAE and enabling secure store only in a single remote site (brach office)?
    how will the other remote locations behave?
    Will they access the website normally with no SSL optimization even passing thru the core WAE?
    What about the other SSL sites which have no certificate? They will be treated as normal HTTPS with no optimization, right?
    If the site uses proxy, will any flow be impacted?
    If the steps above do not fit my case, how can I configure SSL optimization for only one remote WAE?
    Thanks in advance.
    importing  the client certificate and key (client.crt and client.key exported from  the Web server - See more at:  https://supportforums.cisco.com/docs/DOC-16452#sthash.3BKz05zU.dpu

    Hi guys.
    I have to enable SSL optimization for a specifc HTTPS website only and for a specific remote site only (branch office).
    The scenario is as follows:
    Multiple sites connected via a MPLS cloud. Each site has its own WAE device (module or appliance).
    There is a central manager and core WAE in the main site (central site).
    There is a website accessed via HTTPS by all the remote sites. This specific website is hosted within the main site.
    For only a specific branch office (remote site) we want to enable SSL optimization for this specific website.
    I saw this great and useful doc, but I still have some concerns.
    https://supportforums.cisco.com/docs/DOC-16452
    Basically, according to I see, I should do the following if I want to enable SSL optimization with the entire environment:
    - export the certificate and keys;
    - enable secure store in the central manager;
    - In the remote and core WAE, Check "initialize CMS secure store" and "Open CMS Secure Store";
    - In the core WAE, import the CA certificate (upload PEM file);
    - In the core WAE, create the SSL Accelerated Service by:
        --importing the client certificate and the key;
        -- Match interesting traffic;
        -- Put the SSL Acc Service in service;
    - Finally, make sure SSL acceleration is enabled in both remote and core WAE.
    The concerns:
    I only need to enable SSL optimization for a specific location accessing a specific website.
    Should the steps above work fine If I enable the SSL service for this specific website in the core WAE and enabling secure store only in a single remote site (brach office)?
    how will the other remote locations behave?
    Will they access the website normally with no SSL optimization even passing thru the core WAE?
    What about the other SSL sites which have no certificate? They will be treated as normal HTTPS with no optimization, right?
    If the site uses proxy, will any flow be impacted?
    If the steps above do not fit my case, how can I configure SSL optimization for only one remote WAE?
    Thanks in advance.
    importing  the client certificate and key (client.crt and client.key exported from  the Web server - See more at:  https://supportforums.cisco.com/docs/DOC-16452#sthash.3BKz05zU.dpu

  • Optimize Powershell script (reboot remote Server and Check Services)

    Hi there,
    For my next maintenance Weekend i try to write a script to reboot all Servers and check after the reboot all automatic Service if there are running or not.
    Now the Script works fine for me so far, but it is not perfect. I would like to ask if anybody could help me to optimize it.
    # First Part: Reboot Servers
    $date = Get-Date -Format dd-MM-yyyy
    # Get list of Servers
    $Servers = Get-Content "D:\Scripts\Reboot\servers.txt"
    # Reboot each server
    ForEach ($Server in $Servers)
    "Computer $Server initiated reboot at $(Get-Date)" | Add-Content -Path D:\Logs\Reboot\Rebootlogs_$date.txt
    Restart-Computer $Server -Force -Wait
    # Check each Server
    forEach ($Server in $Servers)
    if (Test-Connection $Server -quiet) { "Computer $Server verified to be responding to ping at $(Get-Date)" | Add-Content -Path D:\Logs\Reboot\Rebootlogs_$date.txt }
    else { "Computer $Server unresponsive to ping at $(Get-Date)" | Add-Content -Path D:\Logs\Reboot\Rebootlogs_$date.txt }
    # Seconde Part: Check Services
    # Get list of Servers
    $Servers = Get-Content "D:\Scripts\Reboot\servers.txt"
    # Check Auto Services on each Server
    ForEach ($Server in $Servers)
    ForEach-Object {
    Write-Output $Server | Out-File -FilePath "D:\Logs\Reboot\services_$date.txt" -Append
    # get Auto that not Running:
    Get-WmiObject Win32_Service |
    Where-Object { $_.StartMode -eq 'Auto' -and $_.State -ne 'Running' } |
    # process them; in this example we just show them:
    Format-Table -AutoSize @(
    'Name'
    'DisplayName'
    @{ Expression = 'State'; Width = 9 }
    @{ Expression = 'StartMode'; Width = 9 }
    'StartName'
    ) | Out-File -FilePath "D:\Logs\Reboot\services_$date.txt" -Append
    As you can see i do it in two parts, the perfect way might be in one, where i check one server and write also just one log with all information.
    At the Moment the log files Looks like that:
    Reboot:
    Computer server1 initiated reboot at 04/28/2015 15:14:51
    Computer server2 initiated reboot at 04/28/2015 15:16:40
    Computer server1 verified to be responding to ping at 04/28/2015 15:17:41
    Computer server2 verified to be responding to ping at 04/28/2015 15:17:44
    Service:
    Server1
    Name           DisplayName         State   StartMode StartName                
    RemoteRegistry Remote Registry     Stopped Auto      NT AUTHORITY\LocalService
    sppsvc         Software Protection Stopped Auto      NT AUTHORITY\NetworkSer...
    Server2
    Name           DisplayName         State   StartMode StartName                
    RemoteRegistry Remote Registry     Stopped Auto      NT AUTHORITY\LocalService
    sppsvc         Software Protection Stopped Auto      NT AUTHORITY\NetworkSer...
    Now my Question is how to Change maybe my Loop or code to get one Log like that:
    Computer server1 initiated reboot at 04/28/2015 15:14:51
    Computer server1 verified to be responding to ping at 04/28/2015 15:17:41
    Name           DisplayName         State   StartMode StartName                
    RemoteRegistry Remote Registry     Stopped Auto      NT AUTHORITY\LocalService
    sppsvc         Software Protection Stopped Auto      NT AUTHORITY\NetworkSer...
    Computer server2 initiated reboot at 04/28/2015 15:16:40
    Computer server2 verified to be responding to ping at 04/28/2015 15:17:44
    Name           DisplayName         State   StartMode StartName                
    RemoteRegistry Remote Registry     Stopped Auto      NT AUTHORITY\LocalService
    sppsvc         Software Protection Stopped Auto      NT AUTHORITY\NetworkSer...
    Thanks for helping.

    You could probably have something nice using Function and CmdletBinding so you can get something like:
    function reboot-myservers { blablabla }function check-myservices { blablabla }$Servers = Get-Content "D:\Scripts\Reboot\servers.txt" $servers | reboot-myservers | check-myservices
    Bruce Jourdain de Coutance - Consultant MVP Exchange http://blog.brucejdc.fr

  • Remote Control Optimization under Windows XP

    Hello -
    I am having problems getting remote control optimizations activated
    under
    Windows XP. RC works like a dream on our 98 boxen, but XP is slow as
    molasses.
    Zenworks 3.2 (non-SP1)
    Our server is Netware 5.1 sp6 (I think)
    Novell Client 4.83 (non-SP1)
    Dell Optiplex GX240
    Windows XP (non-SP1)
    16 mb ATI Rage Pro Ultra Video Card
    driver: ati2dvaa.dll
    version: 6.59
    This is the newest one on Dell's site, albeit dated August 2001.
    The thing that gets my goat is that the Zen 3.0 readme says the
    Installer
    can tell which cards are bad so it doesn't install the .dlls, but then
    doesn't list the bad cards. Also, all my machines have drishi.dll in
    the
    right spot. If anyone has any tips other than 'apply all patches' I
    would
    greatly appreciate it.
    -Dan Wheeler
    Holden RIII School District

    Rajkumar -
    The Optimization Status is reported as "Disabled".
    Video card info:
    >> 16 mb ATI Rage Pro Ultra Video Card
    >> driver: ati2dvaa.dll
    >> version: 6.59
    >> This is the newest one on Dell's site, albeit dated August 2001.
    Thanks.
    -Dan Wheeler
    Holden RIII School District
    On Wed, 19 Feb 2003 03:56:18 GMT, "Rajkumar V" <[email protected]>wrote:
    >Find out the Optimization Status if it is "Enabled" from Agent Icon
    on the
    >task tray->Information->General.
    >
    >What is the video driver installed on your XP workstaion?
    >
    >Regards
    >Rajkumar
    >
    ><[email protected]> wrote in message
    >news:F1d2a.14$[email protected]..
    >> Hello -
    >>
    >> I am having problems getting remote control optimizations activated
    under
    >> Windows XP. RC works like a dream on our 98 boxen, but XP is slow
    as
    >> molasses.
    >>
    >> Zenworks 3.2 (non-SP1)
    >> Our server is Netware 5.1 sp6 (I think)
    >> Novell Client 4.83 (non-SP1)
    >> Dell Optiplex GX240
    >> Windows XP (non-SP1)
    >> 16 mb ATI Rage Pro Ultra Video Card
    >> driver: ati2dvaa.dll
    >> version: 6.59
    >> This is the newest one on Dell's site, albeit dated August 2001.
    >>
    >> The thing that gets my goat is that the Zen 3.0 readme says the
    Installer
    >> can tell which cards are bad so it doesn't install the .dlls, but
    then
    >> doesn't list the bad cards. Also, all my machines have drishi.dll
    in the
    >> right spot. If anyone has any tips other than 'apply all patches'
    I would
    >> greatly appreciate it.
    >>
    >> -Dan Wheeler
    >> Holden RIII School District
    >>
    >

  • 20 Index Restriction on Remote Tables (i.e. using Database Links)

    The Oracle Database Administrator's Guides for 10g and 11g document a performance restriction that "No more than 20 indexes are considered for a remote table." If I go back to the 8i documentation it says "In cost-based optimization, no more than 20 indexes per remote table are considered when generating query plans. The order of the indexes varies; if the 20-index limitation is exceeded, random variation in query plans may result."
    Does anyone have more details on this performance restriction? In particular I am trying to answer these questions:
    1) Are the 20 indexes which are considered by the CBO still random in 10g?
    2) Can I influence which indexes are considered with index hints or will my hints only be considered if they are for one of the "random" 20 indexes which are being considered by the CBO?
    3) Are there any other approaches or work-arounds to this restriction assuming you need to select from a large remote table with more than 20 indexes (and need to perform the selection using 1 of those indexes to get adequate performance) or do we need to abandon database links for this table?
    Thanks in advance for your input.

    So, here's my simple test.
    SQL>
    SQL> create table gurnish.indexes20plus ( n1 number, n2 number, n3 number, n4 number, n5 number, n6 number, n7 number,
    2 n8 number, n9 number, n10 number, n11 number, n12 number, n13 number, n14 number, n15 number, n16 number,
    3 n17 number, n18 number, n19 number, n20 number, n21 number, n22 number, n23 number, n24 number,
    4 n25 number, n26 number, n28 number);
    create index xin1 on indexes20plus (n1);
    Table created.
    SQL> SQL> create index xin2 on indexes20plus (n2);
    create index xin3 on indexes20plus (n3);
    Index created.
    SQL> SQL>
    Index created.
    SQL> SQL> create index xin4 on indexes20plus (n4);
    Index created.
    SQL> SQL>
    Index created.
    SQL> SQL> create index xin5 on indexes20plus (n5);
    create index xin6 on indexes20plus (n6);
    Index created.
    SQL> SQL>
    Index created.
    SQL> SQL> create index xin7 on indexes20plus (n7);
    Index created.
    SQL> SQL> create index xin8 on indexes20plus (n8);
    Index created.
    SQL> SQL> create index xin9 on indexes20plus (n9);
    Index created.
    SQL>
    SQL> create index xin10 on indexes20plus (n10);
    Index created.
    SQL> SQL> create index xin11 on indexes20plus (n11);
    create index xin12 on indexes20plus (n12);
    create index xin13 on indexes20plus (n13);
    Index created.
    SQL> SQL>
    Index created.
    SQL> SQL>
    Index created.
    SQL> SQL> create index xin14 on indexes20plus (n14);
    Index created.
    SQL> SQL> create index xin15 on indexes20plus (n15);
    Index created.
    SQL>
    SQL> create index xin16 on indexes20plus (n16);
    Index created.
    SQL>
    SQL> create index xin17 on indexes20plus (n17);
    Index created.
    SQL> SQL> create index xin18 on indexes20plus (n18);
    Index created.
    SQL> SQL> create index xin19 on indexes20plus (n19);
    Index created.
    SQL> SQL> create index xin20 on indexes20plus (n20);
    Index created.
    SQL> SQL> create index xin21 on indexes20plus (n21);
    Index created.
    declare
    i number;
    begin
    for i in 1..100
    loop
    dbms_random.seed(i+100);
    insert into indexes20plus values (dbms_random.value(1,5),dbms_random.value(1,21),dbms_random.RANDOM, dbms_random.RANDOM,dbms_random.value(1,20),
    dbms_random.value(1,4),dbms_random.value(1,6), dbms_random.value(1,7),dbms_random.value(1,9),dbms_random.value(1,10),
    dbms_random.value(1,11),dbms_random.value(1,12),dbms_random.value(1,13),dbms_random.value(1,14),dbms_random.value(1,1),
    dbms_random.value(1,1),dbms_random.value(1,19),dbms_random.value(1,122),dbms_random.value(1,20),dbms_random.value(1,20)
    ,dbms_random.value(4,20),dbms_random.value(1,20),dbms_random.value(1,20),dbms_random.value(1,20),dbms_random.value(1,20)
    ,dbms_random.value(4,20),dbms_random.value(4,20));
    end loop;
    commit;
    end;
    SQL> set autotrace traceonly
    SQL> l
    1* select * from gurnish.indexes20plus@lvoprds where n1 = 4
    SQL> /
    no rows selected
    Execution Plan
    Plan hash value: 441368878
    | Id | Operation | Name | Rows | Bytes | Cost (%CPU
    )| Time | Inst |
    | 0 | SELECT STATEMENT REMOTE | | 1 | 351 | 1 (0
    )| 00:00:01 | |
    | 1 | TABLE ACCESS BY INDEX ROWID| INDEXES20PLUS | 1 | 351 | 1 (0
    )| 00:00:01 | LVPRD |
    |* 2 | INDEX RANGE SCAN | XIN1 | 1 | | 1 (0
    )| 00:00:01 | LVPRD |
    Predicate Information (identified by operation id):
    2 - access("A1"."N1"=4)
    Note
    - fully remote statement
    - dynamic sampling used for this statement
    Statistics
    0 recursive calls
    0 db block gets
    0 consistent gets
    0 physical reads
    0 redo size
    1897 bytes sent via SQL*Net to client
    481 bytes received via SQL*Net from client
    1 SQL*Net roundtrips to/from client
    0 sorts (memory)
    0 sorts (disk)
    0 rows processed
    SQL> select * from gurnish.indexes20plus@lvoprds where n21 = 4;
    no rows selected
    Execution Plan
    Plan hash value: 2929530649
    | Id | Operation | Name | Rows | Bytes | Cost (%CPU
    )| Time | Inst |
    | 0 | SELECT STATEMENT REMOTE | | 1 | 351 | 1 (0
    )| 00:00:01 | |
    | 1 | TABLE ACCESS BY INDEX ROWID| INDEXES20PLUS | 1 | 351 | 1 (0
    )| 00:00:01 | LVPRD |
    |* 2 | INDEX RANGE SCAN | XIN21 | 1 | | 1 (0
    )| 00:00:01 | LVPRD |
    Predicate Information (identified by operation id):
    2 - access("A1"."N21"=4)
    Note
    - fully remote statement
    - dynamic sampling used for this statement
    Statistics
    1 recursive calls
    0 db block gets
    0 consistent gets
    0 physical reads
    0 redo size
    1897 bytes sent via SQL*Net to client
    481 bytes received via SQL*Net from client
    1 SQL*Net roundtrips to/from client
    0 sorts (memory)
    0 sorts (disk)
    0 rows processed
    SQL>

  • Need to Optimize 3D performance on an Envy 17 3D? (...and other random bits)

    Hey there,
    The name's Darren. Nice to meet you. While I'm new to posting around these parts, I have been lurking for a little bit. Here's the deal: I work for another part of HP -- I run the blog, thenextbench.com. There, I'm working on various stories: How-tos, tweaks, tips and whatnot. What I'm wondering is if you guys would find it useful for me to post bits of some of my stories here. For example, I did one a while back about setting up games to work in 3D on an ENVY 17 3D....and getting better performance. 
    (The story originally ran here)
    You’ve bought an ENVY 17 3D. Awesome. You’re rocking it with 3D movies and I’m going to make the wild assumption that you’ve played some games since the Envy 17 3D got updated with that snazzy TriDef 3D ignition software. It’s actually dead-simple to get up-and-running with its 300-plus supported games…but what if there is no preset profile for that brand new game you just bought or that super-obscure title you downloaded from some cool, underground hipster indie gaming site? Well, I’ve been tinkering a little with this machine and wanted to walk you through the proper steps to get you situated. So strap those fancy goggles firmly to your noggin and read on, my friends.
    For the sake of this story, I’m going to walk you through how I got things set up, step-by-step. If any of this seems a little redundant, bear with me. Also, the fine folks at TriDef have been great to work with on this - and while I don’t have all the answers, feel free to hit the comment box below and I’ll do my best to get the straight scoop from them. Also, I’d highly recommend youcheck the ddd forums as well. It is a VERY handy resource for 3D gaming on the ENVY 17 3D.
    STEP 1: The initial setup
    The first time you run the TriDef 3D Ignition software, hit the “Scan” button. It checks directories for known EXE files and instantly populates them on the game launch list. If you installed a popular game directly from a disc, it usually doesn’t have a problem. But if you’re like me, you download your games from digital download services like Steam. (What can I say? I lose discs all the time.) That’s when it gets a little trickier.  The game is afoot!
    STEP 2: Manually adding a game
    Click the “Add” button and it calls up a window. The first thing to look at is the drop down menu. It contains a current list of all the games automatically supported. Your game not there? Don’t sweat it yet. There’s a link in the window to the TriDef forums – there is an active community of users always creating new game profiles for you to download. Still nothing? There is still hope. Select the “Generic” profile for now. We’ll get back to that in Step 3.
    In the same window, you’re going to see a prompt to find the game location. You can either click a shortcut to the game or find the actual EXE file yourself. After that, make sure you create a name for the profile and save it.
    STEP 2a: Adding a Steam game
    I figured that it’d be a piece of cake. And it was at first. I downloadedBorderlands through Steam and when I created a profile pointing to the game file in the Steam directory, everything was groovy.
    (PROTIP: The TriDef software can work with game shortcuts, but Steam holds its game files in the “\Steam\steamapps\common” directory).
    Many other Steam-downloaded games started giving me this oddball warning: “This game doesn’t support DirectX 9, 10 or 11.” These were new games – OF COURSE they supported the latest DX files. So I did a little digging and there is an extra step required to make some Steam games work.
         1.       Click the “Add” button in the TriDef menu
         2.       In the “Executable” field, point to the “steam.exe” file in the main Steam directory.
         3.       Find a shortcut for the game you want to download. (If you don’t have one, open up the Steam client, right-click on the game and select “create shortcut on desktop.”)
         4.       Right-click on the shortcut for the game. At the end of the link location it’ll have a number. Copy that number
         5.       Within the TriDef’s Add window, enter “-applaunch [NUMBER]” in the field where it says “Command Line Arguments (optional)”
         6.       Look for the game’s profile as described above in Step 2.
         7.       Save your progress.
    STEP 3: Optimizing your 3D performance
    Once you’ve cleared those first couple steps, it’s actually not that bad from here. You just want to optimize the experience so that you can get good 3D effects and keep the game playable. What you have to remember is that in order to render a 3D image, the Envy is effectively doubling what’s happening on-screen. My gut reaction with any game is to run it at the laptop’s native screen resolution (1920 by 1080). It looks pretty and can handle running those games in 2D just fine. Bring 3D into the equation and your frame rate will drop. But with a couple tweaks, I’ll get you back up to speed.
         1.       First, start up a test game and just sit around in the game environment, not the game menus.
         2.       Next, on your computer’s number pad, hit the “0” key to call up the 3D overlay menu. Use the 8 and 2 to navigate up and down and the 6 key to make selections.
         3.       Push 2 until the “Performance” option is highlighted and hit the 6 key. There you should see the frame rate displayed (It’s labeled “FPS”). If the FPS number is above 30, youshould be fine. That, of course, can change if there’s a lot of action happening on screen. In short, the higher the frame rate, the better.
                  a.       If your frame rate is below 30, consider lowering the game’s resolution or move the cursor in the 3D overlay menu and lower the game’s 3D effects settings. Just highlight “Quality” and push the 6 to toggle the 3D effects between High, Medium and Low.
         4.       When you find the performance settings you like, hit Alt-Shift-S to save them. The next time you fire up that game, it’ll remember what you set.
    STEP 4: Tweaking your 3D experience.
    All right, so you’ve got the game running great, the 3D effects are there, but maybe you still want to adjust the settings a little further. For instance, the 3D effect is a little more jarring in real-time strategy games like StarCraft II and MMO games because you have menus and cursors floating over the world out of perspective with the rest of the 3D depth.  (Try selecting a target far downfield in an MMO and you’ll know what I’m talking about). There are all sorts of settings here that you can adjust. Experiment by adjusting the numbers for the “Depth” and “Focus” under the 3D menu. Under the “Options” and “Window and Cursor” sections, there are plenty of other toggles to switch on and off to your liking.
    Goes without saying, make sure to hit Alt-Shift-S when you’re done and the Ignition software will remember all your preferences.
    What About….?
    Just so you know, this story is an on-going work-in-progress that I plan to update as I learn more. Here are a couple things that I’m currently looking into with the Envy 17 3D:
    [This Game] Doesn’t Work at All / Is Glitchy in 3D. Yeah, I run into that problem as well every so often. DC Universe Online looks broken with tearing images when the 3D goggles are on. (Looks great in 2D, though). Other games, like Telltale Games’ new Back to the Future titles look five kinds of crazy. Those might be more specific fixes that require a deeper dive later on.
    What about Flash-based games? My gut reaction is that the technology requires DirectX 9, 10 or 11 to work so this one might not be in the cards.
    What about older games optimized for Windows 7? There are plenty of old-skool classics, I’d love to try in 3D, but they were all created in a pre-DirectX 9 world. That’s not stopping me from looking around for any solutions, but no word yet.
    =-=-=-=-=-
    So....was this even remotely helpful? Would you want to see more stuff like this? Or bits from stories I've written posted here? Heck, if there were topics you wanted tackled in story-form, I'm all ears for that as well. 
    Thanks in advance for any feedback!
    GizmoGladstone
    Blogger-in-Chief, HP's thenextbench.com
    thenextbench.com
    While I professionally blog for HP about the latest laptops and desktops, these words are all mine.
    My job: Come up with unusual angles for talking about HP gear, dissecting how stuff works and provide tips on getting better performance with your tech.

    Hi @fjward ,
    Thank you for visiting the HP Support Forums and Welcome. I have looked into your issue HP ENVY 17-3090nr 3D Edition Notebook PC and issues with brightness control and the Catalyst Control Center.  I would uninstall any graphic drivers that are listed and  CCC software, restart the computer, then reinstall only the AMD. It will include the Amd Graphics Driver and Catalyst Control Center restart the computer.
    Here is a link to the HP Support Assistant if you need it. Just download and run the application and it will help with the software and drivers on your system.
    You can do a system restore. System restore will help if something automatically updated and did not go well on the Notebook.
    When performing a System restore please note remove any and all USB devices. Disconnect all non-essential devices as they can cause issues.
    Please let me know how this goes.
    Thanks.
    Please click “Accept as Solution ” if you feel my post solved your issue, it will help others find the solution.
    Click the “Kudos, Thumbs Up" on the bottom to say “Thanks” for helping!

  • The type or namespace name 'Optimization' does not exist in the namespace 'System.Web'

    App_Start\BundleConfig.cs (1): The type or namespace name 'Optimization' does not exist in the namespace 'System.Web' (are you missing an assembly reference?)
     Global.asax.cs (4): The type or namespace name 'Optimization' does not exist in the namespace 'System.Web' (are you missing an assembly reference?)
     App_Start\BundleConfig.cs (8): The type or namespace name 'BundleCollection' could not be found (are you missing a using directive or an assembly reference?)
    I'm getting the above errors when attempting to create a remote build.
    I've tried the solution found here (http://blog.davidebbo.com/2014/01/the-right-way-to-restore-nuget-packages.html) but no luck.

    Hi,
    I have an asp.net mvc project in Visual Studio 2013. I'm hosting it on Visual Studio Online. 
    The project builds fine on my local machine/Visual Studio 2013. When i try to do a build on Visual Studio Online (i created a build definition there), i get similar errors. 
     App_Start\BundleConfig.cs (2): The type or namespace name 'Optimization' does not exist in the namespace 'System.Web' (are you missing an assembly reference?)
     App_Start\FilterConfig.cs (2): The type or namespace name 'Mvc' does not exist in the namespace 'System.Web' (are you missing an assembly reference?)
     App_Start\RouteConfig.cs (5): The type or namespace name 'Mvc' does not exist in the namespace 'System.Web' (are you missing an assembly reference?)
     Controllers\HomeController.cs (5): The type or namespace name 'Mvc' does not exist in the namespace 'System.Web' (are you missing an assembly reference?)
     Controllers\HomeController.cs (9): The type or namespace name 'Controller' could not be found (are you missing a using directive or an assembly reference?)
     Global.asax.cs (5): The type or namespace name 'Mvc' does not exist in the namespace 'System.Web' (are you missing an assembly reference?)
     Global.asax.cs (6): The type or namespace name 'Optimization' does not exist in the namespace 'System.Web' (are you missing an assembly reference?)
     App_Start\BundleConfig.cs (9): The type or namespace name 'BundleCollection' could not be found (are you missing a using directive or an assembly reference?)
     App_Start\FilterConfig.cs (8): The type or namespace name 'GlobalFilterCollection' could not be found (are you missing a using directive or an assembly reference?)
     Controllers\HomeController.cs (11): The type or namespace name 'ActionResult' could not be found (are you missing a using directive or an assembly reference?)
     Controllers\HomeController.cs (16): The type or namespace name 'ActionResult' could not be found (are you missing a using directive or an assembly reference?)
     Controllers\HomeController.cs (23): The type or namespace name 'ActionResult' could not be found (are you missing a using directive or an assembly reference?)
     C:\Program Files (x86)\MSBuild\12.0\bin\amd64\Microsoft.Common.CurrentVersion.targets (1697): Could not resolve this reference. Could not locate the assembly "System.Web.Helpers, Version=3.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35,
    processorArchitecture=MSIL". Check to make sure the assembly exists on disk. If this reference is required by your code, you may get compilation errors.
     C:\Program Files (x86)\MSBuild\12.0\bin\amd64\Microsoft.Common.CurrentVersion.targets (1697): Could not resolve this reference. Could not locate the assembly "System.Web.Optimization". Check to make sure the assembly exists on disk. If this
    reference is required by your code, you may get compilation errors.
     C:\Program Files (x86)\MSBuild\12.0\bin\amd64\Microsoft.Common.CurrentVersion.targets (1697): Could not resolve this reference. Could not locate the assembly "System.Web.Razor, Version=3.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35, processorArchitecture=MSIL".
    Check to make sure the assembly exists on disk. If this reference is required by your code, you may get compilation errors.
     C:\Program Files (x86)\MSBuild\12.0\bin\amd64\Microsoft.Common.CurrentVersion.targets (1697): Could not resolve this reference. Could not locate the assembly "System.Web.WebPages, Version=3.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35,
    processorArchitecture=MSIL". Check to make sure the assembly exists on disk. If this reference is required by your code, you may get compilation errors.
     C:\Program Files (x86)\MSBuild\12.0\bin\amd64\Microsoft.Common.CurrentVersion.targets (1697): Could not resolve this reference. Could not locate the assembly "System.Web.WebPages.Deployment, Version=3.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35,
    processorArchitecture=MSIL". Check to make sure the assembly exists on disk. If this reference is required by your code, you may get compilation errors.
     C:\Program Files (x86)\MSBuild\12.0\bin\amd64\Microsoft.Common.CurrentVersion.targets (1697): Could not resolve this reference. Could not locate the assembly "System.Web.WebPages.Razor, Version=3.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35,
    processorArchitecture=MSIL". Check to make sure the assembly exists on disk. If this reference is required by your code, you may get compilation errors.
     C:\Program Files (x86)\MSBuild\12.0\bin\amd64\Microsoft.Common.CurrentVersion.targets (1697): Could not resolve this reference. Could not locate the assembly "WebGrease". Check to make sure the assembly exists on disk. If this reference is required
    by your code, you may get compilation errors.
     C:\Program Files (x86)\MSBuild\12.0\bin\amd64\Microsoft.Common.CurrentVersion.targets (1697): Could not resolve this reference. Could not locate the assembly "Antlr3.Runtime". Check to make sure the assembly exists on disk. If this reference
    is required by your code, you may get compilation errors.
     C:\Program Files (x86)\MSBuild\12.0\bin\amd64\Microsoft.Common.CurrentVersion.targets (1697): Could not resolve this reference. Could not locate the assembly "Newtonsoft.Json". Check to make sure the assembly exists on disk. If this reference
    is required by your code, you may get compilation errors.
     C:\Program Files (x86)\MSBuild\12.0\bin\amd64\Microsoft.Common.CurrentVersion.targets (1697): Assembly strong name "System.Web.Mvc, Version=__MvcPagesVersion__, Culture=neutral, PublicKeyToken=31bf3856ad364e35, processorArchitecture=MSIL" is
    either a path which could not be found or it is a full assembly name which is badly formed. If it is a full assembly name it may contain characters that need to be escaped with backslash(\). Those characters are Equals(=), Comma(,), Quote("), Apostrophe('),
    Backslash(\).

  • How to - correct episode order within the Remote App

    Recently I found that when i was using the Remote App to control my iTunes Library, all my TV Show episodes where in alphabetical order instead of episode number order.
    after searching on how to solve this and speaking with Apple Support (who couldn't figure out why this was happening) i couldn't find the answer and noticed that many people were having this same issue and were in the same boat as me.
    i managed to solve the issue and wanted to share. sorry if i have posted this in the wrong area, first time giving advise on here.
    ill run through this using The Walking Dead - Season 1 as an example, but this can obviously be done with any TV Show.
    Even with the correct episode numbers, iTunes displays the episode in the correct order -
    but in alphabetical order within your library on the remote app -
    in iTunes - highlight all episodes and get info
         2. under the Options Tab, change the Media Kind to Music Video and click OK at the bottom
         3. Then head over to where you Music Videos are stored
         4. select each of the episodes, get info
         5. under the Details Tab, you will now have more options to select
         6. scroll to the bottom of the Details Tab, and within the Album Artist section, type in "The Walking Dead"
         7. in the Disc Number section, you want to add which season of how many seasons there are. example - I have 4 seasons of The Walking Dead within my library, and as this is season 1, i will enter 1 of 4
         8. next, in the Track section, you want to add which episode of how may episodes in that season. example - this is episode 1 of 6 in the season
         9. next go into the Options Tab, and change the Media Kind back to TV Show
         10. click OK at the bottom
         11. once you have done this for each episode, head back into your TV Shows
         12. when you now get info for each of the episodes, the Album Artist, Disc Number and Track sections should be at the bottom of the Details Tab (if they were not there before)
         13. if this has been done correctly - all episodes should be correctly listed within the remote app by episode order
    Hopefully this has helped somebody out there!

    Gotcha - came up with a workaround... since the photos were living in the Photo Library packaged itself, and not simply in another regular folder, I right clicked on the library and said show packaged content, then search for what folders the files lived in, then copied the files out of the packaged library into a folder. Then I was able to give the Photos Library access to the new folder with the files.
    With all my files now references, I enabled iCloud Photo Library and am uploading now.
    Hopefully my new local Photo Library will be smart enough to optimize local images and shrink to give me more space.
    Thanks.

  • Remote Invocable object is not constructed/initialized correctly

    Hi, I got a strange problem.
    Basically, a node assignes a range of ids (for example 1-1000) to all nodes including itself via the InvocationService. Upon the receipt of Invocable, all the remote nodes do not have the Invocable information initialized correctly, ONLY the original node starts the InvocationService call get the Invocable initialized correctly. In my program, any node can start the InvocationService, and I tried all the different nodes and they all have the same problem.
    I fake the code a little bit and when calling the InvocationService (query()) it always sends to itself. And all Invocable objects (when run() starts) are initialized correctly.
    Can someone help provide some insight?
    The following is some of code related to the problem.
    The node call the InovactionService:query()
    SearchInvocable oneRange = new SearchInvocable(CacheHelper.getLocalMember());
    for (int j = 0; j < members.length; j++) {
    oneRange.setMin(nextMinKey);
    nextMaxKey = ((nextMinKey + rangeSize) < maxKey) ? (nextMinKey + rangeSize) : maxKey;
    oneRange.setMax(nextMaxKey);
    // Inform the member for the assigned key range via
    // invocation service
    // CHANGING TO targets.add(memebers[0]);
    // WORKS FINE! SO STRANGE
    targets.add(members[j]);
    System.out.println("Range[" + j + "]: " + oneRange);
    invokeService.query(oneRange, targets);
    targets.clear();
    nextMinKey = ((nextMaxKey + 1) < maxKey) ? (nextMaxKey + 1) : maxKey;
    The Invocable object
    public class SearchInvocable extends KeyRange implements Invocable {
    SearchInvocable(Member sourceMember) {
    super();
    this.sourceMember = sourceMember;
    public void run() {
    System.out.println("SearchInvocable::run(): " + this);
    Here is the output for remote nodes print out the Invocable (the Invocable is not initialized correctly).
    SearchInvocable::run(): type:null min:0 max:0
    Here is the output for local node print out the Invocable
    SearchInvocable::run(): type:Accountl min:10001 max:99996
    Thanks,
    Jasper

    Hi Robert,
    That's it. It is the KeyRange not implements the Serializable. I missed it. I thought for the local node it goes through the same serialization as the remote nodes do so I overlook the problem. This shows Coherence does the optimization for local node without going through the serialization. Excellent!
    Regards,
    Jasper

  • A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections.

    A network-related or instance-specific error occurred while establishing a connection to SQL Server. The
    server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: SQL Network Interfaces, error: 26- Error:Locating Server\instance Specified)
    How Can i solve this?

    1. Make sure SQL Server Service is running
    2. If a named instance, make sure SQL Server browser service is running
    3. Make sure SQL Server is configured to allow remote connections
    4. Examine the SQL Server error log for messages confirming that SQL is listening on the expected network interfaces and ports
    5. Test server connectivity with PING from the client machine
    6. Test port connectivity using TELNET or PowerShell to the server and port (from step 4) from the client machine.  For example
    a. TELNET <server-name> 1433
    b. PowerShell: 1433 | % { echo ((new-object Net.Sockets.TcpClient).Connect("YourServerName",$_)) "server listening on TCP port $_" }
    7. Check firewall settings if step 5 or 6 connectivity test fails
    Best Regards,Uri Dimant SQL Server MVP,
    http://sqlblog.com/blogs/uri_dimant/
    MS SQL optimization: MS SQL Development and Optimization
    MS SQL Consulting:
    Large scale of database and data cleansing
    Remote DBA Services:
    Improves MS SQL Database Performance
    SQL Server Integration Services:
    Business Intelligence

  • Can't upload "put" files to remote server unless they are in a folder. Please help.

    I have done many searches on this and found no solutions. Here is the problem:
    When I upload a file from my local site to the remote server it works fine as long as that file is contained in a folder.
    Example:
    I can upload an image from my images folder, or a template page from my template folder with no problem.
    BUT, I can not upload any web/html pages, my CSS page, site map, or any file for that matter that is simply in the root directory but not in it's own folder.
    When I try to do that I get this error:
    index.html - error occurred - An FTP error occurred - cannot put index.html.  Access denied.  The file may not exist, or there could be a permission problem.   Make sure you have proper authorization on the server and the server is properly configured.
    File activity incomplete. 1 file(s) or folder(s) were not completed.
    Files with errors: 1
    index.html
    So every time I make a change to my index page I can't upload it to the server without getting that error unless I use a 2nd party program like Filezilla - which works fine. It is only Dreamweaver doing this.
    I have been using dreamweaver for a long time and had no problems with this site. This all started happening out of no where. and it is happening on my work and home computer.
    I made no changes to my server preferances or anythijng like that.
    I was just doing my thing - being happy - and then BAM, not happy any more....

    Hi there. First off thank you for your interest and your help.
    I have done what you suggest probably 30 times as I have obsessed over this crazy problem for a couple straight weeks before I finaly gave up on CS5.5
    I'm not sure what you mean by REMOTE server. The only server settings I am aware of are the ones in Dreamweaver. Here they are:
    5.0 Settings:
    Server Name = the same values on each
    Connect Using = FTP
    FTP Address = same numeric value on 5.0 & 5.5 port 21
    User name = same on both
    Password = same on both
    Root directory = is blank (I have tried it using / with no luck as well)
    web URL = http://ftp.westbrookadvertising.com/www/htdocs/
    Passive FTP is checked & Use FTP Peformance optimization is checked.
    5.5 Settings:
    Server Name = the same values on each
    Connect Using = FTP
    FTP Address = same numeric value on 5.0 & 5.5 port 21
    User name = same on both
    Password = same on both
    Root directory = is blank (I have tried it using / with no luck as well)
    web URL = http://ftp.westbrookadvertising.com/www/htdocs/
    Passive FTP is checked & Use FTP Peformance optimization is checked.

  • Local vs. Remote Response Why such a diff?

    I'm trying to find out why there is such a difference
    in query finish times and how to improve the time.
    ftp is fast between the machines.
    The network folks say everthing is ok there though
    the sniffer sees many Parses.
    Changing the arraysize in SQL*Plus makes no real diff.
    When running a query to a spool file on the local
    windows 2000 machine the time to finish the job is
    aprox 1 min.
    When the same query to a spool file is run connecting
    from a remote machine (Sun Solaris) - the time is aprox 5 Min.
    Both are run in SQL*Plus, 8.1.7
    both are selecting * form a table with 95,000 rows.
    doing a full scan as expected
    Trace on the local shows the fetch line
    fetch - local fetch from unix
    count - 954 count - 954
    cpu - 0.08 cpu - 2.85
    elapsed - 2.55 elapsed - 3.76
    disk - 1069 disk - 1074

    Martin,
    I can only guess at the response time issue without seeing the underlying sql. But, it may have to do with the difference between local (memory) speed and network (remote). Remember the local query only has to run against the local machine, where as the remote has the overhead of the tcp/ip stack and a limited packet size of 1056 bytes. If you are retrieving a large amount of data, it must be broken up for transmission, and then reassembled on your end. Also are you table scanning on both machines? do you have a explain plan for both of them? It maybe that the local machine has a better optimizer plan than the remote machine. Check out the indexes and which optimizer is being used on the both machines. Finally it may just be how the Oracle servers are configured. So, you know have a few ideas to start checking.
    Jim

  • Trying to optimize eSATA and internal disk configurations

    I'm trying to optimize the HD setup on my dual 2.5GHz G5 with 4.5GB RAM
    major considerations.
    - massive itunes library (260GB), and big iphoto lib (25GB) as well
    - lots of video editing in Final Cut with large capture files and many video exports
    - regular podcasting and other media creation with all my music and photos
    - need for regular COMPLETE backups
    - speed
    Here's the current setup. I have six disks as part of the system
    1. internal 160GB disk (maps simply to a MACHD volume)
    2. internal 250GB disk (maps simply to a COMMONS volume for Democracy player files, torrent downloads etc)
    then on my 4-port eSATA controller card
    4x 500GB SATA drives from Western Digital for a total of 2TB eSATA disk space
    they are in 2 eSATA enclosures from FirmTek
    I'm managing the disks with SoftRAID
    Before I get into the problem, how would YOU use this incredible amount of disk space, considering the goals I have? (video, media storage, backup).
    Now, The problem
    I've been disappointed with the speed of my system and suspect its my HD configuration. I have enough RAM right!?
    I've got some raid stripes going on
    2 of the 500GB disks (disk2 and disk3) support two "active" volumes
    a) a striped ATLAS volume of 800GB (holds itunes, documents, iphoto, basically all media files)
    b) a striped VIDEO SCRATCH volumn of 200GB (for working files in FCP, imovie, etc)
    the other 2 of the 500GB disks (disk0 and disk1) support two "clone" volumes
    a) a mirror MACHDCLONE volume on both disk0 and disk1 (to protect the system drive. I run Super Duper 3x per week)
    b) a striped ATLAS_CLONE volume to backup the active ATLAS volume
    the COMMONS volume is not backed up in any way. figure i can live without my Democracy files and torrents, etc.
    My ideas:
    based on my performance observations, my setup above is just wrong, and I don't know where to turn for the best advice. Google is very poor at dealing with such complexity in search results. There are some video advice sites, but they only cover part of my problems. I have a few theories of how I should be using these drives
    1. use the eSATA drives strictly for performance benefits, not for backup. consider a USB2.0 drive for backups and use Mozy for offsite backup
    2. simplify the disk allocation. No single disk should support more than one volume
    3. the video scratch SHOULD be striped in order to benefit from speed. and should be on its own physical disk(s) separate from ANY other function
    So I'm thinking
    a) stripe two of the eSATAs into a single 1 TB array for my media or ATLAS volume
    - this solves me running out of space on the volume (getting closer with the iTunes video downloads every day)
    - it's also just physically easier to deal with. I can SEE what drives make up ATLAS alone
    - will be easier for me to eventually replace the G5 with an MBP running its own eSATA pc card with easy access to the same ATLAS volume
    this still leaves two 500GB eSATA disks around
    b1) I could extend the ATLAS volume to an array including a 3rd eSATA disk for a 1.5TB volume. this would allow me to bring COMMONS files onto ATLAS
    b2) the remaining 500GB eSATA disk can be video scratch
    OR
    c1) dedicate 1 500GB disk to VIDEO SCRATCH
    and
    c2) partition the other 500GB disk as a clone of both COMMONS and the internal System drive
    see how CONFUSING THIS IS ?
    there are too many permutations of things.
    I know I like keeping the system drive simple and internal. Ideally, the second internal disk would mirror this volume, but they do not match in size or brand
    part of me wants to stripe all FOUR eSATA drives into a blazing 2TB masterpiece, but it seems like a bad idea to put VIDEO SCRATCH on the same array as ATLAS
    Other questions:
    should the itunes library get it's own disk altogether? is striping of benefit here?
    are there some sites that explain HD management well?
    PowerMac G5 2.5GHz 4.5GB RAM   Mac OS X (10.4.9)   also own a blacbook

    Thanks so much for that awesome feedback.
    A few points.
    I have the dual processor G5, not the quad core. Purchased in Jan 2005.
    My RAM pageouts are fine (didn't know what that was until you mentioned it)
    Love the idea of moving COMMONS "outside the box"
    I used to have my system volume boot from an external RAID, but didn't notice a big improvement, and it meant my G5 would ONLY boot if the eSATAs were powered up. I just didn't like that feel. I want the tower to work in a self-contained fashion, even if I don't have access to all media. I want access to the OS and apps.
    I'm unlikely to buy more SATA controllers and enclosers or too many new disks. I'm on a serious budget and want to work with as much of what I have as possible. That said, i just checked out the Drobo and am drooling. I'll wait to see how well it performs for data access (and video) and not just storage.
    It sounds dreamy to stripe all four of the eSATAs into a 1.8TB storage megaplex. I imagine they would scream in an ATLAS_BADASS volume, but then I've got nothing left for VIDEO SCRATCH used to capture and render.
    The VIDEO_SCRATCH doesn't need to be large, and I think that's where I'm having a conflict. My eSATA drives are way too big to use even ONE as a video scratch, much less striping two of those bad boys just for that purpose
    Purchasing a 10K drive for video scratch (or system volume) is not really in the cards yet.
    So here's where I sit now:
    1. My Media Storage
    ++ the 4 eSATA drives (2.0TB raw)
    I go with the badass steroid injected ATLAS volume striped across all four.
    this is my media array and holds all the contents of ATLAS and COMMONS (iTunes, iPhoto, Documents, FCP training videos, ripped DVDs, the works)
    2. My System Volume
    ++ the 250GB internal SATA
    move COMMONS out
    migrate system volume to this disk
    better storage-to-free space ratio
    3. My Scratch Disk
    a) use the now-spare 160GB internal Maxtor (probably weak and slow)
    b) get an external FW800 RAID disk from OWC
    http://eshop.macsales.com/shop/firewire/hard-drives/EliteAL/StripedRAID
    I'd go with the 160GB or 320GB
    4. Backup Plan - level 1 - local
    ++ use my spare external Maxtor 250GB FW drive
    clone the system volume regularly
    ++ get an external FW drive (like the 1TB My Book Premium II from WD)
    clone ATLAS_BADASS regularly
    the WD is just $400
    i know it's capacity is lower than my super striped RAID, but i don't know of any cheap way to clone ATLAS_BADASS to a 1.5TB drive
    5. Backup Plan - level 2 - remote
    pay for a Mozy storage account which has unlimited capacity
    upload system and ATLAS_BADASS every few weeks
    any new thoughts? and thanks again!

Maybe you are looking for

  • REMOTE AND TESTING INFO

    When I close Dreamweaver 8 it wipes the remote info and testing server login and password details for my web sites. Can this be avoided? OS Widows XP Professional. Thank you Mike Shields

  • CR2 not recognized in photoshop CS4

    I just bought the new Canon T1i.  Its raw files are marked as CR2. When i try to open them in photoshop CS4 it says "Could not complete your request because it is not the right kind of document."  So i downloaded the upadate to camera raw, version 5.

  • Back to my Mac....and sleep

    BTMM is great when you get it to work, as I have. However, I am away a lot and I really don't like just leaving my computer switched on. Waste of electricity. Is it safe ? Etc. etc. So, does anybody know if Apple has / is come up with a way of waking

  • How do I delete a sync agent?

    I am trying to clean up my Azure environment. I have a bunch of Sync Agents that are no longer being utilized, but each time I select one to attempt to delete, I get an error that states "The Sync Agent was not deleted", without giving me any specifi

  • Keyword export

    I am interested in exporting the meta data and keyword information from a series of events. Basically, I'd like to be able to subsequently import this data into a spreadsheet, so I could have a table that presents clip name, duration, keywords, etc.