Do we need an Xserve? Advice sought ...

Hi, I have a small design studio and we're in the process of upgrading our gear! There are only three of us, and we'll probably end up with two Mac Pros and an iMac for the junior. We've got Gigabit ethernet.
Background
The majority of our work is publishing, and we work with the Adobe Creative Suite. Recently weve changed the way that we work. Rather than work separately on projects, we're now have multiple people working on a single project at different times. So far we just open the files across the network, which is fine until that machine goes down (which we've had happen).
*The question*
I've never worked from a server before, and the decision that I think that we need to make is whether to go to an Xserve, or just NAS. I'd love to go the Xserve route, but I'm not sure whether it's overkill or not. The files that we work on are routinely in the 30Mb-80Mb range. We need to store around 150Gb in current working files at any one time, we also need to hold our clients' photo libraries in a central place (about 500Gb).
We also have a number of small utilities and applications that we rarely (but all) use. Can I install a central copy of an application on the server so that I only have to purchase one copy, and we can each use it? (Not at once!) I don't need to do it for our day-to-day software, just the things that get rarely used.
Your advice and recommendations would be appreciated!
Thanks
Simon
D10 Creative
(Brisbane. Australia)

Hi D10 Creative-
Greetings and welcome to the Apple boards.
How fortunate for you that you are able to upgrade your gear. It is possible that you could use one of the retired machines for server duty depending on your ultimate needs. A PowerMac G5 filled with large hard drives and a few external FireWire drives fuel more than a few creative studios. You could also fill a Mac Pro with drives configured as a RAID, turn on file-sharing and go.
If the only thing you will be doing for now is working with large files between less than 10 users as part of your workflow the built-in file-sharing capabilities built into OSX may serve your needs well.
Check with the individual software publisher in order to see if their particular product will function per your question. Some products may require different licensing to use them that way. Best to be careful here and not get yourself in trouble over something stupid.
A big issue is that you will be pushing around many large files and the longer that you wait for those files to get from point A to point B will be time and money lost. Make sure that your internal wiring and networking gear are all capable of supporting gigabit Ethernet traffic.
No matter what machine functions as your server or NAS make sure to have it and any necessary external components like hard drives on a robust UPS and to have three rotating backups with one always stored securely off-site. It sounds like you will be working with some fairly large files so storage of said files will always be an issue.
Luck-
-DaddyPaycheck

Similar Messages

  • Hi, I need help and advice. Basically me and my ex partner both had iphones and synced it with the same computer under the same ID. We split i have a new laptop and now it keeps asking for the old ID or it'll erase my apps bought on theold account.

    Hi, I need help and advice. Basically me and my ex partner both had iphones and synced it with the same computer under the same ID. We split up and now im trying to get all my apps and info onto my new laptop with a new account but it keeps asking me for the old apple ID which she is still using and she changed the password. i tried backing it up but still nohing. When i try to back up purchased items being apps etc its keeps asking for the old one. help

    See Recover your iTunes library from your iPod or iOS device. But you'll still need the password.
    Once you have the computer authorized to use the account she could change the password again to stop you buying apps on her card (assuming it's not on yours!). It would lock you out of upgrading them too but they should work unless she uses the deathorize all feature.
    It depends on how amicable the split is...
    tt2

  • Need a help / advice & guidance whether to switch on my Career in SAP-PM from core industries after spending my 09 years of experience in Core field that to in Maintenance.

    Hello Adviser /expert,
    Need a help / advice & guidance whether to switch on my Career in SAP-PM from core industries after spending my 09 years of experience in Core field that to in Maintenance.
    As now i m thinking to do SAP-PM certified course from authorized SAP partner in India that to Pune.
    So any one can suggest authorized SAP partner in India that to Pune.
    My introduction (About myself): - I had done my Diploma in Mechanical and had total 09 years of experience in Mechanical Maintenance field that to in different manufacturing company such as Steel, Auto Ancillary & Cotton plant.
    Whether its right decision to change my career from Core sector to SAP-PM..??
    Is there very good scope in SAP-Pm for me after 09 years of Core Maintenance field..???
    Guide me
    Warm Regard
    Ravin

    Ravindra,
    SAP PM is very niche Module, very in demand module, at the same time, being niche, getting into it and getting 1 implementation is also difficult.
    your decision of joining SAP authorized training is the best option, as a certified consultant, you have more chances to get a break as fresher and you can continue working on it, else it would waste of your intellectual energy.
    you just search sap.com training or email them or chat with them, they will give u the training center details,
    but very less training classes are available. Try you will get lucky soon

  • Need an expert advice in Implementing continuous delivery across environments

    Hi All,
    I need an expert advice/solution in implementing continuous delivery in our project. (PF Screenshot)
    Deployment goes like this.. 
    Dev checks in -> TFS Builds - > RM Invokes -> using PS/DSC -> Copies the binaries to DIT environemnt's IIS virtual dir -> Manual replacement of Web.config files (since we have many environments)- > App runs
    PS/DSC does following things : Copies binaries using X copy , Uploads SP excel reports , Executing Dacpac scripts etc ..
    Now the chanllenge earlier faced was wrt Web.Config files was, for every DIT environment many configurations, conn strings would change. In order to overcome this , Implemented Web.config transformations( like Web.DIT1.Config .... Web.DIT6.config in my solution
    explorer. 
    Added a custom MSBuild script in my .csproj file and before triggerig the build defination I 'd change in Process tab- > Basic - > Configuration = DIT1 (based on requirement) and triggers . )
    It perfectly deploys to the DIT 1 without manually replacing WEb.config files. 
    1) Now the blocker for me, Can we achieve this without Web.config transformations ? Since I don't want to expose the sensitive date in config files is there any way to implementing the same ?
    2) I wanted to have continuous deployments from DIT to SIT and from SIT to UAT etc , How can we achevive the same ?
       I m planning to implement the same with web.config tokenizing in RM?
    Looking for some expert advice in achieving the same?
    Many Thanks ,
    Abraham Dhanyaraj

    HI Abraham,
    Instead of having web config transformations to set config values for each environment do the following.
    1. Have one transformation config say for example web.release.config which will contain tokens understood by RM (starting and ending with double underscore) for each config value.
    <log4net>
    <appender name="FileAppender" type="log4net.Appender.FileAppender">
    <file value="__log4netLogFile__" xdt:Transform="Replace" />
    </appender>
    </log4net>
    <ClientsSection>
    <Clients xdt:Transform="Replace">
    <add Id="__ClientId1__" Secret="__SecretCode1__" RedirectUrl="__RedirectUrl1__"></add>
    <add Id="__ClientId2__" Secret="__SecretCode2__" RedirectUrl="__RedirectUrl2__"></add>
    </Clients>
    </ClientsSection>
    <appSettings xdt:Transform="Replace">
    <add key="MinCaps" value="2" />
    <add key="MinSymbols" value="1" />
    <add key="MinNumerics" value="1" />
    <add key="MinLength" value="6" />
    <add key="MaxLength" value="" />
    <add key="webpages:Version" value="3.0.0.0" />
    <add key="webpages:Enabled" value="false" />
    <add key="PreserveLoginUrl" value="true" />
    <add key="ClientValidationEnabled" value="true" />
    <add key="UnobtrusiveJavaScriptEnabled" value="true" />
    <add key="WebPortalName" value="__WebPortalName__" />
    <add key="TimeZoneFrom" value="Pacific Standard Time"/>
    <add key="TimeZoneTo" value="Mountain Standard Time"/>
    </appSettings>
    The TFS builds can generate output with above formatted configs with web config transformation.
    2. Then in RM server templates change the config files to have values depending on the environment. You can do this using a custom deployment component or default xcopy deployment component to set parameters.
    3. Then in the release template set the config values.
    Cheers!
    Chaminda

  • XServe in XrackPro2 Rack Temps: Need Other Owner Advice!!

    So as part of a new network deployment which mandated that the X-Serve (Dual Ghz Intel (2 X 2cores) and X-Serve (7TB) be in the same room as clients, we purchased an XRackPro2 6U. We have installed a Gigabit Switch, Fiber Switch, Xserve, and XServe Raid. We're noticing that the temps with the door closed are 10 degrees warmer than with the door open.
    Door Close
    Left Inlet 84 Degree F
    Right Inlet 77 Degree F
    And the fans are 8200 In and 4800 Out Across the CPUS
    With the door closed the Inlet temp jumps to 95 and Fans to 12000 and this is with the server under NO load.
    What are other users of this rack seeing? The room is climate controlled to 72 degrees F.

    How close are the front of the Xserves to the door? The airflow of the Xserves is front to back so you need some clearance in front of them or you can get a "dead spot" where air flows around the unit but not to the front of the Xserve. I think the high inlet temperature is a sign that the air is circulating around inside the unit rather than getting good in-out flow.
    If you have removed the extra insulation so the unit is as open as it can be and the Xserves are at the same level as the exhaust fans I'm not sure what else there is to do. Some people advocate filling any empty slots with the 1U blank faces to restrict clear front to back flow except through the servers but I'm not sure how much I believe that really helps.
    I have used both of the bigger units (12 and 24U) cases and they do run cooler with the door open but they shouldn't vary that much. One other warning - don't leave the back open with the front closed for any amount of time because opening the back moves the exhaust fans away while the front flow is still constricted.
    My $.02,
    =Tod

  • Need second Xserve RAID, need advice

    We have 1 Xserve RAID with 14 x 400Gb Drives, It is RAID 50
    we are rapidly using up the space and need to know the easiest method for adding capacity.
    We want the two RAIDs to show up as a single volume, so when we buy the second Xserve RAID will it need to be the same capacity as the first i.e 14 x 400Gb drives?
    Also what is the procedure for adding this second RAID, will it involve backing up the data, wiping and rebuilding both RAIDs as RAID 50. or is there a way to do this without wiping data?
    Anyone who has been in this position I would be grateful of their advice

    First off, ideally the arrays should be the same size. Mixing drive sizes in an array is never a good idea. Usually the RAID controller will base the array on the smallest volume in the set and just use whatever it needs from the others, leaving the rest of the space unused (and inaccessible). e.g. if you try to RAID a 500GB volume and a 750GB volume, the 750GB volume will be treated like a 500GB volume and you lose the extra 250GB.
    By far the safest option is to do as you describe - backup the data, rebuild the array and restore the data.
    There is one other option, though, that might work, but I'd still backup the data before trying it.
    diskutil has an addToRAID option which allows the addition of an extra drive to a RAID set. I've only ever used it on RAID 1 mirrors, never on a RAID 0 stripe, so I don't know if it'll work, but it might be worth trying. You don't have anything to lose.
    Note that this kind of expansion isn't really viable long-term. If you expect to do this frequently you might consider alternative options like XSAN which can add any sized volume to a virtual pool of storage, growing organically.

  • Time to upgrade to Xserve-Advice Please

    Hi,
    I'm currently the default "IT" guy at my company. We are a small company without any kind of dedicated IT department. We started out 15-20 years ago with a single Mac and have grown throughout the years. We now have 16 stations with 4 of them being PCs. At first we had our network setup with a simple file sharing setup. When we need to share files, each computer would connect with which ever other computer we needed to. This worked ok for a while but then we had the problem of archiving all of our data. I would manually compile everyone's work on a specific project and back it up to an external hard drive. This was not the most efficient solution, but it was effective and we used this method for many years. We've recently expanded our business to include Maya work which is done on PCs. At the same time we introduced 2x 2TB Lacie Ethernet network storage devices. Now I have everyone working within the same folder on the Lacie drives. This makes my life easier as far as archiving. Part of the problem I've got now is with the PCs. We've had them for a few months now and I've never really gotten them fully integrated into our environment. I have a single PC at home, but the work environment has always been completely Mac based. Meaning I have some PC knowledge but really none as far as networking goes. I know that we need to get a server to accomplish a couple of different things.
    1) Better integration of PCs and Macs
    2) Improved Rendering for Maya (PC based Maya)-We are going to implement a render farm into the mix as well- we are currently looking at a Boxx system and an SGI system as well
    3) Central Storage with backup
    4) Improved speed (with the Lacie storage devices, there is some slowdown when first connecting)
    5) Remote login--i.e. something like the ability to login from home and set up a render over the weekend
    I would prefer to go with an Xserve because of my familiarity with Mac products. My problem is that when I look at the specs for the Xserve, I really don't know what anything is. I have no idea how to price anything for the boss.
    It makes me feel like I am getting in over my head, but at the same time, I am confident enough in my abilities that if I get the correct system set up, I should have no problem administering it for our needs.
    A) I imagine I'd want to go with the 8 core to start with
    B) What kind of performance boost-and in what areas-would I get with upgrading to the 2.93 or 2.66 GHz?
    C) I imagine I should get the most RAM we could fit into the budget?
    D) what is the Solid State drive? Do I need it?
    E) I can have 3 hard drives. How many do I need? why would I need more than one?
    F) Extra power supply..I understand why this makes sense
    G) XSan? what is it? do I need it?
    We currently also only have everything connected via ethernet. What's the deal with Fibre channel? Do I need to install a Fibre channel card in each workstation plus run all new wiring? If so then this is a very pricey deal and way unrealistic as far as budget at this time.
    I'm also wondering about the render farm. Will either a Boxx system or a SGI system integrate well with an Xserve when rendering PC Maya files? Would there be an option to configure an Xserve as a render farm? If so could I render PC Maya files on it?
    Sorry for the long winded post. I'm lost and need some help from some experts. Hopefully there is someone out there who can lead me in the right direction.
    Thanks for your help.
    Phil

    Hi
    My two cents.
    +I know that we need to get a server to accomplish a couple of different things+
    Apart from load balancing etc you don't really mention anything that either a 3rd-Party product might provide and what you've already got in place.
    +1) Better integration of PCs and Macs+
    With the environment you're describing you don't specifically need a Server for this although it depends on what you want to achieve.
    +2) Improved Rendering for Maya (PC based Maya)-We are going to implement a render farm into the mix as well- we are currently looking at a Boxx system and an SGI system as well+
    Perhaps XGrid might fit the bill?
    +3) Central Storage with backup+
    Depending on what Configuration you choose (Standard, Workgroup or Advanced) this could be easily achieved with what's already built in to the Server OS. On a side note the XServe is not the Server. The OS is. OSX Server can be installed on any qualifying hardware. The XServe is dedicated Server hardware built specifically to perform a Server (with all that that means) role. Redundancy is built-in. With what you're describing perhaps a suitable MacPro would be better?
    With the XServe you have to consider a rack as well as noise if a rack is not an option.
    +4) Improved speed (with the Lacie storage devices, there is some slowdown when first connecting)+
    This is probably due to the fact the drives are connected to a client machine and OS that is now not coping with the amount of client access required. I'm guessing someone is also working at that client mac?
    +5) Remote login--i.e. something like the ability to login from home and set up a render over the weekend+
    This might be possible with XGrid although you might have a real problem with bandwidth
    +I would prefer to go with an Xserve because of my familiarity with Mac products. My problem is that when I look at the specs for the Xserve, I really don't know what anything is. I have no idea how to price anything for the boss.+
    +It makes me feel like I am getting in over my head, but at the same time, I am confident enough in my abilities that if I get the correct system set up, I should have no problem administering it for our needs.+
    Leopard Server is a whole new ballgame and if not approached correctly Apple's marketing slogan of "No IT Required" may seem like a sick joke
    +A) I imagine I'd want to go with the 8 core to start with+
    If you have the budget get the best you can afford
    +B) What kind of performance boost-and in what areas-would I get with upgrading to the 2.93 or 2.66 GHz?+
    This would depend on what you want and what you're expecting. You won't really know until some time has passed. Besides how could you compare it? Unless you buy and bench test them both you can't really know? It might be possible you know someone who has exactly the same environment as you, doing exactly the same work and wanting exactly the same solution. However if you use google you should be able to find some performance bench test results? Whether they're applicable to your environment is another question.
    +C) I imagine I should get the most RAM we could fit into the budget?+
    See (A)
    +D) what is the Solid State drive? Do I need it?+
    http://support.apple.com/kb/SP511
    http://www.apple.com/xserve/features/storage.html
    http://blogs.computerworld.com/newapple_xserves_offer_additional_ssdbay
    +E) I can have 3 hard drives. How many do I need? why would I need more than one?+
    See (A). To be honest only you would really know? With the optional Apple RAID Card you could have 3x1TB Drives as a RAID 5. This provides some measure of redundancy (which is not a back-up) as well as performance. However you could as easily use a MacPro that has the 4 drive bays. Potentially 3+TB of Storage. A simple rule of thumb would be: How much data do I have now? Does the Storage I have now cope with it? Will the amount of Data grow over time? Do I have enough storage to cope with that growth?
    +F) Extra power supply..I understand why this makes sense+
    The XServe can be built-to-order with two PSUs providing redundancy. You should factor in a UPS as well. You don't want a power cut to potentially ruin all your data?
    +G) XSan? what is it? do I need it?+
    http://manuals.info.apple.com/en/xsan/XsanGettingStarted.pdf
    I think this would be way too much in terms of your environment as well as your budget. However read the pdf and come to a reasoned judgment.
    +We currently also only have everything connected via ethernet. What's the deal with Fibre channel? Do I need to install a Fibre channel card in each workstation plus run all new wiring? If so then this is a very pricey deal and way unrealistic as far as budget at this time.+
    No to the first two questions. Read the XSan pdf.
    +I'm also wondering about the render farm. Will either a Boxx system or a SGI system integrate well with an Xserve when rendering PC Maya files? Would there be an option to configure an Xserve as a render farm? If so could I render PC Maya files on it?+
    I have no experience of either of the two products you mention. Perhaps someone who does may post?
    Does this help? Tony

  • Pl/sql parameter portlet - need some help/advice - how to create

    I want to create a pl/sql portlet that accepts a parameter and on submit passes the parameter to other portlets (sql reports) these are then automatically run to display the new data.
    E.g.
    parameter portlet = deptno
    On submit
    Sql reports then refreshed using the parameter
    I am aware, and have tried the mycompnay demo, which works exactly as I want but the parameter portlet cannot be amended and is written in Java.
    I need a pl/sql equivalent so I can tailor the code.
    Any advice examples or guidance would be really appreciated.
    Thanks in anticipation.
    SD

    Hi,
    You can use a form portlet to accept parameters and then call a report in the success procedure of the form. In this example it calls a report with the value in the flightno field.
    declare
    flightno number;
    blk varchar2(10) := 'DEFAULT';
    begin
    flightno := p_session.get_value_as_varchar2(
    p_block_name => blk,
    p_attribute_name => 'A_FLIGHT_NO');
    call('SJDEMO30.report1.show?p_arg_names=flightno&p_arg_values='||
    flightno);
    end;
    Thanks,
    Sharmila

  • Need help and advice on designing relations of classes

    I tried to use JInternalFrame which contains an extension of JPanel as the UI content (with actionListener for the buttons inside it). The Panel is contained in a JScrollPane.
    jIntFrm = new JInternalFrame("-Title-", true, true, true, true);
    jIntFrm.setLayout(FlowLayout());
    jIntFrm.add(new JScrollPane(new PnlEntry()));
    The user clicked the button inside the panel (PnlEntry) to close the JInternalFrame containing this panel or another button to do operations that will need to display another JInternalFrame containing another extension of JPanel.
    Please give me advices on how these should be done, so that it makes good OO relations.
    Right now, I'mdoing it like this:
    public class mainFrame extends JFrame{
    JInternalFrame jIntFrm; //AND OTHER INTERNAL FRAME
    jIntFrm = new JInternalFrame("-Title-", true, true, true, true);
    jIntFrm.setLayout(FlowLayout());
    //THE PANEL IS GIVEN THE REF AND MEMORIZE IT UNTIL IT
    //INVOKES mainFrame.panelExit(JInternalFrame)
    jIntFrm.add(new JScrollPane(new PnlEntry(this, jIntFrm)));
    panelExit(JInternalFrame jif){
    if(jif == jIntFrm){
    jIntFrame.dispose();
    jIntFrame = null;
    panelContinueProcess(JInternalFrame jif, Object someID){
    if(jif == jIntFrm){
    jIntFrame.dispose();
    jIntFrame = null;
    //start displaying another JInternalFrame with the appropriate Panel in it
    Is this all is OK?
    Please give me your advice
    Thx,
    David

    Maybe read on MVC (Model-View-Controller) pattern? It's practically the standard for Swing applications (even Swing itself follows the pattern).
    You don't want to clutter your frame with behavioral code.

  • Advice sought for a new PC build

    Hello,
    I could use a little advice about an "as silent as silent can be" system that I am trying to put together right now. I plan to build it myself. I have installed / replaced components in PC systems before but it will be the first time I build a PC from scratch.
    Purpose of PC
    To blitz anything that CS6 can throw at it right now and still be putting up a respectable fight when CS11 is eventually released. In order of importance, the system needs to excel at:
    Video editing and effects using Premiere and AfterEffects
    3D animation using Autodesk 3DS Max etc
    Multimedia content authoring    (Flash, Photoshop, Illustrator, Fireworks, Dreamweaver)
    Music production     (Propellerhead Reason, REAPER)
    Internet browsing, emails, iTunes and all other 21st century time-wasting activities.
    Before I place any orders, I would appreciate it if you would let me know if you can see anything that I might have forgotten, any potential incompatibility issues or just any terrible decisions that I really need to rethink.
    Components
    Windows Professional  64bit OS
    Do I need Ultimate?
    Corsair Obsidian Series 550D Quiet Mid-Tower Case
    Is this case cool and quiet. Are there better options than this that don’t look like either a teenage warrior’s wet dream or a hotel minibar?
    Corsair Professional Series AX 760 Watt ATX/EPS Fully Modular 80 PLUS Platinum PSU
    Honestly I have no idea. I picked it because it is highly rated for efficiency and should go with the case. Are there better options at a similar price point, especially in terms of heat / noise / known reliability issues?
    Intel  Core i7-3930K 6-Core Processor
    I am definitely going with this CPU. I think.
    Corsair 120mm Hydro Series H80i Digital High Performance Liquid CPU Cooler
    I chose this rather than the 100i because it can exhaust out the back of the case. I think this will be a better configuration for the case I have chosen and the location where the PC will live.
    Asus P9X79 PRO Motherboard (Socket 2011, Intel X79 chipset)
    It has all the features I want except on-board firewire which is not a deal breaker and has had good reviews. Can anyone recommend something better at a similar price point? Is it worth spending more to get a higher specified / workstation grade motherboard?
    Asus Nvidia GeForce 2GB GTX 670 DirectCU II Graphics Card
    Supposedly one of the quietest GTX670 cards on the market and ought to be fully compatible with the motherboard.
    Pioneer BDR-208DBK 15X SATA Blu-ray writer
    Storage Components
    Corsair Vengeance Vengeance Performance 32GB DDR3 1866MHz CL10 (4 x 8GB) RAM
    I am also considering getting 64GB and using 32 – 40GB as a RAM disk. Good idea? No?
    Corsair Force Series 3 GT 180GB SSD  
    Windows OS, Programs and Pagefile(s).
    Corsair Force Series 3 GT 180GB SSD  
    Caches, Previews, Temporary files for all CS6 programs
    4 x WD Caviar Black 1TB in RAID 10 configuration
    Media files, project files, exported / rendered files, sample library. Is it worth getting a dedicated RAID controller or should the motherboard be able to handle this without hurting performance elsewhere?
    1 x WD Caviar Black 1TB
    Everything else.
    My budget is reasonably flexible but I do not want to waste money on components that offer only minimal performance benefits.  Also, I have no plans at present to get into overclocking etc. Far too scary for a newbie.
    If you made it this far, thank you for taking the time to read this and any advice at all would be very much appreciated.
    Regards
    Michael

    Hello
    I finished my build at the weekend and installed the OS and so far the process has been much better than I could have hoped for. Thank you for all of your help.
    The computer is much faster and quieter than any computer I have ever owned before and (touch wood) is a joy to use. The Windows Experience rating is 7.8.
    I am still running through checks to make sure everything is OK before installing the rest of the software but I have come across one issue that I think I ought to double check before I get in too deep.
    So far I have connected 4 drives to the Intel SATA controller of my ASUS P9X79 Deluxe motherboard set to RAID mode:
    Samsung 840 Pro 128GB SSD    ---- Not in RAID
    Samsung 840 Pro 256GB SSD    ---- Not in RAID
    2 x WD Black 500GB SSD in RAID 0
    Pioneer Bluray Re-Writer BDR-208DBK
    I have a WD Black 2TB waiting to be added to the system for non performance-critical storage once I am certain that I don't have an issue with set up.
    I have 32GB of RAM installed.
    I have installed and run the Samsung Magician software and the performance benchmarks for the drives are:
    Seq Read 128K
    Seq Write 128K
    Random Read 4K
    Random Write 4K
    Test Range
    Samsung 840 Pro 128GB
    533
    226
    88655
    4788
    1GB
    Samsung 840 Pro 256GB
    534
    263
    91169
    4956
    1GB
    2 x WD Black RAID 0
    241
    265
    2610
    1462
    100MB
    The SSD read speeds look fine  but write speeds seem to be very low to me, certainly much lower than the speeds quoted by Samsung and confirmed in multiple reviews.
    Does this look OK to you and if not, do you have any ideas about how I could improve SSD write performance?
    So far I have followed the steps in the following guide to SSD/HDD optimization for Windows 7 to no avail (checked for latest firmware and drivers, turned off indexing, disabled hibernation, checked that TRIM is enabled, shrank page file to 4GB on OS drive, changed power options so drives never sleep .... have I missed anything?):
    http://www.overclock.net/t/1156654/seans-windows-7-install-optimization-guide-for-ssds-hdd s
    Finally, does the RAID drive's performance look ok.
    I am suspicious that the write performance of the SSDs seems to be pegged to the write performance of the RAID, although that might be just coincidence.
    One last thing, and thanks for your patience.
    I am tempted to switch back to AHCI mode in the BIOS and re-run the benchmarks to see if the SATA RAID mode is the problem. This won't be too much of a hassle as the system is empty at present.
    But if / when I return to  RAID mode, I shouldn't have to do any pre-installing because the RSTe drivers are now already installed and up to date, is this correct?
    My apologies if these issues have been covered before in the forum. I have looked but I couldn't find anything specific.
    Regards
    Michael

  • Back Up Strategy Advice Sought

    Hello!
    I'd appreciate some advice on how best to set up a back up regime, please.
    I've just bought a Western Digital My Passport external drive for backing up purposes. At first I thought I'd simply back up my home folder to it on a regular basis and - I presume - all my documents, photos, music would be backed up 'en masse.'
    Having read up a bit on it (partly on these forums), I note that some do a 'clone' of their entire hard drive.
    Is there any mileage in doing a clone of the drive since I'm aware this is very useful to get up and running swiftly in the event of, say, the internal hard drive failing? My external drive is USB only, and I gather it isn't possible to boot from it (Firewire required?)
    If it is still worth having a clone of my system, should I partition the external so the clone goes on one partition and other 'often changing' files go on the other? (e.g. bookmarks/iTunes store-purchased music).
    If it is suggested I go the 'clone route' do I require something like Carbon Copy Cloner or Superduper! (I think I read that cloning can be done via Disk Utility?)
    Space-wise I'm fine: my new external drive is 160GB and my mini has an 80GB hard drive with, to date, 35GB used.
    I hope I'm making some sense here - your thoughts and advice would be much appreciated.

    retrieve 'other stuff'...
    Address Book and iCal have a Backup function in their File menu. I usually save the files regularly to a USB stick and sync them to .Mac (now MobileMe) too.
    Safari, from the File menu >Export Bookmarks"
    The folder "Mail" in yourusername/Library.
    Software Update has a function in its "Update" menu bar item - "Install and keep Package".
    The trick here is remembering the correct order to install the updates. Some are obvious, others less so and Installer does not always give a reason that is easily understood, when can't run an update. It might just say "The software cannot be installed" or "Error (number)".
    It would be nice if it would say "You need to install version xx, before you can install version xy", but I think we might have to wait a long time for that.
    Also consider that many individual updates will require a restart before proceeding further. By the time you have worked through manual updates from your archive, Software Update would probably have done the job faster anyway.

  • Sql Server Replication Advice - In desperate need of good advice

       I am a new developer and have written an application which solves some very important problems for my company. At the moment, I am stuck on replication. I believe what I need is merge replication but before I waste any more valuable
    time I wanted to ask for advice. The specific problem I am trying to solve is this. The company has around thirty marine vessels which frequently lose internet connection. The application I have written writes to a sqlserver 2012 database installed locally
    on each marine vessel. When the vessels have connection, it also writes to a sqlserver 2012 database which currently resides on an azure vm. What I need to happen is this. I need the databases on the marine vessels to replicate with the remote database. I
    believe that I need the local databases on the marine vessels to be the publisher and distributor and that I want the changes to be pushed to the remote server residing on the azure vm. The setup has been so hard to implement (probably because I'm still dealing
    with the learning curve) that I am wondering if this is the best way to go. Any advice on whether this is the correct way to go or if there are better alternatives is greatly appreciated.
      I have also tried using the sync framework, and the sql azure sync agent. I have gotten both to achieve exactly what I am after, except for one thing. The application residing on the marine vessels will create new tables. I have not found anyway to
    add the new tables to the sync schema, without manually going in and setting them up. Since I will not be on these vessels, it's not really an option. If there is a way to add the new tables programmatically then the sync agent would work perfectly for me.
     Thanks to all for any help.

    Hi,
    Yes, you can configure the SQL Server replication. Replication processing resumes at the point at which it left off if a connection is dropped. If you are using merge replication over an unreliable network, consider using logical records,
    which ensures related changes are processed as a unit. For more information, see
    Group Changes to Related Rows with Logical Records.
    These articles may be helpful:
    Synchronize two tables using SQL Server Integration Services (SSIS)–Part I of II
    http://blogs.msdn.com/b/jorgepc/archive/2010/12/07/synchronize-two-tables-using-sql-server-integration-services-ssis-part-i-of-ii.aspx
    Synchronize SQL Server databases in different remote sources
    http://solutioncenter.apexsql.com/synchronize-sql-server-databases-in-different-remote-sources/
    Thanks.
    Tracy Cai
    TechNet Community Support

  • Need graphics card advice.

    I purchased a new laptop from BB about 6 months ago, a Toshiba Satellite P875. I need a graphics card with at least 128mb of dedicated memory and think I only have a chip with 32mb of dedicated memory. I was wondering if anyone knows what would be good, keeping in mind that cost is an issue. 
    Thanks. 
    Solved!
    Go to Solution.

    http://simswiki.info/wiki.php?title=Game_Help:TS3_System_Requirements
    That link lists all the expansion packs, and if they will, will not, or might work with various graphics cards(the list is fairly extensive).
    My guess is that your CPU picked up the slack for your lacking GPU in the base game, but the later expansion packs seem to weed out all but the higher end cards.
    Unfortunately, you don't have a gaming computer, so the best you can do is see just how far down you can turn the graphics settings/resolutino.
    If you like my post, or solution to your issue/question, go ahead and click on the little star by my name and/or accept the post as the Solution. It makes me happy.
    I'm NOT an employee of Best Buy, or Geek Squad, though I did work as an Agent for a year 5 years ago. None of my posts are to be taken as the official stance that Best Buy will take on your situation. My advice is just that, advice.
    Unfortunately, that's the bad luck of any electronic, there's going to be bad Apples... wait that's a horrible pun.

  • Need Expert's Advice - How to use LabView Efficiently and to increase Readability

    My application is fairly complex. It is a real world testing applications that simultaneously controls 16 servo motors running various stress testing routines asynchronously and all at the same time. The application includes queues, state machines, sub VI's, dynamically launched VI's, subpanels, semaphores, XML files, ini files, global variables, shared variables, physical analog and digital interfaces and industrial networking. Just about every technique and trick that LabView 2010 has to offer and the kitchen sink as well.
    Still I am not happy with the productivity that LabVIEW 2010 has provided, nor the readability of my final product.
    Sometimes there are too many wires. Much of my state machines have a dozen or more wires just going from input to output, doing nothing, just because one or two states in the machine need that variable in some state. Yeah, I could spend alot of time bundling and unbundling and rebundling those values, but I don't think that would improve things much.
    We have had a long discussion about the use or misuse of Local variables in this forum and I don't want to repeat that here. I use them sparingly where I think it is relatively safe to do so. I also have a bug whenever I try and copy some code that contains one or more local variables. On Pasting the code with local variables, the result is something other than what I expected, I am not sure what. I have to undo the paste and rebuild the code one object at a time.
    I am also having trouble using trouble using Variable Property Nodes. When I cut and paste them, they often loose their reference object and I have to go back into the code and redo the Link To on each one. That wastes alot of time and effort.
    Creating subVIs is often not appropriate when the code makes many references to objects on the Front Panel. Some simple code will turn into a bunch of object references and dereferences which also tends to take alot of work to clean up and often does not help overall readability in many cases. I use subVIs when appropriate, but because of the interface overhead, not as often as I would like to. My application already has over 150 sub VIs.
    The LabView Clean Up Diagram function often works poorly. It leaves way too much empty space between objects, making my diagrams 3 to 4 24" screens wide. That is way too much and difficult to navigate effectively. The Clean Up function puts objects in strange places relative to other objects used nearby. It does a poor job routing wires and often makes deciphering diagrams more difficult rather than easier.
    My troubleshooting strategies don't work well for large diagrams and complex applications. The diagrams are so complex that execution highlighting may take 20 minutes for a single pass. Probes help, but breakpoint aren't of much use, because single stepping afterwards often takes you to somewhere else in the same diagram. I can't follow the logic well doing this.
    Using structures, I may have Case structures nested 5 to 10 levels deep inside some Event Structure inside a While Loop. Difficult to work with and not very readable.
    All and all, I can make it work, but I am not happy about the end result.
    I am hoping to benefit from some expert advice from those that are experienced in producing large complex applications efficiently, debugging efficiently and producing readable diagrams that they are proud of.
    Can anyone offer their advice on how best to use the LabView features to achieve these results in complex applications? I hope that you can help show me the light.

    I'm not an expert but I'm charged out as one at work.
    I am off today so I'll share some thoughts that may help or possibly inspire others to chime. I have tried to continually improve my code in those areas and would greatly welcome others sharing their approaches and insights.
    Note:
    I do refactoring services to help customers with this situation. What I will write does not represent what we do in a code review since our final delverable is a complete final design and that is beyond the scope of this reply.
    I'll comment on your points.
    dbaechtel wrote:
    My application is fairly complex. ...
    While watching Olympic figure skating competion slow-motion replays, I learned how the subtleties of how the launching skate is planted while entering a jump can make the difference between a good jump and a bad one.
    In software, we plant our foot when we turn from the design to the development. I have to admit that there where a couple of times when I moved from design to development too early and found myself in a situation like you have described.
    How to know when design is done?
    Waterfall says "cross every 't' and dot every 'i' " while Agile says "code now worry about design latter" and Bottom-up "says "demo working why bother designing" (Please feel free to coment on these over-simplifications gang).
    My answer is not much more helpful for those new to LabVIEW. 
    My design work is done when my design diagrams are more complicated than the LabVIEW diagrm they describe.
    dbaechtel wrote:
     simultaneously controls 16 servo motors running various stress testing routines asynchronously and all at the same time. The application includes ...and the kitchen sink as well.
    Have you posted any design documents you have? These will help greatly in letting us understand your application. More on diagrams latter.
    Anytime I see multiple "variations on a theme", I think LVOOP (LabVIEW OOP ) . I'll spare you the LVOOP sales pitch but will testify that once you get your first class cloned off and running as a sibling (or child) you'll appreciate how nice it is to be able to use LVOOP.
    Discalimer:
    If you don't already have an OOP frame of mind, the learning curve will be steep.
    dbaechtel wrote:
    Still I am not happy with the productivity that LabVIEW 2010 has provided, nor the readability of my final product.
    Sometimes there are too many wires....going from input to output, doing nothing,... spend alot of time bundling and unbundling and rebundling those values, but I don't think that would improve things much.
     Full disclaimer:
    I used to be of the same opinion and even used performance arguements to make my point. I have since, changed my mind.
    Let me illustrate (hopefully). This link (if it works for you, use lefthand pane to navigate hierachy) shows an app I wrote from about 10 years ago when I was in my early days of routing wires. Even the "Main" VI started to suffer from too many wires as this preview from that links shows.
    Clustering related data values using Type Definitions   is the first method I would would urge. This makes it easier to find the VIs that use the Type def via the browse relationships>>>callers. If I implement my code correctly, any problem I believe is associated with a particualr piece of data that is a Type def has to be in one of the VIs that use that type def therefore easier to maintain.
    When I wrote "related data" I am refering data normalization rules (which my wife knows and I picked-up from her and I claim no expertise in this area) where only values that are used together are grouped. E.G. Cluster named File contains "Path" and "Refnum" but not "PhaseOfMoon". This works out nicely with first creating sub-VI since all of the data related to file operations are right there whe I need it and it leads into the next concept ...
    When I look at a value in a shift register on the diagram taking up space that is only used in a small sub-set of states, I concider using an Action Engine . This moves the wire from the current diagram into the Action Engine (AE), and cleans up the diagram. The AE brings with it built-in protection so provided I keep all of the opearations related to the the Type def inside the AE I am protected when I start using multiple threads that need at that data (trust me, it may not make a difference now but end users are clever). So that extra wire is effective encapsualted and abstracted away from the diagram you are looking at.
    But I said earlier that I would not sell LVOOP so I'll show you what LVOOP based LV apps look like to contrast what I was doing ten years ago in that earlier link. This is what the top level VI looks like.
     And this is the Analysis mode of that app.
    I suspose I should not mention that LVOOP has wizards that automatically create the sub-VI (accessors) that bundle/unbundle the clusters, should I?
    Continuing...
    dbaechtel wrote:
    We have had a long discussion about the use or misuse of Local variables...I also have a bug whenever I try and copy some code...
    If you can simplify the code and duplicat ethe bug. please do so. We can get it logged and fixed.
    dbaechtel wrote:
    I am also having trouble using trouble using Variable Property Nodes....
    That sounds like a usage issue. Posting code to illustrate the process will et us take a shot at figuring out what is happening. 
     dbaechtel wrote:
    Creating subVIs is often not appropriate... My application already has over 150 sub VIs.
    "Back in the day..." LV would not even try to create a sub-VI that involved controls/indicators. I use sub-VIs to maintain a common GUI often but I do it on purpose and when I find myself creating a sub-VI that involves a control/indicator, I hit ctrl-z immediately! 
    I figure a way around it (AE ?) and then do the sub-VI.
    Judging by your brief explanation and assuming you do a LVOOP implementation, I would estimate that app need 750-1500 VIs. 
     dbaechtel wrote:
    The LabView Clean Up Diagram function often works poorly.... 
    THe clean-up works fine for how I use it. After throwing together "scratch code" and debugging the "rats nest" I'll hit clean-up as a first step. It guess good enough on simple digrams and in some cases inspires me to structure the diagram in a different way that I may not have thought about. If I don't like, ctrl-z.
    Good deisgn and modualr implementaion led to smaller diagrams that just don't need thrre screens.
     dbaechtel wrote:
    My troubleshooting strategies don't work well for large diagrams and complex applications....Can anyone offer their advice on how best to use the LabView features to achieve these results in complex applications? I hope that you can help show me the light.
    Smaller diagrams single step faster since the sub-VI run full speed. I cringe thinking about a 3-screen diagram with multiple probes open ( shivver!).
    Re: Nestested structres
    Sub-VIs (wink, wink, nudge, nudge)
    If it works you have prven the concept is possible. That is the first step in an application.
    I hope that gives you some ideas.
    Ben
    Ben Rayner
    I am currently active on.. MainStream Preppers
    Rayner's Ridge is under construction

  • Need some Tech Advice -- Harddrive configuation

    I've searched the last week or so, just seeing what everyone else has asked, what others have suggested and/or done. So I've gotta ask a couple of questions myself.
    System Specs:
    G5 Quad Core 2.5Ghz
    8GB of RAM
    Pair of 500GB Hard Drives (currently two separate drives)
    NVIDIA Quadro FX 4500
    Dual 30" Displays
    There are 2 500GB drives in the computer, but the 2nd one is not being utilized currently. The boot/primary drive has less than 1GB available. System performance is sluggish. 2 applications that are being used are Photoshop CS2 and Aperture. All images are shot with a Canon 1Ds in RAW.
    I like the idea of RAID0 in regard to speed, but don't like the idea of no redundancy. This 500GB drive was filled up from February of this year, so hard drives get filled up quickly. What can anyone recommend for a means to increase capacity, performance and stability.
    Hardware RAID configuration? Software RAID configuration? External Enclosure? Based on what I've read here, hardware RAID, though more expensive, would be a better solution compared to SoftRAID. External enclosure would need to be FW800 vs. USB2.0.
    The thing I'm trying to stay away from would be a 1TB RAID0 and then another 1TB RAID0 as a means for backup.
    I currently have the 250GB drive that came with the computer, so I could set that drive back up as the boot drive, then take the pair of 500GB drives to store all the data. Just looking for the advice of the more experienced.
    Any advice would be greatly appreciated. Thanks for everyone's time.

    don,
    After posting this and searching further, I found this:
    http://www.caldigit.com/S2VRDuo.asp
    I am actully looking at CalDigit's SATA RAID too.
    This 2-bay RAID subsystem provides amazing functions that none of FireWire drive can compete. If it is at 150MB/s as they claimed, it is 2 times faster than FireWire 800 RAID.
    I also checked with B&H, the price of such GREAT SATA RAID with 4-port Mac Pro compatible SATA card is fairly inexpensive. (only 500+ bucks for 500GB)
    Or if you want to use firewire interface, your best bet is to get the new CalDigit FW800 dual bay array, FireWire VR, which is hot swapable and cooled by a nice quiet fan. This might be the best and give you most flexibility (can be upgraded to larger capacity if needed, supports RAID 0, 1 and JBOD, same as their S2VR Duo):
    http://www.caldigit.com/FireWireVR.asp.
    Cheers,
    Darrelo Brown
    Mac Pro   Mac OS X (10.4.8)   AJA Kona 3
    Mac Pro   Mac OS X (10.4.8)   AJA Kona 3

Maybe you are looking for