Best use of disk drives for data transfer

I have three different data "disk" drives on my system and I timed transferring 13 GB of data between. The drives are a standard 7500 RPM disk, a pair of 7500 RPM disks in RAID 0, and a solid state flash drive. The times in minutes and and seconds were as follows:
To:| Standard | RAID 0 | Flash
From: | | |
Standard | 6:39 | 2:43 | 3:22
RAID 0 | 2:46 | 4:41 | 3:33
Flash | 2:41 | 1:51 | 16:58
The first thing to notice is that if the source and destination drives are the same then times are much longer. The Flash drive was particularly bad.
The best time was from Flash to RAID 0, but in the reverse direction the time was poorer than a standard disk, suggesting that the flash has a fast read but a slow write.
Conclusion: for disk bound operations, such as transferring DV AVI files with little or no processing use different drives if practicable. In other words, set the scratch files drive to be other than the drive with the source files on it. If possible, put your source files on a flash drive and your scratch files on a RAID 0 drive.
Edit: Sorry for the poor table layout. Two or more consecutive spaces get converted into a single space for some reason.

You can use the error-cluster to detect the end of your queue. The erroroutput of the shared variable will return a -2220 Warning, if it reads a value it has read already before. I attached a modified example and saved it for 8.2 I hope you can open it.
Attachments:
readbuffered82.zip ‏51 KB

Similar Messages

  • Need to build communication redundancy using serial RS-232 for Data Transfer b/w Host and RT irrespective of TCP/IP Data Transfer

    Hi - I would like to build the logic in which it should accomodate the communication redundancy using serial RS-232 for Data Transfer b/w Host and RT irrespective of TCP/IP Data Transfer.
    I want to do data transfer b/w host and RT through RS232 VISA portal whenever TCP/IP ethernet cable has been unplugged from the controller continuosly , it should keep on checking for TCP/IP link re-establishing also , when ever the tcp/ip link established again that time the communication should be using in that link only. This is accomplished by deploying the RT vi as execuatbale file. I made some logic regards to the above said logic , bur it was not working as much I expected.
    I request you to go through the attached two VI's and let me know , what I did wrong in that,
    Please do the needful.
    Attachments:
    TCP_Serial_Host.vi ‏33 KB
    TCP_Serial_RT.vi ‏41 KB

    even i am new to this topic and i am trying to get familiar with these protocols
    refer to tcp server/client examples in labview examples

  • Advantages of using BAPI over LSMW for data transfer

    Hellow,
    Can anybody please give an input for advantages of using BAPIs in data transfer compare to LSMW. If we use BAPI for data transfer of master and transaction data how it is more usefull compare to LSMWs developed.
    Thanks in advance
    Raghav

    Hi Raghavendra,
    Refer the links..may help.....
    [http://www.*******************/search?q=BAPIandLSMW+comparision]
    [http://www.*******************/search?q=bapi]
    Regards,
    Mdi.Deeba

  • Using a flash drive for data storage - questions

    I take video clips whenever I travel and when creating DVD videos, want to intermix the clips with still images. Over a long trip, so many videl clips are created that it is required to download these to free up space in the camcorder. For me, a flash drive (or pen drive) is an ideal medium. However, I have found that even with a "fast" drive having a spec of 20MB/sec read and 18MB/sec write, the data transfer slows significantly when the transfer is of either video (.AVI) or music (.mp3).
    For example, I have a Sony flash drive with the above specs. Transferring straight data files such as word or spreadsheet documents, I measure the speed at approximately 12MB/sec. But, when I transfer video or audio, this drops to about 4MB/sec.
    When I look up transfer rates on the net, it gives one figure specifying this as "data transfer". I could find nothing to indicate the change, if any, in this rate if video or audio files are transferred.
    Does anyone have experience with this? Basically, I am trying to find the fastest 2 GB minimum flash drive at a reasonable price, but will I lose all of the speed as a function of the type of file.
    Thanks

    Yes, if these are normal data files and the flash drive is formatted for Mac OS Extended (Journaled) or FAT32.

  • Creating multiple libraries and using an external drive for data

    I'm a recent switcher and new to iphoto. On the windows platform, I used adobe album. I have a around 10k of images that I put on an external drive which is attached to my new imac. I deleted the iphoto folder and redirected it to my images on my external. All is working great. I'm looking for some advise on work flow of file/iphoto management. i.e. I'm assuming the more images I accumulate, the slower I photo is going to run? I hear you can create multiple iphoto libraries. Will that only load the photos that are associated with the library and I'm assuming optimize iphoto's performance versus always loading every photo you have taken every time you work in iphoto? If adding libraries is the way to go, can someone direct me to instruction on how to do this?
    This may be beyond this threads scope but I plan to put all my data (iphoto, imovies, itunes) onto an external drive in order to keep the imac main drive as clean a possible. Is my thinking correct in doing so. Any negative repercussion in doing so?

    Joe:
    I'm assuming the more images I accumulate, the slower I photo is going to run?
    You'll have to get quite a lot of photos into iPhoto before you start to see a real slowdown in performance. Users have reported libraries of 40K files and they perform just fine. However, do not keep all of rolls open as that will really do a job on the performance. Just open the rolls you need and close them all before quitting.
    Yes, you can use multiple libraries and only load what's in each. I use iPhoto Library Manager to manage and move quickly between 10 libraries. With the paid version of iPLM you can move albums and/or rolls between libraries as needed and keep the keywords, ratings, and comments intact.
    Using an external HD in order to maintain sufficient free space (10G minimum is what I recommend) on the boot drive to help optimize performance is a very good idea.
    As far as any negative aspects, it would be a good idea to have a second external HD as a backup to the working external HD. You can't be too careful with those image files (and other important files) that can't be replaced.
    Do you Twango?
    TIP: For insurance against the iPhoto database corruption that many users have experienced I recommend making a backup copy of the Library6.iPhoto database file and keep it current. If problems crop up where iPhoto suddenly can't see any photos or thinks there are no photos in the library, replacing the working Library6.iPhoto file with the backup will often get the library back. By keeping it current I mean backup after each import and/or any serious editing or work on books, slideshows, calendars, cards, etc. That insures that if a problem pops up and you do need to replace the database file, you'll retain all those efforts. It doesn't take long to make the backup and it's good insurance.
    I've written an Automator workflow application (requires Tiger), iPhoto dB File Backup, that will copy the selected Library6.iPhoto file from your iPhoto Library folder to the Pictures folder, replacing any previous version of it. You can download it at Toad's Cellar. Be sure to read the Read Me pdf file.

  • Use of Shared Variables for data transfer from a RT traget to a desktop application

    Hi,
    I want to adopt the shared variables to share data in an existing distributed application comparable with the T3 Benchmark configuration in this document:
    [1] http://zone.ni.com/devzone/conceptd.nsf/webmain/5B4C3CC1B2AD10BA862570F2007569EF
    The current implementation uses a well tuned solution with RT FIFOs and TCP/IP communication with a desktop-PC which monitors and stores the data.
    The desktop application runs with a much slower execution rate than the TCL. So the TCP/IP packets are used to buffer the data with a package size dependent on the data send rate.
    To use the shared variables instead of this RT FIFO + TCP/IP implementation I need to read the shared vairable buffer at once each time the desktop application reads the shared variable.
    But according to this quotation from [1]:
    "With buffering, you can account for temporary fluctuations between read/write rates of a variable. Readers that occasionally read a variable slower than the writer can miss some updates."
    this seems to be impossible. Am I right?
    Are there any suggestions to circumvent this problem? Or are shared variables not made to share data between unsynchronized processes without data loss?
    Regards Till

    You can use the error-cluster to detect the end of your queue. The erroroutput of the shared variable will return a -2220 Warning, if it reads a value it has read already before. I attached a modified example and saved it for 8.2 I hope you can open it.
    Attachments:
    readbuffered82.zip ‏51 KB

  • The i pod module for data transfer

    Recently i put i pod into enable disk use function so i could use the i pod for data transfer i went into MY computer then located the i pod service module i went to put it onto my desktop but it said module is being used by another program but it isnt atall. When i put in data like my spreadsheets onto the module dragging them it doesnt appear on the i pod module in the i tunes why is that? What do i have to do to make sure i can see that the data is recieved onto the i pod thanks mr roberts

    Hi,
      Retrieve data file from presentation server(Upload from PC)
    DATA: i_file like rlgrap-filename value '/usr/sap/tmp/file.txt'. 
    DATA: begin of it_datatab occurs 0,
      row(500) type c,
    end of it_datatab.
      CALL FUNCTION <b>'GUI_UPLOAD'</b>
           EXPORTING
                filename        = i_file
                filetype        = 'ASC'
           TABLES
                data_tab        = it_datatab  "ITBL_IN_RECORD[]
           EXCEPTIONS
                file_open_error = 1
                OTHERS          = 2.
    Pls reward points.
    Regards,
    Ameet

  • HT5287 If DVD Movies, Audio CD's and even burning CD and DVD's are not supported, what is the point of DVD & CD Sharing then? Wouldn't it just be better to remote in or use a thumb drive if it's only able to be used for data transfer?

    If DVD Movies, Audio CD's and even burning CD and DVD's are not supported, what is the point of DVD & CD Sharing then? Wouldn't it just be better to remote in or use a thumb drive if it's only able to be used for data transfer?
    Or am I missing the bigger picture?

    As long as you have a Superdrive or an external burner/drive, burning, watching, installing, etc. from CD or DVD will work just fine. And so will sharing.

  • Using Nokia 1200 CA45 connection for data transfer

    I have a USB CA45 connector but am unable to connect my 1200 to PC for data transfer. Is this simply because this entry level phone is not able to connect to PC. If so why does it have a CA45 connector installed?
    Any help or advice appreciated
    ottoman62

    About my problem above, i followed the process below which is written in the user guide :
    "So that you can use a connection by cable, a pilot of cable of data USB must be installed on your PC. You can use the Transfert functionality of data without installing of pilot of cable of data USB. Select Menu > Connectiv. > Câble USB. You can connect your apparatus to a compatible PC using a cable of data USB.
    Connect the cable of data USB on the basis of the apparatus. To
    change the type of equipment that you normally connect to your
    apparatus using the cable of data, support on the joystick. [...]
    Transfer of data - Reach data such as audio or photo files on your computer and transfer them since this one by using a connection by cable of data. To use the Transfer mode of data, take care not to select the type of connection USB in
    the parameters Manage connections of Nokia PC Continuation. Insert a memory board in your apparatus, connect this one to a compatible computer using the cable of data USB and, when the apparatus you request to indicate the mode used, select Transfert of data. In this mode, your apparatus plays the part of peripheral of storage of mass and, on your computer, it seems a removable hard disk."
    The USB cable used is a Nokia device.

  • HT3275 I am using an external drive for backup. I am now getting a message that the drive needs a disk repair, but when I run disk repair it  just seems to freeze.

    I am using an external drive for back up. Yesterday the backup failed and I received a message that the external drive( a passport drive) needed a 'disk repair", when I ran disk repair nothing happened it seemed to freeze. Suggestions? should i simply erase the external drive and start over-my computer is fine.

    Get a different drive for backup.
    In the meantime, once you are backed up some drive safe, then you can attempt to repair the backup drive using the repair the directory link.

  • JPA - Best Practice For Data Transfer?

    I've been considering an alternative method for data transfer between applications, by using Serialized or Encoded to File JPA Entities. (either Binary or XML)
    I know this procedure may have several draw backs as compared to traditional exported SQL queries or data manipulation statements, however, I want to know if anyone has considered or used this process for data transfer?
    The process would be to
    - query the database and load the JPA Entities
    - Serialize or Encode them to file
    - zip up the entire folder with the JPA entities
    - transfer the data to destination machine
    - extract the data to a temp directory
    - reload the JPA entities by de-serializing and persisting them to the database
    The reason I'm considering this process, is basically because I have a desktop application (manages member records, names, dates, contributions, etc) used by different organisations in different locations (which are not related except by purpose ie clubs or churches) and would like to have a simple way of transporting all data associated with a single profile (information about a member of the organisation) from one location to another in a simple way, ie users interact only with the application without the need for any database management tool or such.
    I'm also considering this because it is not easy to generate an SQL Script file without using a dedicated Database Management Tool, which I do not want the application users to have to learn how to use.
    I would appreciate ANY suggestions and probable alternative solutions for this problem. FYI: I'm using a Java DB database.
    ICE

    jschell wrote:
    In summary you are moving data from one database to another. True
    You only discussed flow one way. Well the process is meant to be bi-directional. Basically what I envision would to have something like:
    - the user goes to File -> Export Profile...
    - then selects one or more profiles to export
    - the export process completes and the zip archive is created and transfered (copied, mailed, etc) to the destination location
    then on the destination pc
    - the user goes to File -> Import Profile
    - selects the profile package
    - the app extracts, processes and imports the data (JPA serialized for example)
    Flow both ways is significantly more complicated in general.Well if well done it shouldn't be
    And what does this have to do with users creating anything?Well as shown above the user would be generating the Zip Archive (assuming that is the final format)
    Does this make the problem clearer?
    ICE

  • How many disk groups for +DATA?

    Hi All,
    Does oracle recommends having one big/shared asm disk group for all of the databases?
    In our case we going to have 11.2 and 10g rac running against 11.2 ask...
    Am I correct in saying that I have to set asm’s compatibility parameter to 10 in order to be able to use the same disk?
    Is this is a good idea? Or should i create another disk group for the 10g db’s?
    I’m assuming there are feature that will not be available when the compatibility is reduced to 10g...

    Oviwan wrote:
    what kind of storage system do you have? nas? what is your protocol between server and storage? tcp/ip (=>nfs)? fc?....
    if you have a storage with serveral disks then you create mostly more than one lun (raid 0, 1, 5 or whatever). if the requirement is, that you need a 1 TB diskgroup, then I would not create 1 1TB lun, I would create 5x200GB lun's for example, just for the case that you have to extend the diskgroup with a same lun size. if its 1 TB then you have to add another 1TB lun, if there are 5x200GB luns then you can simply add 200GB.
    I have nowhere found a document that says: if you have exactly 16 lun's for a diskgroup it's best. it depends on os, storage, etc...
    so if you create a 50gb diskgroup I would create just one 50gb lun for example.
    hthyes its NAS, connectd using Iscsi. it has 5 disks 1TB each and configued with RAID5. I found below requirments on asm ... it indicates 4luns as minimum per diskgroup, but it doesnt clearify whether its for external redundancy or as mredundancy types.
    •A minimum of four LUNs (Oracle ASM disks) of equal size and performance is recommended for each disk group.
    •Ensure that all Oracle ASM disks in a disk group have similar storage performance and availability characteristics. In storage configurations with mixed speed drives, such as 10K and 15K RPM, I/O performance is constrained by the slowest speed drive.
    •Oracle ASM data distribution policy is capacity-based. Ensure that Oracle ASM disks in a disk group have the same capacity to maintain balance.
    •Maximize the number of disks in a disk group for maximum data distribution and higher I/O bandwidth.
    •Create LUNs using the outside half of disk drives for higher performance. If possible, use small disks with the highest RPM.
    •Create large LUNs to reduce LUN management overhead.
    •Minimize I/O contention between ASM disks and other applications by dedicating disks to ASM disk groups for those disks that are not shared with other applications.
    •Choose a hardware RAID stripe size that is a power of 2 and less than or equal to the size of the ASM allocation unit.
    •Avoid using a Logical Volume Manager (LVM) because an LVM would be redundant. However, there are situations where certain multipathing or third party cluster solutions require an LVM. In these situations, use the LVM to represent a single LUN without striping or mirroring to minimize the performance impact.
    •For Linux, when possible, use the Oracle ASMLIB feature to address device naming and permission persistency.
    ASMLIB provides an alternative interface for the ASM-enabled kernel to discover and access block devices. ASMLIB provides storage and operating system vendors the opportunity to supply extended storage-related features. These features provide benefits such as improved performance and greater data integrity.
    one more question about fdisk partitioning. is it correct that we should only create one partition per luns (5x200Gb luns in my case) is it because this way i will have more consistent set of luns( in term of performance)?

  • Best Solid State Hard Drive for ThinkPad T430s

    Hi all,
    I was thinking of replacing the 500gb in my 2 year old t430s with a solid state. Prices have come down.
    I was thinking of buy this new one from Samsung, which has about the best reviews of any SSD.
    http://www.amazon.com/SanDisk-Extreme-2-5-Inch-Warranty-SDSSDXPS-240G-G25/dp/B00KHRYRNM
    Doesn't seem very hard to install SSDs and I'd do a fresh install of Windows. But I'm curious if I'd run into any bottlenecks on the t430s with such a high end SSD. I have 8 gigs of ram.
    I was also considering this best seller:
    http://www.amazon.com/Samsung-Electronics-2-5-Inch-Internal-MZ-7TE250BW/dp/B00E3W1726/ref=sr_1_1?s=p...
    But I'd prefer to shell out 30 bucks more for the top of the line one with better lifespan etc.
    Any thoughts appreciated.
    Thanks.

    SSDs provide the most improvement when reading lots of small files, where you need to move the heads and wait for the disk to spin to the right sector.  For most people, booting and shutdown are the tasks most helped.  If you don't need a WWAN card, I suggest you consider using a smaller mSATA card as the boot drive (I use a 256GB) and keeping your 500GB drive for data.  mSATA drives are available up to 1TB

  • My external hard drive is 'seen' by my iMac and I can go into the Finder and open files and folders. I am using the hard drive for Time Machine back up. However Time Machine says it can't find the drive. Same thing has happened with Final Cut Express.

    My new LaCie external hard drive is 'seen' by my iMac and I can go into the Finder and open files and folders. I am using the hard drive for Time Machine back up. However Time Machine says it can't find the drive.
    The same thing happened recently between Final Cut Express and my other LaCie external hard drive used as the Scratch disk. It fixed itself.
    I've run out of ideas. Help would be very much appreciated. Thanks.

    have you done some searches on FCPx and time machine? Is there a known issue with using a TM drive with FCPx? dunno but ...wait...I'll take 60 sec for you cause I'm just that kind of guy....   google...." fcpx time machine problem"  Frist page link 
    http://www.premiumbeat.com/blog/fcpx-bug-best-practices-for-using-external-hard- drives-and-final-cut-pro-x/
           You cannot have time machine backups on your hard drive if you intend to use it in FCPX.
    booya!

  • Going from xp to windows 7 64 bit using an external drive for music storage

    Greetings Techies!
    I hope you all are well.
    I need to move my iTunes from an XP machine to Windows 7 64 bit. On the XP machine I am using an external drive for music storage and the iTunes database is stored on the internal drive. I am looking for advice on how to transfer all of this over to my new Windows 7 64 bit machine. I know iTunes can be picky about everything and I don’t want to take and chances and loose my playlists or music.
    Any help would be much appreciated.
    Blessings,
    paul

    You are right about it being picky. I just upgraded from Vista to Windows 7 and was forced to format my drive. I had all the music backed up, but when I imported it, I have 2 or 3 copies of some songs, art work missing, and have lost my playlists. I hope someone responds to your question, because I could use similar help! Best of luck to ya . . .

Maybe you are looking for

  • I have two 2nd generation Apple TV

    I'm not able to update the software to one of the Apple TV. When trying to update get message unable to update try again later. This has been going on for over a month. I have reset t

  • Running an external job at the specific node out of a 2 Node RAC database.

    Hi. all. I created a job , which runs an external script, by using dbms_scheduler. Our database is 2 node RAC database(10.2.0.2.0). The external script exists at a only one Node(Node 1). select status,instance_id, additional_info from dba_scheduler_j

  • Report colors

    Hello team, i have one more question ... I have a siple SQL report. f.e. a list of all emp's [email protected]> select * from scott.emp; EMPNO ENAME JOB MGR HIREDATE SAL COMM DEPTNO 7369 SMITH CLERK 7902 1980-12-17 00:00:00 800 20 7499 ALLEN SALESMAN

  • Xi jdbc reveiver adapter

    Hi, have somebody ever faced the situation where communication channel says ' it's allright', but there are no updates on DB? One of the posted fields is a date-field, mapping: date=sysdate (without any hasQuot attribute), so actually sql statement m

  • Policy Subjects: No filtered roles found in sub organisations?

    Hi! I just tried to add a filtered role in a suborganisation to a policy's subjects. But to my dismay I found that only filtered roles in the base organisation are shown. A quick look into the ldap logfile shows that a search for filtered roles is in