XML Large Data transfer

I am trying to retrive about 3000 rows from a table and then converting it to XML using the oracle provided classes(oracle.xml.sql) and display it on IE 5.0/Netscape 4.5 but get a timeout ( I can do the same thing with 10 rows coming back ) with Oracle application Server 4.0.8 on Solaris 2.6 . And I am trying to figure out where its timing out .
Thanks for your help in advance.
Rajan
null

I assume that you've been successful in running this query from SQLPlus, and that it runs in a reasonable amount of time, if not you may need to tune your query or split it to make it run faster.
Also, take a look at your servlet log and verify that you are not overrunning the StringBuffer. If you are you may want to make sure that you have configured the XSQL servlet properly. There's a parameter you can set that will not buffer the output. I can't remember what it's called but it's in the docs.
Hope this helps,
Mark
null

Similar Messages

  • INPUT: KT4/V vs CRC error in large data transfer/CD burning HERE!

    This issue can be solved with BIOS update. KT4V & KT4 Ultra users who are having this problem can request for the TEST BIOS to test on your system. You may either pm/email me or Bas, or get it at http://ftp://ftp.heppen.be/MSI/
    Please report back whether the test BIOS would really fix the problem, or cause any new problem, or any performance hit.
    ** this sounds like a Christmas Gift to KT4V users AND New Year Gift to KT4 Ultra users!!!  :D  **
    To all KT4 Ultra and KT4V users, either you have data corruption or CRC error in large data transfer and CD burning or not, your inputs are needed.
    Please list down your system specs as details as possible. Below here is a guideline, you may take this, CTRL-C (copy) and write your specs in your post.
    1. System specs:
    CPU:
    Motherboard:
    RAM Slot-1: (exact brand and model)
    RAM Slot-2:
    RAM Slot-3:
    display card:  [no overclock]
    IDE-1M:  (exact HDD brand and model pls)
    IDE-1S:
    IDE-2M:
    IDE-2S:
    IDE-3:
    SER-1:
    SER-2:
    PCI-1:
    PCI-2:
    PCI-3:
    PCI-4:
    PCI-5:
    PCI-6:
    PSU: (brand, model, total power, (estimated) combined power)
    BIOS revision:
    Operating System:
    VIA 4-in-1 drivers : (if you installed it, tell us the version)
    other drivers, services or applications might affect the data transfer such as : PCI Latency patch, WPCREDIT modifications, VCool, CoolerXP...
    2. CRC ERROR?
    PASS or FAIL
    If PASS, let us know your BIOS settings.
    If FAIL, proceed as below:
    3. Please use these BIOS settings:
    1. Load BIOS Setup Default
    2. NO OVERCLOCK ON FSB! Set accordingly to your CPU
    2. Set RAM to
    a) SPD, if failed try
    b) User defined to the slowest RAM timings, ie 266,2.5,3,6,3,disable interleave,4,disable 1T, normal
    If PASS, go for more extreme BIOS settings as you usually use:
    1. High Performance Default
    2. set RAM to the extreme timings
    3. DO NOT overclock yet until both 1. and 2. are PASS
    4. Try these suggestions:
    1. Microsoft IDE drivers (uninstall VIA 4-in-1's)
    2. VIA 4-in-1 different version's IDE filter driver?
    3. VIA IDE Miniport driver?
    4. use IDE-3 RAID channel for one HDD data transfer
    5. same HDD transfer, ie C:\dir1\*.* -> C:\dir2\*.*
    6. burn CD at 1x speed
    7. Set the HDD and/or CD to PIO mode, or slower UDMA mode.
    8. If and only if you know how to update BIOS correctly and willing to take some risks, try the BETA BIOS KT4 (1.25), KT4V (1.64) too.
    Please report back your tests and experimentations of these suggestions.
    If you have any workaround to deal with this issue other than set back FSB to 100Mhz, please tell us too.
    Thanks for your inputs!

    My system, just gotten this 2 days ago
    CPU: Athlon XP 2000+
    Motherboard: MSI KT4V (MS-6712)
    RAM Slot-1:
    RAM Slot-2: 512 Mb Kingston DDR 333 CAS 2.5
    RAM Slot-3:
    display card: Abit Siluro GF3 Ti200
    IDE-1M: Western Digital WD800JB (8 Mb Cache) - 80 GB
    IDE-1S: Seagate U-Series ST360020A - 60 GB
    IDE-2M: Sony DVD-ROM 16x (DDU1621)
    IDE-2S: Creative CDRW121032
    IDE-3:
    SER-1:
    SER-2:
    PCI-1:
    PCI-2:
    PCI-3: Accton 1207F 10/100 Fast Ethernet Card
    PCI-4: SBLIVE 5.1 Platinum with Live!Drive II
    PCI-5:
    PCI-6:
    PSU: 400Watts (Generic)
    BIOS revision: 1.6
    Operating System: Windows 2000 with SP3
    VIA 4-in-1 drivers : Hyperion 4.45, only AGP and INF installed. IDE drivers are standard Win2k/SP3 ones.
    2. CRC ERROR?
    FAIL.
    When i had my system, i tried installing Windows ME as i wanted to do dual-booting together with Win2K. When i tried to install NVIDIA Detonator drivers (Ver 30.82), it proceeded as normally and asked for a reboot, which i did, then it just hang before the start of Windows ME. I did a reboot, and later selected "Normal" as Windows ME detected a failed startup, and later i was able to enter Windows ME, but it reported that the NVIDIA Detonator drivers were invalid and of wrong type to my display card.
    Later i tried to move my files from my C:\ to D:\ and it reported saying that my destination file is invalid.
    When i changed my OS to Win2k/SP3 (no more dual-boot), and installed the same NVIDIA detonator driver version, it worked. When i started to copy files again, it later BSOD, and said PAGE_FAULT_ERROR (something like this). When installing from CDs, it will report that my .CAB files are corrupt or have insufficient swap file space (i set mine manually at 1.2Gb size). Then there are times during my reboots and entering win2k, i found my keyboard and mouse (all PS/2) not working and windows loads as usual.
    Later i changed my PCI latency setting from 32 to 96, i managed to install from CD without much further issues.
    Upon reading these posts here, i didn't realize that the MSI KT4 series or the KT400 chipset had so much issues! i have read from countless sites like extremetech and anandtech and none reported about this particular errors i have encountered during my first 2 days with this setup. (this is my first Athlon setup, i'm previously and Intel person-type).
    So far, i conclude in my settings is that:
    -32-Bit settings in Bios settings for CD/DVD/CD-RW must be disabled, i concur with Shumway's recommendations.
    -DMA settings in Windows 2000 for CD/DVD/CD-RW must be set to PIO mode otherwise when copying from CD to HDD will have read errors.
    -Installing the PCI latency fix really does wonders for my set. (PCI Latency fix ver 1.9). Now i can copy files from all my drives without worrying so much about CRC errors. Thank you, George E. Breese.
    -I really want to know why in WinME i can't install the NVIDIA detonator drivers, while in Win2K i can.
    I post again, once i have done some more tests to my system, especially CD-R writes.
    Angel17

  • ADO's Use with XML for Data Transfer

    Does anyone know if ADO is supported as a data transfer method
    between XML and Oracle8i and back (from Oracle8i and XML)? If
    not, are any other Microsoft drivers supported for this type of
    operation?
    null

    Hi there,
    I have a similar question for you: I have an ASP page (coded in vbScript) that connects to an oracle 9i database via ODBC and accesses data using ADO.
    Using this method I can read and write to the database successfully for all data types except that of XML Type.
    It is my understanding that a column of XML Type is fundamentally treated as a CLOB column underneath the covers. Then why can I write data to a CLOB column and not to an XML column using ADO methods??
    Here is the code I am using:
    'put my dom object into a variable
    Dim xmlDoc
    xmlDoc = objDom.xml
    ' Connection string
    Dim cnxn
    Set cnxn = Server.CreateObject("ADODB.Connection")
    cnxn.Open "dsn=sc3;uid=user;pwd=password;"
    ' Recordset object
    Dim rs
    Set rs = Server.CreateObject("ADODB.Recordset")
    rs.Open "Select * FROM xmltabletest", cnxn, 2, 2
    'write xml document to column of xml type
    rs.AddNew
    rs("xmlcol") = xmldoc
    rs.Update
    ' clean up
    rs.Close
    cnxn.Close
    Set rs = Nothing
    Set cnxn = Nothing
    Any help with this would be great. I'd also like to write the xml document to a table of xml type using the same/similar method.
    Can anyone help me please?!?
    Thanks in advance,
    Ari

  • Large data transfer hangs timecapsule

    While transfering many small files my time caspule hangs after a short while. I can still use the device from my Mac, but from windows the device is no longer existstant. Even webtraffic doesn't work.
    I'm using the latest timecapsule firmware 7.6.1 and the timecapsule is a model bought in 2011. I had hoped that the 7.6.1 firmware would solve the problem as 7.6 also had this issue.

    Click here and fill out the form.
    If desired, you can use the ditto command in the Terminal instead of the Finder to copy files.
    (122436)

  • Data Transfer Workbech - XML File

    Hi,
    I pretend use the Task Scheduler for importation data to the SAP Business One System. For this, I created a batch file with information:
    cd "C:\Program Files (x86)\SAP\Data Transfer Workbench"
    DTW -S C:\Temp\DTW_ScheduleItemImport.xml
    Also, I exported for xml file the data to import in the system through Data Transfer Workbench. The content of xml file is:
    <DTW><BOM><BO><AdmInfo><Object>13</Object><Version>2</Version></AdmInfo><Documents><row><CardCode>C1001</CardCode><DocDate>20050223</DocDate><DocDueDate>20050223</DocDueDate><Series>1000</Series><TaxDate>20050223</TaxDate></row></Documents><Document_Lines><row><ItemCode>A1008</ItemCode><Price>8</Price><Quantity>2</Quantity><WarehouseCode>01</WarehouseCode></row></Document_Lines></BO></BOM></DTW>
    When I schedule a task in Task Scheduler runs the same but the data are not imported.
    I believe that not import due to the fact that in the XML file will not be exported data on BD certified in Data Transfer Workbench.
    Anyone help me?
    Thanks.
    Sandra Pereira

    Hello Sandra,
    I trust you.
    >The objective this process is to schedule of importing data into the system using the mapping in xml, right?
    Yes, and all the login information, file information, etc. the filename should be the same each time to import the data.
    Maybe you are using older version of DTW than it knows.  You may try to upgrade to the latest DTW, or at least B1 2007 SP 01 version (i have DTW 2005.0.30)
    my xml file looks like
    <Transfer>
    - <Logon>
      <UserName>manager</UserName>
      <Password>tCmKcJvLfKqJ</Password>
      <Company>FRMARPSAPHU</Company>
      <Server>(local)sqlexpress</Server>
      <UserAuthentication>True</UserAuthentication>
      <Language></Language>
      <LicenseServer></LicenseServer>
      <ChooseDB>False</ChooseDB>
      <DBType>4</DBType>
      <DBUser></DBUser>
      <SybasePort></SybasePort>
      <DBPassword></DBPassword>
      </Logon>
      <ObjectCode>oItems</ObjectCode>
    - <FileExtractor>
    - <Extorlogin>
      <ExID></ExID>
      <ExDSN></ExDSN>
      </Extorlogin>
      <FilesTypes>2</FilesTypes>
    - <Files>
      <Items>C:SBOProjectsBio-RAD2010Service Implementationitems.txt</Items>
      </Files>
      </FileExtractor>
    etc
    Regards
    János

  • Data Transfer Workbench Slow on Large Imports

    I am using version 6.5.11 of the workbench, API version 6.50.53
    I have a problem with the speed of imports where the file sizes become large. e.g. I am importing sales invoices using a header file and line items file. Both are tab-delimited text files. If I import 2000 invoices, it takes around 1.8 seconds per invoice. If I import 10000 invoices, it takes around 18 seconds per invoice (50 hours). I actually wanted to import around 30000 invoices, but that was taking a minute per invoice.
    Another problem is that the first data line after the header line in every import file is ignored and not processed.
    Does anyone know if these have been fixed or will be fixed soon?

    To import a large amount of items or invoices with the Data Transfer Workbench, I always shrink the file into x files with 1000 records. That's the only to have a fast process.
    Apparently this problem is correctedwith 2004 which is much faster. I haven't tested yet the DTW with 2004 as I just received it.
    For the bug you have with the line, it occured to me one time too but not anymore and I can't work out why it did that. Maybe, it could be more efficient for you to create message on the marketplace.
    Vincent

  • Data Transfer Betw 8i's 0040, ODBC, or OLE DB Drivers & XML

    I have researched that Oracle has 3 database drivers for
    accessing Windows NT data - they are Oracle 8i's: 0040, ODBC
    Driver, and OLE DB drivers. Do these drivers allow data
    transfer between XML and Oracle 8i?
    null

    Donnell Brockenbrough (guest) wrote:
    : I have researched that Oracle has 3 database drivers for
    : accessing Windows NT data - they are Oracle 8i's: 0040, ODBC
    : Driver, and OLE DB drivers. Do these drivers allow data
    : transfer between XML and Oracle 8i?
    Here are the details I have:
    1. XML Support in OO4O:
    The next major release of OO4O will allow query result sets to
    be rendered as XML documents. Query result sets can also be
    accessed, and data can be inserted or updated via a COM/DCOM
    implementation of DOM interfaces. XML support in OO4O will cover
    both Relational as well as Object-Relational data bases. OO4O
    will also include a COM/DCOM implementation of the Oracle XML
    Parser.
    2. XML support in ODBC or OLE DB
    Since the ODBC or OLE DB specifications are comntrolled by
    Micosoft, they will be the vender to contact regarding XML
    support for these interfaces.
    Oracle XML Team
    http://technet.oracle.com
    Oracle Technology Network
    null

  • Java.lang.OutOfMemoryError during transfer of large data from SAP to PI

    Hi experts,
    We are trying to transfer large data from SAP to external system via PI but the transfer stucked in sm58 of SAP system with 'error java.lang.OutOfMemoryError'. We have tested before this and we can only get approximately 100K of records to go through successfully. We neeed approximately transfer 300k datas per time. Current setting of max heap size (in mb) of our PI system is 512 at message server and bootstap tab. At servers general tab the max heap size (in mb) is 2048 and its java parameters are having the values below:
    -Xmx1024m
    -Xms1024m
    -Djco.jarm=1
    -XX:PermSize=512m
    -XX:MaxPermSize=512m
    -XX:NewSize=320m
    -XX:MaxNewSize=320m
    What need to be increased to solve the error?
    Thanks.
    Regards,
    Thava

    Hi Thava,
    We can set a limit on the request body message length that can be accepted by the HTTP Provider Service on the Java dispatcher. The system controls this limit by inspecting the Content-Length header of the request or monitoring the chunked request body (in case chunked encoding is applied to the message). If the value of the Content-Length header exceeds the maximum request body length, then the HTTP Provider Service will reject the request with a 413 u201CRequest Entity Too Largeu201D error response. You can limit the length of the request body using the tting MaxRequestContentLength property of the HTTP Provider Service running on the Java dispatcher. By default, the maximum permitted value is 131072 KB (or 128MB).You can configure the MaxRequestContentLength property using the Visual Administrator tool. Proceed as follows:
           1.      Go to the Properties tab of the HTTP Provider Service running on the dispatcher.
           2.      Choose MaxRequestContentLength property and enter a value in the Value field. The length is specified in KB.
           3.      Choose Update to add it to the list of properties.
           4.      To apply these changes, choose  (Save Properties).
    The value of the parameter MaxRequestContentLength has to be set to a high value.
    In short parameters to reset values for ABAP side are
    icm/HTTP/max_request_size_KB
    icm/server_port_ TIMEOUT
    rdisp/max_wprun_time
    zttp/max_memreq_MB
    Parameter to reset values for JAVA side is MaxRequestContentLength.
    You can use following link to know more about ICM parameters
    http://help.sap.com/saphelp_nw04/helpdata/EN/61/f5183a3bef2669e10000000a114084/frameset.htm
    http://help.sap.com/saphelp_nw04/helpdata/EN/25/b0a4f6f2b03a43a727a165a4d6a555/frameset.htm
    regards
    Anupam

  • XML Solutions for Large Data Sets

    Hi,
    I'm working with a large data set (9 million records comprising 36 gigabytes) and am exploring the use of XML with it.
    I've experimented with a JDBC app (taken straight from Steve Muench's excellent <i>Oracle_XML_Applications</i>) for writing to CLOBS, but achieve throughputs of much less than 40k/s (the minimum speed required to process the data in < 10 days).
    What kind of throughputs are possible loading XML records from CLOBs into multiple tables (using server-side Java apps)?
    Could anyone comment whether XML is a feasible possibility for this size data set?
    Regards,
    Mike

    Just would like to identify myself (I'm the submitter):
    Michael Driscoll <[email protected]>.
    null

  • Large data usage transfer every 6 hours while on a WiFi network, who is doing this transfer? Is Verizon polling my phone causing this usage?

    I have three iPhone 5s. All are using my WiFi while at home. One iPhone is sending data over Verizon's network, not WiFi, every 6 hours or so. The time of this data transfer changes every few days. I see data usage number on my Verizon account ranging from a few kilobytes to almost a gigabyte. Is this due to Verizon polling the iPhone for data usage and billing purposes on a regular basis? And what do these billing numbers mean? Where does Verizon get these numbers? Any running apps while on my WiFi should use the WiFi, not Verizon, correct?

    I'm having the same issue and this month these data pulls equated to 1891.51 MB!!!
    Model: iPhone 5S 64GB.  Today I received a notification from Verizon that I had exceeded my data plan.  I just went through my entire usage detail report on my phone and separated-out everything that appears to be part of a 6-hour pattern.  When I'm at home and most of the time while I'm at work, I am connected to wifi.  These data pulls often happen in the middle of the night, when I know for a fact that my phone is charging and connected to wifi at home.  Below are the results of the pattern analysis.  Keep in mind that I removed all lines that, as far as I could tell, were not part of a 6-hour pattern.
    THE PROOF: You sent me three alerts immediately after pulling data from my phone.
    The first alert said that I had used 75% of my data plan on 8/11/2014 at 9:45pm while I was at home on wifi.  There was a data pull logged at 9:45pm which was part of the 6-hour pattern above.
    The second alert stated that I had used 90% of my data at 10:46pm on 8/14/2014; the data log shows a pull just before the alert at 10:45pm.
    The third alert telling me I had used all of my data came today while my phone was sitting next to me, connected to wifi at the office.  The alert arrived at 12:48pm, the log shows a data pull at 12:48pm.
    Screenshots:
    How about an explanation Verizon?

  • Best way to transfer XML based data

    All,
    I am looking for the best possible way of transferring XML based data from a server to a web based client.
    XML file is generated on server and its size is massive. I am looking for all possible alternatives and possibly the best way.
    Could it be AJAX, SOAP, parsing through a servlet, creating value objects?
    I appreciate all inputs.
    Thanks,
    S.

    You can make a bootable copy of your system now, but you won't be able to boot from it with the new computer. They are not usually backwards compatible with the OS. But when you first boot into the new system, use SetUp Assistant to migrate the data from the cloned copy. That will work just fine.

  • How to do Bulk data transfer  using Web Service

    In my application I have to write various web services but majority of the web service has to
    query database and return back bulk data(rows>10K) through web service.
    So I would like to ask what is the efficient way of transferring bulk data using web service as presently
    Iam returning the dataset as xml String (using StringBuilder) from web service and consuming the same at client end.
    Is there a better way to this in web service?
    My env:
    Front end can be in any other technology ,UI like C#
    Back end : Tomcat 6 on Java 6 with Axis2
    Thanks in advance

    Innova wrote:
    But then also I have to mention a return type although its xml that is getting transferred.
    Can you provide me a with a small sample code.Like if I have Emp obect with properties
    Empname,EmpID,DOJ,DOB,Dpt . In this case what will be the return type.
    My major concern is that my resultset is going to be huge in terms of >10,000 rows so
    that much time for formation of xml and then the transfer so how can I reduce the transfer
    time ,I meant a faster optimised approach for transferring large data as part of web service.
    Simply putting , how can I transfer bulk data in minimum time so that I can reduce the
    waiting time at client side for the response.
    Thanks in advanceI've done this with thousands before now, and not had a performance problem...at least nothing worth writing home about.
    Your return type will be an array of Emp objects. You can use a SOAP array for that (you'll need to look that up for the syntax, since I can't remember it off the top of my head), which gets restricted to being an array of Emp.
    Should the normal technique prove to be too slow then you should look at alternatives, but I would expect this to work well enough. And you have no faffing about with devising a new transfer system that then has to be explained to your client system...it'll all be standard.

  • HCM Best Practices version 1.500 (HCM Data Transfer Toolbox)

    Hi,
    Does anyone have the HCM Data Transfer Toolbox (BPHRDTT, HR-DTT, HRDTT)? This seems to be discontinued and been replaced by LSMW which does not provide anything even close to this tool!!!!
    I wish this would have been posted in the Downloads on SDN.
    Thanks
    Check out the following link to see what I mean:
    http://help.sap.com/bp_hcmv1500/HRBP_DTT_V1_500/index.htm
    Edited by: Andreas Mau on Jan 30, 2009 8:42 AM

    Hi Uwe,
    Thanks for giving it a try, but my question is very specific for the reason that SAP has discontinued a perfectly good tool and replaced it with some do-over.
    LSMW - is not always suitable and for large loads the performance is out of wack. You always have to start from scratch creating the stuff.
    PU12 - The interface toolbox is nice but it could offer something more like XML or MHTML formating so you can actually reuse the data without pain. Also for payroll extraction on retro-results for only deltas this is not useful either.
    HR-DTT - Was a tool that had all you really needed and was working just fine. It had everything not just infotypes but also T558B/C/E load although I prefer T558D than T558C. Here SAP is totally lacking support. The 2 BAPIs for BUS1023 do not support all one is only for outsourcing, the other for only B/D tables. In the code the E table update is commented out since they forgot to actually add the corresponding structure for the load as importing parameter...
    SXDA - A rather unknown Data Transfer Workbench which also uses projects to organize the data transfer hooks up onto LSMW as well, but again that is where the tools end. There are no example projects for any area that are of any use. SCM/SRM has some minor examples but nothing that could be called exhaustive.
    The Link: Thanks to SAP support this link has been discontinued so that no-one can ask questions any more on this issue. You may still find the docu on the service marketplace in 50076489.ZIP for HRBP_USA_V1500
    Edited by: Andreas Mau on Apr 23, 2009 10:36 PM
    Edited by: Andreas Mau on Apr 24, 2009 1:39 AM

  • Data transfer b/w SAP to Java using IDOC and Interface SAP Jco

    Dear Experts,
    The challenging requirement we are having is, we need to create the interface for data transfer between SAP system and the Java system. The data will be transferred from SAP to java and similarly once some processing done in Java again the details needs to be transferred from Java to SAP.
    For this data transferred we are planning to use IDOC process and for interface "SAP Java connector (Version 3.0.5)" we are planning to use. As per our understanding, from Java side one program needs to be written to connect with SAP as "Registered program". This registered program will appear in SAP GATEWAY automatically and using tRFC, TCP/IP connection both SAP and Java system will be connected.
    In this case we are having some doubts.
    1. The data from SAP is going to be transfered from one Custom transaction (Z tcode). Once "Outbound IDOC" will get triggered and will carry the details. Now the doubt is, whether the data / details will get transfered to JAVA system automatically or we need to perform any other steps from SAP ABAP coding...(like converting in to flat file, XML file and etc) ??
    2. We are planning to install "SAP Jco" in Java server. Is this correct...??
    3. Other than SAP Jco any other softwares needs to be installed or not..??
    4. Since we are going to trigger the "outbound IDOC" from custom transaction, we are planning to develope one program in SE37. Other than this any other program we need to develop or not..??
    5. Any sample Java program for the SAP Jco version 3.0.5 to create the "Registered program" with SAP..? (e.g. SAP Listener program).?
    If anybody has detailed steps or explanation please share it with us.
    Thanks in advance
    Warm Regards,
    VEL

    Hi All,
      For the above mentioned issue, we implemented JCo software in JAVA system and created the JAVA program including SAP logon credential details like Client, User name, password and Language details.
    When this JAVA program was compiled successfully then, that non SAP system will appear in SAP gateway Tcode.
    Once non SAP system started appearing in SAP gateway that means, both SAP & Non SAP are connected automatically.
    Regards,
    Velmurugan P

  • Data transfer very slow on mac pro!! mac H/W unable to fit specifications.

    Hi,
    I am installinginternal disks on the quad mac pro and the transfer ratet in the mac are very slow!!!
    the disks are: barracuda 7200.10 disks, 500 GB. Sta II 300, max ST3500630AS firmware 3.AAE . throughput:300mb.jumpred off to get the 300.
    MAC tells that the QUAD MAC Pro equipment (2x2.66ghz dualcore intel xeon) comes with a 3.Gigabit specification/ 300MB/sec. (Intel ESB2 AHCI, speed: 3.0 Gigabit. AHCI version 1.10 supported)...
    And the benchmarks of the disk ST3500630AS show maximum 80MB/sec and a minimin of 37.3MB/sec, average 63.1 MB/sec. (anandtech tests).
    According to seagate.com the maximum sustained data transfer rate:78MB/s
    but in the mac the reallity is different: I am getting a transfer rate of 22MB/sec!!!!! . with out any application running and only making copy of the files!!!!
    that means that the mac is only providing only 1/3 of the expected transfer rate for the disks..
    so, is there a way that the MAC Pro gets all the normal transfer rate for those disks???
    Is the Quad MAC Pro hardware unable to get the maximum throughput of the disks???
    It seems that the mac is unable to get also the average transfer rate of those disks.. how to correct it???
    thanks a lot
    Alberto
    ps. just for fun: I am getting a faster transfer rate from an old power pc ,and old pc, and firewire 400 and old ide disks!!! (36MB/seg!!!)

    Yup.
    BThe latest Mac Pro are better, so if you have a new or 2 month old unit, Seagate should work, though I think there should be later firmware.
    Some would say those drives take the cake or woRSe.
    http://www.barefeats.com/hard91.html
    SEAGATE PUZZLES
    +Firmware version 3.AEE or later solves the slow sustained large block write speed issue for a single 7200.10 inside the Mac Pro.+
    +The remaining performance issue is slow small random read speeds for one, two, three or four drives. No matter how many drives you configure in RAID 0 sets, the average random read speed for combined block sizes from 64K to 1024K is less than 30MB/s (based on QuickBench 3 testing).+
    +Until Seagate fixes this, we can't recommend the 7200.10 series as the ideal boot drive for the Mac Pro (or Power Mac).+
    http://www.barefeats.com/quad08.html

Maybe you are looking for

  • How can you combine 2 shapes while preserving individual colors?

    I want to be able to apply vector textures [for t-shirts for example] by knocking the texture shape out of the original shape. Now, say I have a logo with several layers, or a bunch of shapes together, and I want to apply ONE texture across the entir

  • SOAP to SOAP scenario - XIServer:UNAUTHORIZED

    Hello, I created a scenario synchronous SOAP<>PI<>SOAP, but when I consume the WS (using soapui) this show me the next error: <!--see the documentation--> <SOAP:Envelope xmlns:SOAP="http://schemas.xmlsoap.org/soap/envelope/">    <SOAP:Body>       <SO

  • Complicated Masking always freezes up program???

    Hey I've been working on a 4 or 5 second shot with 3 complicated masks masking the movement of 3 people frame by frame. Usually after 10 minutes of working I'll move one of the vector points, and the point will move but what is visually being masked

  • Is it OK to Switching Undo Tablespaces without DB stop ?

    We use Oracle11g R2. I read Oracle Doc and It looks OK. But, how could I make sure UNDO tablespace mode(ONLINE,PENDING OFFLINE) ? Switching Undo Tablespaces You can switch from using one undo tablespace to another. Because the UNDO_TABLESPACE initial

  • How to upgrade to Ps CS5 on new iMac

    Hello- I'm anticipating my new iMac i7 anyday. I want to upgrade to Ps CS5 to the new system, not my current old iMac which I have CS2 on. Since this is an upgrade will that be possible? Can I buy CS5 & install directly to the new iMac or do I have t