Upload time faster on wintel than solaris

I posted a similar question before but have not resolved the issue so...
I setup WLS 6.1 on a wintel, 500MHz laptop
and WLS6.1 on a E220R running Solaris 8
my client browser is IE 5.5 running on Win2k
I tested the time it takes to upload a file using the console to upload an
EAR file
when I upload a 2.5Mb EAR to wintel it takes about 5 seconds to upload and
install
on the solrais box it takes about 35 seconds, most of this time is spend
uploading the file
why is it so much slower to upload to solaris?
any help on this will be greatly appreciated
Jim Durnford [email protected]

Thanks for the tips, Varjak.
I never use so-called 'download accelerators', nothing like more useless software on the system.
My anti-virus programs and settings are such that they don't affect my downloads.  Anything that I download from other sites (Google, Netflix, etc) come in at similar speeds regardless of whether I'm on the PC, Mac, tablet, or whatever.
It's not really an issue, I was just curious if anyone else has compared their PC vs. Mac and noticed the same thing I experience.
This happens across all other OS's, not just Windows.  The Mac isn't faster anywhere else only on the Apple-based iTunes.  I just find it funny and it wouldn't surprise me at all if their system 'sniffs out' Macs and gives them the fast lane for downloading from iTunes!

Similar Messages

  • Playback speed in Sample Editor window many, many times faster than track (at correct speed) in arrange area. How do I sync Sample Editor playback speed to correct speed/tempo in arrange area? Track is spoken word.

    Playback speed in Sample Editor window many, many times faster than track (at correct speed) in arrange area. How do I sync Sample Editor playback speed to correct speed/tempo in arrange area? Track is spoken word. Sample Editor playback sounds like Alvin on a meth binge. Spoken phrase is generated from Textspeech. Textspeech can export files as WAV files or MP3 files. Perhaps a clue?:   When exported Textspeech WAV file is dragged and dropped into track in arrange area of new project, it exhibits same supersonic speed. When Textspeech file is exported as MP3 file and dragged and dropped in arrange area track, it plays at correct speed.

    Thanks Erik,
    If nothing else, this huge list of updates and fixes, shows clearly that the Logic Dev team is working hard on fixing and improving LPX to a major degree.... and from the list of fixes done.. show they do read the bug reports submitted!
    As an aside....
    I recall how all the 'naysayers' prior to LPX (and in some cases, since...)  were proclaiming how Logic was dead, the team was being disbanded, we won't see any further development, the Dev team doesn't listen or care... and so on....... I wonder where those people are now?

  • Mencoder H.264 20 times faster than Compressor 2

    I tested mencoder with compressor running with 5 G5s. the H.264 implementation of mencoder was four times faster than the 5 dual core quads clustered with compressor two and queermaster . my single computer alone with just a dual 2 ghz processor encoded a movie 20 times faster than compressor with Queerrmaster on this same machine.
    compressor costs more than mencoder(free in DVision). to get compressor you have to get an expensive Pro app.
    What's wrong with this picture?

    Well, I have heard this lament before with the G5s, and all I can say is that I guess Apple is slowly starting to drop support for the PowerPC generation (it was inevitable). I assume you've upgraded to 3.0.1?
    As for Motion 3 (and someone correct me here if I am wrong), I believe it's slower because of the full 3D integration. Whether or not you have a lot of 3D aspects, I think it still calculates for it, causing your response and render time to decrease.

  • Web Report - ABAP Vs JAVA engine - ABAP 10 times faster than JAVA

    Guys,
    I want to share what we found in our project and see if any of you have insights
    into our findings.We are on NW2004S SP14 and we are moving to SP15 in a couple of weeks.We created query, developed WAD for it and executing the WAD takes for this query takes 22 secs (Vs 2 secs using ABAP) the query output has 1 million records and most of the actions we take from that point on like right click on account takes 20 secs (Vs 0 secs/instant using ABAP) , drilldown to level 4 of account hierarchy takes 58 secs (Vs 5 secs using ABAP), drilldown on cost center level 6 takes 42 secs (Vs 4 secs using ABAP), , right click on cost center takes 32 secs (Vs 3 secs using ABAP), ..etc.
    Basically every action we take in the JAVA report takes an average of  28 secs.There are 9 aggregates built on the cube that are barely hit by this query but the same query performing same actions with same selections hit the aggregates many many times.The questions I have is why is ABAP so fast compared to JAVA ? What is true explanation behind this behavior ? What are the dis-advantages by using ABAP engine ? Users are loving the performance and features of ABAP while they weren't really on board with the original JAVA report (as it was slow). ABAP is sure enough 10 times faster than JAVA. Query/Query Properties are exactly the same in ABAP and JAVA.Please explain.
    Cheers
    RT

    Hi All,
    Thanks to all you for your responses. I appreciate your time for going through my questions and coming forward to express your views.
    However, I was looking for more specific "factual" answers. My question is "What does a client miss if they opt to install only ABAP based BI 7.0, as against JAVA Based BI 7.0"
    thanks again.
    Naga

  • "tp import all" runs 24 times faster than "tp import TR "  for long queue?

    After a test upgrade to ECC 6.0, I processed 1200 individual transports in 12 hours. When I rebuilt the queue of 1200 transports and processed "tp import all", it completed in 30 minutes.
    Should I expect "tp import all" to process 24 times faster than individual imports?
    800 transports were client independent (code) and 400 transports were client-specific (configuration).
    What run times have you seen?
    What problems with "tp import all" have you seen?
    Best regards,
    Gary Sherwood

    Hi Gary
    You don't know the 800 transports which are wating for the import, what could be render your system in case of import all. so, that's why , i will prefer to you import individual request instead of import all.
    offcourse, Import all are faster than the individual because it prepare all steps once to start import,
    Regards
    Anwer Waseem

  • [Solved] ADF BC - PL/SQL block 6 times faster than CreateRow/InsertRow

    THE ENVIRONMENT:
    - JDeveloper 10.1.2.0.0
    - Oracle 9.2.0.8.0
    THE PROBLEM:
    I found one of my pages in an ADF BC Web Application to be quite slow. After some profiling, I found a significant bottleneck in a loop which inserts approximately 270 rows in a table (only inserts into the table using insertRow). Would there be an improvement if we rewrite the transaction in PL/SQL code? To find out, I came up with the following simple test case:
    1) Table
    create table TES_FSOLICFONDO
    ID_SOLICITUD NUMBER not null,
    A_PAIS VARCHAR2(2),
    C_IDEMPRESA NUMBER,
    A_CODMONEDA VARCHAR2(3),
    C_LOCAL NUMBER,
    D_FECSOLICIT DATE,
    C_IDBILLMONE NUMBER not null,
    M_MONTO NUMBER,
    A_CREADOPOR VARCHAR2(30),
    D_CREACION DATE,
    A_MODIFPOR VARCHAR2(30),
    D_MODIF DATE);
    alter table TES_FSOLICFONDO
    add primary key (ID_SOLICITUD);
    create index IND_TES_FSOLICFONDO_01 on TES_FSOLICFONDO (C_IDBILLMONE, C_LOCAL, D_FECSOLICIT);
    create index IND_TES_FSOLICFONDO_02 on TES_FSOLICFONDO (C_IDEMPRESA, A_CODMONEDA, C_LOCAL, D_FECSOLICIT,C_IDBILLMONE);
    2) Sequence
    create sequence SEQ_TES_FSOLICFONDO
    minvalue 1
    maxvalue 999999999999999999999999999
    start with 1
    increment by 1
    cache 2;
    3) ADF BC
    - Create a Business Components Project with a BC package
    - Create new Entity Object based on the previous table, with standard wizard settings (this will be named TesFsolicfondo)
    - Create new View Object linked to previous Entity Object, with standard wizard settings (this will be named TesFsolicfondoVO).
    - Create an Application Module named AMTESAsignaSuperAvance, and include the previous View Object in the Data Model
    4) The test case
    - Create a test() method in the AMTESAsignaSuperAvanceImpl class as follows:
    public void test() throws Exception
    TesFsolicfondoVOImpl solVO = getTesFsolicfondoVO1();
    TesFsolicfondoVORowImpl r = null;
    for (int i = 0; i < 270; i++)
    r = (TesFsolicfondoVORowImpl) solVO.createRow();
    Sequence seq = new Sequence("SEQ_TES_FSOLICFONDO",this);
    r.setIdSolicitud(new Number(seq.getData().toString()));
    r.setAPais("PE");
    r.setCIdempresa(new Number(637292));
    r.setACodmoneda("USD");
    r.setCLocal(new Number(388));
    r.setDFecsolicit(new Date("2006-04-07"));
    r.setCIdbillmone(new Number(116));
    r.setMMonto(new Number(0));
    r.setACreadopor("10000003480");
    r.setDCreacion(new Date("2006-04-03"));
    solVO.insertRow(r);
    getTransaction().commit();
    - Create a test.jsp page including the following code within a scriptlet:
    AMTESAsignaSuperAvanceImpl am = null;
    try
    am = (AMTESAsignaSuperAvanceImpl) Configuration.createRootApplicationModule("cl.falabella.fpi.tes.bc.AMTESAsignaSuperAvance","AMTESAsignaSuperAvanceLocal");
    am.test();
    catch (Exception e)
    am.getTransaction().rollback();
    finally
    Configuration.releaseRootApplicationModule(am,true);
    - Create an equivalent PL/SQL block as follows:
    BEGIN
    FOR I IN 1..270
    LOOP
    INSERT INTO TES_FSOLICFONDO (ID_SOLICITUD, A_PAIS, C_IDEMPRESA, A_CODMONEDA, C_LOCAL,
    D_FECSOLICIT, C_IDBILLMONE, M_MONTO, A_CREADOPOR, D_CREACION)
    VALUES (SEQ_TES_FSOLICFONDO.NEXTVAL, 'PE', 637292, 'USD', 388, TO_DATE('07-04-2006', 'DD-MM-YYYY'), 116, 0,
    '10000003480', TO_DATE('03-04-2006', 'DD-MM-YYYY'));
    END LOOP;
    COMMIT;
    END;
    THE RESULTS:
    The PL/SQL block was executed on a standard SQL* Plus window, and the test() method within the AM was called from the test.jsp page.
    The PL/SQL block takes 1125 milliseconds in my setup, the test() method within the AM takes 6017 milliseconds, so PL/SQL is about 6 times faster in this test.
    THE QUESTION:
    Is this supposed to be this way? I want to avoid creating a PL/SQL package, I would rather use the ADF BC framework throughout the application. Would someone kindly point out any configuration issues I may have missed that will improve the BC performance?

    Hello Jan,
    Thanks for the tip, good to be born again as a name instead of a number.
    I omitted the code used for timing, but here goes:
    test.jsp:
    java.util.Date d1 = new java.util.Date();
    AMTESAsignaSuperAvanceImpl am = null;
    try
    am = (AMTESAsignaSuperAvanceImpl) Configuration.createRootApplicationModule("cl.falabella.fpi.tes.bc.AMTESAsignaSuperAvance","AMTESAsignaSuperAvanceLocal");
    am.test();
    catch (Exception e)
    am.getTransaction().rollback();
    finally
    Configuration.releaseRootApplicationModule(am,true);
    java.util.Date d2 = new java.util.Date();
    System.out.println("Time (ms) = " + Double.toString(d2.getTime()-d1.getTime()));
    PL/SQL:
    I use a tool called "PL/SQL Developer". It has a "Command Window" with a user interface which is extremely similar to SQL* Plus. After running the PL/SQL code, it reported "PL/SQL procedure successfully completed in 1.125 seconds".
    From the tool documentation:
    The Command Window
    The Command Window allows you to execute SQL scripts in a way that is very much similar to Oracle's SQL*Plus. To create a Command Window press the New button on the toolbar or select the New item in the File menu. A Command Window is created and you can type SQL and SQL*Plus commands like you are used to, without leaving PL/SQL Developer’s IDE.
    Regards
    Rodrigo

  • Can I reduce a Template image upload time to Azure RemoteApp ?

    Can I reduce a Template image upload time to Azure RemoteApp ?
    I want to reduce from 2 hours to 1 hour.
    Today, Jan 14, 2015's upload time is over 2 hours.
    Environment
      Region : Japan West.
      Vhd size: 43 GB
      Upload Tool: Upload-AzureRemoteAppTemplateImage.ps1
    Result
      Elapsed time for the operation= 00:32:38
      Elapsed time ffor upload = 01:48:29
      Traffice rate = 58 Mbps
    Regards,
    Yoshihiro Kawabata

    Hi Yoshihiro,
    1. You can reduce the Hash calculation time by storing the vhd on faster hard drive(s) and using faster CPU.  For that size vhd you should be able to get it down to less than 10 minutes for the calculation time.
    2. You can reduce the upload time by using faster Internet connection that is as close as possible to Azure Japan West facility, with low latency and no packet loss.  Under excellent conditions (high bandwidth link, low latency, no loss) you should
    be able to get several hundred Mbps throughput. 
    -TP

  • Can i upload  first  Transaction data rather than master data what will hap

    hai,
    can i upload  first  Transaction data rather than master data what happends going in bw regarding performance issues like indexes
    let me know
    regards,
    murali

    Hi Murali,
    you can load the transaction data before master data. The main impact will be, that the system has to generate sid's and has to populate the sid-table of the master data objects while loading the data to a data target (cube or ods that is activated for bex reporting). This basically means that your data loads will take some more time. Additionally, if you need the master data attributes somewhere in your update or transfer rules, you will not get the correct information.
    So basically it is always a good approach to load and activate the master data before loading the transactional data. But anyway, specially in the development environment I usually don't load all the necessary master data.
    regards
    Siggi
    PS: navigational attributes of the data targets will not be available until you loaded the master data and scheduled the attribute and hierarchy change run.
    Message was edited by: Siegfried Szameitat

  • Upload time events using PA61?

    Hello friends,
    I have written code to upload time events using BDC for transaction PA61.
    But when i try to upload data,in the LIST ENTRY part there's a field ORIGF that is automatically getting filled with 'M" (indicating a manual entry).
    The requirement is that i need to change the content of ORIGF to SPACE.
    I tried updating table TEVEN but there's no change.
    Can anybody please suggest me as to whet i must do.
    Thanks in advance.
    Sanghamitra.

    Hello friends,
    Yeah thats right.
    Its always a good gesture to write down the way we solved.
    I faced problem in two areas :
    1) uploading more than three rows in the time events part of transaction PA61.
    2) updating ORIGF field of TEVEN.
    Solution:
    1) have set the a counter while looping through the time events and also set field DEFSIZE of CTUEVENTS.
    2) i observed the records getting uploaded in foreground mode so i could not notice the difference in the contenets of the field ORIGF but when i had done the same in background i could.
    I hope the reason is understood.
    Regards,
    Sanghamitra.
    2)

  • File upload time

    Hi,
              We are experiencing performance problems uploading file using multipart form
              post requests to a Weblogic Server V6.1 SP1 from IE 5.5 on Win 2000 SP2.
              The upload time seems to increase exponentially with the file size with a
              90Mb file (video) taking as long as 1/2 hour.
              We have done a very simple test which reads the bytes coming in without
              parsing or processing of any kind.
              If anybody knows of a solution to this or at least what the problem is it
              would be greatly appreciated.
              Jim [email protected]
              

    I'm wondering where the bottleneck is, client or server. Try doing the same
              from Mozilla on Windows and from Mozilla on Solaris to try to prove that the
              problem is the server (i.e. if Mozilla on two OSs has the same problem, then
              it's probably an issue on the server end).
              Peace,
              Cameron Purdy
              Tangosol, Inc.
              Clustering Weblogic? You're either using Coherence, or you should be!
              Download a Tangosol Coherence eval today at http://www.tangosol.com/
              "Jim Durnford" <[email protected]> wrote in message
              news:[email protected]...
              > In my haste I left out critically important information. Sorry.
              > The server is running on Solaris and the client is running on Window2000
              > IE5.5.
              >
              >
              > "Kevin L. Burns" <[email protected]> wrote in message
              > news:[email protected]...
              > > This is curious. I wrote my own multipart processing code and have not
              > > experienced this. How are you reading off of the stream? I would
              > > assume you're just looping through the read or readLine functions and
              > > passing the data into byte arrays. I actually do some processing while
              > > I'm reading in the data and have not had any issues. At the same time,
              > > I haven't done much with files of quite that size. You might try doing
              > > some investigation to peripheral issues.
              > >
              > > -Kevin
              > >
              > > Jim Durnford wrote:
              > >
              > > > Hi,
              > > >
              > > > We are experiencing performance problems uploading file using
              multipart
              > form
              > > > post requests to a Weblogic Server V6.1 SP1 from IE 5.5 on Win 2000
              SP2.
              > > > The upload time seems to increase exponentially with the file size
              with
              > a
              > > > 90Mb file (video) taking as long as 1/2 hour.
              > > > We have done a very simple test which reads the bytes coming in
              without
              > > > parsing or processing of any kind.
              > > >
              > > > If anybody knows of a solution to this or at least what the problem is
              > it
              > > > would be greatly appreciated.
              > > >
              > > >
              > > > Jim [email protected]
              > > >
              > > >
              > > >
              > >
              >
              >
              

  • AccessHW is 5 times slower under XP than 2000

    We are using the Port and Memory Utilities for Windows
    (http://sine.ni.com/apps/we/niepd_web_display.display_epd4?p_guid=B45EACE3DEBD56A4E034080020E74861)
    to get data from a legacy/custom piece of hardware that is memory
    mapped on the ISA bus at 0xE0000.
    Specifically, we are using ReadMemory16.vi from AccessHW.zip to read 2 bytes data samples
    in a LabVIEW 7.1 application under Windows XP.
    On a clean, dual boot (Windows 2000 and XP) system, the ReadMemory16.vi is
    about 5.5 times faster when called under Windows 2000 than under Windows XP.
    (100000 calls = 2.3 sec vs. 12.9 sec)
    I see 2 possible reasons for this: the CVI code under the hood in AccessHW
    is not optimized for XP, or XP is not optimized for memory-mapped devices on
    the ISA bus.
    Anyone have any ideas?
    test program attached
    mlewis
    Attachments:
    test speed 2.vi ‏16 KB

    thanks for your response...
    indeed, we are using the native LabVIEW 7.1 Outport.vi to control
    a 3rd party PIO card. Unfortunately, the traditional memory-mapping space
    for ISA devices was 0xD0000-0xEFFFF which cannot be accessed using 16 bit addressing.
    That's why I am using ReadMemory16 to do the 32 bit access.
    Since most legacy ISA buses are an expansion of the PCI bus, I'm wondering if XP handles
    the crazy memory mapping differently. In essence, if no PCI card responds to a given address,
    then the PCI bus expansion gets it, the so-called "if no one claims it, it must be yours" logic.
    mlewisMessage Edited by mal11 on 04-20-2005 11:45 AM

  • Why is my batch process four times faster on battery vs. plugged in?

    Hi-
    I have an illustrator file with dynamic text / variables.
    I load an xml file with the dynamic data and batch save as copy the variations.
    I am using Ai CS6 on Windows 7 Dell laptop with 16gb RAM.
    When I batch save as copy and my laptop is on battery power, the batch is 4 times faster than when the laptop is plugged in.
    For those familiar with batch, when I am on battery power, no dialog pops up showing the "save as copy" left to right progress bar. When I plug in the save as copy dialog does show up, and the batch slows down dramat i c  a  l  l  y.
    To further the confusion, when my laptop is in a docking station, it's about 25% slower than just battery, but three times faster than plugged into power brick.
    I thought maybe it was the power brick, but I used another power brick - same thing. Even tried different outlets! Same behavior.
    I would think the batch would go faster plugged in. But it absolutely is faster on battery. Inside the same batch, if I start the batch on battery, it will go smoking fast, and if I plug in the power source, it slows down to a crawl, and then it will rocket back to full speed when I unplug the power cord!
    Any advice / solution would be appreciated. Obviously super fast on battery can only last as long as the battery.
    Thanks!
    -Michael

    I have no way of knowing what the performance controls are on your laptop. Since the performance issue has to do with your laptop's power source, my suggestion was to just try some things to see if it makes a difference. Try EVERYTHING. Maybe including calling your laptop maker's tech support and asking them to explain the performance difference.

  • HT202299 Is there a way to automatically upload videos that last more than 5 minutes on icloud ?

    I saw that icloud has a limitation of 5 minutes on the videos :
    Is there a way to automatically upload videos that last more than 5 minutes from iphoto/photos ?
    I'm trying to find a way to store my photo library on the cloud, AND keep to acces to each photos/videos from the net (not just storing the .iphotolibrary on the icloud folder) but i cant do it for the videos of more than 5min ...

    I believe you are probably referring to iCloud photo sharing. If you wait a while, the next version of Yosemite will include the new Photos application which will sync video through the iCloud photo library feature.

  • HT4623 I have updated my Iphone 3 but unable to start it. It takes too much time on Authentication and than message appears that Authentication failed

    I have updated my Iphone 3 but unable to start it. It takes too much time on Authentication and than message appears that Authentication failed

    I don't know either its jailbroken or hacked otherwise.
    It was working properly before I have updated it through Itunes to update the OS. After the updation, this message occurs
    Authentication failed, please try after few minutes
    Please help

  • Upload time data using bdc

    hi hr gurus,
    plz help
    we have 3 shifts like 6-2,2-10,10-6.
    while uploading time data to sap from third party system.
    do we need to take care of the shifts times in BDC?
    or do we need to manage it in configurations
    if you have any code please send me
    regards

    Hi,
    As above said was the also procedure to do the BDC recording.Hope you understand the precedure in detail else have a look @ this.
    1.Goto transaction shdb for recording.
    2.There give the recording name starting with Z or Y letter.
    3.Click "New Recording" button in the left extreme Application tool bar.
    4.You will get a new pop up window Create Recording there give Recording Name"Z r Y" and give the transaction code that you going to record for e.g (PA30,XK01).
    5.Leave the Mode as it  is and click "Start Recording".
    6.Here you will get the transaction code window for e.g if you are  giving PA30 it will call the corresponding transaction.
    7.Start recording by entering the values which you want to record.
    8.After recording save the recording and give back.
    9.There you find the list of process which is done.
    10.Enter your recording name and press enter you will get the program name.
    11.Select the program name and press "program" button in the application tool bar.It will ask abt the prgm name and give read from file.
    12.You will get the coding in se38 for recording.
    13.Edit your program for uploading the flat file from you legacy system to application server by using CALL FUNCTION "GUI_UPLOAD".
    14.Get the flat file and fill it in internal table and pass the internal table values by giving "loop at internal table" after READ DATASET statement.
    15.Select the call transaction for small no. of data for large no. of data we have to use session method.(Refer some links for difference between session and call transaction).
    16.There by executing the program we will get the output which record are created.
    Thanks,
    Sakthi.C
    *Rewards if usefull--*

Maybe you are looking for

  • Custom Wizards (using Wizard Framework)?

    Is there a good resource out there that describes how one might go about creating a custom wizard (I suppose using SAP's Wizard Framework) to be used during creation of new iViews (of a particular portal archive)?  I've been looking around, but haven

  • How to track the number of Objects with IDE

    Is there a way I can use my IDE to see how many instances of a particular class are created? I am using Sun ONE Studio.

  • Triggers in a database

    Hi, How to find what are all the user defined triggers present in a database? Thanks Aravindh

  • How can delete DC in DTR ?

    Hi. I created test DC in DTR and made one java bean project. After then, I want to delete this test DC completely. I try to delete repository browser in NWDS but delete function was disabled. How can I delete test DC ? Regards, Arnold.

  • Comparing OO ALV and ALV using Function modules

    Hi All, Please provide me what are the advantages of developing ALV using Objects compared to creation using function modules. What are the disadvantages of creating using the conventional FM way. I have not worked much on ALV using Function modules,