Large amount of space going missing

I was just clearing some space on my ipod, I deleted around 5 songs and when I tried to put new music on it said there was no space.
So I have a few questions.
http://i138.photobucket.com/albums/q251/yurksemesh/ipodhelpbar.jpg
Should there be that much space in the other section?
http://i138.photobucket.com/albums/q251/yurksemesh/itunescontrol.jpg
What is iTunesControl? Should it be taking up that much space? Is it safe to delete?
http://i138.photobucket.com/albums/q251/yurksemesh/artwork.jpg
Should artwork take up this much space?
Any help at all would be greatly appreciated
Thanks

I was just clearing some space on my ipod, I deleted around 5 songs and when I tried to put new music on it said there was no space.
Did you delete actual songs and not simply a playlist? Deleting a playlist doesn't free up space because it is only a link to songs and not the actual files.
Should there be that much space in the other section?
The storage space taken up by "Other" can be made up of, notes or files of any kind you've added to the iPod whilst using it as a hard disk or album artwork. Or sometimes when the iPod data base has become corrupt.
There will always be a few MBs of used space under "other", and yours seems perfectly normal.
What is iTunesControl? Should it be taking up that much space? Is it safe to delete?
The iTunes control file (iTunes Music Library.xml) contains artwork, volume & sound adjustments, ratings, play count info, etc. Not worth deleting.
Should artwork take up this much space?
Again, perfectly normal.

Similar Messages

  • What does "other" mean at the bottom of the sync page? It is taking up a large amount of space on my phone.

    What does "other" mean at the bottom of the sync page? It is taking up a large amount of space on my phone.

    The other section contains temporary files, such as text messages, iMessages, voicemail, and other storage that is meant for small short term use.  If the other section of your phone has grown quite large and you can't account for what it could be then you may have experienced a syncing error to iTunes.
    I have experienced this problem frequently during my years with the iPhone.  The other portion of my phone grows with every sync of my phone to iTunes.  There comes a point where it takes up the majority of the space on my iPhone, making the device near unusable.
    When it grows to this extent is usually cased by an error during the syncing process.  Any information iTunes can't reconcile during the syncing process is saved out of sight, out of mind, in the 'other' section.
    You will never be able to see or recover this section unless you are very savvy and know how to poke around the archived iPhone back-up on your computer.  (if you are thinking of trying, stop.  It will not be worth your time and effort, or if using third party software, the money or spyware associated with it *cough* iTools *cough*).
    The only fix I have found that works is to back up your phone to iTunes, and preform a 'restore'.  This erases the content of your phone, and restores it to factory settings.
    That is you phone will be as if you took it fresh out of the box.  You can preform a restore from back-up through iTune, or iCloud to get your content back.
    Lo and behold your have trimmed the mystery meat.

  • School iPad Syncing Taking Up Large Amounts of Space

    We have a computer syncing 50 iPads. After realizing that backups of each iPad were taking up around 80GBs of space on the hard drive, we deleted the "Device Backups" in the preferences of iTunes and gained all the space back. This is quite a temporary fix because it's much easier to sync new app downloads which will just recreate the back ups instead of downloading them by hand on each iPad.
    We've not upgraded them to iOS5.
    Is there away around this issue besides backing up the data to an external hard drive (not really solving the problem) or using iCloud (bandwidth issues)? I'd like to see a way of cloning iPads soo they just use the same back up or something to that effect, but I realize that might not be possible.

    Yes, though not advisable for content that you may not be able to redownload on demand. I'm not sure what Audible's policy, but with iTunes there is only one download.
    You can either place the device in manually managed mode, in which case content you delete from the library won't be automatically removed from the device, or delete the files from Windows Explorer while keeping the entries for the same in the iTunes library. With the second approach iTunes will warn that it cannot find the files to sync (you can disable the warning) but it won't remove them, and you retain the advantage of being able to automatically sync other content.
    tt2

  • [865PE/G Neo2 Series] Why do I have a big ol' amount of space reserved for an MFT on my SATA drive?

    The hard drive is on the promise controller and for some reason it has a large amount of space used for the MFT when even my system drive and my other storage drive, both located on the INTEL SATA ports do not have this space set aside.

    how do you figure out that? How large amount is that?

  • Im try to import a large amount of pictures to Iphoto. I have pleant of space on my HD. But half way thru the transfer an error message comes up and says not enough disc space. How can I fix this?

    Im trying to import a large amount of pictures to Iphoto. I have pleanty of space on my HD. But half way thru the transfer an error message comes up and says not enough disc space. How can I fix this?

    I. I want to import 58.64 GB of Photos from an external HD
    2. I have 278.28 GB space left on my HD
    3. The exact error message goes like this......
    Insufficient Disk Space
    iPhoto cannot import your photos because
    there is not enough free space on the
    volume containing your iPhoto library
    I Have researched this and most advice is to empty the iPhoto trash. I have already done this and it did not help.
    Thank you for helping

  • Is there a way to put a large amount of music on your iPod without having to keep all the files in iTunes on your computer as well? I want to put my entire music collection (including cds) on my iPod but don't want to take up the space on my computer.

    is there a way to put a large amount of music on the ipod without having to keep all the files on itunes as well? I want to use my ipod as an externa drive and put all of my music on it without taking up the space on my computer. I also don't want to lose all my files everytime I plug ipod into my computer. Is this possible? Is there a way to avoid using itunes and only use the ipod as an external drive?

    You cannot put music onto your iPod without using iTunes, that what iTunes is for.
    It's also not a good idea to wipe the music from your computer and having it only on your iPod. We see countless posts here from people who have done just that - and then lost everything when the iPod needs a Restore. Even if you never need to Restore your iPod, what happens when you eventually replace the iPod? You'll be back here asking how to get the music from one iPod to another. That's not easy to do, we see countless posts about that too!
    A much better idea is to buy an external drive (a proper external drive, not simply an iPod) and put your large amount of music onto that drive. Then point your iTunes Library to that drive. However, you need to remember two things:
    You still need a backup of that Library.
    Using an external drive as your iTunes Library means that the drive must be connected and ready to read before starting your iTunes programme. If it isn't, then iTunes will look on the C: drive - and you will find no music in your Library. (Once again, lots of posts about that as well!)

  • HT1338 I keep getting running out of disk space. when I check I have a large amount of stuff on yellow called other what is it and how do I get rid of it ?

    I am getting a full disk message.
    When I check I have a large amount of material described as Other and is coded yellow on the information bar.

    "Other" simply means anything that isn't recognised as music, video, images, software (apps) or a backup. So, PDFs, .doc files, text files, disk images yadda yadda yadda.  All sorts of things.
    If you want to get an idea of what's taking up space on your drive, try GrandPerspective.

  • Default partitions taking up large amount of disk space

    While investigating in Disk Management after having my Yoga 11s laptop for a few months, I noticed a bunch of partitions that are taking up a large portion of space on my computer. Which brought up a bunch of questions (related to the screenshot below):
    What are the EFI System Partition and OEM Partitions?
    Does the recovery partition work with that OneKey Recovery?
    What does OneKey actually supposed to back up? Both the core OS files and all of the personal files/data?
    Why are there two recovery partitions listed?
    If the other 4 partitions are needed/used why are they all 100% free space.
    Should I be storing personal files on the D: primary partition or C: partition?
    The allocation of file space doesn't seem to line up with Lenovo's description here (which also mentions only 3 partitions, not 6): http://download.lenovo.com/express/HT062547.html
    I just purchased a 2TB external drive and planned to partition it into two 1TB partitions and use one for backup and one for storage. In this case, do I need such a large recovery parition, or any at all on my computer?
    I had planned on using the Windows 8.1 native File History for backups on the external. OneKey seems to be more of a manual backup process, but I'm trying to figure out the best why to backup both my own personal files and the system files themselves. What's the best why to set that up?
    See screenshot:
    Thank you for the help!
    Kevin

    Thanks, I also found this article after some digging that explains the default partitions: http://www.lionhack.com/2013/12/25/lenovo-yoga-2-pro-partitions/
    What's the reasoning behind Lenovo creating the separate C and D drives?

  • My iMac running 10.7.5 crashes when copy and paste large amounts of information like a picture.

    My iMac running 10.7.5 crashes when I store a large amount of data like copy and paste a picture. It also has started being painfully slow to open the first page in Safari.  Here's is some information a program gathered on my computer.   -Thanks!
    Problem description:
    iMac 10.7.5.  Crashes when copy & paste large amounts and slow to first open first web page then back to normal speed
    EtreCheck version: 2.1.8 (121)
    Report generated April 13, 2015 3:05:27 PM EDT
    Download EtreCheck from http://etresoft.com/etrecheck
    Click the [Click for support] links for help with non-Apple products.
    Click the [Click for details] links for more information about that line.
    Hardware Information: ℹ️
        iMac (27-inch, Late 2009) (Technical Specifications)
        iMac - model: iMac10,1
        1 3.33 GHz Intel Core 2 Duo CPU: 2-core
        8 GB RAM
            BANK 0/DIMM0
                2 GB DDR3 1067 MHz ok
            BANK 1/DIMM0
                2 GB DDR3 1067 MHz ok
            BANK 0/DIMM1
                2 GB DDR3 1067 MHz ok
            BANK 1/DIMM1
                2 GB DDR3 1067 MHz ok
        Bluetooth: Old - Handoff/Airdrop2 not supported
        Wireless: Unknown
    Video Information: ℹ️
        ATI Radeon HD 4670 - VRAM: 256 MB
            iMac 2560 x 1440
    System Software: ℹ️
        Mac OS X 10.7.5 (11G63) - Time since boot: 14 days 7:47:18
    Disk Information: ℹ️
        ST31000528AS disk0 : (1 TB)
            disk0s1 (disk0s1) <not mounted> : 210 MB
            Macintosh HD (disk0s2) / : 999.35 GB (777.11 GB free)
            Recovery HD (disk0s3) <not mounted>  [Recovery]: 650 MB
        OPTIARC DVD RW AD-5680H 
    USB Information: ℹ️
        Sunplus Innovation Technology. USB to Serial-ATA bridge 1 TB
            disk2s1 (disk2s1) <not mounted> : 210 MB
            Time Machine Backups (disk2s2) /Volumes/Time Machine Backups : 999.86 GB (143.50 GB free)
        Apple Inc. Built-in iSight
        Apple Internal Memory Card Reader
        Apple Computer, Inc. IR Receiver
        HP Deskjet 9800
        Apple Inc. BRCM2046 Hub
            Apple Inc. Bluetooth USB Host Controller
    Kernel Extensions: ℹ️
            /Library/Extensions
        [loaded]    org.virtualbox.kext.VBoxDrv (4.2.18) [Click for support]
        [loaded]    org.virtualbox.kext.VBoxNetAdp (4.2.18) [Click for support]
        [loaded]    org.virtualbox.kext.VBoxNetFlt (4.2.18) [Click for support]
        [loaded]    org.virtualbox.kext.VBoxUSB (4.2.18) [Click for support]
            /Library/Parallels/Parallels Service.app
        [loaded]    com.parallels.kext.prl_hid_hook (7.0 15107.796624) [Click for support]
        [loaded]    com.parallels.kext.prl_hypervisor (7.0 15107.796624) [Click for support]
        [loaded]    com.parallels.kext.prl_netbridge (7.0 15107.796624) [Click for support]
        [loaded]    com.parallels.kext.prl_vnic (7.0 15107.796624) [Click for support]
            /System/Library/Extensions
        [loaded]    com.parallels.kext.prl_usb_connect (7.0 15107.796624) [Click for support]
    Startup Items: ℹ️
        HP IO: Path: /Library/StartupItems/HP IO
        ParallelsTransporter: Path: /Library/StartupItems/ParallelsTransporter
        VirtualBox: Path: /Library/StartupItems/VirtualBox
        Startup items are obsolete in OS X Yosemite
    Launch Agents: ℹ️
        [running]    com.hp.devicemonitor.plist [Click for support]
        [loaded]    com.hp.help.tocgenerator.plist [Click for support]
        [loaded]    com.parallels.desktop.launch.plist [Click for support]
        [loaded]    com.parallels.DesktopControlAgent.plist [Click for support]
        [running]    com.parallels.vm.prl_pcproxy.plist [Click for support]
    Launch Daemons: ℹ️
        [loaded]    com.adobe.fpsaud.plist [Click for support]
        [loaded]    com.hikvision.iVMS-4200.plist [Click for support]
        [loaded]    com.microsoft.office.licensing.helper.plist [Click for support]
        [running]    com.parallels.desktop.launchdaemon.plist [Click for support]
    User Launch Agents: ℹ️
        [loaded]    com.adobe.ARM.[...].plist [Click for support]
        [running]    com.akamai.single-user-client.plist [Click for support]
        [loaded]    com.google.keystone.agent.plist [Click for support]
        [not loaded]    org.virtualbox.vboxwebsrv.plist [Click for support]
    User Login Items: ℹ️
        iTunesHelper    UNKNOWN  (missing value)
        GrowlHelperApp    Application  (/Users/[redacted]/Library/PreferencePanes/Growl.prefPane/Contents/Resources/Gr owlHelperApp.app)
        Microsoft AU Daemon    Application  (/Applications/Microsoft AutoUpdate.app/Contents/MacOS/Microsoft AU Daemon.app)
        Air Mouse Server    UNKNOWN  (missing value)
        CrossOver CD Helper    UNKNOWN  (missing value)
        Pages    UNKNOWN  (missing value)
        Dropbox    Application  (/Applications/Dropbox.app)
        Wondershare Helper Compact    Application  (/Users/[redacted]/Library/Application Support/Helper/Wondershare Helper Compact.app)
        AdobeResourceSynchronizer    Application Hidden (/Applications/Adobe Reader.app/Contents/Support/AdobeResourceSynchronizer.app)
        HP Scheduler    Application  (/Library/Application Support/Hewlett-Packard/Software Update/HP Scheduler.app)
    Internet Plug-ins: ℹ️
        JavaAppletPlugin: Version: 14.9.0 - SDK 10.7 Check version
        FlashPlayer-10.6: Version: 16.0.0.305 - SDK 10.6 [Click for support]
        QuickTime Plugin: Version: 7.7.1
        AdobePDFViewerNPAPI: Version: 11.0.10 - SDK 10.6 [Click for support]
        Flash Player: Version: 16.0.0.305 - SDK 10.6 Outdated! Update
        AdobePDFViewer: Version: 11.0.10 - SDK 10.6 [Click for support]
        SharePointBrowserPlugin: Version: 14.4.8 - SDK 10.6 [Click for support]
        Google Earth Web Plug-in: Version: 6.0 [Click for support]
        Silverlight: Version: 4.0.60531.0 [Click for support]
        iPhotoPhotocast: Version: 7.0
    Safari Extensions: ℹ️
        1-ClickWeather
        Reload Button
    3rd Party Preference Panes: ℹ️
        Akamai NetSession Preferences  [Click for support]
        Flash Player  [Click for support]
        Growl  [Click for support]
        MacFUSE  [Click for support]
    Time Machine: ℹ️
        Time Machine not configured!
    Top Processes by CPU: ℹ️
             2%    WindowServer
             1%    prl_disp_service
             0%    fontd
             0%    AdobeReader
             0%    ODSAgent
    Top Processes by Memory: ℹ️
        120 MB    mds
        112 MB    AdobeReader
        94 MB    WindowServer
        94 MB    Finder
        69 MB    loginwindow
    Virtual Memory Information: ℹ️
        5.44 GB    Free RAM
        1.78 GB    Active RAM
        464 MB    Inactive RAM
        901 MB    Wired RAM
        1.64 GB    Page-ins
        0 B    Page-outs

    I believe that insufficient RAM may be the source of some of your problems. If you have a RAM of somewhere 4 to 8GB, you will experience smoother computing. 3GB doesn't seem right, so you might want to learn more by going to this site:
    http://www.crucial.com/store/drammemory.aspx
    I don't know what know what's happening with your optical drive, but it seems you use your drive quite a bit. In that case, look into a lens cleaner for your machine. It's inexpensive, works quite well.
    I hope you'll post here with your results!

  • Itunes upgrade, large amount of music canno be located

    After itunes upgrade, a large amount of my collection is 'The Song "..." could not be used because the original file could not be found. Would you like to locate it?'
    If before I play the song I 'Get Info' and then look the path (alt-print screen) and then attempt to play the song (I'll get the !) and then locate it, it will play. If I then compare the path to the print screen path, they are identical.
    Thanks for your time.

    1. find a song that is going to be missing, not hard since its a lot more than 50% of my library.
    2. select the song (dont play it) and 'get info' from its properties.
    3. since i cannot see the 'get info' and interact with itunes, i will make a copy of the window for future viewing with print screen.
    4. close 'get info' and try and play the song. its status will now have the all too familiar '!' and be limited now in what results it will allow me see.
    5. locate the song's file like itunes suggests.
    6. once located, select it in the play list and "get info".
    7. compare with the contents of the copy buffer.
    8. they are the same exact path names.
    These are not full albums or artists, my collection has always been maintained by itunes. So I have large portions of albums 'missing'. I don't use other programs to tag or album art my collection. None of my podcast tracks seem to have been impacted.

  • Create new table , 1 column has large amount of data, column type ?

    Hi, sure this is a noob question but...
    I exported a table from sql server, want to replicate in oracle.
    It has 3 columns, one of the columns has the configuration for reports, so each field in the column has a large amount of text in it, some upwards of 30k characters.
    What kind of column do I make this in oracle, I believe varchar has a 4k limit.
    creation of the table...
    REATE TABLE "RPT_BACKUP"
    "NAME" VARCHAR2(1000 BYTE),
    "DATA"
    "USER" VARCHAR2(1000 BYTE)
    ORGANIZATION EXTERNAL
    TYPE ORACLE_LOADER DEFAULT DIRECTORY "XE_FTP" ACCESS PARAMETERS ( records delimited BY newline
    skip 1 fields terminated BY ',' OPTIONALLY ENCLOSED BY '"' MISSING FIELD VALUES
    ARE NULL ) LOCATION ( 'REPORT_BACKUP.csv' ))
    Edited by: Jay on May 3, 2011 5:42 AM
    Edited by: Jay on May 3, 2011 5:43 AM

    Ok, tried that, but now when i do a select * from table i get..
    ORA-29913: error in executing ODCIEXTTABLEFETCH callout
    ORA-30653: reject limit reached
    ORA-06512: at "SYS.ORACLE_LOADER", line 52
    29913. 00000 - "error in executing %s callout"
    *Cause:    The execution of the specified callout caused an error.
    *Action:   Examine the error messages take appropriate action.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • DSS problems when publishing large amount of data fast

    Has anyone experienced problems when sending large amounts of data using the DSS. I have approximately 130 to 150 items that I send through the DSS to communicate between different parts of my application.
    There are several loops publishing data. One publishes approximately 50 items in a rate of 50ms, another about 40 items with 100ms publishing rate.
    I send a command to a subprogram (125ms) that reads and publishes the answer on a DSS URL (app 125 ms). So that is one item on DSS for about 250ms. But this data is not seen on my man GUI window that reads the DSS URL.
    My questions are
    1. Is there any limit in speed (frequency) for data publishing in DSS?
    2. Can DSS be unstable if loaded to much?
    3. Can I lose/miss data in any situation?
    4. In the DSS Manager I have doubled the MaxItems and MaxConnections. How will this affect my system?
    5. When I run my full application I have experienced the following error Fatal Internal Error : ”memory.ccp” , line 638. Can this be a result of my large application and the heavy load on DSS? (se attached picture)
    Regards
    Idriz Zogaj
    Idriz "Minnet" Zogaj, M.Sc. Engineering Physics
    Memory Profesional
    direct: +46 (0) - 734 32 00 10
    http://www.zogaj.se

    LuI wrote:
    >
    > Hi all,
    >
    > I am frustrated on VISA serial comm. It looks so neat and its
    > fantastic what it supposes to do for a develloper, but sometimes one
    > runs into trouble very deep.
    > I have an app where I have to read large amounts of data streamed by
    > 13 µCs at 230kBaud. (They do not necessarily need to stream all at the
    > same time.)
    > I use either a Moxa multiport adapter C320 with 16 serial ports or -
    > for test purposes - a Keyspan serial-2-USB adapter with 4 serial
    > ports.
    Does it work better if you use the serial port(s) on your motherboard?
    If so, then get a better serial adapter. If not, look more closely at
    VISA.
    Some programs have some issues on serial adapters but run fine on a
    regular serial port. We've had that problem recent
    ly.
    Best, Mark

  • Extremely Slow USB 3.0 Speeds When Transferring Large Amounts of Video

    Hi there,
    I am transferring large amounts of footage (250GB-1.75TB chunks) from 5x 5400rpm 2TB drives to 5x 7200rpm 2TB drives simultaneously (via 6x USB 3.0 connections and 4x SATA III, with copy/paste in explorer) and the transfer speeds are incredibly slow. Initially the speeds show up as quite fast (45-150mb/ps+) but then they slow down to around 3mb/ps.
    The drives have not been manually defragmented but the vast majority of the files on each are R3D video files.
    I am wondering if the amount of drives/data being used/sent is what is causing such slow speeds or if there might be another culprit? I would be incredibly appreciative to learn of any solutions to increase speed significantly. Many thanks...
    Specs:
    OS: Windows 7 Professional
    Processor: i7 4790k
    RAM: 32GB
    GPU: Nvidia 970 GTX

    If the USB ports are all on the same controller, they share their resources, so the transfer rate with 6 ports would be max 1/6-th of the transfer rate with 1 USB port if we disregard the overhead. Add that overhead to the equation and the transfer rate goes down even further. Now take into account the fact that you are copying from slow 5400 RPM disks that effectively max out at around 80 MB/s with these chunks and high latency, add the OS overhead and these transfer rates do not surprise me.

  • Query about clustering unrelated large amounts of data together vs. keeping it separate.

    I would like to ask the talented enthusiasts who frequent the devolper network to tell me if I have understood how Labview deals with clusters. A generic description of a situation involving clusters and what I believe Labview does is shown below. An example of this type of situation is shown for generating the Fibonacci sequence is attached to illustrate what I am saying.
    A description of the general situation:
    A cluster containing several different variables (mostly unrelated) has one or two of these variables unbundled for immediate use and then the modified values bundled back into the cluster for later use.
    What I think Labview does:
    As the original cluster is going into the unbundle (to get original variable values) and the bundle (to update stored variable values) a duplicate of the entire cluster is made before picking out the individual values chosen to be unbundled. This means that if the cluster also contains a large amount of unrelated data then processor time is wasted duplicating this data.
    If on the other hand this large amount of data is kept separate then this would not happen and no processor time is wasted.
    In the attached file the good method does have the array (large amount of unrelated data) within the cluster and does not use the array in more than one place, so it is not duplicated. If tunnels were used instead, I believe at least one duplicate is made.
    Am I correct in thinking that this is the behaviour Labview uses with clusters? (I expected Labview only to duplicate the variable values chosen in the unbundle code object only. As this choice is fixed at compile time it would seem to me that the compiler should be able to recognise that the other cluster variables are never used.)
    Is there a way of keeping the efficiency of using many separate variables (potentialy ~50) whilst keeping the ease of using a single cluster variable over using separate variables?
    The attachment:
    A vi that generates the Fibonacci sequence (the I32 used wraps at ~44th value, so values at that point and later are wrong) is attached. The calculation is itterative using a for loop. 2 variables are needed to perform the iteration which are stored in a cluster (and passed from iteration to iteration within the cluster). To provide the large amount of unrelated data, a large array of reasonably sized strings is provided.
    The bad way is to have the array stored within the cluster (causing massive overhead). The good way is to have the array separate from the other pieces of data, even if it passes through the for loop (no massive overhead).
    Try replacing the array shift registers with tunnels in the good case and see if you can repeat my observation that using tunnels causes overhead in comparison to shift registers whenever there is no other reason to duplicate the array.
    I am running Labview 7 on windows 2000 with sufficient memory so that the page file is not used in this example.
    Thank you all very much for your time and for sharing your Labview experience,
    Richard Dwan
    Attachments:
    Fibonacci_test.vi ‏71 KB

    > That is an interesting observation you have made and seems to me to be
    > quite inexplicable. The trick is interesting but not practical for me
    > to use in developing a large piece of software. Thanks for your input
    > - I think I'll be contacting technical support for an explaination
    > along with some other anomolies involving large arrays that I have
    > spottted.
    >
    The deal here is that the bundle and unbundle nodes must be very careful
    when they are swapping elements around. This used to make copies in the
    normal cases, but that has been improved. The reason that the sequence
    affects it is that it affects the algorithm so that it orders the
    element movement so that the algorithm succeeds in avoiding a copy.
    Another, more obvious way
    is to use a regular bundle and unbundle, not
    the named variety. These tend to have an easier time in the algorithm also.
    Technically, I'd report the diagram to tech support to see if the named
    bundle/unbundle case can be handled as well. In the meantime, you can
    leave the data unbundled, as in the faster version.
    Greg McKaskle

  • Sending large amounts of data spontaneously

    In my normal experience with the internet connection, the bits of data sent is about 50 to 80% of that received, but occasionally Firefox starts transmitting large amounts of data spontaneously; what it is, I don't know and where it's going to, I don't know. For example, today the AT&T status screen showed about 19 MB received and about 10 MB sent after about an hour of on-line time. A few minutes later, I looked down at the status screen and it showed 19.5 MB received and 133.9 MB sent, and the number was steadily increasing. Just before I went on line today, I ran a complete scan of the computer with McAfee and it reported nothing needing attention. I ran the scan because a similar effusion of sending data spontaneously had happened yesterday. When I noticed the data pouring out today, I closed Firefox and it stopped. When I opened Firefox right afterward, the transmission of data from did not recommence. My first thought was that my computer had been captured by the bad guys and now I was a robot, but McAfee says not to worry. But should I worry anyway? What's going on, or that not having a good answer now, how can I find out what's going on? And how can I make it stop, unless I'm seeing some kind of maintenance operation that Mozilla or Microsoft is subjecting me to?

    Instead of using URLConnection open a Socket to the server port (80 probably) send a POST http request followed by the data, you may then (optional) recieve data from the server to check that the servlet is ok, this is the same protocol as URLConnection, but you have control over when the data is actually sent...
    Socket sock=new Socket(getHost(),80);
    DataOutputStream dos=new DataOutputStream(sock.getOutputStream());
    dos.writeBytes("POST servletname\r\n");
    dos.writeBytes("Content-type: text/plain\r\n");  //optional, but good if you know
    dos.writeBytes("Content-length: "+lengthOfData+"\r\n")  //again, optional, but good if you can know it without caching the data first
    dos.writeBytes("\r\n");   // gotta have a blank line before the data
      // send data now
    DataInputStream=new DataInputStream(sock.getInputStream());  //optional if you want to recieve
      // recieve any feedback from servlet if you want
    dis.close();
    dos.close();
    sock.close();im guessing that URLConnection caches the data so it can fill in "Content-length"

Maybe you are looking for