Uploading classical music (large files)

Hi everyone.
I'm having trouble uploading classical music from my MacBook to the iphone via itunes. Uploads all contemporary music, but won't bring the larger classical music files across.
I can see the classical music albums in my itunes playlists and it has the option to select this file when i do a manual sync with the iphone (too much music on my Mac to do a complete sync with the small iphone) but nothing comes across.
Not sure what to do and would appreciate any help!
cheers
B
PS Not sure if this is relevant, but when I open Finder and go to the itunes folder, under the sub-folder "Itunes Music", none of the classical music folders are there. (but they're definitely in the library as I can play them on the Mac).

I think its possible, You can use NPM modules like "ftp", "ssh2" and so on.
I often use ssh2 module like below code:
(function () {
  var Connection = require('ssh2');
  var conn = new Connection();
  conn.on('ready', function() {
    conn.sftp(function(err, sftp) {
      if (err) throw err;
      sftp.fastPut('<local file>',
          '<remote file>',
          function(err){
            if (err) throw err;
            conn.end();
  }).connect({
    host: '<server address>',
    port: 22,
    username: '<user name>',
    password: '<password>'
Ten

Similar Messages

  • Upload and Process large files

    We have a SharePoint 2013 OnPrem installation and have a business application that provides an option to copy local files into UNC path and some processing logic applied before copying it into SharePoint library. The current implementation is
    1. Users opens the application and  clicks “Web Upload” link from left navigation. This will open a \Layouts custom page to select upload file and its properties
    2. User specifies the file details and chooses a Web Zip file from his local machine 
    3. Web Upload Page Submit Action will
         a. call WCF  Service to copy Zip file from local machine to a preconfigure UNC path
         b. Creates a list item to store its properties along with the UNC path details
    4. Timer Job executes in a periodic interval to
         a. Query the List to see the items that are NOT processed and finds the path of ZIP file folder
         b. Unzip the selected file 
         c. Loops of unzipped file content - Push it into SharePoint library 
         d. Updates list item in “Manual Upload List”
    Can someone suggest a different design approach that can manage the large file outside of SharePoint context? Something like
       1. Some option to initiate file copy from user local machine to UNC path when he submits the layouts page
       2. Instead of timer jobs, have external services that grab data from a UNC path and processes periodic intervals to push it into SharePoint.

    Hi,
    According to your post, my understanding is that you want to upload and process files for SharePoint 2013 server.
    The following suggestion for your reference:
    1.We can create a service to process the upload file and copy the files to the UNC folder.
    2.Create a upload file visual web part and call the process file service.
    Thanks,
    Dennis Guo
    TechNet Community Support
    Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Subscriber Support, contact
    [email protected]
    Dennis Guo
    TechNet Community Support

  • OutOfMemory while uploading/downloading a large file

    Hello,
    We have recently been experiencing OutOfMemory errors in production. We find that whenver there is reasonable load and some user tries to download a 20MB zip file served from weblogic, the server throws OutOfMemoryError.
    How does Weblogic handle large file upload / download request? Is it possible to crash the weblogic server by initiating a huge file upload (in the order of 200MB)?
    Will this be kept in memory?
    like wise for downloads, what is the maximum size of the file that can be served by weblogic? Is there any memory implication? (other than the response buffer size) Will the entire file be cached in memory by weblogic when a download request gets served?
    Please help!
    Thanks!
    Dheepak.

    Hi Dheepak, Are you using a special servlet to download the zip? The default servlet always buffers only buffer_size no of bytes and then flushes it as soon as it exceeds. The default buffer size is around 8k, so you should see multiple flushes of 8k each.
    The file is not kept in memory or cached in anyway by the server itself buf I can imagine a custom servlet/filter doing that.
    For uploads, WebLogic just creates an inputStream over the underlying socket inputstream and returns it to the user. So again there is no caching of data other than the inter buffers which again arnt more than 8k.
    Hope that helps
    Nagesh

  • Splitting large files for upload

    I need to upload a very large file to an FTP site, 9.8G. This takes a hugely long time with my bandwidth and invariablly fails somewhere before it finishes. Does anyone know if there is software that will take a large file like this, split it up, perhaps into disk images, so that they can be uploaded separately and then reassembled?
    Thanks very much.

    If you perform a web search there are some file splitters available for the Mac, I don't use them.
    Hopefully you will find a suitable one.

  • CF8.01 Large Files Upload issue

    We are having an issue with posting large files to the server
    through CFFile. Our server is running on Windows 2003 R2 SP2 with
    2GB of RAM. The average upload size is 800MB and we may run into
    multiple simultaneous uploads with the large file size. So, we have
    adjusted the "Maximum size of post data" to 2000 MB and "Request
    Throttle Memory" to 5000 MB in the ColdFusion admin setting to
    hopefully can allow up to 5 simultaneous uploads.
    However, when we tried to launch two 800MB uploads at the
    same time from different machines, only one upload can get through.
    The other one returned "Cannot connect to the server" error after a
    few minutes. No errors can be found in the W3C log and the
    ColdFusion logs (coldfusion-out.log and exception.log) but it is
    reported in the HTTPErr1.log with the following message:
    2008-04-18 08:16:11 204.13.x.x 3057 65.162.x.x 80 HTTP/1.1
    POST /testupload.cfm - 1899633270 Timer_EntityBody DefaultAppPool
    Can anyone shed some light of it? Thanks!

    quote:
    Originally posted by:
    Newsgroup User
    Don't forget that your web server (IIS, Apache, etc.) can
    have upload
    throttles as well.
    We did not throttle our IIS to limit the upload/download
    bandwidth.
    Is there a maximum limit for "Request Throttle Memory"
    setting? Can we set it over the available physical RAM size?

  • GRC 10 - Job Sync Legacy System large files

    When I run the job GRAC_REPOSITORY_OBJECT_SYNC sync with the files from the legacy system, the file "USER_PERMISSION.txt" is about the size of 231 MB, and this happening to this job sincornismo canceled, the error "TSV_TNEW_PAGE_ALLOC_FAILED", the basis verified the memory allocated.
    We check the SAP note "1593704 - legacy file upload issue for large files" is already applied in SP05.
    Server: SAP GRC AC v10:
    [Config My System|http://www.2shared.com/file/rhXVfRcX/My_system.html]
    Could you help me with some information to have solved the problem.
    grateful
    Inacio

    Hello Inacio,
    I know that this is a very old post, but have you managed to solve this issue?
    I've found this note, that is included in SP10
    Note 1741277 - Repository Role sync ends with TSV_TNEW_PAGE_ALLOC_FAILED
    Have you changed memory parameters in order to solve it?
    Thanks!
    Diego.

  • I have a large classical music collection.  As I uploaded into the IMAC needless duplication occurred in the genre classifications.  Some compositions are separated by movement and placed in "LP-Classical" genre, but in duplicated genres. How combine thos

    Question:  I have a large classical music collection which I have  uploaded into my IMAC computer. In the process many of the compositions were placed in needlessly duplicated genre categories ( a sonata turns in three SEPARATE "LP-Classical" categories, all 3 movements separated into different "LP-Classical"  categories, one atop the other.  How does one eliminate the foolish duplication and recombine the three movements into one composition? I am Desperate.
    Normal editing techniques do not work for this problem!  Thank you/

    Question:  I have a large classical music collection which I have  uploaded into my IMAC computer. In the process many of the compositions were placed in needlessly duplicated genre categories ( a sonata turns in three SEPARATE "LP-Classical" categories, all 3 movements separated into different "LP-Classical"  categories, one atop the other.  How does one eliminate the foolish duplication and recombine the three movements into one composition? I am Desperate.
    Normal editing techniques do not work for this problem!  Thank you/

  • How do I get my iPod classic 160GB to upload my music to iTunes in my new Mac Mini OS X 10.9.5?

    How do I get my iPod classic 160GB to upload my music to iTunes in my new Mac Mini OS X 10.9.5?

    That would be via home sharing.
    Make sure  it's enabled in iTunes (file - home sharing) and Apple TV (settings - computers)
    You would then access on Apple TV using the computer icon. Your computer would need to be on and running itunes

  • BT Cloud - large file ( ~95MB) uploads failing

    I am consistently getting upload failures for any files over approximately 95MB in size.  This happens with both the Web interface, and the PC client.  
    With the Web interface the file upload gets to a percentage that would be around the 95MB amount, then fails showing a red icon with a exclamation mark.  
    With the PC client the file gets to the same percentage equating to approximately 95MB, then resets to 0%, and repeats this continuously.  I left my PC running 24/7 for 5 days, and this resulted in around 60GB of upload bandwidth being used just trying to upload a single 100MB file.
    I've verified this on two PCs (Win XP, SP3), one laptop (Win 7, 64 bit), and also my work PC (Win 7, 64 bit).  I've also verified it with multiple different types and sizes of files.  Everything from 1KB to ~95MB upload perfectly, but anything above this size ( I've tried 100MB, 120MB, 180MB, 250MB, 400MB) fails every time.
    I've completely uninstalled the PC Client, done a Windows "roll-back", reinstalled, but this has had no effect.  I also tried completely wiping the cloud account (deleting all files and disconnecting all devices), and starting from scratch a couple of times, but no improvement.
    I phoned technical support yesterday and had a BT support rep remote control my PC, but he was completely unfamiliar with the application and after fumbling around for over two hours, he had no suggestion other than trying to wait for longer to see if the failure would clear itself !!!!!
    Basically I suspect my Cloud account is just corrupted in some way and needs to be deleted and recreated from scratch by BT.  However I'm not sure how to get them to do this as calling technical support was futile.
    Any suggestions?
    Thanks,
    Elinor.
    Solved!
    Go to Solution.

    Hi,
    I too have been having problems uploading a large file (362Mb) for many weeks now and as this topic is marked as SOLVED I wanted to let BT know that it isn't solved for me.
    All I want to do is share a video with a friend and thought that BT cloud would be perfect!  Oh, if only that were the case :-(
    I first tried web upload (as I didn't want to use the PC client's Backup facility) - it failed.
    I then tried the PC client Backup.... after about 4 hrs of "progress" it reached 100% and an icon appeared.  I selected it and tried to Share it by email, only to have the share fail and no link.   Cloud backup thinks it's there but there are no files in my Cloud storage!
    I too spent a long time on the phone to Cloud support during which the tech took over my PC.  When he began trying to do completely inappropriate and irrelevant  things such as cleaning up my temporary internet files and cookies I stopped him.
    We did together successfully upload a small file and sharing that was successful - trouble is, it's not that file I want to share!
    Finally he said he would escalate the problem to next level of support.
    After a couple of weeks of hearing nothing, I called again and went through the same farce again with a different tech.  After which he assured me it was already escalated.  I demanded that someone give me some kind of update on the problem and he assured me I would hear from BT within a week.  I did - they rang to ask if the problem was fixed!  Needless to say it isn't.
    A couple of weeks later now and I've still heard nothing and it still doesn't work.
    Why can't Cloud support at least send me an email to let me know they exist and are working on this problem.
    I despair of ever being able to share this file with BT Cloud.
    C'mon BT Cloud surely you can do it - many other organisations can!

  • OOM happens inside Weblogic 10.3.6 when application uploads large files

    Oracle Fusion BI Apps application is uploading large files (100+ MB) onto Oracle Cloud Storage. This application works properly when ran outside weblogic server. When deployed on Fusion Middleware Weblogic 10.3.6, during upload of large files we get this OOM error
    java.lang.OutOfMemoryError: allocLargeObjectOrArray: [B, size 268435472
        at jrockit/vm/Allocator.allocLargeObjectOrArray(JIZ)Ljava/lang/Object;(Native Method)
        at jrockit/vm/Allocator.allocObjectOrArray(Allocator.java:349)[optimized]
        at weblogic/utils/io/UnsyncByteArrayOutputStream.resizeBuffer(UnsyncByteArrayOutputStream.java:59)[inlined]
        at weblogic/utils/io/UnsyncByteArrayOutputStream.write(UnsyncByteArrayOutputStream.java:89)[optimized]
        at com/sun/jersey/api/client/CommittingOutputStream.write(CommittingOutputStream.java:90)
        at com/sun/jersey/core/util/ReaderWriter.writeTo(ReaderWriter.java:115)
        at com/sun/jersey/core/provider/AbstractMessageReaderWriterProvider.writeTo(AbstractMessageReaderWriterProvider.java:76)
        at com/sun/jersey/core/impl/provider/entity/InputStreamProvider.writeTo(InputStreamProvider.java:98)
        at com/sun/jersey/core/impl/provider/entity/InputStreamProvider.writeTo(InputStreamProvider.java:59)
        at com/sun/jersey/api/client/RequestWriter.writeRequestEntity(RequestWriter.java:300)
        at com/sun/jersey/client/urlconnection/URLConnectionClientHandler._invoke(URLConnectionClientHandler.java:213)
        at com/sun/jersey/client/urlconnection/URLConnectionClientHandler.handle(URLConnectionClientHandler.java:149)
    Looks like weblogic is using its default Weblogic HTTP handler, switching to Sun HTTP handler via start up JVM/Java Option "-DUseSunHttpHandler=true" solves the OOM issue.
    Seems instead of streaming the file content with some fixed size byte array its being on the whole file into memory during upload.
    Is it possible to solve this OOM by changing any setting of Weblogic HTPP handler without switching to Sun HTTP handler as there are many other application deployed on this weblogic instance?
    We are concerned whether there will be any impact on performance or any other issue.
    Please advice, highly appreciate your response.
    Thanks!

    Hi,
    If you have a back then restore the below file back and then try to start weblogic:
    \Oracle\Middleware\user_projects\domains\<domain_name>\config\config.lok
    Thanks,
    Sharmela

  • FTP and HTTP large file ( 300MB) uploads failing

    We have two IronPort Web S370 proxy servers in a WCCP Transparent proxy cluster.  We are experiencing problems with users who upload large video files where the upload will not complete.  Some of the files are 2GB in size, but most are in the hundreds of megabytes size.  Files of sizes less than a hundred meg seem to work just fine.  Some users are using FTP proxy and some are using HTTP methods, such as YouSendIt.com. We have tried explict proxy settings with some improvment, but it varies by situation.
    Is anyone else having problems with users uploading large files and having them fail?  If so, any advice?
    Thanks,
       Chris

    Have you got any maximum sizes set in the IronPort Data Security Policies section?
    Under Web Security Manager.
    Thanks
    Chris

  • Beach Ball of Death on OSX while trying to upload large files(updated)

    Hi,
    I did a search on this and while I got lots of hits none on this issue.
    We have a web portal that allows customers to upload files to a DAM server, we have control server.
    The issue is not the back end, the back end is php and all set up correctly, we have 2gb ram assigned to php, the upload_max_filesize is set to 1.8gb max post size is set accordingly.
    The flex app loads, user selects large file say 1.6gb ( we do file size check to make sure its below the limit), click upload button, beach ball of death shows, a little while later the file uploads.
    Great so far, now the user tries another files same size, beach ball of death, script times out (we capture this and end gracefully). If the user restarts safari then the file will upload just fine.
    Seems you can upload a large file first time and small files subsequently below 1.2gb, but you can not upload multiple large files.
    Seems some sort of memory leak, I was looking at converting this upload app to send files via ftp, but if the flash player needs to process/load the files first then I don't think this will work looking to increase the file size ceiling to 4gb via ftp.
    Code is a bit involved but in the end just calls file reference upload, then beach ball appears
    Any ideas? player version 10.0.32.18 debug
    UPDATED 09_17_09
    It appears to be a memory leak in the player when in safari, the file is read but the memory not freed up after completion. Firefox frees the used memory after upload and you can continue. Not sure if Apple or Adobe issue.
    However why bother reading the file first? can we have an HTP stream file upload in the web flash player like AIR as. This way the player would not care on the file size and the limitation would reside on the server, which we can deal with.
    Message was edited by: flashharry!

    Hi,
    I did a search on this and while I got lots of hits none on this issue.
    We have a web portal that allows customers to upload files to a DAM server, we have control server.
    The issue is not the back end, the back end is php and all set up correctly, we have 2gb ram assigned to php, the upload_max_filesize is set to 1.8gb max post size is set accordingly.
    The flex app loads, user selects large file say 1.6gb ( we do file size check to make sure its below the limit), click upload button, beach ball of death shows, a little while later the file uploads.
    Great so far, now the user tries another files same size, beach ball of death, script times out (we capture this and end gracefully). If the user restarts safari then the file will upload just fine.
    Seems you can upload a large file first time and small files subsequently below 1.2gb, but you can not upload multiple large files.
    Seems some sort of memory leak, I was looking at converting this upload app to send files via ftp, but if the flash player needs to process/load the files first then I don't think this will work looking to increase the file size ceiling to 4gb via ftp.
    Code is a bit involved but in the end just calls file reference upload, then beach ball appears
    Any ideas? player version 10.0.32.18 debug
    UPDATED 09_17_09
    It appears to be a memory leak in the player when in safari, the file is read but the memory not freed up after completion. Firefox frees the used memory after upload and you can continue. Not sure if Apple or Adobe issue.
    However why bother reading the file first? can we have an HTP stream file upload in the web flash player like AIR as. This way the player would not care on the file size and the limitation would reside on the server, which we can deal with.
    Message was edited by: flashharry!

  • I have uploaded an old itunes library by mistake - any way that I can go back to my previous library.  Unable to do a system restore and I don't know where to find an up to date music/playlist file.

    I have uploaded an old itunes library by mistake - any way that I can go back to my previous library.  Unable to do a system restore and I don't know where to find an up to date music/playlist file.  I was originally trying to recover a playlist that vanished after I had updated one of the ipads, did an online search, followed the instructions only to discover that I had loaded an itunes library from 2 years ago.  Stupidly I thought that it had updated itself and didn't realise I had to do it.  Any he

    See Empty/corrupt iTunes library after upgrade/crash. Hopefully you have a more recent library file that you can restore.
    tt2

  • Upload very large file to SharePoint Online

    Hi,
    I tried uploading very large files to SharePoint online via the REST API using JAVA. Uploading files with a size of ~160 mb works well, but if the filesize is between 400mb and 1,6 gb I either receive a Socket Exception (Connection reset by peer) or an error
    of SharePoint telling me "the security validation for the page is invalid". 
    So first of all I want to ask you, how to work around the security validation error. As far as I understand the token, which I have added to my X-RequestDigest header of the http post request seems to be expired (by default it expires after 1800 seconds).
    Uploading such large files is time consuming, so I can't change the token in my header while uploading the file. How can I extend the expiration time of the token (if it is possible)? Could it help to send post requests with the same token
    continuously while uploading the file and therefore prevent the token from expiration? Is there any other strategy to upload such large files which need much more than 1800 seconds for upload?
    Additionally any thoughts on the socket exception? It happens quite sporadically. So sometimes it happens after uploading 100 mb, and the other time I have already uploaded 1 gb.
    Thanks in advance!

    Hi,
    Thanks for the reply, the reason I'm looking into this is so users can migrate there files to SharePoint online/Office 365. The max file limit is 2gb, so thought I would try and cover this.
    I've looked into that link before, and when I try and use those end points ie StartUpload I get the following response
    {"error":{"code":"-1, System.NotImplementedException","message":{"lang":"en-US","value":"The
    method or operation is not implemented."}}}
    I can only presume is that these endpoints have not been implemented yet, even though the documentation says they only available for Office 365 which is what  I am using. 
    Also, the other strange thing is that I can actually upload a file of 512mb to the server as I can see it in the web UI. The problem I am experiencing is get a response, it hangs on this line until the TimeOut is reached.
    WebResponse wresp = wreq.GetResponse();
    I've tried flushing and closing the stream, sendchunked but still no success

  • Is it best to upload HD movies from a camcorder to iMovie or iPhoto.  iMovie gives the option for very large file sizes - presumably it is best to use this file size because the HD movie is then at its best quality?

    Is it best to upload hd movie to iPhoto or iMovie?  iMovie seems to store the movie in much larger files - presumably this is preferable as a larger file is better quality?

    Generally it is if you're comparing identical compressors & resolutions but there's something else happening here.  If you're worried about quality degrading; check the original file details on the camera (or card) either with Get Info or by opening in QuickTime and showing info. You should find the iPhoto version (reveal in a Finder) is a straight copy.  You can't really increase image quality of a movie (barring a few tricks) by increasing file size but Apple editing products create a more "scrub-able" intermediate file which is quite large.
    Good luck and happy editing.

Maybe you are looking for