Import hangs for (very) large file

Hi All,
where importing a file that is converted to a single data column in the beffileimport script.
So i have a flatfile with 250.000 rows but 4 data columns and i convert it to a file with 1.000.000 rows and 1 data column.
In the inbox i can see this process takes about 12 minutes. Which is long but not a problem for us.
After that the FDM web client (11.1.1.3) keeps showing processing, please wait... (even after a day).
We saw that there was no activitiy on the server so closed the web client, logged on againg and the import action was succesfull..
So for some reason because the import step takes so long it doesn't show a success message but keeps hanging.
Any suggestions how to solve this?
thanks in advance,
Marc

The only thing that I would be aware of that would cause this is inserting a timeout Registry key on the workstation as noted in the following microsoft KB. If you have this in place, remove it and reboot and then test the import again.
http://support.microsoft.com/kb/181050

Similar Messages

  • Is it best to upload HD movies from a camcorder to iMovie or iPhoto.  iMovie gives the option for very large file sizes - presumably it is best to use this file size because the HD movie is then at its best quality?

    Is it best to upload hd movie to iPhoto or iMovie?  iMovie seems to store the movie in much larger files - presumably this is preferable as a larger file is better quality?

    Generally it is if you're comparing identical compressors & resolutions but there's something else happening here.  If you're worried about quality degrading; check the original file details on the camera (or card) either with Get Info or by opening in QuickTime and showing info. You should find the iPhoto version (reveal in a Finder) is a straight copy.  You can't really increase image quality of a movie (barring a few tricks) by increasing file size but Apple editing products create a more "scrub-able" intermediate file which is quite large.
    Good luck and happy editing.

  • Tips or tools for handling very large file uploads and downloads?

    I am working on a site that has a document repository feature. The documents are stored as BLOBs in an Oracle database and for reasonably sized files its not problem to stream the files out directly from the database. For file uploads, I am using the Struts module to get them on disk and am then putting the blob in the database.
    We are now being asked to support very large files of 250MB+. I am concerned about problems I've heard of with HTTP not being reliable for files over 256MB. I'd also like a solution that would give the user a status bar and allow for restarts of broken uploads or downloads.
    Does anyone know of an off-the-shelf module that might help in this regard? I suspect an ActiveX control or Applet on the client side would be necessary. Freeware or Commercial software would be ok.
    Thanks in advance for any help/ideas.

    Hi. There is nothing wrong with HTTP handling 250MB+ files (per se).
    However, connections can get reset.
    Consider offering the files via FTP. Most FTP clients are good about resuming transfers.
    Or if you want to keep using HTTP, try supporting chunked encoding. Then a user can use something like 'GetRight' to auto resume HTTP downloads.
    Hope that helps,
    Peter
    http://rimuhosting.com - JBoss EJB/JSP hosting specialists

  • Upload very large file to SharePoint Online

    Hi,
    I tried uploading very large files to SharePoint online via the REST API using JAVA. Uploading files with a size of ~160 mb works well, but if the filesize is between 400mb and 1,6 gb I either receive a Socket Exception (Connection reset by peer) or an error
    of SharePoint telling me "the security validation for the page is invalid". 
    So first of all I want to ask you, how to work around the security validation error. As far as I understand the token, which I have added to my X-RequestDigest header of the http post request seems to be expired (by default it expires after 1800 seconds).
    Uploading such large files is time consuming, so I can't change the token in my header while uploading the file. How can I extend the expiration time of the token (if it is possible)? Could it help to send post requests with the same token
    continuously while uploading the file and therefore prevent the token from expiration? Is there any other strategy to upload such large files which need much more than 1800 seconds for upload?
    Additionally any thoughts on the socket exception? It happens quite sporadically. So sometimes it happens after uploading 100 mb, and the other time I have already uploaded 1 gb.
    Thanks in advance!

    Hi,
    Thanks for the reply, the reason I'm looking into this is so users can migrate there files to SharePoint online/Office 365. The max file limit is 2gb, so thought I would try and cover this.
    I've looked into that link before, and when I try and use those end points ie StartUpload I get the following response
    {"error":{"code":"-1, System.NotImplementedException","message":{"lang":"en-US","value":"The
    method or operation is not implemented."}}}
    I can only presume is that these endpoints have not been implemented yet, even though the documentation says they only available for Office 365 which is what  I am using. 
    Also, the other strange thing is that I can actually upload a file of 512mb to the server as I can see it in the web UI. The problem I am experiencing is get a response, it hangs on this line until the TimeOut is reached.
    WebResponse wresp = wreq.GetResponse();
    I've tried flushing and closing the stream, sendchunked but still no success

  • Unable to copy very large file to eSATA external HDD

    I am trying to copy a VMWare Fusion virtual machine, 57 GB, from my Macbook Pro's laptop hard drive to an external, eSATA hard drive, which is attached through an ExpressPort adapter. VMWare Fusion is not running and the external drive has lots of room. The disk utility finds no problems with either drive. I have excluded both the external disk and the folder on my laptop hard drive that contains my virtual machine from my Time Machihne backups. At about the 42 GB mark, an error message appears:
    The Finder cannot complete the operation because some data in "Windows1-Snapshot6.vmem" could not be read or written. (Error code -36)
    After I press OK to remove the dialog, the copy does not continue, and I cannot cancel the copy. I have to force-quit the Finder to make the copy dialog go away before I can attempt the copy again. I've tried rebooting between attempts, still no luck. I have tried a total of 4 times now, exact same result at the exact same place, 42 GB / 57 GB.
    Any ideas?

    Still no breakthrough from Apple. They're telling me to terminate the VMWare processes before attempting the copy, but had they actually read my description of the problem first, they would have known that I already tried this. Hopefully they'll continue to investigate.
    From a correspondence with Tim, a support representative at Apple:
    Hi Tim,
    Thank you for getting back to me, I got your message. Although it is true that at the time I ran the Capture Data program there were some VMWare-related processes running (PID's 105, 106, 107 and 108), this was not the case when the issue occurred earlier. After initially experiencing the problem, this possibility had occurred to me so I took the time to terminate all VMWare processes using the activity monitor before again attempting to copy the files, including the processes mentioned by your engineering department. I documented this in my posting to apple's forum as follows: (quote is from my post of Feb 19, 2008, 1:28pm, to the thread "Unable to copy very large file to eSATA external HDD", relevant section in >bold print<)
    Thanks for the suggestions. I have since tried this operation with 3 different drives through two different interface types. Two of the drives are identical - 3.5" 7200 RPM 1TB Western Digital WD10EACS (WD Caviar SE16) in external hard drive enclosures, and the other is a smaller USB2 100GB Western Digital WD1200U0170-001 external drive. I tried the two 1TB drives through eSATA - ExpressPort and also over USB2. I have tried the 100GB drive only over USB2 since that is the only interface on the drive. In all cases the result is the same. All 3 drives are formatted Mac OS Extended (Journaled).
    I know the files work on my laptop's hard drive. They are a VMWare virtual machine that works just fine when I use it every day. >Before attempting the copy, I shut down VMWare and terminated all VMWare processes using the Activity Monitor for good measure.< I have tried the copy operation both through the finder and through the Unix command prompt using the drive's mount point of /Volumes/jfinney-ext-3.
    Any more ideas?
    Furthermore, to prove that there were no file locks present on the affected files, I moved them to a different location on my laptop's HDD and renamed them, which would not have been possible if there had been interference from vmware-related processes. So, that's not it.
    Your suggested workaround, to compress the files before copying them to the external drive, may serve as a temporary workaround but it is not a solution. This VM will grow over time to the point where even the compressed version is larger than the 42GB maximum, and compressing and uncompressing the files will take me a lot of time for files of this size. Could you please continue to pursue this issue and identify the underlying cause?
    Thank you,
    - Jeremy

  • Exporting to iTunes results in a very large file

    I have been making music on Garageband and when I export it to itunes, it becomes a very large file. This makes disables me from being able to post my music on my website or email it. What can I do to make the file smaller?
    imac G5   Mac OS X (10.4.5)   1.9 GHz PowerPC G5

    You should convert your file to an MP3 in iTunes:
    - Make sure that in the import options of iTunes, MP3 is selected.
    - Mark your song in iTunes and select "convert to MP3" in the menu.
    - This creates a new file in iTunes with the same title. If you don't know which is which, apple-I will tell you if it's an AIFF or an MP3, and the file size.

  • Sorting very large file

    Hi,
    I've a very large file (1.3G, more than 10 million records) that I need to sort, what is the best possible way to do that, without using a database?
    Thanks,
    Gary

    Just a suggestion:
    If you create a sorted red/black tree of record numbers (i.e. offsets in the data file), you could get an equivalent sort as follows (probably without a huge amount of memory...):
    Here's the strategy:
    Create a comparitor that compares two Long objects. It performs a seek in the data file and retrieves the records pointed to by those two Longs and compares them using whatever record comparitor you were going to use.
    Create a sorted list based on that comparitor (a tree based list will probably be best)
    Run a for loop going from 0 to the total records in the data file, create Long objects for each value, and insert that value into the list.
    When you are done, the list will contain the record numbers in sorted order.
    Create an iterator for the list
    For each element in the list, grab the corresponding data file record from the source file and write it to the destination file.
    voila
    Some optimizations may be possible (and highly desirable, actually):
    When you read records from the file, place them into an intermediate cache (linked list?) of limited length (say 1000 records). When a record is requested, scan the cache for that record ID - if found, move it to the top of the list, then return the associated data. If it is not found, read from the file, then add to the top of the list. If the list has more than X records, remove the last item from the list.
    The upper pieces of the sort tree are going to get hammered, so having a cache like this is a good idea.
    Another possibility: Instead of inserting the records in order (i.e. 0 through X), it may be desirable to insert them in pseudo-random order - I seem to remember that there was a really nice algorithm for doing this (without repeating any records) in an issue of Dr. Dobbs from awhile back. You need to get a string of random numbers between 0 and X without repeats (like dealing from a card deck).
    If your data is already in relatively sorted order, the cache idea will not provide significant bennefit without randomizing the inputs.
    One final note: Try the algorithm without the cache first and see how bad it is. The OSes disk caching may provide sufficient performance. If you encapsulate the data file (i.e. an object that has a getRecordBytes(int recNumber) method), then it will be easy to implement the cache later if it is needed.
    I hope this provides some food for thought!
    - K

  • Very large files checkin to DMS/content server

    Dear experts,
    We have DMS with content server up and running.
    However, there is a problem with very large files (up to 2GB!) - these files cannot be checked in due to extremely long upload times and maybe GUI limitations.
    Does anyone have suggestions how we could get very large files into the content server ? I only want to check them in (either via DMS or directly to the content server) and then get the URL back.
    best regards,
    Johannes

    Hi Johannes,
    unfortunatly there is a limit for files regarding the Content Server which is about 2GB. Please note that such large files will cause very long upload times. If possible I would recommend you to split the files to smaller parts and try to check it in.
    From my point of view it is not recommended to put files directly on the Content Server, because this could lead to inconsistencies on the Content Server.
    Best regards,
    Christoph

  • Dreamweaver 8  hangs on opening large files

    I have a Windows XP SP2 Dell Computer with 1 Gb RAM, and
    everytime I try to open a fairly large PHP file to edit, the
    Dreamweaver application hangs or freezes on me. When I open task
    manager - the CPU resources being used are split nearly 50/50
    between the hung Dreamweaver app, and "System Idle Process" - with
    the Memory Usage on the hung Dreamweaver Application gradually
    increasing.
    Has anyone here experienced anything like this?? And if so,
    did you find a solution short of reformating and reinstalling
    everything.
    I appreciate any and all help and/or suggestions!

    I notice the problem with any files over 100 Kb,
    unfortunately all of the files that I CAN open without problems,
    are all under 10~15Kb. So, in all likelihood I
    <u>may</u> be having the same issues with files smaller
    than 100Kb, (but larger than 15Kb) as well. I'll try your
    suggestions and see what happens.
    The funny thing is, these same files open just fine with
    Dreamweaver 7 (on a different but <u>identical</u>
    machine). I'm thinking it may be time to "rebuild" the computer
    that's running Dreamweaver 8. By "rebuild" I mean backup
    everything, reformat the harddrive and then reinstall everything
    again. I normally try to do that once a year or so - and it's
    due....
    quote:
    Originally posted by:
    Newsgroup User
    G Evans,
    How large is the file that you are having a problem with?
    You may be seeing a combination of problems such as DW being
    slow to
    open very large files, and DW hanging on some markup that it
    does not
    recognize.
    Try making a copy of your file with a .htm file extension and
    see if
    that opens to see if the translation of the PHP markup is
    causing a problem.
    Next, try breaking your file into smaller files to either fix
    or isolate
    the problem.
    Hope this helps,
    Randy
    > I have a Windows XP SP2 Dell Computer with 1 Gb RAM, and
    everytime I try to
    > open a fairly large PHP file to edit, the Dreamweaver
    application hangs or
    > freezes on me. When I open task manager - the CPU
    resources being used are
    > split nearly 50/50 between the hung Dreamweaver app, and
    "System Idle Process"
    > - with the Memory Usage on the hung Dreamweaver
    Application gradually
    > increasing.
    >
    > Has anyone here experienced anything like this?? And if
    so, did you find a
    > solution short of reformating and reinstalling
    everything.
    >
    > I appreciate any and all help and/or suggestions!

  • How can we suggest a new DBA OCE certification for very large databases?

    How can we suggest a new DBA OCE certification for very large databases?
    What web site, or what phone number can we call to suggest creating a VLDB OCE certification.
    The largest databases that I have ever worked with barely over 1 Trillion Bytes.
    Some people told me that the results of being a DBA totally change when you have a VERY LARGE DATABASE.
    I could guess that maybe some of the following topics of how to configure might be on it,
    * Partitioning
    * parallel
    * bigger block size - DSS vs OLTP
    * etc
    Where could I send in a recommendation?
    Thanks Roger

    I wish there were some details about the OCE data warehousing.
    Look at the topics for 1Z0-515. Assume that the 'lightweight' topics will go (like Best Practices) and that there will be more technical topics added.
    Oracle Database 11g Data Warehousing Essentials | Oracle Certification Exam
    Overview of Data Warehousing
      Describe the benefits of a data warehouse
      Describe the technical characteristics of a data warehouse
      Describe the Oracle Database structures used primarily by a data warehouse
      Explain the use of materialized views
      Implement Database Resource Manager to control resource usage
      Identify and explain the benefits provided by standard Oracle Database 11g enhancements for a data warehouse
    Parallelism
      Explain how the Oracle optimizer determines the degree of parallelism
      Configure parallelism
      Explain how parallelism and partitioning work together
    Partitioning
      Describe types of partitioning
      Describe the benefits of partitioning
      Implement partition-wise joins
    Result Cache
      Describe how the SQL Result Cache operates
      Identify the scenarios which benefit the most from Result Set Caching
    OLAP
      Explain how Oracle OLAP delivers high performance
      Describe how applications can access data stored in Oracle OLAP cubes
    Advanced Compression
      Explain the benefits provided by Advanced Compression
      Explain how Advanced Compression operates
      Describe how Advanced Compression interacts with other Oracle options and utilities
    Data integration
      Explain Oracle's overall approach to data integration
      Describe the benefits provided by ODI
      Differentiate the components of ODI
      Create integration data flows with ODI
      Ensure data quality with OWB
      Explain the concept and use of real-time data integration
      Describe the architecture of Oracle's data integration solutions
    Data mining and analysis
      Describe the components of Oracle's Data Mining option
      Describe the analytical functions provided by Oracle Data Mining
      Identify use cases that can benefit from Oracle Data Mining
      Identify which Oracle products use Oracle Data Mining
    Sizing
      Properly size all resources to be used in a data warehouse configuration
    Exadata
      Describe the architecture of the Sun Oracle Database Machine
      Describe configuration options for an Exadata Storage Server
      Explain the advantages provided by the Exadata Storage Server
    Best practices for performance
      Employ best practices to load incremental data into a data warehouse
      Employ best practices for using Oracle features to implement high performance data warehouses

  • Very large file upload 2 GB with Adobe Flash

    Hello, anyone know how can I upload very large files with Adobe Flash or how to use SWFUpload?
    Thanks in Advance, All help will be very appreciated.

    1. yes
    2. I'm getting error message from php
    if( $_FILES['Filedata']['error'] == 0 ){ 
      if(move_uploaded_file( $_FILES['Filedata']['tmp_name'], $uploads_dir.$_FILES['Filedata']['name'] ) ){ 
      echo 'ok'; 
      exit(); 
      echo 'error';    // Overhere
      exit();

  • How do I control read start position in a very large file where start byte position may be larger than I32 (+/- 2^31)?

    Using LabView, I am trying to read a very large file which may be on the order of 2^32 bytes. I need to be able to step into the file at a byte position which may be greater than the I32 limit set by the read file.vi. Are there any options to the read file.vi or a method of circumventing this limitation?

    I'm not sure but i think that you can manage the "pos mode" in the "seek" sub-vi.
    The "pos mode" let you choose the initial position to add the numbers of bytes you want to move.
    I think that you can add a i32 number from the "initial" in the "pos mode" an latter use the "pos mode" in "current" to add another value. Then the next time you can move more than 2^31 bytes to the initial position.
    I hope you understand my idea, i wasn't try it before, but i think that would work.

  • Web Service to transmit a file (a very large file)

    Greetings,
    Is is possible to use a web service to transmit a file, sometimes a very large file? If so, what are the security implications of doing such a thing?
    Thanks,
    Charles

    Quick answer: It depends... Did you try using Google to look-up examples of doing this? Here is a url to try: http://www.google.com/#sclient=psy&hl=en&source=hp&q=web+service+transfer+file&pbx=1&oq=web+service+transfer+file&aq=f&aqi=g1g-j3g-b1&aql=&gs_sm=e&gs_upl=1984731l1997518l0l1997874l25l19l0l1l1l0l304l3879l0.7.10.1l18l0&bav=on.2,or.r_gc.r_pw.&fp=7caaed5eb0414d97&biw=1280&bih=871
    Thank you,
    Tony Miller
    Webster, TX
    Never Surrender Dreams!
    JMS
    If this question is answered, please mark the thread as closed and assign points where earned..

  • How do share a very large file?

    How do share a very large file?

    Do you want to send a GarageBand project or the bounced audio file?  To send an audio file is not critical, but if you want to send the project use "File > Compress" to create a .zip file of the project before you send it.
    If you have a Dropbox account,  I'd simply copy the file into the Dropbox "public" folder and mail the link. Right-click the file in the Dropbox, then choose Dropbox > Copy Public Link. This copies an Internet link to your file that you can paste anywhere: emails, instant messages, blogs, etc.
    2 GB on Dropbox are free.     https://www.dropbox.com/help/category/Sharing

  • I inadvertently created a very large file. Not needing it I sent it to the Trash Can. However, the deletion process never completes. Any suggestions as to how to delete it ?

    I inadvertently created a very large file on my hard drive. Not needing it I sent it to the Trash Can. However, the deletion process never completes. Any suggestions as to how to delete it ? I re-installed the OS but the file was still there taking up needed space.

    Command-click all of the files you want to delete and don't actually move them to the Trash until you've gone through the entire folder.
    (63341)

Maybe you are looking for

  • I can't rename CR2 raw files in LR3- urgent help

    Hi Guys, First time, Ive turned to a forum, so hope this the kind of trivia you love! I've happily been renaming the final images prior ro export from the original CR2 camera file name to a new name. However after migrating all 2011's images to a new

  • Schedule Manager SCMA

    Dear All, Is there any way to schedule the sub task after completion of a task automatically in schedule manager SCMA??as we execute the step one by one, so once the job is comepleted it should automatically trigger the step assign to that task. How

  • Menu offcentered in IE

    I have a vertical menu in my page and the first menu item is not being formatted like the rest in Internet Explorer (everywhere else it is fine). I'm not sure why. Can anyone help? I have no idea what to do, and this has to be done pretty soon. You c

  • Java services

    Hi, recently i changed my company and had taken charge of the SAP servers in new company(employer). i was exposed to windows operating systems, not much on UNIX environment. my employer had SAP ECC 6.0 deployed on HP-UX with Oracle 10g. here are a fe

  • File Extension Hidden

    I tried using the criterion "File Extension Hidden". If I choose "Any", it finds all the files; if I choose "No", it finds 0; if I choose "Yes", it finds 0. Any, No, and Yes are my only choices. So "File Extension Hidden" seems to not function at all