Upload and Process large files

We have a SharePoint 2013 OnPrem installation and have a business application that provides an option to copy local files into UNC path and some processing logic applied before copying it into SharePoint library. The current implementation is
1. Users opens the application and  clicks “Web Upload” link from left navigation. This will open a \Layouts custom page to select upload file and its properties
2. User specifies the file details and chooses a Web Zip file from his local machine 
3. Web Upload Page Submit Action will
     a. call WCF  Service to copy Zip file from local machine to a preconfigure UNC path
     b. Creates a list item to store its properties along with the UNC path details
4. Timer Job executes in a periodic interval to
     a. Query the List to see the items that are NOT processed and finds the path of ZIP file folder
     b. Unzip the selected file 
     c. Loops of unzipped file content - Push it into SharePoint library 
     d. Updates list item in “Manual Upload List”
Can someone suggest a different design approach that can manage the large file outside of SharePoint context? Something like
   1. Some option to initiate file copy from user local machine to UNC path when he submits the layouts page
   2. Instead of timer jobs, have external services that grab data from a UNC path and processes periodic intervals to push it into SharePoint.

Hi,
According to your post, my understanding is that you want to upload and process files for SharePoint 2013 server.
The following suggestion for your reference:
1.We can create a service to process the upload file and copy the files to the UNC folder.
2.Create a upload file visual web part and call the process file service.
Thanks,
Dennis Guo
TechNet Community Support
Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Subscriber Support, contact
[email protected]
Dennis Guo
TechNet Community Support

Similar Messages

  • Processing Large Files using Chunk Mode with ICO

    Hi All,
    I am trying to process Large files using ICO. I am on PI 7.3 and I am using new feature of PI 7.3, to split the input file into chunks.
    And I know that we can not use mapping while using Chunk Mode.
    While trying I noticed below points:
    1) I had Created Data Type, Message Type and Interfces in ESR and used the same in my scenario (No mapping was defined)Sender and receiver DT were same.
    Result: Scenario did not work. It created only one Chunk file (.tmp file) and terminated.
    2) I used Dummy Interface in my scenario and it worked Fine.
    So, Please confirm if we should always USE DUMMY Interfaces in Scenario while using Chunk mode in PI 7.3 Or Is there something that I am missing.
    Thanks in Advance,
    - Pooja.

    Hello,
    While trying I noticed below points:
    1) I had Created Data Type, Message Type and Interfces in ESR and used the same in my scenario (No mapping was defined)Sender and receiver DT were same.
    Result: Scenario did not work. It created only one Chunk file (.tmp file) and terminated.
    2) I used Dummy Interface in my scenario and it worked Fine.
    So, Please confirm if we should always USE DUMMY Interfaces in Scenario while using Chunk mode in PI 7.3 Or Is there something that I am missing.
    According to this blog:
    File/FTP Adapter - Large File Transfer (Chunk Mode)
    The following limitations apply to the chunk mode in File Adapter
    As per the above screenshots, the split never cosiders the payload. It's just a binary split. So the following limitations would apply
    Only for File Sender to File Receiver
    No Mapping
    No Content Based Routing
    No Content Conversion
    No Custom Modules
    Probably you are doing content conversion that is why it is not working.
    Hope this helps,
    Mark
    Edited by: Mark Dihiansan on Mar 5, 2012 12:58 PM

  • Process large file using BPEL

    My project have a requirement of processing large file (10 MB) all at once. In the project, the file adapter reads the file, then calls 5 other BPEL process to do 10 different validations before delivering to oracle database. I can't use debatch feature of adapter because of Header and detail record validation requirement. I did some performace tuing (eg: auditlevel to minimum, logging level to error, JVM size to 2GB etc..) as per performance tuing specified in Oracle BPEL user guide. We are using 4 CPU, 4GB RAM IBM AIX 5L server. I observed that the Receive activity in the begining of each process is taking lot of time, while other transient process are as per expected.
    Following are statistics for receive activity per BPEL process:
    500KB: 40 Sec
    3MB: 1 Hour
    Because we have 5 BPEL process, so lot of time is wasted in receive activity.
    I did't try 10 MB so far, because of poor performance figure for 3 MB file.
    Does any one have any idea how to improve performance of begining receive activity of BPEL process?
    Thanks
    -Simanchal

    I believe the limit in SOA Suite is 7MB if you want to use the full payload and perform some kind of orchastration. Otherwise you need to do some kind of debatching, which you stated will not work.
    SOA Suite is not really designed for your kind of use case as it needs to parocess this file in memory, when any transformation occurs it can increase this message between 3 - 10 times. If you are writing to a database why can you read the rows one by one?
    If you are wanting to perform this kind of action have a look at ODI (Oracle Data Integrator). I Also believe that OSB (Aqua Logic) can handle files upto 200MB this this can be an option as well, but it may require debatching.
    cheers
    James

  • File adapter  reading and writing large files

    Hi we are getting error when trying to process large files using file adapters. files of size 80 to 100 MB. we need to read the inbound files and write them to another folder in another server. the error we are getting is out of memory. gracias

    Hi,
    Use the asynchronous process or a checkpoint(); to see your instance before it time-out.
    --Khaleel                                                                                                                                                                                                                           

  • Upload and Reading Excel File in Web Dynpro

    Hi all,
    I have a requirement in my application( in 04s), where in, i need to upload an excel from a client through a web dynpro application(using fileupload UI) and read each and every the content of that excel file in web dynpro and process the data accordingly.
    The format of the excel is fixed and pre-defined.
    I went through a lot of blogs, but could not find a direct and exact solution to this requirement.
    Please help me.
    Looking forward to your contribution
    Thank you,
    Gita KC.

    Reading Excel Sheet from Java without using any Framework
    Enhanced File Upload - Uploading and Processing Excel Sheets
    Reading Multiple Sheets of Excel Sheet from Java
    nikhil

  • Problem while processing large files

    Hi
    I am facing a problem while processing large files.
    I have a file which is around 72mb. It has around more than 1lac records. XI is able to pick the file if it has 30,000 records. If file has more than 30,000 records XI is picking the file ( once it picks it is deleting the file ) but i dont see any information under SXMB_MONI. Either error or successful or processing ... . Its simply picking and igonring the file. If i am processing these records separatly it working.
    How to process this file. Why it is simply ignoring the file. How to solve this problem..
    Thanks & Regards
    Sowmya.

    Hi,
    XI pickup the Fiel based on max. limit of processing as well as the Memory & Resource Consumptions of XI server.
    PRocessing the fiel of 72 MB is bit higer one. It increase the Memory Utilization of XI server and that may fali to process at the max point.
    You should divide the File in small Chunks and allow to run multiple instances. It will  be faster and will not create any problem.
    Refer
    SAP Network Blog: Night Mare-Processing huge files in SAP XI
    /people/sravya.talanki2/blog/2005/11/29/night-mare-processing-huge-files-in-sap-xi
    /people/michal.krawczyk2/blog/2005/11/10/xi-the-same-filename-from-a-sender-to-a-receiver-file-adapter--sp14
    Processing huge file loads through XI
    File Limit -- please refer to SAP note: 821267 chapter 14
    File Limit
    Thanks
    swarup
    Edited by: Swarup Sawant on Jun 26, 2008 7:02 AM

  • Processing large files on Mac OS X Lion

    Hi All,
    I need to process large files (few GB) from a measurement. The data files contain lists of measured events. I process them event by event and the result is relatively small and does not occupy much memory. The problem I am facing is that Lion "thinks" that I want to use the large data files later again and puts them into cache (inactive memory). The inactive memory is growing during the reading of the datafiles up to a point where the whole memory is full (8GB on MacBook Pro mid 2010) and it starts swapping a lot. That of course slows down the computer considerably including the process that reads the data.
    If I run "purge" command in Terminal, the inactive memory is cleared and it starts to be more responsive again. The question is: is there any way how to prevent Lion to start pushing running programs from memory into the swap on cost of useless harddrive cache?
    Thanks for suggestions.

    It's been a while but I recall using the "dd" command ("man dd" for info) to copy specific portions of data from one disk, device or file to another (in 512 byte increments).  You might be able to use it in a script to fetch parts of your larger file as you need them, and dd can be used to throw data from and/or to standard input/output so it's easy to get data and store in temporary container like a file or even a variable.
    Otherwise if you can afford it, and you might with 8 GB or RAM, you could try and disable swapping (paging to disk) alltogether and see if that helps...
    To disable paging, run the following command (in one line) in Terminal and reboot:
    sudo launchctl unload -w /System/Library/LaunchDaemons/com.apple.dynamic_pager.plist
    To re-enable paging, run the following command (in one line) in Terminal:
    sudo launchctl load -w /System/Library/LaunchDaemons/com.apple.dynamic_pager.plist
    Hope this helps!

  • How do I upload and attach the file to Service Request

    [This thread was migrated from the On Demand Developer Forum in the old Siebel Community]
    drangineni
    New Contributor
    Hi,
    I am trying to upload a file and add it to a Service object as an
    attachement. I greatly appreciate if you could provide any code or
    sample...
    Thanks.
    Daya
    Product: CRM OnDemand
    10-21-2006 10:58 AM
    Re: How do I upload and attach the file to Service Request...
    BigSlick
    Valued Contributor
    On Demand doesn't support adding attachments via web services. One
    solution I've seen is depending on the scenario is to create a web link
    field on an object that's based on a custom text field. Your web service
    can populate that custom field and the web link can generate a dynamic
    link to the file in On Demand's UI. However, this depends on where the
    attachment is located and if the user needs some sort of firewall access.
    Hope this helps
    -BigSlick
    10-23-2006 11:43 AM
    ==============================================================================
    Click on the board or message subject at the top to return.

    Yes this still holds ture.
    Bardo

  • How to upload  and download a files into AL11 directory in ABAP

    Hi,
                   How to upload  and download a files into AL11 directory in ABAP
    thanks
    Moderator message: please search for available information/documentation.
    Edited by: Thomas Zloch on Mar 21, 2011 9:18 AM

    You should try one of these forums for an answer to your question:
    http://swforum.sun.com/jive/forum.jspa?forumID=116
    http://community.java.net/netbeans
    http://linux.java.net

  • Apex application file -upload and download a file.

    hi
    im having an issue with an application i created,its about uploading and downloading a file in application.the application is working and was able to upload and download a file but i have not idea there the file is stored in the application database,try to search for the file,its something dealing with the apex_application_file and i cant find it.
    Any idea where the file is stored?
    thnx
    nivesh

    Dear nivesh!
    If you upload a file into an APEX application the file is temporarily stored in the APEX_APPLICATION_FILES table. If you close your current application page the APEX_APPLICATION_FILES table will be cleared. You should create your own table to store files in a BLOB column. I've create an example for uploading images into an APEX application on apex.oracle.com. If you want to have a look at it please use the following credentials:
    Workspace: flo_demo
    Username: dev_null
    Password: password
    Application: 61811
    Yours sincerely
    Florian W.

  • Upload and plot text files

    Hi
    I would like to upload and plot as overlays several text files in labview.
    Uploading and ploting one file is not a problem but I want to load several and see them overlayed in a single plot.  Presumably I need a looping routine that uploads each file independently putting the data in an array, then plots each of the arrays to the x-y graph.  I just can't figure a way of doing it.  Can anybody help?

    Hi.
    Here is a small example (LV7.1)
    Message Edited by EVS on 09-05-2005 05:01 PM
    Jack
    Win XP
    LabVIEW 6.1, 7.0, 7.1, LabWindows/ CVI 7.1
    Let us speek Russian
    Attachments:
    Add PlotXY.vi ‏56 KB
    Clip_7.jpg ‏64 KB

  • Using Windows 8.2 and copying large files over USB 3.0 to Segate Backup Plus USB hard drive - I get the following error:-

    Using Windows 8.1 and copying large files over USB 3.0 to Segate hard drive -  I get the following error:-
    Error 0x80070079: The semaphore timeout period has expired
     Using Windows 8.1 with a 3 - 4TB USB Segate drive that is using Microsoft's bitlocker.    It seems I can copy small files w/out issues; however, copying large files and directories causes this error.  Connectivity is USB 3.0 from a Dell
    Latitue3 e6520 to Segagate drive. 
    Can anyone share fix or determine methodology to fix?

    Hi Joe,
    Was the error gone when you turned off the Bitlocker from this drive?
    If no, please refer to this similar thread's solution to resolve:
    https://social.technet.microsoft.com/Forums/windows/en-US/c3fc9f5d-c073-4a9f-bb3d-b7bb8f893f78/error-0x80070079-the-semaphore-timeout-period-has-expired?forum=itprovistanetworking
    Please remember to mark the replies as answers if they help, and unmark the answers if they provide no help. If you have feedback for TechNet Support, contact [email protected]

  • How can I use the "Correct camera distortion" filter and process multiple files in PSE 11?

    How can I use the "Correct camera distortion" filter and process multiple files in PSE 11?

    Did you check the help page for Correct Camera Distortion and Process multiple file
    Correct Camera Distortion: http://helpx.adobe.com/photoshop-elements/using/retouching-correcting.html#main-pars_headi ng_5
    Process multiple files: http://help.adobe.com/en_US/photoshopelements/using/WS287f927bd30d4b1f89cffc612e28adab65-7 fff.html#WS287f927bd30d4b1f89cffc612e28adab65-7ff6

  • How to upload and download any file from plsql through weblogic server

    hi all,
    how to upload and download any file from plsql through weblogic server? i am using oracle 10g express edition and jboss.
    Thanks and Regards,
    MSORA

    hi bala ,
    for a windown server u can use VNC (virtual network connection) which opens a session on u r desktop later u can drag and drop form there vice versa and for a linux box you can use Win SCP which helps to open a session with interface to u r desktop in both cases you can upload and down load files very easiy just as we drag and drop items in a simple pc .. we use the same technique...
    bye
    vamshi

  • Upload and download the file same name but different extension from the document library.

    HI,
         I am using the Client Object Model (Copy. Asmx ) To upload and download the file from the document library.
    I am having the mandatory File ID for the each Document.
    I tried to upload the the document (KIF53.txt) with File ID (KIF53) uploaded successfully.
    Again I tried to Upload the document(KIF53.docx) With File ID(KIF53) its uploaded the file but it not upload the File ID in the Column
    Please find the below screen shoot for the reference.

    thanks ashish
    tried 
    My requirement is to create the  folder and sub folder in SharePoint document library. If already exist leave it or create the new folder and the subfolder in the Document library using client side object model
    I able to check for the parent folder.
    But cant able to check the subfolder in the document library.
    How to check for  the sub folder in the document library?
    Here is the code for the folder IsFolder alredy Exist.
    private string IsFolderExist(string InputFolderName)
            string retStatus
    = false.ToString();
            try
                ClientContext context
    = newClientContext(Convert.ToString(ConfigurationManager.AppSettings["DocumentLibraryLink"]));
    context.Credentials = CredentialCache.DefaultCredentials;
                List list
    = context.Web.Lists.GetByTitle(Convert.ToString(ConfigurationManager.AppSettings["DocumentLibraryName"]));
                FieldCollection fields
    = list.Fields;
                CamlQuery camlQueryForItem
    = new CamlQuery();
    camlQueryForItem.ViewXml = string.Format(@"<View 
    Scope='RecursiveAll'>
    <Query>
                          <Where>
    <Eq>
    <FieldRef Name='FileDirRef'/>
    <Value Type='Text'>{0}</Value>
                            </Eq>
    </Where>
    </Query>
    </View>", @"/sites/test/hcl/"
    + InputFolderName);
    Microsoft.SharePoint.Client.ListItemCollection listItems
    = list.GetItems(camlQueryForItem);
    context.Load(listItems);
    context.ExecuteQuery();
                if (listItems.Count
    > 0)
    retStatus = true.ToString();
                else
    retStatus = false.ToString();
            catch (Exception ex)
    retStatus = "X02";
            return retStatus;
    thanks
    Sundhar 

Maybe you are looking for

  • I just down loaded lion and now every time i turn on the computer every single thing i had used in the last session opens up--how do i stop this?

    I just loaded lion on my macbook pro, and everytime i turn on the computer, every thing i did in the last session, like word, instant messanger, sites, pop-up-how do i stop this and just get to my desktop when the computer is first turned on--please

  • Apps doing malware activity?

    Hi guys, I know that the applications on iTunes are checked at least for stability issues. Are you guys aware of any checks for virus like/malware activities? I have a wireless account which I only use with my iPhone, and got a warning about copyrigh

  • Icons not showing in dock

    -just upgraded to 10.6.3 on my macbook. -icons (such as iTunes, and other random applications) do not appear in dock -printer and scanner no longer work -I suspect it may be a processor or memory issue but not sure Please advise!

  • Illustrator CS5 external links problem

    My company has recently switched over to using Illustrator as our primary interface design tool. We have several designers working on a set of files. We are using externally linked files for shared components and for any file that has a bitmap image

  • Can not delete bookmarks

    I have a mac mini, and use to be able to delete bookmarks.  I no longer can.  I do it the way I always have done it.  What may be the problem