Problem upload too large file in a Content Area ORA-1401

Hi All
I have a customer who is trying to upload a file with a too large name in a content area.
If the filename is 80 characters, it works fine.
But if the filename is upper or 80 characters the following error ocurred:
ORA-01401: inserted value too large for column
DAD name: portal30
PROCEDURE : PORTAL30.wwv_add_wizard.edititem
URL : http://sbastida-us:80/pls/portal30/PORTAL30.wwv_add_wizard.edititem
I checked the table: WWV_DOCUMENT and it have a column name: FILENAME is defined like VARCHAR2(350).
If i run a query on this table, the filename column stores the filename of the file uploaded.
Is this a new bug? may i fill this bug?
Do you have any idea about this issue?
Thanks in advance,
Catalina

Catalina,
The following restrictions apply to the names of documents that you are uploading into the repository:
Filenames must be 80 characters or less.
Filenames must not include any of these characters: \ / : * ? < > | " % # +
Filenames can include spaces and any of these characters: ! @ ~ & . $ ^ ( ) - _ ` ' [ ] { } ; =
Regards,
Jerry
null

Similar Messages

  • Upload very large file to SharePoint Online

    Hi,
    I tried uploading very large files to SharePoint online via the REST API using JAVA. Uploading files with a size of ~160 mb works well, but if the filesize is between 400mb and 1,6 gb I either receive a Socket Exception (Connection reset by peer) or an error
    of SharePoint telling me "the security validation for the page is invalid". 
    So first of all I want to ask you, how to work around the security validation error. As far as I understand the token, which I have added to my X-RequestDigest header of the http post request seems to be expired (by default it expires after 1800 seconds).
    Uploading such large files is time consuming, so I can't change the token in my header while uploading the file. How can I extend the expiration time of the token (if it is possible)? Could it help to send post requests with the same token
    continuously while uploading the file and therefore prevent the token from expiration? Is there any other strategy to upload such large files which need much more than 1800 seconds for upload?
    Additionally any thoughts on the socket exception? It happens quite sporadically. So sometimes it happens after uploading 100 mb, and the other time I have already uploaded 1 gb.
    Thanks in advance!

    Hi,
    Thanks for the reply, the reason I'm looking into this is so users can migrate there files to SharePoint online/Office 365. The max file limit is 2gb, so thought I would try and cover this.
    I've looked into that link before, and when I try and use those end points ie StartUpload I get the following response
    {"error":{"code":"-1, System.NotImplementedException","message":{"lang":"en-US","value":"The
    method or operation is not implemented."}}}
    I can only presume is that these endpoints have not been implemented yet, even though the documentation says they only available for Office 365 which is what  I am using. 
    Also, the other strange thing is that I can actually upload a file of 512mb to the server as I can see it in the web UI. The problem I am experiencing is get a response, it hangs on this line until the TimeOut is reached.
    WebResponse wresp = wreq.GetResponse();
    I've tried flushing and closing the stream, sendchunked but still no success

  • JspSmartUpload  uploading too many files

    Has anyone else had a problem with too many files being uploaded?
    My code is as follows:
    <FORM NAME="frmAddDoc" ACTION="process_add.jsp" METHOD="post" ENCTYPE="multipart/form-data">
    <%@ include file="attachments.html"%>
    and attachments.html code specifies two files to be uploaded as follows:
    <TR>
    <TD CLASS="formFieldLabel">Source Document</TD>
    <TD><INPUT TYPE="File" NAME="sourceDocFileName" SIZE="12" CLASS="fileField"></TD>
    <TD><IMG SRC="../shared/images/shim_trans.gif"></TD>
    </TR>
    <TR>
    <TD CLASS="formFieldLabel">Published Document</TD>
    <TD><INPUT TYPE="File" NAME="docFileName" SIZE="12" CLASS="fileField"></TD>
    <TD><IMG SRC="../shared/images/shim_trans.gif"></TD>
    </TR>
    In process_add.jsp there is a call to the UploadBean.upload method which is as follows:
    public UploadResult upload(PageContext pageContext, String CSVFileTypesAllowed)
    StringBuffer statusMsg = new StringBuffer();
    int uploadedFilesCount = 0;
    // Upload initialization
    uploadHelper.initialize(pageContext);
    uploadHelper.setAllowedFilesList(CSVFileTypesAllowed);
    uploadHelper.upload();
    uploadedFilesCount = uploadHelper.save(uploadTempDir, uploadHelper.SAVE_PHYSICAL);
    return uploadResult;
    This works most of the time but occasionally four files are uploaded instead of two (i.e. uploadedFilesCount and uploadedFiles.getCount() are both 4 when they should be 2).
    (UploadHelper = new SmartUpload)
    Has anybody any ideas what could be causing this?
    I having been chasing this "bug" for a week now and it has me beat
    Can anybody help?
    Thanks

    Hi
    Try looking for the jspSmartUpload documentation, nobody here or atleast 98 % of the people do not use these third party tools. But rather develop them on their own. So it might be difficult for you to get an answer for your post.
    Swaraj

  • Upload a large file to OCS10g workspace

    Hi,
    I have a workspace on OCS10g and the quota of the workspace is 2GB.
    But, when I upload a large file - about 100 MB, it occurs errors.
    Undoubtedly, it works well in the case of small files.
    Already, I set the parameter IFS.DOMAIN.MEDIA.CONTENTTRANSFER.ContentLimit to 0.
    It means an unlimited permission to upload a workspace.
    Is there any other prerequisite to upload a large file to a workspace?
    Regards,
    Kitae

    Have you checked the Apache Timeout directive? It might need to be larger. There are no other general tricks or parameters to tweak that our dev team is aware of. Please file a metalink tar for this and detail the error message. I'm sure Oracle Support Services will be able to help you mine the logs to find out what is going on.

  • Problem while processing large files

    Hi
    I am facing a problem while processing large files.
    I have a file which is around 72mb. It has around more than 1lac records. XI is able to pick the file if it has 30,000 records. If file has more than 30,000 records XI is picking the file ( once it picks it is deleting the file ) but i dont see any information under SXMB_MONI. Either error or successful or processing ... . Its simply picking and igonring the file. If i am processing these records separatly it working.
    How to process this file. Why it is simply ignoring the file. How to solve this problem..
    Thanks & Regards
    Sowmya.

    Hi,
    XI pickup the Fiel based on max. limit of processing as well as the Memory & Resource Consumptions of XI server.
    PRocessing the fiel of 72 MB is bit higer one. It increase the Memory Utilization of XI server and that may fali to process at the max point.
    You should divide the File in small Chunks and allow to run multiple instances. It will  be faster and will not create any problem.
    Refer
    SAP Network Blog: Night Mare-Processing huge files in SAP XI
    /people/sravya.talanki2/blog/2005/11/29/night-mare-processing-huge-files-in-sap-xi
    /people/michal.krawczyk2/blog/2005/11/10/xi-the-same-filename-from-a-sender-to-a-receiver-file-adapter--sp14
    Processing huge file loads through XI
    File Limit -- please refer to SAP note: 821267 chapter 14
    File Limit
    Thanks
    swarup
    Edited by: Swarup Sawant on Jun 26, 2008 7:02 AM

  • Problems Uploading a Pages file in html format including pictures to web pa

    I have problems uploading a Pages file in html format to a web page.
    The files itself is downloaded but the images are not. How can I upload a page file in html including the pictures or illustrations?

    I just bought a macbook as well as the iwork software. I am having a few problems. I am a student and when I try to print out documents (originally word documents) it puts them automatically into textedit instead of pages. Also, the pages get all screwed up, things are missing, etc. How do I make pages my default and do i need to export? import? or do something so that these documents show up perfectly in pages?
    A second questions- When i try to print out slideshows in keynote (originally from powerpoint) the handouts print on the little edge of the paper horizontally rather than the long way. I am hoping these are easy adjustments.
    I would appreciate any help anyone can give me.
    Thank you!

  • Uploading a text file from webi filter area as part of the query condition

    Post Author: balasura
    CA Forum: Publishing
    Requirement : Uploading a text file from webi filter area as part of the query condition Hi, I am in a serious requirement which I am not sure available in BO XI. Can some one help me plz. I am using BO XI R2, webi I am generating a ad-hoc report, when I want to give a filter condition for a report, the condition should be uploaded from a .txt file. In the current scenario we have LOV, but LOV could hold only a small number of value, my requirement is just like a lov but the list of values will be available in a text file ( which could number to 2000 or 2500 rows). I would like to upload this 2500 values in the form of a flat text file to make a query and genrate report. Is it possible in BO XI? For Eg:- Select * from Shipment Where u201CShipment id = u2018SC4539u2019 or Shipment id = u2018SC4598u2019u201D The u201Cwhereu201D condition (filter) which has shipment id will be available in a text file and it needs to be loaded in the form of .txt file so that it will be part of the filter condition. Content of a .txt file could be this shipment.txt =============== SC4539 sc2034 SC2343 SC3892 . . . . etc upto 2500 shipment Ids I will be very glad if some could provide me a solution. Thanks in advance. - Bala

    Hi Ron,
       This User does not have the access to Tcode ST01.
       The user executed Tcode SU53 immediately following the authorization failure to see the authorization objects. The 'Authorization obj' is blank and under the Description it has 'The last Authorization check was successful' with green tick mark.
      Any further suggestions, PLEASE.
    Thanks.

  • File handling in Content areas

    Where is it determined what happens when you select a file in a content area folder? If I want to only be able to download the file and not try to read it, is there a setting that may be changed to allow this?
    Thanks,
    Chris

    Chris,
    We just do whatever the browser is configured to do with the file type. There are no special settings available in Portal.
    Regards,
    Jerry
    null

  • Uploading Very Large Files via HTTP

    I am developing some classes that must upload files to a web server via HTTP and multipart/form-data. I am using Apache's Tomcat FileUpload library contained within the commons-fileupload-1.0.jar file on the server side. My code fails on large files or large quantities of small files because of the memory restriction of the VM. For example when uploading a 429 MB file I get this exception:
    java.lang.OutOfMemoryError
    Exception in thread "main"I have never been successful in uploading, regardless of the server-side component, more than ~30 MB.
    In a production environment I cannot alter the clients VM memory setting, so I must code my client classes to handle such cases.
    How can this be done in Java? This is the method that reads in a selected file and immediately writes it upon the output stream to the web resource as referenced by bufferedOutputStream:
    private void write(File file) throws IOException {
      byte[] buffer = new byte[bufferSize];
      BufferedInputStream fileInputStream = new BufferedInputStream(new FileInputStream(file));
      // read in the file
      if (file.isFile()) {
        System.out.print("----- " + file.getName() + " -----");
        while (fileInputStream.available() > 0) {
          if (fileInputStream.available() >= 0 &&
              fileInputStream.available() < bufferSize) {
            buffer = new byte[fileInputStream.available()];
          fileInputStream.read(buffer, 0, buffer.length);
          bufferedOutputStream.write(buffer);
          bufferedOutputStream.flush();
        // close the files input stream
        try {
          fileInputStream.close();
        } catch (IOException ignored) {
          fileInputStream = null;
      else {
        // do nothing for now
    }The problem is, the entire file, and any subsequent files being read in, are all being packed onto the output stream and don't begin actually moving until close() is called. Eventually the VM gives way.
    I require my client code to behave no different than the typcial web browser when uploading or downloading a file via HTTP. I know of several commercial applets that can do this, why can't I? Can someone please educate me or at least point me to a useful resource?
    Thank you,
    Henryiv

    Are you guys suggesting that the failures I'm
    experiencing in my client code is a direct result of
    the web resource's (servlet) caching of my request
    (files)? Because the exception that I am catching is
    on the client machine and is not generated by the web
    server.
    trumpetinc, your last statement intrigues me. It
    sounds as if you are suggesting having the client code
    and the servlet code open sockets and talk directly
    with one another. I don't think out customers would
    like that too much.Answering your first question:
    Your original post made it sound like the server is running out of memory. Is the out of memory error happening in your client code???
    If so, then the code you provided is a bit confusing - you don't tell us where you are getting the bufferedOutputStream - I guess I'll just assume that it is a properly configured member variable.
    OK - so now, on to what is actually causing your problem:
    You are sending the stream in a very odd way. I highly suspect that your call to
    buffer = new byte[fileInputStream.available()];is resulting in a massive buffer (fileInputStream.available() probably just returns the size of the file).
    This is what is causing your out of memory problem.
    The proper way to send a stream is as follows:
         static public void sendStream(InputStream is, OutputStream os, int bufsize)
                     throws IOException {
              byte[] buf = new byte[bufsize];
              int n;
              while ((n = is.read(buf)) > 0) {
                   os.write(buf, 0, n);
         static public void sendStream(InputStream is, OutputStream os)
                     throws IOException {
              sendStream(is, os, 2048);
         }the simple implementation with the hard coded 2048 buffer size is fine for almost any situation.
    Note that in your code, you are allocating a new buffer every time through your loop. The purpose of a buffer is to have a block of memory allocated that you then move data into and out of.
    Answering your second question:
    No - actually, I'm suggesting that you use an HTTPUrlConnection to connect to your servlet directly - no need for special sockets or ports, or even custom protocols.
    Just emulate what your browser does, but do it in the applet instead.
    There's nothing that says that you can't send a large payload to an http servlet without multi-part mime encoding it. It's just that is what browsers do when uploading a file using a standard HTML form tag.
    I can't see that a customer would have anything to say on the matter at all - you are using standard ports and standard communication protocols... Unless you are not in control of the server side implementation, and they've already dictated that you will mime-encode the upload. If that is the case, and they are really supporting uploads of huge files like this, then their architect should be encouraged to think of a more efficient upload mechanism (like the one I describe) that does NOT mime encode the file contents.
    - K

  • WRT110 router - FTP problem - Timeouts on large files

    When I upload files via FTP and if the transfer of a file takes more then 3 minutes,
    then my FTP programs doesn't jump to the next file to upload when the previous one has been just finished.
    The problem occurs only if I connect to the internet behind the wrt110 router.
    I have tried FileZilla, FTPRush and FlashFXP FTP softwares to upload via FTP.
    At FileZilla sending keep-alives doesn't help,
    only it works at FlashFXP, but I have a trial version.
    At FIleZilla's site I can read this:
    http://wiki.filezilla-project.org/Network_Configuration#Timeouts_on_large_files
    Timeouts on large files
    If you can transfer small files without any issues,
    but transfers of larger files end with a timeout,
    a broken router and/or firewall exists
    between the client and the server and is causing a problem.
    As mentioned above,  
    FTP uses two TCP connections:
    a control connection to submit commands and receive replies,
    and a data connection for actual file transfers.
    It is the nature of FTP that during a transfer the control connection stays completely idle.
    The TCP specifications do not set a limit on the amount of time a connection can stay idle.
    Unless explicitly closed, a connection is assumed to remain alive indefinitely.
    However, many routers and firewalls automatically close idle connections after a certain period of time.  
    Worse, they often don't notify the user, but just silently drop the connection.
    For FTP, this means that during a long transfer
    the control connection can get dropped because it is detected as idle, but neither client nor server are notified.
    So when all data has been transferred, the server assumes the control connection is alive
    and it sends the transfer confirmation reply.
    Likewise, the client thinks the control connection is alive and it waits for the reply from the server.
    But since the control connection got dropped without notification,
    the reply never arrives and eventually the connection will timeout.
    In an attempt to solve this problem,
    the TCP specifications include a way to send keep-alive packets on otherwise idle TCP connections,
    to tell all involved parties that the connection is still alive and needed.
    However, the TCP specifications also make it very clear that these keep-alive packets should not be sent more often than once every two hours. Therefore, with added tolerance for network latency, connections can stay idle for up to 2 hours and 4 minutes.
    However, many routers and firewalls drop connections that have been idle for less than 2 hours and 4 minutes. This violates the TCP specifications (RFC 5382 makes this especially clear). In other words, all routers and firewalls that are dropping idle connections too early cannot be used for long FTP transfers. Unfortunately manufacturers of consumer-grade router and firewall vendors do not care about specifications ... all they care about is getting your money (and only deliver barely working lowest quality junk).
    To solve this problem, you need to uninstall affected firewalls and replace faulty routers with better-quality ones.

    When I upload files via FTP and if the transfer of a file takes more then 3 minutes,
    then my FTP programs doesn't jump to the next file to upload when the previous one has been just finished.
    The problem occurs only if I connect to the internet behind the wrt110 router.
    I have tried FileZilla, FTPRush and FlashFXP FTP softwares to upload via FTP.
    At FileZilla sending keep-alives doesn't help,
    only it works at FlashFXP, but I have a trial version.
    At FIleZilla's site I can read this:
    http://wiki.filezilla-project.org/Network_Configuration#Timeouts_on_large_files
    Timeouts on large files
    If you can transfer small files without any issues,
    but transfers of larger files end with a timeout,
    a broken router and/or firewall exists
    between the client and the server and is causing a problem.
    As mentioned above,  
    FTP uses two TCP connections:
    a control connection to submit commands and receive replies,
    and a data connection for actual file transfers.
    It is the nature of FTP that during a transfer the control connection stays completely idle.
    The TCP specifications do not set a limit on the amount of time a connection can stay idle.
    Unless explicitly closed, a connection is assumed to remain alive indefinitely.
    However, many routers and firewalls automatically close idle connections after a certain period of time.  
    Worse, they often don't notify the user, but just silently drop the connection.
    For FTP, this means that during a long transfer
    the control connection can get dropped because it is detected as idle, but neither client nor server are notified.
    So when all data has been transferred, the server assumes the control connection is alive
    and it sends the transfer confirmation reply.
    Likewise, the client thinks the control connection is alive and it waits for the reply from the server.
    But since the control connection got dropped without notification,
    the reply never arrives and eventually the connection will timeout.
    In an attempt to solve this problem,
    the TCP specifications include a way to send keep-alive packets on otherwise idle TCP connections,
    to tell all involved parties that the connection is still alive and needed.
    However, the TCP specifications also make it very clear that these keep-alive packets should not be sent more often than once every two hours. Therefore, with added tolerance for network latency, connections can stay idle for up to 2 hours and 4 minutes.
    However, many routers and firewalls drop connections that have been idle for less than 2 hours and 4 minutes. This violates the TCP specifications (RFC 5382 makes this especially clear). In other words, all routers and firewalls that are dropping idle connections too early cannot be used for long FTP transfers. Unfortunately manufacturers of consumer-grade router and firewall vendors do not care about specifications ... all they care about is getting your money (and only deliver barely working lowest quality junk).
    To solve this problem, you need to uninstall affected firewalls and replace faulty routers with better-quality ones.

  • Problem uploading bif tvs file

    It might be a dumb question but here is my problem:
    I'm trying to upload a tvs file to OracleXE (10g) using the html interface but keep on getting this error:
    ORA-20001: Unable to create collection: ORA-06502: PL/SQL: numeric or value error: character string buffer too small
    Error creating collection using loaded data.
    Trying uploading only a couple of records (lines) works just fine.
    It seems that one argument is too bog for its field type but as far as I understand it, the uploader is supposed to adjust the size according to the data and/or prompt for changes.
    The size of the file I'm trying to upload is ~22MB.
    I'm pretty much sure that there is a straight forward solution that I'm not aware of.
    Anyone?
    Thanks,
    Oren

    I tried to post it there but for some reason it wouldn't let me post - the start a new thread (as well as reply) just refreshes the page. I tried it with different machines and different navigators with no success. I even tried deleting all oracle forum related cookies.
    I have no problem posting in other forums.
    Any suggestions or solutions?

  • Problem when loading pdf files from Shared Content

    When I load pdf files from Shared Content, I got the following problem: "The selected document could not be retrieved, please try uploading the document again."
    Anyone knows this?
    Thank you very much in advanced.

    I don't migrated the program, but installed it from the original installer,
    i. e. I first installed Indesign from backup, and then uninstalled it and
    reinstall  clean from Adobe.
    What a plug-in or utility converts page from InDesign to PDF?
    2014-08-10 22:51 GMT+04:00 Peter Spier <[email protected]>:
        problem with exporting PDF files from InDesign CS5  created by Peter
    Spier <https://forums.adobe.com/people/P+Spier> in InDesign - View the
    full discussion <https://forums.adobe.com/message/6627440#6627440>

  • Privileges problem when transferring large files

    This just happened, and I don't know what changed. I cannot transfer large files from my Macbook Pro to my MacPro desktop, neither on the network nor by a USB hard drive. I get the infamous "You do not have sufficient privileges" message. The files start copying, and after about 1gb or so, the message pops up. Small files no problem, there's no restriction on those. It does not matter which disk I am trying to copy it to.
    I thought it was just a problem with iMovie files and folders (these get big fast), but it seems to be an issue with any large file. I have repaired permissions on both disks, I have made sure that the unlock key is unlocked and that all the boxes are checked "Read & Write".
    And this problem just arose today. Nothing has changed, so far as I know.

    I assume I am always logged in as administrator because I have never logged in as anything else. It also does not matter from what folder, or even disk drive, I am trying to transfer from. If the file is above a certain size, it stops transferring and I get the error.
    What interests me is not that I immediately get the "insufficient privileges" error when I try and transfer a forbidden folder or file. I've seen that. This happens after about a gig of data has transferred over already, then that error pops up.

  • InDesign CS5 causing problems with InCopy assignment file but not content files

    I'm having a strange InDesign/InCopy CS5 issue (Mac OSX 10.6.6; all system & Adobe updates are current).  In my InDesign doc I have one assignment file containing two content files. All of the InCopy pieces work fine in InDesign (i.e., I can check out/in, edit, etc.).  When I go to open the assignment file in InCopy, InCopy starts to open  it but then crashes. If I try to open each content file directly in  InCopy, no problem. I've tried deleting and recreating new assignments  in InDesign, but they all have the same result in InCopy.
    I think this actually is a problem with the InDesign file because if I export the InDesign file to IDML, then try to open the IDML file, InDesign crashes! I've trashed my InDesign preferences, made sure all cross-references are up to date, etc. Anybody have any ideas what else to look for in my InDesign file that might cause this problem?
    Thanks!
    Andrea

    Thanks John!
    1) Here are the first 10-ish lines from the crash report that is generated when I try to open the IDML file:
    Thread 0 Crashed:  Dispatch queue: com.apple.main-thread
    0   ???                               0xa0c266f0 _XHNDL_trapback_instruction + 0
    1   com.adobe.InDesign.Indexing       0x20778192 GetPlugIn + 341394
    2   com.adobe.InDesign.Indexing       0x20779009 GetPlugIn + 345097
    3   PublicLib.dylib                   0x0129228b CScriptProvider::AccessProperties(IScriptRequestData*, IScript*) + 571
    4   com.adobe.InDesign.Scripting      0x1f4befe8 GetPlugIn + 166440
    5   com.adobe.InDesign.Scripting      0x1f4c2e31 GetPlugIn + 182385
    6   com.adobe.InDesign.INXCore        0x20641444 GetPlugIn + 45220
    7   PublicLib.dylib                   0x0142c9e8 CScriptDOMElement::GetMultipleAttributes(K2Vector<IDType<ScriptID_tag>, K2Allocator<IDType<ScriptID_tag> > > const&, adobe::version_1::vector<KeyValuePair<IDType<ScriptID_tag>, DOMAttributeValue>, adobe::version_1::capture_allocator<KeyValuePair<IDType<ScriptID_tag>, DOMAttributeValue> > >&) + 280
    8   PublicLib.dylib                   0x0142bd38 CScriptDOMElement::InsertProperties(adobe::version_1::vector<KeyValuePair<IDType<ScriptID _tag>, ScriptData>, adobe::version_1::capture_allocator<KeyValuePair<IDType<ScriptID_tag>, ScriptData> > >&, adobe::version_1::vector<KeyValuePair<IDType<ScriptID_tag>, DOMAttributeValue>, adobe::version_1::capture_allocator<KeyValuePair<IDType<ScriptID_tag>, DOMAttributeValue> > > const&, short, short) + 1336
    9   PublicLib.dylib                   0x0142beab CScriptDOMElement::SetSimpleAttributes(adobe::version_1::vector<KeyValuePair<IDType<Scrip tID_tag>, DOMAttributeValue>, adobe::version_1::capture_allocator<KeyValuePair<IDType<ScriptID_tag>, DOMAttributeValue> > > const&, short) + 91
    10  PublicLib.dylib                   0x0142c0d3 CScriptDOMElement::SetAttributes(adobe::version_1::vector<KeyValuePair<IDType<ScriptID_ta g>, DOMAttributeValue>, adobe::version_1::capture_allocator<KeyValuePair<IDType<ScriptID_tag>, DOMAttributeValue> > > const&) + 51
    11  PublicLib.dylib                   0x0142a2ea CScriptDOMElement::SetAttribute(IDType<ScriptID_tag>, DOMAttributeValue const&) + 202
    2) That works. Moved all pages (frame threading was preserved) to a new doc, exported to IDML, opened new IDML file just fine. Woo hoo! One of my editors won't be pleased that all of the tracked changes have disappeared, but at least the file functions now
    Wish I knew what caused the problem in the first place so we could (hopefully) prevent this from happening again. Any ideas?
    Thanks again for your help!
    Cheers,
    Andrea

  • How to zip files in a content area

    Hi!!
    We have a lot of documents in a content area, and every day the number of documents is going up. The question is, if we can zip these documents or something similar.

    If you're asking about compressing the files in the repository, or adding a bunch of items to a zip file, neither of these features are currently available.
    You would have to download the files, add them to a zip on the desktop, and then upload the new zip file.
    Regards,
    Jerry
    PortalPM

Maybe you are looking for

  • HT1918 how do i remove credit card info from account

    how do i remove credit card info from acct

  • Why do PDF files on a URL not load in Firefox (and are black in Safari)

    Trying to open PDF information sheets from a UK govt. site, Firefox would not open anything; if it helps, Safari's window just went black, but Google and Camino both opened the files with no problems. How do I get Firefox to work as well?just recentl

  • Single level planning

    What setting to be done for single level planning of following BOM. M1     M1.1            M1.3     M1.2            M1.5 When MRP is run for M1...it shud create planed order for M1 only.....and when MRP is run for M1.1 then it shud create planed orde

  • Inserting images with no backround

    hi, i'm creating a simple webpage with a few pictures.i know how to insert images but the problem is that DW always inserts an image with a white background or any background i have set before exporting edited image from PS or any other image editor.

  • Netboot/Server Admin problems

    Hi, I'm using 10.6.8 server and for a while now server admin has disallowed any user to log in to manage the server. I now want to deploy an old 10.6 image and need to re-enable netboot, does anyone have the terminal command as i have forgotten it an