BLOB Size in portal

Hi
I have a portal form contains a blob field.
Sometimes It returns error during uploading the file:
Error: An unexpected error occurred: ORA-01401: inserted value too large for column (WWV-16016)
I know the maximum size of blob is about 4G and my file is about 100K
Any suggestion about that?
Thanks
Shahram
null

Which version of portal are you using

Similar Messages

  • How to modify the blob size, or how to set the size?

    i want to know how to modify the blob size, or how to set the size?
    what's the default size of blob?
    Thanks in advance.

    Blob datatype can contain binary data with a maximum size of 4 GB.
    when you enter 10kb file, the database will only use 10kb to store the file (depending on block size etc)
    if you want to modify the blob size, you may do like this:
    SQL> create materialized view t_mv refresh fast on commit
    2 as select id, dbms_lob.getlength(x) len from t;
    Materialized view created.
    SQL> alter table t_mv add constraint t_mv_chk check (len < 100);
    Table altered.

  • Define Block Blob size in GB

    Hi all,
    I have used the following code to define the block blob size in MB and then download this file. Its working fine.
    protected void btn_download_Click1(object sender, EventArgs e)
        Button btndownloadrow = (Button)sender;
        GridViewRow row = (GridViewRow)btndownloadrow.NamingContainer;
        Label lblfilename = (Label)row.FindControl("lblGrid_filename");
        string downloadfile = lblfilename.Text.ToString();
        AccountFileTransfer = CloudStorageAccount.Parse("DefaultEndpointsProtocol=http;AccountName=" + ACCOUNTNAME + ";AccountKey=" + ACCOUNTKEY);
        if (AccountFileTransfer != null)
            BlobClientFileTransfer = AccountFileTransfer.CreateCloudBlobClient();
            ContainerFileTransfer = BlobClientFileTransfer.GetContainerReference(CONTAINER);
            ContainerFileTransfer.CreateIfNotExist();
        var blob = ContainerFileTransfer.GetBlockBlobReference(downloadfile);
        var sasUrl = blob.Uri.AbsoluteUri;
        CloudBlockBlob blockBlob = new CloudBlockBlob(sasUrl);
       var blobSize = 551* 1024 * 1024; // Block blob size of 551 MB
       int blockSize = 1024 * 1024 * 1; //  chunk of size 1 MB
        Response.Clear();
        Response.ContentType = "APPLICATION/OCTET-STREAM";
        System.String disHeader = "Attachment; Filename=\"" + blockBlob.Name + "\"";
        Response.AppendHeader("Content-Disposition", disHeader);
        for (long offset = 0; offset < blobSize; offset += blockSize)
            using (var blobStream = blockBlob.OpenRead())
                if ((offset + blockSize) > blobSize)
                    blockSize = (int)(blobSize - offset);
                byte[] buffer = new byte[blockSize];
                blobStream.Read(buffer, 0, buffer.Length);
                Response.BinaryWrite(buffer);
                Response.Flush();
        Response.End();
    The problem which I am facing is that when I tried to define the block blob size in GB I am getting overflow error. I am trying to download a file of size around 3 gb. I am using this:-
      var blobSize = 3558 * 1024 * 1024; // trying to define the block blob size of around 3 GB here I am getting overflow error
    Could you please help me so that I can define the block blob size in GBs so that I can download the file from azure using block blob storage.
    Thanks.

    Hi,
    Thanks for sharing the solution about how to avoid overflow error, it will be very beneficial for other community members who have similar questions. If you have any difficulty in future programming, we welcome you to post in forums again.
    Best Regards,
    Jambor
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Blob sizes in mySql

    MySQL supports different blob sizes. I'm told that kodo does not support
    blobs in MySQL however I can override this by extending the MySQLDictionary
    class.
    This is what I was told.....
    "package com.xyz;
    import java.sql.*;
    import kodo.jdbc.schema.*;
    import kodo.jdbc.sql.*;
    public class CustomMySQLDictionary
    extends MySQLDictionary
    protected String appendSize (Column col, String typeName)
    if (col.getType () == Types.BLOB && col.getSize () > 0)
    return <sized blob string>
    return super.appendSize (col, typeName);
    Plug your dictionary into Kodo with:
    kodo.jdbc.DBDictionary: com.xyz.CustomMySQLDictionary
    In your metadata, set the size of your field with:
    <field name="blobField">
    <extension vendor-name="kodo" key="jdbc-size" value="xxx"/>
    </field>
    I have done this with a couple of minor changes. It almost does what I
    need it to do.
    Basically I want to create an ANT script which will enhance the necessary
    files, create my database, and then dump the schema to a file(so that I
    can have a sql script to run later). I have everything working with the
    exception of the of creating the database. The database gets created
    however the column where I specify blob, still gets created as a blob. I
    want to use a different blob size.
    The above code helped me with dumping the schema to a file and dumps the
    appropriate data as I would expect it.
    So how do I get the database created properly? Also, is there a way to
    automatically generate the schema without actually creating the database
    first?

    Mike Krell wrote:
    MySQL supports different blob sizes. I'm told that kodo does not support
    blobs in MySQL however I can override this by extending the MySQLDictionary
    class.
    This is what I was told.....
    "package com.xyz;
    import java.sql.*;
    import kodo.jdbc.schema.*;
    import kodo.jdbc.sql.*;
    public class CustomMySQLDictionary
    extends MySQLDictionary
    protected String appendSize (Column col, String typeName)
    if (col.getType () == Types.BLOB && col.getSize () > 0)
    return <sized blob string>
    return super.appendSize (col, typeName);
    Plug your dictionary into Kodo with:
    kodo.jdbc.DBDictionary: com.xyz.CustomMySQLDictionary
    In your metadata, set the size of your field with:
    <field name="blobField">
    <extension vendor-name="kodo" key="jdbc-size" value="xxx"/>
    </field>
    I have done this with a couple of minor changes. It almost does what I
    need it to do.
    Basically I want to create an ANT script which will enhance the necessary
    files, create my database, and then dump the schema to a file(so that I
    can have a sql script to run later). I have everything working with the
    exception of the of creating the database. The database gets created
    however the column where I specify blob, still gets created as a blob. I
    want to use a different blob size.
    The above code helped me with dumping the schema to a file and dumps the
    appropriate data as I would expect it.
    So how do I get the database created properly? Also, is there a way to
    automatically generate the schema without actually creating the database
    first?After doing some further investigation, the call "col.getType ()" returns
    a Types.VARBINARY and not a Types.BLOB. So I changed my code to test for
    this instead and also test for the column size to indicate the appropriate
    BLOB size. Is this correct?
    It appears that a LONGBLOB is coming back as a "LONG VARBINARY", even
    though I am testing for this, I cannot get a LONGBLOB. Why?
    Also, prior posts indicate that BLOBs are not supported in Kodo and MySQL.
    I'm confused as to what this means because in the src files that you
    delivered with your product, I see references to BLOB, MEDIUMBLOB, etc.
    The file I'm referring to is kodo.jdbc.sql.MySQLDictionary. So tell me
    again why blobs aren't supported?

  • Toplink with mysql: problem with blob size

    I'm using toplink with a mysql database. I want to store some data in a blob.
    In mysql there exist different sizes of blobs (in my case I need a mediumblob).
    But if I create the scheme for the database with jpa/toplink it alway creates only a column with type blob.
    I can explicitly tell the database to use a mediumblob by this:
    @Column("MEDIUMBLOB")
    But by doing this I limit my program to mysql of course, as this data type is not known to other databases.
    Does anybody know a more elegant solution to setting the blob size?
    for example with hibernate it can be done this way:
    @Column(length=666666)

    Looks like you are using JPA, and in JPA you would set the columnDefinition to the type that you want, e.g.
    @Lob
    @Column(name="BLOBCOL", columnDefinition="MEDIUMBLOB")
    byte[] myByteData;
    As you mentioned, this does introduce a dependency on the database. However, you can always either put the Column metadata in XML, or override it with something else later in XML.
    The length attribute was intended and specified for use with Strings but I guess there is no reason why it couldn't be used for BLOBs as well (please enter an enhancement request, or submit the code to add the feature to EclipseLink). Be aware, though, that doing so at this stage is going to be introducing a dependency on the provider to support a non-spec defined feature.

  • BLOB Size limitations

    Is there a means to get passed the 4GB blob size limitation? Are there compression routines that interMedia offers?
    Any suggestions would be much appreciated.
    Thanks.

    What version of Oracle are you working with? And how are you planning to store the data?
    If you are using BFILE LOBs, then the media is stored outside the database and the size is limited to 4GB in 10.2. If you are using BLOB storage (inside the database) then the limit is up to 128 TB in size.
    I assume that this is video data? interMedia will not be able to process any images that large, though I expect it will be able to extract properties from the header without any problem.

  • Any way to change blob size -- do I need to?

    I'm tinkering around with remote blob storage in my dev environment -- doing a lot of different options to see how it really works.  One thing I noticed is that when I dragged a file to a document library and it got stored under my filestream filename
    in the file system (instead of the mdf file of the provider database) that files it created seemed to be about 61k.  So if I dragged a 5 MB PDF file, there would be about 85 of these small blob files.  That seemed to be a little small and inefficient
    to me, so I thought I could do something to increase the file size of the blob files and reduce the number of them.  I tried to increase the filestreammaxsizeinlineblob parameter of the RBS.MSI installation from 61140 to 102400 (and I tried doing this
    in a number of different ways that I don't want to count).  When I did that, nothing seemed to end up in the file system.  Then I started poking around in the database and saw the table mssqlrbs_filestream_data_1.rbs_filestream_data_1 and saw the
    blob_size column and nothing was larger than 65641.  So it seems that no data is being stored in the file system when I mandate a filestreammaxsizeinlineblob parameter of 102400 and that makes sense now looking at the aforementioned table because everything
    was formed at a much lower size then the threshold I instituted.  My two questions are this:  1) is there any way to increase the blob_size so it's larger and there are less files on the file system or is that something hardcoded into sharepoint,
    and 2) is the 61 - 64k size just fine and what appear to be lots of files to me is really nothing for the server to handle?  Frankly, the performance I got was quite good retreiving documents from BLOBs in my test environment, but was wondering what would
    happen when it got actual use in production and tons of these files are floating around on the file system.
    *This is a different issue than setting the minimumblobstoragesize in powershell -- I know how to do that*

    1) Yes you can.
    2) It's fine, don't bother changing it.
    The thing you're seeing at work is called Shredded storage. It effectively allows SharePoint and SQL to only store updates to large files. Part of this involves shredding files to smaller chunks so it can identify which bits have changed. Because you're
    externalising BLOBs you see these shredded files on the disk.
    RBS is no longer anywhere near as useful in 2013 as it used to be and for the majority of cases i'd advise against using it. It might actually be causing worse performance as you take a small but measurable hit whenever you use an externalised BLOB, which
    is fine for large slow files but very counter productive for small bits of data that are best kept in the database.
    Thanks for the quick response, Alex!  If you have time, could you please briefly reply as to why RBS isn't as useful in 2013 (my setup is SP2013 SP1 w/ SQL2012 SP1), and what would be the threshold where it *would* be useful (i.e. number of files in document
    storage DB, total document storage size, etc.)

  • How to change the size  of portal destop title  after user login

    hi,guys
    got a question,I want to change the title of desktop,but I found out this title been limited by portal.the size length is 20.
    in pf_desktop_instance table,the instance_title size is 20,so I changed it to 200,but still fail.so...does any guys handled this issue before.
    I am using dvt tool of oracle weblogic portal.
    thank you so much.
    regards
    aris

    I found the function from editTitle.js
         titleSave: function(newValue, oldValue) {
              var url = this.getUpdateUrl();
              this.oldValue = oldValue;
              var content = samples.dvt.Manager.getContentObject();
              content.title = newValue;
              alert(content.title);
              dojo.xhrPost({
                   url:          url,
                   mimetype:     "text/html",
                   content:     content,
                   load:          dojo.hitch(this, this.titleSaveLoad),
                   error:          dojo.hitch(this, this.titleSaveError)
    so,is that means this content truncate by dvt java code?
    if yes.where can i get dvt sourecode.thank you.
    if no,then how to fix this one.
    thank you.

  • SOLUTION - How to display a BLOB content in Portal Report

    Courtesy of Kelly Wong:
    1. Every time a record of the file information is inserted into the
    PORTAL30.WWDOC_DOCUMENT table while you inserted a record to the blob
    field in a portal form that you created.
    [The fact that a file uploaded appears in the WWDOC_DOCUMENT table is actually a side-effect of Forms file upload, this may change in a future version. However this example gives a pretty good solution to the problem - Dmitry]
    2. If you describe the PORTAL30.WWDOC_DOCUMENT table, you will find
    that a few of the columns are: NAME, FILENAME, MIME_TYPE, BLOB_CONTENT,
    CREATOR, etc.
    3. I created a pl/sql procedure that takes a NAME parameter.
    The code of the procedure is as follows:
    CREATE OR REPLACE Procedure get_url
    (V_FILENAME IN VARCHAR2)
    IS
    url varchar2(100);
    BEGIN
    url := '/pls/portal30/docs/'&#0124; &#0124;v_filename;
    portal30.wwv_redirect.url(p_url =>url);
    END;
    4. I then created a portal report, select NAME, FILENAME, MIMETYPE,
    CREATOR fields from PORTAL30.WWDOC_DOCUMENT. (remember, no BLOB_CONTENT
    field is selected!) My select statement is:
    SELECT '<A
    HREF="scott.get_url?v_filename='&#0124; &#0124;name&#0124; &#0124;'">'&#0124; &#0124;filename&#0124; &#0124;'</A>' filename,
    name,mime_type,creator
    from portal30.wwdoc_document
    where creator = 'KELLY'
    You can see that I am passing in "NAME" instead of "FILENAME" into the
    get_url procedure because the it needs the "NAME" info to display the
    file.
    Actually, the content in the NAME column is something like:"563.TXT",
    and the content in the FILENAME column is "F15675/IPORTAL_LOCAL.TXT".
    The hyperlink can be either on the NAME or FILENAME field, as long as
    you pass in the content of "NAME" into the procedure.
    And it should be fairly easily to substring the FILENAME to show only
    "IPORTAL_LOCAL.TXT" if the client doesn't like to see the number
    portion.
    That is, when I click on the link, I am able to see my file in a new
    browser window.The only drawback in this scenario is if there is a link
    from your document to a portal component (form/report), it may not link
    to the form/report. If there are links in the document link to other
    document (html, txt, etc.), it will be no problem. It just open the
    document in another browser window.
    Please feel free to give me any comment on this and post it to the forum
    if needed. I just don't want to search for the original questions in
    the forum and attach the reply to it.
    Regards;
    Kelly.

    Is this method working also for portal 3.0.9?
    I can't get it to work.
    Is there a way do put a link to download the content of a blob field inside a report in version 3.0.9 that comes with iAS 1.0.2.2?
    Thank's in advance
    Mauro

  • How can we limit uploading file size in portal.

    we have a feature where customers can upload there files, we use portals for uploading files, form which has the file type is posted to a stored procedure and which in turns calls the wwv_things.saveitem.
    we need to restrict a file size and file type for the uploading files, I can restrict the file type with javasript but wanted to find out if oracle portals has any way to restrict the uploading file size,
    thanks and regards,
    Sunil.

    Currently, this is not available. You could have a query that has all RKF as a starting point to achieve this. If you think this would be good to have by default, please log a development request on the OSS Messaging System.

  • BLOB size more than doc size.

    Hi All,
    We are migrating pdf documents to oracle database.The table in which we are inserting the BLOB's is under the Table Space "USER01".
    When we compare the free bytes availaability between before and after migration, it shows that about 5.23 GB of tablespace bytes has been occupied for
    insering 1930 BLOB's of size 447 MB.
    Is the BLOB's datatypes occupy this much space? Am I doing something wrong in calculating the size occupied?
    Is there any efficient way we can do store the BLOB into the Oracle database?.
    Any help will be appreciated.
    Thanks,
    Rana

    Hi Daniel,
    Thanks for the response, here is table table structure.
    CREATE TABLE RCONTENT
    DOC_ID INTEGER NOT NULL,
    PROD_ID VARCHAR2(60 BYTE) NOT NULL,
    INFO VARCHAR2(20 BYTE),
    DOC_TYPE VARCHAR2(20 BYTE),
    DOC_CONTENT BLOB
    This is existing in 9204 and 10gR1 both.
    Thanks,
    Rana.

  • Input calendar not displaying appropriate size in Portal

    We are on SAPGUI ECC6 and Portal 7.0. We haven't implement CRM.
    I have designed an ABAP Webdynpro application. The application has an input field with type DATE. When I view the calender using transaction se80 vai the GUI, the size is 100% correct.
    I have also created an Iview for this application and when I view the calendar through the Portal the format is wrong and only the current month is displaying.
    Aragorn ahs also posted the same problem during October last year, but the question has not been answered.
    Any help will be appreciated.
    Regards
    Margariet

    I have change the Supply Portal Stylesheet from yes to No and it fix the problem.
    Margariet

  • Reduce BLOB size to 60k

    I have a .jpg saved in a table as a BLOB.
    I want to reduce the size of the BLOB so that when the .jpg is extracted, it has a 60k or less file size.
    Physical (x,y) size is irrelevant.
    Ideally,
    1. Select the BLOB
    2. Put it in a temporary table
    3. Select it from the temporary table
    4. Reduce it in size.
    5. Resave it to the temporary table.
    The original BLOB will remain unchanged.
    It should stay as a BLOB throughout the process (ideally).
    Thanks in advance.
    Any help appreciated.
    John

    Is there a reason that you aren't using interMedia for this? If you want to operate on an image in the database, OrdImage is the data type to use. Otherwise, you'll have to code your own JPG compression routines which seems less than ideal.
    Justin
    Distributed Database Consulting, Inc.
    http://www.ddbcinc.com/askDDBC

  • Max file size for Portal Upload

    Hi,
    Our portal is EP6 SP2 Patch 5. Our users try to upload 200 MB document and it is failing. Any one of you know what is the limiatation of the size.??
    Any help would be a great help.
    Ravi

    I don't know what the limit is but for files that size it is probably better to try and use webdav.  Go into details of the folder you want to upload to, then propeties and one of the properties is access links.  One of those is a webdav link.  Copy that.  Then in Windows explorer, choose map network drive -> web folder and put that url in.  You will have to logon using portal uid / password but then the portal folder will appear in my network places so you can jsut copy it in that way.  I've copied loads of files in that way.
    Paul

  • Displaying Blob PDF in Portal

    Hi all,
    I searched online but I haven't been able to find an adequate response. I have a WSRP portlet and I'd like to show a PDF document that is stored as a blob based on what document a portal user has access to. I tried setting the content type of the page to application/pdf and using the response's OutputStream to write the bytes out but it gives a generic error about the markup. Do you think it would be possible to get this working within portal?
    Another option if the first one doesn't work is to call a servlet. In JDeveloper I added a servlet and tried calling it but I haven't been successful. Do you have any references to documents that would show how to call a servlet in portal via a JSP?
    Thanks!

    Hi,
    A WSRP portlet can only generate output in HTML.
    This means you cannot generate PDF (our any other non-HTML) output.
    As you point out yourself, the workaround is to develop a servlet which generates your PDF output.
    In your portlet you can add a link to the servlet.
    If your servlet runs on a different tier than your portal tier, you need to add a rewriterule or proxypass in order to access it.
    Best regards,
    Marcel.

Maybe you are looking for

  • Error while opening the PDF preview

    Hi, While opening the form in the PDF preview, i am getting a pop window with a message saying, 'Cannot find 'c:\.\Adobe Examples\request_de.xml'. Make sure the path or Internet address is correct.'. When i click 'ok' it is giving a blank window. Do

  • Judging a Professional Grade

    Hi, this may seem like an amateur question (well, it is, I guess), but here's my issue: I just got into using Color v1 after many years on FCP and I bought myself a Matrox MXO2 Mini (Composite/Component/HDMI out) to help my grading. At home I have an

  • Am I supposed to receive a code for the rest of season 5 breaking bad?

    I purchased the season pass assuming it was for the whole season, not just a portion of it. I was disappointed to find that the final episodes were unavailable after they began airing. I recently read an article about this exact situation. Should I b

  • Errors with Presenter

    I'm getting this error when I run the installation, the install completes regardless: http://i46.photobucket.com/albums/f105/Elwood54/error1904.jpg And this one when I try to choose any menu item from the Presenter menu in Powerpoint: http://i46.phot

  • HTML component SVG support

    I try to load a few images in an HTML component (Webkit/531.9). THe PNG, JPG and GIF images will load fine, but the SVG image does not load at all. The image will load in Chrome (Webkit/533.4) and Safari (Webkit/531.22.7) without a problem. Is there