Compression to .DV? is it possible?

Hey ya!!!
I have read the compression threads and wasn't able to find this in the manual... ok.. what i normaly do in Imovie is share my project as a .DV and then go into squeeze and compress it!!!
Now in my final cut pro when i click on EXPORT using COMPRESSOR and I the advanced format conversion and check DV NTSC the file wants to save as a MOV and noticed it switches it from 16:9 to 4:3 automatically... I shot this in 16:9... ohh and it compresses it to a big file like 500mb with squeeze I can compress it to 8mb or something like that.
What should i do?
Thanks for listening (or reading)
Joey

uhmmm... well I finaly figured out how to save it to .DV - but when i compressed it it killed the quality
So I saved it as DIVX and it's only 30mb file, so if I want to post that file online I would like to shrink it down but I dont want to kill my quality... the size is 480x360 = 30mb
here you go http://www.pumatalk.com/spots/PumaTalk_Tennis.divx
does anyone have any suggestions on how to shrink it to a small sizer but without really killing the quality ....
Thanks folks,
Joey

Similar Messages

  • Compress cube request is not possible

    Hi Expert
    when I compress cube reuest, the system prompt there is no valid request to compress, but actually there are more 30 request in the cube.
    does anyone have this issue and how to solve it?
    by the way, I use the BI7.0  and use DTP transfer and in this cube there are different datasouce to provide data.
    Thnaks.

    Hi
    Thanks.
    there is no any red request and every thing is ok, I can use this cube data to provide data to other cube and Bex report, there is no any issue. but when I want to improve system performace to compress request, the system prompts it has no valid request.

  • How to delete the compressed Request ID in the info cube... ?

    Hi BW Gurus,
    one of the info package will upload the request in to three data targets and failed due to error with duplicate records.  I am able to delete the bad request in two data targets and facing problem with one data target due to the request is available in green and got compressed. 
    The reasons why i am unable to delete the request in that data target is it is already got compressed and rolledup.
    I tried with selective deletion based on same request ID  i have done that successfully but the request still presents in the data target.  I have checked in listcube and found no data for the request id.
    now i have one more question like the request has bring down to status not ok in the monitor and done selective deletion will it be a problem on the data missing please advise me if any data missing will occur for the particular data target.
    can any one help on this. Thanks in Advance.
    Venkat.

    Hi Venkat
    You have one way to delete the compressed request
    But this is possible only if u have the request in PSA
    If the request is in PSA, do reverse posting
    This will nullify the particular bad request by changing the press sign into - and - sign into + for all the records went in the cube for that particular request
    Regards
    N Ganesh

  • Compressing a half-hour show for the web is crashing FCP

    Hi,
    I've searched the forums but haven't found the answer to my question so I'm posting it here.
    I have a client who wants the entire film I made for them (it's about 25 minutes) put on their web site at 25MB in size.  I did several tests and came up with parameters based on :30 of video that would make the whole thing 55MB.  But when I actually went to export, it's crashed both times I tried.
    Any suggestions on another way to compress?  Is this even possible to do?
    Here is what I've done after my sequence was fully rendered:
    Compress using Quicktime Conversion, Options
    Video - H.264
    Frame Rate 15fps
    Key frame: Automatic
    Data Rate: Restrict to 226 kbps
    Compressor Quality: High
    Size: 320x240 Letterbox
    Audio: AAC 32kHz
    I've had some issues in the past with Compressor doing strange things to my video, so I was avoiding Compressor, but would be open to suggestions using it at this point. (I'm running FCP 7.0)
    Thanks for your help!

    Hi
    Have just tried a this and if you send the timeline in FCP to Compressor and apply as follows from the selections available:
         Web
                - Download
                                     -  QuickTime 6 Compatible
                                                                                 -  MPEG 4 100Kbps
    25 minutes comes out out file size approx 18.8 MB
    Picture looks fairly good and OK except for quick movement which scans a little but good steady pictures look and sound fine.
    Hope this helps
    PJ

  • How to compress .MOV files?

    So, I have a monster 4GB folder full of .MOV files shot on an iPhone 4 that I need to compress small enough to fit on two or three 700MB discs - or a 2GB flash drive. However, I don't really know how to it. I used the Archive Utility to zip it and the folder is still the same size as the original, so I assume that doesn't truly "compress" things. Is it possible to compress in iMovie (and if so, how), or will I need freeware to do it? Or is it even possible at all?

    Is it possible to compress in iMovie (and if so, how), or will I need freeware to do it? Or is it even possible at all?
    The basic answer here is that any converter app that exports the audio + video data to a lower combined total data rate will produce a small file. However, there is a limit to your ability to decrease the data rates and still produce useable output. In general, decreasing the data rate will drecrease quality unless you also decrease the dimensions of the targeted encoding matrix and/or otherwise modify the distribution of data per unit of time. In most cases the basic strategy is to use a high effeciency codec export option that allows the user to manually modify the export settings to produce the most compact file with a level of quality that the user "can live with." How low you can go will usually depend on the preferences of the specific user. Since your iPhone files are already highly compressed, my personal recommendation in your case would really be to move up to higher density optical media (e.g., DVD) or break down and spend the $10-$20 it would take to upgrade to a 16 GB or 32 GB drive.

  • Does Pages always compress(jpeg) tiff files in docs exported to PDF?

    Title say it all really.
    I would like to export a Pages 09 document, which contains high-res tiff images, to a PDF.  I do not want the images to have any compression applied.  Is this possible?

    Yes, it is.  Pdf if is a kind of image format but not in the way that jpeg, gif or tiffs are. here is what Wikipedia says about it.
    Portable Document Format (PDF) is a file format used to represent documents in a manner independent of application software, hardware, and operating systems.[2] Each PDF file encapsulates a complete description of a fixed-layout flat document, including the text, fonts, graphics, and other information needed to display it.
    For what purpose do you want your Pages document to be exported to PDF?

  • Move and Compress from 10.2.0.4 to 11.2.0.3

    All:
    Have a question regarding best practice or an approach with regard to a requirement to move data between databases.
    I have about 7 different schemas that currently reside in a 10.2.0.4 database, I need to take a snapshot of these schemas as part of a migration project and move them into a new 11.2.0.3 database where they will be utilized as historical data, and no longer updated.
    I would like to move this data into the new 11.2.0.3 db and have it be compressed as part of this process. Doing a simple datapump export from 10.2 and import to 11.2 into a tablespace with compression set to OLTP by default seems to not result in the newly created objects being compressed, I am assuming that the existing table definitions are overriding the defaults for the tablespace.
    Any guidance with regard to going from 10.2 normal to 11.2 compressed in the fewest steps possible would be greatly appreciated.
    Thanks,
    Bill

    Pl clarify if you are referring to standard compression (http://docs.oracle.com/cd/E11882_01/server.112/e25494/tables002.htm#CJAGFBFG) or advanced compression (http://docs.oracle.com/cd/E11882_01/license.112/e10594/options.htm#DBLIC142) - the latter requires separate licensing
    HTH
    Srini

  • Setting the JAR compression Level from a .DEPLOY file

    I am trying to find a straightforward method to use the best compression level possible for the jar files created from JDeveloper (10.1.3). I do have the "Compressed" checkbox checked.
    Currently it is creating Jar files that are not compressed at the best level possible, (the files it creates are about 8 megs). If I open up the jar file in an external compression tool, and then re-compress it as a Jar, I am able to get a file almost half the size... (4-5 megs). And yes, this jar file still works flawlessly in all of our environments. (it is used as a webstart file)
    Is there any way to adjust the compression level that JDeveloper uses to compress the jar?
    Thanks,
    Mike Hess

    Hi Mike,
    I had a look at the code, and unfortunately there seems to be no way to set the compression level. It's using a ZipOutputStream with the default compression level, and deployment does not expose any user accessible way to override this (I was hoping to find some kind of "hidden" setting, but no joy).
    I have filed bug 4567053 to track this (should be visible on metalink within around 24 hours). In the mean time, perhaps using Ant or External Tools is an option? You could probably use either to essentially post process the jar in the way you're manually doing it now if the initial assembly of the jar performed by the deployment feature is tricky to capture in a build file.
    Thanks,
    Brian

  • What is MMR Compression? Is it supported for TIFF through JAI

    Please give a brief idea of MMR Compression . I want to compress a TIFF image using MMR compression through JAI , Is it possible?
    If not please suggest alternate methods to compress TIFF using MMR.

    hi,
    I have also same doubt? If anybody knows plz tell me.
    Compression tif file with MMR.

  • IMovie 11 compression options lost ?

    Hi everybody !
    Yesterday I was trying to export my iMovie project as I usually do, using QuickTime.
    When I opened the video option to set the compressor to use I noticed some options were missing (e.g. "None", to export without using compression).
    How is it possible ?
    Before Lion OS I'm sure this option was available (I always export my projects without compression).
    Please ... help me ...
    Thanks everybody,
    Stefano.

    Hi, thanks for your answer.
    Unfortunately I cannot see them.
    The only options available now are:
    1) Animation
    2) Apple Intermediate Codec
    3) DV - PAL
    4) DV/DVCPRO - NTSC
    5) DVCPRO - PAL
    6) DVCPRO50 - NTSC
    7) DVCPRO50 - PAL
    8) H.264
    9) PHOTO JPEG
    10) VIDEO MPEG-4
    After "H.264" I had "None", to export clips without compression.
    I'm pretty sure this happened after Lion OS update.
    Also "Apple Pixlet Video", "JPEG 2000" and "PNG" codecs were available before Lion OS.
    Could it be an issue about Quicktime ?

  • Seven hours to compress HDV to H.264 - to be expected?

    I'm looking to buy a desktop to replace my laptop as my primary video editing/rendering tool, but I want to check that I'm doing everything I can to make my video compression run as quickly as possible as it is right now.
    For example: I had 43GB of HDV video, all of it fairly high motion sports clips, and Compressor took a little over seven hours to compress it all into H.264 clips at 800kbps, totaling 1.3GB. I'm ready to buy a very powerful machine in order to drastically cut this compression time down, but I wanted to see how much of this is because of software limitations, as opposed to hardware limitations.
    Thanks.

    r daws,
    I switched from a macbook pro to a mac pro 2.26 Nehalem. The simple answer to you question is that it will speed up your compressions dramatically if you know how to do it. I would estimate a factor of 10. That's right, it will take about 1/10 the time on a new mac pro vs a macbook pro, if you set it up right.
    Now that is just using compressor. Rendering in FCP is faster, but not by that big of a factor. My experience is roughly 40-80 % faster.
    Many of Apple apps, like iLife, you won't notice a ton of difference.
    It all has to do with the number of cores and the utilization of the cores.
    If you set up compressor to use qmaster, you can set up a quick cluster of 6-8 cores, and then compressor splits up the videos and runs 6 different instances ( separate copies of the program) which greatly speeds up the process.
    Also, using multiple internal hard drives makes it much more convenient, and faster as well.
    All said and done, it is hard to describe how happy I am with the speed difference between the Mac Pro and my MacBook Pro. Never go back.
    But, the long answer is that 7 hours to compress could either be really great or really horrible depending on the settings.
    It takes me roughly 4 - 5 Minutes to compress a seven minute video using the default settings on DVD-90 Best Quality. When I change the frame controls to produce a higher quality vid, the same sequence takes 45 Minutes.
    Time is all dependent on the settings.
    Ken

  • Can't compress a video small enough

    I'm a FC Express user and want to submit videos to a website that requires them to be 2 minutes long, 240x320, and no bigger than 7 megs., and they have to be .wmv files. I tried saving a video as a 240x320 Quicktime time file compressed as much as possible, then using the Flip4Mac Quicktime plug-in, and compressing it as much as possible in that to convert it to a .wmv file but it is still too big at 17 megs. Does anyone have any suggestions? Any help would be greatly appreciated. -Charlie Neuman, San Marcoss, CA

    1 pass CBR means it looks at it one time and uses a Constant Bit Rate. I don't think you can do anything else when exporting from FCE directly. If you use something like Episode, made by Telestream, who make Flip4Mac, you can use use 2-pass VBR. It looks at the media to assess how it can compress it better and then in the second pass does the compression. VBR stands for Variable Bit Rate. With 2-pass VBR a little reduction of the bit rate, perhaps only one end (it's variable the bit rate can fluctuate between say 300 and 500 depending on the complexity of the video), and you'll get your 7MB.

  • Edit and output video without modifying any pixels?

    That subject line sounds cryptic, I'm sure, but I need to assemble numerous snippets of animated pixel art into what should become a somewhat lengthy animation. Final Cut should perform no internal resizing and no compression prior to output; all input files are 240x160 without exception. Stills are PNG to take advantage of the Alpha channel, but the animated clips are uncompressed QuickTime with a single, unique RGB value reserved for the background for chroma-keying (all layering is to be all-or-nothing with regard to the Alpha/keying; there is no partial transparency at play).
    All I need is to compile these clips together, layer a few elements on top of each other, and tweak the timing here and there. Beyond that, Final Cut should leave everything exactly like it found it, down to the pixel. (I'll worry about the output format and compression later.)
    Is this possible (and easy) in Final Cut Express? In my disastrous trials with Adobe Premiere Elements (v.1), Premiere insisted on internally resizing the clips for editing, which fuzzed up the images and defeated the chroma keying (as if the fuzziness wasn't bothersome enough). It's possible this was due to user error, but as far as I can tell, nothing can prevent Premiere from reprocessing everything into its native DV dimensions, which is quite destructive toward pixel art. Anyway, I'm frustrated enough with Premiere that I'm willing to purchase Final Cut, if it can meet these requirements easily.
    Thanks much for any information you can provide. Oh, and if Final Cut could possibly output the entire sequence as an animated GIF, that'd be fantastic, but I tend to think it'd be a little too heavy-handed (even though my total palette is well below 256 colors) because it's used to crunching fancy true-color videos instead of this basic pixel art.

    After some further research, it seems my best approach for this project will be to compose my small QuickTime clips with all layering done beforehand, then assemble them with a utility such as QuickTime Pro or MPEG Streamclip, as discussed here:
    http://discussions.apple.com/thread.jspa?messageID=7813920&#7813920
    MPEG Streamclip, in fact, seems to work quite well in my initial tests. If the clips are numbered sequentially, dropping them into the application assembles them beautifully. And it's free!
    Thanks again to those who replied.

  • 2GB OR NOT 2GB - FILE LIMITS IN ORACLE

    제품 : ORACLE SERVER
    작성날짜 : 2002-04-11
    2GB OR NOT 2GB - FILE LIMITS IN ORACLE
    ======================================
    Introduction
    ~~~~~~~~~~~~
    This article describes "2Gb" issues. It gives information on why 2Gb
    is a magical number and outlines the issues you need to know about if
    you are considering using Oracle with files larger than 2Gb in size.
    It also
    looks at some other file related limits and issues.
    The article has a Unix bias as this is where most of the 2Gb issues
    arise but there is information relevant to other (non-unix)
    platforms.
    Articles giving port specific limits are listed in the last section.
    Topics covered include:
    Why is 2Gb a Special Number ?
    Why use 2Gb+ Datafiles ?
    Export and 2Gb
    SQL*Loader and 2Gb
    Oracle and other 2Gb issues
    Port Specific Information on "Large Files"
    Why is 2Gb a Special Number ?
    ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    Many CPU's and system call interfaces (API's) in use today use a word
    size of 32 bits. This word size imposes limits on many operations.
    In many cases the standard API's for file operations use a 32-bit signed
    word to represent both file size and current position within a file (byte
    displacement). A 'signed' 32bit word uses the top most bit as a sign
    indicator leaving only 31 bits to represent the actual value (positive or
    negative). In hexadecimal the largest positive number that can be
    represented in in 31 bits is 0x7FFFFFFF , which is +2147483647 decimal.
    This is ONE less than 2Gb.
    Files of 2Gb or more are generally known as 'large files'. As one might
    expect problems can start to surface once you try to use the number
    2147483648 or higher in a 32bit environment. To overcome this problem
    recent versions of operating systems have defined new system calls which
    typically use 64-bit addressing for file sizes and offsets. Recent Oracle
    releases make use of these new interfaces but there are a number of issues
    one should be aware of before deciding to use 'large files'.
    What does this mean when using Oracle ?
    ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    The 32bit issue affects Oracle in a number of ways. In order to use large
    files you need to have:
    1. An operating system that supports 2Gb+ files or raw devices
    2. An operating system which has an API to support I/O on 2Gb+ files
    3. A version of Oracle which uses this API
    Today most platforms support large files and have 64bit APIs for such
    files.
    Releases of Oracle from 7.3 onwards usually make use of these 64bit APIs
    but the situation is very dependent on platform, operating system version
    and the Oracle version. In some cases 'large file' support is present by
    default, while in other cases a special patch may be required.
    At the time of writing there are some tools within Oracle which have not
    been updated to use the new API's, most notably tools like EXPORT and
    SQL*LOADER, but again the exact situation is platform and version specific.
    Why use 2Gb+ Datafiles ?
    ~~~~~~~~~~~~~~~~~~~~~~~~
    In this section we will try to summarise the advantages and disadvantages
    of using "large" files / devices for Oracle datafiles:
    Advantages of files larger than 2Gb:
    On most platforms Oracle7 supports up to 1022 datafiles.
    With files < 2Gb this limits the database size to less than 2044Gb.
    This is not an issue with Oracle8 which supports many more files.
    In reality the maximum database size would be less than 2044Gb due
    to maintaining separate data in separate tablespaces. Some of these
    may be much less than 2Gb in size.
    Less files to manage for smaller databases.
    Less file handle resources required
    Disadvantages of files larger than 2Gb:
    The unit of recovery is larger. A 2Gb file may take between 15 minutes
    and 1 hour to backup / restore depending on the backup media and
    disk speeds. An 8Gb file may take 4 times as long.
    Parallelism of backup / recovery operations may be impacted.
    There may be platform specific limitations - Eg: Asynchronous IO
    operations may be serialised above the 2Gb mark.
    As handling of files above 2Gb may need patches, special configuration
    etc.. there is an increased risk involved as opposed to smaller files.
    Eg: On certain AIX releases Asynchronous IO serialises above 2Gb.
    Important points if using files >= 2Gb
    Check with the OS Vendor to determine if large files are supported
    and how to configure for them.
    Check with the OS Vendor what the maximum file size actually is.
    Check with Oracle support if any patches or limitations apply
    on your platform , OS version and Oracle version.
    Remember to check again if you are considering upgrading either
    Oracle or the OS in case any patches are required in the release
    you are moving to.
    Make sure any operating system limits are set correctly to allow
    access to large files for all users.
    Make sure any backup scripts can also cope with large files.
    Note that there is still a limit to the maximum file size you
    can use for datafiles above 2Gb in size. The exact limit depends
    on the DB_BLOCK_SIZE of the database and the platform. On most
    platforms (Unix, NT, VMS) the limit on file size is around
    4194302*DB_BLOCK_SIZE.
    Important notes generally
    Be careful when allowing files to automatically resize. It is
    sensible to always limit the MAXSIZE for AUTOEXTEND files to less
    than 2Gb if not using 'large files', and to a sensible limit
    otherwise. Note that due to <Bug:568232> it is possible to specify
    an value of MAXSIZE larger than Oracle can cope with which may
    result in internal errors after the resize occurs. (Errors
    typically include ORA-600 [3292])
    On many platforms Oracle datafiles have an additional header
    block at the start of the file so creating a file of 2Gb actually
    requires slightly more than 2Gb of disk space. On Unix platforms
    the additional header for datafiles is usually DB_BLOCK_SIZE bytes
    but may be larger when creating datafiles on raw devices.
    2Gb related Oracle Errors:
    These are a few of the errors which may occur when a 2Gb limit
    is present. They are not in any particular order.
    ORA-01119 Error in creating datafile xxxx
    ORA-27044 unable to write header block of file
    SVR4 Error: 22: Invalid argument
    ORA-19502 write error on file 'filename', blockno x (blocksize=nn)
    ORA-27070 skgfdisp: async read/write failed
    ORA-02237 invalid file size
    KCF:write/open error dba=xxxxxx block=xxxx online=xxxx file=xxxxxxxx
    file limit exceed.
    Unix error 27, EFBIG
    Export and 2Gb
    ~~~~~~~~~~~~~~
    2Gb Export File Size
    ~~~~~~~~~~~~~~~~~~~~
    At the time of writing most versions of export use the default file
    open API when creating an export file. This means that on many platforms
    it is impossible to export a file of 2Gb or larger to a file system file.
    There are several options available to overcome 2Gb file limits with
    export such as:
    - It is generally possible to write an export > 2Gb to a raw device.
    Obviously the raw device has to be large enough to fit the entire
    export into it.
    - By exporting to a named pipe (on Unix) one can compress, zip or
    split up the output.
    See: "Quick Reference to Exporting >2Gb on Unix" <Note:30528.1>
    - One can export to tape (on most platforms)
    See "Exporting to tape on Unix systems" <Note:30428.1>
    (This article also describes in detail how to export to
    a unix pipe, remote shell etc..)
    Other 2Gb Export Issues
    ~~~~~~~~~~~~~~~~~~~~~~~
    Oracle has a maximum extent size of 2Gb. Unfortunately there is a problem
    with EXPORT on many releases of Oracle such that if you export a large table
    and specify COMPRESS=Y then it is possible for the NEXT storage clause
    of the statement in the EXPORT file to contain a size above 2Gb. This
    will cause import to fail even if IGNORE=Y is specified at import time.
    This issue is reported in <Bug:708790> and is alerted in <Note:62436.1>
    An export will typically report errors like this when it hits a 2Gb
    limit:
    . . exporting table BIGEXPORT
    EXP-00015: error on row 10660 of table BIGEXPORT,
    column MYCOL, datatype 96
    EXP-00002: error in writing to export file
    EXP-00002: error in writing to export file
    EXP-00000: Export terminated unsuccessfully
    There is a secondary issue reported in <Bug:185855> which indicates that
    a full database export generates a CREATE TABLESPACE command with the
    file size specified in BYTES. If the filesize is above 2Gb this may
    cause an ORA-2237 error when attempting to create the file on IMPORT.
    This issue can be worked around be creating the tablespace prior to
    importing by specifying the file size in 'M' instead of in bytes.
    <Bug:490837> indicates a similar problem.
    Export to Tape
    ~~~~~~~~~~~~~~
    The VOLSIZE parameter for export is limited to values less that 4Gb.
    On some platforms may be only 2Gb.
    This is corrected in Oracle 8i. <Bug:490190> describes this problem.
    SQL*Loader and 2Gb
    ~~~~~~~~~~~~~~~~~~
    Typically SQL*Loader will error when it attempts to open an input
    file larger than 2Gb with an error of the form:
    SQL*Loader-500: Unable to open file (bigfile.dat)
    SVR4 Error: 79: Value too large for defined data type
    The examples in <Note:30528.1> can be modified to for use with SQL*Loader
    for large input data files.
    Oracle 8.0.6 provides large file support for discard and log files in
    SQL*Loader but the maximum input data file size still varies between
    platforms. See <Bug:948460> for details of the input file limit.
    <Bug:749600> covers the maximum discard file size.
    Oracle and other 2Gb issues
    ~~~~~~~~~~~~~~~~~~~~~~~~~~~
    This sections lists miscellaneous 2Gb issues:
    - From Oracle 8.0.5 onwards 64bit releases are available on most platforms.
    An extract from the 8.0.5 README file introduces these - see <Note:62252.1>
    - DBV (the database verification file program) may not be able to scan
    datafiles larger than 2Gb reporting "DBV-100".
    This is reported in <Bug:710888>
    - "DATAFILE ... SIZE xxxxxx" clauses of SQL commands in Oracle must be
    specified in 'M' or 'K' to create files larger than 2Gb otherwise the
    error "ORA-02237: invalid file size" is reported. This is documented
    in <Bug:185855>.
    - Tablespace quotas cannot exceed 2Gb on releases before Oracle 7.3.4.
    Eg: ALTER USER <username> QUOTA 2500M ON <tablespacename>
    reports
    ORA-2187: invalid quota specification.
    This is documented in <Bug:425831>.
    The workaround is to grant users UNLIMITED TABLESPACE privilege if they
    need a quota above 2Gb.
    - Tools which spool output may error if the spool file reaches 2Gb in size.
    Eg: sqlplus spool output.
    - Certain 'core' functions in Oracle tools do not support large files -
    See <Bug:749600> which is fixed in Oracle 8.0.6 and 8.1.6.
    Note that this fix is NOT in Oracle 8.1.5 nor in any patch set.
    Even with this fix there may still be large file restrictions as not
    all code uses these 'core' functions.
    Note though that <Bug:749600> covers CORE functions - some areas of code
    may still have problems.
    Eg: CORE is not used for SQL*Loader input file I/O
    - The UTL_FILE package uses the 'core' functions mentioned above and so is
    limited by 2Gb restrictions Oracle releases which do not contain this fix.
    <Package:UTL_FILE> is a PL/SQL package which allows file IO from within
    PL/SQL.
    Port Specific Information on "Large Files"
    ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    Below are references to information on large file support for specific
    platforms. Although every effort is made to keep the information in
    these articles up-to-date it is still advisable to carefully test any
    operation which reads or writes from / to large files:
    Platform See
    ~~~~~~~~ ~~~
    AIX (RS6000 / SP) <Note:60888.1>
    HP <Note:62407.1>
    Digital Unix <Note:62426.1>
    Sequent PTX <Note:62415.1>
    Sun Solaris <Note:62409.1>
    Windows NT Maximum 4Gb files on FAT
    Theoretical 16Tb on NTFS
    ** See <Note:67421.1> before using large files
    on NT with Oracle8
    *2 There is a problem with DBVERIFY on 8.1.6
    See <Bug:1372172>

    I'm not aware of a packaged PL/SQL solution for this in Oracle 8.1.7.3 - however it is very easy to create such a program...
    Step 1
    Write a simple Java program like the one listed:
    import java.io.File;
    public class fileCheckUtl {
    public static int fileExists(String FileName) {
    File x = new File(FileName);
    if (x.exists())
    return 1;
    else return 0;
    public static void main (String args[]) {
    fileCheckUtl f = new fileCheckUtl();
    int i;
    i = f.fileExists(args[0]);
    System.out.println(i);
    Step 2 Load this into the Oracle data using LoadJava
    loadjava -verbose -resolve -user user/pw@db fileCheckUtl.java
    The output should be something like this:
    creating : source fileCheckUtl
    loading : source fileCheckUtl
    creating : fileCheckUtl
    resolving: source fileCheckUtl
    Step 3 - Create a PL/SQL wrapper for the Java Class:
    CREATE OR REPLACE FUNCTION FILE_CHECK_UTL (file_name IN VARCHAR2) RETURN NUMBER AS
    LANGUAGE JAVA
    NAME 'fileCheckUtl.fileExists(java.lang.String) return int';
    Step 4 Test it:
    SQL> select file_check_utl('f:\myjava\fileCheckUtl.java') from dual
    2 /
    FILE_CHECK_UTL('F:\MYJAVA\FILECHECKUTL.JAVA')
    1

  • Problem with unstable and slow SWFs

    Good evening,
    Some of my users are reporting problems with the SWF output of my intensive Captive file which includes significant software simulation. I knew that these files were going to be resource intensive due to the nature of the project, but I felt comfortable with system requirements of at least 2 gb RAM and a mid-range processor.
    Unfortunately, some of my users who meet this criteria are reporting problems including text animations not playing in full and slow slide transitions. I cannot replicate this problem on any of my machines.
    I have been playing around trying to compress images as much as possible and otherwise reduce file size. What else can be done? What are the top five things I can do to remedy this problem?
    My presentation is about 50 slides and the SWF output is about 4.5 mb.
    Thanks

    I've seen this type of thing many times, especially in corporate environments.
    My top suggestions:
    If you've taken all reasonable steps to make your project as web-friendly as possible by this time, don't make any more changes to your project until you've exhausted the possibilities below.  It is most likely NOT your project that is at fault.
    Find out more details from the users that are complaining.  If at all possible, actually go and sit with them to watch the behaviour they're talking about.  Don't believe everything you hear.
    Check that users are on the correct version of Flash Player. For example, Flash Player 10 can play significantly better than Flash 9.
    Ask the user to check defragmentation on their PC.  Most users DON'T know how to do this and never defragment. As a result, I have seen PCs in corporate IT environments that had not been defragmented in years.  Their hard drive showed up mostly red when the fragmentation analysis was done.  This lack of PC housekeeping can contribute to many issues, especially with Flash or multimedia playback.  Ask the user to defragment 2 or 3 times to be on the safe side.
    Watch the CPU on Task Manager while the PC is playing the Captivate presentation.  The performance monitor should tell you if the CPU is struggling and unable to decode the content fast enough.
    Try to have another user log onto the same machine using a different profile and see if they experience the same issues.  This can tell you whether the issue is due to a corrupted user profile.  Conversely, have the same user log onto a different machine, creating a new profile for themselves on that PC to see if they experience the same issue.  If it seems to be their profile at fault, they may need to get some IT dude to blow away their profile and set it up afresh.
    Check whether or not the issue could be due to server latency.  This is especially the case where LMSs are involved.  Users will often experience latency issues such as slow transitions from one quiz slide to the next because the course module has to wait for the LMS to respond before it can move forward.  If this is occurring for a significant number of users, you can try using the optional SCORM template that only sends tracking data at the end of the module instead of all the way through.
    In my experience, if 95% of your user base is playing the content without issue, you don't need to change anything.  Chances are that whatever is going wrong for the few that are complaining about poor playback wouldn't necessarily improve no matter what you did to your project.  In the vast majority of cases, you need to isolate what exactly is causing the issue for these users and correct it ON THEIR END.
    Rod Ward
    www.infosemantics.com.au
    UPDATE: I just realised that a lot of this information would probably help others, so I created a blog post documenting these steps: http://www.infosemantics.com.au/debug_slow_playback

  • Animated Gif Color Problem

    Hi all,
    I want to make a small animated gif by My lovely software : AfterEffect...
    The problem i face, that when i render my work, i get an animated gif with bad colors (like if i export as 256 colors for example..)
    I put in the export setting MILLIONS OF COLORS ....
    and i get always POOR color depth ...
    I can see the true colors in the After Effect work Panels ...
    Only when i put the composition in the render queu and export as Animated Gif, i get the Poor colors..
    PLEASE HELP ME ...
    Thanks

    As pointed out by the others, the limitation is in the file format. You will never exceed the 256 barrier. This is further complicated by how AE determines the color palette. This usually happens on the first few frames, 'cos naturally the other frames do not exist and AE doesn't know about them. If there is considerable change in the color palette over time, this cannot be accommodated. Therefore you'd really do a lot better by following the advise provided by the others. In addition to Rick's tips, I recommend you work with the perceptive dithering mode when saving your GIFs. This usually gives the "smoothest" result, but may shift the colors ever so slightly. I also recommend to not use transparencies with animated GIFs. They have a severe impact on file size and compression quality as well as possibly the playback performance in your browser. If you know the background color of your web page, it should be part of the file.
    Mylenium

Maybe you are looking for