Elements 9 failure to read large files.

I recently purchased a Nikon D5300. I was a previous D50 user and always saved and converted RAW NEF files with Elements 9. The D5300 save 24Mb NEF files and Elements 9 would not read these files. I chanded to save RAW and JPEG files and it would not read those either. It will read lower res JPEG files. Is there an update to Elements 9 to alllow reading larger files?
Bob Jacobson
[email protected]

Here's an interesting take on the "dependent operation failure" error message
Adobe CS5: Payload cannot be installed due to dependent operation failure

Similar Messages

  • File Adapter Not Reading Large Files

    Dear Experts,
    Enviornment :-
    OS:-Linux
    Jdeveloper:- 11.1.1.6
    SOA:-11.1.1.6
    Weblogic:-10.3.6
    JDK:-SUN
    Allocated RAM:-16GB
    Currently we are in UAT Phase and now we are facingan  issue in reading large file.Below is the Design details of the service
    FileAdapter(Read)-->Bpel(Business login,Using FlowN)-->FileAdapter(Write CSV),JMS Adapter(AQ JMS topics)
    In this case at the time of read itself we are facing issue. File adapter reading xml file but in receive activity receives input data as
    xmlDocKey:1C135990067411E3BFA6B5087B629F9DI
    I really couldn;t understand about the error. Even i tried reading as Opaque format and still end up with same error.
    In order to make sure i have create mediator and tried reading the file, in case i was able to read file upto 15mb with out any error.Also i tried as  "read as Attachment" in Bpel component and able to read the attachment upto 7 mb file, but this is hitting the performance side.
    I request some one please let me know why the file adapter is giving XmlDOCKey rather then XMLContent to the inputvariable
    Regards,
    Tarak

    Can you check your BPEL Properties in EM?
    Go to Soa-infra > right Click > SOA Administration > BPEL properties
    increase the Dispatcher Engine Thread = 10, Dispatcher Invoke Threads = 60 and Dispatcher Engine Threads to 90
    Click on "More BPEL Configuration Properties"
    Increase the  DispatcherMaxRequestDepth from 600 to 1000.
    Bounce the server and see if works..
    Bounce the server and try again.
    If this fails, try get a threshold by increasing the file size until it fails again. 

  • How do I fix the failure to read Pdf files?

    How do I fix the failure to read Pdf files(Many old and some new) with new acro reader 11?

    What is your operating system?  What exactly do you mean "failure to read"; do you get an error message?

  • Reading large file with JCA Adapter in OSB

    Hello,
    We are searching for a solution how to read large file (>50M) from network drive and deliver it to queue via OSB 11gR4 (10.3.4). The problem is when reading the file with JCA File Adapter. It seems that it cannot handle as large files as we have. The documentation provides a way to bypass file size limitation by using Chunk Read but it seems to require BPEL Process execution which is not possible in our environment. Does anyone know if there are ways to implement this without having BPEL Process?
    Our usecase:
    read file from network drive -> transfer with OSB -> deliver MQ
    Other options than JCA File Adapter can be considered, if anyone can advice...

    If it's a plain routing use case and no message processing is required then you may simply use OSB's FILE transport instead of JCA adapter. Create a messaging type proxy service and select request message type as "binary". Also enable the content streaming (Disk buffer, compression).
    From OSB Dev guide -
    Oracle JCA Adapter for FTP and Files – Attachments (large payload support), pre- and post-processing of files, using a re-entrant valve for processing ZIP files, content streaming, and file chunked read are not supported with Oracle Service Bus.
    http://download.oracle.com/docs/cd/E17904_01/doc.1111/e15866/jca.htm#BABBICIA
    You may also refer -
    Reading huge flat file in OSB 11gR1
    Regards,
    Anuj

  • UTL_FILE.get_line won't read large files ?

    I am trying to read a large fixed length flat file. If I cut the file down to really small it will read it but it reads it as a single line. If I try to read a larger file > 32k I get a READ_ERROR. I am pretty sure it has to do with the end of line marker but I saw nothing about that in the UTL_FILE documentation. This is on UNIX, new line character after each record in the file. Standard unix flat file.
    Any ideas on what to do?
    Thanks in advance
    Matt
    [email protected]
    my code:
    BEGIN
    BEGIN
    std_file := UTL_FILE.FOPEN('&4','&1','r',32767);
    EXCEPTION
    WHEN UTL_FILE.INVALID_PATH THEN
    RAISE_APPLICATION_ERROR(-20011,'Invalid Path for STD file, &4/&1');
    WHEN OTHERS THEN
    RAISE_APPLICATION_ERROR(-20014,'Other Error trying to open STD file, &4/&1');
    END;
    IF UTL_FILE.is_open(std_file) = FALSE THEN
    RAISE_APPLICATION_ERROR(-20015,'Could not open STD file, &4/&1');
    END IF;
    -- READ STD FILE HEADER
    BEGIN
    UTL_FILE.get_line(std_file,hdr_text);
    EXCEPTION
    WHEN UTL_FILE.INVALID_FILEHANDLE THEN
    RAISE_APPLICATION_ERROR(-20017,'STD read file handle not valid');
    WHEN UTL_FILE.INVALID_OPERATION THEN
    RAISE_APPLICATION_ERROR(-20018,'STD read invalid operation error');
    WHEN UTL_FILE.READ_ERROR THEN
    RAISE_APPLICATION_ERROR(-20019,'STD read error');
    WHEN NO_DATA_FOUND THEN
    RAISE_APPLICATION_ERROR(-20020,'STD read no data found');
    WHEN VALUE_ERROR THEN
    RAISE_APPLICATION_ERROR(-20021,'STD read value error');
    END;
    -- PROCESS TRANSACTIONS
    LOOP
    BEGIN
    tran_text := NULL;
    UTL_FILE.get_line(std_file,tran_text);
    EXCEPTION
    WHEN no_data_found THEN EXIT; -- EOF
    WHEN value_error THEN
    RAISE_APPLICATION_ERROR(-20010,'STD record too long.');
    END;
    std_rowcount := std_rowcount + 1;
    END LOOP;
    UTL_FILE.FCLOSE(std_file);
    EXCEPTION
    WHEN NO_DATA_FOUND THEN
    RAISE_APPLICATION_ERROR(-20001,'No Data Found.');
    WHEN UTL_FILE.INVALID_PATH THEN
    RAISE_APPLICATION_ERROR(-20002,'Invalid Path ');
    WHEN UTL_FILE.INVALID_MODE THEN
    RAISE_APPLICATION_ERROR(-20003,'Invalid Mode ');
    WHEN UTL_FILE.INVALID_OPERATION THEN
    RAISE_APPLICATION_ERROR(-20004,'Invalid Operation ');
    END;
    null

    We are still hung up on this. I tried implementing the code from STEVE'S XML book but still haven't resovled it.
    The clob is being created via XSU see below. The new char[8192] appeasr to force the output file to 8K
    with trailing characters on small clobs but adds a carraige return each 8K on larger ones.
    As usual any input is appreciated from all. Doese anyone know of a good JAVA forum like this one?
    Thanks
    PROCEDURE BuildXml(v_return OUT INTEGER, v_message OUT VARCHAR2,string_in VARCHAR2,xml_CLOB OUT NOCOPY CLOB) IS
    queryCtx DBMS_XMLquery.ctxType;
    Buffer RAW(1024);
    Amount BINARY_INTEGER := 1024;
    Position INTEGER := 1;
    sql_string     VARCHAR2(2000) := string_in;
    BEGIN
    v_return := 1;
    v_message := 'BuildXml completed succesfully.';
    queryCtx := DBMS_XMLQuery.newContext(sql_string);
    xml_CLOB := DBMS_XMLQuery.getXML(queryCtx);
    DBMS_XMLQuery.closeContext(queryCtx);
    EXCEPTION WHEN OTHERS THEN
    v_return := 0;
    v_message := 'BuildXml failed - '||SQLERRM;
    END BuildXml;
    create or replace and compile java source named sjs.write_CLOB as
    import java.io.*;
    import java.sql.*;
    import java.math.*;
    import oracle.sql.*;
    import oracle.jdbc.driver.*;
    public class write_CLOB extends Object
    public static void pass_str_array(oracle.sql.CLOB p_in,java.lang.String f_in)
    throws java.sql.SQLException, IOException
    File target = new File(f_in);
    FileWriter fw = new FileWriter(target);
    BufferedWriter out = new BufferedWriter(fw);
    Reader is = p_in.getCharacterStream();
    char buffer[] = new char[8192];
    int length;
    while( (length=is.read(buffer)) != -1) {
    out.write(buffer);
    is.close();
    fw.close();
    /

  • Reading large files -- use FileChannel or BufferedReader?

    Question --
    I need to read files and get their content. The issue is that I have no idea how big the files will be. My best guess is that most are less than 5kb but some with be huge.
    I have it set up using a BufferedReader, which is working fine. It's not the fastest thing (using readLine() and StringBuffer.append()), but so far it's usable. However, I'm worried that if I need to deal with large files, such as a PDF or other binary, BufferedReader won't be so efficient if I do it line by line. (And will I run into issues trying to put a binary file into a String?)
    I found a post that recommended FileChannel and ByteBuffer, but I'm running into a java.lang.UnsupportedOperationException when trying to get the byte[] from ByteBuffer.
    File f = new File(binFileName);
    FileInputStream fis = new FileInputStream(f);
    FileChannel fc = fis.getChannel();
    // Get the file's size and then map it into memory
    int sz = (int)fc.size();
    MappedByteBuffer bb = fc.map(FileChannel.MapMode.READ_ONLY, 0, sz);
    fc.close();
    String contents = new String(bb.array()); //code blows up
    Thanks in advance.

    If all you are doing is reading data I don't think you're going to get much faster than InfoFetcher
    you are welcome to use and modify this class, but please don't change the package or take credit for it as your own work
    InfoFetcher.java
    ==============
    package tjacobs.io;
    import java.io.IOException;
    import java.io.InputStream;
    import java.util.ArrayList;
    import java.util.Iterator;
    * InfoFetcher is a generic way to read data from an input stream (file, socket, etc)
    * InfoFetcher can be set up with a thread so that it reads from an input stream
    * and report to registered listeners as it gets
    * more information. This vastly simplifies the process of always re-writing
    * the same code for reading from an input stream.
    * <p>
    * I use this all over
         public class InfoFetcher implements Runnable {
              public byte[] buf;
              public InputStream in;
              public int waitTime;
              private ArrayList mListeners;
              public int got = 0;
              protected boolean mClearBufferFlag = false;
              public InfoFetcher(InputStream in, byte[] buf, int waitTime) {
                   this.buf = buf;
                   this.in = in;
                   this.waitTime = waitTime;
              public void addInputStreamListener(InputStreamListener fll) {
                   if (mListeners == null) {
                        mListeners = new ArrayList(2);
                   if (!mListeners.contains(fll)) {
                        mListeners.add(fll);
              public void removeInputStreamListener(InputStreamListener fll) {
                   if (mListeners == null) {
                        return;
                   mListeners.remove(fll);
              public byte[] readCompletely() {
                   run();
                   return buf;
              public int got() {
                   return got;
              public void run() {
                   if (waitTime > 0) {
                        TimeOut to = new TimeOut(waitTime);
                        Thread t = new Thread(to);
                        t.start();
                   int b;
                   try {
                        while ((b = in.read()) != -1) {
                             if (got + 1 > buf.length) {
                                  buf = IOUtils.expandBuf(buf);
                             int start = got;
                             buf[got++] = (byte) b;
                             int available = in.available();
                             //System.out.println("got = " + got + " available = " + available + " buf.length = " + buf.length);
                             if (got + available > buf.length) {
                                  buf = IOUtils.expandBuf(buf, Math.max(got + available, buf.length * 2));
                             got += in.read(buf, got, available);
                             signalListeners(false, start);
                             if (mClearBufferFlag) {
                                  mClearBufferFlag = false;
                                  got = 0;
                   } catch (IOException iox) {
                        throw new PartialReadException(got, buf.length);
                   } finally {
                        buf = IOUtils.trimBuf(buf, got);
                        signalListeners(true);
              private void setClearBufferFlag(boolean status) {
                   mClearBufferFlag = status;
              public void clearBuffer() {
                   setClearBufferFlag(true);
              private void signalListeners(boolean over) {
                   signalListeners (over, 0);
              private void signalListeners(boolean over, int start) {
                   if (mListeners != null) {
                        Iterator i = mListeners.iterator();
                        InputStreamEvent ev = new InputStreamEvent(got, buf, start);
                        //System.out.println("got: " + got + " buf = " + new String(buf, 0, 20));
                        while (i.hasNext()) {
                             InputStreamListener fll = (InputStreamListener) i.next();
                             if (over) {
                                  fll.gotAll(ev);
                             } else {
                                  fll.gotMore(ev);
         }

  • Adobe reader 8 failed to read large files

    i have got annoying issue, actually I have installed adobe reader8 recently.
    it reads small files(less than 2MB) without any problem. but when I try to open files of size more than 2 MB it get hanged including all other processes. then I do have to open task manager to terminate this.
    why there is so?

    I have a fairly new Windows Vista with Adobe 8. It will not open large eMails near the 5 MB range. I know they are mostly pictures , videos. I have on my desktop Adobe Reader 8, Adobe Media Player and adobe Photosh. I tried downloading adobe 9, but I don't see it on the desktop. What can I do to solve this? I am a retired Pharmacist. My computer knowledge is between Beginner and Intermediate.
    Thank YOu,
    [email protected]

  • Need help using JTextArea to read large file

    Hi here is the deal.
    I've got a large file (about 12Mbytes) of raw data (all of the numbers are basically doubles or ints) which was formed with the ObjectOutputStream.writeInt/writeDouble (I say this to make clear the file has no ascii whatsoever).
    Now I do the file reading on a SwingWorker thread where I read the info from the file in the same order I put it originally.
    I need to convert it to string and visualize it on a JTextArea. It starts working. However a one point (56% to be exact since I know exactly the number of values I need to read) it stops working. The program doesn�t freeze (probably because the other worker thread froze) and I get no exceptions (even tough I�m catching them) and no errors.
    Does anyone have any idea of what the problem could be?
    Thank you very much in advance.
    PD: I don�t know if it matters but I'm using ObjectInputStream with the readInt/readDouble functions to get the values and then turning them to strings and adding them to the JTextArea.

    I can put up the code.
    I don't have it with me right now but I'll do it later.
    Thank you.
    Second. I need to debug a function aproximation that uses a method that has to manage that many numbers. If I don�t put it into a txt and read it there is no way I will know where the problems are, if any. And yes I can look at the txt and figure out problems. It's not that hard.
    What I'll try to do is to write directly to regular txt file instead of doing it to the JTextArea.
    Thank you for your help and I'll post back with the code and results.
    PD: I don't know what profiling is, would you mind telling me?

  • Texhnique to read large files (20-200Mb) using IO

    Hi,
    Can anyone send me code to read large XML files from Network or local machine say 20-200Mb, when i use normal IO it takes huge time.
    I need some optimum or buffered approach.
    Thanks and Regards
    Amit

    I am sorry, i don't know why you are so rude i am not giving u any homework assigned to me, i am just a new learner and i also don't like any spoonfeeding but i am a c++ programmer, and just learning java
    I am getting error
    Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
    when i am using xerces 2.9.0 JDK1.5,
    The XML file size is approx 1G
    I am using
    try
    File file = new File(m_szFileName);
    DocumentBuilderFactory dbf = DocumentBuilderFactory.newInstance();
    DocumentBuilder db = dbf.newDocumentBuilder();
    Document doc = db.parse(file);
    doc.getDocumentElement().normalize();
    m_rootNode = doc.getDocumentElement();
    catch (Exception e)
    e.printStackTrace();
    when i get document.parse then it gives error
    Document doc = db.parse(file);
    I think this loads whole file into heap and that heap is not sufficient
    This code is running if i put upto 5Mb file.
    I want to know whether it uses internally some buffering mechanism or that i need to do externally.
    if yes for that i am asking for sample codes.
    Also i got some solutions like
    java -Xms<initial heap size> -Xmx<maximum heap size>
    Defaults are:
    java -Xms32m -Xmx128m
    but i think this is not a proper solution to increase heap size, i don't know but according to me it's wrong approach to increase heap according to requirement
    specially when i don't know max size of xml, it can be upto 4G, or like i can also need to read eventviewer XML saved file whole max limit is 4G.
    Also i come to know bug in
    http://bugs.sun.com/bugdatabase/view_bug.do?bug_id=6536111
    jre1.6,
    but i think i am using jre1.5
    b'coz my jdk version is 1.5
    Can you guide me
    Thanks in advance
    amit3281
    Message was edited by:
    amit3281
    Message was edited by:
    amit3281

  • Read Large Files Using BPEL File Adapter

    Hi,
    I have a scenario, where files of large size in text format are to be read and send to 3rd party. MTOM Policy has to be attached. Files of size 3MB or less are polled and size greater than 3 MB are not Retrieved. How can I resolve the issue. Do I have to break the file and send the data? If so how?
    Thanks
    Ranga

    You have to use streaming feature of the JCA file adapter for handling huge file.
    You could go through the following link
      http://docs.oracle.com/cd/E23943_01/integration.1111/e10231/adptr_file.htm#CIAHHEBF

  • Elements 10 will not read RAW files from my new Lumix G5

    All my RAW files taken with Lumix G1 are fine but the RAW files from my new camera (Lumix G5) won't open. What should I do?
    Looking at help in E10, it says ACR is version 6.7.0.339
    Elements 10 is version 10.0 (20110831.m.17215)
    Any ideas?

    You need at least ACR 7.2 for those raw files, and ACR 7 is not compatible with PSE 10. So you have three choices:
    1. Upgrade to PSE 11 (not the mac app store version, which is stuck at 7.1). If you are considering this, I would wait a month or so, since Adobe historically releases a new version every Sept/Oct.
    2. Use the panasonic software (silky pics?) to convert the raw files and send the resulting images to PSE for further processing.
    3. Download the free adobe DNG converter, which can create DNG files that are compatible with your version of ACR.

  • How to read large files(20-30 MB) using File/FTP adapters.

    Hi All,
    I am using JDev 10.1.3.3 and SOA 10.1.3.3.
    In my project some times i am getting files of size 20MB to 30MB.
    And the data of the file is in binary form.
    How can i read these type of files.
    I think batching is not possible because data is in binary form.
    Please help me out...
    Regards
    PavanKumar.

    Hi Anirudh,
    the tips provided in technotes are not suitable to my process structure.
    In my case data will be in binary format.
    First i need to read the payload and later in my process i have to archive that using a web service.

  • Streaming basics - reading large file from RESTful API

    Hey all -
    My command of PowerShell is pretty rudimentary, though I have written a number of scripts.  In a recent example, I was trying to redirect the output of my firm's RESTful API - a small .txt file - into a file, using | out-file <path and
    file>.  THat worked 
    just fine.  However, when I tried to do this with a docx or pdf file, the result was that I ended up with a small file that was 
    unopenable.
    I gather what I need to do is stream the content to a file, but I can't quite see how that is done in Powershell.  The files might  be
    many MB in size and take upwards of 30 seconds to transfer.
    Any basic guidance on how to approach this would be appreciated.  Thanks -
    Paul
    Paul

    Oh so close!
    I'm getting hung up on how to create and pass the session object itself. 
    My first command,
    Invoke-RestMethod -Uri
    <apiKey>9588060291......</apiKey><email>...</email><password>...</password><endOtherSessions>T</endOtherSessions></createSession">https://test.lofco.com/v1/ILCGLServices/session?method=CREATE"&"client=DEV-TEST"&"xml=<createSession><apiKey>9588060291......</apiKey><email>...</email><password>...</password><endOtherSessions>T</endOtherSessions></createSession>
    -SessionVariable s -Outfile 'c:\LofCo\Demos\API Code Samples\XML\Output\createsession_response.xml'
    returns successfully; I can see in the output file the session ID. I was hoping this was captured in the variable s via the -SessionVariable approach, but when I run the second command:
    Invoke-RestMethod -Uri
    <acceptSplash>T</acceptSplash></workspaceEntryRequest">https://test.lofco.com/services/workspaces/entry?client=DEV-TEST"&"workspaceId=226691"&"method=CREATE"&"xml=<workspaceEntryRequest><acceptSplash>T</acceptSplash></workspaceEntryRequest>
    -WebSession $s -Method Post -Outfile 'C:\LofCo\Demos\API Code Samples\XML\Output\enterexchange_response.xml'
    I get a 403 error indicating the session has expired or is otherwise invalid. I'm clearly confused as to how to capture this session and pass it along...
    Any pointers on that piece?  And thanks for your previous assistance.
    Paul
    Paul

  • Large file with fstream with WS6U2

    I can't read large file (>2GB) with the stl fstream. Can anyone do this or is this a limitation of WS6U2 fstream classes. I can read large files with low level function C functions

    I thought that WS6U2 meant Forte 6 Update 2. As to more information, the OS is SunOS 5.8, the file system is NFS mounted from an HP-UX 11.00 box and it's largefile enabled. my belief is that fstream does not implement access to large files, but I can't be sure.
    Anyway, I'm not sure by what you mean by the access to the OS support for largefiles by the compiler, but as I mentioned before, I can read more then 2GB with open() and read(). My problem is with fstream. My belief is that fstream must be largefile enabled. Any idea?

  • Along the lines of How To Lead Large Files

    I have some mainframe extract files loaded onto a Solaris drive that are between 1 and 4 GB to be used in an initial load of a data warehouse. I can't even open a file with file sizes that large. (We're running JDK 1.2.2 - not sure if that matters.) I'm using this statement -
    bufReader = new BufferedReader(new InputStreamReader(new FileInputStream(fileName)));
    This is the error I get on that statement -
    java.io.FileNotFoundException: /wrhs_data/export.13.aug02 (Value too larg)
    at java.io.FileInputStream.open(Native Method)
    at java.io.FileInputStream.<init>(FileInputStream.java:68)
    at com.cofinity.importer.MyFileReader.open(MyFileReader.java:40)
    at com.cofinity.importer.MyFileReader.main(Compiled Code)
    The statement works fine on files less than 220 MB. It breaks somewhere between 220 and 804 MB.
    From the error message it seems that the underlying Native call can't handle opening such a large file. I've searched for the "value to larg" sub-message and found nothing. I tried eliminating the BufferedReader and just using the InputStreamReader, but I received the same error.
    Does anyone know how Java can read large files in the 1 to 4 GB range? (I suppose I could use something like Informatica to split the files up, but our disk space is at a premium.) Any help would be greatly appreciated.
    Thanks,
    Steve

    Well it appears to fail in open(). I tried your code on a binary file of size 25739135241 bytes (23.9+ gibibytes) on AIX and it did just fine, so it may be something in the runtime, try upgrading to a newer JDK/SDK, failing that, use the OS to stream in your data:
    BufferedReader br = new BufferedReader(
            new InputStreamReader(System.in)
    );And just pipe/redirect your file to your Java processes' standard in.

Maybe you are looking for