Digest a large file using SHA1

Hi,
Is it possible to hash a file of size 1 GB with MessageDigest class. All the update method of the MessageDigest class are having only byte[] as argument. I guess reading the entire file(of size 1GB) and creating a byte array and updating it in to the MessageDigest class is not a good idea. Instead of doing it do we have any other ways/API supports for updating a FileInputStream to a MessageDigest class? It would be good if it takes care of intelligent hashing of large files.
I tried with multiple updates(of MessageDigest class) and it is taking 80 seconds to digest a single file of size 1.4GB. I believe it is huge time to get a hash of a file. In my system I am having billions of files that need to be hashed. Please suggest me if you have any other mechanism to do it very quickly.

Dude, your worry is well-founded - butu it isn't a Java problem. What people are trying to tell you is, you can't do what you want, becuase you don't have enough computing power. Let's do the math, shall we? You give a figure of "100 terabytes daily". Let's assume you want to limit that to "in eight hours," so you can run your backup and still leave time to actually deal with the files.
So - you want to process 1,000,000,000,000 bytes / 8*60*60 s, = 35Mb/s (roughly). On a 3Ghz machine, that leaves you 100 instruction cycles per byte to load from tape (!!!), process the SHA1 algorithm, and write the results.
What's the maximum streaming capacity, in Mb/sec, of your tape system?
I don't think there's ANY reasonable way of doing this - certainly not serially. You could manage it with a processing farm - with 100GB tapes, assign one tape to a single CPU, and then your "terabytes' of storage are handleable. You just need 100 machines to get the job done.
In any event - your speed issue is almost certainly NOT Java's problem.
Grant

Similar Messages

  • Processing Large Files using Chunk Mode with ICO

    Hi All,
    I am trying to process Large files using ICO. I am on PI 7.3 and I am using new feature of PI 7.3, to split the input file into chunks.
    And I know that we can not use mapping while using Chunk Mode.
    While trying I noticed below points:
    1) I had Created Data Type, Message Type and Interfces in ESR and used the same in my scenario (No mapping was defined)Sender and receiver DT were same.
    Result: Scenario did not work. It created only one Chunk file (.tmp file) and terminated.
    2) I used Dummy Interface in my scenario and it worked Fine.
    So, Please confirm if we should always USE DUMMY Interfaces in Scenario while using Chunk mode in PI 7.3 Or Is there something that I am missing.
    Thanks in Advance,
    - Pooja.

    Hello,
    While trying I noticed below points:
    1) I had Created Data Type, Message Type and Interfces in ESR and used the same in my scenario (No mapping was defined)Sender and receiver DT were same.
    Result: Scenario did not work. It created only one Chunk file (.tmp file) and terminated.
    2) I used Dummy Interface in my scenario and it worked Fine.
    So, Please confirm if we should always USE DUMMY Interfaces in Scenario while using Chunk mode in PI 7.3 Or Is there something that I am missing.
    According to this blog:
    File/FTP Adapter - Large File Transfer (Chunk Mode)
    The following limitations apply to the chunk mode in File Adapter
    As per the above screenshots, the split never cosiders the payload. It's just a binary split. So the following limitations would apply
    Only for File Sender to File Receiver
    No Mapping
    No Content Based Routing
    No Content Conversion
    No Custom Modules
    Probably you are doing content conversion that is why it is not working.
    Hope this helps,
    Mark
    Edited by: Mark Dihiansan on Mar 5, 2012 12:58 PM

  • Trying to transfer a large file using new ipod touch.

    I am trying to transfer a large file (17gigs)using my new ipod touch. I got a new laptop and i am trying to get some things on it. I was able to do this last time with my old ipod classic. It would let me copy/paste the file or drag it. But with the ipod touch i can not do either of those. Any help will be appreciated. Thank you very much.
    Message was edited by: usagisailormoon
    Message was edited by: usagisailormoon

    ah really. ok thank you very much.

  • How to Expire Large Files using File Server Resource Manager

    Is there a way to expire Large Files over 2GB that have not been accessed in 2 years.
    I see under the File expiration options that I can expire files that have not been Created, Modified, or Accessed for a certain amount of time.
    Thanks,
    Eddie

    Hi Eddie,
    FSRM can help report large files and also can help move old files to a folder, but I did not found a way to combine them in a single process.
    Instead how about using Robocopy?
    You can run robocopy /min:xxx /minlad:xxx <source> <target>.
    /MIN:n :: MINimum file size - exclude files smaller than n bytes.
    /MINLAD:n :: MINimum Last Access Date - exclude files used since n.
    (If n < 1900 then n = n days, else n = YYYYMMDD date).
    Please remember to mark the replies as answers if they help and un-mark them if they provide no help. If you have feedback for TechNet Support, contact [email protected]

  • Process large file using BPEL

    My project have a requirement of processing large file (10 MB) all at once. In the project, the file adapter reads the file, then calls 5 other BPEL process to do 10 different validations before delivering to oracle database. I can't use debatch feature of adapter because of Header and detail record validation requirement. I did some performace tuing (eg: auditlevel to minimum, logging level to error, JVM size to 2GB etc..) as per performance tuing specified in Oracle BPEL user guide. We are using 4 CPU, 4GB RAM IBM AIX 5L server. I observed that the Receive activity in the begining of each process is taking lot of time, while other transient process are as per expected.
    Following are statistics for receive activity per BPEL process:
    500KB: 40 Sec
    3MB: 1 Hour
    Because we have 5 BPEL process, so lot of time is wasted in receive activity.
    I did't try 10 MB so far, because of poor performance figure for 3 MB file.
    Does any one have any idea how to improve performance of begining receive activity of BPEL process?
    Thanks
    -Simanchal

    I believe the limit in SOA Suite is 7MB if you want to use the full payload and perform some kind of orchastration. Otherwise you need to do some kind of debatching, which you stated will not work.
    SOA Suite is not really designed for your kind of use case as it needs to parocess this file in memory, when any transformation occurs it can increase this message between 3 - 10 times. If you are writing to a database why can you read the rows one by one?
    If you are wanting to perform this kind of action have a look at ODI (Oracle Data Integrator). I Also believe that OSB (Aqua Logic) can handle files upto 200MB this this can be an option as well, but it may require debatching.
    cheers
    James

  • I am having trouble receiving large files using a T1 line.

    I can receive large files (3-10meg) using any connection other than T1.  Is there a setting I need to change on my Macbook pro?

    Obviously your talking about a setting in a program?
    Maybe Double NAT ?
    Would help if you gave a description of what the program your using is.
    What's your result on http://www.speedtest.net

  • Reading large files -- use FileChannel or BufferedReader?

    Question --
    I need to read files and get their content. The issue is that I have no idea how big the files will be. My best guess is that most are less than 5kb but some with be huge.
    I have it set up using a BufferedReader, which is working fine. It's not the fastest thing (using readLine() and StringBuffer.append()), but so far it's usable. However, I'm worried that if I need to deal with large files, such as a PDF or other binary, BufferedReader won't be so efficient if I do it line by line. (And will I run into issues trying to put a binary file into a String?)
    I found a post that recommended FileChannel and ByteBuffer, but I'm running into a java.lang.UnsupportedOperationException when trying to get the byte[] from ByteBuffer.
    File f = new File(binFileName);
    FileInputStream fis = new FileInputStream(f);
    FileChannel fc = fis.getChannel();
    // Get the file's size and then map it into memory
    int sz = (int)fc.size();
    MappedByteBuffer bb = fc.map(FileChannel.MapMode.READ_ONLY, 0, sz);
    fc.close();
    String contents = new String(bb.array()); //code blows up
    Thanks in advance.

    If all you are doing is reading data I don't think you're going to get much faster than InfoFetcher
    you are welcome to use and modify this class, but please don't change the package or take credit for it as your own work
    InfoFetcher.java
    ==============
    package tjacobs.io;
    import java.io.IOException;
    import java.io.InputStream;
    import java.util.ArrayList;
    import java.util.Iterator;
    * InfoFetcher is a generic way to read data from an input stream (file, socket, etc)
    * InfoFetcher can be set up with a thread so that it reads from an input stream
    * and report to registered listeners as it gets
    * more information. This vastly simplifies the process of always re-writing
    * the same code for reading from an input stream.
    * <p>
    * I use this all over
         public class InfoFetcher implements Runnable {
              public byte[] buf;
              public InputStream in;
              public int waitTime;
              private ArrayList mListeners;
              public int got = 0;
              protected boolean mClearBufferFlag = false;
              public InfoFetcher(InputStream in, byte[] buf, int waitTime) {
                   this.buf = buf;
                   this.in = in;
                   this.waitTime = waitTime;
              public void addInputStreamListener(InputStreamListener fll) {
                   if (mListeners == null) {
                        mListeners = new ArrayList(2);
                   if (!mListeners.contains(fll)) {
                        mListeners.add(fll);
              public void removeInputStreamListener(InputStreamListener fll) {
                   if (mListeners == null) {
                        return;
                   mListeners.remove(fll);
              public byte[] readCompletely() {
                   run();
                   return buf;
              public int got() {
                   return got;
              public void run() {
                   if (waitTime > 0) {
                        TimeOut to = new TimeOut(waitTime);
                        Thread t = new Thread(to);
                        t.start();
                   int b;
                   try {
                        while ((b = in.read()) != -1) {
                             if (got + 1 > buf.length) {
                                  buf = IOUtils.expandBuf(buf);
                             int start = got;
                             buf[got++] = (byte) b;
                             int available = in.available();
                             //System.out.println("got = " + got + " available = " + available + " buf.length = " + buf.length);
                             if (got + available > buf.length) {
                                  buf = IOUtils.expandBuf(buf, Math.max(got + available, buf.length * 2));
                             got += in.read(buf, got, available);
                             signalListeners(false, start);
                             if (mClearBufferFlag) {
                                  mClearBufferFlag = false;
                                  got = 0;
                   } catch (IOException iox) {
                        throw new PartialReadException(got, buf.length);
                   } finally {
                        buf = IOUtils.trimBuf(buf, got);
                        signalListeners(true);
              private void setClearBufferFlag(boolean status) {
                   mClearBufferFlag = status;
              public void clearBuffer() {
                   setClearBufferFlag(true);
              private void signalListeners(boolean over) {
                   signalListeners (over, 0);
              private void signalListeners(boolean over, int start) {
                   if (mListeners != null) {
                        Iterator i = mListeners.iterator();
                        InputStreamEvent ev = new InputStreamEvent(got, buf, start);
                        //System.out.println("got: " + got + " buf = " + new String(buf, 0, 20));
                        while (i.hasNext()) {
                             InputStreamListener fll = (InputStreamListener) i.next();
                             if (over) {
                                  fll.gotAll(ev);
                             } else {
                                  fll.gotMore(ev);
         }

  • Downloading via XHR and writing large files using WinJS fails with message "Not enough storage is available to complete this operation"

    Hello,
    I have an issue that some user are experiencing but I can't reproduce it myself on my laptop. What I am trying to do it grab a file (zip file) via XHR. The file can be quite big, like 500Mb. Then, I want to write it on the user's storage.
    Here is the code I use:
    DownloadOperation.prototype.onXHRResult = function (file, result) {
    var status = result.srcElement.status;
    if (status == 200) {
    var bytes = null;
    try{
    bytes = new Uint8Array(result.srcElement.response, 0, result.srcElement.response.byteLength);
    } catch (e) {
    try {
    Utils.logError(e);
    var message = "Error while extracting the file " + this.fileName + ". Try emptying your windows bin.";
    if (e && e.message){
    message += " Error message: " + e.message;
    var popup = new Windows.UI.Popups.MessageDialog(message);
    popup.showAsync();
    } catch (e) { }
    this.onWriteFileError(e);
    return;
    Windows.Storage.FileIO.writeBytesAsync(file, bytes).then(
    this.onWriteFileComplete.bind(this, file),
    this.onWriteFileError.bind(this)
    } else if (status > 400) {
    this.error(null);
    The error happens at this line:
    bytes = new Uint8Array(result.srcElement.response, 0, result.srcElement.response.byteLength);
    With description "Not enough storage is available to complete this operation". The user has only a C drive with plenty of space available, so I believe the error message given by IE might be a little wrong. Maybe in some situations, Uint8Array
    can't handle such large file? The program fails on a "ASUSTek T100TA" but not on my laptop (standard one)
    Can somebody help me with that? Is there a better way to write a downloaded binary file to the disk not passing via a Uint8Array?
    Thanks a lot,
    Fabien

    Hi Fabien,
    If Uint8Array works fine on the other computer, it should not be the problem of the API, but instead it could be the setting or configuration for IE.
    Actually using XHR for 500MB zip file is not suggested, base on the documentation:
    How to download a file, XHR wraps an
    XMLHttpRequest call in a promise, which is not a good approach for big item download, please use
    Background Transfer instead, which is designed to receive big items.
    Simply search on the Internet, and looks like the not enough storage error is a potential issue while using XMLHttpRequest:
    http://forums.asp.net/p/1985921/5692494.aspx?PRB+XMLHttpRequest+returns+error+Not+enough+storage+is+available+to+complete+this+operation, however I'm not familiar with how to solve the XMLHttpRequest issues.
    --James
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Out of memory when coverting large files using Web service call

    I'm running into an out of memory error on the LiveCycle server when converting a 50 meg Word document with a Web service call.  I've already tried increasing the heap size, but I'm at the limit for the 32 bit JVM on windows.  I could upgrade to a 64 bit JVM, but it would be a pain and I'm trying to avoid it.  I've tried converted the 50 meg document using the LiveCycle admin and it works fine, the issue only occurs when using a web service call.  I have a test client and the memory spikes when it's generating the web service call taking over a gig of memory.  I assume it takes a similar amount of memory on the receiving end which is why LiveCycle is running out of memory.  Does any one have any insight on why passing over a 50 meg file requires so much memory?   Is there anyway around this?
    -Kelly

    Hi,
    You are correct that a complete 64bit environment would solve this. The problem is that you will get the out of memory error when the file is written to memory on the server. You can solve this by creating an interface which stores large files on the server harddisk instead, which allows you to convert as large files as LC can handle without any memory issue.

  • Read Large Files Using BPEL File Adapter

    Hi,
    I have a scenario, where files of large size in text format are to be read and send to 3rd party. MTOM Policy has to be attached. Files of size 3MB or less are polled and size greater than 3 MB are not Retrieved. How can I resolve the issue. Do I have to break the file and send the data? If so how?
    Thanks
    Ranga

    You have to use streaming feature of the JCA file adapter for handling huge file.
    You could go through the following link
      http://docs.oracle.com/cd/E23943_01/integration.1111/e10231/adptr_file.htm#CIAHHEBF

  • Uploading large files using nio in http client

    Hi,
    I'm developing a multithreaded Swing client which needs to be capable of uploading and downloading large ( 100mb ) files to servlets via http.
    Downloading has presented no problem, but writing to the OutputStream of a URLConnection, whilst fine for small files, gave me 'out of memory' errors on large ones. ( As I understand it, the whole stream is buffered before sending and it just runs out of room.)
    So I replaced the URLConnection with a SocketChannel, and after writing the http header, used a loop to write chunks of data from a FileChannel.
    I needed to monitor any incoming data to truncate the upload if the servlet gives a premature response indicating an error condition, so I set the SocketChannel to non-blocking and put a read into the loop.
    Immediately I got a bunch of Swing exceptions and the frame was unable to render itself, I guess due to thread starvation problems.
    I currently have this clunky code that seems to work ok:
    int percent=0;
    long total = 0L;
    long chunk = 1024*8;
    while(total<length ){
    chunk = ((length-total)<chunk)?length-total:chunk;
    long ii= fc.transferTo(total,chunk,socketChannel);
    total+=ii;
    yield();
    socketChannel.configureBlocking(false);
    if(socketChannel.read(buffer)>0 )break;
    socketChannel.configureBlocking(true);
    What I can't figure out is why Swing objects if I permanently set the SocketChannel to non-blocking rather than just whilst I read it, and why I get the following exception from a polling thread that runs concurrently:
    'No buffer space available (maximum connections reached?)'
    Anyone got any ideas or a better way to do it?
    Chris

    I assume that the OutputStream obtained fromURLConnection is internally buffered and URLConnection
    waits until the whole stream has been input before
    inserting the initial Content-length header. Only when
    it knows the complete length is the content sent. Hmm, this is quite unfortunate. Especially since it is not necessary: HTTP allows the data stream close to serve as the end-of-data marker (Content-Length: is an optional header, in its absence the receiver simply reads all of the data until the sender closes the stream).
    Has anyone else had experience of sending large
    (100mb) files through Java http clients on small
    pc's?I've sent large-ish (~1-10Mb) payloads via URLConnection but not as large as yours. Interesting issue to figure out... I am out of ideas for the moment.

  • How to send large files using web service

    hello everyone,
    I am new to this forum, so please pardon me if I post some silly problem...
    I have created one service which sends file when client (jsp) request it. I am using JBOSS as my server. purpose of this application is when client request some fle then service will send this file... and most of the time we need to send only pdfs and ppts...
    Problem is, this service sends txt, java files easily of any size but when i tried sending PDF, PPT then i got xml.SAXParseException.......
    I thought this error is because of some characters, but how to fix it....
    I am working on Linux.
    code snippet is:
    import java.io.*;
    public class MyHelloService
    public String file_size (String name)
         String s = new String("");
    byte[] sendata1=new byte[100];
         try
              System.out.println("name recived is :::::::::::"+name);
              FileInputStream in=new FileInputStream(name);
              int size=0;
              size=in.available();
              System.out.println("FILE SIZE IS:::::"+size);
              byte[] sendata11=new byte[size];
              i=in.read(sendata11);
              System.out.println(new String(sendata11));
              s=new String(sendata11);
         catch(Exception e)
                   System.out.println("EXCEPTION IN JWS:::"+e);
                   s=new String("nofilefounderror");
         return s;
    pls tell me what am i doing wrong ad how to fix this?
    and one more thing can i send byte array from a web service as i tried but couldnt do that... so i am reading everything in a single byte array and then converted to string.....
    is it possibel to send file in a chunk?if yes, how to do that?
    waiting for the reply..... pls reply as soon as possible....
    Rashi

    hi,
    I am sending file from server to client i.e client will request for a file and service will send it back....... no socket connection is there...I am using JBOSS and apache axis.
    pls help me out.....
    Rashi

  • Processing large file using Debatching - SAX Exception

    Hi,
    I have a large xml file (about 20 mb) to be processed. I implemented the debatching feature and in the file adapter I defined the publish messages in bacthes as 500.
    When I run the process, I expected to see several instances in the console. But I see one instance not in the Instances page but in the Perform Manaul Recovery and nothing seems to be happening.
    Do I need to do anything here. Can anyone please help me here.
    Thanks
    -Prapoorna
    Edited by: p123 on Jun 29, 2009 3:07 PM

    The file is 20 mb.
    Sample xml file is as shown below. I have several attendance_row tags between time_and_attendance.
    <time_and_attendance>
    <attendance_row><oracle_person_id>110758</oracle_person_id>
    <absence_reason>Work Abroad</absence_reason>
    <action_type>A</action_type>
    <date>01/04/2009</date>
    <total_hours>8.6</total_hours>
    <last_update_date>16/06/2009 12:35:47</last_update_date>
    </attendance_row>
    <attendance_row><oracle_person_id>110758</oracle_person_id>
    <absence_reason></absence_reason>
    <action_type>W</action_type>
    <date>01/04/2009</date>
    <total_hours>0</total_hours>
    <last_update_date>16/06/2009 12:35:47</last_update_date>
    </attendance_row>
    <attendance_row><oracle_person_id>110758</oracle_person_id>
    <absence_reason>Work Abroad</absence_reason>
    <action_type>A</action_type>
    <date>02/04/2009</date>
    <total_hours>8.6</total_hours>
    <last_update_date>16/06/2009 12:35:47</last_update_date>
    </attendance_row>
    </time_and_attendance>
    Here is the schema file
    <?xml version="1.0" encoding="UTF-8" ?>
    <!--This Schema has been generated from a DTD. A target namespace has been added to the schema.-->
    <xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" targetNamespace="http://TargetNamespace.com/ReadFile" xmlns="http://TargetNamespace.com/ReadFile" nxsd:version="DTD" xmlns:nxsd="http://xmlns.oracle.com/pcbpel/nxsd">
    <xs:element name="oracle_person_id" type="xs:string"/>
    <xs:element name="total_hours" type="xs:string"/>
    <xs:element name="action_type" type="xs:string"/>
    <xs:element name="last_update_date" type="xs:string"/>
    <xs:element name="absence_reason" type="xs:string"/>
    <xs:element name="attendance_row">
    <xs:complexType>
    <xs:sequence>
    <xs:element ref="oracle_person_id"/>
    <xs:element ref="absence_reason"/>
    <xs:element ref="action_type"/>
    <xs:element ref="date"/>
    <xs:element ref="total_hours"/>
    <xs:element ref="last_update_date"/>
    </xs:sequence>
    </xs:complexType>
    </xs:element>
    <xs:element name="date" type="xs:string"/>
    <xs:element name="time_and_attendance">
    <xs:complexType>
    <xs:sequence>
    <xs:element maxOccurs="unbounded" ref="attendance_row"/>
    </xs:sequence>
    </xs:complexType>
    </xs:element>
    </xs:schema>
    Thanks
    -Prapoorna
    Edited by: p123 on Jun 29, 2009 3:49 PM
    Edited by: p123 on Jun 29, 2009 7:23 PM

  • How to zip a LARGE file using zipOutputStream

    Hi,
    I am trying to zip a file which is over 50MB, and I got an "Out-of-Memory" error.
    Here is part of the java program:
    public static String Zipper(String Path,String ZipName,String FilestoZip, boolean DeleteFile){
    /* This function archives a given subdirectory and the deletes the
    archived files.
    Parameters: Path => Full path of the zipped file subdirectory
    Zipname => Name of the zip file to create
    FilestoZip => List of files to zip.
    Return: Message status if completed or not
    try{
    Msg = "COMPLETE:Creating zip file "+ZipName;
    ZipOutputStream zip = new ZipOutputStream( new FileOutputStream(ZipName));
    zip.setMethod(ZipOutputStream.DEFLATED);
    zip.setLevel(Deflater.BEST_COMPRESSION);
    File file= new File(FilestoZip);
    FileInputStream in = new FileInputStream(file);
    byte[] bytes = new byte[in.available()];
    in.read(bytes);
    in.close();
    ZipEntry entry = new ZipEntry(file.getName());
    entry.setTime(file.lastModified());
    zip.putNextEntry(entry);
    zip.write(bytes);
    zip.closeEntry();
    zip.close();
    catch (Exception e) {Msg = e.getLocalizedMessage();}
    This program works fine with small files, but when the file size gets bigger, it will create "out-of-memory" error.
    Does anybody know how to work around this?
    I appreciate your help.
    Michelle

    I modified the program as follow:
    ZipEntry entry = new ZipEntry(file.getName());
    entry.setTime(file.lastModified());
    zip.putNextEntry(entry);
    byte[] bytes=new byte[1024];
    int len;
    while ((len=in.read(bytes))>0) {
    zip.write(bytes,0,len);
    And now it is working. I can zip a file with 125MB size.
    Thank you for your help.
    Michelle

  • Reading and Writing large Excel file using JExcel API

    hi,
    I am using JExcelAPI for reading and writing excel file. My problem is when I read file with 10000 records and 95 columns (file size about 14MB), I got out of memory error and application is crashed. Can anyone tell me is there any way that I can read large file using JExcelAPI throug streams or in any other way. Jakarta POI is also showing this behaviour.
    Thanks and advance

    Sorry when out of memory error is occurred no stack trace is printed as application is crashed. But I will quote some lines taken from JProfiler where this problem is occurred:
              reader = new FileInputStream(new File(filePath));
              workbook = Workbook.getWorkbook(reader);
              *sheeet = workbook.getSheet(0);* // here out of memory error is occured
               JProfiler tree:
    jxl.Workbook.getWorkBook
          jxl.read.biff.File 
                 jxl.read.biff.CompoundFile.getStream
                       jxl.read.biff.CompoundFile.getBigBlockStream Thanks

Maybe you are looking for

  • Error when connecting PHP on linux with oracle9i

    the error that presents/displays to me is the following one Warning: ocilogon(): ociopen_server: Error while trying to retrieve text for error ORA-12154 in /var/www/html/oci8test.php on line 3 rethat 2.1 AS, PHP-3.4.3 y apache 1.3.28 Help

  • Mountain Lion change color profile when use discrete video card

    MacBookPro mid 2010. Update Samsung SSD 830. Mountain Lion 10.8.2 just installed (new installation). Reset PRAM and SMC. Done. No result. Try with GfxCardStatus. No result. When the system uses the discrete video card changes the color profile. I'm n

  • Pre order

    Why do I need the money on my card for the album I had pre- ordered? Without it, now I cannot download the album I ALREADYpurchased

  • Why can't I get the Amber update?

    Why amber update isn't available for me... My phone 520 ... and I am living in Iran... Moderator's note: We amended the title of this post as we moved it to the appropriate board.

  • ERRORS OCCURED DURING EXTRACTION USING RSA3

    Dear Folks, I have an error in production client & that is when i am doing test extraction for standard datasource using RSA3 i am geting following error:ERRORS OCCURED DURING EXTRACTION. I checked the log and message is extraction program failed as