Generating MD5 hash value for any specific flat file

Hi experts,
I am developing a program that will generate flat files and also I should generate the MD5 Hash value for each and every flat files. My question is how can I generate the MD5 hash value for the generated .txt files.
Thanks in advance
Shabir

You can use functions
MD5_CALCULATE_HASH_FOR_CHAR for text file
MD5_CALCULATE_HASH_FOR_RAW for binary

Similar Messages

  • Issue while generating MD5 if there are any special characters in the CLOB

    Hi Guys,
    I need your help...
    We are using dbms_crypto.hash to generate the MD5 hash value for a CLOB that will be sent to other feeder systems. The feeder systems validate the file by generating the MD5 hash value on their end and compare with the MD5 generated in Oracle and accept the file only if the MD5 hash value matches.
    The MD5 file matches between both the systems except in the case where there are special characters in the CLOB. Somehow, MD5 is prefixing  character before any special character and calculating the MD5 value. In our case, our special characters being ®, §, ™, ©....
    This is a high priority issue for us and it is holding our UAT.
    We are using Oracle 11.2.0.2.0.
    Your response is really appreciated.
    Best Regards
    Gupta

    Welcome to the forum!
    Whenever you post provide your 4 digit Oracle version.
    >
    We are using dbms_crypto.hash to generate the MD5 hash value for a CLOB that will be sent to other feeder systems. The feeder systems validate the file by generating the MD5 hash value on their end and compare with the MD5 generated in Oracle and accept the file only if the MD5 hash value matches.
    The MD5 file matches between both the systems except in the case where there are special characters in the CLOB. Somehow, MD5 is prefixing  character before any special character and calculating the MD5 value. In our case, our special characters being ®, §, ™, ©....
    >
    See if my reply in this thread addresses your problem.
    Re: MD5 HASH computed from DBMS_CRYPTO does not match .NET MD5
    The fix in almost all of these cases is to understand that the Crypto package always converts everything to AL32UTF8 before hashing.
    ALWAYS.
    ALWAYS.
    ALWAYS.
    Unless the 'other' hash generators (substitute any generator you wish) do it the Oracle way you will NEVER get a match.
    NEVER.
    NEVER.
    NEVER.
    There are no workarounds.

  • The selected signed file could not be authenticated. The file might have been tampered with or an error might have occured during download. Please verify the MD5 hash value against the Cisco Systems web site

    I am trying to load any 9.0.3 firmware on my UCM 5.0.4.2000-1 server. Every newer firmware I load throws the following error. I have verified the MD5 is correct and also downloaded the file several times with the same result. I can load the same firmware file on another UCM server and it loads fine. Any ideas?
    Thanks in advance!
    Error Message:
    The selected signed file could not be authenticated. The file might have been  tampered with or an error might have occured during download. Please verify the  MD5 hash value against the Cisco Systems web site:  9b:b6:31:09:18:15:e7:c0:97:9f:e6:fe:9a:19:94:99
    Firmware File: cmterm-7970_7971-sccp.9-0-3.cop.sgn
    UCM version: 5.0.4.2000-1

    Thanks for your reply. We have a lab environment where I maintain  UCM 5.0, 5.1, 6.0, 6.1, 7.0, 7.1 and 8.0 servers each running the latest released firmware for our QA testing team. I have downloaded and installed the latest device packages but find that if I try to install any firmware newer then 8.3.1 on either 5.0.4 or 6.0 i start getting MD5 hash authentication errors. It looks like 9.0.3 firmware should work on UCM 5.0 and 6.0 so I am lost as to why I can't seem to update any firmware for any model phone if it is newer then version 8.3.1 on either 5.0 or 6.0. while 5.1 and 6.1 work without issues. Maybe it is just a bug. I mostly wanted to see if anyone else has experienced this or if it is just me.

  • DBMS_CRYPTO MD5 hash value does not match 3rd party MD5 free tool

    Hello,
    I am using Oracle Version: 11.2.4.
    I have a problem where the MD5 value from DBMS_CRYPTO does not match the hash value from 3rd party MD5 free tool (MD5 Checksum Calculator 0.0.5.58 or WinMD5Free v1.20) and also the MD5 hash value calculated by an ingestion tool where I am transferring files to. The MD5 hash value that the ingestion tool calculates is the same as the 3rd party MD5 free tools I have. This occurs only on some of the XML files that I generate using XSQL(xmlserialize, xmlagg, xmlelement, etc.) and DBMS_XSLPROCESSOR on a Linux OS. The XML files are transferred from the Unix OS to my Windows 7 OS via filezilla.
    I found a thread on this forum that also had a similar issue so I copy/paste the java functions. They are listed below(both are the same expect for the character set):
    create or replace java source named "MD5_UTF_8" as
    import java.security.MessageDigest;
    import java.security.NoSuchAlgorithmException;
    import java.sql.Clob;
    import java.sql.Blob;
    public class MD5_UTF_8 {
    private static final byte [] hexDigit = {
    '0','1','2','3','4','5','6','7','8','9','A','B','C','D','E','F'
    /** Converts a byte array to a hex string
    * Returns an empty string if the byte array is null
    public static final String toHexString(byte [] bytes) {
    if (bytes == null) return new String("");
    StringBuffer buf = new StringBuffer(bytes.length * 2);
    for (int i = 0; i < bytes.length; i++) {
    buf.append((char) hexDigit[((bytes >>> 4) & 0x0F)]);
    buf.append((char) hexDigit[(bytes & 0x0F)]);
    return buf.toString();
    // Convert Hex String to Byte Array
    public static final byte[] byteArrayFromHexString(String str) {
    byte[] bytes = new byte[str.length() / 2];
    for (int i = 0; i < bytes.length; i++)
    bytes = (byte) Integer.parseInt(str.substring(2 * i, 2 * i + 2), 16);
    return bytes;
    public static String getMD5HashFromClob(Clob inhalt) throws Exception{
    MessageDigest algorithm;
    StringBuffer hexString;
    String s = null;
    String salida = null;
    int i;
    byte[] digest;
    String tepFordigest = inhalt.getSubString(1L, (int)inhalt.length());
    try {
    algorithm = MessageDigest.getInstance("MD5_UTF_8");
    algorithm.reset();
    algorithm.update(tepFordigest.getBytes("UTF-8"));
    digest = algorithm.digest();
    s = toHexString(digest);
    } catch (java.security.NoSuchAlgorithmException nsae) {
    s = "No es posible cifrar MD5";
    return s;
    sho err
    alter java source "MD5_UTF_8" compile
    sho err
    CREATE OR REPLACE FUNCTION get_md5_UTF_8_CLOB(inhalt CLOB) RETURN VARCHAR2 DETERMINISTIC
    AS LANGUAGE JAVA
    name 'MD5_UTF_8.getMD5HashFromClob(java.sql.Clob) return java.lang.String';
    create or replace java source named "MD5" as
    import java.security.MessageDigest;
    import java.security.NoSuchAlgorithmException;
    import java.sql.Clob;
    import java.sql.Blob;
    public class MD5 {
    private static final byte [] hexDigit = {
    '0','1','2','3','4','5','6','7','8','9','A','B','C','D','E','F'
    /** Converts a byte array to a hex string
    * Returns an empty string if the byte array is null
    public static final String toHexString(byte [] bytes) {
    if (bytes == null) return new String("");
    StringBuffer buf = new StringBuffer(bytes.length * 2);
    for (int i = 0; i < bytes.length; i++) {
    buf.append((char) hexDigit[((bytes >>> 4) & 0x0F)]);
    buf.append((char) hexDigit[(bytes & 0x0F)]);
    return buf.toString();
    // Convert Hex String to Byte Array
    public static final byte[] byteArrayFromHexString(String str) {
    byte[] bytes = new byte[str.length() / 2];
    for (int i = 0; i < bytes.length; i++)
    bytes = (byte) Integer.parseInt(str.substring(2 * i, 2 * i + 2), 16);
    return bytes;
    public static String getMD5HashFromClob(Clob inhalt) throws Exception{
    MessageDigest algorithm;
    StringBuffer hexString;
    String s = null;
    String salida = null;
    int i;
    byte[] digest;
    String tepFordigest = inhalt.getSubString(1L, (int)inhalt.length());
    try {
    algorithm = MessageDigest.getInstance("MD5");
    algorithm.reset();
    algorithm.update(tepFordigest.getBytes());
    digest = algorithm.digest();
    s = toHexString(digest);
    } catch (java.security.NoSuchAlgorithmException nsae) {
    s = "No es posible cifrar MD5";
    return s;
    sho err
    alter java source "MD5" compile
    sho err
    CREATE OR REPLACE FUNCTION get_md5_CLOB(inhalt CLOB) RETURN VARCHAR2 DETERMINISTIC
    AS LANGUAGE JAVA
    name 'MD5.getMD5HashFromClob(java.sql.Clob) return java.lang.String';
    I created the above java functions and added the calls to them in my package to see what hash values they would produce but I am getting "ORA-29532: Java call terminated by uncaught Java exception: java.nio.BufferOverflowException " the XML is about 60mb.
    package code sniippets:
    declare
    l_hash raw(2000);
    l_checksum_md5 varchar2(2000);
    l_checksum_md5_utf_8 varchar2(2000);
    Begin
    t_checksum := lower(RAWTOHEX(dbms_crypto.hash(src=>l_clob,typ=>dbms_crypto.hash_md5)));
    l_hash := get_md5_CLOB (l_clob);
    l_checksum_md5 := lower(rawtohex(l_hash));
    l_hash := get_md5_UTF_8_CLOB (l_clob);
    l_checksum_md5_UTF_8 := lower(rawtohex(l_hash));Please help,
    Thank You in advance
    Don
    Edited by: 972551 on Nov 21, 2012 12:18 PM
    Edited by: sabre150 on Nov 21, 2012 11:06 PM
    Moderator action : added [code ] tags to format properly. In future please add them yourself.

    >
    I have a problem where the MD5 value from DBMS_CRYPTO does not match the hash value from 3rd party MD5 free tool (MD5 Checksum Calculator 0.0.5.58 or WinMD5Free v1.20) and also the MD5 hash value calculated by an ingestion tool where I am transferring files to. The MD5 hash value that the ingestion tool calculates is the same as the 3rd party MD5 free tools I have.
    I found a thread on this forum that also had a similar issue so I copy/paste the java functions.
    >
    And in that thread (Re: MD5 HASH computed from DBMS_CRYPTO does not match .NET MD5 I provided the reason why DBMS_CRYPTO may not match hashes produced by other methodologies.
    I have no idea why you copied and posted all of that Java code the other poster and I provided since code has NOTHING to do with the problem you say you are having. Thte other poster's question was how to write Java code that would produce the same result as DBMS_CRYPTO.
    You said your problem was understanding why DBMS_CRYPTO 'does not match the hash value from 3rd party MD5 free tool ...'. and I answered that in the other forum.
    >
    The Crypto package always converts everything to AL32UTF8 before hashing so if the .NET character set is different the hash will likely be different.
    See DBMS_CRYPTO in the PL/SQL Packages and Types doc
    http://docs.oracle.com/cd/B28359_01/appdev.111/b28419/d_crypto.htm
    If you look at the spec header for the DBMS_CRYPTO package it shows this note:
    -- Prior to encryption, hashing or keyed hashing, CLOB datatype is
    -- converted to AL32UTF8. This allows cryptographic data to be
    -- transferred and understood between databases with different
    -- character sets, across character set changes and between
    -- separate processes (for example, Java programs).
    -- If your 3rd party MD5 free tool (MD5 Checksum Calculator 0.0.5.58 or WinMD5Free v1.20) do not use the AL32UTF8 character set then the hashes will likely be different. You can't modify DBMS_CRYPTO so if the hashes need to match you need to use 3rd party tools that either use the correct character set or can be configured to use the correct character set.
    The problem in the other thread was how to WRITE Java code that uses the correct character set and I showed that OP how to do that.
    So unless you are writing your own Java code all of that code you copied and pasted is useless for your use case.

  • Calculating hash values for really big files

    I am using the following code to calculate the hash values of files
    public static String hash(File f, String algorithm)
                throws IOException, NoSuchAlgorithmException {
            if (!f.isFile()) {
                throw new IOException("Not a file");
            RandomAccessFile raf = new RandomAccessFile(f, "r");
            byte b[] = new byte[(int) raf.length()];
            raf.readFully(b);
            raf.close();
            MessageDigest messageDigest = MessageDigest.getInstance(algorithm);
            messageDigest.update(b);
            return toHexString(messageDigest.digest());
        }Now the problem is, for really big files, 100 MB or over, I get an OutOfMemoryError.
    I have used the -Xms and -Xms options to increase the JVM heap size, and untimately made it to work. However, I think this is lame and there is also a limit to the -Xmx option.
    Is there any other way I can calculate the hash values of these really big files?
    Thanks a lot in advance.

    why do u open the file the way u do ?
    why to u upload ALL the file AT ONCE into the memory ?
    i would do it like this:
    FileInputStream fis = new FileInputStream (f);
    int fileSize = f.available();
    byte buffer[] = new byte[1000];
    MessageDigest messageDigest = MessageDigest.getInstance(algorithm);
    for(int read = 0;read < fileSize;read +=1000;)
    if(fis.available() > 1000)
    fis.read(buffer, read, 1000);
    else if(fis.available() > 0)
    fis.read(buffer, read, fis.available());
    else
    break;
    messageDigest.update(b);
    fis.close();
    return toHexString(messageDigest.digest());

  • Invalid character value for cast specification using SSIS? Please assist

    SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005. An OLE DB record is available. Source: "Microsoft
    SQL Server Native Client 10.0" Hresult: 0x80004005 Description: "Invalid character value for cast specification". There was an error with input column "INV_DT" (627) on input "OLE DB Destination Input" (332). The column status
    returned was: "The value could not be converted because of a potential loss of data.". SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "input "OLE DB Destination Input" (332)" failed because error code 0xC0209077 occurred,
    and the error row disposition on "input "OLE DB Destination Input" (332)" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information
    about the failure. SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "OLE DB Destination" (319) failed with error code 0xC0209029 while processing input "OLE DB Destination Input" (332). The identified component
    returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure. The attempt
    to add a row to the Data Flow task buffer failed with error code 0xC0047020. SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on component "OLE DB Source" (1) returned error code 0xC02020C4. The component returned a failure code when
    the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.

    Hi Jony2422,
    From the error message, we can infer the issue is caused by the input column “INV_DT” cannot convert to the output column in OLE DB Destination.
    Based on my research, the exactly cause is the data type of input column “INV_DT” can convert to the data type of corresponding output column, but the data in the input column “INV_DT” cannot convert to the data type of corresponding output column. We can
    verify the issue in the Advanced Editor dialog box of OLE DB Destination.
    To avoid this issue, please make sure the input column can convert to the output column in OLE DB Destination, including the all the values in the input column. For more details, please refer to Visakh's suggestions.
    If there are any other questions, please feel free to ask.
    Thanks,
    Katherine Xiong
    Katherine Xiong
    TechNet Community Support

  • [Microsoft][ODBC SQL Server Driver]Invalid character value for cast specifi

    [Microsoft][ODBC SQL Server Driver]Invalid character value for cast specification
    An you help me with this error?
    This is the problemativ query:
    Search QUERY : SELECT DISTINCT theK_files.fileid, theK_files.name, theK_files.t
    itle, theK_files.description, theK_users1.name AS AuthorName, theK_users1.surn
    ame AS AuthorSurname, [theK_file-extensions].name AS Extension, theK_files.publ
    ished, theK_files.updated FROM theK_users INNER JOIN [theK_users-to-groups] ON
    theK_users.userid = [theK_users-to-groups].userid INNER JOIN theK_usergroups
    ON [theK_users-to-groups].groupid = theK_usergroups.groupid INNER JOIN [theK_g
    roups-to-files] ON theK_usergroups.groupid = [theK_groups-to-files].groupid INN
    ER JOIN theK_files ON [theK_groups-to-files].fileid = theK_files.fileid INNER
    JOIN theK_users theK_users1 ON theK_files.authorid = theK_users1.userid INNER
    JOIN [theK_file-extensions] ON theK_files.extensionid = [theK_file-extensions]
    .extensionid WHERE theK_users.userid = '1' AND theK_files.extensionid = '1' AN
    D ( theK_files.name LIKE '%a%' OR theK_users1.name LIKE '%a%' OR theK_files.de
    scription LIKE '%a%' )

    are these columns character/string/varchar:
    theK_users.userid = '1'
    theK_files.extensionid = '1'
    if these are number columns, change them to
    theK_users.userid = 1
    theK_files.extensionid = 1
    Jamie

  • Create hash value for clob column ?

    Hi,
    does anybody know a way to calculate a hash value for a clob column (9i) ?
    DBMS_UTILITY.GET_HASH_VALUE could only handle varchar2(4000).
    Thank you!
    Regrads
    Michael
    Message was edited by:
    mseiwert

    I can't reproduce it on my 10.2.0.4.0. CTL file:
    load data
    INFILE *
    Replace into table samp
    fields terminated by ","
    trailing nullcols
    no,
    col1 Char(100000000) ,
    col2 Char(100000000) enclosed by '"' and '"'
    BEGINDATA
    1,asdf,"assasadsdsdsd""sfasdfadf""sdsdsa,ssfsf"
    2,sfjass,"dksadk,kd,ss""dfdfjkdjfdk""sasfjaslaljs"Loading:
    SQL> Create table samp
      2  (
      3  no number,
      4  col1 clob,
      5  col2 clob
      6  );
    Table created.
    SQL> host sqlldr scott/tiger control=c:\temp\samp.ctl log=c:\temp\samp.log
    SQL> select * from samp
      2  /
            NO
    COL1
    COL2
             1
    asdf
    assasadsdsdsd"sfasdfadf"sdsdsa,ssfsf
             2
    sfjass
    dksadk,kd,ss"dfdfjkdjfdk"sasfjaslaljs
            NO
    COL1
    COL2
    SQL> SY.

  • Logical path for getting a Flat file from application server

    Hi All,
    We have loaded some .csv files to application server, what is the logical path we have to mention in the infopackage scheduler screen? please guide me how to give the path for getting a flat file from application server.
    Thanks,
    Sairam.

    Hi Sairam,
    I hope you know which location you have saved in the Application server.
    Now if you go to the Infopackage and click on the "External Data" tab, there you will see Radio Buttons for
    1) Client Workstation
    2) Application Server
    Choose the second radio button, then in the Field "Name of the File" you will be able to use the F4 help and browse AL11 transaction through this option. You can then choose the File.
    Hope this helps
    Regards,
    Praveen.

  • Can i able to put filter for my source flat file?

    Hi all,
    Please help me with the best practise of ODI.
    My source is flat file and i want to put filter.
    can i able to put filter for my source flat file? If yes, please help me with the best practise of applying filter.
    Regards
    Suresh

    Hi ,
    If you are trying to create at the Model -->Datastore ---> Filter ---> Insert condition
    then it will not work for File technology . You will get " Invalid Format Description "
    But you can specify a filter in the interface .
    Just drop the column(s) from your flat file data store into the canvas and then specify the filter condition .
    Thanks,
    Sutirtha

  • Custom Report for the Stock and Stock value for a specific date

    Hi SAP Gurus,
    Is there any SAP standard t-code or any logic to get the transcations (additions (for example: Purchases) and subtractions (Sales) to the inventory) for a particular materials in a plant and with Total Stock and also Total Stock value when that particular transaction happened?
    Our system is R/3 4.7
    I looked at the MB5B, MBCE, MBCA, MC44, MB51 and some other standard T-codes but could not find the total stock value at the time of Transaction happened.
    The history tables MBEWH and MARDH  are updated after the month end closing procedures, right, which means I will have the inventory value changing every month if material has Price "S".
    Thank you,
    -Harter

    Hi Harter,
    Unfortunately, you cannot see in a single tcode the value of stock and stock quantity on a specific date. As you yourself have pointed out, we only have to make use of the history table MBEWH, MARDH for the month wise stock quantity and value. Along with that you should also make use of the table MBEW to take teh stock quantity and value. So the total value of stock on a particular date will be
    Stock qnty = MBEWH value until the previous month (for teh specific valuation class, period etc) + MBEW value for the present date.
    But this will nto work out if you want to find out teh stock quantity and stock value on a past date basis. For past data, only  m onthwise data is available. For this anyway you can refer to MC.1 and so on reports.

  • Looking for sha-1 hash values for the Windows 10 Enterprise x64 & x86 ISO's

    When I downloaded the ISO's, I either missed the hash values or they weren't there. Still can't fine them. I'd sure like to validate the ISO's before trying to install.
    Can anyone point me to their locations or post them here?
    Mahalo

    Maybe aboodi86 just ran a sha1 generator on the downloads.
    For what it's worth, I got the exact same value listed above for my x64 download, so it's legit.  I didn't download the x86 version.
    -Noel
    Detailed how-to in my eBooks:  
    Configure The Windows 7 "To Work" Options
    Configure The Windows 8 "To Work" Options

  • Putaway Block in Storage Bin for any specific Material Code

    Hi Experts,
    We have a problem, we have allowed mix storage, addition to stock in the storage type, our requirement is this, that we want to block one bin aa-01-01 for some specific material code xxxxx, but we want to allow putaway other material codes to that bin.
    How can we achieve this. Can we use LS07 or some other solution. Please suggest at the earliest.
    Regards
    Ishu

    Hi,
    In standard system, you can block one specific bin for the specific material with Putaway strategy "Fixed bin" by maintaining this bin in material master. It is unclear that you set "allowed mix storage" and again required material specific putaway in bin.
    You can change "Mixed storage" indicator to blank. Which explains "___"  Mixed storage is not allowed
           All storage units in a storage bin must contain the same material and batch (one quant).
    If you don't want to change any of current setting you'll need to use the user exit and maintain custom table where you can maintain specific bin for the material for the warehouse. After determination of the bin to putaway in the standard code you can check if it is conflicting to the entry in your custom table.
    Hope this'll help.
    Regards,
    Sudhir

  • Ora_hash - Same hash value for different inputs (Urgent)

    Hi,
    Trying to use ora_hash to join between tables but i noticed that in some cases, when working on different input value the ora_hash function generates same results.
    select ora_hash('oomeroe03|6NU3|LS006P|7884|1|17-JUL-13 13.18.22.528000|0005043|'),ora_hash('GSAHFFXTK|GCQ3|A6253S|12765|1|17-JUL-13 17.26.26.853000|0136423|')
    from dual
    Output value : 1387341941
    Oracle version is 11gR2.
    Thanks

    Why would anyone limit the hash distribution to three buckets ?
    However, one must understand that the default seed is 0.  So one input repeated gets the same hash value unless the seed is changed.
    SQL> select ora_hash(rn) , rn
      2  from
      3  (Select rownum as rn from dual connect by level < 11)
      4  order by rn;
    ORA_HASH(RN)         RN
      2342552567          1
      2064090006          2
      2706503459          3
      3217185531          4
       365452098          5
      1021760792          6
       738226831          7
      3510633070          8
      1706589901          9
      1237562873         10
    10 rows selected.
    SQL> l
      1  select ora_hash(rn) , rn
      2  from
      3  (Select rownum as rn from dual connect by level < 11)
      4* order by rn
    SQL> /
    ORA_HASH(RN)         RN
      2342552567          1
      2064090006          2
      2706503459          3
      3217185531          4
       365452098          5
      1021760792          6
       738226831          7
      3510633070          8
      1706589901          9
      1237562873         10
    10 rows selected.
    SQL> /
    ORA_HASH(RN)         RN
      2342552567          1
      2064090006          2
      2706503459          3
      3217185531          4
       365452098          5
      1021760792          6
       738226831          7
      3510633070          8
      1706589901          9
      1237562873         10
    10 rows selected.
    SQL>
    Hemant K Chitale

  • Plan hash value for two queries!

    Hi,
    DB : Oracle 11g (11.2.0.3.0)
    OS: RHEL 5
    I have two question:
    1. Can two queries have same plan hash value? I mean I have below two queries:
    SELECT /+ NO_MERGE */ MIN(payor.next_review_date)*
    * FROM payor*
    * WHERE payor.review_complete = 0*
    * AND payor.closing_date IS NULL*
    * AND payor.patient_key = 10;*
    and
    SELECT  MIN(payor.next_review_date)
    * FROM payor*
    * WHERE payor.review_complete = 0*
    * AND payor.closing_date IS NULL*
    * AND payor.patient_key = 10;*
    When I tried to review the execution plan for both queries, the plan hash value remain same. Does it mean that execution plan for both queries are same? If yes, then how Oracle understands or changes the execution plan based on hint. If no then what plan hash value represents?
    2. If the execution plan with hint and without hint is same except for a given query except no.of rows and bytes. Does it mean that query with less rows and bytes scanned is better?
    Thanks in advance
    -Onkar

    Hi,
    there are two different things. One is EXPLAIN PLAN, which is how the optimizer thinks the query will be executed. It contains some estimates of cost, cardinalities etc. There is also EXECUTION PLAN. It also contains all this information regarding the optimizer estimates, but on the top of that, it also contains information about actual I/O incurred, actual cardinalities, actual timings etc.
    So if a hint is changing optimizer estimates, but the plan stays the same, then impact on query's performance is zero.
    If the actual numbers are changing, this is probably also irrelevant to the hint (e.g. you can have less physical reads because more blocks are found in the buffer cache the second time you're running the query, or you less work because you don't have to parse the statement etc.).
    Actually, most of optimizer hints don't affect optimizer estimates; rather, they try to get the optimizer to use a certain access method or a certain join order etc. regardless of the cost. So you must be talking about such hints as cardinality, dynamic_sampling etc. If that's not the case -- please clarify, because this means that something wrong is going on here (e.g. an INDEX hint may work or it may fail to work, but if it fails, optimizer estimates shouldn't change).
    Best regards,
    Nikolay

Maybe you are looking for