Huffman compression problem

I am trying to get a character count from an ASCII file. So i read byte by byte and store bytes into an array.
I'm using priority queue (heap) for storing the characters and their frequencies in an object, for the huffman tree.
So my question is what is the easiest way to count the character frequencies so that i can put them into the priority queue?
Should insert into PQ once i have a character count for the entire file, or while i am going through character by character ?

I know it's a spoiler and a give-away, but it's Saturday and I have to
make up with what happened the last couple of days. Have a look:import java.util.HashMap;
import java.util.Map;
import java.util.TreeSet;
abstract class HuffmanNode implements Comparable {
     private static int seq= 0;
     protected int freq;
     protected int code;
     public int getFreq() { return freq; }
     public int getCode() { return code; }     
     protected void incFreq() { freq++; }
     public HuffmanNode(int freq) { this.freq= freq; code= seq++; }
     abstract protected void add(int b);
     public int compareTo(Object obj) {
          HuffmanNode that= (HuffmanNode)obj;
          if (this.freq == that.freq)
               return this.code-that.code;
          else
               return this.freq-that.freq;
class LeafNode extends HuffmanNode {
     private char chr;
     private int len;
     public LeafNode(char chr, int freq) { super(freq); this.chr= chr; }
     protected void add(int bit) {
          code&= (1<<len)-1;
          code<<= 1;
          code|=  bit;
          len++;
     public String toString() { return "["+chr+":"+code+","+len+"]"; }
class CombNode extends HuffmanNode {
     private HuffmanNode lft;
     private HuffmanNode rgt;
     public CombNode(HuffmanNode lft, HuffmanNode rgt) {
          super(lft.getFreq()+rgt.getFreq());
          this.lft= lft; lft.add(0);
          this.rgt= rgt; rgt.add(1);
     protected void add(int bit) { lft.add(bit); rgt.add(bit); }
     public String toString() { return "["+lft+","+rgt+"]"; }
public class Huffman {
     private Map table= new HashMap();
     public void add(char c) {
          Character key= new Character(c);
          HuffmanNode node= (HuffmanNode)table.get(key);
          if (node == null) {
               node= new LeafNode(c, 0);
               table.put(key, node);
          node.incFreq();               
     public HuffmanNode build() {
          TreeSet tree= new TreeSet(table.values());
          while (tree.size() >= 2) {
               HuffmanNode lft= (HuffmanNode)tree.first(); tree.remove(lft);
               HuffmanNode rgt= (HuffmanNode)tree.first(); tree.remove(rgt);
               tree.add(new CombNode(lft, rgt));
          return (HuffmanNode)tree.first();
}Note that I carefully removed all comments (just to give you something
to study) and also note how I cleverly (ahem) left the sorting business
to the Collection stuff (me being a lazy bum). Simply create a Huffman
object, add all your characters to it and let the object build
the Huffman tree. Note that this is not an industrial strenght Huffman
compression implementation by far, for reasons that are beyond the
scope of your question.
kind regards,
Jos

Similar Messages

  • File Compression Problem

    Hello all,
    I'm having a strange file compression problem with CS5 on our new Mac Pro.  We purchased the Mac Pro to scan and process images, but the JPEGs and GIFs we create from this computer are much larger than they should be when closed (e.g. images that should be compressed to 6KB are reading as 60KB, and the file size is often smaller when opened than closed). Furthermore, anytime we use these image files in other programs (e.g. Filemaker Pro) the inflated file size will carry over.  What's even more puzzling is that the same files that are reading as 60KB on our Mac Pro will read correctly as 6KB from a PC.  Similarly, if we embed these images -- that were created on the Mac Pro -- into Filemaker from a PC, the image file size is correct.  We cannot use the compressed files we create on our Mac Pro because the inflated file size will be passed on to whatever application we use on the Mac Pro (except for Photoshop).
    We have been processing images for years on a PC and haven't had any troubles with this.   We were thinking for a while that the problem was with the Mac operating system, but after many calls with expert Apple advisers it seems like Photoshop for Mac has to be the issue.  We have already tried reformatting and re-indexing the hard drive, and at this point there is nothing else that can be done from Apple's end.  The last expert I spoke with at Apple said that it sounds like the way Photoshop for Mac compresses files and how Mac reads file sizes is very different from the way Photoshop for PC compresses files and how Windows reads file sizes.  If he was correct, and there is no work-around, we'll have no other choice and will have to return our Mac.
    Has anyone else experienced this before?  The experts at Apple were thoroughly confused by the problem and so are we.
    Thanks,
    Jenny

    This has nothing to do with compression.
    Macintosh saves more metadata, and more previews than Windows - that's one part.
    Macintosh shows the size of the file on disk, including wasted space due to the disk block size - that's another part. (look at the byte count, not the size in K or MB).
    When you upload the files to a server, or use them in most programs, that extra metadata is lost and the file sizes should be identical.
    I can't believe that your advisors spent any time on such a trivial issue...

  • Any help appreciated - iTune compression problems

    Hi there,
    Just bought a second hand iPhone 3g yesterday that has to last me out 5 months before my contract is up and I can hopefully get a 5 if it's out (touchwood).
    Transferred some music from iTunes to my iPhone and on the phone speakers it sounds fine, however when I put my headphones into the jack it's incredibly distorted and sounds like it's suffered far too much compression.
    I'm guessing this is possibly a headphone jack related problem as opposed to a compression issue? Tried converting files to AAC but made no difference
    Any help appreciated!
    Thanks
    Dan

    FYI - the iPhone includes a single external speaker. The opposite grill that appears the same is a microphone.
    If this were a compression or format issue, the same would occur when playing the music via the speaker. Using earbuds has no affect on the compression or format. More than likely this is a headphone jack related problem or there could be a problem with the earbuds being used.

  • JAI CCIT4 compression problem

    I have experienced a problem when trying to write out a tiff image using the group4 compression method (CCIT4), the problem is that it inverts the colors of the ouput file, inverting the image before writing or going through the image and inverting manully is too slow. If anyone could please help?
    Thanks, Wesley.

    The Java Twain package has been updated since, the name has changed to Morena. Now Morena is available at http://www.gnome.sk together with a tutorial and many examples.
    For illustration:
    Using Morena, pictures from the scanner/camera are acquired in the same way as a file is opened from the disk:
    image=Toolkit.getDefaultToolkit().createImage(TwainManager.getDefaultSource());
    There are more than 100 methods available in the Morena to get and set scanner capabilities. Below is an example how to call some typical of them:
    TwainSource twainSource= TwainManager.getDefaultSource();
    twainSource.setVisible(false);     // hiding the Twain user interface and manage scanning directly from the Java
    twainSource.setXResolution(100);     // setting DPI
    twainSource.setYResolution(100);     // setting DPI
    twainSource.setPixelType(TwainSource.TWPT_ RGB);     // setting color model
    System.err.println("getPixelType=" + source.getPixelType()); //geting the actual color model
    twainSource.setContrast(300);     // setting contrast
    image=Toolkit.getDefaultToolkit().createImage(twainSource);
    To analyze an acquired image is a straightforward method. The developer needs to create his own class with a Java ImageConsumer interface and to start produce data from the scanner/camera directly to the new class. E.g. if his class has name "myImageConsumer", it receives data by the following code:
    TwainSource twainSource= TwainManager.getDefaultSource();
    twainSource.startProduction(myImageConsumer);
    Data are sent via ImageConsumer interface and can be parsed in the byte/int array.

  • [Solved] Compression Problem with 7z + Peazip

    I like to compress a file with 7zip, which doesn't succeed.
    It has a size of 4.4GB and after compression it got about 3,8GB.
    If I choose split the file to 1DVD size, it will succeed with a *.7z.001, this is why I know the compressed size. But if I choose single volume, it will give me this error:
              7-Zip [64] 9.20  Copyright (c) 1999-2010 Igor Pavlov  2010-11-18
              p7zip Version 9.20 (locale=en_US.UTF-8,Utf16=on,HugeFiles=on,4 CPUs)
              Scanning
              Creating archive /home/xxx/xxx/file.7z
              Compressing  file.iso
    System error:
    E_FAIL               
    The command is:
    /usr/lib/peazip/res/7z/7z a -t7z -m0=LZMA -mmt=on -mx5 -md=16m -mfb=32 -ms=2g -w '/home/xxx/xxx/file.7z' '/home/xxx/xxx/file.iso'
    The code which will succeed with split file is:
    /usr/lib/peazip/res/7z/7z a -t7z -m0=LZMA -mmt=on -mx5 -md=16m -mfb=32 -ms=2g -v4699324416 -w '/home/xxx/xxx/file.7z' '/home/xxx/xxx/file.iso'
    Ark compressor also crashes when trying to compress this file, and so is fileroller. This is why I guess the problem lies in 7z, but I don't actually know for sure, because if I use peazip and try to compress the file to a single volume *.zip, it will also crash. But on the other hand I think zip compression is also done by 7z.
    Compression ratio is not relevant, it will crash even of store.
    It doesn't crash if I use compressor to create *.pea, which I don't want to use.
    The compression will succeed with other files, but not with all. From 11 files I try to compress, it succeeded with 6, it didn't succeed with 5. All files are 4,4Gb before compression.
    My file systen is ext4, I tried to copy the files to a different hdd also on ext4 and another one with ntfs, no change.
    I am out of ideas what I should do, I only can guess the problem is with 7zip somewhere.
    Last edited by alocacoc (2014-03-06 06:39:50)

    OK, when I compress with peazip to *.7z, it will fill up the directory /tmp, which is 2GB, which is half of my RAM size I guess, till its full. But when I do a split file (to produce a *.7z.001 file with a split limit of 4,4GB (this will also produce one file only), it will not use /tmp.
    So maybe the problem is that at *.7z time it wants to cache it to /tmp, and at the split file it will cache to the destenation directory.
    This was guessable also the problem why ark and fileroller crashed before, since I kept the peazip error window open and it won't release its temp file in /tmp till its closed and the whole system reacted strangely in this time.
    Maybe I have to wait for an update on Peazip or 7zip to see if the problem will be fixed.
    Last edited by alocacoc (2013-11-24 22:38:51)

  • Spoken Podcast compression problem

    For the podcast I host I've always edited the audio in Soundtrack Pro and then imported the AIFF audio file into iTunes to compress using the AAC Spoken Podcast preset. This worked great for years but ever since iTunes 9 was released the audio sounds garbled after compression. The other AAC presets still produce excellent sounding audio it's just the Spoken Podcast preset that seems to be broken. I submitted a bug report but wanted to know if anyone else has had this problem or if they know of a fix.

    Thank you Arun,
    There was no selective deletion on the cube.Roll up is included before the compress process in the chain.
    Please find the info below from the table we can understand that DM is going fine.
    The latest request in the cube is 1109217 and last compressed req is 1094040
    Please find the table result for the cube
    QUALOK      ROLLUP_RSVD_DUAL ROLLUP      COMPR_AGGR  DMEXIST     DMALL       TMSTMP_ROLLUP      
    1.109.217           1.109.217               1.109.217         1.007.194    1.109.217   1.094.040     20.120.215
    Could you please tell me from the above analysis what i can do  ?
    Thanks
    Satya

  • DVCPRO HD Compression problem

    Hello
    I've got some great looking DVCPRO HD 720p 59.94 footage. I export it as is from FCP.
    When I let DVD Studio Pro compress it to an SD DVD (NTSC 23.98 fps), it looks blocky and terrible.
    I've seens some posts about this around, but can't seem to understand the solution...
    Thanks
    Alex

    Well, can I burn a DVD to 59,94?
    In compressor any setting I use looks terrible. I tried converting the DVCPRO HD footage to 10bit Uncompressed in case that was the problem (changing the size to 1280x720), but it didn't work either. I can up every single setting (including Frame Control) in compressor to best or highest bit rate, and my preview looks like bad jpeg compression...
    Do you think the frame rate could be the issue?
    Thanks
    Alex

  • Lightroom - edit in PS TIFF compression problem

    Hello everybody,
    when I send my image to photoshop from lightroom via edit in function, after I save it (ctrl + s) in ps the output tiff file is always uncompressed regardless the compression setting in lightroom.
    I noticed it because the sizes of the tiffs are rather huge. I checked the tiff properties in explorer and it says uncompressed indeed. Every other settings in lightroom edit in ps preferences seem to work. This behavior is the same with dngs and jpegs, just when lightroom is handing over the original to ps. It doesn't occur when lightroom renders its own copy and handing it over to ps afterwards.
    A workaround is to use save as in ps but anyway..
    Am I missing something, or is it a bug? Thanks for your answer.
    Lightroom 3.6 64bit, Photoshop CS 5.1 64bit, Win 7. I also tried it with LR 4 beta and it's the same (due to incompatibility Open anyway option). Tiffs always come back uncompressed as well.

    I totally agree. This is easy to replicate.   I am running LR4 and PS5 on an IMac.
    Open  a 20MB dng file to PS5 from Lightroom and then save via Command S.  Tif file size is 150MB and compression is at level 1.
    Open a 20MB dng file directly to PS5  and then use Save As and choose ZIP compression.  Tif file size is 36MB and compression is level 8.
    This is a long standing problem!  Is it fixed in PS6?

  • DV PAL Quicktimes compression problem

    I am putting out DV PAL QuickTime’s, but noticing in QT pro you can tick a button in properties that makes the compression better and the pic looks less compressed. I want to know is there a way to export them out of Final Cut so that this setting is already ticked and we are getting the highest quality straight out of final cut rather then having to recompress and save it out of QT pro. Has anyone come across this problem before is this a final cut problem or is this just a monitoring problem in other programmes. There is a big difference in quality when you tick the button in QT pro.

    Jen:
    As Trevor saud, export your movie as Quicktime Movie with default settings and you'll get a full quality copy of your timeline, that can use as source for any further encoding, ie. MPEG2 to use in DVDSP.
    There is a big difference in quality when you tick the button in QT pro.
    I think you refer to the check button named High Quality you can access when opening the movie in QT > Window > Show Movie Properties > Video Track.
    When you check that option you get a better playback quality, right?
    If you mean that setting, it's only a playback setting, all your quality is inside of your QT movie, you are not recompressing it. The low quality playback default come from the time when computers had not enough power to playback the DV stream at full quality.

  • HD for web compression problem

    I'm trying to compress some HD footage shot on an HVX200 for the web and am having a problem with jagged edges. I'm somewhat new to HD, but this problem resembles interlacing artifacts from my previous DV days. Straight lines aren't straight, they're jagged and the edges around people with white shirts on dark backgrounds have a similar jagged/digital characteristic. Here's how my project is set up.
    Shot on HVX200 at 24pn. Edited in FCP and my sequence settings are 960X720 HD (16:9) frame size. Pixel Aspect Ratio of HD (960X720) and QuickTime Video Settings set to Compressor: DVCPRO HD 720p60 at 100% quality.
    When I save a self-contained quicktime movie like this and view it on my monitor, no jagged edges. It looks exactly as I'd like it to look after its compressed. However, when I pull it into compressor and try to encode it with h.264, it comes out with the jagged edges. I have the following settings in my Compressor summary:
    Format: QT
    Width: 480
    Height: 360
    Pixel Aspect Ratio: Square
    Field Output: Same as source
    Frame Controls On (Retiming: Fast, Resize filter: linear filter, deinterlace filter: best)
    I've tried what feels like all of the various combinations of sequence and compressor settings. Obviously, I've not tried them all. How would I render this out for high quality web display?

    Compressing as mp4 instead, I was able to get this to work tweaking some of the settings in there. Only problem now is an enormous file size that I have to work to get down, but whatever my issue was, seems to have been corrected.

  • MPEG 2 compression problem

    Hi, I was wondering if someone could please give me some advice on MPEG encoding in Premiere Pro CS3. I'm using the standard MPEG 2 DVD when exporting my movie (which is made up of PNGs 1024 x 576), the problem is its just compressing my movie way too much.
    Even when I shrink down a single PNG image to 720 by 405 they still don't look as bad as when converted to MPEG format. Even when I export it to MOV from Aftereffects its still not as compressed, but i need to keep the MPEG format for a DVD.
    My movie is only 1 min 40 sec,roughly 50mb I've got 2 other things to go on the DVD which come to about 800mb all up, so I've still got 3.9gb unused which I'd rather use to maximise the resolution of my movie.
    I could increase bit rate but apparently most DVD players can't handle higher than 9, surely there must be some way i can get better picture quality?

    I'm using the standard MPEG 2 DVD
    That would be an interlaced preset, which can wreck any progressive source.
    made up of PNGs 1024 x 576
    That is a square pixel aspect ratio proportion for widescreen.  DVD can't handle that.  It'll require 720 x 576 using a 1,4 pixel aspect ratio, so the export is doing come conversion here.  Better if your source was also 720 x 576 with a 1,4 PAR.

  • Video Compression Problems

    Does anyone have any idea of some compression settings I could use for a video on my website? The current file I have on there now takes way too long to load, so I want to compress it somehow, but it also still has to be able to play on an ipod and I also would like to preserve the quality as much as possible. Also it still has to have auto play capabilities. Any suggestions?

    I would say it's a QuickTime problem. Not an iWeb one. You would have the same questions with a webpage of your own.
    Better ask in a QT forum.
    http://discussions.apple.com/category.jspa?categoryID=122

  • Cube Compression problems

    Hi
    We are having problems on our BW system when trying to compress requests.
    At the moment we're compressing a single request at a time, but more often than not end up using secondary logs, and when those are all used up, the job is cancelled, in which case a database rollback happens. During rollback, the system is unavailable for the 2-3 hours, so batch jobs are put on hold.
    This is not ideal.
    Is there a way to cancel a compress job, so that a database rollback doesn't happen?
    Also, should we be reasonably expecting a compress of 13 million records to complete in an acceptable amount of time, or is this unreasonable.
    Basis have already given us the maximum amount of logs, so we can't ask for more. We will also look at breaking the requests up into smaller pieces.
    Regards,
    Andrew

    Hi Andrew,
    We had similar issues in the past with regard to Compression, following activities to mention;
    1. In the past Compression was mainly handled via a process chain on daily basis, this has caused many issues, so we split compression jobs into parallel runs however this did not resolve our issues, Then,
    2.  We executed Compression only on Weekends, in order for this to happen created a condition in process chain, PSA deletions to happen on 2-3 weeks time frame ( best practice )
    3. 10 Million records can be easily compress, but will dump out if its more than that and Basis have to roll back which is time consuming, extending memory is another solution but its system Dependant.
    4. Have to minimize business impact not to schedule compression jobs during working hours as there is a direct impact to reporting process.
    Hope this helps
    WIDYL

  • Cube Compression Problem

    In the middle of a cube compression that was running in the background, the job short dumped.
    I know why the job short dumped.  It was due an invalid SQL statement, presumably because I tried to compress too many requests.
    Problem is that the requests that I asked to be compressed in the cube are marked as "compressed".    How do I "uncheck" those requests that I know are not compressed?

    Wardell,
    Requests get compressed one by one when you compress requests - hence if you are compressing 10 requests then it starts from request 1. If the job ends halfway then the requests that have been compressed till then stay compressed. It does not mean that the request has an incorrect compression status. If the request was getting compressed / scheduled for compression - you will get a clock icon against the request.
    Also to double check - you refresh the screen and go to the collapse tab and see if the release button is not greyed out.
    And go back to RSA1 and refresh tree and then go back to the manage tab and see the status.

  • Huffman Code problem

    I'm working on a huffman implementation - but i'm getting stumped - when i try and write both the buffered code as well as header values to files (Compress.cmp)and (compress.hea).
    Here's the error:
    //i've numbered the sections that relate to the err message...
    null
    java.lang.NullPointerException
    at Compress.Output(Compress.java:96)
    at Compress.preOrderTraversal(Compress.java:90)
    at Compress.preOrderTraversal(Compress.java:79)
    at Compress.preOrderTraversal(Compress.java:79)
    at Compress.huff(Compress.java:63)
    at Test.main(Test.java:8)
    Exception in thread "main"
    my code:
    1. import java.io.*;
    public class Compress {
         public static void huff(String filename) throws IOException, FileNotFoundException {            
              final int BEGIN = 0;
              final int END = 256;
              final int ARR = 257;
              int frequency[] = new int[ARR];
              int chr = 0;
              int currPos = 0;
              TreeNode tNode;
              TreeNode temp1, temp2, temp3, root;
              PriorityQueue pq = new PriorityQueue();
              DataOutputStream dos = new DataOutputStream(new FileOutputStream("Compress.hea"));
              FileReader fr = new FileReader(filename);               
              // Read characters/bytes from the FileInputStream and establish the relative frequencies of each character
              while(chr >=0){
                   currPos = chr;
                   if(currPos > END || currPos < BEGIN )
                        currPos = 0;
                   frequency[currPos]++;
                   chr = fr.read();
                   fr.close();
                   // Creating a TreeNode with node frequencies of the input stream
                   for (int i = 1; i < ARR; i++){
                        if (frequency[i] > 0)
                        tNode = new TreeNode(i, frequency, true);
                        pq.insert(tNode);
              //Building huffman tree
              while (pq.size() > 1){
                   // Removing two items from the priorityQueue
                   temp1 = (TreeNode)pq.remove();
                   temp2 = (TreeNode)pq.remove();
                   // Creating a new TreeNode with the addition of these two frequency values
                   temp3 = new TreeNode(1, temp1.freq + temp2.freq, false);
                   // placing the new node back into the priority queue
                   if ((temp1.compareTo(temp2)) == -1 || (temp1.compareTo(temp2)) == 0)
                        temp3.left = temp1;
                        temp3.right = temp2;
                        else if ((temp1.compareTo(temp2)) == 1 || (temp1.compareTo(temp2)) == 0)
                             temp3.left = temp2;
                             temp3.right = temp1;
                        pq.insert(temp3);
                   root = (TreeNode)pq.remove();
                   String InCode = "";
                   // make a traversal call - so as to assign 0 and 1 bit values to the huffman code
    63.                preOrderTraversal (root, InCode);
                   fr.close();
                   for (int j = 0; j <= frequency.length;j++ ){
                        dos.writeByte(frequency[j]);
              private static void preOrderTraversal (TreeNode t, String huffCode)throws IOException, FileNotFoundException {
                   // Performing the preOrderTraversal of the created Huffman tree
                   // for each value on either the left or right of the tree; a 1 or 0 will be appended
                   final int ARR = 257;
                   String code[] = new String[ARR];
                   if (!t.leaf){
                             preOrderTraversal(t.left, huffCode + "1");
                             preOrderTraversal(t.right, huffCode + "0");
                   if (t.leaf)
                             code[t.symbol] = huffCode;
                   int c;
                   FileReader fr = new FileReader(new File("testFile.txt"));
                   while ((c = fr.read()) != -1){
                             System.out.println(code[c]);
    90.                          Output(code[c]);                
                        fr.close();               
              private static void Output(String s)throws IOException{
    96.                for (int k = 0; k < s.length();k++){
                        Output(s.charAt(k));
              private static void Output(char bit)throws IOException{
                   int buffer = 0;
                   BufferedWriter out = new BufferedWriter(new FileWriter(new File("Compress.cmp")));
                   int bufferLength = 0;
                   bufferLength++;
                   buffer = 2* buffer;
                   if (bit == '1')
                        buffer++;
                   if (bufferLength == 8){
                        out.write(buffer);
                   buffer = 0;
                   bufferLength = 0;
    i'm running a test file that calls the huff method:
    8. Compress.huff("testFile.txt");

    It's kinda hard to read your code without it being formatted (read [url http://forum.java.sun.com/faq.jsp#format]here)
    First, realize that the first element of code is code[0], not code[1]
    96. for (int k = 0; k < s.length();k++){
    The only Object in this line which could be null is 's' which is passed as a parameter from here.
    90. Output(code[c]);
    This means that some element of array 'code' is null.
    First, let's find out which one is null by inserting a line before this one.
    System.out.println("Outputting code["+c+"]="+code[c]);
    90. Output(code[c]);
    Now, let's say you found out that element 97 is null. Why didn't it get set? Let's go back up to this statement
    code[t.symbol] = huffCode;
    and insert a statment before it
    System.out.println("Setting code["+t.symbol+"]="+huffcode);
    This will allow you to see which elements are being set.

Maybe you are looking for

  • How to Authenticate the Web Service from my Java Client?

    Hi All, I am a newbie. I am working in Microsoft Technology. But due to a critical requirement, I am suppose to write a Java Web Service Client application. I some how managed to create the Client Application. The application works fine if the Authen

  • Satellite A300-15A does not start up after Alcohol program installation

    Hi... I bought Satellite A300-15A yesterday and after completing the Vista installation process, I tried to install "Alcohol" but my notebook shutdown and now I'm not able to start notebook. It is not starting in any mode. I forget to make recovery d

  • TIMEZONE CONVERSION PROBLEM

    Hi, I have application which accepts a date-time and time zone as two different fields. I want to convert user entered date-time from user entered timezone to PST before storing it in DB. I am using below statement to do it - to_char(convert_time(to_

  • Installer/Application Manager just won't open

    In the simplest terms: Adobe Application Manager for CS5 just won't open. It all started when I tried to update CS5 to the most recent version, but this Application Manager would freeze up after the "Initializing Installer" window was done. After tha

  • Accessing ribbon via vba ApplicationOptionsDialog

    hi, I am using microsoft access 2010 project (adp). I need to disable the file menu when I distribute the ade to clients. the backstage disable has no effect. my customized ribbon below. I only need to a way via vba to turn off the ApplicationOptions