Hit size issues within a MC

Hi all! I have a small problem. I have a MC with this code.
This plays a small animation inside the MC.
An thumbnail of a car that has a mask over the top that
expands to show the rest of the car. Now, the hit size of the MC
starts off as just the thumbnail (small) but when the expand
happens the size of the hit area has also expanded to the same size
as the car (big). This does not work for me, not one little bit, as
I have other images in a row. As you can see I have code in place
so that the MC that is being rolled over is at the front
(overlaping the image below).
In short, I need the hit size to stay small. How can I do
this?? I am more than happy to send the .fla to anyone that need's
to see what I mean.
Thanks all!

So I don't know if your like me but, I hate not knowing if a
problem has been solved or not, so for the benefit of everyone else
having the same issue I fixed the proble and I'm replying to myself
with the fix.
To solve the problem of setting a hit state to a different MC
this is what I have done.
I have a movieClip with the ainmation start on the rollOver
and play in reverse on rollOut. Whilst at the same time have the
image stay infront of the mpvieClip to the left or right (the
ainmation is a mask layer over a large picture, expanding from a
small square to show the whole image). The rollOver, rollOut
effect, and set that movieClip to rest on top was an easy one to
set with this code.
onClipEvent (enterFrame) {
if (this.hitTest(_root._xmouse, _root._ymouse, true)) {
this.nextFrame();
} else {
this.prevFrame();
on (rollOver) {
_root.x += 2;
_root.img_01_Mc.swapDepths(_root.x);
Great! But this is where my issue came into effect! The hit
area size incressed with the animation. Making the whole effect
look rubbish. As you would have to move all the way out of the
movieClip to get to the next. So the way around this is to creat a
movieClip ontop of the "animated movieClip" and set the "Alpha" to
0% and then attach this code.
onClipEvent (enterFrame) {
if (this.hitTest(_root._xmouse, _root._ymouse, true)) {
_root.img_01_Mc.nextFrame();
} else {
_root.img_01_Mc.prevFrame();
on (rollOver) {
_root.x +=2;
_root.img_01_Mc.swapDepths(_root.x);
Sorry this is so long, but I appreciate when people give a
"lay-mans" explanation for the "n00bz" out there.
So with that said, a break down of the code>
"if (this.hitTest(_root._x..."
"this." refers to: the movieClip with the Alpha set to 0%,
to which the code is attached.
"hitTest" refers to: if the mouse pointer is on the
movieClip then start the animation "img_01_mc". Which is called
into effect with "_root.img_01_Mc.nextFrame();" this plays the
animation forwards.
"else {" refers to: if the mouse pointer is no longer over
the movieClip (with alpha 0%) then play the animation backwards by
calling this part of the code into effect
"_root.img_01_Mc.prevFrame();", and the the animation begins to
play in reverse.
The last part of the code is simple.
on (rollOver) {
_root.x += 2;
_root.img_01_Mc.swapDepths(_root.x);
this just tells the movieClip "img_01_Mc" to sit ontop of
other movieClips to the left or right if the mouse is on the hit
area.
All this said and done. It might be a little hard to follow
without seeing what I'm talking about. So if you want the .fla just
ask and I will be more than happy to send it to any one that would
like it!
Happy "Flashing"
All the best,
Satrop

Similar Messages

  • Windows Update Helps with File Size Issues?

    I'm just wondering if anybody has recently noticed an
    improvement related to the file size issue variously reported
    throughout the forums?
    I ask because our IT folks distributed a Windows update on 2
    days last week and since the application of those updates I have
    not experienced the freakishly large file sizes and the related
    performance issues in Captivate. Unfortunately I don't have any of
    the details of what patch(es) were installed, as it was part of our
    boot script one morning and I didn't even realize it was updating
    until I received the Reboot Now or Later alert.
    Anyway, I was curious because I have experienced significant
    performance improvement since then.
    Rory

    If you are using a remote workflow ... designers are sending off-site editors InCopy Assignment packages (ICAPs) .... then they need to create assignments in order to package them for the remote InCopy user. So there's no need to split up a layout into smaller files or anything.  An assignment is a subset of the INDD file; multiple assignments -- each encompassing different pages or sections -- are created from the same INDD file.
    When the designer creates the assignment, have them turn off "Include original images in packages"; that should keep the file size down.
    Or -- like Bob said -- you can avoid the whole remote workflow/assignment package rigamarole all together by just keeping the file in a project folder in the Dropbox folder on teh designer's local hard drive, and have them share the project folder with the editors. In that workflow, editors open the INDD file on their local computer and check out stories, just as though they were opening them from a networked file server.
    I cover how the InCopy Dropbox workflow works in a tutorial video (within the Remote Workflows chapter) on Lynda.com here:
    http://www.lynda.com/tutorial/62220
    AM

  • Paper Size issues with CreatePDF Desktop Printer

    Are there any known paper size issues with PDFs created using Acrobat.com's CreatePDF Desktop Printer?
    I've performed limited testing with a trial subscription, in preparation for a rollout to several clients.
    Standard paper size in this country is A4, not Letter.  The desktop printer was created manually on a Windows XP system following the instructions in document cpsid_86984.  MS Word was then used to print a Word document to the virtual printer.  Paper Size in Word's Page Setup was correctly set to A4.  However the resultant PDF file was Letter size, causing the top of each page to be truncated.
    I then looked at the Properties of the printer, and found that it was using an "HP Color LaserJet PS" driver (self-chosen by the printer install procedure).  Its Paper Size was also set to A4.  Word does override some printer driver settings, but in this case both the application and the printer were set to A4, so there should have been no issue.
    On a hunch, I then changed the CreatePDF printer driver to a Xerox Phaser, as suggested in the above Adobe document for other versions of Windows.  (Couldn't find the recommended "Xerox Phaser 6120 PS", so chose the 1235 PS model instead.)  After confirming that it too was set for A4, I repeated the test using the same Word document.  This time the result was fine.
    While I seem to have solved the issue on this occasion, I have not been able to do sufficient testing with a 5-PDF trial, and wish to avoid similar problems with the future live users, all of which use Word and A4 paper.  Any information or recommendations would be appreciated.  Also, is there any information available on the service's sensitivity to different printer drivers used with the CreatePDF's printer definition?  And can we assume that the alternative "Upload and Convert" procedure correctly selects output paper size from the settings of an uploaded document?
    PS - The newly-revised doc cpsid_86984 still seems to need further revising.  Vista and Windows 7 instructions have now been split.  I tried the new Vista instructions on a Vista SP2 PC and found that step 6 appears to be out of place - there was no provision to enter Adobe ID and password at this stage.  It appears that, as with XP and Win7, one must configure the printer after it is installed (and not just if changing the ID or password, as stated in the document).

    Thank you, Rebecca.
    The plot thickens a little, given that it was the same unaltered Word document that first created a letter-size PDF, but correctly created an A4-size PDF after the driver was changed from the HP Color Laser PS to a Xerox Phaser.  I thought that the answer may lie in your comment that "it'll get complicated if there is a particular driver selected in the process of manually installing the PDF desktop printer".  But that HP driver was not (consciously) selected - it became part of the printer definition when the manual install instructions were followed.
    However I haven't yet had a chance to try a different XP system, and given that you haven't been able to reproduce the issue (thank you for trying), I will assume for the time being that it might have been a spurious problem that won't recur.  I'll take your point about using the installer, though when the opportunity arises I might try to satisfy my cursed curiosity by experimenting further with the manual install.  If I come up with anything of interest, I'll post again.

  • Smartcardio ResponseAPDU buffer size issue?

    Greetings All,
    I’ve been using the javax.smartcardio API to interface with smart cards for around a year now but I’ve recently come across an issue that may be beyond me. My issue is that I whenever I’m trying to extract a large data object from a smart card, I get a “javax.smartcardio.CardException: Could not obtain response” error.
    The data object I’m trying to extract from the card is around 12KB. I have noticed that if I send a GETRESPONSE APDU after this error occurs I get the last 5 KB of the object but the first 7 KB are gone. I do know that the GETRESPONSE dialogue is supposed to be sent by Java in the background where the responses are concatenated before being sent as a ResponseAPDU.
    At the same time, I am able to extract this data object from the card whenever I use other APDU tools or APIs, where I have oversight of the GETRESPONSE APDU interactions.
    Is it possible that the ResponseAPDU runs into buffer size issues? Is there a known workaround for this? Or am I doing something wrong?
    Any help would be greatly appreciated! Here is some code that will demonstrate this behavior:
    * test program
    import java.io.*;
    import java.util.*;
    import javax.smartcardio.*;
    import java.lang.String;
    public class GetDataTest{
        public void GetDataTest(){}
        public static void main(String[] args){
            try{
                byte[] aid = {(byte)0xA0, 0x00, 0x00, 0x03, 0x08, 0x00, 0x00};
                byte[] biometricDataID1 = {(byte)0x5C, (byte)0x03, (byte)0x5F, (byte)0xC1, (byte)0x08};
                byte[] biometricDataID2 = {(byte)0x5C, (byte)0x03, (byte)0x5F, (byte)0xC1, (byte)0x03};
                //get the first terminal
                TerminalFactory factory = TerminalFactory.getDefault();
                List<CardTerminal> terminals = factory.terminals().list();
                CardTerminal terminal = terminals.get(0);
                //establish a connection with the card
                Card card = terminal.connect("*");
                CardChannel channel = card.getBasicChannel();
                //select the card app
                select(channel, aid);
                //verify pin
                verify(channel);
                 * trouble occurs here
                 * error occurs only when extracting a large data object (~12KB) from card.
                 * works fine when used on other data objects, e.g. works with biometricDataID2
                 * (data object ~1Kb) and not biometricDataID1 (data object ~12Kb in size)
                //send a "GetData" command
                System.out.println("GETDATA Command");
                ResponseAPDU response = channel.transmit(new CommandAPDU(0x00, 0xCB, 0x3F, 0xFF, biometricDataID1));
                System.out.println(response);
                card.disconnect(false);
                return;
            }catch(Exception e){
                System.out.println(e);
            }finally{
                card.disconnect(false)
        }  

    Hello Tapatio,
    i was looking for a solution for my problem and i found your post, first i hope your answer
    so i am a begginer in card developpement, now am using javax.smartcardio, i can select the file i like to use,
    but the problem is : i can't read from it, i don't now exactly how to use hexa code
    i'm working with CCID Smart Card Reader as card reader and PayFlex as smart card,
              try {
                          TerminalFactory factory = TerminalFactory.getDefault();
                      List<CardTerminal> terminals = factory.terminals().list();
                      System.out.println("Terminals: " + terminals);
                      CardTerminal terminal = terminals.get(0);
                      if(terminal.isCardPresent())
                           System.out.println("carte presente");
                      else
                           System.out.println("carte absente");
                      Card card = terminal.connect("*");
                     CardChannel channel = card.getBasicChannel();
                     ResponseAPDU resp;
                     // this part select the DF
                     byte[] b = new byte[]{(byte)0x11, (byte)0x00} ;
                     CommandAPDU com = new CommandAPDU((byte)0x00, (byte)0xA4, (byte)0x00, (byte)0x00, b);
                     resp = channel.transmit(com);
                     System.out.println("Result: " + getHexString(resp.getBytes()));
                        //this part select the Data File
                     b = new byte[]{(byte)0x11, (byte)0x05} ;
                     com = new CommandAPDU((byte)0x00, (byte)0xA4, (byte)0x00, (byte)0x00, b);
                     System.out.println("CommandAPDU: " + getHexString(com.getBytes()));
                     resp = channel.transmit(com);
                     System.out.println("Result: " + getHexString(resp.getBytes()));
                     byte[] b1 = new byte[]{(byte)0x11, (byte)0x05} ;
                     com = new CommandAPDU((byte)0x00, (byte)0xB2, (byte)0x00, (byte)0x04, b1, (byte)0x0E); */
                        // the problem is that i don't now how to built a CommandAPDU to read from the file
                     System.out.println("CommandAPDU: " + getHexString(com.getBytes()));
                     resp = channel.transmit(com);
                     System.out.println("Result: " + getHexString(resp.getBytes()));
                      card.disconnect(false);
              } catch (Exception e) {
                   System.out.println("error " + e.getMessage());
              }read record : 00 A4 ....
    if you know how to do , i'm waiting for your answer

  • Recording File Size issue CS 5.5

    I am using CS 5.5, a Balckmagic Ultra Studio Pro through USB 3.0 being fed by a Roland HD Video switcher. Everything is set for 720P 60fps (59.94) and the Black Magic is using the Motion JPEG compression. I am trying to record our sermons live onto a Windows 7 machine with an Nvidia Ge-Force GTX 570, 16 GB of Ram and a 3TB internal raid array (3 drives). It usually works great but more often now when I push the stop button in the capture window, the video is not proceesed and becomes unusable. Is it a file size issue or what. I get nervous when my recording goes longer than 50 Minutes. Help

    Jim thank you for the response. I have been away and busy but getting
    caught up now.
    I do have all drives formatted as NTFS. My problem is so sporadic that I
    can not get a pattern down. This last Sunday recorded fine so we will see
    how long it last. Thanks again.

  • Swap size issues-Unable to install DB!!

    Unable to install DB. End part i am getting failed due to swap size issue .. FYI...
    [root@usr~]# df -h
    Filesystem            Size  Used Avail Use% Mounted on
    /dev/hda2             5.9G  5.9G     0 100% /
    /dev/hda3             3.0G  848M  2.0G  31% /tmp
    /dev/hda5              34G   12G   21G  37% /refresh
    /dev/hda1              99M   12M   83M  12% /boot
    tmpfs                 3.9G     0  3.9G   0% /dev/shm
    [root@usr~]#
    Please help me...Thanks

    You can increase your swap space.. I have also faced same issue
    Just try: http://www.thegeekstuff.com/2010/08/how-to-add-swap-space/
    ~J

  • Unable to install DB(swap size issues)

    Unable to install DB. End part i am getting failed due to swap size issue .. FYI...
    [root@usr~]# df -h
    Filesystem            Size  Used Avail Use% Mounted on
    /dev/hda2             5.9G  5.9G     0 100% /
    /dev/hda3             3.0G  848M  2.0G  31% /tmp
    /dev/hda5              34G   12G   21G  37% /refresh
    /dev/hda1              99M   12M   83M  12% /boot
    tmpfs                 3.9G     0  3.9G   0% /dev/shm
    [root@usr~]#
    Please help me...Thanks

    I tried with dd if=/dev/hda3 of=/dev/hda5 count=1024 bs=3097152
    Now the o/p is below ...
    [root@user/]# df -h
    Filesystem            Size  Used Avail Use% Mounted on
    /dev/hda2             5.9G  5.9G     0 100% /
    /dev/hda3              35G   33G  345M  99% /tmp
    /dev/hda5              34G   12G   21G  37% /refresh
    /dev/hda1              99M   12M   83M  12% /boot
    tmpfs                 3.9G     0  3.9G   0% /dev/shm
    [root@user/]# cd /tmp/

  • I purchased an iPhone 3 and had photos on it. Started having issues within 30 days and could replace it. Moved photos to back up on iCloud. did not restore new phone from back up how can i save the new photos and retrieve the old ones?

    I purchased an iPhone 3 and had taken some photos on it. I started having issues within 30 days and could replace it. So I created an iCloud account and  saved those photos as a back up on iCloud. I did not restore my new phone from back up because I did not realize I needed to restore and not just activitate it. How can I save the new photos, apps, and other stuff but retrieve the old photos?

    If all you want is your old photos, you could backup your new phone using iTunes, then restore it from your iCloud backup, import the old photos to your computer, then restore it from the backup your made in iTunes returning your newer data to your phone.  If you want your old photos on your phone, sync them there from your computer using iTunes.  The process would look like this:
    Connect your phone to your computer, when it appears in iTunes on the left sidebar right-click on it and select Transfer Purchases; right-click again and select Backup.
    Disconnect from your computer and go to Settings>General>Reset>Erase All Content and Settings to return it to new
    Go through the activation setup, choosing to Restore from iCloud backup.  Make sure your phone is plugged into a charger and has access to wi-fi as this can take hours to complete.
    When complete, without connecting your phone to your computer, open iTunes and go to Preferences.  On the Devices tab check "Prevent...from syncing automatically"
    Now import the old photos to your computer (see http://support.apple.com/kb/HT4083)
    When done, open iTunes, right-click on the name of your phone on the left sidebar and select Restore from Backup, choose the backup you made in step 1.
    Go to the Photos tab and select the photos you want to sync to your phone and sync.

  • TS1646 I use and overseas credit card for purchases but Itunes is insisting on a credit card issued within NZ where I live.?

    Why can I not use and overseas credit card for purchases?  My account information is insisting on a credit card issued within my country of residence..?

    iTunes doesn't allow a credit card that was issued by a bank in another country (possibly because it's a way of confirming that you are able to buy from a country's store) - you will only be able to use a credit card issued by a New Zealand bank. Or you could use iTunes gift cards as your payment method, as far as I can see they are available in New Zealand.

  • [HELP] Error creating property: file. Please verify the Property size is within the Repository limits.

    I am using portal 8.1
    was trying to upload a file (.doc) to the repository, and encountered such error.
    com.bea.content.RepositoryRuntimeException: Error creating property: file. Please
    verify the Property size is within the Repository limits.
    TIA

    Hi,
    hopefully this may help towards a solution.
    Are you using the Loader EJB ?
    1) Are you publishing to a root node within the "BEA repository" ?
    -> does this work ?
    Do you have code that reads something like:
    loader.loadDirectory("firstNode");
    loader.loadFile("firstNode", myBinaryValue, myBytes, myProperties);
    -> Having banged our heads off a lot of walls on this, we think that it is only when you try to publish to a sub-node within the repository that problems appear.
    2) Try publishing to a sub-node in this manner:
    loader.loadDirectory("firstNode/secondNode"); // Ensure exist in Repository
    String contentNodeName = "firstNode/secondNode/myContentNode");
    loader.loadFile(contentNodeName, myBinaryValue, myBytes, myProperties);
    --- Drop us a line as well as replying to this group (as I don't check daily etc). Myself and a collegue, Niamh Fitzpatrick, have been looking at this. Thanks Niamh.
    Hopefully this will work.

  • What am I doing wrong - size issue

    Need some advice...as usual!
    I designed a site on my 17" monitor and set the margins to 0
    and table to 100%. However when I look at it on different screens
    it looks rubbish...big gap at the bottom of the page with the
    design cramp at the top. I wonder if someone would mind looking at
    my code and see what's wrong with it. The site was designed using
    various techniques including css for nav bars, tables, and
    fireworks elements. the site can be viewed at:
    www.shelleyhadler.co.uk/nerja.html
    thanks for your help. Shell

    >Re: What ams I doing wrong - size issue.
    Several things...
    First, your 17" monitor has noting to do with web page
    layout. What
    resolution is your monitor set to? it could be 800 pixels
    wide or 1280
    pixels... wouldn't that make a difference? That aside, screen
    resolution and
    size are irrelevant anyway. What counts it eh size that your
    viewers have
    their web browser window set at.
    I have a pretty large monitor, set to a very high resolution,
    so I open
    several windows at once and size them so I can see the
    content in all.
    Sometimes my browser is full screen and sometimes its shrunk
    down to less
    than 600 x 800. Your web site needs to accommodate that.
    Every web viewer
    out there is different and likes the way they have their
    screens set up. So
    you need to be flexible and your site needs to be flexible.
    Next, Don't design in Fireworks and import to Dreamweaver.
    Fireworks is a
    superb web ready graphics and imaging processing program. The
    authors
    (mistakenly) threw in some web authoring stuff that works
    very poorly.
    Design your pages using Dreamweaver. Learn html markup and
    css styling to
    arrange it, then use Fireworks to create graphics to support
    your content.
    Along the way, be aware of the diffferant browsers in use.
    Internet
    Explorer is the most popular (or at least most in use) simply
    by virtue of
    the Microsoft market share, but it is also the least web
    complient (by virue
    of the Microsoft arrogance) so some things that work there,
    (like your green
    bands) won't on other browsers and vice versa.
    That said... graphically, your site looks great. You have a
    good eye for
    composition and simple clean design. You just need to learn
    to use html to
    your best advantage to create some realy nice looking and
    nicely working
    sites.
    "shelleyfish" <[email protected]> wrote in
    message
    news:[email protected]...
    > Need some advice...as usual!
    >
    > I designed a site on my 17" monitor and set the margins
    to 0 and table to
    > 100%. However when I look at it on different screens it
    looks
    > rubbish...big
    > gap at the bottom of the page with the design cramp at
    the top. I wonder
    > if
    > someone would mind looking at my code and see what's
    wrong with it. The
    > site
    > was designed using various techniques including css for
    nav bars, tables,
    > and
    > fireworks elements. the site can be viewed at:
    >
    > www.shelleyhadler.co.uk/nerja.html
    >
    > thanks for your help. Shell
    >

  • WMV and Disk Size issues

    So I am a pretty avid Encore user and I have come into some issues lately and could use some help.
    Background-
    I filmed a 14 hour conference on SD 16:9 mini dv
    I captured 14 hours with Premiere as .AVI - I edited the segments and exported as .AVI
    I used Media Encoder to convert the files to NTSC Progressive Widescreen High Quality (.m2v)   - Reduced the file size drastically
    I then used Media Encoder to convert the .m2v files to .wmv files - Reducing the conference size to 5.65 GB in total.
    I then imported the .wmv into Encore - my issues begin
    At first, Encore CS4 imported the .wmv files without a problem however the disk size (of 5.65 GB) registered in Encore as around 13 gigs???  Why is that?  The .wmv files only consume 5.65 gb on my harddrive.  Where is this file size issues coming from?
    So then Encore CS4 gets upset that I have exceeded the 8.5 DL Disk size and crashes...
    I reopen the program and try to import my .wmv files again (forgot to save like an idiot).  3 of 8 .wmv files import and then Encore starts giving me decoder errors saying I cannot import the rest of the .wmv files... 
    Can anyone help me with this issue?  Im quite confused by all of this.  Things seemed to work fine (sorta) at first and now Encore is pissed.
    I want to get this 14 hour conference on 1 DL DVD and I thought it would be as simple as getting the files reduced to a size sutable for a 8.5 gb disk.  Why is my way of thinking incorrect?
    Thanks for any help,
    Sam

    ssavery wrote:
    Thanks everyone for your help.
    Im still not giving up....  It's become an obsession at this point.  My uncle does this with kids movies for his children.   He'll download and compress entire seasons of children shows and put them all on one dvd which he then plays in a dvd player for them to watch.  Im currently trying to get ahold of him....
    Thanks for the help
    Sam
    i've done this as well for shows that are never to be seen again for the light of day from the 80s and such... i use VSO Software's "ConvertXtoDVD v4" i ONLY use this for archival purposes of xvid or wmv or stuff that encore would throw fits over. the menus are all mainly default stock stuff. but for these projects i'm not concerned about menus or specific navigation, i just need to get the job done. i can squeeze around 15 hrs of 720x480 on one DL (it compresses the ever living dayligths out of the video... but for most of the source footage... it really doesnt matter at that point, its mostly all VHS archives i use for this program anyway) if you just absolutely HAVE to have a 1 disker, you could check that app out, burn it an see how it looks.
    edited to add: that to really squeeze crap in, you can also use a DVDFab program (any ersion should do... Older ones are cheaper) make a disc image with ConvertX, if yiu have alot f footage it may push it beyond the normal boundary of a dvd-dl and fail the burn. So then you can just import the disc image into DVDFab, and choose it to burn to a DVD-DL, and it may compress it by about 3-7% more to fit it. I would NEVER use this method EVER for a client... But if you are just hell-bent on doing 1 disk. Tries these 2 apps out. It may work out if you can live with the compression.
    if you do try this, I recommend trying this workflow. Open premiere with your first gem captured AVI. set up your chapters how you want them or whatever, then save each chapter or lecture or segment or whatever as it's own AVI. the. Import all those separately into ConvertX and set it up to play one after the other when each segment ends. [i can't confirm this 100%, because i usually drop in already compressed  files... but if for some reason it don't wanna work out... then i would  suggest dropping in the mts files instead] (if say you want a new "movie" for each lecture instead, and have chapters per movie that can be done too... But it's more work, but I can expound later if need be)  To save time on encoding, set up the menu to be the "minimalist" menu. It's strictly text. Then just create sn ISO. if you donthe full thing, I can almost guarantee you'll have to use DVDFab to burn to disc, because it'll probably be about 5-8% overburn.

  • User DB Size issues

    We are currently in a "legal hold" status with the GW system, meaning
    we cannot run any kind of expire/reduce options on a system wide
    basis. Needless to say the user mailboxes have grown exponentially.
    Mostly, I am concerned with the user DB sizes.
    I am curious when I should become alarmed, meaning, at what size would
    system/mailbox stability become an issue.
    We have no idea when we will be relieved of the hold status and I am
    looking for any suggestions on how best to manage the system going
    forward.
    Any advice would be appreciated.
    Mark

    In article <[email protected]>, Mark wrote:
    > I am getting up in the 100+MB sizes.
    So you should be good for a while, just need to watch a few things to
    make sure you aren't hitting limits; CPU usage (add more if needed) and
    I/O (migrate to a faster SAN/servers if needed)
    > Fortunately/unfortunately I am
    > in an industry where we will just continue to purchase more space, so
    > that won't be an issue (can you guess!)
    I can think of a few industries that 'suffer' from that problem of a
    lack of limits. It can be sad how that can lead to strange behaviours
    that can really hurt when limits hit.
    > We have had the archiving discussion many times pre and post our
    > current legal hold status. Executive managment won't go for it.
    Since cost savings/ROI isn't a consideration here, look at the legal
    side. Can you extract all the necessary messages in the time frame the
    courts require? May cases have been lost due to not being able to pull
    up all the relevant emails in a timely basis, and that is the main
    reason for many of these archival products. Retain queries that take
    minutes would be weeks or months (especially with your db sizes) if
    done manually, and courts often demand the discovery results within a
    month.
    Perhaps an OstermanResearch whitepaper may help you, "Convincing
    Decision Makers of the Critical Need for Archiving" available at
    http://www.ostermanresearch.com/down...nd_Related_Iss
    ues
    "When subpoenaed for information, the responding party has a maximum of
    30 days to respond according to Rule 34of the FRCP."
    Andy Konecny
    KonecnyConsulting.ca in Toronto
    Andy's Profile: http://forums.novell.com/member.php?userid=75037

  • Will Adobe ever fix the UI text font size issues in CS6 apps?

    Having recently installed CS6 on my new Windows computer, I find some serious issues with font sizes in the user interface (and not only in Dreamweaver, but in several--but not all--the applicaitons). You'd think that with a 27-inch screen, font sizes wouldn't be a problem. And you'd be wrong.
    The irony is that some text is properly sized. Commands in the menus. For example. But the menu names are not. The text of items in the File tab is correctly sized, but the name of the File tab is not. In the Manage Sites dialog box, the title bar text is properly sized, but the text within, including button labels, is not. IN almost all of these cases, text is just plain way too small to read. It looks like the programmers made a HUGE boo-boo: they sized text in pixels, not only a no-no in web diesgn, but a no-no in applicaiton design.
    Because on a 27-inch screen with a resolution of 1920x1080, UI text in Dreamweaver is all plenty large and readable, while on a 27-inch screen with a resolution of 2560x1440, UI text all over the applicaiton is too tiny to read--and made worse by putting the text on a gray background, reducing contrast, yet another design no-no.
    CC is NOT an option--and properly should not be for any occasional user. Do not even think of suggesting it as a "solution."
    This is clearly a bug (did Adobe do any testing on higher-resolution screens before releaseing CS6?), and one I wonder if it is unique to Windows systems (as Apple has had similarly high-resolutiopn screen for its displays and iMac lines for quite some time now). Or is it uniquely a Windows 8 issue?
    Finally, is there at all a workaround? I can find nothing in the Settings dialog box that would control the size of UI text. Perhaps there is a text-based configuration file that could be modified to change the size of UI text. (Yeah, having configured autoexec.bat, config.sys, win.ini, and similar files back in the day, I do know my way around configuration files--if they are still used.)

    twritersf wrote:
    Having recently installed CS6 on my new Windows computer, I find some serious issues with font sizes in the user interface (and not only in Dreamweaver, but in several--but not all--the applicaitons). You'd think that with a 27-inch screen, font sizes wouldn't be a problem. And you'd be wrong.
    (and not only in Dreamweaver, but in several--but not all--the applicaitons).
    What made you think that you should complain to Adobe about Windows display problem?  Adobe and Microsoft are two different companies competing against each other.
    However, you might have display problem and the best way to resolve this is by going to:
    Control Panel >> Display
    Follow the online instructions.
    Please not, you go to control Panel from within the windows itself NOT from DW.
    Hope this gets you started or you can ask in the Windows Forum on Microsoft Website.

  • Smart Objects - File size issues

    Hey All,
    The Question: Not sure if this question has been answered elsewhere. But when using a nested smart object (meaning a smart object within a smart object) Photoshop CS5 doesn't display the correct file size (at bottom left) or seem to account for the nested smart object file size.  Is there a "setting" I’m missing to accurately display what the true file size is?
    The Problem: Using multiple nested smart objects that I have reduced the size of my image to be 260x200 for web export. Photoshop CS5 won't let me save a file that appears to be only 3mbs claiming it's over 2 gig's. See image below.
    Really not sure what to do about this, the company I work for makes lots of changes so using smart objects is necessary for my work flow. But also seems to be slowing me down trying to figure out issues like this and is problematic when it comes to saving all the work I have been doing.
    Thanks for the help

    FentonDesigns wrote:
    when using a nested smart object (meaning a smart object within a smart object) Photoshop CS5 doesn't display the correct file size (at bottom left) or seem to account for the nested smart object file size.  Is there a "setting" I’m missing to accurately display what the true file size is?
    One thing you might have missed is that Photoshop is not a file editor its a document editor.   The sizes Photoshop is displaying are related to how much ram it using for the documents data. How efficient ram is being used etc. File sizes vary all over the place sizes depend on the number of pixels in an image format support layers no layers compression?, transparency.   There is no way Photoshop could even guess at any file sizes.
    An other is all smart object layer are not created the same and their sizes my be far different the you may think.  
    Smart object layers have a basic format. There is an embedded object, there is a composite pixel rendering for embedded object that is used for the layer pixels and there is a transform associated for the layer rendered pixels.
    Anything Photoshop supports can be an embedded object.  These objects are copies of the original object.  For example a copy of a RAW file where ACR settings are stored in the file copy metadata. An embedded object might be a copy of a PSD file that has thousands of layers. in any case Photoshop renders pixels for the embedded objects composite view and uses these rendered pixels as the smart object layers pixels.  These pixels can not be changed within the document.
    However the embed object can be opened and worked on and changed. If the change is committed Photoshop will update the embedded object and render the updated object composit view and replace the layers pixels. 
    Smart Object Layer Pixels can only be acted on in the document not changed with paint etc. For example the Transform associated with the smart object layer sizes and positions the layer rendering over the canvas. The layer actual size may be larger then, smaller then the canvas size and have a different aspect ratio then the canvas. Example if you place in an image that is larger then the document canvas size one of Adobe Photoshop's Preferences is set by default have Place resize large images to fit within current the documents canvas size. The transform associated with that placed layer would cause the rendering of the layers pixels to fit with in the canvas. 
    Though an embedded  object may contain thousands of layers the actual object may be much smaller then you think for PSD files are compressed object may be compressed.  Also while the embedded object  may contain vector layers when a smart object layer is transformed the layer is transformed using interpolation like a raster layer for all that is being transformed is the pixels Photoshop rendered for the embedded smart object. The only way to work on the embedded smart object layers it to open the smart object and work on the object itself.j

Maybe you are looking for