Photo too large for drop zone

Often when I drag a photo into the drop zone of a main menu, the photo goes in but is displayed so large that much of the subject is cropped out. Anyone have a way of shrinking the photo down so that the entire photo can be seen in the menu?

I have found that size of the photo is rarely a problem. The problem is simply getting that portion of the photo you are interested in to show. The drop zone photo areas are small, therefore, you generally are placing photos of people that were taken from distances of 5 to 10 feet. The faces are the key to being satisfied with the photo in the drop zone ( or the car, or the motorcylcle, etc.) FORTUNATELY, you can drop the photo in the zone, and then place your cursor on the photo and move it around so that exactly the area of the photo you are concerned with will show! By moving my photos ( in effect a cropping occurs without the loss of any part of the photo. It would be like looking at a 4X6" photo and placing a 1X3" frame on it. Move the frame around to get what you want and VOILA! Hope this enables you to accomplish your drop zone objective.

Similar Messages

  • Images too large for Drop zone in Template

    Is there a way I can resize images I want to use after I drop it in a drop zone? For some reason, all my images are too large for the drop zones - even after making smaller in photoshop.
    Thanks!

    I'm fairly new to DVD SP too and wondering the same thing.
    Shouldn't we be able to edit the image mask?
    It's kinda worthless to just accept what DVDSP decides to do with the photo.

  • PSE 6 slide show photos too large for TV screen

    I have been collecting, scanning and saving family photos for over a year to create a slide show for my 81 yr. old Dad. I create a slide show with PSE 6, save to a CD and play on the TV. No matter what pixel size I resize the photos to they are still too large for the TV screen - not just one TV screen but different ones I try. I loose heads at the top, feet at the bottom, people on the sides. I have been researching but have not found the answer to solve my problem. Anyone have the answer or some suggestions I would be very appreciative.  Thanks.
    Scrap Grannie

    Well, first of all, choose the right resolution and frame-rate...
    Are you working in NTSC (30 fps), or PAL (25 fps)...?
    Use one of the common DV formats /
    if you choose HD, 1080i, then your TV must be able to recognize that format,
    otherwise images get cut off...
    Format                                           Rectangular Size              Square Size
    DV NTSC................................................720x480.............................. ...720x540
    DV PAL...................................................720x576............................ .....768x576
    HDV 720p..................................................NA................................. ....1280x720
    HDV 1080i.................................................NA................................. ....1920x1080
    Stay with your chosen format through-out your whole project...
    (PNG, TIFF or PSD), import those files into your video editor.
    Depending on the Operational System (Mac or Windows), one has many choices...
    http://en.wikipedia.org/wiki/List_of_video_editing_software
    After editing, import those files into your chosen Authoring software for burning,
    they will take care of the appropriate compression.
    (I would recommend burning to DVD, not CD, one will achieve much better quality...
    Regards
    Nolan

  • HT1229 My iPhoto Library (version 8.1.2) is 280GB (greater than 50% of my 500GB total storage memory on my iMac.  It was too large for me to drag it to a new hard drive so the Apple geniuses did it for me.  However they did not delete the Library from my

    My iPhoto Library (version 8.1.2) is 280GB (greater than 50% of my 500GB total storage memory on my iMac.  It was too large for me to drag it to a new hard drive so the Apple geniuses did it for me.  However they did not delete the Library from my iMac (that's my responsibility).  I dragged it to Trash and when it started to move I clicked over to the new hard drive to confirm it had indeed been copied.  I became nervous when I didn't see among the few files on this otherwise empty new hard drive anything that resembled a 280GB Library so I cancelled the migration to trash.
    How can I be sure that my iphoto has been copied and that all my "metadata" survived in tact?

    the new backup drive
    I thought the new drive would be your data drive to host the iPhoto library. Do you also use it for TimeMachine backups?
    I am unable to search either in email as well as Finder.  I AM able to search within iPhoto though, thankfully
    Spotlight may still be busy rebuildig its index.
    You could try to rebuild the Spotlight Index, if you do see no progress:
    Spotlight: How to re-index folders or volumes
    I hope, other frequent posters will drop in. I have not used iPhoto 8.x in long time.

  • Why is Encore 5.1 AUTO mode building files just a bit too large for DVDs?

    I've been doing the same videos for a few years using Encore 1.5 and more recently Encore 5.1 (with Premiere 5.5).  The videos are typically football games with short 2-25 secs of motion menus.  If I put two games per DVD, the total running time has been up to 2 hours total per game.  I've made dozens of these with Premier writing AVI files (standard 720x480 vide) and writing in Encore 1.5 using auto settings.  With Premiere CS4 I started using the dynamic link to Encore CS4 and still ran beautiful DVDs on auto transcoding.
    Now, when using the same standard video files in Premiers 5.5, and using dynamic link to Encore 5.1, on auto transcode settings, the image files or transcode files (depending on how I order the build) keep coming out 150 - 300 MBs too large for the DVDs (typically Tayo-Yuden or Verbatim burning to Plextor/Pioneer drives). (Windows-7-X64 Intel D975XBX2 MB.)
    To see if I was going nuts, I recently saved a job as two AVI files (same length as the sequences sent through dynamic link to Encore CS5.1) and did the same DVD setup in Encore CS4 but importing the AVI files as timelines.  The chapter points didn't convert for Encore CS4 so I had to install manually. The end result, though, was a DVD image that worked fine.
    So I then went to the Encore CS5 build window and set the DVD size to manual, dropping down from 4.7GBs to 4.25.  The result fit, but then left too much free space (though the vids still looked OK). 
    I know by other discussions that others are having issues with Encore CS5.1 - or maybe it is related to Premiere 5.5 sequences.  Recently burned a set with 22 secs motion menu and 1.4 hours of video and had no problems on auto.  This leads me to believe either that Encore is underestimating the transcode sizes or there is a problem with writing DVDs at or near two hours in length. 
    Anybody have answers, suggestions?
    Thanks,
    Doug A

    Thanks, Giorgio.
    I do you imgburn quite a bit and even the new version 2.5.6.0 which allows for truncating and overburning wouldn't write to either the Plextor burners or the Pioneer 205 (Blu-Ray in DVD mode).
    I just ran another image build setting the disc size (for the burn) at 4.5GBs rather than the automatic 4.7GBs.  This helped and gave me a final image size of 4.33GBs - still not large enough to maximize the disc and the compressed code for better quality, but in this case, a tradeoff for just getting the job done.
    I hope Adobe updates a fix for this problem, if indeed, it turns out to be a bug.  Also, while I'm on the the bandwagon, the motion previews in Encore 5.1 do seem to preview in very low res.  I'll need to check to see if there is a setting for this, but I've not experienced it before 5.1.
    Regards,
    Doug A

  • File too large for volume format?

    I'm trying to copy a 4.6GB .m4v file from my hard disk to an 8GB USB flash drive.
    When I drag the movie in the finder, I get a dialogue telling me the file "is too large for the volume's format". That's a new one for me. If the flash drive has 8GB available, and the file is only 4.6GB in size, how is that too large? Does it have to do with the way the flash drive is formatted? It's presently formatted as MS DOS FAT 32. Is that the problem? Do I need to reformat as HFS+?  Or is it some other problem?

    The flash drive is pre-formatted FAT32 which has a maximum allowable file size of 2 GBs. Change the flash drive format for OS X:
    Drive Partition and Format
    1. Open Disk Utility in your Utilities folder.
    2. After DU loads select your hard drive (this is the entry with the mfgr.'s ID and size) from the left side list. Click on the Partition tab in the DU main window.
    3. Under the Volume Scheme heading set the number of partitions from the drop down menu to one. Click on the Options button, set the partition scheme to GUID then click on the OK button. Set the format type to Mac OS Extended (Journaled.) Click on the Apply button and wait until the process has completed.

  • Code too large for try statement

    I have a jsp that I am getting the following exception on.
    [ServletException in:/jsp/DLG_PTEN.jsp] Unable to compile class for JSP C:\Documents and Settings\ebsgam\My Documents\IBM\wsappdev51\workspace\StnWeb\.metadata\.plugins\com.ibm.etools.server.core\tmp0\cache\localhost\server1\DefaultEAR\StnWeb.war\jsp\_DLG_5F_PTEN.java:13576: code too large for try statement } catch (Throwable t) { ^ C:\Documents and Settings\ebsgam\My Documents\IBM\wsappdev51\workspace\StnWeb\.metadata\.plugins\com.ibm.etools.server.core\tmp0\cache\localhost\server1\DefaultEAR\StnWeb.war\jsp\_DLG_5F_PTEN.java:1165: code too large for try statement try { ^ 2 errors '
    I am currently running this on Websphere Aplication Developer 5. First off I'm hoping that this error will go away when it is deployed to tomcat. But I wanted to see if anyone has had this problem and knows of a tweek or work around for it.
    The jsp in question is not the longest one I have... but it has a large number of drop down list boxes .. using jstl <select><option>. and the drop downs have around 40 or 50 items in them.

    in weblogic 8.1, i've met the same error, and solved the problem by adding param to weblogic.xml , like below:
      <jsp-descriptor>
         <jsp-param>
              <param-name>noTryBlocks</param-name>
              <param-value>true</param-value>
         </jsp-param>
      </jsp-descriptor>

  • On apple tv why are photos too large on slideshow

    I set up album in my iphoto, 
    selected it on apple tv
    why are some photos too large and chopping off top of photos?

    When you first sync photos to your iPod, or everytime you add new photos, iTunes goes through a process of optimizing your photos for the iPod before actually sync'ing. You can see it countdown on the screen. The results are kept in the "iPod Photo Cache" folder.
    I was under the impression that this optimizing was more a process of collecting together 25 thumbnails and putting them together in a single file together so that it enabled the ipod to show 25 pictures at once with the option to select one.. Are you sure the thumbnails are actually altered in any way.

  • Time Machine backs up one MacBook, but "backup is too large" for second

    We have two MacBooks running OS 10.5.8 and both are connected wirelessly to Time Capsule. My daughter's laptop backs up fine. But my wife's laptop fails. It backed up one time when I first installed it, but never since. The error message I get is that the backup is too large for the backup volume -- 71.8 GB required but only 7.4 GB available.
    My daughter's machine has a lot of photos and iTunes files on it, so it shows 141 out of 150 GBs are used up. The Time Capsule, which I understood to have 1 TB actually shows only 7.4 GB out of 463 GB are available.
    And actually, when I try to manually run Time Machine on my wife's machine I get the really long Preparing message, and then the Time Machine Error message about not enough volume.
    Should a machine with 150 GB capacity really be taking so much space on Time Capsule, and why the difference between the 1 TB advertised capacity and the <500GB of memory that actually appears to be there?
    Does any of this suggest I need to restore the external or internal drives?

    Steve Hasler wrote:
    We have two MacBooks running OS 10.5.8 and both are connected wirelessly to Time Capsule. My daughter's laptop backs up fine. But my wife's laptop fails. It backed up one time when I first installed it, but never since. The error message I get is that the backup is too large for the backup volume -- 71.8 GB required but only 7.4 GB available.
    Does it make sense that it's trying to back up about 60 GB (Time Machine adds 20% for temporary workspace) from that Mac?
    If so, see if it has the +Warn when old backups are deleted+ box checked in Time Machine Preferences > Options.
    Click the Time Capsule in a Finder sidebar. If the Finder is in +Column View,+ you should see something like this:
    |
    except there should be two +sparse bundles+ listed in the third column. Right-click each one in turn, select +Get Info,+ and see what's shown for Size. Report back with the numbers. Also tell us if you have any other data on your TC, besides the 2 Mac's backups.
    My daughter's machine has a lot of photos and iTunes files on it, so it shows 141 out of 150 GBs are used up. The Time Capsule, which I understood to have 1 TB actually shows only 7.4 GB out of 463 GB are available.
    And actually, when I try to manually run Time Machine on my wife's machine I get the really long Preparing message, and then the Time Machine Error message about not enough volume.
    That sounds like it's doing a "deep traversal," perhaps after a long time since the last backup, where it has to compare everything on the Mac to the backups, to figure out what needs to be backed-up.
    Should a machine with 150 GB capacity really be taking so much space on Time Capsule
    Sure. Time Machine will use up all the space available to it, then begin deleting the oldest backup(s) when it needs room for new ones.
    and why the difference between the 1 TB advertised capacity and the <500GB of memory that actually appears to be there?</div>
    It does sound like you have a 500 GB Time Capsule. The original ones were 500 GB; I'm not sure when the 1 TB version came out, and now you can get them with 2 TB. How old is it, and why do you think it's a 1 TB model?

  • HT3275 Message: This backup is too large for the backup volume?

    Time Machine no longer backs up.  I keep getting this message (This backup is too large for the backup volume) though I have excluded and deleted a huge number of documents, photos, etc.  Help!  Help!

    Eunice19 wrote:
    Time Machine no longer backs up.  I keep getting this message (This backup is too large for the backup volume) though I have excluded and deleted a huge number of documents, photos, etc.  Help!  Help!
    Do you have the Warn when old backups are deleted box checked in Time Machine Preferences > Options?  If so, just remove the check and run another backup.  Then Time Machine will delete your oldest backup(s) to make room for the new one.
    If not, see #C4 in Time Machine - Troubleshooting, probably the pink box there.  If that doesn't clear it up, post back with the amount of data being backed-up (ie, how much is on your internal HD and any others being  backed-up), how large your backupd drive is, and whether there's anything else on it (if so, how much?).
    A screenprint of the message might help, too.

  • I have a 500 GB hard drive and a 1TB Time Capsule running on a MacBook Pro.  It was all working well until the MacBook went in for a repair a week or so ago.  Since then, TC will not perform a backup;  instead, it says the backup is too large for the disk

    Since having my MacBook Pro repaired (for a video problem) Time Capsule returns the following message:  "This backup is too large for the backup disk. The backup requires 428.08 GB but only 192.14 GB are available."
    I notice that there is also a new sparse bundle.
    Since TC has my ONLY backup (going back about 4 years) I am reluctant to wipe it and start over fresh as I am afraid of losing files. 
    Is there a way of dealing with this?
    I am using Snow Leopard 10.6.8

    The repair shop likely replaced a major circuit board on your MacBook Pro, so Time Machine thinks that you have a "new" computer and it wants to make a new complete backup of your Mac.
    You are going to have to make a decision to either add another new Time Capsule....or USB drive to your existing Time Capsule....and in effect start over with a new backup of your Mac and then move forward again.
    For "most" users, I think this is probably the best plan because you preserve all your old backups in case you need them at some point, and you start over again with a new Time Capsule so you have plenty of room for years of new backups.
    Or, as you have mentioned, you have the option of erasing the Time Capsule drive and starting all over again. The upside is that you start over and have plenty of room for new backups. The downside is that you lose years of backups.
    Another option....trying to manually delete old backups individually....is tricky business....and very time consuming. To get an idea of what is involved here, study this FAQ by Pondini, our resident Time Capsule and Time Machine expert on the Community Support area. In particular, study the pink box.
    http://web.me.com/pondini/Time_Machine/12.html
    Once you look through this, I think you may agree that this type of surgery is not for the faint of heart.  I would suggest that you consider this only if one of the other options just cannot work for you.

  • Cannot decrypt RSA encrypted text : due to : input too large for RSA cipher

    Hi,
    I am in a fix trying to decrypt this RSA encrypted String ... plzz help
    I have the encrypted text as a String.
    This is what I do to decrypt it using the Private key
    - Determine the block size of the Cipher object
    - Get the array of bytes from the String
    - Find out how many block sized partitions I have in the array
    - Encrypt the exact block sized partitions using update() method
    - Ok, now its easy to find out how many bytes remain (using % operator)
    - If the remaining bytes is 0 then simply call the 'doFinal()'
    i.e. the one which returns an array of bytes and takes no args
    - If the remaining bytes is not zero then call the
    'doFinal(byte [] input, int offset, in inputLen)' method for the
    bytes which actually remained
    However, this doesnt work. This is making me go really crazy.
    Can anyone point out whats wrong ? Plzz
    Here is the (childish) code
    Cipher rsaDecipher = null;
    //The initialization stuff for rsaDecipher
    //The rsaDecipher Cipher is using 256 bit keys
    //I havent specified anything regarding padding
    //And, I am using BouncyCastle
    String encryptedString;
    // read in the string from the network
    // this string is encrypted using an RSA public key generated earlier
    // I have to decrypt this string using the corresponding Private key
    byte [] input = encryptedString.getBytes();
    int blockSize = rsaDecipher.getBlockSize();
    int outputSize = rsaDecipher.getOutputSize(blockSize);
    byte [] output = new byte[outputSize];
    int numBlockSizedPartitions = input.length / blockSize;
    int numRemainingBytes = input.length % blockSize;
    boolean hasRemainingBytes = false;
    if (numRemainingBytes > 0)
      hasRemainingBytes = true;
    int offset = 0;
    int inputLen = blockSize;
    StringBuffer buf = new StringBuffer();
    for (int i = 0; i < numBlockSizedPartitions; i++) {
      output = rsaDecipher.update(input, offset, blockSize);
      offset += blockSize;
      buf.append(new String(output));
    if (hasRemainingBytes) {
      //This is excatly where I get the "input too large for RSA cipher"
      //Which is suffixed with ArrayIndexOutofBounds
      output = rsaDecipher.doFinal(input,offset,numRemainingBytes);
    } else {
      output = rsaDecipher.doFinal();
    buf.append(new String(output));
    //After having reached till here, will it be wrong if I assumed that I
    //have the properly decrypted string ???

    Hi,
    I am in a fix trying to decrypt this RSA encrypted
    String ... plzz helpYou're already broken at this point.
    Repeat after me: ciphertext CANNOT be safely represented as a String. Strings have internal structure - if you hand ciphertext to the new String(byte[]) constructor, it will eat your ciphertext and leave you with garbage. Said garbage will fail to decrypt in a variety of puzzling fashions.
    If you want to transmit ciphertext as a String, you need to use something like Base64 to encode the raw bytes. Then, on the receiving side, you must Base64-DEcode back into bytes, and then decrypt the resulting byte[].
    Second - using RSA as a general-purpose cipher is a bad idea. Don't do that. It's slow (on the order of 100x slower than the slowest symmetric cipher). It has a HUGE block size (governed by the keysize). And it's subject to attack if used as a stream-cipher (IIRC - I can no longer find the reference for that, so take it with a grain of salt...) Standard practice is to use RSA only to encrypt a generated key for some symmetric algorithm (like, say, AES), and use that key as a session-key.
    At any rate - the code you posted is broken before you get to this line:byte [] input = encryptedString.getBytes();Go back to the encrypting and and make it stop treating your ciphertext as a String.
    Grant

  • Mac Mini display is too large for screen on only one user account

    Okay, so I left my two year olds alone for a minute playing the "alphabet game" on my Mac Mini. They only had the keyboard, no mouse but managed to muck up my display leaving me a bit frustrated.  The screen is now too large for my Samsung display. The only way to see everything (dock, top bar, etc) is to move my mouse arrow to the end of my display and see it roll back onto the page.  I've checked the settings there and they are fine. The MacBook Pro plugs right in and is proper resolution. So I then wondered if another account on the Mac Mini would do the same thing. I logged out of my Admin account and into another and everything looks just dandy. I log back into my Admin account and it's too large and blurry again. The resolution is set correct at 1920x1080 at 60 Hz.
    What button did they push on my keyboard that would do this and how do I get it back?? Aargh! Thanks all!

    Ha, figured it out myself from another discussion forum finally. Thoght I'd share in case anyone else runs into this. They must have hit "Zoom" by htting the "Control" and scroll buttons at the same time..
    Resolution:
    You can zoom out by holding down the Option and Command buttons on the keyboard and, while you hold them down, pressing the - key. 

  • Page header plus page footer too large for the page in crystal report 2008.

    Hi,
    when we selecting pieview and print after entering paramter it's showing error: page header plus page footer too large for the page. error in File.rpt page header or page footer loanger than page. and it's not showing print layout format if i connect another printer it's showing layout designe. and some times it's showing letter formate and if i give print it's taking default lamdscape but we setup defual setup for printer 10*12 inches in particular printer.please guide me how we can solve this issues.
    regds,
    samapth

    This is a really hard post to read. See if you can take a bit of time to reword it, but one thing I do understand is that you are getting this error:
    page header plus page footer too large for the page.
    Typically, you can trust that if the error is thrown, it is true. E.g.; this is not one of those errors that says one thing, and means another. I suspect that you have some field(s) in the header(s) that grow, depending on the data. If there is too much data or the data is too long ( a text for example), the error will be thrown. To resolve this, see if placing the field(s) into a group footer / header will help.
    Ludek

  • ERROR : OpenDoc CR to PDF - File is too large for attachment.

    We are getting the following error in 3.1 using an OpenDoc call when we call a large Crystal Report to PDF format...
    Error : 52cf6f8f4bbb6d3.pdf File is too large for attachment.
    It runs OK from BOE when given parameters that returned 44 pages. (PDF = 139 KB)
    We get the error on a parameter-set that returns 174 pages when run via CR Desktop or as a SCHEDULED Instance. (PDF = 446 KB).
    Client application can't use the SDKs to SCHEDULE Instances - only configured for OpenDoc calls.....
    The BOE server is running on SOLARIS - and it's is a 2 Server CMS-Cluster.
    The problem is SPORADIC, so I am thinking the issue is related to a specific setting on one of the servers.
    Any thoughts on where to start looking...?

    Problem is _not _with the number of Rows returned - it is an issue with the size of the PDF file that it is trying to move.
    Found a possible WINDOWS solution on BOB - need to find if there is an equivalent for SOLARIS...
    Check the dsws.properties on web server D:\Program Files\Business Objects\Tomcat55\webapps\dswsbobje\WEB-INF\classes
    See if you can change any parameter to remove size limitation.
    #Security measure to limit total upload file size
    maximumUploadFileSize = 10485760

Maybe you are looking for

  • Printing several document in a single check

    My user printing check from tcode fbz5. there have the option for printing one check for one document number. now their requiremnt is they need the option to print several document no of same vendor should print in one check. So, please help me regar

  • BW Report Availability

    As much as we would like all BW based reports available on time, all of the time, the reality is that some reports are not available to our user community due to data load failures e.g. master data, issues that require bug fixes etc .. Our business n

  • How to get ITResource to Adapter?

    I have written an adapter that I want to get visibility to a database ITResource and I have set an adapter variable to ITResource. But how do I pass\map the ITResource into the Adapter? I have tried to map it via the "Map Adapters" of the Data Object

  • Image Burn????

    So if I have a Sony 57" Grand Wega HD-TV that has rear projection CRT, won't I get image burn from the hard drive icon and the menu bar? If the icon was animated it wouldn't be a problem and if I could hide the menu bar until I rolled over it, I thin

  • Java.lang.StringIndexOutOfBoundsException: String index out of range: 8

    Hi Friends, I have a search window ,in one I/P Field i entered * and kept the default max hits to display as 100 records(This can be changed). Now when i click on Search i get 100 records being displayed,later when i change it to 150,200 Hits i am ab