Gradul Increase of effects

Is it possible to gradually increase or decrease the effect on a track? Say I want a compressor effect (or whatever) to start at a certain time in the track and stop at another time, or gradually alter the ammount of the effetc, how do I make that happen?

Have you heard about automation? One of the most important features in Logic is the automization of Plug Ins and volume/send levels.
Check your PDF manual and enter the word "track automation" in it´s search function. If you don't find or understand it come back and we will help.

Similar Messages

  • Increase After Effects System text for high resolution monitor

    I just purchased a high resolution monitor.  The panels and system text (rendering queue, project panel, etc.) in After Effects is now just about microscopic.
    Is there anyway to adjust the default text size on all the panels?

    No, unfortunately not. This stuff is part of Adobe's specific UI libraries and cannot be changed. The main application menu should respect your settings though, as should file dialogs and a few text input fields, if that's any consolation. If possible, simply define a user preset in your graphics card settings that switches screeen resolution and system dpi when launching AE or any other Adobe app. Good monitors have good scaling chips, so it should not look all too shabby and you can always go back full res for pixel precision when needed...
    Mylenium

  • Actual vs Effective PPI

    I see in the info when I click on a specific link in the links panel two items I don't know how to tell them apart.  Can anyone give me a good explanation of the difference between actual and effective ppi.  Which one is important and which one you really need to pay attention to and why.  I understand that a picture needs to be at least 240 ppi for print.  I've read several articles and just don't really understand.

    Actual resolution is the resolution as you would view it in an image editor like Photoshop.
    Effective resolution is the scaled resolution. If you scale the picture smaller in InDesign, you're "squishing" the pixels into a smaller area, increasing the effective resolution. If you scale the picture larger in InDesign, you're spreading the pixels over a larger area, decreasing the effective resolution.

  • Increasing song volume...

    I import my CD's into iTunes and sometimes notice that some songs are recorded at a very low level. An example is 'Careless Whispers' off the Wham! album. There's an option to increase volume in the preferences but I'm not sure how that works. Once a song is imported into iTunes, can the volume be increased on it? Thanks for feedback

    ok, I did that, but it sounds the same. Does this work on ANY song, regardless of its original source? Many of my iTune library songs came off of CD's. Should it still work and does the volume increase be effective if I burn it back on to another CD? Thanks

  • Mac Mini slow when accessing hd - NVidia MCP79 SATA Driver Problem?

    I bought a Mac Mini (Early 2009) a while ago. Unfortunately i have a problem concerning speed and responseability which i couldnt solve even with a lot of googleing.
    I noticed that problem first time when i tried to play a video file from hd. The video does not play as smooth as it should. It stutters from time to time. It is like a short interruption. The picture freezes for half a second and replays the lost frames very fast after that. These freezes do not occure periodically. It seems to appear randomly. The wierd thing is that the cpu usage is very low and there is enough free ram and disk space.
    So i think its not a ressource problem. I noticed that these freezes do not only affect video playback. The whole system is kind of unresponsive. And thats the the case if there is some sort of hd or dvd access. When i try to copy a huge file these freezes are increasing rapidly. The more hd access the more the system is unresponsive and cannot be used anymore.
    All these effects reminded me of a faulty dma/udma setting on windows machines. So i tried to check the transfer mode for my macs hd controler. But i didnt find any place where to check that. And i dont really think it is running in pio mode because that would result in a high cpu usage.
    I played around a little bit to understand whats going on. So this is what i found out:
    When i play a video from the hd its stuttering. It increases if i copy a file from or to the hd. Copying a file from the dvd-rom increases the effect as well. It is not influenced by a copy from AND to an external usb hd. And the whole effects do not appear under Windows Vista no matter how much i acces the hd.
    So i come to the conclusion that there must be something wrong with the SATA controler driver which is a NVidia MCP79. Unfortunately i have no idea what i can do to solve this problem. And i didnt find any advice or something in the internet.
    So i would appreciate any suggestions.

    Ok, I think I was jumping to conclusion. Disabling journaling didnt remove the problem. It just seemed so because my problem is much less distinct with Leopard. Bad luck I guess.
    kmac1036, to answer your questions. For sure I dont have warranty anymore. Maybee I should have posted that I upgraded my Mac to 3 GB RAM. Furthermore I replaced the original Hitachi HD with a Western Digital 500 GB model.
    Somehow that may be connected to my problem. I've read that many people have problems with the WD HD and those beachballing issues with SN. (I do so too)
    But my performance problem when accessing the HD did also exist with the Hitachi. In fact the Mini was extremly slow with the original 1 GB RAM. I think putting some additional memory into the machine helped because it reduced hd accesses.
    Ok, I am not an expert. But if I had to guess I would say the following seems to happen. There may be an operation in the hd driver that takes longer than it should. Because of that it maintains cpu cycles and prevents the kernel to continue scheduling. That would explain why the freezes are not connected to a high cpu usage and why the systems seems to catch up things after the freeze very fast.

  • Material value which is MAP will be GR for free of chaged

    Hi,
    I have one material  with value stock and qty stock for inventory and using MAP .
    now current inventory qty 
    10 ea  current MVP  is 1000 US$
    what if  the material will be Good Receipt  as  Free of charge ?
    my question,  do i have to suggest  501 with inventory value ?
    or  do i have to suggest 511 without inventory value  ?
    anyway both way,   qty of material  will be increased and effect to MVP
    so which way is best guide or answer to customer?

    501-  this is without PO and free goods will go to inventory and MAP will decrease. Stock value will remain same. Here you will not have a reference of how the quantity is received.
    511-  If you use this the free goods will add to the inventory and MAP will decrease, stock value will remain same.Here you will have a reference to PO.
    So you can suggest to the customer to have reference  and use 511.

  • Third cloud option: Adobe Creative {Production, Design & Web, ..} Cloud

    One of the things which make Creative Cloud less attractive to an enthousiast like me is the fact I can make use of all applications if I get Creative Cloud (first option), but I don't need all aplications. On the other hand, I make use of too many making individual applications (second option) less interesting (and I think I will loose the lovely Dynamic Link).
    I make use of a couple of video related applications (Premiere Pro, Audition, Photoshop and increasingly After Effects), but I have no intentions to make use of Acrobat, InDesign and Dreamweaver. In essence it comes down to why one would buy Production Premium and not the Master Suite. I opted for the former and I haven't had second thoughts. The option of the third option of the suites returning in the cloud package could be even more interesting to those making use of the other suites. Those are even less expensive, but I have to admit I haven't done the math on whether those people should get the individual applications instead.
    Another totally new option would be to do the same as, for example, Red Giant who allows you to compile your own bundle. They have applications and multiple suites (without an overlap) and with those you can compile your own deal. The more applications and suites you add the more discount you get on the entire deal.

    Nun ja, "veraltet" ist keine Aussage. Das hängt ja wohl in erster Linie davon ab, was du für ein System verwendest und wie lange du es behalten willst. CS6 läuft prima auf Windows und wird es noch ein paar Jährchen, nur auf Mac wird's langsam finster, weil sich Apple nicht drum schert...
    Mylenium

  • Extrapolate data into a graph

    Thank you for taking a look at my question.
    I am having a hard time finding out how to extrapolate data into a graph. I am working on a business forecasting document and would like to create a break even analysis by charting the fixed, semi-fixed, and variable costs associated with revenues derived from sales represented.
    My fixed costs are fixed, and used for things like leases, advertising, operating expenses and office supplies.
    My semi-fixed costs are proportional to the number of sales we make covering incrementing outside services and personnel requirement costs.
    My variable costs are directly related to the number of sales made and cover the cost of making and delivering our product.
    I have my spreadsheet worked out to where I can input new sales numbers and see the effect on revenue. My question is can i create a graphical representation of this relationship?
    I would like to avoid having to create a chart of data with static values (such as 0 sales, 50 sales, 100 sales, 150 sales).
    I would rather be able to define the areas where I input my sales numbers, define the output total costs/revenues, and allow the graphical representation to extrapolate this data across a set range.
    Thank you again for taking a look, input is well appreciated,
    Jordan

    Ian,
    Thanks for the response - unfortunately I am not looking to use a bubble graph, rather to figure out how to have numbers use an equation to graph two variables (costs and revenues) when sales are between 1 and n.
    Here is a screenshot of an example breakeven point, please notice the way each increasing sale effects both semi-fixed costs and resulting revenue:
    Due to processing of the total sales number prior to it being returned, I cannot create a table between 1 and n to calculate all values beforehand. Instead I am attempting to designate this golden-colored cell to be my n sales value (on x-axis), then set a minimum and maximum range allowing Numbers.app to graph the resulting outputs (total costs and total sales).
    Barring this capability existing, I believe I will need to utilize AppleScript to insert the range of values (1-n) and copy the resulting numerical output to a new table, then create the chart from this data set. The only issue here is that I am as-yet unaware of how to create such an AppleScript.
    Thank you for your input,
    Jordan

  • Best ways to get rid of dust/greasy spots from front element

    Hello guys! Could you please share your experience with taking care of the front element of your L lenses? What's the best way to remove dust sports, fingerprints etc. without scratching the front element or damaging any kind of the coating? 
    So far, I've found people recommending Lenspen and soft microfiber for regular glasses. What'd you recommend? 
    Thanks in advance and happy holidays to all of you!
    Karren

    Hi Karen,
    Get yourself a bulb blower, some quality lens cleaning fluid, microfiber cloths and a Lens Pen. 
    First thing to do is get as much dust off as possible with the bulb blower. It also might help to use a very soft, anti-static brush (note: if there are oils on the lens, the brush will become contaminated and will need to be cleaned or it will just spread the oils around).  Alternatively, a low-powered vacuum might be useful, too, just be careful to avoid too strong or too direct suction. I often just have the vacuum running nearby, to draw away fine particles as they are dislodged with a bulb blower or brush.
    The whole point is to remove as much of the dust grit as possible. Some dust can be hard particles that might scratch. But dust specks can become adhered to the surface, too. So you may need to gently use a dry microfiber cloth too. The cloth will become contaminated with particles, and if there is any oil that will contaminate it too. So either replace or clean the cloth fairly often.
    Once you have particles off as best possible, to remove fingerprints and other oils use a clean microfiber cloth slightly moistened with a quality lens cleaning fluid (I've used Zeiss and other fluids, shop around).  Dry with another section of the microfiber cloth, or a different cloth. EDIT: Yes, you can fog the surface a little with your breath instead of using lens cleaning fluids. I do that all the time when I'm in a hurry, I have to admit. However, it really isn't a good idea. You can end up breathing things onto the surface you really don't want there!
    The only time to use a Lens Pen is after all the above is done. It's used to remove light haze that's often left behind by cleaning fluids and to give the lens a final polishing. Do this gently and only after any and all grit has been removed, to avoid risking any scratches. A lens that's been given this final polishing with a Lens Pen will be much more resistant to new dust settling and adhering to the surface.
    Other things that can be helpful include optical cleaning swabs such as Pec Pads. A Speck Grabber is a precise tool that can be used to remove individual particles.
    I will not use common tissues, whether they are designated for lens cleaning or not. Most papers are made with wood pulp, and wood has minerals in it, which can be hard enough to cause micro-scratches in glass or optical coatings. Modern lenses have hardened coatings to make them more resistant, but look at older lenses with plain glass or softer coatings and you'll often find "cleaning marks".... very fine scratches in the glass or coatings, likely from using paper and/or not getting dust specks off first.
    I also don't recommend using Q-Tips or other common "cotton buds".... those shed fine threads that can get stuck in places you don't want them. Now, it's more risky using those inside a camera, than on a lens. But stil there are better things to use.
    There are swabs especially made for cleaing optics and cmaeras, and the Pec Pads mentioned above, for example. There are made from rags, not from wood pulp
    A lot of the same things can be used for sensor cleaning, by the way.
    Prevention is the best course... Use a lens hood and be careful handling your lenses to prevent getting fingerprints on the optics. Cap your lenses when not in use. Vacuum out your camera bag occasionally. Use common sense precautions when switching lenses. Though I personally don't like to use them all the time, a "protection" filter might be wise to use in some situations, such as in very dusty conditions or near the ocean, to prevent salt spray settling onto the lens (it's sort of "greasy" when it dries... can leave salt residue too).
    And, you might be surprised how little some dust on a lens effects your images. Don't be obsessive about cleaning them. The most likely thing dust or oils might do is increase flare effects. A lens hood that reduces oblique light striking the lens can help a lot with that.
    EDIT: Yikes!  It's almost scary how similar Tim and I answered your question, each coming up with almost the same thing independently.
    Alan Myers
    San Jose, Calif., USA
    "Walk softly and carry a big lens."
    GEAR: 5DII, 7D(x2), 50D(x3), some other cameras, various lenses & accessories
    FLICKR & PRINTROOM 

  • Does HDV improve SD/DVD quality

    Hallo,
    I want to switch to HDV for personal use only, but can't affort to buy a HDV camera and big HD television at the same time. So I have to decide wich one to buy first. So there are some questions maybe someone can answer.
    So far I have been recording DV 4:3 PAL with a Canon MVX2i and a Panasonic GS400 and I have been happy with the results. Last year I have made some recordings in 16:9 but in my opinion the quality with both cameras is not very good, esspecially with shots where the camera is not really stable. The picture seem to fall apart a little. I do realize the pixels are streched and with the same amount of pixels it can't have the same quality as 4:3. But on (cable) television there is sometimes a broadcast that doesn't seem to have this "falling apart" effect ( a lot do ) while it uses the same mpeg (720x540) stream as the others.
    So my first question: If I buy one of the cheap Canon or Sony HDV cameras and shoot HDV 16:9 footage, edit as HDV and convert it straight to mpeg2 for a regular DVD with compressor, would this give me a noticable quality improvement compared to DV 16:9 to DVD? It doesn't have to be more sharp but does it give a more solid picture?
    Second question: Does above workflow decreases the effect of those bend, moving lines you see on for instant brick walls while the camera is moving slightly, because of the lines of the chips and the wall interferance?
    Third and last question: Somewhere (I can't find it anymore) in this forum I read something about more artifacts around moving objects in HDV like in low rate mpeg compared to DV. Is this true?
    Thank you for your respons.

    So my first question: If I buy one of the cheap Canon
    or Sony HDV cameras and shoot HDV 16:9 footage, edit
    as HDV and convert it straight to mpeg2 for a regular
    DVD with compressor, would this give me a noticable
    quality improvement compared to DV 16:9 to DVD?
    Absolutely not, assuming you are comparing good quality cameras of either format. HDV has a noisier compression scheme, and you will also lose definition (compared to starting SD) when you scale the HDV down for the DVD.
    You will get somewhat more definition horizontally with HDV compared to anamorphic SD, but only if the SD is displayed anamorphically rather than letterbox. I think you would notice other problems before you would notice this improvement.
    It doesn't have to be more sharp but does it give a more
    solid picture?
    No. Again look at the quality of your SD camera.
    Second question: Does above workflow decreases the
    effect of those bend, moving lines you see on for
    instant brick walls while the camera is moving
    slightly, because of the lines of the chips and the
    wall interferance?
    It would tend to increase that effect, which again with a decent camera is almost non-existent.
    Third and last question: Somewhere (I can't find it
    anymore) in this forum I read something about more
    artifacts around moving objects in HDV like in low
    rate mpeg compared to DV. Is this true?
    Yes, HDV has motion artifacts from interframe compression, and DV doesn't. DV does have compression artifacts but all compression occurs within each frame.
    If you do go HDV, be sure you understand the effect it will have on your ability to capture, edit, to monitor, to work with any particular camera or flavor of HDV. It is never a simple out-of-the-box solution and often requires more $$ than one thought.
    Don't get me wrong here: HDV has lots of promise and can be very useful and improve quality in many situations. But for SD-only output it isn't going to help and will most likely hurt.

  • Heavy use of public key

    I need to encrypt a lot of (maybe mllions) pieces of data (each 32 byte long, random looking) using a PublicKey. Each piece must be encrypted separately (since I need the possibility to decrypt it alone).
    This way it is possible to obtain many plaintext-ciphertext pairs using my program. Is it too risky? Of course, cracking the program would reveal the PublicKey anyway, so maybe my whole problem is just stupid.....
    I found out different ways, e.g., encrypting this way just a couple of random SecretKeys, which will be used to encrypt the pieces of data mentioned above. The storage overhead would be small, the computation could be even faster (using symmetrical cipher instead of RSA), but it's more complicated to maintain.... and loosing somehow the encrypted random SecretKeys would be a disaster.
    null

    I tried to figure out, how it works....
    import javax.crypto.*;
    import javax.crypto.spec.*;
    import java.security.*;
    import java.security.spec.*;
    import sun.misc.*;
    public class PBEEncryptDataString
        static public class EncryptionException extends Exception
            private EncryptionException(String text, Exception chain)
                super(text, chain);
        private static final String PROVIDER = "BC";
        private static final String ALGORITHM = "PBEWITHSHA-1AND192BITAES-CBC-BC";
        public PBEEncryptDataString(String passphrase, byte[] salt, int iterationCount, String characterEncoding) throws EncryptionException
            assert(passphrase != null);
            assert(passphrase.length() >= 6);
            assert(salt != null);
            assert((iterationCount > 6) && (iterationCount < 20));
            assert(characterEncoding != null);
            try
                PBEParameterSpec params = new PBEParameterSpec(salt, iterationCount);
                KeySpec keySpec = new PBEKeySpec(passphrase.toCharArray());
                SecretKey key = SecretKeyFactory.getInstance(ALGORITHM, PROVIDER).generateSecret(keySpec);
                this.characterEncoding = characterEncoding;
                this.encryptCipher = Cipher.getInstance(ALGORITHM, PROVIDER);
                this.encryptCipher.init(javax.crypto.Cipher.ENCRYPT_MODE, key, params);
                this.decryptCipher = Cipher.getInstance(ALGORITHM, PROVIDER);
                this.decryptCipher.init(javax.crypto.Cipher.DECRYPT_MODE, key, params);
            catch (Exception e)
                throw new EncryptionException("Problem constucting " + this.getClass().getName(), e);
        synchronized public byte[] encrypt(String dataString) throws EncryptionException
            assert dataString != null;
            try
                byte[] dataStringBytes = dataString.getBytes(characterEncoding);
                byte[] encryptedDataStringBytes = this.encryptCipher.doFinal(dataStringBytes);
                return encryptedDataStringBytes;
            catch (Exception e)
                throw new EncryptionException("Problem encrypting string", e);
        synchronized public String decrypt(byte[] encryptedDataStringBytes) throws EncryptionException
            assert encryptedDataStringBytes != null;
            try
                byte[] dataStringBytes = this.decryptCipher.doFinal(encryptedDataStringBytes);
                String recoveredDataString = new String(dataStringBytes, characterEncoding);
                return recoveredDataString;
            catch (Exception e)
                throw new EncryptionException("Problem decrypting string", e);
        public static void main(String[] args)
            try
                Security.addProvider(new org.bouncycastle.jce.provider.BouncyCastleProvider());
                final byte[] salt =
                {0x00, 0x01, 0x02, 0x03, 0x04, 0x05, 0x06, 0x07, 0x00, 0x01, 0x02, 0x03, 0x04, 0x05, 0x06, 0x07, };
                PBEEncryptDataString dataStringEncryptAgent = new PBEEncryptDataString("The Password - make if fairly long so that there is lots and lots of entropy", salt, 1000, "UTF-8");
                // Get the dataString to encrypt from the command line
                String dataString = (args.length == 0)? "The quick brown fox jumps over the lazy dog." : args[0];
                System.out.println("Data string ....................[" + dataString + "]");
                // Encrypt the data
                byte[] encryptedDataStringBytes = dataStringEncryptAgent.encrypt(dataString);
                BASE64Encoder base64Encoder = new BASE64Encoder();
                System.out.println("Encoded encrypted data string ..[" + base64Encoder.encode(encryptedDataStringBytes) + "]");
                // Decrypt the data
                String recoveredDataString = dataStringEncryptAgent.decrypt(encryptedDataStringBytes);
                System.out.println("Recovered data string ..........[" + recoveredDataString + "]");
            catch (Exception e)
                e.printStackTrace(System.out);
        private String characterEncoding;
        private Cipher encryptCipher;
        private Cipher decryptCipher;
    }Here, to show the basic approch, I have used a fixed IV but one can use a random IV and write it as a prefix to the output so that it can be used in the decryption process. Schneier has shown that this in no way compromises the security of the approach BUT, of course, it does increase the effective length of the encrypted data.

  • White balance selections, Quick Develop and Develop...let's talk

    One thing that isn't well-known is that the adjustments in Quick Develop and the adjustments in Develop are *fundamentally different* from each other.
    Develop is basically designed to provide *absolute* adjustments to RAW images. It can be used for JPEGs too, but in that case, the white balance setting is not absolute as it's already been corrected in-camera. So the WB settings are meaningless since you don't know where to begin and the temperatures are removed for the same reason. You just start where you are and get warmer or cooler.
    Quick Develop is designed to provide *relative* adjustments to all images. What this means is, if you have many images selected and hit the right exposure arrow ">" you'll increase the effective exposure of all of those images from wherever they are now to that plus 1/3 stop. If you hit the ">>" you'll get 1 stop increments.
    Since QD is always making *relative* adjustments, it may be reasonable to assume that you have a known, corrected starting point from which to make relative adjustments. That may be why the daylight and other options are available in QD. I personally think this is a bad assumption as you don't know what corrections have been applied in-camera. Thus, I think those options shouldn't appear in QD just like they don't in Develop.
    Comments?

    The relative vs. absolute of the QD vs. Dev modules is something I'm not familiar with LR enough to dispute. However isn't the point of LR's non-destructive nature, that all actions on an image are relative to the initial baseline, whether it is a RAW or JPG (or Other).
    Granted applying a preset WB set of values against a JPG does not make a lot of sense in the development of a repeatable workflow, but if a set of WB values improves a JPG image, who cares. Its all non-destructive and I can reset at any time.
    I tend to not like operating in applications that are too stringent in keeping me walled into the "right" way of doing things. Particularly based on the super intelligent decision of the software developer because they know how it is supposed to work. (I have license to berate this group because it is my vocation) Changing the user interface from one module to another (if they both do the same thing) or from one file type to another is not a good thing. Consistency is paramount to user freindliness. If indeed the QD and the Dev modules work differently then it is expected that the controls be labeled differently to express this difference. If a selection is detrimental to the image or will cause application issues it is better to leave the selection visible (for consistency) and disable it (gray out) to keep it from being selected.
    The abreviation of White Balance: to WB: in the application is also bad form. It is jargony and does not assist new users to understanding. Screen real estate is scarce and I understand the inclination to do this, but there is a cost.
    - Morey

  • Soft Focus Images From Canon EOS 1D Mark IIn

    I am viewing soft focus issues with my RAW CR2 files coming from the above camera body. I accidentally opened a file in Canon's Digital Photo Professional and it was razor sharp compared to Aperture. A few historical facts. I have had the camera body calibrated to my multiple L lenses several times. I am using the latest firmware for the camera at 1.1.2 and the latest version of the OS at 10.4.10, Aperture 1.5.4, and DPP at 3.0.2. I have used another identical body model with one of my lenses with the same results. I have viewed this issue on several Macs and displays. I have done controlled sample shots using a Digital Imaging Box where the camera is mounted firmly to it with a two second timer to avoid camera shake and controlled lighting with the same results. This all leads me to how Aperture is handling the CR2 file. Now I am talking about viewing only at 100% magnification. There is NO adjustments done at this point which should rule out the RAW processor engine. Like you all I have moved my entire VERY LARGE library and complete workflow to Aperture and now have serious doubts if I can use this awesome application. Has anyone else experienced this? I have tried Apple directly, other forums, other photographers, etc. with no real answer. Since this is my first post, I can not figure out how to attach screen captures so you can clearly see the difference. I can send them via PM if required. All of your help is greatly appreciated!
    Cheers,
    David

    What do you mean sharp "out of the camera?"
    This is a RAW file? Note JPEG files will almost always be sharper out of the camera than will RAW files. JPEG files have post-processing (color space applied, sharpening applied) out of camera when RAW files do NOT.
    It is very likely you have different settings enabled in DPP and Aperture. With Aperture you can do sharpening in the RAW conversion step, or via edge sharpening... and doing a comparison on "sharpness" before you have applied this really makes no sense.
    NO RAW files are sharp out of camera (well, unless they're from a MF digital back) because all DSLRs have an anti-aliasing filter in them which purposely blurs images. This is done to eliminate artifacting and to increase the effective resolution of the camera (because note that a 12 MP digital camera actually only has 6 million green, 3 million blue, and 3 million red sensors). It does not have 12 million RGB capable sensors, which requires some math, etc., in camera.

  • If I scale my InDesign doc to one-eighth should I scan at 8x resolution?

    I have a 26'x7.5' banner I'm creating out of 3"x3" post-it notes.
    The post-it notes have handwriting on them, and I want to print the banner "life-sized" so each post-it note shows up as a 3"x3" square on the final 26'x7.5' banner.
    I've been scanning the post-it notes at 300 dpi and then I place them into my InDesign document, which is scaled down to one-eighth its final output size, so that it's easier to work with, i.e. it's 3.25-feet in InDesign right now, rather than 26-feet.
    1- If I scale my InDesign document down to one-eighth it's output size (3.25-feet rather than 26-feet), should I increase the dpi of each image from 300 to 2400 dpi so that the resolution of the final output will remain 300 dpi?
    2- Does InDesign "link" the images and keep their resolution no matter if I scale the document down so that I can work with it more easily?
    When placing the post-it note images into the document scaled at one-eighth its size, I have to select Auto Fit and then decrease the height and width so they'll fit.
    3- Does this decrease the resolution from 300 dpi?
    Thanks!

    And so if I understand correctly, the image which is linked within an InDesign document will not be changed as I scale my design - it will merely increase in effective dpi as I scale down the design and decrease in effective dpi as I scale up?
    It's different than embedded images in Photoshop, right?

  • When importing a raster into illustrator, how to create stroke around it?

    hi. thanks for reading this. i have an image of a lemon on transparent background. i used a high fidelity live trace and want to create a stroke around the entire lemon.
    when i select the lemon, it selects every single path even the ones inside. is there a way i can just create a 20 pt stroke border around the lemon easily?
    thanks!

    Trace a path around it, either in photoshop or illustrator, you then can either mask the object (thus be able to put it on a background of varying sorts) or stroke the path and use it as you want.
    Or if you mean the brocoli is on a transparant background and it's in a format like PSD where the edges of the document don't exist, only the edges of the non transparant objects, then you can try a cheat. Which is to add a drop shaddow or outer glow with no bluring and set to 0 offset on both axis. You can try adding extra drop shaddows or outer glows on to increase the effect, although I am dubios as to it's outcome will be anything like you want.
    If you cut this brocoli out in PS did you use a path to do that? If so you can copy that path into illustrator and stroke it as mentioned above.
    Regards
    Paul

Maybe you are looking for