Working w/ large scale AI files

What is the best way to work with large scale (wall mural) files that contain many gradients, shadows. etc. Any manipulations take a considerable time to redraw. Files are customer created, and we are then manipulating from there in-house. We have some fairly robust machines ( mac pro towers- 14GB RAM, RAID scratch disk).. I would guess there is a way to work in a mode that does not render effects, allowing for faster manipulation? Any help would be great- and considerably help reduce our time.
First post- sorry if did something wrong- question / title wise/
THX!

In a perfect world, the customers would be creating their Illustrator artwork with the size & scale of the final image in mind. It's very difficult to get customers (who often have no formal graphic design training and are self taught at learning Illustrator & Photoshop) to think about the basic rule of designing for the output device -a graphics 101 sort of thing.
Something like a large wall mural, especially one that is reproduced outdoors, can get by just fine with Illustrator artwork using raster effects settings of 72ppi or even less than that. Lots of users will have a 300ppi setting applied. 300ppi is fine for a printed page being viewed up close. 300ppi is sheer overkill for large format use.
Mind you, Adobe Illustrator has a 227" X 227" maximum artboard size, so anything bigger than that will have to be designed at a reduced scale with appropriate raster effects settings applied. For example, I'll design a 14' X 48' billboard at a 1" = 1' scale. A 300ppi raster effects setting is OK in that regard. When blown up in a large format printing RIP the raster based effects have an effective 25ppi resolution, but that's good enough for a huge panel being viewed by speeding vehicles on city streets or busy highways.
Outside of that, the render speed of vector-based artwork will depend on the complexity of artwork. One "gotcha" I tend to watch are objects with large numbers of anchor points (like anything above 4000 points). At a certain point the RIP might process only part of the object or completely disregard it.

Similar Messages

  • ADT Not Working With Large Amount Of Files

    Hi,
    I posted a similar issue in the Flash Builder forum; it seems these tools are not meant for anything too big. Here is my problem; I have an application that runs on the iPad just fine, however I cannot seem to be able to package any of the resource files it uses, such as swf, xml, jpg, and flv. There are a lot of these files, maybe around a 1000. This app won't be on the AppStore and just used internally at my company. I tried to package it using Flash Builder, but it couldn't handle it, and so now I have tried the adt command line tool. The problem now is that it is running out of memory:
    Exception in thread "main" java.lang.OutOfMemoryError
            at java.util.zip.Deflater.init(Native Method)
            at java.util.zip.Deflater.<init>(Unknown Source)
            at com.adobe.ucf.UCFOutputStream.addFile(UCFOutputStream.java:428)
            at com.adobe.air.ipa.IPAOutputStream.addFile(IPAOutputStream.java:338)
            at com.adobe.ucf.UCFOutputStream.addFile(UCFOutputStream.java:273)
            at com.adobe.ucf.UCFOutputStream.addFile(UCFOutputStream.java:247)
            at com.adobe.air.ADTOutputStream.addFile(ADTOutputStream.java:367)
            at com.adobe.air.ipa.IPAOutputStream.addFile(IPAOutputStream.java:161)
            at com.adobe.air.ApplicationPackager.createPackage(ApplicationPackager.java:67)
            at com.adobe.air.ipa.IPAPackager.createPackage(IPAPackager.java:163)
            at com.adobe.air.ADT.parseArgsAndGo(ADT.java:504)
            at com.adobe.air.ADT.run(ADT.java:361)
            at com.adobe.air.ADT.main(ADT.java:411)
    I tried increasing the java memory heap size to 1400m, but that did not help. What is disturbing though is why this day and age, the adt tool is written solely for 32 bit java runtimes, when you can no longer buy a 32bit computer. Almost everything is 64bit now. Being able to utilize a 64bit java runtime would eliminate this.
    Has anyone else had to deal with packaging a large number of files within an iPad app?
    thanks

    Hi,
    Yes the bug number is 2896433, logged against Abobe Air. I also logged one against Flash Builder for the same issue basically, FB-31239. In the tests I have done here, both issues are the result of including a large number of files in the package. By reducing the number of files included, these bugs do not appear.
    I would liketo try out the new SDK, but it won't be available until a few more weeks, correct?
    The main reason I want to include all these files is because the iPads will not always have a wireless connection.
    thx

  • How do I work around large AIR Installer file for app updates?

    I develop AIR apps which often come with many included files (pdf, doc, swf, images, etc).  These get quite large, Total Air installer size measured in Gigs.   Distribution is via DVD.  Currently I create sub-folders under my project and place the assets.  Works fine.
    The rub comes when I want to issue program updates.  The AIR update process would be great if I could just update my main program swf, but what I'm finding is that even if I'm doing a minor code update I still have to create this same multi-gigabyte AIR installer.
    If I were using a Visual Studio installer I'd just include some sort of run-once that would copy all the assets after the installer completed.  Then when building an update it would skip the 'copy all' part...but I can't see any way to do something similar with the AIR installer.
    So, to summarize, I'd like to be able to issue updates to my main AIR SWF without including all the assets again.  Any thoughts?
    Thanks in advance,
    Greg.

    No, I don't have any recommendations. You would need to sign up for the AIR redistribution program. When you do this you gain access to some tools that help the process. http://www.adobe.com/products/air/runtime_distribution1.html
    The only warning is that you will have to create an installer for each of your target platforms (unless you find an installer product that is also cross-platform, of course).

  • Working with  large Avi (DivX) File - PE 2

    Hi there
    I'm trying to use PE2 to edit & generate scene markers in a 1.24GB avi file.
    The problem I'm having is that frames are very slow to update on the timeline making it very difficult to work with, probably due to the size of the file file & my low system specs.
    Any thoughts on how to best tackle this? Would it be best to split the avi file into more manageable pieces? If so, how would I do that?
    Any suggestions would be much appreciated.
    Thanks & regards
    Stephen

    Export the DivX clip as a DV-AVI file using File>Export>Movie. Then replace the DivX clip on the timeline with the DV-AVI clip.

  • Encore CS5 constantly locks up/freezes working with large H.264 file

    This is driving me mad, I'm trying to author my first Blu Ray with this program, a single layer output with H.264/AVC video. I've encoded the video elsewhere and I have a 21GB .264 file. It's a BR legal file but as soon as I import it into Encore CS5 the fun and games begin. Firstly it takes a good five minutes to import and Encore appears to have crashed. Finally though it does import and looking at 'Blu Ray transcode' it says 'Don't transcode', so it's a legal file. However if I then try to do anything with the file in Encore, like add it to a timeline the whole program locks up again. 10 minutes later it's added as a timeline but trying to navigate that timeline to add chapters is impossible, the program locks up (Not responding) if I try and move the slider or pretty much do anything. It means making a BR with this file is pretty much impossible.
    My PC spec is as follows:
    Core i5 2500k 3.3Ghz
    8GB DDR3 1600mhz RAM
    120GB Crucial SSD boot drive with Encore installed on
    500GB Hard Disk & 2TB Hard Disk
    Windows 7 64-Bit Home Premium
    Is there anything I can do to actually make Encore usable with this large file and finally finish authoring a disc? I've tried loading the file from all three drives and it's the same. The media cache is currently on my 2TB drive because I want to save space on my SSD and minimize writes to it.

    Taking 5 minutes to import is normal.  Encore has to index the entire file so it knows where the keyframes are.  Don't try to do anything while that's happening.
    Once in, however, Encore should respond normally.  At this point, I'm drawing a blank as to why it isn't.

  • Applying Oil Paint Filter to Large Scale Images

    I need to apply the effects available from the Oil Paint filter to very large, 80mb images. The filter works exactly as I need it to on small images, but not at large scale. A comment I have heard in a Lynda.com video on the Oil Paint filter mentioned that the filter does not work well on large images. However, I REALLY need this, even if I need to pay someone to write a program that can do it! Does anyone know if / how I can get the filter to work for large images and / or if there is a third-party plug-in that will provide the same results? Having this filter work on large scale images could make or break a business idea I have so finding a solution is extremely important to me.

    What's the problem you're having with applying it to an 80 MB image?  Is it that the effects don't scale up enough?
    Technically it can run on large images if you have the computer resources...  I've just successfully applied it to an 80 MB image, and with the sliders turned all the way up it looks pretty oil painty, though it kind of drops back into a realistic looking photo when zoomed out...
    If it's just that the sliders can't go high enough, given that it's a very abstract look you're trying to achieve, have you considered applying it at a downsized resolution, then upsampling, then maybe applying it again?  This is done that way...
    Oh, and by the way, Oil Paint has been removed from Photoshop CC 2014, so if you're planning a business based on automatically turning people's large photos into oil paintings you should assume you'll be stuck with running the older Photoshop version.
    -Noel

  • Working with large Artboards/Files in Illustrator

    Hello all!
    I'm currently designing a full size film poster for a client. The dimensions of the poster are 27" x 40" (industry standard film poster).
    I am a little uncertain in working with large files in Illustrator, so I have several problems that have come up in several design projects using similar large formats.
    The file size is MASSIVE. This poster uses several large, high-res images that I've embedded. I didn't want them to pixelate, so I made sure they were high quality. After embedding all these images, along with the vector graphics, the entire .ai file is 500MB. How can I reduce this file size? Can I do something with the images to make the .ai file smaller?
    I made my artboard 27" x 40" - the final size of the poster. Is this standard practice? Or when designing for a large print format, are you supposed to make a smaller, more manageable artboard size, and then scale up after to avoid these massive file sizes?
    I need to upload my support files for the project, including .ai and .eps - so it won't work if they're 500MB. This would be good info to understand for all projects I think.
    Any help with this would be appreciated. I can't seem to find any coherent information pertaining to this problem that seems to address my particular issues. Thank you very much!
    Asher

    Hi Asher,
    It's probably those high-res images you've embedded. Firstly, be sure your images are only as large as you need them Secondly, a solution would be to use linked images while you're working instead of embedding them into the file.
    Here is a link to a forum with a lot of great discussion about this issue, to get you started: http://www.cartotalk.com/lofiversion/index.php?t126.html
    And another: http://www.graphicdesignforum.com/forum/archive/index.php/t-1907.html
    Here is a great list of tips that someone in the above forum gave:
    -Properly scale files.  Do not take a 6x6' file then use the scaling tool to make it a 2x2'  Instead scale it in photoshop to 2x2 and reimport it.  Make a rule like over 20%, bring it back in to photoshop for rescaling.
    -Check resolutions. 600dpi is not going to be necessary for such and such printer.
    -Delete unused art.  Sloppy artists may leave old unused images under another image.  The old one is not being used but it still takes up space, therefore unecessarily inflating your file.
    -Choose to link instead of embedd.  This is your choice.  Either way you still have to send a large file but many times linking is less total MB then embedding.  Also embedding works well with duplicated images. That way multiple uses link to one original, whereas embedding would make copies.
    -When you are done, using compression software like ZIP or SIT (stuffit)
    http://www.maczipit.com/
    Compression can reduce file sizes alot, depending on the files.
    This business deals with alot of large files.  Generally people use FTP's to send large files, or plain old CD.  Another option is using segmented compression.  Something like winRAR/macRAR or dropsegment (peice of stuffit deluxe) compresses files, then breaks it up into smaller manageble pieces.   This way you can break up a 50mb file into say 10x 5mb pieces and send them 5mb at a time. 
    http://www.rarlab.com/download.htm
    *make sure your client knows how to uncompress those files.  You may want to link them to the site to download the software."
    Good luck!

  • Speed up Illustrator CC when working with large vector files

    Raster (mainly) files up to 350 Mb. run fast in Illustrator CC, while vector files of 10 Mb. are a pain in the *blieb* (eg. zooming & panning). When reading the file it seems to freeze around 95 % for a few minutes. Memory usage goes up to 6 Gb. Processor usage 30 - 50 %.
    Are there ways to speed things up while working with large vector files in Illustrator CC?
    System:
    64 bit Windows 7 enterprise
    Memory: 16 Gb
    Processor: Intel Xeon 3,7 GHz (8 threads)
    Graphics: nVidia Geforce K4000

    Files with large amounts vector points will put a strain on the fastest of computers. But any type of speed increase we can get you can save you lots of time.
    Delete any unwanted stray points using  Select >> Object >> stray points
    Optimize performance | Windows
    Did you draw this yourself, is the file as clean as can be? Are there any repeated paths underneath your art which do not need to be there from live tracing or stock art sites?
    Check the control panel >> programs and features and sort by installed recently and uninstall anything suspicious.
    Sorry there will be no short or single answer to this, as per the previous poster using layers effectively, and working in outline mode when possible might the best you can do.

  • Photoshop CS6 keeps freezing when I work with large files

    I've had problems with Photoshop CS6 freezing on me and giving me RAM and Scratch Disk alerts/warnings ever since I upgraded to Windows 8.  This usually only happens when I work with large files, however once I work with a large file, I can't seem to work with any file at all that day.  Today however I have received my first error in which Photoshop says that it has stopped working.  I thought that if I post this event info about the error, it might be of some help to someone to try to help me.  The log info is as follows:
    General info
    Faulting application name: Photoshop.exe, version: 13.1.2.0, time stamp: 0x50e86403
    Faulting module name: KERNELBASE.dll, version: 6.2.9200.16451, time stamp: 0x50988950
    Exception code: 0xe06d7363
    Fault offset: 0x00014b32
    Faulting process id: 0x1834
    Faulting application start time: 0x01ce6664ee6acc59
    Faulting application path: C:\Program Files (x86)\Adobe\Adobe Photoshop CS6\Photoshop.exe
    Faulting module path: C:\Windows\SYSTEM32\KERNELBASE.dll
    Report Id: 2e5de768-d259-11e2-be86-742f68828cd0
    Faulting package full name:
    Faulting package-relative application ID:
    I really hope to hear from someone soon, my job requires me to work with Photoshop every day and I run into errors and bugs almost constantly and all of the help I've received so far from people in my office doesn't seem to make much difference at all.  I'll be checking in regularly, so if you need any further details or need me to elaborate on anything, I should be able to get back to you fairly quickly.
    Thank you.

    Here you go Conroy.  These are probably a mess after various attempts at getting help.

  • Large Scale Digital Printing Guidelines

    Hi,
    I'm trying to get a getter handle on the principles and options for creating the best large and very large scale prints from digital files.  I'm more than well versed in the basics of Photoshop and color management but there remain some issues I've never dealt with.
    It would be very helpful if you could give me some advice about this issue that I've divided into four levels.  In some cases I've stated principles as I understand them.  Feel free to better inform me.  In other cases I've posed direct questions and I'd really appreciate professional advice about these issues, or references to places where I can learn more.
    Thanks alot,
    Chris
    Level one – Start with the maximum number of pixels possible.
    Principle: Your goal is to produce a print without interpolation at no less than 240 dpi.  This means that you need as many pixels as the capture device can produce at its maximum optical resolution.
    Level two – Appropriate Interpolation within Photoshop
    Use the Photoshop Image Size box with the appropriate interpolation setting (Bicubic Smoother) to increase the image size up to the maximum size of your ink jet printer.
    What is the absolute minimum resolution that is acceptable when printing up to 44”?
    What about the idea of increasing your print size in 10% increments? Does this make a real difference?
    Level three - Resizing with vector-based applications like Genuine Fractals?
    In your experience do these work as advertized and do you recommend them for preparing files to print larger than the Epson 9900?
    Level four – Giant Digital Printing Methods
    What are the options for creating extremely large digital prints?
    Are there web sites or other resources you can point me to to learn more about this?
    How do you prepare files for very large-scale digital output?

    While what you say may be true, it is not always the case. I would view a 'painting' as more than a 'poster' in terms of output resolution, at least in the first stages of development. Definately get the info from your printer and then plan to use your hardware/software setup to give you the most creative flexibility. In other words - work as big as you can (within reason, see previous statement) to give yourself the most creative freedom. Things like subtle gradations and fine details will benefit from more pixels, and can with the right printer be transferred to hard copy at higher resolutions (a photo quality ink jet will take advantage of 600ppi) if that's what you're going for.
    Additionally it's much easier to down scale than to wish you had a bigger image after a 100 hours of labor...

  • ELearning for Large Scale System Change

    Our next project is training for a large scale company wide system upgrade. Our users are very dependent on the software that will be replaced and several departments will need to learn their job anew. Any suggestions on how to use eLearning to increase adoption and comprehension of the new software?

    Hi Lincoln,
    I've worked on a number of large-scale IT change projects for international clients. I can make a few suggestions, some Captivate related, some more general eLearning related.
    On projects like this I tend to produce three types of training: face-to-face, interactive tutorials/simulations & job aids. Ideally the three are planned together, allowing you to create a single instructional design plan. You want people to be introduced to the system, learning to be reinforced and then everybody to be supported.
    The face-to-face training usually contains lots of show and tell, where the users are shown how to do tasks and then have a go at doing this themselves. Ideally a number of small tasks are shown and repeated, then they are all brought together by getting the learners to follow realistic scenarios that require all of the discrete tasks to be performed together. I find that lots of training doesn't integrate deeply with people's real-life jobs and the use of real world scenarios helps to improve retention and performance. I have made materials where the show-and-tell pieces have been pre-recorded, which you can do with Captivate.
    The interactive tutorials are usually used as follow-on material to the face-to-face modules, allowing learners to go through simulations with guidance when they do something unexpected, though sometimes there is no face-to-face training and the interactive tutorials have to deliver all of the teaching. Sometimes these include sections that merely show users and ask questions afterwards. Sometimes they become very complex branching scenarios. I usually build all of this in Captivate, though I do find it quite buggy and frustrating at times.
    Finally, I build small job aids. These are very specific and show how to do a well defined function, such as changes an existing customer's address. Sometimes these are Captivate movies, sometimes they are PDF files, often they are implemented as both. They can be embedded and/or linked from the system help screens and FAQs, as well as used in support responses and post-training emails. The movies tend to be short and sweet: 30-120 seconds long.
    In an ideal world the number of job aids grows rapidly after implementation in response to user support requests, though in reality I often have to anticipate what users are going to find difficult and create all of these prior to launch.
    If you are going to use Captivate for your project, then I suggest that you test, test and test again before agreeing what you will deliver. It's a great bit of software, in theory, but it is quite buggy. I'm working on a project with CP6.1 and I'm having lots of audio synch problems and video corruption issues publishing my final work to as MP4 videos.
    In terms of effort, my rule of thumb is 20% planning, 60% design and scripting and 20% implementation.
    I hope this helps,
    David

  • Typical/Common large-scale ACE deployment or designs?

    I am deploying several ACE devices and GSS devices to facilitate redundancy and site load balancing at a couple of data centers.  Now that I have a bunch of experience with the ACE and GSS, are there typical or common ACE deployment methods?  Are there reference designs?  I have been looking, and haven't really found any.
    Even if they are not Cisco 'official' methods, I'm wondering how most people, particularily those who deploy a lot of these or deploy them with large-scale systems, typically do it.
    I'm using routed mode (not one-arm mode) and I'm wondering if most people use real server (in my case, web servers) with dual-NICs to support connectivity to back-end systems?  Or do people commonly just route it all through the ACE?
    Also, how many VIPs and real servers have been deployed in a single ACE 4710 device?  I'm trying to deploy about 700 VIPs with about 1800 Real Servers providing content to those VIPs.
    How do people configure VIPs, farms, and sticky?  I'm looking for how someone who wants to put a large ammount of VIPs and real servers into the ACE would succeed at doing it.  I have attempted to add a large number in the 'global' policy-map, but that uses too many PANs.
    I have tried a few methods myself, and have run into the limit on Policy Action Nodes (PANs) in the ACE device.  Has anyone else hit this issue?  Any tips or tricks on how to use PANs more conservitively?
    Any insight you can share would be appreciated.
    - Erik

    As far as i can see from your requirements i suggest you create 1 ear file for your portal and 1 ear file per module.
    The ear file from your portal is the main application and the ear files of your modules are shared libraries that contain the taskflows. These taskflows can be consumed in the portal application.
    This way, you can easily deploy 1 module without needing to deploy the main application or the other application.
    It also let you devide your team of developer so everybody can work on a sepperate module without interfering.
    On a sidenote: when you have deployed your main application, and later you create a new module, than you have to register that module to your application so then you will need to redeploy your portal but if you update an existing module, you won't need to redeploy your portal.
    As for the security, all your modules will inherit the security model of your portal application.

  • Using Smart Objects in large scale documents.

    I'm creating a large scale banner (12' by 8') which includes a photographic background image with vector based logo elements and typography on top of it.  I am creating the document in photoshop, and wanted to know if using Smart Objects is the best solution for placing the logo elements and typography, but I don't know how they will be affected when the image is scaled for printing. 

    In that case a Vector Smart Object (File > Place or pasting the Illustrator content) should work fairly well.
    One thing to keep in mind is that in Illustrator Raster Effects (Drop Shadows, Filters, Color Meshes, …) have a resolution determined under Effect > Document Raster Effect Settings.
    If that resolution is low upscaling on Photoshop may reveal noticeable pixels.

  • Large scale displays

    I have a project that requires creating a 8'x26' (one-way see thru) vinyl window film for display.
    The sign will include both text and large scale photo stock images. I have never developed anything that large for print before and I'm concerned about pixelation or blurred images. Does have a suggestion on what size images and what type of file format to use for the final art for production?

    Like Monika, I recommend talking to the print vendor first.  That goes for any job.  They will tell you what they need.  I can also recommend you work primarily on an .ai file for construction of the window display.  Look at it as a few "stages" of construction.  Stage 1 = .ai file; Stage 2 = .eps file; Stage 3 = PDF file.  The reason I recommend doing it this way is you can do all of your layer effects, typography, and image placement on the .ai file ( or, I refer to it as your "live" file ).  Work at 25% final size ( in your case, 24"h x 78"w ).  Make sure your document color space is CMYK ( and your image files ).  Set the document raster resolution to high if it isn't already.  Place images at 100% in the scaled version at 300 to 400ppi ( depending on print vendor requirements ).  There should not be any visible pixelation or fuzziness at the final size ( again, depends on print vendor capabilities ).  Save the original .ai as a copy to EPS ( this will flatten the file ).  In the EPS, convert the text to outlines ( do not forget to leave the original .ai file with live text elements in case you need to edit the text in the future ).  Double check all of your overprints and make sure White elements knockout.  Distill ( or Print > Save as PDF ) the EPS to PDF using High Quality Print or Press setting, leave color unchanged. Use this info as a guide.  Final file requiremnts to be determined by the print vendor.

  • Illustrator needs a Relative mode switch or ability to set a Working directory for each AI file!

    Hi guys,
    I thought I would post in here because I find I'm wanting this feature more and more over the past year.
    This can apply to most of Adobes application but it would be great to see some file management functionality built into the applications themselves. This could then be further expanded in Bridge/Cue to incorporate more applications into the working project but at the very least the application itself the ability to control its own root existence.
    On so many occasions I require Illustrators file linking feature to be relative and NOT absolute which it currently is. The only relative capability it has is if ALL the linked assets are in the same local directory as the AI file itself and this as everyone knows breaks the first rule of proper file management.
    Many of the projects I work on uses many AI files with sometimes over 50 linked in assets in a variety of different formats from a variety of different clients (locations). I cannot easily move this project folder around without the painful task of linking all the data again and this creates a massive issue.
    Ideally an approach similar to Autodesks Maya and Max where you can set project directories. This means all the Save As and other operations will default to that working directory for that file. All linked data will then use the projects working directory as its root and allow the project to move from location to location and not be affected.
    This would Im sure benefit many other designers who work on projects with large amounts of data that may not live always in the same spot. For us we work with so many different people during a production that mobility is a must and at the moment we are finding it very difficult.
    As always, thanks for reading.
    Cheers
    Nick.

    Hello,
    First command
    So what does the wiki meen by: "DEVPATH sets the physical device. You can determine this by executing the command
    readlink -f /sys/class/hwmon/hwmon-device/device | sed -e 's/^\/sys\///'"
    Physical device of what, and what output am I suppose to get: Nothing, a list with output on where the symlinks lead, something else... In my case I got no output, if that is right, I do not know - since I do not understand what the command do.
    Seccond comand
    DEVNAME: Sets the name of the device. Try:
    sed -e 's/[[:space:]=]/_/g' /sys/class/hwmon/hwmon-device/device/name
    Does it meen like this - If I during my pwmconfig used hwmon1 wich was coretemp, and hwmon2 wich was nct6775, I should do:
    sed -e 's/[[:space:]=]/_/g' /sys/class/hwmon/hwmon1-device/device/name coretemp
    sed -e 's/[[:space:]=]/_/g' /sys/class/hwmon/hwmon2-device/device/name nct6775
    And that will direct every occurrence of hwmon1 and hwmon2 to the correct sensor chip?
    Regards
    Martin
    Last edited by onslow77 (2015-01-23 21:46:04)

Maybe you are looking for