How to produce large Files (100meters and more)?

We are currently working on an architecutre project for which we have to make printed curtains/screens. The maximum of one Curtain
is about 180 meters (600ft). Over the 180 meters we have a pattern, which makes it not so easy to split files into pieces (color gradients, a.m.m.)
Our printing company suggested us to produce everything in a 1:10 scale (1 meter is 10 centimeters) and in 600dpi and they will scale it up with
their rip unit and print it than in 1:1. Final print resolution is therefore 60dpi.
As a start we are doing a 25 meters x 2,5 meters file - which is already super hard on the computer. (all new macs with 16gb ram and adobe creative cloud)
Is there any way of working like with videos - online and offline files? So that we work on a smaller scale and then repeat all the steps on the larger images.
We want to get our workflow optimized as we have to produce 3000 meters of image.
Any Ideas?
Thanks
Georg Kettele

First thing that comes to mind is to ask:  Do you really need that high a resolution?  Are people really going to be looking at actual print detail from just a few meters away on a project that's nearly 200 meters wide?  In other words, are you actually creating fine image detail that matters at the 60 ppi scale?
The second thing that occurs to me is that  a typical computer with just 16GB of RAM doesn't seem a good fit for working on gargantuan images.  You've avoided doing the math that matters for the readers here, so I'll do it:
60 pixels/inch * 39.37 inches/meter * 180 meters == 425196 pixels width
Assuming your full length print will also be 2.5 meters:
60 pixels/inch * 39.37 inches/meter * 2.5 meters == 5906 pixels height
So your proposed image at this scale will be 425195 x 5906 pixels == 2.5 gigapixels.  Assuming you're using 8 bits/channel mode RGB, that's 7.5 gigabytes to store the image exactly once in RAM.
Keeping in mind you'll need to actually WORK on this image in Photoshop, you'll need at least 10x that RAM (for History states, room to run various tools, etc.) and a LOT of super fast disk storage to be able to swap data to the scratch file.  Numbers like 96 GB of RAM and several TB of SSD array storage space come to mind to even begin to think about working on such a huge image. 
Any image that takes 7.5 gigabytes in RAM is going to be slow to work on no matter what computer you have today.  And if you're hoping to work on an even taller image than 5906 pixels, then you start to get into the realm of computers that haven't been invented yet.
Third thing I'm thinking of is that you may have to divide it into smaller chunks and deal with manually aligning things you want to match up.
You could start working at a MUCH smaller scale, lay the groundwork for the layout using things (e.g., paths/shapes) that scale up nicely, then divide the image up for the finish work.
-Noel

Similar Messages

  • How to upload large file with http via post

    Hi guys,
    Does anybody know how to upload large file (>100 MB) use applet to servlet with http via post method? Thanks in advance.
    Regards,
    Mark.

    Hi SuckRatE
    Thanks for your reply. Could you give me some client side code to upload a large file. I use URL to connect to server. It throws out of memory exception. The part of client code is below:
    // connect to the servlet
    URL theServlet = new URL(servletLocation);
    URLConnection servletConnection = theServlet.openConnection();
    // inform the connection that we will send output and accept input
    servletConnection.setDoInput(true);
    servletConnection.setDoOutput(true);
    // Don't used a cached version of URL connection.
    servletConnection.setUseCaches (false);
    servletConnection.setDefaultUseCaches(false);
    // Specify the content type that we will send text data
    servletConnection.setRequestProperty("Content-Type",
    +"application/octet-stream");
    // send the user string to the servlet.
    OutputStream outStream = servletConnection.getOutputStream();
    FileInputStream filein = new FileInputStream(largeFile);
    //BufferedReader in = new BufferedReader(new InputStreamReader
    +(servletConnection.getInputStream()));
    //System.out.println("tempCurrent = "+in.readLine());
    byte abyte[] = new byte[2048];
    int cnt = 0;
    while((cnt = filein.read(abyte)) > 0)
    outStream.write(abyte, 0, cnt);
    filein.close();
    outStream.flush();
    outStream.close();
    Regards,
    Mark.

  • How to strip a file name and add it to array

    Hello I am new to labview, and I need to know how to strip a file name and add it to an array. For example, filename joe.csv, then take "joe" and add it to an array with other inputs. A visual depiction will be very helpful
    Attachments:
    channel read and error check R03_v9(JHK).vi ‏101 KB

    What does your code attachment have to do with your question?
    What are "other inputs"?
    I assume you have an array of strings. You can strip the filename using get file extension. You can build an array using build array. Wire the existing array on one input ant the string as a second input.
    LabVIEW Champion . Do more with less code and in less time .

  • How to upload a file which has more than 999 line item  through BDC ?

    Hello Techards
    Hi to all
    Can any body tell me how to upload a file which has more than 999 line item  through BDC for traction F-02 ?
    Thanks in advance.
    Shovan

    Hello Shovan,
    You split it up to post two accounting documents with the help of a "suspense" a/c.
    Say, you have to post the following line items below:
    line 1 - dr. - GL a/c X - $1000
    line 2 - cr. - GL a/c Y - $1
    line 3 - cr. - GL a/c Y - $1
    line 1001 - cr. - GL a/c Y - $1
    You cannot post the above as a single doc in SAP (because of technical reasons), so you need to break it up into 2 documents as below:
    Doc1
    line 1 - dr - GL a/c X - $1000
    line 2 - cr - GL a/c Y - $1
    line 3 - cr - GL a/c Y - $1
    line 998 - cr - GL a/c Y - $1
    line 999 - cr - SUSPENSE a/c - $3
    Doc2
    line 1 - dr - SUSPENSE a/c - $3
    line 2 - cr - GL a/c Y - $3
    Note that there is no incorrect impact on accounting as first we credit suspense a/c by $3 and next we debit the same suspense a/c by $3 as a result the effect is nil. Similarly, we credit $997 to GL a/c Y (which is less by $3) in the first doc which is compensated by the second doc by crediting the shortfall of $3.
    Hope this helps,
    Cheers,
    Sougata.

  • How to transfer large files(1GB) to pc

    How to transfer large files(1GB) to pc

    Or possibly alternatively, and if really desperate, upload it to file distribution service like Fileserve or Rapidshare and then download it from the other machine (and then delete the upload).  Many of these services have a free mode although they may have size limitations and they certainly throttle the download speed down (Rapidshare may be one of the worse, Fileserve one of the best).  Personally I never tried something the size of 3GB.  What I do see is that stuff that large is generally broken up into multiple files to get around the size limitations and to be glued together when all the parts are downloaded.
    Just throwing this "out there" as an alternative but as I said, you probably would need to really be desparate to go this route

  • Transfer of large files to and from Egnyte failing...

    One of my clients uses Egnyte for file management. For a typical job I will usually be required to download 5GB of files and upload 1.5GB.
    However, when at home, transfer of large files to and from Egnyte will often fail. (On download, Chrome gives the error message: "connection failed". Uploading, Egnyte's error message is: "HTTP error").
    I have three machines at home. Two Macs (running Yosemite and Lion) and a PC running Windows 7. I've had no luck with any of them on any browser but when using other people's broadband I have no problem at all (using my MacBook).
    I have no firewalls running. Yes, I've turned everything on-and-off-again. So that leaves me to think that the problem lies with my BT Homehub 4 router. But why would my router be botching the transfer of large files? I've switched the router's firewall off, tried adding my Mac to DMZ (whatever that is) but that seems to be the most I can do. Ethernet is no different to wireless.
    I've not noticed this porblem when using other file transfer sites (like WeTransfer).
    What's going on?
    Please help!

    From my own experience (I admin a few gaming servers and often get disconnections from them in the middle of monitoring operations) and based on other users experiences here on the forums I suspect BT have been having some core infrastructure issues which can lead to A) intermittent packet loss B) extended packet delay - both of which can cause servers to assume a 'failure' and disconnect or suspend upload/download.
    I dont know what package you are on from BT (I'm Infinity 2) << and as its Hogmany im the one that drawn the short straw to keep cheaters off out servers << so I'm a bit intoxicated and may not make total sense atm.
    https://community.bt.com/t5/BT-Infinity-Speed-Connection/BT-Infinity-issues-for-the-last-few-days/td...
    ^^ this thread illustrates issues that people have been having over the last few weeks.
    This probably wont help - but it might make you aware that you arent alone in ONGOING issues.
    Happy New Year !

  • How are the .7z files used and installed for photoshop elements and premier elements?

    How are the .7z files used and installed for photoshop elements and premier elements?

    Hi,
    You can try to extract them to a folder and then try to install using the setup file.
    Or you can try to download the .exe file ( Small File ) for the software you have and then run the .exe file and it will start the installation. Both .exe and .7z file for the application should be in same location.
    Download Photoshop Elements products | 9, 8, 7
    Download Photoshop Elements products | 11, 10
    Download Premiere Elements products | 11, 10
    Download Premiere Elements products | 9
    *** You can either download the .exe file from the above download links or you can also download both .exe (File 2 of 2) and .7z (File 1 of 2) from the above links.

  • Need your suggestions - how to display large file in ASCII and HEX

    Hello,
    I want to create an application which can read in a large file and switch between displaying ASCII and HEX (formatted a particular way). There are two problems here that I'm not quite sure how to solve.
    1. How to switch dynamically between ASCII and HEX. Should the HEX formatter be in the document of the JTextArea(or equivalent), or somewhere in the view? If it's in the view then where? I'd rather not read in the file more than once.
    2. How to do some kind of paging scheme for huge files. I'd like to read in part of the file and display it, then when the user scrolls to another area, read in that part of the file.
    Thanks!
    Jeff

    Hello,
    I want to create an application which can read in a
    large file and switch between displaying ASCII and
    HEX (formatted a particular way). There are two
    problems here that I'm not quite sure how to solve.
    1. How to switch dynamically between ASCII and HEX.
    Should the HEX formatter be in the document of the
    e JTextArea(or equivalent), or somewhere in the view?
    If it's in the view then where? I'd rather not read
    d in the file more than once.You can iterate over all the characters in the String using String.charAt, cast the chars to ints, and call Integer.toHexValue(...)
    >
    2. How to do some kind of paging scheme for huge
    files. I'd like to read in part of the file and
    display it, then when the user scrolls to another
    area, read in that part of the file.
    Thanks!
    Jeff

  • How to do backup large files quickly and compress the backup without taking up much space

    I am a member of a large business company, and there is large data stored in our company computers. To protect data, i want to do files backup for my important work documents, pictures, xls etc. Maybe you say the built-in backup tool can help you solve
    this problem. But i want to know the data is so much, and i want to choose a simple, fast, and safe way to backup data. And i want one tool can help me compress my large data. Could you help me? Any free software?

    To save disk space on a volume, you could enable NTFS compression on the folder.
    To backup a folder, you can use the windows built-in functionality. You could also copy the files to another location using a tool that only copies changes (like teh builtin robocopy or Microsft developed synctoy,...).
    http://en.wikipedia.org/wiki/List_of_backup_software lists hird aprty backup software, some of which is free. The main advantages of using real backup software is you can have complex schedules
    (differential backups, granfather-father-son principle) and more easy management and monitoring.
    The 'previous versions' feature is a good solution to retain  multiple versions of files without bloating storage requirements. It is not a real backup solution, but often aids developing a backup strategy that is both flexible as safe. 
    MCP/MCSA/MCTS/MCITP

  • How to list long file names and paths longer than 260 characters in vbscript?

    Hi there,
                   I have looked in different posts and forums, but couldn't find an answer to my question.
    So, here it is... imagine the small piece of code attached below. I need to be able to iterate and rename recursively (I didn't include the recursive part of the code in order to keep things simple) through files and folders in an user selected path. I thought that by using vbscript FileSystemObject I would be able to do so.
    Unfortunately, it seems like the Files method does not detect files whose path is larger than the OS 260 (or 256?) character limit. How could I do it so? Please, don't tell me to use something different than vbscripts :-)
    Thanks in advance!
    Set FSO = CreateObject("Scripting.FileSystemObject")
    Set objFolder = FSO.GetFolder(Path)
    Set colFiles = objFolder.Files
    For Each objFile In colFiles
    Wscript.Echo "File in root folder:" & objfile.Name
    Next

    Yes, that describes the problem, but not the solution - short of changing the names of the folders. There is another solution.  That is, to map a logical drive to the folder so that the path to subfolders and files is shortened.  Here is a procedure that offers a scripted approach that I use to map folders with long names to a drive letter and then opens a console window at that location ...
    Dim sDrive, sSharePath, sShareName, aTemp, cLtr, errnum
    Const sCmd = "%comspec% /k color f0 & title "
    Const bForce = True
    if wsh.arguments.count > 0 then
      sSharePath = wsh.arguments(0)
      aTemp = Split(sSharePath, "\")
      sShareName = aTemp(UBound(aTemp))
      cLtr = "K"
      with CreateObject("Scripting.FileSystemObject")
        Do While .DriveExists(cLtr) and UCase(cLtr) <= "Z"
          cLtr = Chr(ASC(cLtr) + 1)
        Loop
      end with
      sDrive = cLtr & ":"
    Else
      sDrive = "K:"
      sSharePath = "C:\Documents and Settings\tlavedas\My Documents\Script\Testing"
      sShareName = "Testing"
    end if
    with CreateObject("Wscript.Network")
      errnum = MakeShare(sSharePath, sShareName)
      if errnum = 0 or errnum = 22 then
        .MapNetworkDrive sDrive, "\\" & .ComputerName & "\" & sShareName
        CreateObject("Wscript.Shell").Run sCmd & sDrive & sShareName & "& " & sDrive, 1, True
        .RemoveNetworkDrive sDrive, bForce
        if not errnum = 22 then RemoveShare(sShareName)
      else
        wsh.echo "Failed"
      end if
    end with
    function MakeShare(sSharePath, sShareName)
    Const FILE_SHARE = 0
    Const MAX_CONNECT = 1
      with GetObject("winmgmts:{impersonationLevel=impersonate}!\\.\root\cimv2")
        with .Get("Win32_Share")
          MakeShare = .Create(sSharePath, sShareName, FILE_SHARE, MAX_CONNECT, sShareName)
        end with
      end with
    end function
    sub RemoveShare(sShareName)
    Dim cShares, oShare
      with GetObject("winmgmts:{impersonationLevel=impersonate}!\\.\root\cimv2")
        Set cShares = .ExecQuery _
          ("Select * from Win32_Share Where Name = '" & sShareName & "'")
        For Each oShare in cShares
          oShare.Delete
        Next
      end with
    end sub
    The SHARE that must be created for this to work is marked as having only one connect (the calling procedure).  The mapping and share is automatically removed when the console window is closed.  In the case cited in this posting, this approach could be applied using the subroutines provided so that the user's script creates the share/mapping, performs its file operations and then removes the mapping/share.  Ultimately, it might be more useful to change the folder structure, as suggested in the KB article, since working with the created folders and files will still be problematic - unless the drive mapping is in place.Tom Lavedas

  • Large File Reading and Processing

    Hi:
    Suppose I have a file much larger than my computer's memory e.g. say on a Windows XP system, the memory is 256 MB RAM and the file size is 1 GB on disk. Now if I want to create a File object and read from this file, is this possible to do? If yes, how about the performance?
    I understand that this is more of an Operating System question but nevertheless it affects the Java process.
    Thanks!
    Rahul.

    If you mean, you plan to read the whole file into memory, then:
    Theoretically possible, sure. The process would be much bigger (in terms of memory used) than the amount of memory available, which means that it would start swapping out to disk, and that would reduce performance, although to what extent depends on XP's virtual memory implementation, which I know nothing about.
    But if all you mean is that you want to create a File object and read bits of the file, it won't make any difference. File represents a name or description of a file, not the file itself (you can create File objects for files that don't exist). So it doesn't need to read the whole into memory just to create a File object. You can easily create a File object for the huge file, and then read it in line by line, throwing away old lines as you go, and you won't use up very much memory at all. (Unless, of course, there's some horrible weird artifact of XP that I'm not aware of.)

  • Iphoto - how are the photo files managed and stored ?

    I am one of those who migrated from the PC to this wonderfull world of the Mac. So please excuse my PC based way of thinking. 
    I have many thosands of photos, (aprx. 8,000 +++ )
    I used to organized my photos on an external hard disk and and I would create folder that would hold a group of photos; for example (A folder labeled "Vacation on New York" will hold all the photos for that event)
    Now on my iMac, I could import all the photos on that folder and create an event on iphoto called "Vacation on NY"
    The only question I have is where do those file phisically go to?  Where are those  files with their names and extensions?  What folder or directory do they go to once I import them into iphoto?
    I need to know this so that I could get a back up and know how to manage the files.
    Also...
    I would like to know how can I close all events at once.
    Thank you in advance for all the explanations......

    The files are copied into the iPhoto Library which is in your Pictures Folder.
    Most Simple Back Up
    Drag the iPhoto Library from your Pictures Folder to another Disk. This will make a copy on that disk.
    Slightly more complex:
    Use an app that will do incremental back ups. This is a very good way to work. The first time you run the back up the app will make a complete copy of the Library. Thereafter it will update the back up with the changes you have made. That makes subsequent back ups much faster. Many of these apps also have scheduling capabilities: So set it up and it will do the back up automatically. Examples of such apps: Chronosync or DejaVu . But are many others. Search on MacUpdate
    By way of explanation:
    The iPhoto Library  is a Package File. This is simply a folder that looks like a file in the Finder. This is to protect the iPhoto library because many users were inadvertently corrupting their library by browsing through it with other software or making changes in it themselves.
    Want to look inside?
    Go to your Pictures Folder and find the iPhoto Library there. Right (or Control-) Click on the icon and select 'Show Package Contents'. A finder window will open with the Library exposed.
    Standard Warning: Don't change anything in the iPhoto Library Folder via the Finder or any other application. iPhoto depends on the structure as well as the contents of this folder. Moving things, renaming things,, deleting them or otherwise making changes will prevent iPhoto from working and could even cause you to damage or lose your photos.
    To use an editor with iPhoto: You can set Photoshop (or any image editor) as an external editor in iPhoto. (Preferences -> General -> Edit Photo: Choose from the Drop Down Menu.) This way, when you double click a pic to edit in iPhoto it will open automatically in Photoshop or your Image Editor, and when you save it it's sent back to iPhoto automatically. This is the only way that edits made in another application will be displayed in iPhoto.
    To use the files in other apps, upload them etc
    There are many, many ways to access your files in iPhoto:   You can use any Open / Attach / Browse dialogue. On the left there's a Media heading, your pics can be accessed there. Command-Click for selecting multiple pics.
    (Note the above illustration is not a Finder Window. It's the dialogue you get when you go File -> Open)
    You can access the Library from the New Message Window in Mail:
    There's a similar option in Outlook and many, many other apps.  If you use Apple's Mail, Entourage, AOL or Eudora you can email from within iPhoto.
    If you use a Cocoa-based Browser such as Safari, you can drag the pics from the iPhoto Window to the Attach window in the browser.
    If you want to access the files with iPhoto not running:
    For users of 10.6 and later:  You can download a free Services component from MacOSXAutomation  which will give you access to the iPhoto Library from your Services Menu.
    Using the Services Preference Pane you can even create a keyboard shortcut for it.
    For Users of 10.4 and 10.5 Create a Media Browser using Automator (takes about 10 seconds) or use this free utility Karelia iMedia Browser
    Other options include:
    Drag and Drop: Drag a photo from the iPhoto Window to the desktop, there iPhoto will make a full-sized copy of the pic.
    File -> Export: Select the files in the iPhoto Window and go File -> Export. The dialogue will give you various options, including altering the format, naming the files and changing the size. Again, producing a copy.
    Show File:  a. On iPhoto 09 and earlier:  Right- (or Control-) Click on a pic and in the resulting dialogue choose 'Show File'. A Finder window will pop open with the file already selected.    3.b.
    b: On iPhoto 11 and later: Select one of the affected photos in the iPhoto Window and go File -> Reveal in Finder -> Original. A Finder window will pop open with the file already selected.

  • How to Expire Large Files using File Server Resource Manager

    Is there a way to expire Large Files over 2GB that have not been accessed in 2 years.
    I see under the File expiration options that I can expire files that have not been Created, Modified, or Accessed for a certain amount of time.
    Thanks,
    Eddie

    Hi Eddie,
    FSRM can help report large files and also can help move old files to a folder, but I did not found a way to combine them in a single process.
    Instead how about using Robocopy?
    You can run robocopy /min:xxx /minlad:xxx <source> <target>.
    /MIN:n :: MINimum file size - exclude files smaller than n bytes.
    /MINLAD:n :: MINimum Last Access Date - exclude files used since n.
    (If n < 1900 then n = n days, else n = YYYYMMDD date).
    Please remember to mark the replies as answers if they help and un-mark them if they provide no help. If you have feedback for TechNet Support, contact [email protected]

  • Large file size and fuzzy type

    I'm new at using FCE2 and composed my first short 4 minute video. It includes a still image, a PSD layered file which I discovered was quite handy, a few clips and type using the type generator. I exported it to QuickTime movie. Three observations: I was floored by the nearly 1GB size, fuzzy type, and the file on the hard drive says it's a FCE movie inclusive of the little FCE icon by the file name. I was expecting to see a QuickTime icon with file name and the type of file is a QuickTime movie. Is all this normal? I'm very disappointed in the fuzzy type. Oh, also, the still image became blurry. Why?? Just so you know, the still image was a special file 640x480 with a pixel aspect ratio of D4/D16 Anamorphic
    I saved the project under a new name and redid it taking out the Photoshop image, removing it from the bin also, and the new movie exported even larger, over a gig in size. Huh??

    Thanks for the reply Tom. After I posted the first time I went to the Finder, Get Info, and I saw that I could change the 'Opens with' to QuickTime and therefore it became a QuickTime file. About that Anamorphic business I read a 'how to' on dealing with images before bringing them into video. The tip says in the 'New' file dialogue box to choose 640x480 size and in the pull down menu at the bottom where you can choose the 'Pixel Aspect Ratio' it was suggest to use that Anamorphic setting. I did it but it certainly didn't look right but I went with it.
    Again after I posted I looked at the Format of one of the clips and saw the size to be 720x480, Compressor is DV/DVCPRO-NTSC, Pixel Aspect is NTSC-CCIR 601, Anamorphic field is blank. I'm running Photoshop CS2. So I went back there and created a new blank file to use a template for dealing with stills but this time I used the Preset pull down menu and chose NTSC DV 720x480 with guides and the Pixel Aspect Ratio automatically loaded D1/DV NTSC (0.9), clicked Ok and viola, the blank file looks exactly like the Canvas in FCE. I haven't played with a still with this new setting but I will try it on the little project I'm working on.
    As for viewing it, I am looking at it on my Mac flat screen. I went into QuickTime Preferences and checked the box for high quality, thank you. Thanks for reassuring me on the file size.
    I also don't know what "D4/D16 Anamorphic" means.
    I don't understand the fuzzy type. I'm aware these are 72 ppi files and video is not resolution dependent but rather pixel dependent. Computer monitors display at 72 ppi, televisions are higher. I have yet to complete the process of burning a DVD and playing it back on a TV. Maybe that's where I'll see the type showing up sharper.
    At any rate, just dealing with this itty bitty project tells me I have a lot to learn about video, never mind learning about how to use FCE as well.

  • Along the lines of How To Lead Large Files

    I have some mainframe extract files loaded onto a Solaris drive that are between 1 and 4 GB to be used in an initial load of a data warehouse. I can't even open a file with file sizes that large. (We're running JDK 1.2.2 - not sure if that matters.) I'm using this statement -
    bufReader = new BufferedReader(new InputStreamReader(new FileInputStream(fileName)));
    This is the error I get on that statement -
    java.io.FileNotFoundException: /wrhs_data/export.13.aug02 (Value too larg)
    at java.io.FileInputStream.open(Native Method)
    at java.io.FileInputStream.<init>(FileInputStream.java:68)
    at com.cofinity.importer.MyFileReader.open(MyFileReader.java:40)
    at com.cofinity.importer.MyFileReader.main(Compiled Code)
    The statement works fine on files less than 220 MB. It breaks somewhere between 220 and 804 MB.
    From the error message it seems that the underlying Native call can't handle opening such a large file. I've searched for the "value to larg" sub-message and found nothing. I tried eliminating the BufferedReader and just using the InputStreamReader, but I received the same error.
    Does anyone know how Java can read large files in the 1 to 4 GB range? (I suppose I could use something like Informatica to split the files up, but our disk space is at a premium.) Any help would be greatly appreciated.
    Thanks,
    Steve

    Well it appears to fail in open(). I tried your code on a binary file of size 25739135241 bytes (23.9+ gibibytes) on AIX and it did just fine, so it may be something in the runtime, try upgrading to a newer JDK/SDK, failing that, use the OS to stream in your data:
    BufferedReader br = new BufferedReader(
            new InputStreamReader(System.in)
    );And just pipe/redirect your file to your Java processes' standard in.

Maybe you are looking for