3d files/ size limit in CS5?

Anyone know if theres a 3d file quantity / size limit within a photoshop document? What would any limit be dependant on, e.g, VRAM?
Running Photoshop 64bit Cs5 Extended, all updates, on a dual xeon, 12gb ram, 64bit win 7, NVidia Quadro FX 3800 (1gb), Raptor scratch disk with 50gb space, used as a dedicated scratch disk. PS settings are set to alocate over 9gb ram to PS, GPU Open GL settings enabled and set to Normal, 3d settings allocate 100% VRAM (990mb) and rendering set to Open GL. You'd expect this to perform admirably and handle most tasks.
Background:
Creating a PSD website design file with 3 x 3d files embedded. One 'video' animation file linked and a few smart objects (photos) and the rest is shapes and text with a few mask etc. Nothing unusual other than maybe the vidoe and 3d files. The file size is 500mb, which isnt unusual as I've worked on several 800mb files at the same time all open in the same workspace. PC handles that without any problems.
Introducing the 3d files and video seems to have hit an error or a limit of some sort but I cant seem to pinpoint whats causing it or how to resolve it.
Problem:
I have the one 500mb file I've been working on, open. I try to open any ONE file or create a new one and it says the following error. "Could not complete the command because too many files were selected for opening at once". I've tried with 3 files, other PSD files, JPEGs, anything that can be opened in PS. All with the same message. Only one PSD file open, only trying to opne one more file or create a new file from scratch.
I've also had a similar error "Could not complete your request because there are too many files open. Try closing some windows & try again". Have re-booted and only opened PS and still the same errors.
Tried removing the video file and saving a copy. That doesnt work. Removed some of th 3 files and saved a copy and then it sometimes allows me to open more files. Tried leaving the 3d files in and reducing lighting (no textures anyway) and rendering without ray tracing, still no effect. Tried rasterising the files and it allowed more files to be opened. I'm working across a network so tried using local files which made no difference. Only thing that seems to make a difference is removing or rasterising some of the the 3d files.
Anyone had similar problems on what seems to be a limit either on quantity of 3d files, or maybe complexity limit, or something else to do with 3d files limits? Anyone know of upgrades that might help? I've checked free ram and thats at 7gb, using about 10gb swap file. I've opened 5 documents at the sam time of over 700mb each, and its not caused problems, so I can only think the limit is with the GPU with regards to 3d. Cant get that any higher than 990mb, which I'd assume would be enough anyway if that was the problem. I've palyed about with preferences to adjust the 3d settings to lower but no use.
Anyone any idea whats limiting it and causing it to give the error message above? Is it even a PS5 limit or a win 7 64bit limit?
Any ideas greatly appreciated. Thanks all.

Thanks for your comments Mylenium, I originally thought it might be VRAM, but at 1gb (still quite an acceptable size from what I can tell - I'd expect more than 3 x 3d files for that) I originally dismissed it as the complexity of the files seemed quite low for it to be this. I'm still not completely convinced its the VRAM though because of the error message it gives, and have tried it on more complex 3d models and using more of them and it works fine on those. Seems odd about not letting me create a new document too. Would like the money get a 6gb card but a bit out of the budget range at the moment.
Do you know of a way to "optimise" 3d files so they take up less VRAM, for example reducing any unwanted textures, materials, vertices or faces within PS in a similar fashion to how illustrator can reduce the complexity/ number of shapes/ points etc? Cant ask the client as they dont have the time or I'd do this . Does rendering quality make a difference, or changing to a smart object? Doesnt seem to from what I've tried.
Re: using a dedicated 3d program, I'd be a bit reluctant to lose the ability to rotate / edit/ draw onto/ light etc objects within Photoshop now that I have a taste for it and go back to just using 3d renderings, otherwise I'd go down the route as suggested for a dedicated 3d package. Thanks for the suggestion though.

Similar Messages

  • LabView RT FTP file size limit

    I have created a few very large AVI video clips on my PXIe-8135RT (LabView RT 2014).  When i try to download these from the controller's drive to a host laptop (Windows 7) with FileZilla, the transfer stops at 1GB (The file size is actually 10GB).
    What's going on?  The file appears to be created correctly and I can even use AVI2 Open and AVI2 Get Info to see that the video file contains the frames I stored.  Reading up about LVRT, there is nothing but older information which claim the file size limit is 4GB, yet the file was created at 10GB using the AVI2 VIs.
    Thanks,
    Robert

    As usual, the answer was staring me right in the face.  FileZilla was reporting the size in an odd manner and the file was actually 1GB.  The vi I used was failing.  After fixing it, it failed at 2GB with error -1074395965 (AVI max file size reached).

  • HT4863 How can I increase the file size limit for outgoing mail. I need to send a file that is 50MB?

    How can I increase the file size limit for outgoing mail. I need to send a file that is 50MB?

    You can't change it, and I suspect few email providers would allow a file that big.  Consider uploading it to a service like Dropbox, then email the link allowing the recipient to download it.

  • FILE and FTP Adapter file size limit

    Hi,
    Oracle SOA Suite ESB related:
    I see that there is a file size limit of 7MB for transferring using File and FTP adapter and that debatching can be used to overcome this issue. Also see that debatching can be done only for strucutred files.
    1) What can be done to transfer unstructured files larger than 7MB from one server to the other using FTP adapter?
    2) For structured files, could someone help me in debatching a file with the following structure.
    000|SEC-US-MF|1234|POPOC|679
    100|PO_226312|1234|7130667
    200|PO_226312|1234|Line_id_1
    300|Line_id_1|1234|Location_ID_1
    400|Location_ID_1|1234|Dist_ID_1
    100|PO_226355|1234|7136890
    200|PO_226355|1234|Line_id_2
    300|Line_id_2|1234|Location_ID_2
    400|Location_ID_2|1234|Dist_ID_2
    100|PO_226355|1234|7136890
    200|PO_226355|1234|Line_id_N
    300|Line_id_N|1234|Location_ID_N
    400|Location_ID_N|1234|Dist_ID_N
    999|SSS|1234|88|158
    I would need a the complete data in a single file at the destination for each file in the source. If there are as many number of files as the number of batches at the destination, I would need the file output file structure be as follows:
    000|SEC-US-MF|1234|POPOC|679
    100|PO_226312|1234|7130667
    200|PO_226312|1234|Line_id_1
    300|Line_id_1|1234|Location_ID_1
    400|Location_ID_1|1234|Dist_ID_1
    999|SSS|1234|88|158
    Thanks in advance,
    RV
    Edited by: user10236075 on May 25, 2009 4:12 PM
    Edited by: user10236075 on May 25, 2009 4:14 PM

    Ok Here are the steps
    1. Create an inbound file adapter as you normally would. The schema is opaque, set the polling as required.
    2. Create an outbound file adapter as you normally would, it doesn't really matter what xsd you use as you will modify the wsdl manually.
    3. Create a xsd that will read your file. This would typically be the xsd you would use for the inbound adapter. I call this address-csv.xsd.
    4. Create a xsd that is the desired output. This would typically be the xsd you would use for the outbound adapter. I have called this address-fixed-length.xsd. So I want to map csv to fixed length format.
    5. Create the xslt that will map between the 2 xsd. Do this in JDev, select the BPEL project, right-click -> New -> General -> XSL Map
    6. Edit the outbound file partner link wsdl, the the jca operations as the doc specifies, this is my example.
    <jca:binding  />
            <operation name="MoveWithXlate">
          <jca:operation
              InteractionSpec="oracle.tip.adapter.file.outbound.FileIoInteractionSpec"
              SourcePhysicalDirectory="foo1"
              SourceFileName="bar1"
              TargetPhysicalDirectory="C:\JDevOOW\jdev\FileIoOperationApps\MoveHugeFileWithXlate\out"
              TargetFileName="purchase_fixed.txt"
              SourceSchema="address-csv.xsd" 
              SourceSchemaRoot ="Root-Element"
              SourceType="native"
              TargetSchema="address-fixedLength.xsd" 
              TargetSchemaRoot ="Root-Element"
              TargetType="native"
              Xsl="addr1Toaddr2.xsl"
              Type="MOVE">
          </jca:operation> 7. Edit the outbound header to look as follows
        <types>
            <schema attributeFormDefault="qualified" elementFormDefault="qualified"
                    targetNamespace="http://xmlns.oracle.com/pcbpel/adapter/file/"
                    xmlns="http://www.w3.org/2001/XMLSchema"
                    xmlns:FILEAPP="http://xmlns.oracle.com/pcbpel/adapter/file/">
                <element name="OutboundFileHeaderType">
                    <complexType>
                        <sequence>
                            <element name="fileName" type="string"/>
                            <element name="sourceDirectory" type="string"/>
                            <element name="sourceFileName" type="string"/>
                            <element name="targetDirectory" type="string"/>
                            <element name="targetFileName" type="string"/>                       
                        </sequence>
                    </complexType>
                </element> 
            </schema>
        </types>   8. the last trick is to have an assign between the inbound header to the outbound header partner link that copies the headers. You only need to copy the sourceDirectory and SourceGileName
        <assign name="Assign_Headers">
          <copy>
            <from variable="inboundHeader" part="inboundHeader"
                  query="/ns2:InboundFileHeaderType/ns2:fileName"/>
            <to variable="outboundHeader" part="outboundHeader"
                query="/ns2:OutboundFileHeaderType/ns2:sourceFileName"/>
          </copy>
          <copy>
            <from variable="inboundHeader" part="inboundHeader"
                  query="/ns2:InboundFileHeaderType/ns2:directory"/>
            <to variable="outboundHeader" part="outboundHeader"
                query="/ns2:OutboundFileHeaderType/ns2:sourceDirectory"/>
          </copy>
        </assign>you should be good to go. If you just want pass through then you don't need the native format set to opaque, with no XSLT
    cheers
    James

  • S1000 Data file size limit is reached in statement

    I am new to Java and was given the task to trouble shoot a java application that was written a few years ago and no longer supported. The java application creates database files the user's directory: diwdb.properties, diwdb.data, diwdb.lproperties, diwdb.script. The purpose of the application is to open a zip file and insert the files into a table in the database.
    The values that are populated in the diwdb.properties file are as follows:
    #HSQL Database Engine
    #Wed Jan 30 08:55:05 GMT 2013
    hsqldb.script_format=0
    runtime.gc_interval=0
    sql.enforce_strict_size=false
    hsqldb.cache_size_scale=8
    readonly=false
    hsqldb.nio_data_file=true
    hsqldb.cache_scale=14
    version=1.8.0
    hsqldb.default_table_type=memory
    hsqldb.cache_file_scale=1
    hsqldb.log_size=200
    modified=yes
    hsqldb.cache_version=1.7.0
    hsqldb.original_version=1.8.0
    hsqldb.compatible_version=1.8.0
    Once the databsae file gets to 2GB it brings up the error meessage 'S1000 Data file size limit is reached in statement (Insert into <tablename>......
    From searching on the itnernet it appeared that the parameter hsqldb.cache_file_scale needed to be increased & 8 was a suggested value.
    I have the distribution files (.jar & .jnlp) that are used to run the application. And I have a source directory that was found that contains java files. But I do not see any properties files to set any parameters. I was able to load both directories into NetBeans but really don't know if the files can be rebuilt for distribution as I'm not clear on what I'm doing and NetBeans shows errors in some of the directories.
    I have also tried to add parameters to the startup url: http://uknt117.uk.infores.com/DIW/DIW.jnlp?hsqldb.large_data=true?hsqldb.cache_file_scale=8 but that does not affect the application.
    I have been struggling with this for quite some time. Would greatly appreciate any assistance to help resolve this.
    Thanks!

    Thanks! But where would I run the sql statement. When anyone launches the application it creates the database files in their user directory. How would I connect to the database after that to execute the statement?
    I see the create table statements in the files I have pulled into NetBeans in both the source folder and the distribution folder. Could I add the statement there before the table is created in the jar file in the distribution folder and then re-compile it for distribution? OR would I need to add it to the file in source directory and recompile those to create a new distribution?
    Thanks!

  • 4GB File Size Limit in Finder for Windows/Samba Shares?

    I am unable to copy a 4.75GB video file from my Mac Pro to a network drive with XFS file system using the Finder. Files under 4GB can be dragged and dropped without problems. The drag and drop method produces an "unexepcted error" (code 0) message.
    I went into Terminal and used the cp command to successfully copy the 4.75GB file to the NAS drive, so obviously there's a 4GB file size limit that Finder is opposing?
    I was also able to use Quicktime and save a copy of the file to the network drive, so applications have no problem, either.
    XFS file system supports terabyte size files, so this shouldn't be a problem on the receiving end, and it's not, as the terminal copy worked.
    Why would they do that? Is there a setting I can use to override this? Google searching found some flags to use with the mount command in Linux terminal to work around this, but I'd rather just be able to use the GUI in OS X (10.5.1) - I mean, that's why we like Macs, right?

    I have frequently worked with 8 to 10 gigabyte capture files in both OS9 and OS X so any limit does not seem to be in QT or in the Player. 2 GIg limits would perhaps be something left over from pre-OS 9 versions of your software, as there was a general 2 gig limit in those earlier versions of the operating system. I have also seen people refer to 2 gig limits in QT for Windows but never in OS 9 or later MacOS.

  • Is there a Maximum file size limit when combining pdf files

    I am trying to combine files 8 PDF files created with i Explore using Acrobat XI Standard, the process is completed but when opening the file not all 8 individual files have been combined only the first two have been combined, the process shows all eight being read and combined.
    I can open and view each of the original 8 individual files and read the content without any issue, the largest of the eight files is 15,559kb and the smallest is 9,435kb
    if anyone can assist that would be most appreciated

    Hi Tayls450,
    This should not have happened as there is no maximum file size limit when combining PDFs.
    Please let me know if there are some password protected PDFs also?
    Also, I would suggest you to choose 'Repair Acrobat Installation' option from the Help menu.
    Regards,
    Anubha

  • Folio file size limit

    In the past their were file size limits for folios and folio resources. I believe folios and their resources could not go over 100MB.
    Is their a file size limit to a folio and its resources? What is the limit?
    What happens if the limit is exceeded?
    Dave

    Article size is currently capped at 2GB. We've seen individual articles of up to 750MB in size but as mentioned previously you do start bumping into timeout issues uploading content of that size. We're currently looking at implementing a maximum article size based on the practical limitations of upload and download times and timeout values being hit.
    This check will be applied at article creation time and you will not be allowed to create articles larger than our maximum size. Please note that we are working on architecture changes that will allow the creation of larger content but that will take some time to implement.
    Currently to get the best performance you should have all your content on your local machine (Not on a network drive), shut down any extra applications, get the fastest machine you can, with the most memory. The size of article is direclty proportional to the speed at which you can bundle an article and get it uploaded to the server.

  • WebUtil File Transfer - file size limit

    Does anybody know what the file size limit is if I want to transfer it from the client to the application server using WebUtil? The fileupload bean had a limit of 4 Mb.
    Anton Weindl

    Webutil is only supported for 10g. THe following is added to the release notes:
    When using Oracle Forms Webutil File transfer function, you must take into consideration performance and resources issues. The current implementation is that the size of the Forms application server process will increase in correlation with the size of the file that is being transferred. This, of course, has minimum impact with file sizes of tens or hundreds of kilobytes. However, transfers of tens or hundreds of megabytes will impact the server side process. This is currently tracked in bug 3151489.
    Note that current testing up to 150MB has shown no specific limits for transfers between the database and the client using WebUtil. Testing file transfers from the client to the application server has shown no specific limit up to 400Mb; however, a limit of 23Mb has been identified for application server to client file transfers.

  • Lite 10g DB File Size Limit

    Hello, everyone !
    I know, that Oracle Lite 5.x.x had a database file size limit = 4M per a db file. There is a statement in the Oracle® Database Lite 10g Release Notes, that db file size limit is 4Gb, but it is "... affected by the operating system. Maximum file size allowed by the operating system". Our company uses Oracle Lite on Windows XP operating system. XP allows file size more than 4Gb. So the question is - can the 10g Lite db file size exceed 4Gb limit ?
    Regards,
    Sergey Malykhin

    I don't know how Oracle Lite behave on PocketPC, 'cause we use it on Win32 platform. But ubder Windows, actually when .odb file reaches the max available size, the Lite database driver reports about I/O error ... after the next write operation (sorry, I just don't remember exact error message number).
    Sorry, I'm not sure what do you mean by "configure the situation" in this case ...

  • TFS Preview source control file size limit

    Is there a file size limit for adding large files to a TFS Preview project's source control..?   I would like to add some files to source control that are in the 20-30Mb range and the requests keep aborting.

    Hi Erich,
    Thank you for your post.
    I test the issue by adding .zip file to TFS Azure version comtrol, the size include 10MB, 20MB, 100MB, 2GB, i can add they to version control properly without any problem.
    In order narrow down the issue, here are some situations i want to clarify from you:
    1. What's the error message when TFS azure block the check-in process? Could you share a screen shot?
    2. Which client tool are you using? VS10 or VS11? Or other tools?
    3. Try to reproduce this in different network environment and see what will happen.
    4. Capture a fiddler trace which can help to check the issue.
    Thanks,
    Lily Wu [MSFT]
    MSDN Community Support | Feedback to us

  • Anybody know how to increase the plugin file size limit in Photoshop CS6 to greater than 250 mb?

    Can anyone tell me if it is possible to increase the plugin file size limit in Photoshop CS6 to greater than 250 mb and how to do it? Can plugins running in PSCC handle larger file sizes than CS6?

    Wow, thanks for getting back to me!!
    I am running the latest version of HDR Soft Photomatix Tone Mapping Plug-In - Version 2.2 in Photoshop CS6 on a fully loaded solid state MacBook Air. When I attempt to process files exceeding 250 mb with the plugin I get an error message and the plugin will not work. The plugin works fine with anything south of 250 mb. I have also optimized the performance settings in CS6 for large file sizes.
    The standalone version of HRD Soft’s Photomatix Pro easily processes files well in excess of 300 mb.
    I have contacted Photomatix support and they say that 250megs is simply the max file size that Photoshop will allow to run a plugin with.
    So is there any setting that I’m overlooking in Photoshop CS6 that will allow me to process these large files with the plugin? Or if there is indeed a file size limit for plugin processing in CS6 is the limit higher in CC?
    Thanks in advance for your help.

  • File size limit on export doesn't.

    I have an export option for a photo forum I post to, where the size limit is 200kb. I have set 200kb, 600x600 JPG as the export limits, and I routinely get files much bigger, memory-wise, than that, up to 260kb.
    I can set the export dimension limit to 500x500, and still get files over 200kb. Today, after a file went to 260kb at 600x600, I set the dimension limit to 500x500, and the exported file size did not change: at 500x500, it was still a 260kb file. If I go through and tweak compression by hand, on every photo, or set a file size limit of 150kb, I can get under 200kb per file.
    I've tried files where I set the limit from 200, to 190, to 180, to 170, and see no change in the exported file (still well over 200 in size), and then when I set it to 160 it jumps down to 150-160 in size. If the file was only 210kb on export with the 200kb limit, setting it to 180kb will generally get the file under 200kb.
    Really, the file size limit does not do what it purports to do. It can clearly get the file below that size if I direct it by hand every time: I haven't specified anything that would artificially bulk the file up, like a quality setting that would be incompatible with the file size limit. It's free to compress the file as needed to meet the limit.
    What I want it to do is meet the limit I set for it. I shouldn't have to dance around. I have this export setting for the express purpose of quickly producing under-200kb files, and I sort of expect Lightroom to manage it.

    I've had this problem too, but it isn't the only one when creating small jpeg files from LR.
    There is something seriously amiss in the export module. I also create a lot of 600 pixel wide/high files and not only are the file sizes far too high, but the quality is poor. I have two workarounds for this, both of which add a little time to the job, but make a big difference.
    First is to export my files as full size jpegs (which I do anyway,) LR does a good job with these. Then get another programme to batch process these to give me the small sRGB files I also need!
    Second is to use LR's web module and create a basic html site for a batch of images in a folder in a temp directory at the precise size I want. This has the advantage that I can add a watermark. Then just rename and move the folder containing the images from the web folder that has been created to where I want them, followed by deleting the rest of the web folder.
    Working at low quality (38%) from the Export module gave me a file size for one image of 455KB. So then I told it to export at a max of 200KB, and it came out at 565KB.  Using the web module with quality set at 70 gave a higher quality result and a file size of 105KB!
    The problem seems to be worse on images where I've done quite a bit of work using local adjustments - rather as if they are actually performing these on the small jpeg and re-saving each time. Certainly something going very wrong - just like it was in LR2.x and I think it must be a logical error as presumably the web module uses the same library to create jpegs.

  • Increase Project File Size limit?

    When I'm recording a longer song with real instruments, occasionally I will be told I can't record because I'm close to the 2gb file size limit. How do you increase the file size limit, or just get rid of it all together?
    Thanks

    I didn't know there was a size limit. I have Projects that are over 2 GB.

  • Is there a file size limit for images in keynote on a mac?

    Is there a file size limit for images in keynote on a mac?

    If you right click on the file (navigate to it using finder, or if it is in iCloud you should open keynote, File -> Open and you should see a list of your iCloud keynotes) then press command+i, you'll pull up the info tag which will tell you the file size.
    Keynote also has an option to Reduce File Size  (File -> Reduce Files Size) but I do not recommend it at all because it will severely degrade the quality of your pictures. 
    It's also possible to reduce an individual image's memory size in your keynote by right clicking it and selecting Reduce Image File Size).  This is more time-consuming but allows you to see if the degraded photo quality is still good enough, and you can always undo it.

Maybe you are looking for

  • Help! Can't mount iPod to Mac OR PC, and neither Restore or Disk Mode work

    My 5th Generation iPod is not being recognized by my Windows or iTunes whenever I try to mount it, nor is it recognized by my friend's Mac or iTunes whenever I try to mount it on his. Because I cannot get my iPod to connect to a computer, I cannot Re

  • Issue with opening PDF file link from Safari

    Hi Everyone, I got a problem with opening PDF file link from Safari 4.04. Instead of getting a normal pdf content, it shows some weird characters. Not sure it is something wrong with Adobe Reader Plugin or Safari? I have uploaded a screenshot of the

  • How do I fill out a form that was sent to me?

    How do I fill out a form that was sent to me?

  • Recovery tool for ZEN Sleek Ph

    I have trouble shot my Zen Sleek photo and need a Recovery tool for it, but there is none on the Creative Website. Where do I get one or is there another method to revi've it from Recovery mode? Please Reply ASAP

  • Hi how can I execute a getQueryHitCount() on getbindings

    Hi I need additional functionality over an executable, like the following. public String but_nav_ultimo_action() { but_nav_ultimo1.setDisabled(true); BindingContainer bindings = getBindings(); OperationBinding operationBinding = bindings.getOperation