Too many files in root directory. How to split them up?

Hi,
I made my site http://isuckatgolf.net in Dreamwever cs3. Having no idea what I was doing when I started 5 years ago all images and pages have been uploaded directly to the root folder/directory. My host GoDaddy.com just sent me a notice that I'm only allowed 1024 folders/files in one directory. I have almost 7000 in the main site and 1500 in the database for the sub-domain/forum (SMF forum). According to Godaddy.com this is a rule of theirs, but it isn't posted anywhere that they could point me to. Only that I have a max. of 500,000 files total in my unlimited shared hosting account.
I'm looking for a new host that can handle the way it is set up today. My question is, IF I have to divide these files up into subfolders or directories (whatever they're technically called), is it possible and is there an "easy" way to do this in Dreamweaver. Obviously manually moving 7000 files will take some time. But can it be done, and do you think it will be necessary no matter who hosts the site?
Lastly, if I do find a host that is ok the way it is, going forward should I direct DW to put future images into an images folder of some kind. I never created one. Right now all the jpg, gif and html files are just listed. (I didn't know how to do it the right way when I started!)
Any help would be apprecaited.
Thanks!
KEn

It does not make any difference where your files are stored as long as the path to the file is correct. Nancy's idea is correct, but you can have sub-folders to the sub-folders. For example, I will have an images folder and within the images folder I will have folders that refer to pages within my site. My images folder may look like
+images
     +index
          +thumbs
          +slides
     +gallery
          +holidays2012
               +thumbs
               +images
          +holidays2013
               +thumbs
               +images
          +golf
               +images
etc, etc.
That way it is easy to discover if there are any redundant images.
The same goes for the pages. You may have a products page with a couple of sub-pages. The directory would the be
index.html
aboutus.html
products.html
contactus.html
+products
     fencing_material.html
     sport_clothing.html
     solar_panels.html
etc, etc
As said, it really does not matter where the files are, with the exception of the index page which must always be in the root directory.
Good luck 

Similar Messages

  • Access 2010 table has too many fields for web database - how to split into two web-compatible tables?

    Hello, 
    I'm in the process of converting an Access 2010 database into a web database and I'm having some trouble. I have a table which has 236 fields, which is more than the 220 field limit for web-compatible tables. I have tried to split this table into two tables
    with a one-to-one relationship, but web tables can only use lookups as relationships. I've tried to connect the tables with a lookup and then synthesize a one-to-one relationship by using data macros but I'm not having much luck.
    I realize that 236 fields is a lot, but it must be set up this way because each field represents a tasks and is a yes/no box to verify that the task has been completed - and the records are different employees for whom which the tasks need to be completed.
    Could someone please help me figure out a way to make this table web compatible?
    Thank you, 
    Ryan

    Hi,
    I found that you've cross post the quesion on our Answer forum, are you satisfiled the reply from there?
    http://answers.microsoft.com/en-us/office/forum/office_2010-access/access-2010-table-has-too-many-fields-for-web/06ee81ea-24ab-48b8-9b8f-0ed08a868bac
    Regards,
    George Zhao
    TechNet Community Support

  • Anyone know how to fix error message: word cannot complete too many files open

    anyone know how to fix error message: word cannot complete too many files open
    thanks
    Ronny

    First, take a look at this thread, toward the end of the second page:
    https://discussions.apple.com/thread/1449787?start=0&tstart=0
    Your ansswer is there, somewhere.
    Unfortunately, you've not provided enough system information to solve this.(The only info is that you use a MacBook Pro.) This issue may have been addressed in a prior version. Update your Support Communitees Profile to include your current hardware, system, and devices.

  • Photoshop CS6 error "could not complete ... too many files open. Try closing........." with 3D psd?

    Im working in a .psd file with multiple 3D layers. The file size totals at 356mb.
    When opening the file, I get this error: "could not complete your request because there are too many files open. Try closing some windows and try again." This doesn't any sense to me, as I was not experiencing any chugging when the file was open, and I am on a relatively fast system.
    Win 7 64bit
    Intel i7-2760QM @ 2.4GHz (8 CPU)
    16 gigs ram
    Nvidia GTX 560M (2752mb)

    Having the same issue on the Mac!
    Worked on a file yesterday, fairly big, but nothing that I haven't handled before... Only difference was the 3DS file I imported has over 270 textures.
    I think there is a limit on how many textures can be imported, essentially corrupting the file on save. It would be a lot more helpful to have Photoshop warn us that the 3D model is too complicated before we import it and save our work looking forward to finishing off the next day.
    I tried the same model again today, did nothing other than position it and render it, then saved, same error on opeing: "Could not complete your request because there are too many files open. Try closing some windows and try again."
    If anyone at Adobe would like to have a look ta the file the smaller version that I did today I can upload somewhere or send a YouSendIt.
    Cheers.

  • Final Cut Express keeps stopping and saying "Too many files open"

    Final Cut Express keeps stopping and saying "Too many files open". I don't see how this is the case as I only have the one project open...

    Try here:
    http://www.fcpbook.com/Misc1.html
    Al

  • Photoshop CS5.1 Extended: Could not complete your request because there are too many Files Open

    I saved a file last night and now this morning I get an error message stating: "Photoshop CS5.1 Extended: Could not complete your request because there are too many Files Open. Try closing some windows and try again.". I've rebooted the system and made sure Photoshop was the only program running and that there were no other files open prior to trying to open my file. I increased Photoshops ram usage, checked the OpenGL settings, 3D and VRAM settings, cleared my temp files and am at a loss for opening this file.
    System Specs:
    Win 7 64
    6 core I7 processor
    32gb Ram
    3GB VRAM
    600gb free space on 7200 rpm drive
    File specs:
    1 3D object replicated maybe 5 times (yes, I should have rasterized them, and will do that to 4 of them if I get the file open)
    Maybe 8 textures
    Not sure how many total layers.
    Is there anyway to get Photoshop to reopen this file?

    No, more VRAM is useless. I had the same thought as you so I tried with a friend's computer with 3GB of VRAM for 8GB RAM (mine has 1GB VRAM for 16GB RAM). Still got the same error message. Sorry guys but our files are lost until Adobe fix the error in Photoshop code. I hope in the future they will find a way to either fix this error or give a clear limitation, it is really a shame such a dangerous problem exists and persists for more than 3 photoshop gen.
    Still you may be able to have at least a flat image of your latest file by trying to open the file as smart object. I started my work over by doing that. Not a lot but better than nothing.
    However for my new file I am trying something different : Each time I think a 3D file is final, I do not rasterize it but make it into a smart object and save the new smart object into a single separated .psd file. If any change is needed I can still edit all of my 3D objects by opening them separately. So far it works, the global file is lighter with few to no 3D objects inside but I will not say it is a 100% no-risk method yet.
    Last advice I can give everyone : Before closing the file you are working on, save it as a copy (File / Save as... and check 'As copy' in the options below) then with your main file still open, try to open the copy. You may need a lot of memory if your file is heavy but still if the dreaded error message appears when you try to open the copy then remove or rasterize some 3D objects in your main one. Save again as copy and try to open the copy until you do not get the message anymore.

  • Too Many Files Opened

    All,
    I am running on Fedora Core 10 (2.6.27.21-170.2.56.fc10.x86_64) with Java 1.6.0_12-b04 (32-bit and 64-bit get the same issue) and am getting a "java.io.FileNotFoundException: ... (Too many open files)" error when running my unit tests. I have been very careful about placing things in a try { } finally { stream.close(); } block and can't figure out what is actually going wrong.
    The unit tests actually work fine on Windows XP, which points me to either Java or Linux itself.
    After doing some research online I found some articles and postings:
    - http://lj4newbies.blogspot.com/2007/04/too-many-open-files.html
    - http://www.mail-archive.com/[email protected]/msg15750.html
    - http://www.linuxforums.org/forum/redhat-fedora-linux-help/73419-we-facing-too-many-open-files-regularly.html
    However, is there anything else I can do within Java itself (using a VM argument, code "best practices" that I may be missing, etc. to combat this?
    My Linux core configuration as based on the articles are as follows:
    $ cat /proc/sys/fs/file-max
    564721
    $ ulimit -a
    core file size          (blocks, -c) 0
    data seg size           (kbytes, -d) unlimited
    scheduling priority             (-e) 0
    file size               (blocks, -f) unlimited
    pending signals                 (-i) 55296
    max locked memory       (kbytes, -l) 32
    max memory size         (kbytes, -m) unlimited
    open files                      (-n) 1024
    pipe size            (512 bytes, -p) 8
    POSIX message queues     (bytes, -q) 819200
    real-time priority              (-r) 0
    stack size              (kbytes, -s) 10240
    cpu time               (seconds, -t) unlimited
    max user processes              (-u) 1024
    virtual memory          (kbytes, -v) unlimited
    file locks                      (-x) unlimited

    tjacobs01 wrote:
    The unit tests actually work fine on Windows XP, which points me to either Java or Linux itself.OR another app on your linux machine that's got a lot of files open
    OR you installed java as yourself and not system and you don't have priveledges to open the files
    OR your code has a bug somewhere that doesn't effect running it on windows
    99.999% of the time, the problem is with you, not with Java or LinuxThis actually seems to be it. I rebooted the system into console mode to avoid X from loading. I then reset the JDK to be setup by root (I use the self-extracting, non-RPM version). When I ran the test suite via Ant here it did not show the problem. Therefore, I'm fairly convinced it is actually some other application eating up too many files.

  • 8g touch updated 4.2.1, problems synching apps and music. gives multiple errors; device times out, internal device errors, network times out, duplicate file, too many files open. itunes diagnostics ok. what to do?

    have an 8g ipod touch which has had problems trying to play games and downloading update 4.2.1. went to the apple store who kindly downloaded the software update but now can't synch and generates multiple error messages; device times out, internal device error, network times out, duplicate file and too many files open etc etc. have latest i tunes download and diagnostic test ok. what can i do?

    Does the ext directory have the php_oci8.dll? In the original steps the PHP dir is renamed. In the given php.in the extension_dir looks like it has been updated correctly. Since PHP distributes php_oci8.dll by default I reckon there would be a very good chance that the problem was somewhere else. Since this is an old thread I don't think we'll get much value from speculation.
    -- cj

  • IPod won't sync "too many files currently open"

    Every time I try to connect my iPod there is an error that says "cannoy sync iPod too many files currently open." I'm not sure what this means because I only have iTunes open and it still will not sync. Please help! Thanks!

    I had the same problem. I just turned off my shared playlists and restarted my itunes. it worked after that

  • Sync fails in iTunes 9.0 & 9.01: "too many files open" and error -6999

    I've been trying to synch my iPod (Touch, g1) for the last couple weeks but have been getting errors ever since going to iTunes 9.0.
    First it blew my library and now says "Previous iTunes Libraries" at the top. Can live with that (after going through and reloading about 2/3s of the album artwork).
    But now am getting two errors every time I try to sync. One is, "Attempting to copy to the disk '[my iPod's name]' failed. There are too many files open currently."
    Elsewhere I saw it suggested for this error that I turn off sharing, but, alas, sharing isn't turned on.
    The second error message is, "The iPod '[my iPod's name]' cannot be synced. An unknown error occurred (-6999)." and I can't find any mention by Apple about what to do about this.
    HELP. Please.

    There has been some advice to turn off "sharing" to stop getting the "too many files open" error message, but, as noted in my question, sharing was already turned off in the iTunes Preferences.
    But, it turns out, of course, that there is another kind of sharing in the 9.x iTunes -- the one for sharing libraries on your local network. Don't know why they are different because the function seems the same, but I noticed that there was still an icon for "home sharing" or "house sharing" (little picture of a house) on the library/playlist list in iTunes.
    I right clicked on the house icon and found another checkbox switch there. I turned that one off too (Why Apple?) and, voila, the "too many files" error has (apparently) gone away.
    Hope this helps some others out there.
    Apparently you have to turn off sharing in two different places. (Again; why is that, Apple?)

  • What does too many http redirects mean and how do I fix it

    Does anyone know what the **** "Too many http redirects" means, and how the **** do I fix it ??

    500 errors in the HTTP cycle
    Any client (e.g. your Web browser or our CheckUpDown robot) goes through the following cycle when it communicates with the Web server:
    Obtain an IP address from the IP name of the site (the site URL without the leading 'http://'). This lookup (conversion of IP name to IP address) is provided by domain name servers (DNSs).
    Open an IP socket connection to that IP address.
    Write an HTTP data stream through that socket.
    Receive an HTTP data stream back from the Web server in response. This data stream contains status codes whose values are determined by the HTTP protocol. Parse this data stream for status codes and other useful information.
    This error occurs in the final step above when the client receives an HTTP status code that it recognises as '500'. Frank Vipond. September 2010.
    Fixing 500 errors - general
    This error can only be resolved by fixes to the Web server software. It is not a client-side problem. It is up to the operators of the Web server site to locate and analyse the logs which should give further information about the error.
    Fixing 500 errors - CheckUpDown
    Please contact us (email preferred) whenever you encounter 500 errors on your CheckUpDown account. We then have to liaise with your ISP and the vendor of the Web server software so they can trace the exact reason for the error. Correcting the error may require recoding program logic for the Web server software, which could take some time.
    http://www.checkupdown.com/status/E500.html

  • JspSmartUpload  uploading too many files

    Has anyone else had a problem with too many files being uploaded?
    My code is as follows:
    <FORM NAME="frmAddDoc" ACTION="process_add.jsp" METHOD="post" ENCTYPE="multipart/form-data">
    <%@ include file="attachments.html"%>
    and attachments.html code specifies two files to be uploaded as follows:
    <TR>
    <TD CLASS="formFieldLabel">Source Document</TD>
    <TD><INPUT TYPE="File" NAME="sourceDocFileName" SIZE="12" CLASS="fileField"></TD>
    <TD><IMG SRC="../shared/images/shim_trans.gif"></TD>
    </TR>
    <TR>
    <TD CLASS="formFieldLabel">Published Document</TD>
    <TD><INPUT TYPE="File" NAME="docFileName" SIZE="12" CLASS="fileField"></TD>
    <TD><IMG SRC="../shared/images/shim_trans.gif"></TD>
    </TR>
    In process_add.jsp there is a call to the UploadBean.upload method which is as follows:
    public UploadResult upload(PageContext pageContext, String CSVFileTypesAllowed)
    StringBuffer statusMsg = new StringBuffer();
    int uploadedFilesCount = 0;
    // Upload initialization
    uploadHelper.initialize(pageContext);
    uploadHelper.setAllowedFilesList(CSVFileTypesAllowed);
    uploadHelper.upload();
    uploadedFilesCount = uploadHelper.save(uploadTempDir, uploadHelper.SAVE_PHYSICAL);
    return uploadResult;
    This works most of the time but occasionally four files are uploaded instead of two (i.e. uploadedFilesCount and uploadedFiles.getCount() are both 4 when they should be 2).
    (UploadHelper = new SmartUpload)
    Has anybody any ideas what could be causing this?
    I having been chasing this "bug" for a week now and it has me beat
    Can anybody help?
    Thanks

    Hi
    Try looking for the jspSmartUpload documentation, nobody here or atleast 98 % of the people do not use these third party tools. But rather develop them on their own. So it might be difficult for you to get an answer for your post.
    Swaraj

  • Mailbfr - rsync issue too many files??

    hey so basically we have an imap server with around 350 mail accounts,
    we use mailbrf to backup the mail database and the users email but the mailbfr is not completing successfully the error i get in the log that gets email to me is this
    rsync: connection unexpectedly closed (8 bytes received so far) [sender]
    rsync error: error in rsync protocol data stream (code 12) at /SourceCache/rsync/rsync-24/rsync/io.c(359)
    mailbfr was aborted. The process was NOT completed successfully.
    im guessing that because we didn't have this issue a couple months ago that the mail store has grown too big, its around 190GB at the moment so perhaps too many files for rsync to backup correctly??
    any ideas on what we can use to back this much data up??
    Cheers for any help
    Calum

    While I can't exclude the large number of files being the issue (I have seen rsync handle larger amounts, but that doesn't mean a particular combination of size/number couldn't break rsync), I somehow have the feeling it's a different issue.
    Calum, if you need to further investigate this, drop me an e-mail or catch me online.
    That said, with this amount of users, I'd consider splitting things over at least a couple of servers.

  • CS3 Camera Raw Too Many Files Open at the Same time alert!!!

    Please help me. I keep getting this error message in Camera Raw that says there are too many files open at the same time - but I only have 100 open:( Please help - I was getting this error with CS2 and thought upgrading to CS3 would fix it and it didn't!!!

    > "10 or 100 - you can quickly go through a stack of images in ACR and make any desired changes you want. Whether making the same or similar adjustment to similar files, or making radically different adjustments to different images as appropriate".
    I've done this with far more than 100! I think my maximum is 425 raw files, invoking ACR from Bridge without Photoshop even loaded, and it worked well. (I've also done 115 JPEGs in order to crop them under extreme time constraints).
    It can be very slick. For example, if I use a ColorChecker a number of times in a shoot, it is easy to select just the set (perhaps 100 or so) that a particular ColorChecker shot applies to and set the WB for all of them.
    Furthermore, in case people don't know, you can set ratings on raw images while many of them are open in ACR. (Just click under the thumbnail). It isn't as powerful as Lightroom, but it is not to be dismissed.
    I suspect that it is possible to apply sensor-dust-healing to lots of images in the same way, and certainly it is easy to apply presets based on various selections.
    Perhaps with AMG (Adobe Media Gallery) it will be sensible to use the above capability to process 100s of raw files, then create a set of web pages for the best of them, in not much more time than it would have taken in Lightroom. I judge that Lightroom is the "proper" tool for the job (perhaps after 1.1!), but Bridge+ACR can go a long way.

  • Too many files on 10.4 Desktop - Finder won't load

    2.66 DualCore Intel Xeon + OSX 10.4.10 + 2Gb DDR2 DIMM + 230Gb ST3250820AS + OPTIARC DVD RW AD-7170A
    Despite recurrent warnings, a relative kept leaving too many files on the Desktop, up to 300 files (JPGs, downloaded archives). Last week, after attempting to move/copy them to a folder, OSX crashed, and now Finder won't load; after password screen, menu bar and Dock appear, the 'MyDay' app (from Entourage) starts bouncing, and system freezes. Sometimes it goes back to password screen, and from there again to a frozen, empty finder with dead mouse pointer. Other times the spinning clock or wheel remain spinning forever.
    Tried DiskUtility, repaired disk and permissions, reset PRAM, no dice.
    Tried to boot off the 10.4.10 DVD, it doesn't allow to run OSX from the disc (I have a vague recollection of doing this once, but...)
    Started in SafeBoot mode, finder froze at same stage.
    Re-Installed OSX 10.4.10 in Archive and Install mode, same freeze at same stage.
    The only explanation for the crash, as no updates/new app installing happened, is that there are, again, too many loose files on the desktop and as it tries to load the Finder it crashes. I'm hauling the machine ASAP to try with my own Intel to try access his disk via FireWire target mode but worry the same thing will happen on my system, that it will balk at too many files.
    Is there a way to try to fix this without resorting to clean install (needless to say, last backup was months ago)? If I install Leopard on that disk (it comes out fine in Disk Utility) will the same thing happen as soon as it tries to load the Finder? Is there a way to boot 10.4 off a CD/DVD?
    Thanks for any info or tips!

    Welcome to the forums!
    Make your relative write this out 1000 times:
    Performance tip: Keep the Desktop clutter-free (empty, if possible)
    Mac OS X's Desktop is the de facto location for downloaded files, and for many users, in-progress works that will either be organized later or deleted altogether. The desktop can also be gluttonous, however, becoming a catch-all for files that linger indefinitely.
    Unfortunately - aside from the effect of disarray it creates - keeping dozens or hundreds of files on the Desktop can significantly degrade performance. Not necessarily because the system is sluggish with regard to rendering the icons on the desktop and storing them in memory persistently (which may be true in some cases), but more likely because keeping an excessive number of items on the Desktop can cause the windowserver process to generate reams of logfiles, which obviously draws resources away from other system tasks. Each of your icons on your desktop is stored as a window in the window server, not as an alias. The more you have stored, the more strain it puts on the window server. Check your desktop for unnecessary icons and clear them out.
    Keeping as few items as possible on the Desktop can prove a surprisingly effective performance boon. Even creating a single folder on your Desktop and placing all current and future clutter inside, then logging out and back in can provide an immediately noticeable speed boost, particularly for the Finder.
    And it is why Apple invented 'Stacks' for Leopard.

Maybe you are looking for

  • Getdocument function (Loading an xml document from sql plus)

    Hello, I quote from the XML Database Developer's Guide. It appears to be possible to specify an xml file in the query using the following syntax: INSERT INTO XMLTABLE VALUES(XMLTYPE(getDocument(purchaseorder.xml))); Where purchaseorder.xml is a docum

  • No AVCHD support for me! :-( on 4.2 or 4.1 why??? full version installed!

    Hi Guys I have two issues One I don't have AVCHD support and yes I have a full version but do have the new AVC-Intra support. Two Whenever I import .avi files they seem to play much faster than the audio. This second problem would not matter as much

  • Business content upgradation risks

    Hello BW xperts, I am a newbie in SAP BW using BW version 3.1. Our client team is planning on a business content upgrade from version 3.1 to 3.2, in which I don’t have any experience. They want to know <b>what are the risks involved</b>, especially,d

  • HT4528 UNBLOCK IPHONE 4S IOS 5.1.1

    I BOUGHT AN IPHONE FROM VERIZON IN USA, I´M IN COLOMBIA AND IT DOESN´T FUNTION BECAUSE IS BLOCKED

  • How to Increase Spacing between characters in textfield

    Hi Guys, I need some kind of solution. I have a textField which I am using to getPassword( I don't want to use JPassword Field) because of older JVM. suppose TextField is 15 characters wide and number of charactes I want to allow is 5 so I want an "*