Folder Structure & Dates query

Hey all,
I'm new to the forum but have been using LIghtroom for a couple of years to organise and tweak my photos as an amateur.  I use predominatly a 5dmk2 for professional video and amateur photography.  I'm on a Macpro using Lightroom 3.  I have 4 internal drives on the MacPro, and one is a dedicated Lightroom drive, where I import all my photos to via lightroom.  This is my master catalog that I backup. The photos date back to 2003, and are all displyed by date in correct folders etc. Lovely and organised.
I also use a macbook pro, and occassionally have to merge catalogs.  I merga catalogs if I have tweaked the files on the macbook and want to add them to my master catalog complete with edits.  On occasions, when I simply want the photos to be a part of the master catalog, and to tweak them from there, I have simply imported the photos by connecting the macbook via firewire so it mounts as a disk.
What I have (on the drive) is largely how I want it, which is all photos on the drive organised by month and year as they were taken, as per the metadata for each image.  There are however a few folders that both annoy me from an OCD point of view, and also do not fall into the correct date folders and so cannot be so easily found and organise into colelctions etc.
The problem folders look like this:
[URL=http://s556.photobucket.com/user/DaveBaum/media/Screenshot2013-10-11at124017.png.html][IMG]http://i556.photobucket.com/albums/ss3/DaveBaum/Screenshot2013-10-11at124017.png[/IMG][/URL]
The opened example folders shown in the image don't relate to dates at all - so the 2012 > 08 > 27 is not august 27th 2012. Just a folder with a load of images in it taken over a variety of dates. 
Can anyone help me understand:
1 - why this happens
2 - how I can re-organise them into the correct, unform folder structure used for 95% of the images, without having to manually check the date for each, move them into new folders etc etc and delete the current folders when they are empty?
I want all images to be displayed in the date they were taken, and to be added to folder database in the uniform manner. I have tried the "move" import option, but as they are all present on the same drive, this doesn't work.
Life is arguably too short to worry about this sort of thing - but it has been bugging me for months!  And it's making it difficult to organise images form multiplt sources - I think that's what is causing me the issue.
Many thanks in advance.
Dave

Thanks for your reply Rob, it gave me an idea.
I have gone into the finder, to the folders that don't match and are out of sync with the correct dates etc, and copied all media from within them to a new location, and selected the "move" option on import. I've removed the link to them in Lightroom, and I'm re-importing them.
Consequently, on re-importing them, LIghtrrom is rebuilding a folder structure in the finder and all seems to be well with the world again.
Thanks very much
Dave

Similar Messages

  • Creating folder Structure for Query based taxonomy

    Hello friends,
            I am confused as to where do I create a folder hierarchy for taxonomy. Do I need to create a new repository for creating a folder structure. The help.sap.com mentions about using the Category Browser iview for creating the same. I am unable to find this iview in EP6 SP12. Can someone, please guide me as to where do I create a taxonomy folder structure hierarchy?
    Your help is appreciated.
    Thanks,
      G.G

    Hey G.G.,
    Yes, i'd love to.
    So, as stated above, directly below the "Taxonomies" folder you'll see the names of the existing taxonomies. There is no "new" option on this level, because a new taxonomy can only be created under "System Administration" -> "System Configuration" -> "Knowledge Management" -> "Index Administration". After you create a "TREX Search and Classification" or "TREX Classification" index, you can go to the index menu "Taxonomies" and say "New". This will automatically create a new folder under in the "Taxonomies" view.
    Now, if you go one level further, so second subfolder under "Taxonomies" you should have the "New" option. If not, then you'll have to check the permission settings. Please go to "Content Administration" -> "KM Content" -> "Taxonomies" and then on the Menu "Details" -> "Settings" -> "Permissions". You might need to adjust permissions as suggested by SAP in SAPNote (599425):
    Role:"ContentManager"=FULL CONTROL;
    Group:"Everyone"=READ.
    Please check that the already existing subfolders have inherited the permission changes.
    Hope this solves your problem,
    Robert

  • Script to copy data to new folder structure

    I am working on a project where I need to migrate many TB's of data into a new folder structure.
    I have exported the one of the folders including all sub-folder with treesize and exported it to excel.
    So the source folder structure is in column A, which has several thousands lines and I have put the destination folders in column B.
    Now I would like to script the data copy for every line the source folder is listed in column A and the destination folder is listed in column B.
    This has to be done for all containing sub folders and files of course.
    Has anyone got an idea how the script should look like ?
    I can export the excel sheet to CSV, but then my knowledge stops unfortunately.

    Win32 Device Namespaces
    The "\\.\" prefix will access the Win32 device namespace instead of the Win32 file        namespace. This is how access to physical disks and volumes is accomplished directly, without going through the       
    file system, if the API supports this type of access.  You can access many devices other than disks this way        (using the
    CreateFile and        
    DefineDosDevice functions, for        example).
    https://msdn.microsoft.com/en-us/library/windows/desktop/aa365247(v=vs.85).aspx
    ¯\_(ツ)_/¯

  • Migrate data to new folder structure

    I need to migrate a lot of data to a new folder structure, one of the things I will have to deal with is file paths which are too long.
    To copy the data I have made a powershell script which uses robocopy to copy all data.
    Of course I do not want to have disruption in this process (or the least disruption possible), what can I do to prevent issues with long file paths ?
    Is there an easy way to modify my script and detect issues with long file paths ?
    What would be the way to go preventing many copy errors during the robocopy action and fix possible issues before starting ?

    There are utilities out that (free / not free) that you can run and it will give you a report on which folders are too long.  What I did was, I would take the results in the report and email it to an admin assistant or some type of key person in each
    department, and they would fix the problem.  Once I had them all done, then I would do the migration.

  • SharePoint List Filter to narrow a folder structure

    Hello,
    The company I work for has a document library containing folders for each of their hundreds of clients.  The current process now is to load all of the folders in alphabetical order in order so that the users can scroll through and find the correct client
    relatively quickly.  This takes longer than it should considering how many folders need to load.
    I was trying to utilize the SharePoint List Filter to pull the Title column so the users could filter based on the name and we wouldn't need to pull all of the data each time the library was opened, but it seems that the title and name columns don't hold
    their connection to the list.  Each time I attempt to set the connections and then use the filter, I get the "This filter is not connected" error.
    Is it possible to use this web part in this manner (or is there another web part that will work) or should I suggest moving to meta data instead of a folder structure if they want speed of use?
    Thank you in advance!

    Hi,
    Based on your description, my understanding is that you want to you want filter the document library by the title column so that the users could filter based on the name and don’t need to pull all of the data each time the library was opened.
    Refer to the following steps:
    Open your document library-> choose Export to Excel->in excel, click insert PivotTable,
    create a pivot table in Excel -> use  Report filters
    To filter the title column->save it and upload it to a document library
    Then create a new page in SharePoint  and add the Excel Web Access web part and configure it to display your workbook, (or you can add the Excel Web Access web part in your document library and configure it to display your workbook) Now
    you can filter the document library by the title column.
    Here is a link about how to Create a pivot table in Excel 2010:
    http://www.techonthenet.com/excel/pivottbls/create2010.php
    Here is a link about how to use Report filters:
    http://www.gcflearnfree.org/excel2010/20.5
    Besides, you can refer to the following blog:
    http://consulting.risualblogs.com/blog/2014/09/10/filtering-excel-webparts-in-sharepoint-using-query-string-parameters/
    Best Regards,
    Lisa Chen

  • What is the best way to explore a hierarchical folder structure?

    Hallo,
    I need to access and navigate a hierarchical folder structure hosted in a MS SQL Server database. In particular there is a root folder containing several folders. Each child-folder contains further nested folders or documents.
    For each item I need to retrieve the folder's (name, path, etc) and the documents (title, author, etc.) details that are retrievable from the DB fields. Afterwards I will use these data to create a semantic web ontology using Jena API.
    My question was about which is the best way to proceed.
    A collegue of mine suggested to use the "WITH" command of SQL Server to create and use a link list to navigate easily the structure, executing just one query rather than several (one for each level of the nested loops). However in this way the solution will work only with the SMQ Server database, while my goal is to achieve a more general solution.
    May someone help me?
    Thank you in advance,
    Francesco

    My goal is to create a documents library ontology achieving, from each element of the hierarchy (folder or document), some data (title, parent, etc.) and use them to "label" the ontology resources.
    I will use a little of both approches in the following way:
    1) I make just ONE query on folder table to get, from each folder, its path (eg. root/fold1/fold2/doc1.pdf), its ID and ParentID and ONE on the Documents table to get the containerID, title, etc.
    2) I create as many Folder objects as the retrieved records and an HashTable, where the KEY = Folder.ParentID value and the VALUE = Vector<Folder>. I add then each object to the Vector relative to the same ParentID. In this way I have an Vector containing all the folders child of the same parent folder and I do the same for an HashTable keeping the documents contained in a specific folder.
    3)I extract from the HashTable the root folder (whose ParentID is always "0" and whose ID is "1") than it is invoked the method appendChild() (see code)
         public static void appendChild(int ID, Resource RES)
              Vector<Folder> currFold = table.get(ID);
              for(int i=0; i<currFold.size(); i++)
                   //Extract the child and crate the relative resource
                   if(table.containsKey(currFold.getID()))
                        appendChild(currFold[i].getID(), Resource newRES);
    In this way I go in depth in the hirarchical structure using a "left most" procedure. I made a test and the output is correct. However, such an approch must be done for about 4 level depth folders (around 30 in total) containing also documents and creating the documents library of a Project. Then I must process around 20 project to achieve for all of them such a documents library representation.
    By the way, I do not have to mantein the HashTable content after I created the docs library ontology. Hence I use just one hashTable for ALL the projects and I flush it after I finish to do the loop for one project in order to save resources.
    My question is: is right my approach or might I improve it in some way?
    Thank you for every suggesion/comment.
    Francesco
    Edited by: paquito81 on May 27, 2008 8:15 AM                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • Folder Structure Archiving: exporting folder structure + content

    Hi,
    I would like to copy a part of an existing folder structure (with included contents) from a production UCM 11g instance (I'm using Folder_g) to a development instance (using a tar of the newly created archive).
    I'm trying to use Folder Structure Archiving component.
    I'm using the guide:
    http://docs.oracle.com/cd/E21764_01/doc.1111/e10792/c06_migration.htm#CHDFHEEI
    Now, I successfully executed step 6. of "8.10.3.1 Creating a Folder Structure Archive".
    However if I list on filesystem the newly created archive, what can I only see is the archive.hda file (...without references to selected folders) and no contents.
    Contents was exported?...if yes, where I can found them?...if no, how can I export them?
    What I expected to have is a "regular" archive as produced by the Archiver tool...
    I'm missing something?
    Any helps will be greatly appreciated.
    Thank you.
    Best regards.
    Simone.

    Hi Simone,
    I was also facing the same issue. You can try one thing, create an archive folder in your source ucm server through archiver. In the Export Data tab specify the custom query as -> xCollectionId = <Your Folder Collection Id>.You can get the collection id of your folder by seeing the preview without specifying any query and then select the extra column 'Folder' to see in the table. In this way you can export the content for the selected folder.
    After that, go to the Folder Structure Archive tool and select the archive you just now have created and then slect the folder structure you want to archive.
    This is not the correct way but you can try.
    Thanks,
    Sachin

  • Library creating a new duplicate folder structure

    I am using 2.0 on a Mac. Recently I moved all my data files including photos and lightroom catalogs from an external to an internal drive on my MacPro and renamed the new drive with the same name as the external and changed the external name and it became a backup. Lightroom continued to work just fine, seeing all the images with no problem. If I do a "Show in Finder" it shows me them on the internal HD. However, when I import a new image or even import an image or folder that has previously been imported, LR brings it in within a new folder structure that uses the same hard drive name.
    My actual hard drive folder structure for my data drive is something like HardDriveName/Graphics/Airshows/image.jpg with all my images in a Graphics folder and within that a series of folders and subfolders.
    Previously if I imported a new folder, LR would show HardDriveName/NewFolder and I have several folders of imported images with that structure. It did not include the /Graphics folder because I never imported the entire Graphics folder. Now however, after importing a folder, LR created a new folder structure in the Library with a duplicate of the HardDriveName and it included /Graphics.
    I went to the first folder structure and picked a small folder and made a change to an image to darken it a great deal. Then I imported the folder all over again and it imported it to the new folder structure but the dark version was not visible. I DO have automatically write xmp data checked.
    If I right click on any image from either folder structure LR shows me the image within the same hard drive where my original are found.
    I don't want to have two folder structures and apparently if I re-import all my images LR doesn't see the xmp data. I tried to tell RM to read the xmp data from the image I had changed after it came into the new folder but it did not show up. If I manually tell LR to Save Metadata to File for a folder in my original structure and then go to the same folder in the new structure that LR is showing and tell it to read metadata it will show up in the new folder. I have a folder with 32,000+ images and thousands of other images.
    I would like LR to continue to bring images into my "old" folder structure but that may not be possible. Any help?
    Interestingly I have more than one data drive. This problem does not seem to be happening with my other data drive that I also swapped to an internal configuration.
    I put a pic of my folder structure at http://www.pbase.com/santa/image/101246417 to illustrate the issue. My harddrive name shows as Brandon.
    I spoke too soon. LR seems to be creating a new folder structure for two of my three hard drives and adding images properly to one of my three hard drives. All three were new drives with the names changed to match the old drives.

    yes I did update camera raw. I think LR 2.0 asked for that. I did a search on "duplicate" and found no answer to my problem. Anyone with a link to any useful information?

  • How to change location of files (folder structure to storing in iPhoto lib)

    I need help.
    I thought I was smart, but as is always the case, I find out that I am not
    I did not trust iPhoto at first. So I retained my images in my 8 years of folder structure (by year and then subfolders of months).
    My lib is getting large, 7000 plus images, and when importing images (I do not have the setting for imported images to be stored to iPhoto Lib) and wanting to delete some, I find it more work than is necessary.
    Bottom line: Can someone tell me if there is a way to change the setting (of course there is) but also to reimport, or move, my images that are already in iPhoto (but are really just previews referencing to my folder structure) so that they will reside in the iPhoto Lib?
    When I import it finds duplicates...if I say do not import then it does not go any further...but if I say "yes, import duplicates" then it brings them in as duplicates...which means I have two copies, but only one retains all of the keywords etc.
    Another bottom line: I want to retain my keywords in my current library, but move all images into the iPhoto lib instead of my folder structure. I can do this from this date forward, but I want to get the entire lib to be this way.
    Help?

    read through this thread - it should help you
    http://discussions.apple.com/thread.jspa?threadID=1125431&tstart=45
    Larry Nebel

  • Automatic creation of KM folder structure from xml pattern

    Hi all,
    is it possible to create a KM folder structure automaticly, following the tree structure of an xml document?
    For example:
    an xml-document with following content:
    <item id=1 level=0>abc</item>
    <item id=2 level=1>aaa</item>
    <item id=3 level=1>bbb</item>
    <item id=4 level=2>ccc</item>
    <item id=5 level=0>def</item>
    I'd like to create a KM folder structure like this:
    - abc
    ---aaa
    ---bbb
    ccc
    - def
    Does anyone have any idea to implement this scenario in KM?
    Much obliged!
    Steffi

    Not currently - well at least not easily - there is always the programatic approach.
    I would suggest you can create an FSDB repository - this means you can then create your structure in the file system using standard desktop tools.
    After that you could do a mass copy back into the db based KM repository if you so wished.
    For your downstream systems you could always ICE the data across meaning you dont have to create everything again.
    Haydn

  • SAP Netweaver Portal - Folder structure and report publishing strategy

    Hi gurus,
    I'm working on a project in which we are publishing BW reports in SAP Netweaver Portal (prior to this, we were using SAP Bex Browser) organizing it by Department (workset, 2nd lvl of navigation), Business Processes (workset, 3rd lvl of navigation) and "iview types" (transactions, queries, dashboards and so on). On 1st level we have a workset simply called "Reports".
    For each PFCG role we have on BW (ABAP Server), we created a portal role and defined which of the higher navigation tiers is visible to users (delta-linking the top level navigation worksets to the portal role and setting visible accordingly) and then we assigned Portal roles to the appropriated Group (PFCG role) so all users that have access to certain PFCG role can have access to the corresponding Portal role.
    For example:
    In BW (ABAP) we have a PFCG Role: Sales PMR Analysis - Administrator, which grants access to Sales Infoproviders, queries and data for PMR analysis. The user that have access to it, should be able to see:
    |Reports|
    |Sales Administration|
    v PMR Analysis
      > Web Queries
      > Transactions
      > Dashboards
      > Workbooks
    In Portal we create a workset that has: "Reports" and under it, all Departments (Sales Administration, Financial, Services, Human Resources and so on) set as "not visible". We create a Portal role called PMR Analysis and add the Report/Departments workset, set Sales Administration as visible and add worksets/folders named PMR Analysis, Web queries, Transactions, etc, and all the corresponding iviews under it.
    If we assign this new Portal role to the Group of the corresponding PFCG role, the user will see exactly as the example from above.
    Pros: The users have a clean view of the reports he/she has access.
    Cons: The users doesn't have a clear view of what is available in BW so he can request access. There might be some useful report lost in on of the hundreds of BW roles that he doesn't know it exist.
    One of the options would be to grant visibility to ALL departments, business processes and iviews in Portal and let PFCG security roles control user access, but that might be confusing to have an overwhelming amount of links, making it hard to find what the user needs, specially during first access. Not to mention possible security breaches.
    So, I'd like to know how other consultants are defining this kind of folder structure and report publishing strategy in Portal. Ideally we would like to be able to have:
    - Visibility of all existant reports even those that the user does not have acess (so he can learn that it exist and request access)
    - Clean view of reports, segmented by department and business processes.
    - Possibility to search for reports, even for those that the user does not have access.
    I'd appreciate if you guys can share your experience on this.
    Thanks in advance.
    Leandro

    Cons: The users doesn't have a clear view of what is available in BW so he can request access. There might be some useful report lost in on of the hundreds of BW roles that he doesn't know it exist
    This is not a refutation: your BW developers/consultants should be able to write reports, which points you in case of insufficent backend rights. EP is not designed to be familiar with ABAP security concept and cant take any influence at it.
    One of the options would be to grant visibility to ALL departments, business processes and iviews in Portal and let PFCG security roles control user access, but that might be confusing to have an overwhelming amount of links, making it hard to find what the user needs, specially during first access. Not to mention possible security breaches.
    Yes, bad idea.
    - Visibility of all existant reports even those that the user does not have acess (so he can learn that it exist and request access)
    As you already wrote on your own, this would be not good
    - Clean view of reports, segmented by department and business processes.
    - Possibility to search for reports, even for those that the user does not have access.
    This is not covered by SAP, if you wanna provide that, develop it on your own.
    cheers

  • How do i get my folder structure in Lightroom to replicate my hard drive folder structure?

    My issues is that post-import from Aperture, my folder structure in Lightroom does not match my folder structure on my hard-drive.  In particular, rather than be structured as in my hard-drive:
    2015 (folder)
    -> 2015-01-30 (sub folder)
    -> 2015-02-15 (sub folder)
    -> 2015-03-03 (sub folder)
    All of the photo folders are listed individually as:
    2015-01-30
    2015-02-15
    2015-03-03
    In addition, a number of folders have been created in the form 01a2a201aff9c8c684928629ea41221c0fa5930ab7 where no folder exists on my hard drive in this form and the image contains the right date meta data.
    As I've got photos dating back to 2005 and a lot of different days taking photos, this is making the folders view unmanageable.
    I completed the Aperture migration in two stages (as i did not have sufficient space to replicate images in move across to Lightroom)
    1.  Moved Aperture original images from being References to Managed (with the file structure above)
    2.  Then used the Lightroom plug in to import the library from Aperture (which has successfully copied across meta data and my folder structure + edits from Aperture)
    Does anyone know how to fix this issue?  In particular:
    1) Adding in 2015 folder level to structure?
    2) Removing all off the 01a2a201aff9c8c684928629ea41221c0fa5930ab7 folders?
    Many thanks in advance,
    Joe

    In the "folders" pane, right-click on the folder you see and select "show parent folder". Repeat as needed.

  • Create an image / map of folder structure in Finder

    Hi there,
    I am reorganizing some data that I want to make available for others later on. In order to facilitate the transition for them it would be extremly helpful if I could create a map of the folder structure. I am thinking of something like a mindmap which shows a folder and then each subfolder underneath it. It doesn't need to show the actual files within those folders.
    Is there any way to have something like that created automatically? Or maybe some software (freeware would be best of course).
    Thanks a lot,
    Sebastian

    Duplicate post: http://forums.adobe.com/thread/1338701?tstart=0

  • Problen in unzip a file in folder structure

    hi,
    how can i unzip a file and folder with folder structure. when i unzip a file it create problem. please help to solve this problem.
    here is my code
    import java.io.*;
    import java.util.zip.*;
    public class MakeUnzip {
    final static int BUFFER = 2048;
    public static void main (String argv[]) {
    try {
    BufferedOutputStream dest = null;
    FileInputStream fis = new FileInputStream("D:/serverdata/dates.zip");
    // String root = "D:/clientdata/";
    ZipInputStream zis = new
    ZipInputStream(new BufferedInputStream(fis));
    ZipEntry entry;
    ZipFile zipfile = new ZipFile("D:/serverdata/dates.zip");
    while((entry = zis.getNextEntry()) != null) {
    int count;
    byte data[] = new byte[BUFFER];
    if(entry.isDirectory()){
         File dir = new File(entry.getName());
         if(!dir.exists()){          dir.mkdir();                 }      
    FileOutputStream fos =null;
    fos = new FileOutputStream(entry.getName());
    dest = new BufferedOutputStream(fos, BUFFER);
    while ((count = zis.read(data, 0, BUFFER))
    != -1) {
    dest.write(data, 0, count);
    dest.flush();
    dest.close();
    zis.close();
    } catch(Exception e) {
    e.printStackTrace();
    Please give me solution.
    Thanks in advance

    try this one and change it as u like:
    import java.util.zip.ZipFile;
    import java.util.zip.ZipEntry;
    import java.io.InputStream;
    import java.io.OutputStream;
    import java.io.FileOutputStream;
    import java.io.IOException;
    import java.util.Enumeration;
    import java.io.File;
    public class UnzipFile {
    private static void doUnzipFiles(String zipFileName) {
    try {
    ZipFile zf = new ZipFile(zipFileName);
    System.out.println("Archive: " + zipFileName);
    // Enumerate each entry
    for (Enumeration entries = zf.entries(); entries.hasMoreElements();) {
    // Get the entry and its name
    ZipEntry zipEntry = (ZipEntry)entries.nextElement();
    if (zipEntry.isDirectory())
    boolean success = (new File(zipEntry.getName())).mkdir();
    else
    String zipEntryName = zipEntry.getName();
    System.out.println(" inflating: " + zipEntryName);
    OutputStream out = new FileOutputStream(zipEntryName);
    InputStream in = zf.getInputStream(zipEntry);
    byte[] buf = new byte[1024];
    int len;
    while((len = in.read(buf)) > 0) {
    out.write(buf, 0, len);
    // Close streams
    out.close();
    in.close();
    } catch (IOException e) {
    e.printStackTrace();
    System.exit(1);
    public static void main(String[] args) {
    if (args.length != 1) {
    System.err.println("Usage: java UnzipFile zipfilename");
    } else {
    doUnzipFiles(args[0]);
    }

  • How can I make a folder structure for both PSE and Premiere Elements?

    I have recently bought bundled PSE 10 and Prem Elements 10 ... (but have been using PSE6 for years)
    I have lots of jpeg photos on an external drive in folders with a heirarchical structure eg. Pictures ... Friends ... 2011 ... 2011_12_02_Margate_seagulls.jpg  I also have loads of photos that are simply there waiting to be sorted and are in their native format e.g PIC_6120.jpg     Additionally I have .MOV files dumped from my JVC standard def camera some of which are custom named, others are left as is.  I now own a hi-def lumix which takes great jpgs and also video MTS files. After lots of searching on the net for the best way to get all this organised I am at a standstill. I feel that I need an external folder system (ie one that does not rely 100% on adobe and its categotries and tags) because I have had catastropic losses when using earlier photoshop progs. I also see the benefit of using tags. So...
    Is there a simple (and best) way to order my photos and hi-def video while taking into account content I already have shot/videod?  I am happy to hear reasons and see examples why I should trust PSE/Prem E - but I would also like to see if it makes sense to use folders. Ideally i would love a practical folder structuring method that incorporates both video and still content. I have posted here because it is important I organise my video better and have a flexible system that allows for drafts and reworking my Hi-def content.
    I have tried to find the info but it is so piecemeal and I have faith in the people on here ... Any links and advice would be much appreciated

    In generic terms there are three methods of filing just about anything:
    Structured by event
    Structured by type
    Unstructured
    Event might be occasion or date or a specific project. For this type of structure you would store all your assets related to that event, regardless of whether it is still, image or audio, in one place.
    Type would typically be by the type of asset. All your videos in one place, stills in another, audio in another  etc... In business for example in the eighties (when personal computing became prevalent in the workplace) staff would religiously have all their word processor documents in one place, spreadsheets in another, powerpoints in yet another etc.... Dreadfully inefficient.
    Unstructured would be a case of storing things wherever you liked and accessing them via tags or albums - but you've said you want to avoid this.
    Personally I let the application store things according to the defaults for the different import types but then tag them. Essentially I follow the 'unstructured' style. For anything major I create a project directory and move all the project assets into it (i.e. selective structure by event).
    If you search the Tips & Tricks forum I think Bill Hunt (who does this stuff for a living) describes his structured folder flow in great detail.
    Cheers,
    Neale
    Insanity is hereditary, you get it from your children

Maybe you are looking for