Naming huge amounts of files after "undelete" / photorec / etc?

Hello!
Does someone already have a neat little script to automatically name a huge amount of files in multiple fuzzy sorts of ways? Like with id3-tags, exif tags, the first line / header of text documents of various formats... sort by file extension, mime-tag, file size if nothing else is there... and stuff? I'm only familiar with very few of the tools required and still failed miserably at getting them all together in a bash script.
If not: any hints on how to get started? I'm still missing a good way to deal with about everything except for images, mp3 & plain text...
Thx!

I'm not sure if gprename does anything useful - if it does, I can't find it in the GUI.
easytag and krename are great, thanks! Just not fit for so many files and also I'd love if there was a way to do it without choosing pattern for everything manually (starting by the next time I need to do something like this ).  Maybe I should have written that from the beginning: I have about 90M files... might not be a single occasion (people don't learn -.-") and it's not supposed to be perfect: more like a quick & dirty automatic way of making it a little easier for the "end-user" to find his files in this huge mess. I don't care (much) if the script runs for days as long as I don't have to interact with it though, if the result is worth it.
So far now, I managed to reduce the amount of files by about 99% with my script (duplicates, damaged files, very small files, files that have extensions or contain lines of text that indicate they belonged to some programs cache or something), have them sorted by file extensions and give names to word documents, mp3 that have an id3 tag and photos that have exif data available. Most problematic at the moment:
- Processing odt's at a reasonable speed. Extracting huge amounts to /tmp & grep/sed-ing stuff from the xml sucks (me too stoopid for awk ).
- No idea how to get rid of all those files that were part of the windows installation or some other program. Especially huge amounts of videos that seem to have been part of tutorial / encyclopedia type of programs and help pages.
- I'd love to try & get some info from a p2p network. I understand despite it being not very clever, many many people share their whole harddisk, so If I figure out how to query with file hashes or something for "most common file name" and "amount of peers that share / want to download", I could get very far at removing more useless files and naming others.
Last edited by whoops (2011-01-09 09:45:57)

Similar Messages

  • I used migration assn't to load a Time Machine backup onto a new mac.  The first TM backup after that took some time, perhaps not surprising.  But the backups thereafter have all taken hours, with huge amounts of "indexing" time.  Time to reload TM?

    I used migration assn't to load a Time Machine backup onto a new mac.  The first TM backup after that took some time, perhaps not surprising.  But the backups thereafter have all taken hours, with huge amounts of "indexing" time.  Time to reload TM?

    Does every backup require lots of indexing?  If so, the index may be damaged.
    Try Repairing the backups, per #A5 in Time Machine - Troubleshooting.
    If that doesn't help, see the pink box in #D2 of the same link.

  • I deleted pictures from my camera roll using my computer since i didn't have enough space to back up my ipad in my 5 gigs of space. but after deleting a huge amount of pictures. the size of the camera roll is still the same as when i started.

    I deleted pictures from my camera roll using my computer since i didn't have enough space to back up my ipad in my 5 gigs of space. but after deleting a huge amount of pictures. the size of the camera roll is still the same as when i started. some of these same pictures are still on my photostream. I deleted some of them -this time from the ipad itself- but the size of the camera roll remains the same. should i have not remeoved them using the computer for the camera roll.

    The links below have instructions for deleting photos.
    iOS and iPod: Syncing photos using iTunes
    http://support.apple.com/kb/HT4236
    iPad Tip: How to Delete Photos from Your iPad in the Photos App
    http://ipadacademy.com/2011/08/ipad-tip-how-to-delete-photos-from-your-ipad-in-t he-photos-app
    Another Way to Quickly Delete Photos from Your iPad (Mac Only)
    http://ipadacademy.com/2011/09/another-way-to-quickly-delete-photos-from-your-ip ad-mac-only
    How to Delete Photos from iPad
    http://www.wondershare.com/apple-idevice/how-to-delete-photos-from-ipad.html
    How to: Batch Delete Photos on the iPad
    http://www.lifeisaprayer.com/blog/2010/how-batch-delete-photos-ipad
    (With iOS 5.1, use 2 fingers)
    How to Delete Photos from iCloud’s Photo Stream
    http://www.cultofmac.com/124235/how-to-delete-photos-from-iclouds-photo-stream/
     Cheers, Tom

  • Why the apple store is so expensive and why the apple users can't access for free after spending huge amount in apple phone ?

    Why the apple store is so expensive and why the apple users can't access for free after spending huge amount in apple phone ?

    The Apple store is correct. The warranty is not international, and Apple will not accept or return iPhones shipped from a different country.  You need to ship the phone to somebody in Hong Kong who can take it in to Apple for repair, or pay a third party repair shop in the Philippines to fix it.

  • Try to shrink datafiles after deleting huge amount of records

    Dear Sir;
    I have deleted more than 50 million records along with table’s partitions for the previous year.
    now I'm trying to shrink datafile with more than 10GB free space but still unable to delete due to
    ORA-03297: file contains used data beyond requested RESIZE value.
    How can we shrink these datafile or otherwise what is the best way to delete huge amount of records and get additional space in HD
    Thanks and best regards
    Ali Labadi

    Hi,
    You could see this article of Jonathan LEWIS:
    http://jonathanlewis.wordpress.com/2010/02/06/shrink-tablespace

  • How to move huge HD video files between external hard drives and defrag ext drive?

    I have huge high definition video files on a 2TB external hard drive (and its clone).  The external hard drive is maxed out.  I would like to move many of the video files to a new 3TB external hard drive (G-drive, and a clone) and leave a sub-group of video files (1+ TB) on the original external hard drive (and its clone).  
    I am copying files from original external drive ("ext drive A") to new external drive ("ext drive B") via Carbon Copy Cloner (selecting iMovie event by event that I want to transfer). Just a note: I do not know how to partition or make bootable drives, I see suggestions with these steps in them.
    My questions:
    1.)  I assume this transfer of files will create extreme fragmentation on drive A.  Should I reformat/re-initialize ext drive A after moving the files I want?  If so, how best to do this?  Do I use "Erase" within Disk Utilities?  Do I need to do anything else before transfering files back onto ext drive A from its clone?
    2.) Do I also need to defrag if I reformat ext drive A? Do I defrag instead of or in addition to reformating?  If so, how to do this? I've read on these forums so many warnings and heard too many stories of this going awry.  Which 3rd party software to use? 
    Thank you in advance for any suggestions, tips, advice.  This whole process makes me SO nervous.

    Here is a very good writeup on de-fragging in the OS environment that I borrowed
    From Klaus1:
    Defragmentation in OS X:
    http://support.apple.com/kb/HT1375  which states:
    You probably won't need to optimize at all if you use Mac OS X. Here's why:
    Hard disk capacity is generally much greater now than a few years ago. With more free space available, the file system doesn't need to fill up every "nook and cranny." Mac OS Extended formatting (HFS Plus) avoids reusing space from deleted files as much as possible, to avoid prematurely filling small areas of recently-freed space.
    Mac OS X 10.2 and later includes delayed allocation for Mac OS X Extended-formatted volumes. This allows a number of small allocations to be combined into a single large allocation in one area of the disk.
    Fragmentation was often caused by continually appending data to existing files, especially with resource forks. With faster hard drives and better caching, as well as the new application packaging format, many applications simply rewrite the entire file each time. Mac OS X 10.3 onwards can also automatically defragment such slow-growing files. This process is sometimes known as "Hot-File-Adaptive-Clustering."
    Aggressive read-ahead and write-behind caching means that minor fragmentation has less effect on perceived system performance.
    Whilst 'defragging' OS X is rarely necessary, Rod Hagen has produced this excellent analysis of the situation which is worth reading:
    Most users, as long as they leave plenty of free space available , and don't work regularly in situations where very large files are written and rewritten, are unlikely to notice the effects of fragmentation on either their files or on the drives free space much.
    As the drive fills the situations becomes progressively more significant, however.
    Some people will tell you that "OSX defrags your files anyway". This is only partly true. It defrags files that are less than 20 MB in size. It doesn't defrag larger files and it doesn't defrag the free space on the drive. In fact the method it uses to defrag the smaller files actually increases the extent of free space fragmentation. Eventually, in fact, once the largest free space fragments are down to less than 20 MB (not uncommon on a drive that has , say only 10% free space left) it begins to give up trying to defrag altogether. Despite this, the system copes very well without defragging as long as you have plenty of room.
    Again, this doesn't matter much when the drive is half empty or better, but it does when it gets fullish, and it does especially when it gets fullish if you are regularly dealing with large files , like video or serious audio stuff.
    If you look through this discussion board you will see quite a few complaints from people who find that their drive gets "slow". Often you will see that say that "still have 10 or 20 gigs free" or the like. On modern large drives by this stage they are usually in fact down to the point where the internal defragmentation routines can no longer operate , where their drives are working like navvies to keep up with finding space for any larger files, together with room for "scratch files", virtual memory, directories etc etc etc. Such users are operating in a zone where they put a lot more stress on their drives as a result, often start complaining of increased "heat", etc etc. Most obviously, though, the computer slows down to a speed not much better than that of molasses. Eventually the directories and other related files may collapse altogether and they find themselves with a next to unrecoverable disk problems.
    By this time, of course, defragging itself has already become just about impossible. The amount of work required to shift the data into contiguous blocks is immense, puts additional stress on the drive, takes forever, etc etc. The extent of fragmentation of free space at this stage can be simply staggering, and any large files you subsequently write are likely to be divided into many , many tens of thousands of fragments scattered across the drive. Not only this, but things like the "extents files", which record where all the bits are located, will begin to grow astronomically as a result, putting even more pressure on your already stressed drive, and increasing the risk of major failures.
    Ultimately this adds up to a situation where you can identify maybe three "phases" of mac life when it comes to the need for defragmentation.
    In the "first phase" (with your drive less than half full), it doesn't matter much at all - probably not enough to even make it worth doing.
    In the "second phase" (between , say 50% free space and 20% free space remaining) it becomes progressively more useful, but , depending on the use you put your computer to you won't see much difference at the higher levels of free space unless you are serious video buff who needs to keep their drives operating as efficiently and fast as possible - chances are they will be using fast external drives over FW800 or eSata to compliment their internal HD anyway.
    At the lower end though (when boot drives get down around the 20% mark on , say, a 250 or 500 Gig drive) I certainly begin to see an impact on performance and stability when working with large image files, mapping software, and the like, especially those which rely on the use of their own "scratch" files, and especially in situations where I am using multiple applications simultaneously, if I haven't defragmented the drive for a while. For me, defragmenting (I use iDefrag too - it is the only third party app I trust for this after seeing people with problems using TechToolPro and Drive Genius for such things) gives a substantial performance boost in this sort of situation and improves operational stability. I usually try to get in first these days and defrag more regularly (about once a month) when the drive is down to 30% free space or lower.
    Between 20% and 10% free space is a bit of a "doubtful region". Most people will still be able to defrag successfully in this sort of area, though the time taken and the risks associated increase as the free space declines. My own advice to people in this sort of area is that they start choosing their new , bigger HD, because they obviously are going to need one very soon, and try to "clear the decks" so that they maintain that 20% free buffer until they do. Defragging regularly (perhaps even once a fortnight) will actually benefit them substantially during this "phase", but maybe doing so will lull them into a false sense of security and keep them from seriously recognising that they need to be moving to a bigger HD!
    Once they are down to that last ten per cent of free space, though, they are treading on glass. Free space fragmentation at least will already be a serious issue on their computers but if they try to defrag with a utility without first making substantially more space available then they may find it runs into problems or is so slow that they give up half way through and do the damage themselves, especially if they are using one of the less "forgiving" utilities!
    In this case I think the best way to proceed is to clone the internal drive to a larger external with SuperDuper, replace the internal drive with a larger one and then clone back to it. No-one down to the last ten percent of their drive really has enough room to move. Defragging it will certainly speed it up, and may even save them from major problems briefly, but we all know that before too long they are going to be in the same situation again. Better to deal with the matter properly and replace the drive with something more akin to their real needs once this point is reached. Heck, big HDs are as cheap as chips these days! It is mad to struggle on with sluggish performance, instability, and the possible risk of losing the lot, in such a situation.

  • HUGE amount of yellow "Other" space being taken up on iPhone 4S. Please help!

    I looked around the forum for a solution of the huge amount of yellow "other" space that is being eaten up on my iPhone 4S. Even after a restore, the yellow "other" space is taking up 7.1 GB of space on my phone, up from 6.5 which is what it was before the restore. What is going on? Usually I just erase some texts (I send a lot of pics to my hubby and mom,) erase some voicemails and it drops down to around 1 GB. But I've noticed since iOS 6, that hasn't worked. 7.1 GB is ridiculous. Is there ANY way at all to get rid of it without having to erase absolutely everything about my phone and start over from scratch, including downloading 40 apps all over again?
    This seems to be a problem a lot of people have been having over the past few years, why has it never been addressed? There needs to be some sort of utility or something that cleans things up if they have problems with junk not getting deleted or corrupted files sticking around.

    Ok, so I appreciate the suggestion that it may be a corrupted file. But have you noticed that I am not the only person having this problem? Actually look. It is a valid problem, and I'm not the only one having it. So asking for Apple to fix it really isn't wrong.
    http://forums.macrumors.com/showthread.php?t=567122
    http://apple.stackexchange.com/questions/27483/why-does-my-iphone-use-so-much-ot her-space-after-upgrade-to-ios-5
    http://www.ipodrepublic.com/iphone/fixing-issue-other-files-iphone-memory/2010/0 3/31/
    Obviously the problem is not new. And it isn't huge, but it is inconvenient and annoying. How are files getting corrupt so frequently that they are taking up 8 GB of space? There has to be a cause. And the fact that before iOS 6 happened, all I had to do was delete text messages and that would immediately fix it. Why has that changed?
    Also, it's a crime to say the word Google? Sorry it offends you, that's what I used to search.

  • How can we transfer huge amount of data from database server to xml format

    hi guru
    how can we transfer huge amount of data from database server to xml format.
    regards
    subhasis.

    Create ABAP coding
    At first we create the internal table TYPES and DATA definition, we want to fill with the XML data. I have declared the table "it_airplus" like the structure from XML file definition for a better overview, because it is a long XML Definition (see the XSD file in the sample ZIP container by airplus.com)
    *the declaration
    TYPES: BEGIN OF t_sum_vat_sum,
              a_rate(5),
              net_value(15),
              vat_value(15),
             END OF t_sum_vat_sum.
    TYPES: BEGIN OF t_sum_total_sale,
            a_currency(3),
            net_total(15),
            vat_total(15),
            vat_sum TYPE REF TO t_sum_vat_sum,
           END OF t_sum_total_sale.
    TYPES: BEGIN OF t_sum_total_bill,
            net_total(15),
            vat_total(15),
            vat_sum TYPE t_sum_vat_sum,
            add_ins_val(15),
            total_bill_amount(15),
           END OF t_sum_total_bill.TYPES: BEGIN OF t_ap_summary,
            a_num_inv_det(5),
            total_sale_values TYPE t_sum_total_sale,
            total_bill_values TYPE t_sum_total_bill,
           END OF t_ap_summary.TYPES: BEGIN OF t_ap,
            head    TYPE t_ap_head,
            details TYPE t_ap_details,
            summary TYPE t_ap_summary,
           END OF t_ap.DATA: it_airplus TYPE STANDARD TABLE OF t_ap
    *call the transformation
    CALL TRANSFORMATION ZFI_AIRPLUS
         SOURCE xml l_xml_x1
         RESULT xml_output = it_airplus
         .see the complete report: Read data from XML file via XSLT program
    Create XSLT program
    There are two options to create a XSLT program:
    Tcode: SE80 -> create/choose packet -> right click on it | Create -> Others -> XSL Transformation
    Tcode: XSLT_TOOL
    For a quick overview you can watch at the SXSLTDEMO* programs.
    In this example we already use the three XSLT options explained later.
    As you can see we define a XSL and ASX (ABAP) tags to handle the ABAP and XML variables/tags. After "

  • I AM STUCK WITH MOZILLA FIREFOX / NOT SHOWN ON MY LAPTOP SCREEN / USING HUGE AMOUNT OF RAM IN GENERAL - CAN YOU HELP????

    Hi,
    I have copied these problem on you site, as I could explain better then what they did, wich my problem remain the same as they have. And part me explaining what is really happen to my Laptop.
    ''Mozilla/Firefox 4 will not SHOW on my computer SCREEN.
    Firefox won't load onto my computer. I have Windows 7, and each time I try to run the program to install it, it tells me I have another version still waiting to be installed that requires my computer to reboot. When I check "yes" to reboot, my computer reboots but still no Firefox. When I try to uninstall the the old Mozilla program from my Control Panels, there is no Mozilla files or programs listed not in my list of programs but when I go directly to the Windows File on the C: Drive, there's a Mozilla program file there. WHen I go into that file check, I attempt to open the uninstall program. I get an error message that says something to the effect of: "the version Mozilla on this computer is not compatable with my OS." Something about it being and 86bit program. My Firefox 3.6 worked fine 2 days ago. What's wrong??? ''
    '''Firefox 4 using huge amounts of RAM on sites with Java scripts
    I have been observing Firefox using huge amount of RAM when I am on sites that use Java Scripts to rotate images. I have tried a couple of different sites and have monitored the RAM usage. With Java script enabled Firefox 4 continues to grow its RAM usage by about 10MB a minute. I have had the usage hit as high as 1.5GB. As a comparison I have monitored the same sites in Internet explorer and have not seen the same issue. - Just to eliminate a site issue.
    Turning off Java Script solves the issue and eventually frees the RAM.
    I am using XP Pro, 3 GB RAM. '''''
    My experince with Mozilla Firefox:
    Here what really happen when I install the Mozilla Firebox on my La[top you can see the file is there but when I click the program to open its does not shown on the screen but when I check with WTM (Windows Task Manager) its shows that is ON and you can only witness the laptop going very slow and I have to reboot it, and I can not install any other Mozilla :( because of the same problem and all the above.
    I really need Mozilla to work as soon as possible due to important deadlines I have pendent, and i can only use Chrome but is not my ideal program. Can you help me , please ?????
    Thank you
    Vitor Mendes

    If it opens in safe mode, you may have a problematic add-on. Try the procedure in the [[Troubleshooting extensions and themes]] article.

  • Insane XML Import, Huge Project, Duplicate file names work around...

    I planned on kicking off my journey attempting to edit a massive multi year documentary on FCPX with a nice introduction of the blog I'm going to keep about the experience, but I've run into a bit of a roadblock, or maybe major speed bump at least before even getting to that point. I wanted to share what is working as a work around for me and you guys can tell me how I'm doing it wrong.
    Ok, I will try to explain this as succinctly as possible. I'll write in somewhat stream of consciousness just to try and get through it quicker... Basically, after discovering the work around below, I am now utterly confused on how FCPX handles the relationship between its own database of where media is stored, and the actual media itself. I have plenty experience on FCPX now, probably done 30-40 pro commercial jobs on it over the last year since XML became doable as I'm also a Resolve Colorist and all the FCPX projects where hardcore coloring product spots. For commercial work, I never needed to worry about splitting up footage up over multiple Events. Everything, all in one, FCPX handled it no problem. (well the occasional beach ball, but that seems to be a thing with FCPX in general)
    This will be my 10th feature documentary as an Editor. Every one before it was either on Avid's many flavors over the last 12 years or FCP Studio. When this new film came along, I made the decision a few months ago to use FCPX for a few reasons, but mostly because I'm insane and I like to try to mix it up for myself in a career that can get stale quick if you aren't willing to be that way. The film was shot over 2+ years, every shoot was multi cam 5D (yes i know, looks great, but please kill me), I haven't done the math on length, but there is over 10,000 clips of video (this is actually medium in size compared to what I've dealt with before). Its 5D, so theres external audio for everything. FCPX's syncing is great, but I've learned that theres an unsaid window of heads and tales clips must fall within to sync properly with the nearby clips, if they are too far apart FCPX gives up. One shoot day could have 3 cams, 50 clips each, and 2 audio files to sync to, FCPX simply cannot handle this, so off to Plural eyes they went, no problems.
    Ok, all this is relevant eventually I swear! Again, in the past, all in one event, no problem. I tried for fun to bring all media into one Event on this film. It worked, but there is a 10+ second spinning beach ball for every single move you make, so thats no good. Ok, I'll separate the Events out to, lets say, each shoot month. Well that's dumb, in verite documentary, any shot could be the first, any shot could be the last, you need a command over all searchable footage at all times. Shift selecting all events to search *****, and it actually takes longer for FCP to reload each event each time than it does to just have the one massive one. So no go there. Next hair brained idea... What if make a new Event that is just Compound Clips of all the other Event's contents and do more with Markers and Favorites in logging that I was planning to parse it all out. That way I'm working with and FCPX is dealing with 50-60 clips instead of 10,000+ Quick test, Cmd-A, Opt-G, boom, boom, boom, move all to dedicated to Event, hide huge Event, BEHOLD, that works! FCPX chokes a little bit on the insane length of some of the clips, but searching, and general performance is back on par!
    So your saying to yourself "Ok *********, sounds like you figured it out, what's the problem." Well to you I say, "Not so fast!" Remember, that was just a quick test using the media I had imported into the massive 10,000+ clip Event. To do this project proper, I am having to import Multicam sync'd XMLs from Plural Eyes. And this is where it all starts to fall apart. A little foreshadowing for your eager eyes. 10,000+ files all shot with multiple 5D's over the course of years. What does that mean? many, many duplicate file names!
    FCPX as well all know irritatingly imports XML's as new Events, not into existing ones. This obviously takes a lot of burden off media management because with a new Event comes a new database referencing its own version of the raw media. All well and good, and I'm betting its starting to click for some if you advanced users where I'm finally going with this. So I have 50 or so XMLs to bring in, all done no problem. Now I want to replicate that singular Event like I did with the Compound Clip test and have it all in one place to be my master as extensive logging begins for easy searching once editing begins. Highlight the Events, click Merge Events. NOPE. I get a new "Kill Yourself Now" error (a term I coined for Out of Memory and General Error messages in FCP Legacy meaning there ain't much you can do about it): "Two or more files have the same name. Change the names and try again, because I don't know what the **** to do with them." Ok I made up that last part but that's basically what it's saying. Just take the variable out of the equation, this happens with every which way you could try to get the clips together. Merge Events, dragging events on top of each other, dragging just the Multicam clip alone, nothing gets passed that message. What's worse is that while Batch Renaming seems like a solution, the renames do not populate inside the created clips and there is no way to Batch Rename those. Renaming everything at the finder level isn't so great because then I'd have to resync and theres an offline/online thing going here where the film has to be reconformed eventually.
    Basically, I've found that FCPX handles media management in completely different ways depending on whether you are importing into one Event yourself or doing essentially what is a new import with FCPX moving or merging Events. If you bring in all the media to one Event, on a macro level FCPX goes through file by file making aliases referencing the master file. If it hits a duplicate, it makes a parenthesis counter, and keeps going. And with the genius of FCPX metadata, that file name doesn't even matter, you can change it at the Finder level and FCPX will keep the link intact. BUT for some reason if you try to do this outside the realm of a single Event and combine files after the fact a different process takes over in creating this database system and can't handle the duplicates. I can't totally figure the reason other than it probably is scared to change the originally referenced alias to something else (which it shouldn't be scared of since Merge Events deletes the original anyway).
    This is where it gets INSANE and where I lose all understanding of what is going on, and believe me it was a delicate understanding to begin with. It comes in to play with the work around I figured out. I make the master Event with the 10,000+ clips. Then I import all the XMLs as dedicated Events. Now, I then drag the Multicam clips into the master Event, it WORKS! just takes it, no "Kill Yourself Now" error. Stranger still, now with the switched referenced Event, it even takes into account which aliased duplicate file name it's referencing in the Original Media folder. Somehow, it's taking into account the original file path and saying "Ok, I see 5 instances of MVI_5834.mov. Based on the original file path or maybe creation date or god knows what, it must be MVI_5834 (fcp3).mov." It connects perfectly. I can even remove the old XML imported Event with no problem. Crazier yet, I can now move those again to the dedicated Event I wanted originally that only contains those Multicam or Compound Clips.
    So instead of going straight from A to C, going from A to B to C works even though that actually seems way more complicated. Why can't FCPX handle Merge Events and dragging clips the same way it handles media imported into a single Event. And weirder still, why can't FCPX handle the (fcp1,2,3...) appending in the same scenario. But if the appended links are already there, No Problem. And for the love of god, it'd be nice to important XML's into existing Events and make the correct referencing happen right from the get go, which is really the source of all the above headache. I'd have no problem helping FCPX with a little manual pointing in the right direction just like any other NLE needs.
    Ok, having said all of that crap above, my question is, have I missed something completely simple I should have done instead? Now that I have everything in place how I originally envisioned, I think I will still play around a little bit more to make sure FCPX is really going to be able to handle this project. I'm at a stage right now where going another direction is still an option, although the dare devil in me wants to make this work. Media management aside, once you start editing on a FCPX timeline, its hard to go back to anything else. Apple is going to have to figure out some way not to access to everything at all times to work fluidly or big projects like this are never going to be practical in FCPX.
    Sorry for the long confusing post....

    I'm having the exact same problem, but I know nothing of ruby scripts. I've exhausted my resources with tech support, and after literally hours of work with them it seems I now am faced with either re-rating each individual song, or pointing iTunes to each song that it can't locate. Is yours a solution that could help me? How can I find out more about Ruby Scripts? Thanks for your help, hellolylo (or anyone else who might also be able to help...)
    Kenn

  • How do i transfer a compilation photo file from iPhoto consisting of canon(img) and lumix(p..) in the order i compiled them to a usb stick as at the moment it is putting all the lumix files after the canon files?

    I have a macbook pro 15 inch retina 16 gb 1600mhz 2.7intel i7 osx 10.8.5
    I have just put together an iPhoto file of wedding photos consisting of a compilation of images from my canon(img files) and lumix(p..files) I want to know how to transfer them to a usb stick in the order i compiled them as at the moment it is putting all the lumix files after the canon files which is not how i want to present them to the newlyweds?

    Files are ordered by the file-viewer being used.  What viewer are you using, and what data field are you sorting by?
    In Finder, as in most applications, you can select the data field to sort by clicking a column header.  Clicking the selected header will reverse the sort.
    This is the Aperture forum, but what applies to Aperture is likely also true for iPhoto:  if you want to be able to sort your files in a specific order in a file-viewer, you should give them names that allow for sorting in that order when the name field is selected as the data field to sort by.  In Aperture, one creates new files by exporting, and one uses a File Naming Preset to name the files.  The File Naming Preset can start with an ordinal count or a sequence number.  Either of which will allow you to retain your order when viewing your files in a file viewer.

  • Signature Scribble field not retaining image file after saving

    I have 2 issues with my PDF dynamic file I sent out
         1) The add/remove controls is adding a random amount of extra blank rows upon saving the PDF
         2) The image files selected for the Signature Scribble field is not retaining the image file after saving. It also doesn't give an opportunity to actually draw in a signature.
    I even tried to use the Subform Instance Controls Insert/Remove/Move in my table, but I keep receiving the error "You have reached the maximum number of items allowed"
    I would appreciate any help you can give me! I have been farting around with this for too long.
    Thanks!
    Rhonda

    Thanks Pat.  Tried each, restarted computer each time; tried the "Accept" routine several days ago - it worked briefly and then stopped.  Uninstalled Reader and Flash as well.  Strange thing with Flash, one of my online broker programs provides streaming of CNBC.  When I clicked on the CNBC option to initiate CNBC, I got a message saying that I needed to load Flash.  I did, and it loaded again with no mention of Flash having already been installed.
    Using Internet Explorer works, as well as, clicking on the .pdf file within Windows Explorer (WE).  Expanding the window in WE makes it large enough to read.  I would still like to find a way to open the files as they are designed to open.
    I have done all of the Adobe updates as they have been presented.  Microsoft updates load automatically, and restart my computer in the process.  I do not look to see what the updates have been.  I figure if there was not a need, Microsoft would not make the updates.
    [private data removed]

  • Lost CR files after exporting to jpg

    I seem to have lost many original CR files after import > developing > export as jpg.  I had made selects to develop and then output images as lower rez jpgs.  When I went back to the original folders (on an external hd) those CR select files were missing / a gap in the photo sequence now exists and I cannot locate the originals regardless of searches.  How is this possible?

    Lightroom does not delete your photos unless you specifically tell Lightroom to do so, and this wouldn't happen in an Export operation anyway.
    So I don't know how this is possible, but I'm pretty sure it can't be blamed on Lightroom
    You need to:
    1) check the Recycle bin; or if that doesn't work
    2) restore the photos from backups, or if you don't have backups (why not?); or if that doesn't work
    3) you can try to download something called an undelete utility and install it and see if it can find these photos (and if you are going to need this step, perform it immediately, do not do anything else on your computer)

  • How can I use Auto Import to name files after the folder they are dropped into?

    I'm camera tethered on a new Macbook Pro Retina, Lightroom 5, all latest software and OS. Using  EOS Utility to connect and drop into a DROP folder. When Lightroom Autoimports, I would like it to rename the file after the destination folder. Capture One has this feature and I could have sworn I saw a drop down menu on one of the naming tokens in LR5 about a month ago.  ANY IDEAS?

    The closest you can come is to create a File Naming Template that includes "Custom Text" as a part of the file name.  Then, you will have to change the Auto Import Settings to use this Template when auto-importing and manually type that folder name in every time you change folders.  There is no "Folder" token of which I am aware.

  • Working with naming redundancy of MTS files

    I am new to working with MTS files in prelude and eventually Premiere and have a few questions about the proper workflow.  I would be very grateful for any wisdom.
    first off background on the project:
    - We have .mts files from several different consumer Vixia cameras
    - The cameras used the same naming convention for each card.  (starting with 000.mts working its way up).  These means we have several different .mts files, with the same name. 
    - The file structures were copied over directly from the cards, and therefore unchanged.  However, I still can't work with the metadata for some of the clips... and therefore suspect i need to transcode some of them. 
    My Questions:
    1.  Is it a problem that there is a redundancy in the naming of the .mts files, and that i will have several clips in the project with the same name?  All the clips are organized into a folder structure based on date and roll, so i'm not worried about the name redundancy when i'm actually editing.  I am worried that if the media gets disconnected Adobe won't know which clip to look for, and may try to reconnect the wrong clip with the same name.  Does Adobe reconnect based only of file name, or does it also look at the path?
    I'm scared if i change the file name, it will be hard to make sure the file name is changed across the folder structure, and therefore the native structure could be corrupted.
    What is a best practice workflow for working with .mts files with the same name in adobe? 
    2.  Alot of the footage is shot by non-professionals, who didn't know to break up clips.  Therefore, we have a lot  of "spanned" .mts files.  is there any solution to spanned files?  I've read in a previous forum that there is not, and that the only way to avoid the issue is to transcode.  Is that information still true?
    3. If i do have to transcode, what is the recommended codec?  I usually go with prores 4444, but that codec's file size is way too big for the amount of footage we have to go through.  Any suggestions?
    I would be so grateful if anyone can help on these questions!!

    Working with these AVCHD files just involves a bit of care and proper project management.
    eg Well named Source Folders (on local hard drives) and well named "rushes" Folders in your Premiere Project.
    Take care when importing/ ingesting to not overwrite the files at any point..
    No need to rename or transcode source files.
    Relocating / Relinking is not an issue.  (especially if you dont lose them in the first instance).
    If workflow involves additional pipelines  eg delivery via edl  to a Color Suite.
    Ensure timecode and source files are clearly set up in a manageable way.
    BTW - If one is able to shoot with control of timecode..its helpful.  eg Card 1 = 01:00:00:00, Card 2 =02:00:00:00 etc...

Maybe you are looking for

  • Cannot delete tracks after upgrading to iTunes 9

    I upgraded to iTunes 9 today. Added a 14 tracks from a new album. Then when I went in to the tag details to bulk edit the album name, it only edited 13 of 14 tracks. I selected only the last track to try and edit it singularly, but the tag info has b

  • HT204088 how can i make complain about purchasing an item in app store

    i have a problemi purchase an item for one time but every month they take from my accounts money to renewal and i dont want that i need to stop it

  • Apple Outlook DAV Config error with Windows7; Any help?

    I have updated Windows7 and I have downloaded the icloud control panel.  I keep getting the Apple Outlook DAV Config error.  Any help?

  • How to backup ARD database/settings/etc

    I'm about to test Boot Camp on my one-and-only iMac (others are in classrooms and I can't trash them) and would like to know how to backup my ARD database, its settings, and the scripts, etc, that I have created in ARD before testing Boot Camp. How d

  • Info Needed. Urgent!

    Hello, I need some urgent info regarding iFS. 1- Does Oracle iFS supports multi master replication? 2- Can iFS Operations can participate in regular Db transaction? 3- Does iFS maintain ACID properties? 4- Does iFS support online Hotbackup? 5- Do u k