Best strategy for dealing with a mixed library and missing photos

Looking at the package contents of my iPhoto Library, it appears that I have a "mixed" library (i.e. part managed and part referenced). I'm not sure how this happened--I originally created this library about three years ago when I first bought my Mac with iLife '06, then later upgraded to iLife '08 which I'm currently still using. Maybe the default setting was to have a referenced library in iPhoto '06? Don't know. I don't recall ever changing this setting.
Anyway, I was starting to manually recover some missing photos but it just occurred to me that iPhoto is simply updating the aliases to point to the photos I'm telling it to use, and not actually copying the photos into the library (I'm guessing that only happens when you import a photo). This is not what I want. I want my library to be completely managed. I understand that, as my current Preferences settings dictate, any new photos I import now will be copied into the iPhoto Library. But for the existing photos that I am able to recover, I don't want to have to keep the originals outside of iPhoto.
I read in another topic about an application called AliasHerder and I'm thinking of trying that. The problem is, I no longer have original copies of all of the missing photos. I can recreate the folder structure for some of them but not all. I'm not sure what this will do to my iPhoto Library (I have emailed the vendor for clarification). I'm wondering if I'll still end up with a mixed library containing aliases that point to a non-existent file. Perhaps someone who has some experience with the tool could enighten me.
Based on suggestions given to me in previous posts I made, I tried both rebuilding my iPhoto Library and using the iPhoto Library Manager tool. Neither of these produced the results I had hoped for. Am I better off just starting off from scratch and creating a whole new library? And if so, what's the best way to get all of the existing photos from my current library to the new one? Do I export them out and then import them into the new library? Is there any way to salvage events, albums, etc. from my existing library or will I have to recreate these in the new library?
Thanks in advance!

First check iPhoto's Advanced preference pane to make sure you're running a "managed" library.
Click to view full size
Since you don't have the "source" photos available to relink to your best bet, IMO, would be to continue using your current library and delete those missing photos from it whenever you come across them in day to day use. If you click on the thumbnail of a missing photos and drag it to the iPHoto trash and empty it that will delete it from the library. This way you will retain all of your organizational efforts, i.e. albums, books, keywords, etc.
When you rebuilt with iPhoto Library Manager what did the resulting library contain? If it contained your photos without the missing photos you could use it. You would retain your albums, keywords, faces, places and other metadata. You will lose any keepsakes, i.e. books, slideshows, cards, etc.
Or, you could start over as follows:
Creating a new library while preserving the Events from the original library.
1 - Move the existing library folder to the desktop
2 - Open the library package like this.
3 - Launch iPhoto and, when asked, select the option to create a new library.
4 - Drag the Originals folder from the iPhoto Library on the desktop into the open iPhoto window.
You will end up with all your photos (no missing photos) in the same events as in the original library. But there will be no albums, metadata or keepsakes. It's for this reason I made the original suggestion of continuing with your current library above.
OT
TIP: For insurance against the iPhoto database corruption that many users have experienced I recommend making a backup copy of the Library6.iPhoto (iPhoto.Library for iPhoto 5 and earlier versions) database file and keep it current. If problems crop up where iPhoto suddenly can't see any photos or thinks there are no photos in the library, replacing the working Library6.iPhoto file with the backup will often get the library back. By keeping it current I mean backup after each import and/or any serious editing or work on books, slideshows, calendars, cards, etc. That insures that if a problem pops up and you do need to replace the database file, you'll retain all those efforts. It doesn't take long to make the backup and it's good insurance.
I've created an Automator workflow application (requires Tiger or later), iPhoto dB File Backup, that will copy the selected Library6.iPhoto file from your iPhoto Library folder to the Pictures folder, replacing any previous version of it. There are versions that are compatible with iPhoto 5, 6, 7 and 8 libraries and Tiger and Leopard. Just put the application in the Dock and click on it whenever you want to backup the dB file. iPhoto does not have to be closed to run the application, just idle. You can download it at Toad's Cellar. Be sure to read the Read Me pdf file.
NOTE: The new rebuild option in iPhoto 09 (v. 8.0.2), Rebuild the iPhoto Library Database from automatic backup" makes this tip obsolete.

Similar Messages

  • What's best strategy for dealing with 40+ hours of footage

    We have been editing a documentary with 45+ hours of footage and presently have captured roughly 230 gb. Needless to say it's a lot of files. What's the best strategy for dealing with so much captured footage? It's almost impossible to remember it all and labeling it while logging it seems inadequate as it difficult to actually read comments in dozens and dozens of folders.
    Just looking for suggestions on how to deal with this problem for this and future projects.
    G5 Dual Core 2.3   Mac OS X (10.4.6)   2.5 g ram, 2 internal sata 2 250gb

    Ditto, ditto, ditto on all of the previous posts. I've done four long form documentaries.
    First I listen to all the the sound bytes and digitize only the ones that I think I will need. I will take in much more than I use, but I like to transcribe bytes from the non-linear timeline. It's easier for me.
    I had so many interviews in the last doc that I gave each interviewee a bin. You must decide how you want to organize the sound bytes. Do you want a bin for each interviewee or do you want to do it by subject. That will depend on you documentary and subject matter.
    I then have b-roll bins. Sometime I base them on location and sometimes I base them on subject matter. This last time I based them on location because I would have a good idea of what was in each bin by remembering where and when it was shot.
    Perhaps, you weren't at the shoot and do not have this advantage. It's crucial that you organize you b-roll bins in a way that makes sense to you.
    I then have music bins and bins for my voice over.
    Many folks recommend that you work in small sequences and nest. This is a good idea for long form stuff. That way you don't get lost in the timeline.
    I also make a "used" bin. Once I've used a shot I pull it out of the bin and put it "away" That keeps me from repeatedly looking at footage that I've already used.
    The previous posts are right. If you've digitized 45 hours of footage you've put in too much. It's time to start deleting some media. Remember that when you hit the edit suite, you should be one the downhill slide. You should have a script and a clear idea of where you're going.
    I don't have enough fingers to count the number of times that I've had producers walk into my edit suite with a bunch of raw tape and tell me that that "want to make something cool." They generally have no idea where they're going and end up wondering why the process is so hard.
    Refine your story and base your clip selections on that story.
    Good luck
    Dual 2 GHz Power Mac G5   Mac OS X (10.4.8)  

  • Question: Best Strategy for Dealing with Large Files 1GB

    Hi Everyone,
    I have to build a UCM system for large files > 1GB.
    What will be the best way to upload them (applet, checkin form, webdav)?
    Also what will be the best way to download them (applet, web form, webdav)?
    Any tips will be greatly appreciated
    Tal.

    Not sure what the official best practice is, but I prefer to get the file on the servers file system first (file copy) and check it in from that path. This would require a customization / calling a custom service.
    Boris
    Edited by: user8760096 on Sep 3, 2009 4:01 AM

  • Best practice for dealing with Recordsets

    Hi all,
    I'm wondering what is best practice for dealing with data retrieved via JDBC as Recordsets without involving third part products such as Hibernate etc. I've been told to NOT use RecordSets throughout in my applications since they are taking up resources and are expensive. I'm wondering which collection type is best to convert RecordSets into. The apps I'm building are webbased using JSPs as presentation layer, beans and servlets.
    Many thanks
    Erik

    There is no requirement that DAO's have a direct mapping to Database Tables. One of the advantages of the DAO pattern is that the business layer isn't directly aware of the persistence layer. If the joined data is used in the business code as if it were an unnormalized table, then you might want to provide a DAO for the joined data. If the joined data provides a subsiduray object within some particular object, you might add the access method to the DAO for the outer object.
    eg:
    In a user permissioning system where:
    1 user has many userRoles
    1 role has many userRoles
    1 role has many rolePermissions
    1 permission has many rolePermissions
    ie. there is a many to many relationship between users and roles, and between roles and permissions.
    The administrator needs to be able to add and delete permissions for roles and roles for users, so the crud for the rolePermissions table is probably most useful in the RoleDAO, and the crud for the userRoles table in the UserDAO. DOA's also can call each other.
    During operation the system needs to be able to get all permissions for a user at login, so the UserDAO should provide a readPermissions method that does a rather complex join across the user, userRole, rolePermission and permission tables..
    Note that f the system I just described were done with LDAP, a Hierarchical database or an Object database, the userRoles and rolePermissions tables wouldn't even exist, these are RDBMS artifacts since relational databases don't understand many to many relationships. This is good reason to avoid providing DAO's that give access to those tables.

  • What is best practice for dealing with Engineering Spare Parts?

    Hello All,
    I am after some advice regarding the process for handling engineering spare parts in PM. (We run ECC 5)
    Our current process is as follows:
    All materials are set up as HIBE's
    Each material is batch managed
    The Batch field is used for the Bin location
    We are now looking to role out PM to a site that has in excess of 50,000 spare parts and want to make sure we use best practice for handling the spare parts. We are now considering using a basic WM setup to handle the movement of parts.
    Please can you provide me with some feedback on what you feel the best practice is for dealing with these parts?
    We are looking to set up a solution that will us to generate pick lists etc and implment a scanning solution to move parts in and out of stores.
    Regards
    Chris

    Hi,
    I hope all the 50000 spare parts are maintained as stock items.
    1. Based on the usage of those spare parts, try to define safety stock & define MRP as "Reorder Point Planning". By this, you can avoid petty cash purchase.
    2. By keeping the spare parts (atleast critical components) in stock, Planned Maintenance as well as unplanned maintenance will not get delayed.
    3. By doing GI based on reservation, qty can be tracked against the order & equipment.
    As this question is MM & WM related, they can give better clarity on this.
    Regards,
    Maheswaran.

  • Question: Best practices for dealing with multiple AM configurations

    Hello all,
    I have a project using ADF Business Components and ADF Faces. I would like to set up multiple configurations for the Application Modules to support the following scenarios:
    1). Local testing and debugging - using a connection defined in JDeveloper and AM Pooling turned off.
    2). Testing and debugging on an application server - using a JDBC Data Source and AM Pooling turned off
    3). Production deployment - using a JDBC Data Source and AM Pooling turned on.
    It is no problem to create multiple AM configurations to reflect this scenario. In order for the web part of the application to use the correct configurations, the DataBindings.cpx file must specify the correct ones. I was thinking to have 3 different DataBindings.cpx files and to change the CpxFileName context-param in the web.xml file as needed.
    My questions:
    1). Does this make sense as an approach? It should be better than having to change a single AM configuration every time I deploy or test. Is there any easy way to keep multiple DataBIndings.cpx files in synch, given that we may add new pages from time-to-time? Alternatively, can we do some type of "include" processing to include just the dataControlUsages section into a common DataBindings.cpx file?
    2). How would you manage the build-and-deploy process? For the most part, in JDev we would be using configuration #1. The only time to switch to configuration #2 or #3 would be to build an EAR file for deployment. Is this something that it would make sense to accomplish with ANT? I'm not an ANT expert at all. The ANT script would have "build-test-ear" and "build-prod_ear" targets which would swap in a correct web.xml file, recompile everything, build the EAR, then put the development web.xml file back. I'm relatively sure this is possible... comments?
    3). Is there some other recommended approach?
    I appreciate any insights from experience, or even just ideas or thoughts that I can test out.
    Best regards,
    John

    Hi K,
    Sorry for the long long delay in responding I've been traveling - and thanks for the e-mail tickler too...
    To answer your question in short, I do think that ANT is the right way to go; there is an extra ANT task called XMLTask that I was able to download and play with, and it seems it would make this manipulation of the cpx file (or the xcfg file, for that matter) pretty straightforward. I don't have any code to post; it's just in the conceptual stage for me right now. I didn't see anything magical in JDev 11 TP3 that solved this problem for me either.
    Having said all of that, it's more complicated than it might appear. In addition to the DataBindings.cpx file (stores, among other things, which AM configuration to use for each data control), it's certainly possible to programmatically access an AM (specifying the configuration either directly in the code or via a properties file/etc). I'm not sure what the most common use case for AM configurations is, but in my case, I have a Test configuration and a Prod configuration. The Test config, among other things, disables AM pooling. When I am developing/testing, I always use the Test config; in Production, I always use the Prod config. Perhaps the best way for me to do this would be to have an "Active" config and use ANT tasks to copy either Test or Prod to "Active." However, our Subversion repository is going to have a few complaints about this.
    John

  • Best practices for dealing with Exceptions on storage members

    We recently encountered an issue where one of our DistributedCaches was terminating itself and restarting due to an RuntimeException being thrown from our code (see below). As usual, the issue was in our own code and we have updated it to not throw a RuntimeException under any circumstances.
    I would like to know if there are any best practices for Exception handling, other than catching Exceptions and logging them. Should we always trap Exceptions and ensure that they do not bubble back up to code that is running from the Coherence jar? Is there a way to configure Coherence so that our DistributedCaches do not terminate even when custom Filters and such throw RuntimeExceptions?
    thanks, Aidan
    Exception below:
    2010-02-09 12:40:39.222/88477.977 Oracle Coherence GE 3.4.2/411 <Error> (thread=DistributedCache:StyleCache, member=48): An exception (java.lang.RuntimeException) occurred reading Message AggregateFilterRequest Type=31 for Service=DistributedCache{Name=StyleCache, State=(SERVICE_STARTED), LocalStorage=enabled, PartitionCount=1021, BackupCount=1, AssignedPartitions=201, BackupPartitions=204}
    2010-02-09 12:40:39.222/88477.977 Oracle Coherence GE 3.4.2/411 <Error> (thread=DistributedCache:StyleCache, member=48): Terminating DistributedCache due to unhandled exception: java.lang.RuntimeException

    Bob - Here is the full stacktrace:
    2010-02-09 13:04:22.653/90182.274 Oracle Coherence GE 3.4.2/411 <Error> (thread=DistributedCache:StyleCache, member=47): An exception (java.lang.RuntimeException) occurred reading Message AggregateFilterRequest Type=31 for Service=DistributedCache{Name=StyleCache, State=(SERVICE_STARTED), LocalStorage=enabled, PartitionCount=1021, BackupCount=1, AssignedPartitions=205, BackupPartitions=204}
    2010-02-09 13:04:22.653/90182.274 Oracle Coherence GE 3.4.2/411 <Error> (thread=DistributedCache:StyleCache, member=47): Terminating DistributedCache due to unhandled exception: java.lang.RuntimeException
    2010-02-09 13:04:22.653/90182.274 Oracle Coherence GE 3.4.2/411 <Error> (thread=DistributedCache:StyleCache, member=47):
    java.lang.RuntimeException: java.lang.ClassNotFoundException: com.edmunds.vehicle.Style$PublicationState
         at com.edmunds.common.coherence.EdmundsEqualsFilter.readExternal(EdmundsEqualsFilter.java:84)
         at com.tangosol.io.pof.PortableObjectSerializer.initialize(PortableObjectSerializer.java:153)
         at com.tangosol.io.pof.PortableObjectSerializer.deserialize(PortableObjectSerializer.java:128)
         at com.tangosol.io.pof.PofBufferReader.readAsObject(PofBufferReader.java:3284)
         at com.tangosol.io.pof.PofBufferReader.readAsObjectArray(PofBufferReader.java:3328)
         at com.tangosol.io.pof.PofBufferReader.readObjectArray(PofBufferReader.java:2168)
         at com.tangosol.util.filter.ArrayFilter.readExternal(ArrayFilter.java:243)
         at com.tangosol.io.pof.PortableObjectSerializer.initialize(PortableObjectSerializer.java:153)
         at com.tangosol.io.pof.PortableObjectSerializer.deserialize(PortableObjectSerializer.java:128)
         at com.tangosol.io.pof.PofBufferReader.readAsObject(PofBufferReader.java:3284)
         at com.tangosol.io.pof.PofBufferReader.readAsObjectArray(PofBufferReader.java:3328)
         at com.tangosol.io.pof.PofBufferReader.readObjectArray(PofBufferReader.java:2168)
         at com.tangosol.util.filter.ArrayFilter.readExternal(ArrayFilter.java:243)
         at com.tangosol.io.pof.PortableObjectSerializer.initialize(PortableObjectSerializer.java:153)
         at com.tangosol.io.pof.PortableObjectSerializer.deserialize(PortableObjectSerializer.java:128)
         at com.tangosol.io.pof.PofBufferReader.readAsObject(PofBufferReader.java:3284)
         at com.tangosol.io.pof.PofBufferReader.readObject(PofBufferReader.java:2599)
         at com.tangosol.io.pof.ConfigurablePofContext.deserialize(ConfigurablePofContext.java:348)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.readObject(Service.CDB:4)
         at com.tangosol.coherence.component.net.Message.readObject(Message.CDB:1)
         at com.tangosol.coherence.component.net.message.requestMessage.distributedCacheRequest.partialRequest.FilterRequest.read(FilterRequest.CDB:8)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.DistributedCache$AggregateFilterRequest.read(DistributedCache.CDB:4)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.onNotify(Grid.CDB:117)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.DistributedCache.onNotify(DistributedCache.CDB:3)
         at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:37)
         at java.lang.Thread.run(Thread.java:619)
    Caused by: java.lang.ClassNotFoundException: com.edmunds.vehicle.Style$PublicationState
         at java.lang.Class.forName0(Native Method)
         at java.lang.Class.forName(Class.java:169)
         at com.edmunds.common.coherence.EdmundsEqualsFilter.readExternal(EdmundsEqualsFilter.java:82)
         ... 25 more
    2010-02-09 13:04:23.122/90182.743 Oracle Coherence GE 3.4.2/411 <Info> (thread=Main Thread, member=47): Restarting Service: StyleCacheOur code was doing something simple like
    catch(Exception e){
        throw new RuntimeException(e);
    }Would using the ensureRuntimeException call do anything for us here?
    Edited by: aidanol on Feb 12, 2010 11:41 AM

  • Best advice for dealing with awful service?

    Hey everyone!
    So my roommate and I have a problem with att at least once a month, so much that they not only have to prorate our bill monthly but we have to have supervisors call us back for service follow ups (that often) and now we have the red light flashing once again for broadband1 so we've been trying EVERYTHING but of course to only be unsuccessful. Two questions....1) do we chalk it up to a faulty broadband1 cord and hope they come out and fix it (weve had several 'simple' issues like this that have required multiple att servicemen to come out who are absolutely clueless or (my favorite) 'don't have the right part' or 2) do we finally say forget it and just get rid of att altogether....talking to robots for 3 hours to resolve 'simple' issues is a service I don't think anyone should be paying $100+ for every month....were just absolutely frustrated with att and over paying so much for awful service and hopefully on a forum like this we can get advice from people who've had similar problems (and can't ever find any decent customer service) and if they have advice to switch to any other service providers that offer much better quality and service to customers...we'd seriously appreciate any advice towards a better solution. Thanks!

      If you happen to have a 510/589 RG, IPv6 causes disconnects, rebooting and browsing problems.
    Disable IPv6 in the RG's Gui. http://192.168.1.254/
    Good luck

  • Best practice for dealing with Recordsets, JDBC and JSP?

    I've spent the last three years developing web apps using JSP, Struts and Kodo JDO for persistence. All of the content for the apps was created as Java objects using model classes and saved to an Oracle db. Thus, data retrieved from the db was as instances of the model classes and then put into Struts form beans, etc.
    I changed jobs last month and am now having to use Servlets with JDBC to retrieve records from db tables and returning it into Recordsets. Oh, and I can't use Struts in my JSPs either. I'm beginning to think that I had it easy at my previous job but maybe that's just because I was used to it.
    So here are my problems/questions:
    I have two tables with a one to many relationship that I need to retrieve data from, show in a jsp and be able to update eventually.
    So here's what I am doing:
    a) In a servlet, I use a SQL statement to join the tables and retrieve the results into a Recordset.
    b) I created a class with a bunch of String attributes to copy the Recordset data into, one Recordset row per each instance of the bean and then close the Recordset
    c) I then add the beans to an ArrayList and save the ArrayList into the session.
    d) Then, in the JSP, I retrieve the ArrayList from the session and iterate over each bean instance, printing the data out to the jsp. There are some logic statements to determine when not to print redundant data caused by the one to many join.
    e) I have not written the code to update the data yet but was planning on having separate jsps for updating the (one) table and the (many) table.
    Would most of you do something similar? Would you use one SQL statement to retrieve all of the data for display and use logic to avoid printing the redundant part of the data? Or would you have used separate SQL queries, one for each table? Would you have saved the results into something other than an instance of a bean class that represents one record in the RecordSet? Would you have had a bean class with attributes other than Strings - like had a collection attribute to hold the results from the "many" table? The way that I am doing everything just seems so cumbersome and difficult compared to using Struts and JDO before.
    Your help/opinion will be greatly appreciated!

    Would you use one SQL statement to retrieve all of the data for display Yes.
    and use logic to avoid printing the redundant part of the dataNo.
    I believe in minimising the number of queries. If it is a simple one-many join on a db table, then one query is better than one + n queries.
    However I prefer to store the objects in a bean class with attributes other than strings - ie one object, with a collection attribute to hold the related "many" records.
    Does the fact you are not using Struts mean that you have to use scriptlet code? (shudder)
    Or are you using JSTL, or other custom tags?
    How about tools like Ant? Junit testing?
    The way that I am doing everything just seems so cumbersome and difficult
    compared to using Struts and JDO before.Anything different takes adjusting to. Sounds like you know what you're doing for the most part. I agree, in terms of best practices what you have described so far sounds like a step backwards from what you were previously doing.
    However I wouldn't go complaining about it too loudly, too quickly. If you're new on the block theres nothing like making a pain of yourself, and complaining how backwards the work they have done is to put your new workmates' backs up
    Look on it as a challenge. Maybe discuss it quietly with a team leader, to see if they understand how much easier/better/less error prone such approaches can be?
    Struts, cumbersome as it can be, definitely has the advantage of pushing you to follow good MVC practice.
    Good luck,
    evnafets

  • Best strategy for working with popup dialog/PanelWindow (refbook)?

    In JD 11g I want selecting the value from reference book (which returned from DB). This reference book must show in popup dialog.
    I create bounded task flow with page fragment in which place the query filter and table with values for selecting.
    1. I inserted task flow as region in DIALOG component and place command button "OK" and "Cancel" into DIALOG buttons facet.
    Problem - I cant gain access to binding value (which contain ID of selected row) from fragments page definition.
    I try set the "Data Control Scope" value of task flow to "shared", but this did not help.
    2. I place command button "OK" and "Cancel" into page fragment and inserted task flow as region in panelWindow component. Now I cant gain access to binding value from fragments page definition (of course). But I cant auto-close the panelWindow after pressing buttuns on page fragment.
    Please help to solve a problem!

    Thank you for help, Frank!
    I have many reference books (reftables), which contains much records each (two columns - ID, TITLE). Also I have several tables (main tables), which refer to these reference books (maintable.ID_REF1 -> REF1.ID). On page, where I edit record from one of the main tables, i cant use simple LOV combobox, coz many records from reftable. Also I cant use ADF LOV input / ADF CHOICE LIST coz in input fields shows the ID of ref row (i want - the "title" column, bun cant do it). So I have solved to use the modal dialog / panel window in which include the region with reftable (coz each reftable may use in many page with edit of main tables).

  • What is the best approach for dealing with this issue?

    I have been advised by a mac expert that the computer should be left running except for extended periods or, as I have been doing, shut down at the end of each day/ Please explain the rationale for your response. Thanks

    On mactalk.com.au there was recently a thread on "When did you last shut down your Mac".
    A lot of people don't.
    Some people (which I think is mostly PC users converted to Mac, or old-timers who live in the past) do shut down daily.
    I myself have only shut my computer off when moving it physically, when there's an electrical storm, when I've been going on holidays, or when there's been some installation issue requiring a restart/coldstart. (I'm not counting "Restarts" for new software)
    OS X doesn't need to be shut down... though an occasional restart can be good for resetting things like virtual memory, etc.
    If you do leave it going, the CRON scripts only run if the computer is awake anyway (at like 3am in the morning), so if you set the computer to "sleep" after a time, then it's no different to shutting down in that regard.
    If you don't mind the extra time waiting for a boot up... then it doesn't really matter. If like me you're impatient, then don't shut down.

  • Best strategy to deal with Compilations folder

    Hello,
    I just moved my music folder to an external USB drive and I went back into preferences and used the Advanced tab to change the location of the folder. iTunes worked for a while and it produced 263 folders in the Compilations folder full of single song folders.
    Now I can attest to the fact that I always checked not part of a compilation checkbox and stated No in the drop down, each and every time I ripped a CD.
    How do I revert all of these songs in the folders that they belong into?
    Thanks for any help.
    Wassim

    AskTom has a good article about this,
    http://asktom.oracle.com/tkyte/Mutate/index.html

  • ISO tools and techniques for dealing with bad doc libraries and lists

    The farm was upgraded from MOSS 2007 to SP 2010. There are a small number of sites that contain lists and libraries which are no longer accessible. I would like to delete these, but when I open the site in SP Designer 2010, it does not display any of the
    lists or libraries - good or bad. There is a left column entry called lists and libraries, but when I click on it, it says there are no items to show in this view. There are lists and libraries that work just fine.
    I have full control on the site. I have also tried using sharepoint admin accounts. SPD 2010 brings up the site. If I go into Site Settings > Site Lists and Libraries, the list and library that I want to delete are listed there for customization,
    but selecting either of them results in an error message.
    Is there a way, perhaps via PowerShell, that I could delete the list and library that are giving me problems?

    Hi,
    According to your description, these lists might be corrupted during the upgrade from MOSS 2007 to SharePoint 2010.
    We can use PowerShell to delete these lists:
    #Get the Web
    $web = Get-SPWeb "<SharePoint-site-URL>"
    #Get the corrupted List
    $list = $web.lists["corrupted list name"]
    #Set the AllowDeletion Flag to True
    $list.AllowDeletion = $true
    $list.Update()
    #Delete the list
    $list.Delete()
    Read more:
    http://www.sharepointdiary.com/2013/07/force-delete-corrupted-sharepoint-list-using-powershell.html#ixzz2uIbSTbya
    Another two links about this for your reference:
    http://www.cdhtalkstech.com/2013/07/27/cleaning-up-corrupt-lists-in-sharepoint-2010/
    http://maxteo.wordpress.com/2012/08/25/delete-sharepoint-sitelist-using-powershell/
    Thanks
    Patrick Liang
    TechNet Community Support

  • Best way to deal with pics on computer and on iphone

    Hey Gang,
    Ive got lots of pics of animals, houses, friends, celebs, yada yada.
    I have one huge PICS folder and have many subfolders there.
    My goals:
    a. iPhoto on my computer will show all pics of everything, even duds and bad pics
    b. Photos app on my iPhone shows a much smaller subset....
    Not sure how to organize this. I tried by simply duplicating the various albums and renaming them to "iphone friends" for example.
    This is quite laborious !
    Anyone found an elegant solution out there ? Cheers/ TIA

    You can use the Apple TV like an Airport Express. You have turn on iTunes in the ATV menu. Set up iTunes for Remote Speakers and then you can play Audible Books by running them in iTunes. They will play out of your remote speakers.

  • How is the best way to deal with duplicate photos

    I am using a new retina 27" iMac 16gb ram OS X 10.10.1
    Aperture 3.6
    What is the best way to deal with duplicates that get in Aperture Vaults
    I have used Gemini and it finds duplicates, but I have no way of telling if the original are still there.
    I don't want to go through 15000 photos to try to find the duplicate.
    Thanks Charlie

    You mean - one image in a vault, one in a library?  Or duplicates in the same library?
    Photo Sweeper can scan several libraries or folders at the same time and display the duplicates side by side to let you pick which to keep.  You can define rules to mark photos for automatic deletion as well.
    http://overmacs.com/photosweeper.html

Maybe you are looking for

  • Alignment problem using TOP_OF_PAGE w/cl_gui_alv_grid

    Hi, I copied ZRNBCALV_EDIT_03 and made the following changes: 1. In class lcl_event_receiver, added a public method: public section.     methods:       handle_data_changed          for event data_changed of cl_gui_alv_grid              importing er_d

  • Why is my iPad so slow or freezing since the iOS 8.02 update?

    Why is my iPad so slow or freezing since the iOS 8.02 update?

  • Error During IDoc Processing

    Hi friends, i want to transfer data in between two systems. i have created my own message type = ZSACH Basic type  = ZSACH1 segement = ZSACH model view = new_model when i am execuiting this IDOC by using the program i am getting the error as EDI: Syn

  • SQL for Essbase

    I'm an Essbase developer trying to create some sql code as a pass through transformation for a metaoutline build. An example is probably the best way to describe what I'm trying to do: Let's say I have a non-recursive table that I use to build out an

  • Installation of BC

    Hi All After installing 0OI_VOLUOM and 0OI_WGTUOM from business content I am not able to see them in modelling under Infoobjects. Can some one tell me where to look for these Infoobjects. ( When I faced similar problem with some of the Characteristic