Archive and metadata in Records Management

Hi,
Can you advise if it's possible to archive Digital Personnel File contents which is stored in Records Management? We've got a requirement to archive Digital Personnel File contents (HCM Processes and Forms (Adobe) and other stored office documents) with metadata. Is it at all possible to define metadata in Records Management?
Best regards,
Jukka

Hi Jukka,
HCM Processes and Forms already use RM for persisting the data. I believe there are cases & documents created for these processes & forms in RM.
Please have a look at the archiving solution for cases & records for the same. They are available from the sdn link of RM. I have also posted news item on http://www.sdn.sap.com/irj/sdn/nw-ecm stating the upload of the archiving cookbooks. 
The catch as I see for you is that you will need to know how the processes & forms are stored in RM by HCM & then use the correct archiving solution for those objects.
Thanks & Regards,
Pragya

Similar Messages

  • Archiving documents in SAP Records Management

    Hello experts,
    Does anybody know how to archive documents in SAP Records Management? By default, they are stored somewhere in SAP database. However, based on the SAP documentation, they can be stored also to archive by using ArchiveLink. Where is this configuration done and how? Or do we need some coding here?
    I'm interested in two different solutions:
    1) Document will be stored on an archive server right from the start. By doing this, I no longer need to move the document content myself.
    2) Moving existing Records Management documents to archive. I know there is a program SRM_KPRO_CONTENT_RELOCATION for this purpose, but it doesn't seem to be working without some archiving/RM configuration before somewhere else.
    We are using Records Management as part of Digital Personnel Files and the documents to be archived are completed Adobe forms.
    Best Regards,
    JV

    Hi JV,
    You need to define the repository in OAC0 transaction. As per my analysis now your document are stored in sap database.
    If you are using separate serever then confuigure as follows in OAC0:
    Document area : SAP Records Management
    Storgae type : RFC Archive
    define IP address and port. Then goto CSADMIN and activate the certificate for the same.
    or if using SAP Database then make the changes in above areas.
    Then these archived documents will get saved in content server instead of sap database.
    Hope this will help.
    Regards,
    Ravindra

  • Archive and delete sap event management objects

    Team, I need to configure the archiving objects and residence time to Archive and delete EM objects. there are three steps of configuration, and I have never done this before.
    Can any one of you help me in setting up this config and the programs that need to run to archive and delete these objects.
    I would really appreciate you help and quick response on this.
    Define Archiving-Object-Specific Customizing
    Define Residence Times for Archiving or Deleting
    Define Settings for Deleting Document Flow
    Thank you!!!

    Hello Steffen,
    I was told that it would be possible to display archived and deleted event handlers using BAPI /SAPTRX/BAPI_EH_GET_DATA_GEN. I tested it, but could not verify this statement. SAP Help reads:
    "The archiving object /SAPTRX/A0 only supports a technical view in the Archive Explorer of the Archive Information System. This view is similar to the display in transaction SE16."
    It would be great if you could help clarify this. Thank you!
    Best regards,
    Philiipp

  • Records management and Document builder in SRM with PS

    Hi,
    We are implementing SRM 5.0 with PS ( Public section), can any one share any experience and documentation about Records management and Document builder in SRM with PS.
    Thank you
    Sreedhar Vetcha

    Hi Chris,
    I am having the same issue as Patrizia while using BSP, SRM_DEMO_RECORD & SRM_DEMO_BSPEXT, to search/display, maybe update later the records,documents,and notes that are stored in RM. I have read all the documentation  described in your messages as to how to customize web display BSP application using these 2 BSPs, srm_demo_record & srm_demo_bspext. 
    I tested SRM_DEMO_RECORD with <srm:element> tag embbeded in the layout page, but it lists only the records that associated to a given RMS_ID and SPS_ID, and I wasn't able to get it to display the associated document and notes for a selected RM record. The comments on the iview page in BSP suggest the custom code is needed for initial request from browser and callback event to return the notification for the activities occurred in RM. As you suggested to Patrizia, I wonder if you've some sample code that i could use to make these BSPs work before additional enhancements is needed later.  As far as other prerequisit settings of http access and assigning search interface class to the correct service provider element type using GENSP_QUERY_EXT connection parameter are all properly set with my customer. So i'd feel really grateful if you could forward me the example codes for these BSPS.
    Regards,
    Amy Lee
    SAP NetWeaver Consultant

  • Records Management not appearing in my folder/file properties

    I just did a fresh install of OCS_101200.tar.gz . I enabled records management using the steps described in http://download.oracle.com/docs/cd/B25553_01/content.1012/b25275/scenarios.htm#FLSAG129 . I can log into the /rm webapp and create file plans, records policies, etc. But when I right-click on any object in the Collaboration Suite web interface and click Properties, Records Management does not show up as a menu option. Is there an easy fix, or did I mess up the installation?

    Hi ckonstanski,
    as far as I can remember you have to define at least one record category in the RM Web client ( http://your.OCS.com/rm ) to make the Records tab visible in CS Web client.
    Kind regards,
    - Roland

  • Using Sharepoint as an archival and document management

    Hello, 
    I was asked by my manager to look for possible solutions to convert our archive paper file cabinets that are full of student records into digital format. We do not utilize sharepoint yet at the school but I was researching and found that sharepoint does a good
    job in archiving and content management. The ultimate goal for us is:
    1-  to be able to scan the documents in the file cabinets and save them into folders on sharepoint. We want to be able to create a folder under a specific students' name and then scan all documents related to that student into that specific folder. 
    2- Give permission to specific staff members to access and search for docs by student name. 
    I wanted to ask if  and how this is possible to achieve by Sharepoint knowing that we have an estimate of 120,0000 paper docs to scan. 
    I would much appreciate your help. 
    Thanks! 
    Nancy

    Hi
    There are many options to scan your documents and directly save and place in Sharepoint with very effective manner, you can Save directly from Scanner Deive, versioning, searching,
    ordering, automatic workflow and you can set the RCM policy in that if the documents is saved since very long time you can automatic arrive them from one location to another or can remove them apart from this there is many nice feature for content management
    you will get out of the box using SharePoint content management.
    You can use following application along with SharePoint to scan the contents.
    here are some of them.
    KnowledgeLake provides an affordable, platform based solution for managing
    your company’s unstructured content right
    within SharePoint. KnowledgeLake is an enterprise scalable and comprehensive SharePoint solution for searching, viewing,
    securing, routing and annotating your company’s mission critical electronic content. The KnowledgeLake solution is fully
    integrated with SharePoint, so it is easy to implement, easy to administer, presents low risk, and is cost effective.
    2. ScanSnap and Microsoft SharePoint enable you
    to expand the benefits of your Enterprise Content Management (ECM) system, right from your desk.  Fujitsu bundles the KnowledgeLake ‘Scan to Microsoft SharePoint’ application with the scanning software, making it easy to onramp scanned documents directly
    to SharePoint.
    Scan to Microsoft SharePoint is a desktop application enabling users to easily index, store and retrieve any document to SharePoint 2003/2007/2010 or Office Live. It eliminates
    the conventional multi-step, manual process that standard SharePoint constructs require, replacing it with a fully integrated and automated solution.
    Here is nice article you can find some good details about content management automate.
    http://www.technologyfirst.org/magazine-articles/120-march-2013/823-bizdocshow-to-make-your-small-business-paperless-using-sharepoint-with-integrated-imaging-and-scanning.html
    Krishana Kumar http://www.mosstechnet-kk.com

  • Error while trying to archive from Records Management

    Hello,
    I'm currently trying to figure out what would be the best way to archive Digital Personnel File documents. Currently it looks like the best way would be using method described in document [Archiving in SAP Records Management|https://cw.sdn.sap.com/cw/docs/DOC-22228]: using program SRM_KPRO_CONTENT_RELOCATION and archiving object SRMGSP.
    When I'm trying to run this program in test mode I get the following error:
    Problems while searching for the specified documents
    Message no. SRM_GENERIC_SP_ARC021
    There should be documents of the element type I'm trying to use there and I have configured the Content Category to be used with this (using content repository with archive link).
    Any ideas what might be causing the error? Thanks!
    - Jarmo

    Dear Jarmo,
    Please check if you have such documents by first searching for them in ORGANIZER for the given element type based on the last changed date. If you are able to retrieve results that way then this program should also work.
    Best Regards,
    Pragya

  • Difference btw SAP Records managements and SAP DMS?

    can some one pls tell me wat is the difference between records management and sap dms...
    thank you.

    Hi Vivek,
    perhaps some remarks to SAP Records Management. SAP Records Management (and SAP Case Management) is a powerful infrastructure for creating cross-component application since you can organize any SAP and Non-SAP information object as a record (tree) in a role based view.
    One classical application is the HR personnel record. You collect documents (archived documents as well as DMS documents (with versioning)) together with personnel master data and other information. For each employee there is a record instance.
    Another application is a machine record for a machine producer. It is a collection of SD objects and of engineering documents.
    A record can also be business partner record collecting information about the business partner and for example other records that refer to the business partner.
    Since SAP Records Management is designes as a Service Provider Framework you can integrate any kind of information objects both in SAP and in non-SAP systems.
    So, also SAP DMS services can be wrapped as SAP Records Management service provider and hence be used in SAP Records Management applications.
    The focus of SAP DMS is the handling of engineering documents and their behaviour in production. Based on SAP Knowledge Provider it is more a document management service.
    SAP Records Management is stressing the aspect of building cross component applications and their business flow. SAP already provides a lot of standard service providers but you can also build your own service providers and hence integrate your own objects.
    For more information, see the documentation and get a first flavour by starting the transactions ORGANIZER and SRMREGEDIT.
    By the way in contrast to SAP DMS, SAP Records Management is part of NetWeaver.
    Best regards
    Torsten

  • CFolders and Records Management, anyone?

    Hi all,
    Is anyone out there using cFolders together with Records Management?
    We want to be able to link a ‘package’ of collaborated DMS documents with a PO order nr and to set up an approval workflow when updated documents are retrieved.
    We are thinking about using Records Management to archive this and I would like to know if anyone have any experience of this combination?
    Regards,
    Stephan Nilsson

    hi Stephen,
    did you figure it out.
    since i also have the same issue...
    please let me know with the solution,if you have it
    Thanks in advance
    Bye.

  • What is data archiving and DMS(Data Management System) in SAP

    what is data archiving and DMS(Data Management System) in SAP
    Welcome to SCN. Before posting questions please search for available information here and in the web. Please also read the Rules of Engagement before further posting.
    Edited by: kishan P on Aug 31, 2010 1:06 PM

    Hi,
    Filtering at the IDoc Level
    Identify the filter object (BD59)
    Modify the distribution model
    Segment Filtering
    specify the segments to be filtered (BD56)
    The Reduced IDoc Type
    Analyze the data.
    Reduce the IDoc type (BD53)
    Thanks and regards.

  • Records Management  add  and read note

    Hello
    We are in the process of imigrating our paperarchiv from Excel-Sheets to records management. For the data-imigration of the paperrecords I found a bapi. So far so good. I also have to add a note to the paperdocument. For this I have found nothing. I tryed it with bapi's for documents with negative results. How can I create a note and add this note to the paperdocument?
    Any suggestions or code?
    Regards
    Andreas

    I assume you are using the records and not cases. The easiest way is to create an own service provider like SRM_SP_NOTE. the problem with this is, that it has no API as far as I know. The other possibilty is to use cases instead of records. There you can create notes with the case API.
    regards,
    Thomas

  • Is Informatica8.1.1 Data Analyzer and Metadata Manager is Mandatory

    Hi all,
    Is Informatica8.1.1 Data Analyzer and Metadata Manager were mandatory components for BI-Applications intallations.
    we are installing BI-Applications with Oracle Application server, but the above components were supported by Jboss/IBM webSphere/BeaWebLogic.
    Let me know whether i can skip these two components while Informatica installation
    Thanks
    saran

    Hi,
    You dont require Data Analyzer and Metadata Manager for OBI Apps, use custom installtion and select powercenter only while installing informatica.
    Regards
    Tarang Jain

  • RE: List Archive and Release Management

    Hi back.
    >> what procedures do people use to manage the development, testing
    and release of software?
    It's useful to have a separate repository for Development, QA, Production,
    etc. It's true that you have to 'import' between them, each time you
    propagate changes. It might seem like a pain to constantly export/import,
    but actually it's better that way.
    Firstly, you should only export to QA when you're absolutely sure it's ready
    for quality assurance, and this doesn't happen every day (or shouldn't)
    Second, the advantage of this is that you get a 'clean' import and compile
    each time, with no other baggage. This is especially critical when you
    migrate from QA to a production environment.
    There's no need to have a testing_verXX repository, just make a different
    workspace for each version of test.
    Just check out all the components, import, and integrate. The 'old'
    workspace will have a snapshot of the old release.
    * Don't Update that old workspace or you'll lose it!
    * Put an administrator password on the repos, to protect against
    inadvertent usage.
    * It's expensive to have many repositories running on your environment
    manager, so I recommend only one repos for each of development, QA and
    Deployment.
    * You might want to consider only bringing up the production repos as
    required, they hog RAM like crazy.
    It's critical to have a complete image of each version of every application
    you have in production. That way, you can recreate any problem.
    It's even useful to migrate your development repos every now and then. As
    you know, these repos beasties can grow quite large, and it's god to start
    from scratch on a regular basis.
    For example, every month or so, our core architecure team releases a new
    version of the core components. We all integrate, and we create a new repos
    from the core architecture stuff, and import all our stuff over the top of
    it. That way, we keep a loose coupling between sub-systems, and their stuff
    never depends on ours. Also, with a split developmen repos, you can locate
    the different teams on different intranets, or even on different sides of
    the world, all with relatively little fuss.
    Fort&eacute; SCM hooks and SCCSWe use a unix version control called CMVC.
    Whenever we integrate, it checks out the pex files, exports them, and checks
    them back in.
    John Pianezze
    S1 Technologies (Asia Pacific)
    Melbourne, Australia
    -----Original Message-----
    From: Duncan Kinnear [SMTP:[email protected]]
    Sent: Wednesday, July 21, 1999 1:05 PM
    To: [email protected]
    Subject: List Archive and Release Management
    Hi folks!
    First of all, does anyone know what's going on with the list archive
    on
    SageIT? There doesn't seem to be a search facility anymore. Seems
    a
    bit weird when each message is appended with a little signature
    'advertising' a searchable archive.
    Second (and this is the biggy!), what procedures do people use to
    manage the development, testing and release of software? I'm
    talking
    about how you keep them separate, how you do hot-fixes, how you
    identify installed versions at customer sites, etc.
    I've thought that we could do it with separate repositories for
    "current
    development", "testing_verXX", "release_verXY", etc., where we
    export
    from "development" and then import into "testing", and similarily
    for
    "release".
    Also, we are looking at using the Fort&eacute; SCM hooks and SCCS on our
    Unix host to store historical versions of the projects/classes. But
    we
    somehow need to identify which version of a particular component is
    installed at the customer site. I had thought of defining a
    constant in
    each project/class called "SCCS_VER" which contains the SCCS
    keywords that get mapped to SCCS ID and date when put into SCCS.
    Then the constant could be used to display these values in a
    Window's
    "About" window.
    Any thoughts/opinions welcome.
    Cheers,
    Duncan Kinnear,
    McCarthy and Associates, Email:
    [email protected]
    PO Box 764, McLean Towers, Phone: +64 6 834
    3360
    Shakespeare Road, Napier, New Zealand. Fax: +64 6 834
    3369
    Providing Integrated Software to the Meat Processing Industry for
    over 10 years
    To unsubscribe, email '[email protected]' with
    'unsubscribe forte-users' as the body of the message.
    Searchable thread archive
    <URL:http://pinehurst.sageit.com/listarchive/forte>
    To unsubscribe, email '[email protected]' with
    'unsubscribe forte-users' as the body of the message.
    Searchable thread archive <URL:http://pinehurst.sageit.com/listarchive/forte>

    Under Forte 2 we used to export all projects, then import them into a make
    repository, then produce an app. Due to the time involved importing &
    exporting, and errors resulting from importing projects out of order, we
    switched to making copies of the dev repository and producing apps from
    that. For us, it used to take 2+ hours to export/import, as opposed to 5
    minutes for copying repository files. Unfortunately, we have found that
    unless we force-compile prior to deploying, we will get errors when
    attempting to compile partitions.
    Our current procedure:
    Shut down dev repos
    Copy to make env
    make a workspace that includes all projects
    force-compile
    update
    integrate
    make a new workspace for each app to make
    make each app
    export code
    remove unused workspaces
    clean repos
    -----Original Message-----
    From: Peter Sham (HTHK - Assistant Manager - Software Development, IITB)
    [mailto:[email protected]]
    Sent: Wednesday, July 21, 1999 3:31 AM
    To: John Pianezze; Duncan Kinnear
    Cc: [email protected]
    Subject: RE: List Archive and Release Management
    Hi,
    We, like what you've explained, keep different releases of our application
    in different workspaces. However, when the release is getting larger, this
    "check-out-every-components-then-overwrite-by-import" procedure get tougher.
    Sometimes, it even creates garbage in the repository which corrupts my
    workspace.
    So, I have an alternative proposal on this procedure ( though I haven't
    really test on this idea ) which is to create a new repository for each new
    release.
    What do you think?
    Regards,
    Peter Sham.
    -----Original Message-----
    From: John Pianezze [SMTP:[email protected]]
    Sent: Wednesday, July 21, 1999 3:44 PM
    To: Duncan Kinnear
    Cc: [email protected]
    Subject: RE: List Archive and Release Management
    Hi back.
    what procedures do people use to manage the development, testingand release of software?
    It's useful to have a separate repository for Development, QA, Production,
    etc. It's true that you have to 'import' between them, each time you
    propagate changes. It might seem like a pain to constantly export/import,
    but actually it's better that way.
    Firstly, you should only export to QA when you're absolutely sure it's
    ready
    for quality assurance, and this doesn't happen every day (or shouldn't)
    Second, the advantage of this is that you get a 'clean' import and compile
    each time, with no other baggage. This is especially critical when you
    migrate from QA to a production environment.
    There's no need to have a testing_verXX repository, just make a different
    workspace for each version of test.
    Just check out all the components, import, and integrate. The 'old'
    workspace will have a snapshot of the old release.
    * Don't Update that old workspace or you'll lose it!
    * Put an administrator password on the repos, to protect against
    inadvertent usage.
    * It's expensive to have many repositories running on your environment
    manager, so I recommend only one repos for each of development, QA and
    Deployment.
    * You might want to consider only bringing up the production repos as
    required, they hog RAM like crazy.
    It's critical to have a complete image of each version of every
    application
    you have in production. That way, you can recreate any problem.
    It's even useful to migrate your development repos every now and then. As
    you know, these repos beasties can grow quite large, and it's god to start
    from scratch on a regular basis.
    For example, every month or so, our core architecure team releases a new
    version of the core components. We all integrate, and we create a new
    repos
    from the core architecture stuff, and import all our stuff over the top of
    it. That way, we keep a loose coupling between sub-systems, and their
    stuff
    never depends on ours. Also, with a split developmen repos, you can locate
    the different teams on different intranets, or even on different sides of
    the world, all with relatively little fuss.
    Forti SCM hooks and SCCSWe use a unix version control called CMVC.
    Whenever we integrate, it checks out the pex files, exports them, and
    checks
    them back in.
    John Pianezze
    S1 Technologies (Asia Pacific)
    Melbourne, Australia
    -----Original Message-----
    From: Duncan Kinnear [SMTP:[email protected]]
    Sent: Wednesday, July 21, 1999 1:05 PM
    To: [email protected]
    Subject: List Archive and Release Management
    Hi folks!
    First of all, does anyone know what's going on with the list archive
    on
    SageIT? There doesn't seem to be a search facility anymore. Seems
    a
    bit weird when each message is appended with a little signature
    'advertising' a searchable archive.
    Second (and this is the biggy!), what procedures do people use to
    manage the development, testing and release of software? I'm
    talking
    about how you keep them separate, how you do hot-fixes, how you
    identify installed versions at customer sites, etc.
    I've thought that we could do it with separate repositories for
    "current
    development", "testing_verXX", "release_verXY", etc., where we
    export
    from "development" and then import into "testing", and similarily
    for
    "release".
    Also, we are looking at using the Forti SCM hooks and SCCS on our
    Unix host to store historical versions of the projects/classes. But
    we
    somehow need to identify which version of a particular component is
    installed at the customer site. I had thought of defining a
    constant in
    each project/class called "SCCS_VER" which contains the SCCS
    keywords that get mapped to SCCS ID and date when put into SCCS.
    Then the constant could be used to display these values in a
    Window's
    "About" window.
    Any thoughts/opinions welcome.
    Cheers,
    Duncan Kinnear,
    McCarthy and Associates, Email:
    [email protected]
    PO Box 764, McLean Towers, Phone: +64 6 834
    3360
    Shakespeare Road, Napier, New Zealand. Fax: +64 6 834
    3369
    Providing Integrated Software to the Meat Processing Industry for
    over 10 years
    To unsubscribe, email '[email protected]' with
    'unsubscribe forte-users' as the body of the message.
    Searchable thread archive
    <URL:http://pinehurst.sageit.com/listarchive/forte>
    To unsubscribe, email '[email protected]' with
    'unsubscribe forte-users' as the body of the message.
    Searchable thread archive
    <URL:http://pinehurst.sageit.com/listarchive/forte>-
    To unsubscribe, email '[email protected]' with
    'unsubscribe forte-users' as the body of the message.
    Searchable thread archive
    <URL:http://pinehurst.sageit.com/listarchive/forte>
    To unsubscribe, email '[email protected]' with
    'unsubscribe forte-users' as the body of the message.
    Searchable thread archive <URL:http://pinehurst.sageit.com/listarchive/forte>

  • File naming, archiving and time management

    I've posted on this subject before, but I have a new twist that I'd like to get some feedback on.
    I usually import my photos, keeping the master (now called original) file name until the end of the calendar year.  At the end of the year, I like to change the original name for classification and archiving purposes.  By then, I've usually made all of the deletions for the year, so I feel comfortable renaming the photos with some sort of counter or index.  My preferred classification system is: "Custom Name"/"Image Date_"/"Counter" (0000).
    The problem that I'm experiencing is that it is impossible to rename my originals using this format without some inaccuracies if I try to name them all at once without readjusting the computer's internal time zone settings.  I live on the east coast, so if I have a photo shot at 10:30 pm PDT on 2011-03-14, it gets named with a date of 2011-03-15, which obviously isn't accurate for when that photo was shot.  Well, it is accurate based on East Coast Time, but I want the file to be renamed with the date that it was shot, where it was shot, not where my computer currently resides.  Of course, I could rename the batch of 2011 photos in segments, but that would mean multiple quits/reopens from Aperture in order to change the time zone appropriately.
    It seems that my only choices are to either rename my photos at the time of import using the correct time zone settings on my computer, or to not use this renaming format.  Neither of these options are very appealing, since this renaming format is my preferred method.
    I guess my question is: does anyone have any insights or advice on either how to better work around this problem, or if not, other renaming methods that they like to use for archival and organizational purposes?  I know there are many to choose from, but I'm looking for something simple, which also provides direct information about the image, should I want to reference my Originals (which I do outside of Aperture from time to time).
    Thanks for adding to this discussion...
    mac

    Allen,
    SierraDragon wrote:
    mac-
    Personally I create a folder for each Project and copy pix from CF card into those folders. Then I import from the backup hard drive into Aperture using the folder name as the Project name.
    Usually each Project includes only one day or less, and I may have YYMMDD_JonesWed_A, YYMMDD_JonesWed_B, etc. for a large or multiday shoot. I do not let any Project contain more than ~400 Nikon D2x RAW+JPEG files.
    Projects are just that and never put into folders other than by month and/or year, just a forever chronological list. All organizing is done via Albums and Keywords. JonesWed_2011 is a keyword that can be an Album instantly when needed; bride is a keyword; wed is a keyword; flower is a keyword; etc.
    I use wedding just as an example. The process applies to all kinds of shoots.
    I use the 1-9999 Nikon auto-numbering of image files, and never rename image files except  sometimes during export. That way original image names can always be found across mass storage devices in the future independent of any application.
    -Allen
    SierraDragon wrote:
    Usually each Project includes only one day or less, and I may have YYMMDD_JonesWed_A, YYMMDD_JonesWed_B, etc. for a large or multiday shoot. I do not let any Project contain more than ~400 Nikon D2x RAW+JPEG files.
    Why do you keep the photo count in a project to around 400 files or so?  Is it detrimental to speed, or are there other considerations that have led you to work this way?
    SierraDragon wrote:
    Projects are just that and never put into folders other than by month and/or year, just a forever chronological list. All organizing is done via Albums and Keywords. JonesWed_2011 is a keyword that can be an Album instantly when needed; bride is a keyword; wed is a keyword; flower is a keyword; etc.
    So, you are saying that you sometimes put projects into folders by month and/or year?  Or, do you just keep all projects at the top level of the hierarchy?  The only folders I use are at the top of my hierarchy, and they are by year, 2002, 2003, 2004...2012.  I then keep all of my projects in the appropriate year.  I used to keep folders that were named things like, "Travel", "Occasions"..., but this became problematic when I had overlap, and images could fit in more than one designated folder.
    SierraDragon wrote:
    I use the 1-9999 Nikon auto-numbering of image files, and never rename image files except  sometimes during export. That way original image names can always be found across mass storage devices in the future independent of any application.
    It sounds as though you don't actually rename your images at all, but rather just keep the original names.  I don't like to do this because after deletions, it creates gaps in my sequence, and I also end up with multiple images with the same name.  I like for each image to have its own unique identifier by name.
    I'm considering importing the images using a version name, where the version is named by the image date.  I'll keep the original file name intact until the end of the year, and then, should I decide to rename my files, I could base my renaming system off of the version name.  This will automatically capture the date of the image without being reliant on my computer's time zone settings.

  • Messaging Records Management: loss of functionality between Managed Folders and Retention Policy

    In Exchange 2007 and 2010 you can use Managed Folders with Managed Content Settings to act on messages based on when they were moved into the folder. E.g.,
    New-ManagedContentSettings -Name DeleteJunk -FolderName ManagedJunk -MessageClass * -RetentionEnabled $true -RetentionAction DeleteAndAllowRecovery -AgeLimitForRetention 30 -TriggerForRetention WhenMoved
    will delete items 30 days after they're moved into the Junk E-Mail folder designated by the Folder Name "Managed Junk".
    However, Exchange 2010 introduces Retention Policies as an alternative to Managed Folders, and Exchange 2013 drops Managed Folders entirely. With a Retention Policy, it's impossible to specify a TriggerForRetention. See http://technet.microsoft.com/en-us/library/dd335226%28v=exchg.141%29.aspx
    and note that New-RetentionPolicyTag lacks -TriggerForRetention as a parameter. If you Get-RetentionPolicyTag | fl you can still see a TriggerForRetention value, but it's "WhenDelivered" and can't be changed.
    Both http://technet.microsoft.com/en-us/library/bb430780%28v=exchg.141%29.aspx (Exchange 2010) and http://technet.microsoft.com/en-us/library/bb430780%28v=exchg.150%29.aspx (Exchange 2013) describe a rather convoluted method used to determine the age of
    a message for retention purposes, but I don't know whether to believe those pages. It seems apparent, though, that basing a message's age on WhenDelivered can easily produce undesired results.
    Consider a Retention Policy Tag placed on the Junk E-Mail folder which will perform DeleteAndAllowRecovery on items older than 30 days. A message arrives on 01/01/15 and is classified as Junk. It immediately starts aging based on the date 01/01/15. The user
    find the message, which was misclassified, and moves it to another folder. Since there's no way to set a TriggerForRetention based on WhenMoved, the message continues to age from 01/01/15. 90 days later, while the user is reviewing their mail, the user accidentally
    clicks the Junk button on the message. Because the message is more than 30 days old, it's deleted immediately.
    1) Am I correct? Or is there a way to use Retention Policies so that messages are acted on based on how long they've resided in their current folder?
    2) What do I have to do to get Microsoft to add the ability to change the TriggerForRetention on a Retention Policy Tag in Exchange 2013?
    Note, I'm not the first person to raise this issue. It's been discussed in a number of places including TechNet. Here are a few:
    http://social.technet.microsoft.com/Forums/exchange/en-US/82c01e6e-0184-4d25-b803-45a604ca0c68/retention-policy-tag-problem?forum=exchangesvrsecuremessaginglegacy
    http://www.shudnow.net/2010/04/08/exchange-2010-sp1-retention-policies/
    http://social.technet.microsoft.com/Forums/exchange/en-US/82c01e6e-0184-4d25-b803-45a604ca0c68/retention-policy-tag-problem

    By default there is no property on a message to identify a message is moved from one folder to another.Managed folder used to stamp a new property Moveddate on the message when the ContentSetting on the target folder was set to WhenMoved.
    WhenMoved functionality is not available with Retention Policy . You can raise your concern to make this functionality available by opening a ticket with support.

Maybe you are looking for

  • £99 to fix a fault

    Back in 2006 a BT Engineer installed an ADSL filter faceplate to my master socket to resolve our issues with slow connection.  It worked OK until last night when we lost the broadband connection completely. I was able to get my connection back by rem

  • Forum sign-in works for me!

    I was just able to sign-in and reply to a post.  Hopefully others will be able to do the same. Nancy O.

  • Transfer of hours from CATS to HR

    Hi. One of my customers has set up a transferral of hours from CATS to HR (IT2001 and IT2002) with help of trx CATA. The process includes approval of hours via CATS_APPR_LITE which slows everything down considerable. In the time gap between recording

  • Adding tables not reflecting in data control palette

    hello i am using jdev 10 g and oracle xe i connected the database to the new aplication, then i can now use the datas but in the middle of my work i want to add another table, how can i add the new table in the data control palette so i can continue

  • PLL File not working.

    Hello, Welcome again, well i have an issue. i started a new project in oracle forms, we have Template forms having common buttons like Add, Modify, Delete, Exit, Clear and Print which comes in all the forms as a Standard. So for this We use .PLL File