Inherit workflow and versioning to subfolders

does anyone know if there is a possibility to inherit workflow and versioning settings from a parent folder to its subfolders? in standard-configuration we have to set these both settings for every new subfolder we create (permissions are inherited).
thanks!

Hi again,
SAP has introduced now with KMC SPS 14 the "Inheritance of the Versioning Settings When Creating New Folders", see http://help.sap.com/saphelp_nw04/helpdata/en/3a/d60a15803111d5992f00508b6b8b11/frameset.htm for details.
Advantage: It is implemented now
Disadvantage: It is not customizable in any way; still, ApprovalWorkflow, ManualOrdering and TimeDependentPublishing inheritance are not supported.
Hope it helps
Detlev

Similar Messages

  • How to view and restore sharepoint 2013 designer workflows and how to redeploy with newer version to environments

    how to view and restore sharepoint 2013 designer workflows and how to redeploy with newer version to environments
    MCTS Sharepoint 2010, MCAD dotnet, MCPDEA, SharePoint Lead

    Hi,
    In SharePoint Designer 2010, we could not save the workflow as a template directly except the reusable workflow.
    However, in SharePoint Designer 2013, we can just save all the types workflow as a template, then you can import the workflow into the new environment.
    http://blogs.msdn.com/b/workflows_for_product_catalogs/archive/2012/11/02/deploying-a-workflow-on-a-different-server.aspx
    In SharePoint Designer 2013, every time we publish the workflow, we would get a newer version workflow, and the old workflow version would be overwritten.
    So, when you deploy the workflow in the environment, the workflow would the newer version.
    Thanks,
    Jason
    Forum Support
    Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Subscriber Support, contact
    [email protected]
    Jason Guo
    TechNet Community Support

  • Tapeless workflows and Sandy Bridge or other PC's: KISS or LOVE?

    Tapeless workflows and Sandy Bridge or other PC's: KISS or LOVE?
    Life used to be so simple when shooting video on a tape based camera. You shot your material, captured it for editing and stored your precious original footage on tape in a safe and dry place. Sure, it took time to capture, but the big advantage was that if you had a computer or drive failure, you would still have the original tape so everything could be recreated.
    Now with tapeless workflows we have the significant advantage of much faster import of the original footage. Connect the flash card or disk drive to the computer over USB and copy the data to a HDD on the computer, ready for editing. The data on the flash card or disk drive can then be erased, so you can reuse it for more shots. But, like Johan Cruyff has said repeatedly, every advantage has its drawback. In this case it simply means that you no longer have the original material to fall back on, in case of computer or drive failures. That is a very unpleasant and insecure feeling.
    The easy anwser to that problem is backups. Backup of the original media, backup of projects and backup of exports. This often means a bundle of externals for backup or NAS configurations. One thing is clear, it requires discipline to make regular backups and it costs time, as well as a number of disks. Four as a minimum: 1 for media, 1 for exports and at least 2 for projects. Note: This is excluding a backup drive for OS & programs.
    There are different backup strategies in use. Some say backup daily and use one disk for monday, one for tuesday, and so on.  Others say one disk for the first backup, the second for the second backup, then the first again for an incremental backup, etc. and once weekly a complete backup on a third disk. Whatever you choose, be aware that shelf live of a disk is far less than tape. There are horror stories everywhere about ball-bearings getting stuck after some time and without original tapes, you better be safe than sorry, so don't skimp on backups.
    What is the relevancy of all this? I thought this was about Sandy Bridge and other PC's.
    It is and let me try to explain.
    Card based cameras are for the most part DSLR and AVCHD type cameras, and we all know how much muscle is required to edit that in a convenient way. Adobe suggests in the system requirements to use raid configurations for HD editing and practice has shown that raid arrays do give a significant performance boost and improve responsiveness, making for a nicer editing experience. The larger the project and the longer the time-line, the more a raid array will help maintain the responsiveness.
    One thing you would not do is using a raid0 for projects, media and exports, even if you have backups. The simple reason is that the chance of disk failure multiplies by the number of disks in the raid0. Two disks double the chance of disk failure, three disks triple the chance, four disks quadruples the chance, etc.
    Remember: Disaster always strikes when it is most inconvenient.
    Imagine you have been working all day on a project, you decide to call it a day and to make your daily backup, but then the raid fails, before you made your backup. Gone is all of today's work. Then take into consideration the time and effort it takes to restore your backups to the state it was in yesterday. That does not make you happy.
    Another thing to avoid is using a software or mobo based parity raid, for the simple reason that it is slooowww and puts a burden on the CPU, that you want to use for editing, not house keeping.
    For temporary or easily recreated files, like the page-file, media cache, media cache database and preview files, it is very much advised to use a raid0. It makes everything a lot snappier and if disaster strikes, so what? These are easily recreated in a short time.
    This was a general overview of what is required with tapeless workflows. Now let's get down to what this means in terms of system design.
    Two approaches or train of thoughts
    KISS: Keep it stupidly simple or LOVE: Laughing over video editing
    The first one, the most economic one, is to use a system with 3 or 4 disks internally and 4 or more backup disks.
    A typical disk setup can look like this:
    This is a perfectly sensible approach if one does not have large or complex projects, long time-lines and is willing to take the risk of occasionally losing a whole days work, between backups. Many hobbyists and consumers fall in this category.
    The KISS approach keeps it stupidly simple. The drawback is that there is no logical way to add more disks or storage. The discipline, diligence and effort required for regular backups make it far from a laughing matter. In fact it can quickly become a bore. Add to that the fact that the disk setup is simple but not very fast, so less suited for situations where lots of clips are involved, multi-cam is a regularly recurring situation or lots of video tracks are involved.
    A number of video editors want more from their system than the occasional platonic KISS, they want to really LOVE their system, which lead to the other train of thought.
    This is more costly than the KISS approach, but you all know a fiancée or wife is more costly and dear than the occasional kiss on the cheek by an old friend.
    Let's start with a typical disk setup. It may look like this:
    Two striking differences in comparison to the KISS approach:
    1. Much easier disk organization and more disks and thus more space.
    2. It requires a hardware raid controller, causing a higher investment cost. It is like an engagement ring. You don't get LOVE for free, one of the guiding principles of the oldest trade in the world.
    These are easy statements to make, but what are the benefits or advantages, that you would fall in LOVE with such a system, and what are the drawbacks? Think back to Johan Cruyff's adage.
    The only drawback is cost. The advantages are multiple, easier organization, more speed, more storage, snappier editing, no jerkiness, lesser requirements for regular backups and - this is the major benefit - hardly a chance of losing a day's work in case of a drive failure. Keep in mind that a parity raid keeps all your data intact in case of a drive failure, so lessens the need for up-to-date backups.
    We all know, we get what we pay for: "If you pay peanuts, you get monkeys. OTOH, if you pay money to monkeys, you get rich monkeys". But in this case you get what you pay for, a much better editing experience with a much easier workflow.
    Using a parity raid (be it raid 3/5/6/30/50/60) you get security, ease of mind that you are protected against losing precious media, that you need not worry about the last time you made a backup, that the editing you did today may be lost and you save valuable time editing and a lot of aggravation because of a much more responsive system.
    How does this all relate to Sandy Bridge and other PC's?
    First of all, the price difference between a Sandy Bridge / P67 platform and an i7-950+ / X58 platform is very small. Of course the new architecture is slightly more expensive than the older one, but the differences are small, almost not worth talking about.
    So what are the differences? Look below:
    The first thing to keep in mind is that the Sandy Bridge is the successor of the i7-8xx CPU and as such it is much more evolutionary than revolutionary. The CPU power has increased significantly over the i7-8xx due to new architecture and a smaller production process (32 nm), but in essence all the capabilities have remained unchanged. Same memory, same PCI-e lanes, same version, same L3 cache and no support for dedicated raid controllers.
    It is great that the processor performs much better than the older i7-8xx CPU's, almost achieving the level of the i7-9xx range of processors, but is still limited:
    The Sandy Bridge is unsuitable for anything more than a KISS system.
    Why? Because it lacks the required PCI-e lanes to accomodate more than a 16 x PCI-e nVidia card with CUDA support to enable hardware MPE acceleration and the integrated graphics are not supported by CS5.
    You may wonder if that is a bad thing. The plain and simple anser is NO. It is a great processor, it delivers great value for money, is a solid performer, but it has its limitations. Intel had a reason to position this CPU as a mid-level CPU, because that is what it is, a mid-level performer in comparison to what is to come.
    The term mid-level performer may seem strange when compared to the old generation of i7-9xx CPU's, because they perform almost equally well, but keep in mind that there is a generation difference between them.
    So what about the i7-9xx and X58 platform?
    It still is going strong. About the same performance as a Sandy Bridge, with only the much more expensive hexa-cores clearly in the lead, both performance and price wise. The quad cores deliver about the same value for money.  The main difference however is the platform that allows a dedicated raid controller to be installed, thus making it the platform of choice for those who want to go from a passing KISS to true LOVE.
    And what lies ahead?
    Sandy Bridge E on the Waimea platform (X68). Now that is revolutionary. More than double almost everything a processor can offer: double the cores, double the PCI-e lanes, triple the memory, more than double the L3 cache, increase the PCI-e support from 2.0 to 3.0, etc...
    This is why Intel calls this a high-end CPU / platform.
    So what now?
    If you prefer a KISS approach, choose either a Sandy Bridge/P67 or an i7-950+/X58 platform.
    If you wonder whether in the future you may need multi-cam more frequently, edit more complex projects and longer timelines or even progress to RED, look at KISS/LOVE solutions, meaning the i7-950+/X58.
    If you can't have downtime, time pressure is high, delivery dates to clients are critical or you edit highly complex projects, lots of multi-cam situations or lengthy time-lines, choose a LOVE solution, an i7-950+/X58 platform.
    If you have the time to wait till Q4/2011, Sandy Bridge E/Waimea looks to be worth the wait.
    Hope this gives you some more insight into recent and future developments and helps you make wise investment decisions.

    I'm upgrading from an AMD 3800+, cutting with Vegas 7 Pro. Usually shoot DSLR or HDV, sometimes P2, EX or RED. I have ridiculously cheap access to Macs, FCP/FCS, all kinds of software.
    I've been agonizing over this for the last month, was originally hoping the UD7 mobo was the solution, read the read about the NF200/PCIe issue a few days ago, http://www.dvinfo.net/forum/non-linear-editing-pc/489424-i7-980x-now-wait-sandybridge-2.ht ml- and still decided to go for a 2600k. 
    My preference is to treat my video footage the same way as my digital imagery: I make (at least) duplicate back ups of everything before reformatting the cards, never delete the back ups, and only worry about the day-to-day stuff at night. Unless I'm rendering or involved in other long processes, in which case I'll back up the work in process the next day. If I am under a really really tight deadline I might back up as I go.
    Yes, a RAID might make it easier, but I'm paranoid enough to prefer a slower, safer backup. You can always duplicate, and usually improve upon, a days work, but you can never get back original footage you lost. I have only ever had one hard drive die on me (a few enclosures crapped out, though)- it took a couple of (mostly unattended) hours to rectify. As a matter of act, I've had far more loss/damage from tapes than from hard drives.
    I ordered the UD7, 2 F4s and 4 F3Rs, understanding I will probably want to upgrade to SBE when it comes out, or maybe next year. The 2600k/mobo/RAM will likely hold its value better than a 950/X58, likely because of the marketplace as much as merit.
    The UD7 / RAID card issue is in it's early days, there may be a solution/mitigation. Probably not. But if I really really need a RAID card, then I probably really really need a 980, NAS, etc etc.
    But Harm still rocks!

  • How can i select all RAR archives in a folder and it's subfolders?

    so that i can quickly move them all into the same folder to unarchive them. This is after i've decompressed a bunch of zips. Unless someone can tell me how to get 'the unarchiver' to be less dumb, and find the other pieces on it's own. I am constantly getting audio and video in this stupid file arrangement, obviously the work of an evil windows warlock.
    like this
    [folder]
    somefile.xyz.01
    [folder]
    somefile.xyz.02
    [folder]
    somefile.xyz.03
    and so on. It is driving me completely crazy because it seems like it should be such a natural thing to tell the OS- 'hey, select all the files all the way down into all the subfolders within this folder that are like this one.'
    i will set up a shrine and tell tales of your greatness througout the land if you can help me figure this out.

    Here's what you do...
    open a console window at the root directory that contains all of the .zip files.
    After you open all of the zip files you'll have a directory full of subdirectories that contain your .rar files.
    Make sure your console window is pointed at the root. If you type "ls" (without the quotes) in the console you should get a directory listing showing all of the subfolders.
    Now type this command into the console:
    find ./ -iname "*.rar" -exec mv {} ./ \;
    This command will recursively find all of the .rar files within the parent directory and then move them to the root level of the parent directory.
    Now all of the .rar's will be at the same level and you can use your tool of choice to unrar them.
    OOPs, I meant terminal window, NOT console window.
    After I wrote this response I put together a little automator workflow and saved it as a service. Seems to work well. Just open automator and create a service. Specify that the service accepts folders from the finder. Then add a single component which is "run shell script". Set the shell script to pass input as arguments. The shell script should be set to contain this:
    for f in "$@"
    do
              cd $f
              find ./ -iname "*.rar" -exec mv {} ./ \;
    done
    Save with a service name: I use "unrar subfolders".
    Now you simply right click on the parent folder that contains subdirectories that have segments of .rar archives. Pick the service that you created and the magic will happen and all .rar files will be at the parent level. Make sure you right click the parent folder and not just one of the .rar subfolders, that won't work.
    This script can easily be modified to also do the unzip step and clean up afterwards but I like to keep my actions simple and short to avoid unintended consequences.
    Those of you following along might notice that the script in fact can operate on several folders so if you select a set of folders and right click it should process all of them in turn. I haven't tested that!
    The useful trick that can be used again and again is the "find ./ ########### -exec ####### ./ \;" sequence. This will recursively search from the current parent directory and execute another command on everything that the search finds. The search can be as complex as you like as can the execution step. Just make sure you really know which directory is the current parent because you can easily wind up firing this up from your account root or even system root and do nasty things!!!!!

  • IPhoto 08 and Photoshop - suggestions on workflow and managing PSD files

    Hi Everyone,
    I'm a long-time Photoshop user, and a recent convert to Mac (BTW - I love my new Mac), therefore I am also new to iPhoto 08. I must say that I actually enjoy how iPhoto manages my pictures, as it is not all that different from how I've been managing my pics manually for years (by year, by event, etc.). I've read the various discussion topics on how to set up and use Photoshop as an external editor from iPhoto, and have not had any problems up to now.
    I'm now looking for suggestions on workflow and managing my PSD files. I apologize if this is a little long, but I want to make sure I explain the problem clearly. Here's the scenario:
    - From iPhoto, I choose to edit an existing JPEG file in the external editor, Photoshop.
    - I perform my edits (including advanced edits with layers etc.)
    - I do both a "Save As..." and a "Save". That way, I "Save" the flattened JPEG back to my iPhoto library properly, and "Save As..." a PSD file in case I want to do further edits on the image later (days, weeks, months later).
    I'm looking for suggestions on where to put my PSDs. For now, I am saving the PSD to my desktop (using max compatibility), and then importing the PSD into my iPhoto library (into the same Event as the original JPEG). However, this leaves me with 2 copies of the same picture: 1 JPEG, and 1 PSD.
    The problem is, I now want to go back and do more tweaks on the PSD, the end result of which will make the JPEG version out-of-date. I can either
    1) Open both the PSD and JPEG versions at the same time, tweak the PSD, then copy and paste the flattened layers on top of the JPEG version and "Save"
    2) Open only the PSD, tweak, and save a copy as a JPEG onto my desktop and re-import into iPhoto.
    Although a little tedious, both options seem to work. Are there any other options? What do you suggest? I'm curious to see how others manage this.

    I don't have a solution for you (sorry) but I'd like to comment because I'm in a similar boat. However instead of saving PSD files to the desktop, I've been advised to save these to TIFF format in Photoshop by a professioal photographer friend of mine (I think he feels its a more universal file format than PSD for archival purposes). The downside is the files are huge.
    I imported high resolution jpegs into iPhoto. I then use Photoshop to edit and save the flattened jpeg version back to iPhoto (thankfully, keywords are preserved this way). I recently realized that I need to save a copy of the edited file in a lossless format like TIFF or PSD. Unfortunately when I save the file to TIFF and import it back into iPhoto, keywords are lost. This is a drag.
    What I am trying to figure out is how I can retain my keyword info in the edited archival version. Any tips?
    I'm beginning to question whether I should import my photos in to Photoshop (or PS Bridge) first, save the original for archival purposes, do the majority of editing and save a TIFF/PSD file and then import this file into iPhoto for keywording, further editing, downsizing, etc.

  • How to automate the process of adding members into the planning workflow and assigning owners to it?

    Hi,
    We have a workflow XYZ. After every two-three days our entity structure is getting refreshed - so new entities comes in.
    Every time we have to manually add these new entities to the workflow and assign owners to it.
    Please let me know if there is any option to automate the process.
    Hyperion Planning version is 11.1.2.2 .
    Thanks.

    Hi, Vivek.
    Currently, ExportPDF can only handle 1 file at a time. Adobe Acrobat can do batch export to Excel, however.
    This idea has already been added to the ExportPDF Ideas list ("Export multiple..."). If you'd like to see this improvement to ExportPDF, please add your vote or comment here.
    Thanks.
    Dave

  • Workflow multiple versions in production

    Hi Experts,
    when i transport my work flow from development to production
    multiple versions are getting created.
    e.g
    in development i have version 0001
    and production also have version 001
    i did some changes in development in version 0001
    now i transported it to production
    then in production new version is getting created. version 0002
    I did check the SAP help which says
    If a workflow definition is transported into another system,
    only the active version is transported.
    If the workflow definition exists in the target system with the same
    version number,  it is overwritten by the transported version if it has
    no workflows running. Otherwise, the transported workflow definition
    is saved with a free version number. The transported workflow
    definition becomes the active workflow definition in the target system
    but i checked the production server and work flow is not running. still multiple versions are getting created.
    any clue experts
    Regards,
    Umesh Chaudhari.

    Hi Umesh,
    Yes in your scenario you would get a new version created as an instance has already been started using the older version.  This state of affairs used to worry me from a QA perspective as well, but I have seen it now in many sites - both ones where I have built the workflows and ones where I have not - and I can state pretty conclusively that from an operational perspective it is not an issue.
    You could generate a new version in your DEV system whenever you transport, but I would not advise this as in this case it is possible to get your versions out of synch and actually deprecate your changes.  This is not only poor but proved very difficult to analyse (for my small brain anyway) at the site where this was attempted.
    In this case I am firmly of the opinion that I will let the transport system manage the versions in my production system.
    Kind Regards
    Gareth

  • Issue with Workflow template versions? Can any one suggest what to do.

    Hello All.
    Actually in quality system there is a workflow instance in error. I identified the error and fixed the issue with a new development and transported to quality. Now when i restart the error instance in quality, its not picking the latest template with additional steps i developed, instead, it is catching the older version.
    For a fresh instance, my newer version of workflow is getting triggered, why not for error workflow.
    I used swu_obuf to synchronize the buffer but no use. What could be the resolution.
    Please suggest.
    Regards
    Prasad.

    Hi Prasad
    Running instances / Waiting / Or in error continue to run in the version in which they were started.
    That is why Version increments in QA / PRD when you move a change... so that any running, waiting, error instance can pick the same version of the workflow.
    It will not pickup your changed version. Logically it would be incorrect - start in Version A and end in Version B..... say start in version which "Auto Approves" a PO of value INR <=100 .... error .... restart in a changed version which now had auto approve limit of INR 1000. The PO will not be auto approved whereas the day it was started, the limit was only 100. So it makes sense to restart a workflow in the same version in which it was started. That is how the architecture has been designed.
    To conclude, if the steps which went in error because of the data from your previous step (which you have changed now) - see if you can change the container data and restart the WI from SWIA so that the work item starts with a new set of data. Else, discuss with business owners to cancel these workflows and restart new ones.
    regards,
    Modak

  • Approval Workflow and Templates

    Hi
    When using approval workflow and templates, does it exist a workaround of how we may solve the issue where we have to set approval and templates for each KM folder in a hierarchic structure?
    Could it be possible to set these settings for a topfolder and inherit the settings for the siblings?
    Regards
    Kay-Arne S. Aarlie

    Hi,
      You must configure approval service for each folder. It'is not inherit. But, you can develop it, in this link you can find knowledge management and collaboration funcions.
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/kmc/knowledge management and collaboration developers guide.html
    Patricio.

  • Workflow active version

    In workflow definition  i have developed  some workflows whenever i opens that workflow one warning message is appearing 'you are not editing the active version' Is this shows any affect on workflow,

    Hi,
    No.there is not defect in your workflow.
    Possible reasons is you might have swicted back to inactive version of workflow from active workflow and the  edited  the inactive workflow .Here you still didnt activate it.probabaly u might have saved the workflow.
    And since whenever you open SWDD this might be the default workflow(latest one created) that sap screen shows by which you have got the error all the time.
    Possible solution.
         Aactivate the workflow and close SWDD and logoff and reopen swdd.If u still see the error message then,
    create a sample workflow and activate it(making this default).Reopen SWDD  and then from there open your original workflow.
    Also to see whether is there any configuration problem
    go to transaction SWu3(workflow admin) and  check whether any cross mark present in the node of the tree.

  • FM to read the different steps in workflow and the status at each step

    Hi All,
    Is there any FM in SAP which gives the detail roadmap of the steps taken in a workflow and the diffrent status or decision at each step.
    I am looking for something like what you see when you click on the "STARTED WORKFLOW" in the business workplace outbox.
    Which shows up the "steps in the process so far "  and the decision and the agents for each of them.
    Thanks,
    Charan.

    Hi,
      You can get the status according to the task. please check with FM "SAP_WAPI_GET_WI_DELTA".
    Regards
    SM Nizamudeen

  • Error in creation of model and version

    Hi experts,
    While I m trying to create Model and Version SAP demo systems, it is showing me error " Live cache is not availble".
    Should I copy a planning version to new or should I contact basis team for that? It is also not allowing me to create new LC connection.
    I am attaching screen shots. Please help.

    Hi All,
    BASIS guy have configured Live Cache. We have started live Cache succesfully.
    Still facing problem in Version stating that an error occured in Live cache.
    Kindly help.
    Attaching screen shot below.

  • Report for outstanding workflows and the agents responsible

    I need a report that shows all leave request workflows , with userid who created them and approver (agent), and approver's org unit and personnel number.
    is there a standard SAP report that can show that? or I need to develop one?
    I was looking at the SWI2_FREQ, this report shows the leave request workflows and userid who created them, but it does not report on the agent.
    Regards,
    Tiberiu

    Hi
    I think by making use of SAP_WAPI_WORKITEMS_TO_OBJECTS you can get all the workitems who created the leave wokritems.
    Regards
    Pavan

  • Summary report to show all the software components and version installed

    We are using 64bit Windows 2003 and Hyperion Planning and Essbase in 2 separate servers. I am not sure whether Windows can have a summary report to show all the software components and version installed and show it is 32bit or 64bit version installed?
    Thanks!

    Refer steps here to delete SC file.:
    http://support.apple.com/kb/TS2363
    Then proceed to repair your QuickTime. START / CONTROL PANEL / ADD n REMOVE PROGRAMS / highlight QUICKTIME and click CHANGE then REPAIR.

  • Siebel & "BIP Report Generation" Workflow and Bookmark Syntax

    IHAC that wants to schedule the generation of a BIP report to run every weeknight and include a list of activities for the next day. We've utilized an OOTB IO and can successfully render the report using Sample XML in MS Word. Additionally, since they are only on 8.1.1, they do not currently have access to the latest scheduling capabilities. Therefore, we're trying to implement a repeating component to call a workflow to generate the report using the 'BIP Report Generation' workflow and the associated objects (from support posting 823360.1).
    Now to the questions:
    One of the parameters for the workflow is a 'Bookmark'. This parameter does not appear to handle various 'Siebel' querying functions like 'Today()' as part of the criteria. Can someone confirm this statement?
    Currently, I can imagine one high-level possible workarounds. This would entail a revised WF and that includes a prior step to generate the 'Bookmark' by retrieving a string representation of tomorrow's date and concatenating it with the rest of the criteria.
    Next question:
    Does anyone have any other possible alternatives? Maybe a calculated field in the BC/IC for 'Today() + 1' - but this could have performance implications.
    Thanks in advance for any help.

    Suggestions/comments?
    Bump.

Maybe you are looking for