Archiving Workflow with SARA; WORKITEM_WRI and WORKITEM_DEL

I have been manually running the write and delete steps to archive completed workitems. I am now ready to set these jobs to run on a schedule through SM37. The write worked fine and created four archive files. However, the scheduled delete job only processed the first file. How can I define a scheduled job for the delete to use all files generated from the scheduled write job? I will not know how many archive files will be written. I just want the scheduled delete job to use them all.
Thanks

Hello Cliff,
Problem could be that your deletion job was stopped or terminated because of various reasons.
You have to Run the deletion program once again for the archive session number. You have to get 4 deletion joblog and spool list for all the 4 archive files created.
Schedule delete job:
SARA > Delete > Select the Archive Session Number
Maintain Spool and Time > execute
Note: 1. Session number contains all the archive files created.
2. There is no problem in your setting of deletion program because already one archive file was deleted successfully.
Thanks,
Ajay

Similar Messages

  • Workflow with item type and item key  is in progress. Abort existing workfl

    Dear all,
    I'm using the below code from the submit button event to launch the workflow. The workflow works fine when I submit for the first time, when i try to submit for the second time from the same session it throws me the error as Workflow with item type and item key is in progress. Abort existing workflow.
    Where I am struck,Kindly send me any solution.
    I am using below code in oaf
    public void workflow(OAPageContext pageContext,
    String headerId,
    String empName,
    String userName
    String wfItemType = "XXSample";
    String wfProcess = "XXSample";
    String wfItemKey = headerId;
    OANavigation wfClass = new OANavigation();
    // Create Workflow Process
    wfClass.createProcess(pageContext, wfItemType, wfProcess, wfItemKey);
    System.out.println("Workflow created");
    wfClass.setItemAttrText(pageContext,
    wfItemType,
    wfItemKey,
    "XXHDRID",
    headerId);
    wfClass.setItemAttrText(pageContext,
    wfItemType,
    wfItemKey,
    "XXEMPNAME",
    employeeName);
    wfClass.setItemOwner(pageContext, wfItemType, wfItemKey,username);
    wfClass.startProcess(pageContext, wfItemType, wfProcess, wfItemKey);
    System.out.println("Workflow started");
    Thanks,
    Kumar.

    Item type and item key forms unique record for the workflow session, you cannot have 2 instances of the workflow running for the same header id, in your case. Check if the process exists for the header id. If so, display a warning message without launching another workflow session, else start the workflow process.
    Thanks
    Shree

  • Tapeless workflow with FCP 5 and Canon XL H1

    Is it possible to establish a tapeless workflow with FCP 5 and a Canon XL H1 shooting in HD mode?
    We have to shoot many days of lectures and want to skip the import of the footage. The firestore FS-C can only record up to 7.5 hours in HD but we don't have the time to empty it overnight so it can be available the next day...
    We can connect a couple of 500 GB lacie to a 17" Macbook Pro and wanted to use FCP to live capture from the camera. Is that possible?
    Or did anyone try to do that with the Canon Console software (windows only) and then import its stream in FCP?
    Thanks

    No time to unload a drive when you have all night? Why would that be? Isn't this just a big drag and drop? Can't let it start and go to dinner?
    If you shoot HDV you should also record it to tape becuase disk drives can fail etc during the shoot.
    You can capture live from the camera's FW port but again I'd shoot tape as well... use capture now but also use the device control that works with that camera to get the TC thru too. (probably FW HDV basic)
    The camera also has an uncompressed output but that would take a capture card and fast disk array (faster than FW for sure) to capture the uncompressed HD.
    Jerry

  • Optimal Workflow with CS3, Lightroom and iPhoto

    I am using Lightroom and CS3 for my initial workflow, adding keyword tags, ratings and flags to my photos in LR. I would like to be able to work with these photos in LR and CS3 and transfer/copy the best of them into a "iPhoto" directory and import them into iPhoto with the CS3 work, meta data and keywords intact so I don't have to go through these changes again in iPhoto. I prefer CS3 for doing the work, but like the emailing, slideshow and iBook capabilities of iPhoto.
    Thanks in advance,
    Kevin

    Few people use both Lightroom and iPhoto - they both do essentially the same thing but are different and in the past iPhoto has found LR processed photos unacceptable - and they can not operate on the same physical photos - you must export photos form LR and import them into iPhoto since they both manage photos
    If you want to use both you will have to have photos imported into each
    Fortunately external hard drives are cheap -- iPhoto is perfectly happy having its library on an EHD
    LN

  • Workflows with SpeedGrade CC and Premiere CC

    Is there any new workflow related feature in the upcoming Speedgrade CC that hasn't been shown yet ?
    The new Lumetri Deep Color Engine looks like a fantastic idea, but unless I missed something you have to manually load each .look for each clip that get a discrete color correction. That's not that much of an issue for a multicam program, but for anything else that might have tens, hundreds or even thousands of differents .look to load on their respective clips, I wonder how it is usable. Please correct if I'm wrong and if you included some kind of auto .look linker.
    Also, is SpeedGrade CC still limited to importing .EDL, or is .XML import coming to the next version ?

    KGCW wrote:
    Is there any new workflow related feature in the upcoming Speedgrade CC that hasn't been shown yet ?
    The new Lumetri Deep Color Engine looks like a fantastic idea, but unless I missed something you have to manually load each .look for each clip that get a discrete color correction. That's not that much of an issue for a multicam program, but for anything else that might have tens, hundreds or even thousands of differents .look to load on their respective clips, I wonder how it is usable. Please correct if I'm wrong and if you included some kind of auto .look linker.
    Also, is SpeedGrade CC still limited to importing .EDL, or is .XML import coming to the next version ?
    You can actually use the Lumetri effect on adjustment layers and on individual clips. That will allow for easily applying a production look to an entire project, or slicing your project into acts with their respective looks, or using it based on scenes.
    On the SpeedGrade side itself, we haven't changed the import structure for the version you're gonna get on June 17th.
    Cheers,
    Patrick

  • Panasonic AG-HPX500 workflow with FC - pros and cons?

    We are now shooting on Betacam SP, capturing to DV using a DSR-45P DVCAM deck.
    We edit in FCP 5.
    Now we are about to first upgrade our computers to MacBook Pro and Mac Pro and FCS 6.
    Next step is a new camcorder.
    We have been recommended to take a look at Panasonic AG-HPX500, which uses the P2 card.
    Have anyone of you excellent forum members started working with this camcorder?
    I understand that FCS 6 does have native support for the P2 card and if you have a MacBook Pro, you can plug the P2 card into the computer and copy the footage you want to use to a hard drive (or even edit directly on the P2 card).
    I'd be very glad if you'd share your opinions.
    We have this idea to start shooting and editing in HD and then downconvert to SD when creating the MPEG-2 for a SD DVD as output format. This would hopefully give the best quality to our MPEG-2 encoder.

    I've been using the P2 system for about 7 months and have to say it's great!
    I'm not sure what you mean about it not having native support in FCP. FCP 6 has all you need to get your material in and work with it using the Log and Transfer system within the app.
    I use a P2 reader to get the material in, and yes you can use a laptop but only the Powerbok type, the newer MacBook Pro's have the smaller PCMCIA slot so you can't transfer without a P2 viewer or reader. The good thing about the reader is it has 6 slots so you can just setup your transfers and then go away for a bite, come back and it's all done, rather than swapping cards etc.
    All in all a great workflow. The only problem I have experienced is a couple of cards I had became corrupted, although they played fine in the camera. I was able to record off the material via HD SDI and save it, it just wouldn't show up in the reader/viewer.
    The flexibility of the P2 system in terms of shooting formats, under and over cranking etc is fantastic, very much another step towards achieving filmic quality with digital equipment.
    I don't know if there are any real benefits to be had by shooting in HD and resolving down to SD. You'll end up just loosing a lot of frame info in the transfer and you'd be hard pushed to see any discernable difference between good quality SD and the route you propose.
    Having said that, investing in HD is a good move as will only get more widespread and lets hope that the formats calm down a bit!

  • Strange problem with PP_ORDER with SARA

    Hello,
    We are facing a problem with archiving  of PP_ORDER object.
    We archived production orders from  100000000096 to 100000000113 where:
    1. Order 100000000113 which was collective order for orders 1..104 to 1..112
    2. Order 100000000103 which was collective order for orders 1..96 to 1..102
    The case is that subordinated orders was archived with SARA properly and deleted from system. Unfortunately orders 1..113 and 1..103 wasn't archived. They doesn;t exist in archived files.
    But when I type 100000000113 in CO02 i have a message "Order is already deleted. Do you want to display the order?". And I can display it despite that this should be archived but isn't.
    Also when I try to push 'collective order overview" button in CO02 at order 100000000113 nothing is happened.
    Any idea how to do proper archivisation of these two collective orders?
    Now If I try to do another SARA PP_ORDER archivisation of orders 1..113 and 1..103 SARA is searching of its subordinate orders, and nothing is found = no archivization.
    Best regards,
    Michal

    The case is that subordinated orders was archived with SARA properly and deleted from system
    Check that the archiving sessions finish correctly in SARA -> Management and that the archive data is accesible on the storage system.
    Check in the Archive explorer (SARI) that you can find the data, using the archive object and info structure.
    The data should have not been deleted if the archiving session had problems during the process.
    Regards
    Juan

  • User change attribute workflow with approval  problem

    Hello,
    I have a requirement to add account numbers to user entry through workflow with approval process. and also same user can have multiple account numbers. when approver approves the User request then it's account number will be added into the user entity in ldap.
    So, i have created a Change Attribute workflow for that account number with these steps as : initiate, Approval, Commit, Error_report
    this workflow i am able to invoke through IdentityXML call and from OIM interfaces approvers able to approve and that account number is persisting under the User entity.
    Problem is: Actually above request is staging system. when user requests, it is in initiate step, when approver approves the request then only commits the info. so, there are 2 stages here.
    When i am requesting two consecutive account number requests then both requests are in initial stage. Then approver approves the 1st request then it is persisting into User entity. after that approver approves the 2nd request then this account number is overwriting the previous one. so, here is the problem i am finding. worflow is not adding the new account number.. instead its replacing the last value in the list of account numbers for an User entity.
    I hope the above problem make understandable..
    Really its a very much helpful to find the solution on this.
    Thanks in advance,
    Srini.

    Thanks for the help. Having reinstalled OID/OAM a bunch of times to properly add our custom user object, nothing seems absurd. I tried running through your steps, but I'm still not getting the workflow button. I've customized create and delete workflows properly, but the change attribute is a mystery.
    I did the following:
    1) Selected a custom attribute in Attribute Access Control
    2) Changed its read access to Anyone
    3) Saved
    4) Changed its modify access to Anyone
    5) Saved
    6) Added a new Change Attribute workflow for the custom attribute
    7) Action #1: Request, added Anyone as participant and saved
    8) Action #2: External Action, selected attribute is the custom one
    9) Action #3: Commit
    10) Saved and enabled the workflow
    11) Restarted the Identity server
    12) Picked a user
    13) Opened his user profile
    14) Clicked Modify
    The custom attribute is still editable and has no Request a Change button.

  • Archive customer development with SARA

    we would like to archive our old, not used development so it doesn't occupy db, disturb where-used lists, etc.
    It would be nice to have a possibility to recall such programs in some kind of read-only mode.
    Is there a possibility to archive development with trans. SARA or something similar?
    I haven't found any relevant archive object for it. Do you have some hints?
    regards,
    Kris

    There is no SARA object. My solution is according to Rob's suggestion, delete the unwanted objects and release the transport. This will result in versions that you can later retrieve back if necessary. I have written a small report for this, e.g. listing all programs in VRSD that are not present in TRDIR, with a one click jump to standard version management. Other object types require checks in different tables, of course.
    You must take care that the version database remains intact, e.g during system copies.
    Upside: very simple, using standard functionality.
    Downside: it is not integrated, you'd have to retrieve object dependencies manually by try and error.
    Thomas
    P.S. db space can be neglected, it's more about where used lists, system upgrades, undocumented programs etc.
    Edited by: Thomas Zloch on Feb 4, 2011 6:48 PM

  • Video Clip Archiving Workflow and Format

    I currently shoot a ton of video (and photos) with a 5d3 and I've ended up with a large and continually growing archive of video footage.
    I do a lot of photo editing, but I'm able to use Lightroom to clean up my collection to only keep the ones I need and prune the rest. Unfortunately I haven't been able to do the same for video. I edit in the creative suite (mainly Premiere Pro, but some AfterEffects), and I can easily identify the portions of the footage that make clips I'd like to keep.
    Is there a good workflow or file format to cut out and only save the clips I desire? I have lots of file space, but keeping around a ton of footage I will never use doesn't seem to be a good use of my space or time.
    Thanks a lot

    Adobe's Prelude program might help ... though not an ideal solution, it's sort of Bridge modified for video purposes and re-conceptualized. It's primary intent on creation was to allow "junior editors" to log in vast amounts of say tv news show footage, do a quick scrub to find the useful bits, and then add metadata/key-wording to those while ingesting and also backing them up. If you do sub-clip things, it does transcode ... but as it can do a pretty decent job of say conversion to DNxHD or something equally "mezzanine" (as Jago says in the AdobeTV vid on transcoding), something of low compression and high editing and archival use.
    Like Bridge, Prelude does NOT keep a database, so it's not really a DAM program in the way that Lightroom, Aperture, IdImager's "Photo Supreme", and several others are. I've put ... numerous ... requests on this in the Prelude "que" using the feature/bug request form. Even talked with some of the folks in this program "live" while at NAB last year. Yea, they know it would be useful ... but it would be a TON of work to add a database functionality to Prelude & it ain't on anybody's Must Do list that's working on the program.
    So ... Prelude can ingest, add meta, transcode subclips & store them wherever. I know one person that figured he'd set up a folder structure that allowed him to "see" his shot types, something like: mountain streams; rivers; ocean; forest; general city views; traffic; highways; talking heads; ... something like that. With "main" folders & subfolders. That way he can navigate to and browse "likely" folders at need from PrPro.
    IdImager's PhotoSupreme is a very inclusive DAM program and it does handle all file types ... still as well as most video files. As it DOES have a database, once you've put meta into a file, and "shown" that to P-Supreme, it will be able to find that and show you where it is by keyword/meta search. Whether it's "online" on a disc the computer can access live, or on a disc stored on a shelf. I don't know but I doubt it can sub-clip or transcode video files.
    I worked with IdImager before it was bought out and transmogrified into PhotoSupreme. IdImage was a moderately easy DAM program ... which is to say, slightly less than rocket science. P-Supreme is much simpler to use, and has had much of the functionality of the original IdImager brought back.
    So ... you've options, though not necessarily the "one" you might wish. It's an imperfect world ... sigh ...
    Neil

  • What is your backup and/or archive workflow?

    I am a wedding photographer who has been using Ap for a few months now, and am trying to come up with a good backup system and a method to archive projects to save space on my MacBook Pro.
    Here is my simple workflow. I'd appreciate any recommendations for improving it.
    1. After a shoot I import all images directly into Aperture (managed). A typical wedding will yield 10-20 gigs of raw files (depends on number of shooters and how long).
    2. After importing, I update my Ap Vault on a 500gb external FW800 drive.
    3. As I work on the image the next few weeks I update my vault every day.
    4. When I'm completely done with the project I export the whole project to an "archives" folder on my external drive, and then delete it from my aperture library.
    5. I then update my vault again.
    6. Therefore, my vault only contains a backup of projects I'm currently working on (and are in my main aperture library), and I have a file with a growing number of individual projects that have been archived. If I want to access pictures from another project I have to re-import that project into my main library, and then delete it when I'm done.
    QUESTIONS:
    - Would it be easier for me to put all those archived projects into a second library for fast retrieval? How would you do that (I've noticed you can directly drag-n-drop them into the Apperture package...)?
    - Would your recommend also backing up to an optical media, even though the projects are so large (20gb)?
    - How can I use multiple vaults and/or libraries to make my life easier?
    Thanks in advance for your help!

    After all your advice, I think I definitely will start referencing to my images on an external drive. I'll keep current projects on my local HDD, but after I'm done with one I'll move the referenced images to an external drive for archival purposes. I'll also use a combination of external drives and DVDs to backup (I'll never use solely optical media as I've had to learn the hard way about the long-term integrity of optical discs....)
    When all is said an done, I imagine my workflow will work something like this:
    1. Shoot a wedding, importing all images into Aperture and referencing to all masters in a local folder.
    2. After the wedding, immediately backup the project (including referenced files) to a portable firewire drive.
    3. When I'm done working with the images, move the referenced files to an external "archive" drive.
    4. Export the project (including referenced files) to a double layer DVD and stick it in the clients folder with other paperwork, etc.
    5. Update my Aperture Vault (including referenced images) to a separate "backup" drive.
    This way I will have all my older images on an "archive" drive and linked to AP for easy reference. I'll also have DVD backups of each project, plus a huge backup drive with my AP vault with ALL projects and images. When this drive fills up, I'll buy a new one. Hard drives are cheap, and getting cheaper. Besides, that's just the cost of doing business.
    Hopefully future version of AP will have integrated DVD burning, and the ability to span a project across multiple discs.

  • Export PDF Workflow with Applescript and CS3

    Hello,
    I am setting up some PDF workflow with Applescript.
    On a given moment, as my script runs and after getting some user-input answers to questions in some dialogs, my script tells InDesign CS3 to open the Export Adobe PDF window for the current document. I copied and pasted that small part of the script:
    tell application "Adobe InDesign CS3"
    tell document 1
    export format PDF type to "Macintosh_HD:Test01.pdf" using "somePreset" with showing options
    end tell
    end tell
    When you run this small part of my Applescript, InDesign opens the Export Adobe PDF window (as expected) waiting for me to click on "Export". That is exactly what I want, since the user is given here a last opportunity to change some values (for example page range, or spreads). When all is set, the user can click on Export to close the dialog and finish the script.
    Problem: I was hoping that the Adobe PDF Preset "somePreset" would be selected in the first pull-down menu of the Export Adobe PDF window when this window is opened by the script. Unfortunately the last used preset is always selected by default. Anyone suggestions or help?
    Kind regards,
    Bertus Bolknak.

    My operators enter the page range and filename into a dialog box. Then I set those in the script. I use the Press Quality preset to start with and then set the changes I want into a export variable. I set things like bleed, marks, page range, etc.
    Here is an example:
    set theProps to properties of PDF export preset "[Press Quality]"
    try
    delete PDF export preset "Schmidt PDF"
    end try
    set theStyle to {name:"Schmidt PDF", acrobat compatibility:acrobat 7, bleed top:"0.125i", bleed bottom:"0.125i", bleed inside:"0.125i", bleed outside:"0.125i", page marks offset:"0.125i", include ICC profiles:Include None, effective PDF destination profile:use no profile, effective PDF X profile:"No Color Conversion"} & theProps
    make PDF export preset with properties theStyle
    set properties of PDF export preferences to theStyle
    set color bitmap sampling of PDF export preferences to none
    set grayscale bitmap sampling of PDF export preferences to none
    set page range of PDF export preferences to (item i of myPageList) as string
    export document 1 format PDF type to (PrinergyFolder & myJobNumFinal & "_" & VerCode & ".pdf") as Unicode text without showing options
    I am also doing this in Quark.

  • Mail workflow with variable recipient, subject, body and attachment

    Here's what I'm trying to solve:
    I do some payroll processing with Direct Deposit. A PDF file is created for each person's check stub, which I then e-mail to the person. Right now I have Excel generate a mailto: link which addresses the messages, inserts the subject and a brief note in the body. I then manually attach the PDF file and send.
    I'd like to be able to have a single workflow that I can drop the PDF file onto and have it mailed to the intended recipient. The filename can be setup to include any necessary info as it is generated by a very extensive VB script in Excel.
    But I can't figure out getting the variable items in to Automator.
    Thanks for any insight.
    Tim

    I was able to find a commercial product that works very well. MaxBulk Mailer from Max Programming allows you to import a list of recipients, merge with personalized message and designate an attachment (just image types and PDFs, not all types work) and send.

  • Workflow Circuit between FCP and After Effects with HDV

    I am trying to set up a workflow circuit with FCP 5.1.1 and AE 7.0 using HDV. We are:
    1. building sequences with FCP
    2. exporting to AE
    3. adding in FX with AE
    4. putting those back into FCP
    The footage we are using is compressed with HDV 1080i60 in FCP.
    My problem is finding the correct way to export the HDV with FCP. I have tried using Quicktime Conversion but the only option I can find for uncompressed video didn't work when we tried to transfer it.
    I think the main problem is that the AE work is being done on PCs and the video that we have tried sending can't be read by PCs. The audio plays through fine but the video shown is nonexistent.
    What conversion should we use in FCP that allows us to keep the video uncompressed and in its best quality for use in AE? And what works best on the AE end of it?
    Can someone give some advice on steps that they would or have taken for a project of the same ilk.
    G5   Mac OS X (10.4.7)  

    What's your system setup? What FX exactly (e.g. color correction, or just random FX thrown in every now and then)? What went wrong when you tried using an uncompressed codec?
    1) If everything's on an external FAT32 formatted drive (fat chance, get it?), then i think you could export a quicktime reference of your timeline (choose the option that ends up giving u a tiny file). Then plug the drive into your PC, load the Quicktime Ref (also known as QT Ref) into AE7 and FX away. Then export a QT with HDV 1080i60 specs and reimport into FCP.
    2) If you're only doing occsional FX (e.g. gun flashes, lasers from robots, etc..), and nothing that encompasses the entire length of the project (e.g. color correction), you could export a hashed down QT Conversion of the timeline with whatever decent codec that works (e.g. DV NTSC). Then import into AE7, lay down the video, add your effects. But then before you export your FX, turn OFF your original video layer so all your FX are on a transparent background. Export clips with alpha channel and reimport those into FCP.
    3) Ensure that the PC is using the latest version of quicktime. It may need to be upgraded to whatever version is on the Mac, and it may need to be registered as Pro.

  • SD and MM workflow with table

    hi friends,
           can any one explain the SD and MM workflow with tables
    thnx in advance

    Here you got for tables flow
    SD
    http://www.sapgenie.com/abap/tables_sd.htm
    MM
    http://www.sapgenie.com/abap/tables_mm.htm
    FI
    http://www.sapgenie.com/abap/tables_fi.htm

Maybe you are looking for

  • Macbook Pro Late 2008 Boot Issue

    Hello All! So I have an issue i would like to throw into the ring and see if I get any responses I have not thought of yet. Let me give you a quick backstory to my recent macbook adventures.... I will start with a quick prelude story. Chapter One: Pu

  • How to Fix Disappeari​ng HP Printer Drivers -- Disable HP CUE Device Discovery Service

    I have a new HP G60-530US laptop with Windows 7 installed (less than 2 months old). It is connected to an HP 7410 All-in-One Printer and a Brother laser printer. Three times now, the HP 7410 printer has mysteriously disappeared from the installed pri

  • MIni Mac keeps freezing!!!!!

    I hope someone can help me! I have a mini mac and just recently everytime I turn the computer on in the morning I will login and then I go to open a program and then it just freezes! So i have to pull the power cord out in order for the computer to s

  • Photoshop CS4 install disk image won't mount

    Hi, I just acquired Photoshop CS4, and after finishing downloading the instalation .dmg file, I tried opening it, and on two diferent computers it comes out as invalid. On my iMac, it gives out "invalid checksum" and on the MacBook, "codec overrrun".

  • -useLegacyAOT=no big drops in performance

    Here is the bugbase ticked https://bugbase.adobe.com/index.cfm?event=bug&id=3695884 There you will get sources and precompiled IPA that can be executed on any iOS device (I use my Enterprise license). The same code with -useLegacyAOT=no on iPad4 iOS7