Workflow Referencing another Workflow and logs

I have a requirement were the business would like a workflow to create material master, but prior to actually creating the material they have a form that needs passed around and approved my many business units. Once all approvals are obtained a material master can be created. What I would like is to create a custom BOR and event to route the form and get approvals, but I would like to some how carry over or link this custom workflow with the SAP standard for material master so you can view the log of who and what was approved for this material. Any suggestions?

Are you sure that your post has a good place in DMS topic?

Similar Messages

  • What does the Stop Workflow and log... action do?

    I have used the stop workflow action in my SPD workflows but it doesn't seem to do what I thought it would.  It doesn't stop....it goes right on to the next step!
    I want to be able to say if a condition is true, stop the workflow, do not go any further, quit!
    If the condition is false, then go on to the next action or step.
    How do I get a workflow to completely stop even if there are more steps!
    Tearing hair out,
    Judy

    Hi Judy, 
    The Stop Workflow action is really stopping the current workflow instance.
    The possibility of your issue is the condition does not get the value as expected.
    I suggest you to put Log to History List action before the condition to find out if the condition value is correct or not.
    And then do some further troubleshooting.
    -lambert
    Posting is provided "AS IS" with no warranties, and confers no rights.

  • Best Practice Re: Workflows and Workflow Components

    Hello,
    As many know, CUP does not allow you to delete workflows and workflow components if they are referenced in a request somewhere. The solution I'm seeing here on the forums is to run a Request Delete Script from SAP. This is all well and good in a Sandbox environment, but we would not like to delete request information in Productoin.
    So my question is: what is best practice for managing workflows and their components in Production? As time goes on, our workflows will change, our components might change, we might need to add a stage here or there or switch an initiator, etc etc. However, if I create a new workflow or component, I can't delete the old one. So essentially there's potential to have a lot of junk workflow/component data sitting out there because it's referenced in an old request somewhere.
    Does anyone have any recommendations on how to manage this? Ex: If you change a workflow to add another stage, what are you doing with the old workflow since it's useless?
    Let me know if I need to clarify further.
    Thanks!!
    Jes

    f l,
    I'm not sure deleting keys from the registry is ever a best practice, however Xcelsius has listings in:
    HKEY_CURRENT_USER > Software > Business Objects > Xcelsius
    HKEY_LOCAL_MACHINE > SOFTWARE > Business Objects > Suite 12.0 > Xcelsius
    The current user folder holds temporary settings, such as how you've modified your interface.
    The local machine folder holds more important information.
    As always, it's recommended that you backup the registry and/or create a restore point before modifying or deleting any keys.
    As for directories, the only directory Xcelsius uses is the one you install to.  It also places some install logs in the temp directory, but they have no effect on the application.

  • Workflow and General Use Questions

    Hello,
    I'll apologize right off the bat for these novice question because I'm sure the information is probably somewhere in the forum, I just haven't been able to find it. I just purchased Aperture after completing the demo as my library is getting too large to manage using standard file folders. I'm now trying to figure out the best practices for workflow and general use before I invest some serious time into importing and keywording all my pictures.
    1) Store files in the there current location, or in the Aperture Library? It seems to me that once they are moved to the Aperture library, you can only access them from within Aperture. I'm thinking I would be better off leaving them in their current location. For one, if I want to quickly grab a picture as an attachment to an email or something it seems easier to grab it from the standard folders. Second (and more important) I do not have room to keep all my pictures on my Macbook, thus most of them are stored on the Time Capsule.
    So... Keeping photos in their current location appears to be the best choice for me even though it adds an additional step every time I bring in new photos from my camera. Does this sound right?
    2) Is there a way to mark the photos that I have uploaded to my website (Smugmug)? Ideally, I would like to badge photos that have already been uploaded so I can quickly recognize them and ensure I'm not duplicating. I've considered using the rating, or keywords to indicate that a photo has been uploaded but both methods have disadvantages.
    3) Any suggestions for general workflow and organization resources (tutorials, books, websites, etc.)? I've looked at the videos on Apple's site but they obviously didn't get that detailed.
    Thanks for the help, sorry for the length.

    I recommend to Manage by Reference with Master image files stored on external hard drives (note that Aperture defaults to a Managed-Library configuration rather than a Referenced-Masters Library). Especially important for iMacs and laptops with a single internal drive. The workflow as described below in an earlier post of mine uses a Referenced-Masters Library.
    I feel pretty strongly that card-to-Aperture or camera-to-Aperture handling of original images puts originals at unnecessary risk. I suggest this workflow, first using the Finder (not Aperture) to copy images from CF card to computer hard drive:
    • Remove the memory card from the camera and insert it into a memory card reader. Faster readers and faster cards are preferable.
    • Finder-copy images from memory card to a labeled folder on the intended permanent Masters location hard drive.
    • Eject memory card.
    • Burn backup hard drive or DVD copies of the original images (optional strongly recommended recommended backup step).
    • Eject backup hard drive(s) or DVDs.
    • From within Aperture, import images from the hard drive folder into Aperture selecting "Store files in their current location." This is called "referenced images." During import is the best time to also add keywords, but that is another discussion.
    • Review pix for completeness (e.g. a 500-pic shoot has 500 valid images showing in Aperture).
    • Reformat memory card in camera, and archive originals off site on hard drives and/or on DVDs.
    Note that the "eject" steps above are important in order to avoid mistakenly working on removable media/backups.
    Also note with a Referenced-Masters Library that use of the "Vault" backup routine backs up the Library only, not the Masters. Masters should be separately backed up, IMO a good thing from a workflow and data security standpoint.
    Max out RAM in your MB and keep the internal drive less than 70% full.
    Good luck!
    -Allen Wicks

  • User decision history report, workflow summary log

    Hello SAP Workflow community,
    In our project there is a task to implement User Decision history/log management. I want to get programmatically (via ABAP, in order to be able to store this data in DB too) the history all user decisions (statuses of all User Decision work items) inside of specific workflow and send the summary report to the workflow initiator/workflow administrator upon the workflow completion.
    After some search I found the complete set of relevant function modules under SAP_WAPI_WORKITEM_*, where you can get detailed information per work item, including their statuses, but I'm not sure, that this is the best practice to implement such task.
    Of course, we can go «brute force» and built such log manually: to log each user decision and store it in multiline container and after that find some way to send it to the desired person. But I'm pretty sure, that there must be another way, should be some standard, SAP-provided way to build such summary report. Could you, please, point me to the relevant directions.
    The Bottom line is at the end of each workflow execution I want to send to workflow initiator the report with list of all User Decisions, their statuses (approved/declined) and username, who took the decision.
    Thanks.

    GOS->workflow->workflow overview should show the workflow log showing the status of the workflow until now (workitem name/start data & time/end date &time/decision/status/user name).
    Table swwwihead has the fields wi_cd (creation date) wi_aed (end date) and wi_aagent (actual agent)
    Table sww_wi2obj has the link to the object.
    Table swwloghist has the workitem history.
    You could use the way suggested by Karri, since the decision is hard to read directly. But for workflow reports,I usually use BW see Workflow reporting in BW – extractor improvements

  • Tapeless workflows and Sandy Bridge or other PC's: KISS or LOVE?

    Tapeless workflows and Sandy Bridge or other PC's: KISS or LOVE?
    Life used to be so simple when shooting video on a tape based camera. You shot your material, captured it for editing and stored your precious original footage on tape in a safe and dry place. Sure, it took time to capture, but the big advantage was that if you had a computer or drive failure, you would still have the original tape so everything could be recreated.
    Now with tapeless workflows we have the significant advantage of much faster import of the original footage. Connect the flash card or disk drive to the computer over USB and copy the data to a HDD on the computer, ready for editing. The data on the flash card or disk drive can then be erased, so you can reuse it for more shots. But, like Johan Cruyff has said repeatedly, every advantage has its drawback. In this case it simply means that you no longer have the original material to fall back on, in case of computer or drive failures. That is a very unpleasant and insecure feeling.
    The easy anwser to that problem is backups. Backup of the original media, backup of projects and backup of exports. This often means a bundle of externals for backup or NAS configurations. One thing is clear, it requires discipline to make regular backups and it costs time, as well as a number of disks. Four as a minimum: 1 for media, 1 for exports and at least 2 for projects. Note: This is excluding a backup drive for OS & programs.
    There are different backup strategies in use. Some say backup daily and use one disk for monday, one for tuesday, and so on.  Others say one disk for the first backup, the second for the second backup, then the first again for an incremental backup, etc. and once weekly a complete backup on a third disk. Whatever you choose, be aware that shelf live of a disk is far less than tape. There are horror stories everywhere about ball-bearings getting stuck after some time and without original tapes, you better be safe than sorry, so don't skimp on backups.
    What is the relevancy of all this? I thought this was about Sandy Bridge and other PC's.
    It is and let me try to explain.
    Card based cameras are for the most part DSLR and AVCHD type cameras, and we all know how much muscle is required to edit that in a convenient way. Adobe suggests in the system requirements to use raid configurations for HD editing and practice has shown that raid arrays do give a significant performance boost and improve responsiveness, making for a nicer editing experience. The larger the project and the longer the time-line, the more a raid array will help maintain the responsiveness.
    One thing you would not do is using a raid0 for projects, media and exports, even if you have backups. The simple reason is that the chance of disk failure multiplies by the number of disks in the raid0. Two disks double the chance of disk failure, three disks triple the chance, four disks quadruples the chance, etc.
    Remember: Disaster always strikes when it is most inconvenient.
    Imagine you have been working all day on a project, you decide to call it a day and to make your daily backup, but then the raid fails, before you made your backup. Gone is all of today's work. Then take into consideration the time and effort it takes to restore your backups to the state it was in yesterday. That does not make you happy.
    Another thing to avoid is using a software or mobo based parity raid, for the simple reason that it is slooowww and puts a burden on the CPU, that you want to use for editing, not house keeping.
    For temporary or easily recreated files, like the page-file, media cache, media cache database and preview files, it is very much advised to use a raid0. It makes everything a lot snappier and if disaster strikes, so what? These are easily recreated in a short time.
    This was a general overview of what is required with tapeless workflows. Now let's get down to what this means in terms of system design.
    Two approaches or train of thoughts
    KISS: Keep it stupidly simple or LOVE: Laughing over video editing
    The first one, the most economic one, is to use a system with 3 or 4 disks internally and 4 or more backup disks.
    A typical disk setup can look like this:
    This is a perfectly sensible approach if one does not have large or complex projects, long time-lines and is willing to take the risk of occasionally losing a whole days work, between backups. Many hobbyists and consumers fall in this category.
    The KISS approach keeps it stupidly simple. The drawback is that there is no logical way to add more disks or storage. The discipline, diligence and effort required for regular backups make it far from a laughing matter. In fact it can quickly become a bore. Add to that the fact that the disk setup is simple but not very fast, so less suited for situations where lots of clips are involved, multi-cam is a regularly recurring situation or lots of video tracks are involved.
    A number of video editors want more from their system than the occasional platonic KISS, they want to really LOVE their system, which lead to the other train of thought.
    This is more costly than the KISS approach, but you all know a fiancée or wife is more costly and dear than the occasional kiss on the cheek by an old friend.
    Let's start with a typical disk setup. It may look like this:
    Two striking differences in comparison to the KISS approach:
    1. Much easier disk organization and more disks and thus more space.
    2. It requires a hardware raid controller, causing a higher investment cost. It is like an engagement ring. You don't get LOVE for free, one of the guiding principles of the oldest trade in the world.
    These are easy statements to make, but what are the benefits or advantages, that you would fall in LOVE with such a system, and what are the drawbacks? Think back to Johan Cruyff's adage.
    The only drawback is cost. The advantages are multiple, easier organization, more speed, more storage, snappier editing, no jerkiness, lesser requirements for regular backups and - this is the major benefit - hardly a chance of losing a day's work in case of a drive failure. Keep in mind that a parity raid keeps all your data intact in case of a drive failure, so lessens the need for up-to-date backups.
    We all know, we get what we pay for: "If you pay peanuts, you get monkeys. OTOH, if you pay money to monkeys, you get rich monkeys". But in this case you get what you pay for, a much better editing experience with a much easier workflow.
    Using a parity raid (be it raid 3/5/6/30/50/60) you get security, ease of mind that you are protected against losing precious media, that you need not worry about the last time you made a backup, that the editing you did today may be lost and you save valuable time editing and a lot of aggravation because of a much more responsive system.
    How does this all relate to Sandy Bridge and other PC's?
    First of all, the price difference between a Sandy Bridge / P67 platform and an i7-950+ / X58 platform is very small. Of course the new architecture is slightly more expensive than the older one, but the differences are small, almost not worth talking about.
    So what are the differences? Look below:
    The first thing to keep in mind is that the Sandy Bridge is the successor of the i7-8xx CPU and as such it is much more evolutionary than revolutionary. The CPU power has increased significantly over the i7-8xx due to new architecture and a smaller production process (32 nm), but in essence all the capabilities have remained unchanged. Same memory, same PCI-e lanes, same version, same L3 cache and no support for dedicated raid controllers.
    It is great that the processor performs much better than the older i7-8xx CPU's, almost achieving the level of the i7-9xx range of processors, but is still limited:
    The Sandy Bridge is unsuitable for anything more than a KISS system.
    Why? Because it lacks the required PCI-e lanes to accomodate more than a 16 x PCI-e nVidia card with CUDA support to enable hardware MPE acceleration and the integrated graphics are not supported by CS5.
    You may wonder if that is a bad thing. The plain and simple anser is NO. It is a great processor, it delivers great value for money, is a solid performer, but it has its limitations. Intel had a reason to position this CPU as a mid-level CPU, because that is what it is, a mid-level performer in comparison to what is to come.
    The term mid-level performer may seem strange when compared to the old generation of i7-9xx CPU's, because they perform almost equally well, but keep in mind that there is a generation difference between them.
    So what about the i7-9xx and X58 platform?
    It still is going strong. About the same performance as a Sandy Bridge, with only the much more expensive hexa-cores clearly in the lead, both performance and price wise. The quad cores deliver about the same value for money.  The main difference however is the platform that allows a dedicated raid controller to be installed, thus making it the platform of choice for those who want to go from a passing KISS to true LOVE.
    And what lies ahead?
    Sandy Bridge E on the Waimea platform (X68). Now that is revolutionary. More than double almost everything a processor can offer: double the cores, double the PCI-e lanes, triple the memory, more than double the L3 cache, increase the PCI-e support from 2.0 to 3.0, etc...
    This is why Intel calls this a high-end CPU / platform.
    So what now?
    If you prefer a KISS approach, choose either a Sandy Bridge/P67 or an i7-950+/X58 platform.
    If you wonder whether in the future you may need multi-cam more frequently, edit more complex projects and longer timelines or even progress to RED, look at KISS/LOVE solutions, meaning the i7-950+/X58.
    If you can't have downtime, time pressure is high, delivery dates to clients are critical or you edit highly complex projects, lots of multi-cam situations or lengthy time-lines, choose a LOVE solution, an i7-950+/X58 platform.
    If you have the time to wait till Q4/2011, Sandy Bridge E/Waimea looks to be worth the wait.
    Hope this gives you some more insight into recent and future developments and helps you make wise investment decisions.

    I'm upgrading from an AMD 3800+, cutting with Vegas 7 Pro. Usually shoot DSLR or HDV, sometimes P2, EX or RED. I have ridiculously cheap access to Macs, FCP/FCS, all kinds of software.
    I've been agonizing over this for the last month, was originally hoping the UD7 mobo was the solution, read the read about the NF200/PCIe issue a few days ago, http://www.dvinfo.net/forum/non-linear-editing-pc/489424-i7-980x-now-wait-sandybridge-2.ht ml- and still decided to go for a 2600k. 
    My preference is to treat my video footage the same way as my digital imagery: I make (at least) duplicate back ups of everything before reformatting the cards, never delete the back ups, and only worry about the day-to-day stuff at night. Unless I'm rendering or involved in other long processes, in which case I'll back up the work in process the next day. If I am under a really really tight deadline I might back up as I go.
    Yes, a RAID might make it easier, but I'm paranoid enough to prefer a slower, safer backup. You can always duplicate, and usually improve upon, a days work, but you can never get back original footage you lost. I have only ever had one hard drive die on me (a few enclosures crapped out, though)- it took a couple of (mostly unattended) hours to rectify. As a matter of act, I've had far more loss/damage from tapes than from hard drives.
    I ordered the UD7, 2 F4s and 4 F3Rs, understanding I will probably want to upgrade to SBE when it comes out, or maybe next year. The 2600k/mobo/RAM will likely hold its value better than a 950/X58, likely because of the marketplace as much as merit.
    The UD7 / RAID card issue is in it's early days, there may be a solution/mitigation. Probably not. But if I really really need a RAID card, then I probably really really need a 980, NAS, etc etc.
    But Harm still rocks!

  • Workflow and ABAP OO

    Hi,
    I have read the excellent blogs on Workflow and ABAP OO by Joycelyn Dart (/people/jocelyn.dart/blog/2007/07/16/referencing-bor-objects-in-abap-oo-classes). I have sucessfully created a  BO (of type SIBFLPORB) in my class. However, I cannot figure out how to retrieve the workitem container that holds the BO. I have used macros before in workflow programming exists, but there I get the container from a workitem context. I can't find the workitem context in my class. Please provide me with some code samples of how I can retrieve the workitem container that holds the BO.
    I am considering passing the workitem id to the method and reading the container via SAP_WAPI_READ_CONTAINER. If this is a good solution, then I see no point in having the BO as an attribute of the classe. Is it a good solution?
    Thanks!
    Elvez

    I am probably missing something obvious here (or we are not even speaking about a same thing), but I still try to clarify my point.
    >I could of course just pass the necessary data as parameters to the class
    This is the part that I don't really understand. If you already have an instance of a class existing (=there is an object instantiated), which has the BO as an attribute, there shouldn't be any need to change a signature of any method(?). The only thing you need to enhance is the constructor of your class.
    Currently you create the BO instance as an attribute in your constructor method (isn't this correct?). Just create the attributes you need from the BO into your class, and "populate" them in your constructor. Or if it is an method that you need, just create a similar method (=you can copy the code) into your class.
    This all might be much easier than using the WF macros, which shouldn't be that difficult either - as Jocelyn says:
    "You will still need to use BOR macros in your ABAP OO classes if and only if you want to call BOR attributes and methods in your ABAP OO methods.
    If you need to do this then make sure you include the BOR macro definitions in your class by putting the special include program <cntn02> in the "Macros" section of your ABAP OO class."
    Regards,
    Karri

  • Workflow and WD integration

    Hi,
    We are implementing Workflow and Webdynpro ABAP integration.Employee submits competences from portal, this triggers an approval task to the manager .Manager will approve the task from worklist and when he clicks on task, Web dynpro application opens up where he has to approve or reject accordingly.
    We have approve and reject buttons in Web dynpro and we have used User Decision Step type in Workflow to open the Web dynpro application.User decision step decisions are mapped to Web Dynpro approve/reject actions.
    So we are able to successfully launch the Web dynpro and perform the actions.But problem is the rest of Wrokflow after User Decision step is not getting executed. Workflow is getting stopped at this point.
    We have tried using FM's SAPI_WAPI_DECISION_COMPLETE and SAP_WAPI_WORKITEM_COMPLETE.
    Both these FM.s complete the User Decision step but does not execute rest of Workflow.
    Please help me if you have come across this scenario.

    Hello Archana !
             After user decision step, you have maintained the step that updates database as immediate step.
             Have you tested this step separately ? If it have yielded successful results, check the binding for the immediate step after user decision.
             Also, check the workflow log whether the containers for the step that updates the database are populated with required values.
             Is the method that updates the database is custom method ?
    Regards,
    S.Suresh.

  • Reading user input from a form within a workflow and perform actions in workflow based on the input

    Sharepoint 2013
    Need to get input from a user based on some condition within a workflow and based on the input received continue with the workflow. It can be a form with a text box and button to which i can redirect and when user enters a value and clicks on the button
    ,I should come back to the workflow and perform other processing. I should also be able to manually start this workflow from VS.
    Tried different approaches like initiation forms ,user input action of SP2010 etc all of these approaches either add some tasks to task list or force me to click on the workflow link to get input from a user.
    Any suggestions on this?

    Hello
    Thanks for the code, but I don't need an array of beans. By the way this code make a bean and an arraylist everytime it's called?
    I was looking for something like this:
    <form action="myjsp.jsp" method="post">
    ...so after submitting the result will go to the myjsp.jsp file and in the myjsp.jsp file
    <jsp:useBean id="value" class"myBean">
    <jsp:setpropertiy name"value" ....>so everytime I click the add button the values will go the mysjp.jsp file and that will set them in the javabean file. this method uses two files but I was looking for doing this in the same jsp file and not sending it to another file.
    chers
    Ehsan

  • RAW PHOTO WORKFLOW AND BACKUP?

    i have a canon 20D which is set to shoot both jpegs and raw photos. when i plug my card reader into my mac, it automatically opens IPHOTO5 and then i download the images. i get two thumbnails, one says it is a raw, the other a jpeg. but:
    1) i have no idea how to backup the raw images to a dvd or to my external hard drive. and (see below for related problem)
    2) i don't know how to work with the raws. i do own photo shop, but tried dragging the thumbnail into photoshop, and it tells me it could not complete the task because it is the wrong kind of document. i had the same problem when just trying to drag those raws onto my desktop. is this because i am just dragging thumbnails, and if that's the case, where do my raws reside in IPHOTO?
    3) and, even if i can figure out how to get the raws into photoshop, when i go to save them, how and where should i save them and if i want to use the corrected raw to send for printing, do i need to change to a different format? and should i save the new file in iphoto?
    4) last, i get the feeling that maybe the raw photos i have imported from my cards into iphoto (then deleted the originals) now no longer exist as raws anywhere. does iphoto not actually import the raw, but just a raw thumbnail?
    sorry for all the questions that might seem elementary, but i have never had a digital SLR and i really need to establish a proper workflow and make sure to backup my most important photos.
    many thanks for all suggestions!

    1. Click on the Finder icon on the Dock to open the Finder, open the Pictures folder, and you will see several dated folders (folders named as such: 2005-11-20, 2005-01-07, 2006-01-07, etc.). Each folder contains pictures you downloaded form your camera on those dates, but only photos you have not deleted.
    2. Down the bottom somewhere (in the Pictures folder) you will see iPhoto Library. Open this folder and you will see more folders, but dated as follows: 2004, 2005, 2006, etc. Open one of these folders, and you will see other folders, but numbered as follows: 01, 02, 03, 04, 05, 06, 11, 10, etc.
    Not every one of these numbered folder will have an "Originals" folder, but some will.
    Whatever you do, don't change the folder's names, or structure of anything in the iPhoto's Library. I have no idea if you should drag the original photos out of the Originals folder, but I would think that you can duplicate the photos, and drag only the copies into another folder on the desktop.
    Keep in mind that you can set the iPhoto Preferences so that PhotoShop Elements is set as the photo editing application. If you set iPhoto to do that for you, when you double-click on any of the photos on the iPhoto window or display, Elements will automatically launch and open the photo for you. If the photo is a RAW or JPEG, Elements will show it as such at the left bottom corner. If you want to leave the RAW photo intact, just rename the copy you are working with- before you save it (don't save any photo with it's original name to avoid changing the original). However, you can save an original that has been changed in a format such as TIFF, and in another folder, CD, another hard drive, etc. TIFF files are not compressed and take lots of space on the hard drive, but the photos does not degrade. Every time you work with a JPEG image it degrades.
    Again, wait for others to respond to your post. They may have better ideas than mine.

  • Regd. workflow and incompletion procedures

    Hi,
    can workflows or events be triggered from incompletion procedures?  for eg. when a sales order is incomplete and saved, a workflow should start, which handles the process...
    Regards,
    Vijay

    Hi~ vijay.
    I'm not sure that there is a workflow for it.
    but, I think it's possible.
    why don't you try like this.
    1) make your own workflow templete.
    2) find some place you want to code... I mean some kind of
       user exit things.
       in case of sale order.
       Edit user exit 'userexit_save_document'.
       check some field for incompletion procedure.
       for example>
    if vbap-taxm1 eq ' ' or
            vbap-werks eq ' ' or
           .... in case of some condition field for
                incompletion procedure is imcompleted
           call function 'SWE_EVENT_CREATE'   
           .... call workflow ....
         endif.
    But, There is a lot of impletion log for sales order, so when an order is created
    and saved with impletion log, it calls an workflow.
    and somebody get an workitem.... I think it's a little bit cumbersome.
    In T-code <b>'V.02'</b>, you can check all the incompletion documents.
    why don't you use v.02
    I wish I could help you.
    Kyung Woo

  • Workflow Process Log not displayed on UI

    The client recently performed a upgrade in their development environment from 7.0 to 8.0.0.4 and everything appears to be working properly however, the Workflow Process Logs in the UI under Administration-Business Process\Workflow Process Log is not being displayed. All that we currently see are the old log files from the previous upgrade. We presently receive the WorkflowProcess Log files in the Siebsrvr\log directly but would like to know why the log files are not generating in the UI within the Workflow Process Log View. Is there some setting or parameter which needs to be set in order to get this activated again?

    Couldn't you tell what you did to fix this. That would help others that run into the same problem.
    Axel

  • Workflow and user decisions

    Hi Experts,
    I dont have much experience in workflow and I have to code something. At present we have a task which takes the user to transaction MIR4 and then the task is considered completed.
    The user wants to go to tcode MIR4 and then if any changes are made and save button is explicitly clicked, then only the Workflow task/item considered closed, else the user should be prompted that do they want to complete or cancel.
    If its a complete then the document is saved and if not then workflow task/item stays in his/her inbox.
    My doubts are :
    1> How can I prompt user if they want to complete or cancel ? I am thinking User decision, will it be right?
    2. How can I manage to keep the workflow taks in inbox, should I do a loop to itself?
    3. How can I find the document was saved or not ? I am thinking change documents.
    Can you please advice?
    RS

    Hello Reena.
    I would like to attempt,
    1. Yes.
    2. No, I think you get three options generally -
    > (say) Accept, Reject, Cancel and keep work item in inbox. So, no need to use loop.
    3. Yes and No. I suggest you can use the concept of Change Documents or alternatively use Event trace and check in Event log whether the material (say) is created or not.
    Hope this assists you in your decisions.
    Good Luck & Regards.
    Harsh Dave

  • Workflow and Files Integration

    Hello all,
    I have sucessfully configured my files domain, run through the steps for integration with workflow and files. All seems to be ok except there are no review processes showing up within files.
    I know in release 1 the demo workflows were loaded and tied into files.
    Anyone know how to get these into the files screens so I can demo the workflow portion of files?
    Thanks,
    Dennis

    Hello Archana !
             After user decision step, you have maintained the step that updates database as immediate step.
             Have you tested this step separately ? If it have yielded successful results, check the binding for the immediate step after user decision.
             Also, check the workflow log whether the containers for the step that updates the database are populated with required values.
             Is the method that updates the database is custom method ?
    Regards,
    S.Suresh.

  • Workflow and Status tracking

    Dear All,
    We should implement some kind of WORKFLOW solution to support the business planning process developed in SAP BI Integrated Planning.
    We analyzed the Status and Tracking system in SEM-BPS but it is not seems to be the best solution for us and it does not support the Integrated Planning (am I right?).
    What are the possible solutions for a workflow based process handling in an Integrated Planning solution and what are the possible (even non SAP) alternatives?
    Thanks in advance,
    Dezso

    Hello Ajay,
    When i execute the Web Interface, the following error appears:
    Unable to find:http://sdevsapbw:8002/sap/bc/bsp/sap/z100upsyal00001/z100upsyal00001.htm%3Fsap-client
    I’m not expert in this kind of solutions but our Queries are executed on web.
    I have arrived to this project two weeks ago and I tried to execute the queries in my computer and I am unable to do so. The error is the same when executing Web interface for BPS.
    I compared me internet settings with another guy and are equal.
    Do I have to do any configuration or install any Add-on?
    Thanks,
    João Arvanas

Maybe you are looking for

  • Idocs with errors added status-56 in file to idoc scenario- how to rectify

    hi , in file to idoc scenario when i m sending the file from ftp server to IS Retail,  file is successfully getting processed but the idoc is showing error ,when we see using WE05 tcode or BD87 , pl sugggest how to rectify it to make the status 53

  • More questions for David

    1. Auto close. I put in the code you suggested, and, after a hell of a lot of messing around it came good, and now works. I don't know why I had so much trouble, or why it suddenly came good (suspect mismatched braces or types of quote, but thank you

  • Adobe Flash Player 10 causes Internet Explorer 9 to stop working.

    Adobe Flash Player 10 causes Internet Explorer 9 to stop working.  This happens most of thge time when I play a video.  Does anyone know how to fix this problem?

  • I deleted my intel. How do I get it back?

    Hi, kinda did a bonehead thing today, was having problems when I had my devices phone etc backing up and out of no where this window popped up and asked if I wanted to have this as my home computer or public, which never seen that since I bought the

  • Error during start instance phase

    Dear all, I am getting an error during start instance phase of netweaver 7.0 installing.I am using windows server 2008 and mssql. The error is "instance ecd\dvebmgs00 reached state shutdown after having state starting.giving up".I searched a lot but