Workflow and branching

Hi Experts!
I am creating a workflow for Material creation scenario.I am not using LDAP and all user and roles are defined in SAP MDM.
I have 10 plants and have 2 level approver for 6 material types.
That means there are 6 material types with differen approvers and each in turn are different for 10 plants.
So I want to know which is a better way,
1.Maintaining 10 workflow for each plan and then branching for various types.
2.Creating 1 workflow which determines which plant and call from that Workflow other worflows.
If you have any other ideas pls share.
Thanks in advance!
Regards,
Ravz

Hello Ravz,
Your scenario is best suited for modeling outside MDM (possibly in BPM).
Modeling this in MDM will be more complicated from implementation as well as maintenance point of view.
Probably you will end up with lots of customizations, if you decide to model it in MDM.
Because you cannot dynamically assign approver's in MDM workflow.
Also you cannot access modifier (i.e. user) and its role details, anywhere in MDM (assignments, validations, and workflows);
hence you need to keep this plant to its approver mapping somewhere else.
Also consider the point that you cannot use checkin/checkout operations on Tuple or Qualified level.
This will lead to dirty read, because you have to save modifications on original record before approval,
that too you cannot revert modifications in case of rejection.
Tuple or Qualified level change Syndication is another headache.
Designing this scenario in MDM will become measure maintenance night-mer after go-live.
One option of design may solve most of the above problems.
That is: Create a dedicated main table for plants.
Means you will have two main tables one will be analogous to MARA and another will represent MARC.
But, this design is against the packaged Material repository provided by SAP.
Also decision of creating multiple main table should be taken carefully, because it may be a performance hit.
Regards,
Shaailesh.

Similar Messages

  • What are the fundamental procurement workflows and tasks?

    A discussion has begun in the Knowledge Management branch of SAP about the [Help documentation for SAP Sourcing|http://help.sap.com/saphelp_sourcing_wave7_p/helpdata/en/39/8ee60a3d74482e9e0111c46dc1e23d/frameset.htm]. Many feel that Help could better serve the needs of SAP Sourcing users if it were organized and written along the lines of the regular workflows and tasks performed by procurement professionals. Some questions arise from this discussion that would best be answered by the community of procurement professionals. That's you. Care to take a shot?
    What do you think about the idea of reorganizing Help according to procurement workflows?
    What exactly are the fundamental workflows and tasks you execute frequently toward the accomplishment of procurement goals?
    Take a look at the Table of Contents/navigation bar toward the left side of the SAP Sourcing Help page. Currently, its top-level topics are Getting Started, Workbench, Spend and Compliance, etc. If you could rename those top-level topics with the names of fundamental procurement workflows, what would you name them?
    Going down a level, what are the fundamental tasks you must execute within any of the fundamental procurement workflows mentioned above?
    Any input you'd give on these questions will help us improve how Help is organized and written as we go forward. It will be greatly appreciated.
    Stephen Muratore
    Information Developer
    SAP Sourcing

    Does this article contain anything useful?
    (19852)

  • FM to read the different steps in workflow and the status at each step

    Hi All,
    Is there any FM in SAP which gives the detail roadmap of the steps taken in a workflow and the diffrent status or decision at each step.
    I am looking for something like what you see when you click on the "STARTED WORKFLOW" in the business workplace outbox.
    Which shows up the "steps in the process so far "  and the decision and the agents for each of them.
    Thanks,
    Charan.

    Hi,
      You can get the status according to the task. please check with FM "SAP_WAPI_GET_WI_DELTA".
    Regards
    SM Nizamudeen

  • Report for outstanding workflows and the agents responsible

    I need a report that shows all leave request workflows , with userid who created them and approver (agent), and approver's org unit and personnel number.
    is there a standard SAP report that can show that? or I need to develop one?
    I was looking at the SWI2_FREQ, this report shows the leave request workflows and userid who created them, but it does not report on the agent.
    Regards,
    Tiberiu

    Hi
    I think by making use of SAP_WAPI_WORKITEMS_TO_OBJECTS you can get all the workitems who created the leave wokritems.
    Regards
    Pavan

  • Siebel & "BIP Report Generation" Workflow and Bookmark Syntax

    IHAC that wants to schedule the generation of a BIP report to run every weeknight and include a list of activities for the next day. We've utilized an OOTB IO and can successfully render the report using Sample XML in MS Word. Additionally, since they are only on 8.1.1, they do not currently have access to the latest scheduling capabilities. Therefore, we're trying to implement a repeating component to call a workflow to generate the report using the 'BIP Report Generation' workflow and the associated objects (from support posting 823360.1).
    Now to the questions:
    One of the parameters for the workflow is a 'Bookmark'. This parameter does not appear to handle various 'Siebel' querying functions like 'Today()' as part of the criteria. Can someone confirm this statement?
    Currently, I can imagine one high-level possible workarounds. This would entail a revised WF and that includes a prior step to generate the 'Bookmark' by retrieving a string representation of tomorrow's date and concatenating it with the rest of the criteria.
    Next question:
    Does anyone have any other possible alternatives? Maybe a calculated field in the BC/IC for 'Today() + 1' - but this could have performance implications.
    Thanks in advance for any help.

    Suggestions/comments?
    Bump.

  • Tapeless workflows and Sandy Bridge or other PC's: KISS or LOVE?

    Tapeless workflows and Sandy Bridge or other PC's: KISS or LOVE?
    Life used to be so simple when shooting video on a tape based camera. You shot your material, captured it for editing and stored your precious original footage on tape in a safe and dry place. Sure, it took time to capture, but the big advantage was that if you had a computer or drive failure, you would still have the original tape so everything could be recreated.
    Now with tapeless workflows we have the significant advantage of much faster import of the original footage. Connect the flash card or disk drive to the computer over USB and copy the data to a HDD on the computer, ready for editing. The data on the flash card or disk drive can then be erased, so you can reuse it for more shots. But, like Johan Cruyff has said repeatedly, every advantage has its drawback. In this case it simply means that you no longer have the original material to fall back on, in case of computer or drive failures. That is a very unpleasant and insecure feeling.
    The easy anwser to that problem is backups. Backup of the original media, backup of projects and backup of exports. This often means a bundle of externals for backup or NAS configurations. One thing is clear, it requires discipline to make regular backups and it costs time, as well as a number of disks. Four as a minimum: 1 for media, 1 for exports and at least 2 for projects. Note: This is excluding a backup drive for OS & programs.
    There are different backup strategies in use. Some say backup daily and use one disk for monday, one for tuesday, and so on.  Others say one disk for the first backup, the second for the second backup, then the first again for an incremental backup, etc. and once weekly a complete backup on a third disk. Whatever you choose, be aware that shelf live of a disk is far less than tape. There are horror stories everywhere about ball-bearings getting stuck after some time and without original tapes, you better be safe than sorry, so don't skimp on backups.
    What is the relevancy of all this? I thought this was about Sandy Bridge and other PC's.
    It is and let me try to explain.
    Card based cameras are for the most part DSLR and AVCHD type cameras, and we all know how much muscle is required to edit that in a convenient way. Adobe suggests in the system requirements to use raid configurations for HD editing and practice has shown that raid arrays do give a significant performance boost and improve responsiveness, making for a nicer editing experience. The larger the project and the longer the time-line, the more a raid array will help maintain the responsiveness.
    One thing you would not do is using a raid0 for projects, media and exports, even if you have backups. The simple reason is that the chance of disk failure multiplies by the number of disks in the raid0. Two disks double the chance of disk failure, three disks triple the chance, four disks quadruples the chance, etc.
    Remember: Disaster always strikes when it is most inconvenient.
    Imagine you have been working all day on a project, you decide to call it a day and to make your daily backup, but then the raid fails, before you made your backup. Gone is all of today's work. Then take into consideration the time and effort it takes to restore your backups to the state it was in yesterday. That does not make you happy.
    Another thing to avoid is using a software or mobo based parity raid, for the simple reason that it is slooowww and puts a burden on the CPU, that you want to use for editing, not house keeping.
    For temporary or easily recreated files, like the page-file, media cache, media cache database and preview files, it is very much advised to use a raid0. It makes everything a lot snappier and if disaster strikes, so what? These are easily recreated in a short time.
    This was a general overview of what is required with tapeless workflows. Now let's get down to what this means in terms of system design.
    Two approaches or train of thoughts
    KISS: Keep it stupidly simple or LOVE: Laughing over video editing
    The first one, the most economic one, is to use a system with 3 or 4 disks internally and 4 or more backup disks.
    A typical disk setup can look like this:
    This is a perfectly sensible approach if one does not have large or complex projects, long time-lines and is willing to take the risk of occasionally losing a whole days work, between backups. Many hobbyists and consumers fall in this category.
    The KISS approach keeps it stupidly simple. The drawback is that there is no logical way to add more disks or storage. The discipline, diligence and effort required for regular backups make it far from a laughing matter. In fact it can quickly become a bore. Add to that the fact that the disk setup is simple but not very fast, so less suited for situations where lots of clips are involved, multi-cam is a regularly recurring situation or lots of video tracks are involved.
    A number of video editors want more from their system than the occasional platonic KISS, they want to really LOVE their system, which lead to the other train of thought.
    This is more costly than the KISS approach, but you all know a fiancée or wife is more costly and dear than the occasional kiss on the cheek by an old friend.
    Let's start with a typical disk setup. It may look like this:
    Two striking differences in comparison to the KISS approach:
    1. Much easier disk organization and more disks and thus more space.
    2. It requires a hardware raid controller, causing a higher investment cost. It is like an engagement ring. You don't get LOVE for free, one of the guiding principles of the oldest trade in the world.
    These are easy statements to make, but what are the benefits or advantages, that you would fall in LOVE with such a system, and what are the drawbacks? Think back to Johan Cruyff's adage.
    The only drawback is cost. The advantages are multiple, easier organization, more speed, more storage, snappier editing, no jerkiness, lesser requirements for regular backups and - this is the major benefit - hardly a chance of losing a day's work in case of a drive failure. Keep in mind that a parity raid keeps all your data intact in case of a drive failure, so lessens the need for up-to-date backups.
    We all know, we get what we pay for: "If you pay peanuts, you get monkeys. OTOH, if you pay money to monkeys, you get rich monkeys". But in this case you get what you pay for, a much better editing experience with a much easier workflow.
    Using a parity raid (be it raid 3/5/6/30/50/60) you get security, ease of mind that you are protected against losing precious media, that you need not worry about the last time you made a backup, that the editing you did today may be lost and you save valuable time editing and a lot of aggravation because of a much more responsive system.
    How does this all relate to Sandy Bridge and other PC's?
    First of all, the price difference between a Sandy Bridge / P67 platform and an i7-950+ / X58 platform is very small. Of course the new architecture is slightly more expensive than the older one, but the differences are small, almost not worth talking about.
    So what are the differences? Look below:
    The first thing to keep in mind is that the Sandy Bridge is the successor of the i7-8xx CPU and as such it is much more evolutionary than revolutionary. The CPU power has increased significantly over the i7-8xx due to new architecture and a smaller production process (32 nm), but in essence all the capabilities have remained unchanged. Same memory, same PCI-e lanes, same version, same L3 cache and no support for dedicated raid controllers.
    It is great that the processor performs much better than the older i7-8xx CPU's, almost achieving the level of the i7-9xx range of processors, but is still limited:
    The Sandy Bridge is unsuitable for anything more than a KISS system.
    Why? Because it lacks the required PCI-e lanes to accomodate more than a 16 x PCI-e nVidia card with CUDA support to enable hardware MPE acceleration and the integrated graphics are not supported by CS5.
    You may wonder if that is a bad thing. The plain and simple anser is NO. It is a great processor, it delivers great value for money, is a solid performer, but it has its limitations. Intel had a reason to position this CPU as a mid-level CPU, because that is what it is, a mid-level performer in comparison to what is to come.
    The term mid-level performer may seem strange when compared to the old generation of i7-9xx CPU's, because they perform almost equally well, but keep in mind that there is a generation difference between them.
    So what about the i7-9xx and X58 platform?
    It still is going strong. About the same performance as a Sandy Bridge, with only the much more expensive hexa-cores clearly in the lead, both performance and price wise. The quad cores deliver about the same value for money.  The main difference however is the platform that allows a dedicated raid controller to be installed, thus making it the platform of choice for those who want to go from a passing KISS to true LOVE.
    And what lies ahead?
    Sandy Bridge E on the Waimea platform (X68). Now that is revolutionary. More than double almost everything a processor can offer: double the cores, double the PCI-e lanes, triple the memory, more than double the L3 cache, increase the PCI-e support from 2.0 to 3.0, etc...
    This is why Intel calls this a high-end CPU / platform.
    So what now?
    If you prefer a KISS approach, choose either a Sandy Bridge/P67 or an i7-950+/X58 platform.
    If you wonder whether in the future you may need multi-cam more frequently, edit more complex projects and longer timelines or even progress to RED, look at KISS/LOVE solutions, meaning the i7-950+/X58.
    If you can't have downtime, time pressure is high, delivery dates to clients are critical or you edit highly complex projects, lots of multi-cam situations or lengthy time-lines, choose a LOVE solution, an i7-950+/X58 platform.
    If you have the time to wait till Q4/2011, Sandy Bridge E/Waimea looks to be worth the wait.
    Hope this gives you some more insight into recent and future developments and helps you make wise investment decisions.

    I'm upgrading from an AMD 3800+, cutting with Vegas 7 Pro. Usually shoot DSLR or HDV, sometimes P2, EX or RED. I have ridiculously cheap access to Macs, FCP/FCS, all kinds of software.
    I've been agonizing over this for the last month, was originally hoping the UD7 mobo was the solution, read the read about the NF200/PCIe issue a few days ago, http://www.dvinfo.net/forum/non-linear-editing-pc/489424-i7-980x-now-wait-sandybridge-2.ht ml- and still decided to go for a 2600k. 
    My preference is to treat my video footage the same way as my digital imagery: I make (at least) duplicate back ups of everything before reformatting the cards, never delete the back ups, and only worry about the day-to-day stuff at night. Unless I'm rendering or involved in other long processes, in which case I'll back up the work in process the next day. If I am under a really really tight deadline I might back up as I go.
    Yes, a RAID might make it easier, but I'm paranoid enough to prefer a slower, safer backup. You can always duplicate, and usually improve upon, a days work, but you can never get back original footage you lost. I have only ever had one hard drive die on me (a few enclosures crapped out, though)- it took a couple of (mostly unattended) hours to rectify. As a matter of act, I've had far more loss/damage from tapes than from hard drives.
    I ordered the UD7, 2 F4s and 4 F3Rs, understanding I will probably want to upgrade to SBE when it comes out, or maybe next year. The 2600k/mobo/RAM will likely hold its value better than a 950/X58, likely because of the marketplace as much as merit.
    The UD7 / RAID card issue is in it's early days, there may be a solution/mitigation. Probably not. But if I really really need a RAID card, then I probably really really need a 980, NAS, etc etc.
    But Harm still rocks!

  • Difference between SAP Business Workflow and CRM Workflow

    Hello guys,
    Is there any difference between SAP Business Workflow and CRM Workflow?
    Are there others workflows types in SAP?
    In all SAP Workflow Courses (BIT600, BIT601, BIT603, BIT610) only was talked about SAP Business Workflow, and nothing about CRM Workflow.
    Thanks a lot!
    Kleber

    Hello Kleber,
    There is not much of the differance between CRM workflows and SAP Business  workflows..
    Workflows are esentially part of Basis (BC) components and hence its the same across all SAP products as all these products have components of Basis(BC) in them...
    To Conclude, SAP Busines workflows and CRM workflows are the same things... However when you work on CRM, the BOR objects would be belonging to CRM and not to ECC... However conceptwise , there in no change at all.. If you learn BIT100......, you can also work on CRM workflows..
    Regards,
    Anand

  • IPhoto 08 and Photoshop - suggestions on workflow and managing PSD files

    Hi Everyone,
    I'm a long-time Photoshop user, and a recent convert to Mac (BTW - I love my new Mac), therefore I am also new to iPhoto 08. I must say that I actually enjoy how iPhoto manages my pictures, as it is not all that different from how I've been managing my pics manually for years (by year, by event, etc.). I've read the various discussion topics on how to set up and use Photoshop as an external editor from iPhoto, and have not had any problems up to now.
    I'm now looking for suggestions on workflow and managing my PSD files. I apologize if this is a little long, but I want to make sure I explain the problem clearly. Here's the scenario:
    - From iPhoto, I choose to edit an existing JPEG file in the external editor, Photoshop.
    - I perform my edits (including advanced edits with layers etc.)
    - I do both a "Save As..." and a "Save". That way, I "Save" the flattened JPEG back to my iPhoto library properly, and "Save As..." a PSD file in case I want to do further edits on the image later (days, weeks, months later).
    I'm looking for suggestions on where to put my PSDs. For now, I am saving the PSD to my desktop (using max compatibility), and then importing the PSD into my iPhoto library (into the same Event as the original JPEG). However, this leaves me with 2 copies of the same picture: 1 JPEG, and 1 PSD.
    The problem is, I now want to go back and do more tweaks on the PSD, the end result of which will make the JPEG version out-of-date. I can either
    1) Open both the PSD and JPEG versions at the same time, tweak the PSD, then copy and paste the flattened layers on top of the JPEG version and "Save"
    2) Open only the PSD, tweak, and save a copy as a JPEG onto my desktop and re-import into iPhoto.
    Although a little tedious, both options seem to work. Are there any other options? What do you suggest? I'm curious to see how others manage this.

    I don't have a solution for you (sorry) but I'd like to comment because I'm in a similar boat. However instead of saving PSD files to the desktop, I've been advised to save these to TIFF format in Photoshop by a professioal photographer friend of mine (I think he feels its a more universal file format than PSD for archival purposes). The downside is the files are huge.
    I imported high resolution jpegs into iPhoto. I then use Photoshop to edit and save the flattened jpeg version back to iPhoto (thankfully, keywords are preserved this way). I recently realized that I need to save a copy of the edited file in a lossless format like TIFF or PSD. Unfortunately when I save the file to TIFF and import it back into iPhoto, keywords are lost. This is a drag.
    What I am trying to figure out is how I can retain my keyword info in the edited archival version. Any tips?
    I'm beginning to question whether I should import my photos in to Photoshop (or PS Bridge) first, save the original for archival purposes, do the majority of editing and save a TIFF/PSD file and then import this file into iPhoto for keywording, further editing, downsizing, etc.

  • I am new to workflow and I have and issue

    Hi Experts,
    I am very new to workflow and Ive got an issue which I need know where to start with.
    I am working with an application whose UI is Java and back end is SAP.
    When a sales order is created in the UI this creates a order record in SAP and triggers a business object which inturn calls the func module SWE_EVENT_CREATE for initiating the workflow.
    The workflow sends a mail in which the data is coming wrong in the preview mode(i.e whn u single click on the message it appears at the bottom ) and when u double click on the message u can find correct info.Y its like that ?
    Any ideas??
    Neeraj

    Hi,
    Neeraj not in perticular that case but in every case when you click once it will appear at bottom and when you double click on it it gives you a pop p window which helps you to see whole message.
    So,It's a functionality Dear nota problem.
    Hope you understand.
    Regards,
    Purvesh.

  • Workflow and ABAP OO

    Hi,
    I have read the excellent blogs on Workflow and ABAP OO by Joycelyn Dart (/people/jocelyn.dart/blog/2007/07/16/referencing-bor-objects-in-abap-oo-classes). I have sucessfully created a  BO (of type SIBFLPORB) in my class. However, I cannot figure out how to retrieve the workitem container that holds the BO. I have used macros before in workflow programming exists, but there I get the container from a workitem context. I can't find the workitem context in my class. Please provide me with some code samples of how I can retrieve the workitem container that holds the BO.
    I am considering passing the workitem id to the method and reading the container via SAP_WAPI_READ_CONTAINER. If this is a good solution, then I see no point in having the BO as an attribute of the classe. Is it a good solution?
    Thanks!
    Elvez

    I am probably missing something obvious here (or we are not even speaking about a same thing), but I still try to clarify my point.
    >I could of course just pass the necessary data as parameters to the class
    This is the part that I don't really understand. If you already have an instance of a class existing (=there is an object instantiated), which has the BO as an attribute, there shouldn't be any need to change a signature of any method(?). The only thing you need to enhance is the constructor of your class.
    Currently you create the BO instance as an attribute in your constructor method (isn't this correct?). Just create the attributes you need from the BO into your class, and "populate" them in your constructor. Or if it is an method that you need, just create a similar method (=you can copy the code) into your class.
    This all might be much easier than using the WF macros, which shouldn't be that difficult either - as Jocelyn says:
    "You will still need to use BOR macros in your ABAP OO classes if and only if you want to call BOR attributes and methods in your ABAP OO methods.
    If you need to do this then make sure you include the BOR macro definitions in your class by putting the special include program <cntn02> in the "Macros" section of your ABAP OO class."
    Regards,
    Karri

  • SAP Workflow and GRC 10.1 Workflow

    Hi all,
        We are in the midst of upgrading our GRC system up to 10.1 and some questions are coming up about the workflows.  In short are GRC workflows and SAP workflows the "same thing", i.e. if someone outside of the upgrade project were to learn how to create/maintain workflows within GRC 10.1 would they be able to turn around and run the same transactions within an ECC 6.0 environment and create SAP Business workflows?  From what I have seen so far in my searching is that, they have the same basic principle but very different implementation/maintenance.
    Any documentation that you are aware of from SAP showing this would be helpful as well.
    Thanks

    The basic mechanism is the same, but the GRC workflows are more fixed (they leave less room for the workflow to be changed) have build in screens, and relay on the BRF engine to determine approvers etc.
    Workflows in the ECC give you a lot more room for implanting the workflow however you wish (use a decision task, asynchronous tasks, develop your own custom approval screen etc.) and some time required you to implement changes to the workflow object (set a user status, release a document etc.) and don't usually use the BRF.
    So I would have to say that the answer is no, someone who learned how to implement the GRC workflows will not be able to turn around and immediately create workflows in the ECC.  

  • Workflow and General Use Questions

    Hello,
    I'll apologize right off the bat for these novice question because I'm sure the information is probably somewhere in the forum, I just haven't been able to find it. I just purchased Aperture after completing the demo as my library is getting too large to manage using standard file folders. I'm now trying to figure out the best practices for workflow and general use before I invest some serious time into importing and keywording all my pictures.
    1) Store files in the there current location, or in the Aperture Library? It seems to me that once they are moved to the Aperture library, you can only access them from within Aperture. I'm thinking I would be better off leaving them in their current location. For one, if I want to quickly grab a picture as an attachment to an email or something it seems easier to grab it from the standard folders. Second (and more important) I do not have room to keep all my pictures on my Macbook, thus most of them are stored on the Time Capsule.
    So... Keeping photos in their current location appears to be the best choice for me even though it adds an additional step every time I bring in new photos from my camera. Does this sound right?
    2) Is there a way to mark the photos that I have uploaded to my website (Smugmug)? Ideally, I would like to badge photos that have already been uploaded so I can quickly recognize them and ensure I'm not duplicating. I've considered using the rating, or keywords to indicate that a photo has been uploaded but both methods have disadvantages.
    3) Any suggestions for general workflow and organization resources (tutorials, books, websites, etc.)? I've looked at the videos on Apple's site but they obviously didn't get that detailed.
    Thanks for the help, sorry for the length.

    I recommend to Manage by Reference with Master image files stored on external hard drives (note that Aperture defaults to a Managed-Library configuration rather than a Referenced-Masters Library). Especially important for iMacs and laptops with a single internal drive. The workflow as described below in an earlier post of mine uses a Referenced-Masters Library.
    I feel pretty strongly that card-to-Aperture or camera-to-Aperture handling of original images puts originals at unnecessary risk. I suggest this workflow, first using the Finder (not Aperture) to copy images from CF card to computer hard drive:
    • Remove the memory card from the camera and insert it into a memory card reader. Faster readers and faster cards are preferable.
    • Finder-copy images from memory card to a labeled folder on the intended permanent Masters location hard drive.
    • Eject memory card.
    • Burn backup hard drive or DVD copies of the original images (optional strongly recommended recommended backup step).
    • Eject backup hard drive(s) or DVDs.
    • From within Aperture, import images from the hard drive folder into Aperture selecting "Store files in their current location." This is called "referenced images." During import is the best time to also add keywords, but that is another discussion.
    • Review pix for completeness (e.g. a 500-pic shoot has 500 valid images showing in Aperture).
    • Reformat memory card in camera, and archive originals off site on hard drives and/or on DVDs.
    Note that the "eject" steps above are important in order to avoid mistakenly working on removable media/backups.
    Also note with a Referenced-Masters Library that use of the "Vault" backup routine backs up the Library only, not the Masters. Masters should be separately backed up, IMO a good thing from a workflow and data security standpoint.
    Max out RAM in your MB and keep the internal drive less than 70% full.
    Good luck!
    -Allen Wicks

  • Workflow and WD integration

    Hi,
    We are implementing Workflow and Webdynpro ABAP integration.Employee submits competences from portal, this triggers an approval task to the manager .Manager will approve the task from worklist and when he clicks on task, Web dynpro application opens up where he has to approve or reject accordingly.
    We have approve and reject buttons in Web dynpro and we have used User Decision Step type in Workflow to open the Web dynpro application.User decision step decisions are mapped to Web Dynpro approve/reject actions.
    So we are able to successfully launch the Web dynpro and perform the actions.But problem is the rest of Wrokflow after User Decision step is not getting executed. Workflow is getting stopped at this point.
    We have tried using FM's SAPI_WAPI_DECISION_COMPLETE and SAP_WAPI_WORKITEM_COMPLETE.
    Both these FM.s complete the User Decision step but does not execute rest of Workflow.
    Please help me if you have come across this scenario.

    Hello Archana !
             After user decision step, you have maintained the step that updates database as immediate step.
             Have you tested this step separately ? If it have yielded successful results, check the binding for the immediate step after user decision.
             Also, check the workflow log whether the containers for the step that updates the database are populated with required values.
             Is the method that updates the database is custom method ?
    Regards,
    S.Suresh.

  • Index in LOOP and BRANCH in Business Connector

    Hello,
    i use the SAP Business Conector 4.7 and want to map a structure from incoming XML-File to BAPI_PO_CREATE1. I loop over /ORDER[0]/ORDER/ORDER_ITEM_LIST/ORDER_ITEM/ACCOUNTING_INFO/COST_CATEGORY_ID
    and  BRANCH over
    /ORDER[0]/ORDER/ORDER_ITEM_LIST/ORDER_ITEM/ACCOUNTING_INFO/COST_CATEGORY_ID/*body
    But ORDER_ITEM has an index 0,1,2,3 on so on. So it works only, if i BRANCH over
    /ORDER[0]/ORDER/ORDER_ITEM_LIST/ORDER_ITEM[0]/ACCOUNTING_INFO/COST_CATEGORY_ID/*body
    for every index. But the index could change with every file.
    How can i BRANCH over an index?
    Thanks,
    Daniel

    LOOP AT itab.
    AT NEW stud.
      perform xxxx.
    ENDAT.
    ENDLOOP.
    Refer SF
    Refer this link for Smartforms
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/d937ec90-0201-0010-0ca8-b6cb3b6dd1ef
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/b590fd97-0301-0010-db90-d09ff93cee5a
    Subroutine in smartform
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/uuid/ccab6730-0501-0010-ee84-de050a6cc287
    Style and mailing the Smartform output
    https://sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/uuid/8fd773b3-0301-0010-eabe-82149bcc292e
    Table,Template,Loop and Command in Smartform
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/uuid/3c5d9ae3-0501-0010-0090-bdfb2d458985

  • Workflow and Shopping Cart

    Hi Colleagues,
    I want to know how do we relate a shopping carts to the workflows created for it.
    Example: When i create a shopping cart,  an entry is created in CRMD_ORDERADM_H and workflows are created. Can you tell me which table to check the workflow and how. (Table links).
    Regards,
    Rishav Surana

    Hi,
      You can get WF details..
    How to Get WorkItem ID
    Saravanan

Maybe you are looking for

  • Only showing one program at a time?

    I used to be able to have firefox open behind the word program I was working on, then click to the firefox to do a quick search (would go on top of word document, but word doc could still be seen) then I could click right back to the word document &

  • Issue while parsing the chinese character from Mime Message

    Hi, I have a issue with the chinese characters while parsing the mime message (MimeBodypart). In the MimeMessage charset is mentioned as "gb2312". i am using the MimeBodyPart.getContent() to get the content. When mimetype is html, it will be uploaded

  • Playback of same movie file in QT Pro is WAY better than iTunes??

    Okay, so I have some .mov files that I have imported into iTunes. I know that iTunes uses Quicktime to play movies, but how come when I play a movie using QT, the picture is flawless & clear, but when I play it back in iTunes, it's all washed out, bl

  • Passing dynamic parameters

    Hi, Is there any way to pass parameters dynamicaly to a Web Start application? Using jnlp->application-spec->arguments in jnlp files seems all static, unless I want to generate the jnlp file itself on the run. When I use JSP that includes Applet, I c

  • Physical Inventory - Material number missing

    Hi Guys, When one of my users creates a PI, the one material is not included in the document.  I checked the data, it is open for the storage location and it has stock, does anyone know why this happens?