Best Practice Reuse - Post Processing

I have a Query Canvas that I would like to use over and over again in my different form modules. I created an object group and put the group into the object library. All of the code is located in program units.
I want some "post" processing done which differs depending on which form I am reusing it on. I first created a post_query procedure and called it as a hook, but when I use it on a form, it gets overwritten as I update the reusable object group.
I then thought I would just call my post processing after I called the query procedure, but the "post" code is executing BEFORE query code executes - which I am completely confused as to why.
So my basic question is, how do you add "hooks" or pre and post procedures to your reusable code and prevent that code from being overwritten as you update your objects?

Hi,
I also want to know what kind of monitoring scripts I can use to setup as cron jobs to monitor or detect any failure or problems?
To monitor Cluster (OS Level):
I suggest you use a powerful tool "CHM" that already comes with product Grid Infrastructure.
What do you do to configure? Nothing ... Just use.
Cluster Health Monitor (CHM) FAQ [ID 1328466.1]
See this example:
http://levipereira.wordpress.com/2011/07/19/monitoring-the-cluster-in-real-time-with-chm-cluster-health-monitor/
To monitor Database:
PERFORMANCE TUNING USING ADVISORS AND MANAGEABILITY FEATURES: AWR, ASH, and ADDM and Sql Tuning Advisor. [ID 276103.1]
The purpose of this article is to illustrate how to use the new 10g manageability features to diagnose
and resolve performance problems in the Oracle Database.
Oracle10g has powerful tools to help the DBA identify and resolve performance issues
without the hassle of analyzing complex statistical data and extensive reports.
Hope this help,
Levi Pereira
Edited by: Levi Pereira on Nov 3, 2011 11:40 PM

Similar Messages

  • TechNet Wiki - Best Practice Blog Posts

    Lately, we've had some great blog posts about best practices on TechNet Wiki. So we're going to share them with you here...
    Wiki
    Life: Commenting on Comments... Care to Comment?- 10/16/14 by Ed Price
    How
    to write a great post on the Wiki - For Dummies - 10/12/14 by Gokan Ozcifci
    Wednesday
    - Wiki Life: The Importance of Longer, High-Quality Articles - 10/8/14 by Ed Price
    Wednesday
    - Wiki Life: 10 ways to become the most hated Wiki ninja on the planet - 10/1/14 by Peter Geelen
    Wiki Life:
    PowerShell PowerPack! - 9/17/14 by Matthew Yarlett
    The
    most unseen and unspoken TechNet Wiki roles: The mentor Role - 6/22/14 by Sandro Periera
    Wiki Life: Smart Tags -
    6/18/14 by Matthew Yarlett
    Wiki Life:
    Ownership and Credibility - 6/11/14 by Matthew Yarlett
    Wiki
    Life: Best Practices for building TechNet Wiki Portals - 6/4/14 by Horizon Net
    Wiki
    life: Technet Wiki tagging, the ugly truth. - 5/29/14 by Peter Geelen
    Wiki Life:
    Getting too Personal!  - 5/14/14 by Matthew Yarlett
    Wiki Life:
    YOU edited MY article??!  - 4/30/14 by Matthew Yarlett
    Wiki
    Life: Are you right in making it a rite to write? - 4/16/14 by Matthew Yarlett
    Wiki Life - Alerts -
    4/9/14 by Alan Carlos
    Wiki
    Life: Speling an gamma, it is umpotant? - 4/2/14 - by Matthew Yarlett
    Wiki
    Life: How to Translate TechNet Wiki Articles - 4/2/14 by Horizon Net 
    Wiki Life:
    Attention to Detail - 3/19/14 by Matthew Yarlett
    Wednesday - Wiki Life - Mobility - 3/12/14 by Alan Carlos
    Wiki
    Life: A Picture is Worth a 1000 Words - 3/5/14 by Matthew Yarlett
    Wiki Life: Cut'N'Paste -
    2/19/14 by Matthew Yarlett
    Wiki Life: How to Join Leadership - 2/19/14 by Horizon Net
    Wiki Life: Featured Articles in the TechNet Wiki - 2/12/14 by Durval Ramos
    Wiki Life: Code.Format() -
    2/5/14 by Matthew Yarlett
    Wiki Life: The CodePlex Corner - 2/5/14 by Horizon Net
    Did you know that we have a layout article? - 1/29/14 by Durval Ramos
    Wiki
    Life: Get to the point, keep it short! - 1/22/14 by Matthew Yarlett
    Wiki Life:
    Planning a Great Article - 1/8/14 by Matthew Yarlett
    Wiki Life: Best Practices for converting an MSDN / TechNet Forum thread into a Wiki Article!!!
    - 12/25/13 by Ed Price
    Wiki Life: Best Practices for Giving Credit - 12/18/13 by Horizon Net
    Wiki Life: How To Fix a Wiki Article TOC  - 12/4/13 by Benoit Jester
    Wiki Life: How To Detect Missing Tags Without any Effort  - 11/20/13 by Benoit Jester
    Wiki Life: How To Import an Microsoft Excel Spreadsheet Into a Wiki Article - 10/30/13 by
    Markus Vilcinskas
    Wiki Life: Cross Linking  - 10/9/13 by Horizon Net
    Wiki Life: User Groups Portal - 10/2/13 by Horizon Net
    Ed Price, Azure & Power BI Customer Program Manager (Blog,
    Small Basic,
    Wiki Ninjas,
    Wiki)
    Answer an interesting question?
    Create a wiki article about it!

    Respected sensei Wiki Ninja,
    what else do you need to start a Wiki article?
    Put you signature in practice!
    So I kindly invite you all to continue your braindump over here:
    http://social.technet.microsoft.com/wiki/contents/articles/27905.technet-wiki-best-practices-blog-posts-articles.aspx
    Peter Geelen (Microsoft Belgium) - Premier Field Engineer Security & Identity
    [If a post helps to resolve your issue, please click the
    "Mark as Answer" of that post or click "Vote as helpful" button
    of that post.
    By marking a post as Answered or Helpful, you help others find the answer faster.

  • Best Practice on Post Steps after 11.2.0.2.4 ORACLE RAC installation

    I finished RAC 11.2.0.2 installation and patched it to 11.2.0.2.4. The database is also created.
    The nodes are Linux redhat and storages are on ASM.
    Is there any good article or link regarding the best practice of post steps after installation?
    Thanks in advance.

    Hi,
    I also want to know what kind of monitoring scripts I can use to setup as cron jobs to monitor or detect any failure or problems?
    To monitor Cluster (OS Level):
    I suggest you use a powerful tool "CHM" that already comes with product Grid Infrastructure.
    What do you do to configure? Nothing ... Just use.
    Cluster Health Monitor (CHM) FAQ [ID 1328466.1]
    See this example:
    http://levipereira.wordpress.com/2011/07/19/monitoring-the-cluster-in-real-time-with-chm-cluster-health-monitor/
    To monitor Database:
    PERFORMANCE TUNING USING ADVISORS AND MANAGEABILITY FEATURES: AWR, ASH, and ADDM and Sql Tuning Advisor. [ID 276103.1]
    The purpose of this article is to illustrate how to use the new 10g manageability features to diagnose
    and resolve performance problems in the Oracle Database.
    Oracle10g has powerful tools to help the DBA identify and resolve performance issues
    without the hassle of analyzing complex statistical data and extensive reports.
    Hope this help,
    Levi Pereira
    Edited by: Levi Pereira on Nov 3, 2011 11:40 PM

  • General Discussion - Best practice to manage Process order

    Hi Experts,
    Which is the best practice to manage process orders ?
    1. Quantity Change - I can make quantity adjustment in R3 and APO.
    2. Source Change - I can make a version change from order header . Also i can make a source change in APO by selecting a different PPM. Which is the best option.
    3. Re Read Master Data - Best practice to read master data is from R3 or APO ?
    I feel for all the above scenarios process ordes should always managed in R3. But still wondering why we have the same flexibility in APO too ?
    Can

    Hello,
    we are just migrating from 4.6c to ECC 6.0 and I have a couple of workflows to adopt.
    For background steps I defined in the corresponding BOR methods an exception to be fired when no result is available (e.g. no mail address available). Normally, I defined them as temporary errors.
    I activated in the WI outcome section the line for this exception and so the workflow processed this branch when the exception appeared. It worked fine.
    Now, in ECC 6.0, the same workflow get stuck in the WI. The exception is fired (I can see it in the log as "Error message"), but the WI is still in status "in process". It doesn't continue with the error outcome branch.
    Is this a new logic in ECC 6.0? Do you have any idea what to do? I used this logic some dozent times in different methods and workflows and it gives me a headache if I have to change everything ...
    Thank you!
    Best regards,
    Thomas

  • Best Practice Re-install Process?  (Xcelsius 2008 Engage)

    The main impetus of this post is now to request a best practice reinstall process.  E.g., what files/directories or registry keys to delete after uninstalling under Windows XP?  Running Xcelsius 2008 5.1.1.0 Build 12,1,1,344 (Windows XP SP2)?  I want to clean it up and reinstall completely, ideally without also reinstalling Office 2007.
    Earlier today, I had loaded up an otherwise healthy project and all of a sudden my edits were not taking.  I could perform the change in the Component Properties window but the change did not translate in the project display area.
    I had closed and reopened this project several times, killed both "Xcelsius" and "EXCEL" processes, rebooted the machine a couple of times, all to no avail.  I was totally stuck, it seems.  Then, it randomly corrected itself after the Xth close/reboot/reload.  I'm past the issue now it seems, but please consider this an official bug report.  These issues are no longer a mere minor nuisance, and I am looking forward to a nice big update in the next Fix Pack release, which I assume is right around the corner (hint, hint).
    Thanks.
    Edited by: f l on Nov 17, 2008 9:20 PM

    f l,
    I'm not sure deleting keys from the registry is ever a best practice, however Xcelsius has listings in:
    HKEY_CURRENT_USER > Software > Business Objects > Xcelsius
    HKEY_LOCAL_MACHINE > SOFTWARE > Business Objects > Suite 12.0 > Xcelsius
    The current user folder holds temporary settings, such as how you've modified your interface.
    The local machine folder holds more important information.
    As always, it's recommended that you backup the registry and/or create a restore point before modifying or deleting any keys.
    As for directories, the only directory Xcelsius uses is the one you install to.  It also places some install logs in the temp directory, but they have no effect on the application.

  • Best practices to modify process chain

    Hi,
    What is the best practice to modify the  process chain whether in in the production or transport back to the dev.
    Does Query performing tuning setting should be done in production like read modes,cache settings.
    Thanks
    nikhil

    Hi Nikhil,
    The best practice to modify the process chains will be making the change in Development system and transporting to Quality and Production. But if you are making a simple change like changing the scheduling time ( change of scheduled date and time by editing the variant) you can do it directly in Production. But if you are adding new steps to process chain, it is better to make change in Dev and move it to Prod.
    Also once the change reach production you may need to open the chain in edit mode and activate it manually.
    It is a common issue for the process chains containing delta DTP, that the chain will be saved in modified (M)  version and not in active version by transports.
    The query read mode and cache settings, you can do it in Development and move it to production.
    But the pre-filling of cache using broadcaster settings can be done directly in production.
    Also creating of cube aggregates to improve query can be done directly in production rather than creating in development and transporting to Prod.
    Again, it depends on project, as in some projects they make changes directly in Prod, while in some projects changes in Prod are not allowed. I think it is better always start the changes in dev and move it to P.
    Hope it helps,
    Thanks,
    Vinod-

  • Best practice for optimizing processing of big XMLs?

    All,
    What is the best practice when dealing with large XML files ... (say couple of MBs).
    Instead of having to read the file from the file system everytime a static method is run, what would be the best way in which the program reads the file once and then keeps it in memory. So the next time it would not have to read and parse it all over again?
    Currently my code just read the file in the static method like ...
    public static String doOperation(String path,...) throws Exception
    try{           
    String masterFile = path+"configfile.xml";
    Document theFile = (Document)getDocument(masterFile);
    Element root = theFile.getDocumentElement();
    NodeList nl = root.getChildNodes();
    // ... operations on file
    Optimization tips and tricks most appreciated :-)
    Thanks,
    David

    The best practice for multi-megabyte XML files is not to have them at all.
    However if you must, presumably you don't need all of the information in your XML, repeatedly. Or do you? If you need a little bit of it here, then another little bit of it there, then yet another little bit of it later, then you shouldn't have stored your data in one big XML.
    Sorry if that sounds unhelpful, but I'm having trouble imagining a scenario when you need all the data in an XML document repeatedly. Perhaps you could expand on your design?
    PC²

  • What is the best practice dealing with process.getErrorStream()

    I've been playing around creating Process objects with ProcessBuilder. I can use getErrorStream() and getOutputStream() to read the output from the process, but it seems I have to do this on another thread. If I simply call process.waitFor() and then try to read the streams that doesn't work. So I do something like final InputStream errorStream = process.getErrorStream();
    final StringWriter errWriter = new StringWriter();
    ExecutorService executorService = Executors.newCachedThreadPool();
    executorService.execute(
        new Runnable() {
            public void run() {
                try {
                    IOUtils.copy(errorStream, errWriter, "UTF-8");
             } catch (IOException e) {
                    getLog().error(e.getMessage(), e);
    int exitValue = process.waitFor();
    getLog().info("exitValue = " + exitValue);
    getLog().info("errString =\n" + errWriter); This works, but it seems rather inelegant somehow.
    The basic problem is that the Runnable never completes on its own. Through experimentation, I believe that when the process is actually done, errorStream is never closed, or never gets an end-of-file. My current code works because when it goes to read errWriter it just reads what is currently in the buffer. However, if I wanted to clean things up and use executorService.submit() to submit a Callable and get back a Future, then a lot more code is needed because "IOUtils.copy(errorStream, errWriter, "UTF-8");" never terminates.
    Am I misunderstanding something, or is process.getErrorStream() just a crappy API?
    What do other people do when they want to get the error and output results from running a process?
    Edited by: Eric Kolotyluk on Aug 16, 2012 5:26 PM

    OK, I found a better solution.Future<String> errString = executorService.submit(
        new Callable<String>() {
            public String call() throws Exception {
                StringWriter errWriter = new StringWriter();
                IOUtil.copy(process.getErrorStream(), errWriter, "UTF-8");
                return errWriter.toString();
    int exitValue = process.waitFor();
    getLog().info("exitValue = " + exitValue);
    try {
        getLog().info("errString =\n" + errString.get());
    } catch (ExecutionException e) {
        throw new MojoExecutionException("proxygen: ExecutionException");
    } The problem I was having before seemed to be that the call to Apache's IOUtil.copy(errorStream, errWriter, "UTF-8"); was not working right, it did not seem to be terminating on EOS. But now it seems to be working fine, so I must have been chasing some other problem (or non-problem).
    So, it does seem the best thing to do is read the error and output streams from the process on their own daemon threads, and then call process.waitFor(). The ExecutorService API makes this easy, and using a Callable to return a future value does the right thing. Also, Callable is a little nicer as the call method can throw an Exception, so my code does not need to worry about that (and the readability is better).
    Thanks for helping to clarify my thoughts and finding a good solution :-)
    Now, it would be really nice if the Process API had a method like process.getFutureErrorString() which does what my code does.
    Cheers, Eric

  • Best practices for posting video to my iWeb site?

    What's the best way to upload movies to my site so that visitors can watch them while sacrificing the least amount of quality, size and speed?
    As an example, the iPhone commercials on the Apple webpage start playing instantly, look great and when Large is selected are the perfect size.
    But I made a 1.5 minute iSight video on iMovie and saved it as a streaming quicktime file and it takes about 10 minutes to fully load once it's on my site. And I have a 10 mpbs internet connection.
    I also just pulled up a large movie trailer on iTunes and it looks great and started playing instantly.
    Is there a better way for posting video? Perhaps in flash or another codec that would make more sense? Videos on youTube play instantly, although the quality isn't always great.
    thanks!
    MacBook Pro   Mac OS X (10.4.9)  

    There is no absolute correct way. One way is shown in this demo page. It lets the page load without having the movie load thus speeding up the page load a fair amount. The movie only has to load when the Play button is clicked on and then it opens in a separate window sized just for the movie. The code used is also shown on the page. This way can allow you to use a larger QT file and not affect the loading of the original page it's linked to.
    OT

  • Best practices for batch processing without SSIS

    Hi,
    The gist of this question is in general how should a C# client insert/update batches records using stored procedures. The ideas I can think of are:
    1) create 1 SP with a parameter of type XML, and pass say 100 records at a time, on 1 thread.  The SP reads the XML as a table and does a single INSERT.
    2) create 1 SP with many parameters, that inserts 1 records.  I can either build a big block of EXEC statements for say 100 records at a time, or call the SP 1 and a time, on 1 thread.  Obviously this seems the slowest.
    3) Parallel processing version of either of the above: Pass 100 records at a time via XML parameter, big block of EXEC statements, or 1 at a time, and use PLINQ to make multiple connections to the database.
    The records will be fairly wide, substantial records.
    Which scenario is likely to be fastest and avoid lock contention?
    (We are doing batch processing and there is not a current SSIS infrastructure, so it's manual: fetch data, call web services, update batches.  I need a batch strategy that doesn't involve SSIS - yet).
    Thanks.

    The "streaming" option you mention in your linked thread sounds interesting, is that a way to input millions of rows at once?  Are they not staged into the tempdb?
    The entire TVP is stored in tempdb before the query/proc is executed.  The advantage of the streaming method is that it eliminates the need to load the entire TVP into memory on either the client or server.  The rowset is streamed to the server
    and SQL Server uses the insert bulk method is to store it in tempdb.  Below is an example C# console app that streams 10M rows as a TVP.
    using System;
    using System.Data;
    using System.Data.SqlClient;
    using System.Collections;
    using System.Collections.Generic;
    using Microsoft.SqlServer.Server;
    namespace ConsoleApplication1
    class Program
    static string connectionString = @"Data Source=.;Initial Catalog=MyDatabase;Integrated Security=SSPI;";
    static void Main(string[] args)
    using(var connection = new SqlConnection(connectionString))
    using(var command = new SqlCommand("dbo.usp_tvp_test", connection))
    command.Parameters.Add("@tvp", SqlDbType.Structured).Value = new Class1();
    command.CommandType = CommandType.StoredProcedure;
    connection.Open();
    command.ExecuteNonQuery();
    connection.Close();
    class Class1 : IEnumerable<SqlDataRecord>
    private SqlMetaData[] metaData = new SqlMetaData[1] { new SqlMetaData("col1", System.Data.SqlDbType.Int) };
    public IEnumerator<SqlDataRecord> GetEnumerator()
    for (int i = 0; i < 10000000; ++i)
    var record = new SqlDataRecord(metaData);
    record.SetInt32(0, i);
    yield return record;
    IEnumerator IEnumerable.GetEnumerator()
    throw new NotImplementedException();
    Dan Guzman, SQL Server MVP, http://www.dbdelta.com

  • Best practice for invoice posting inSRM

    Dear expert,
    what is the best practice for posting an invoice? Is it in SRM system or MM system? In SRM, the system allows one to use approval procedure for invoice posting; wheras in MM, one cannot use the apporval procedure to post an invoice. similar is the case, in SRM-SUS and MM-SUS scenario?
    I would appreciate if you can feedback on the industry practice.
    thanks and regards,
    Ranjan

    Ranjan,
    As a SAP customer we use invoice entry in ERP using MIRO and FB60 as these are the most efficient for our busines.
    1. Invoice entry using MIRO tcode is faster as it can be done with minimal use of the mouse.  Invoice entry is slower in SRM as browser response and interactions with the backed ERP system slows the systems reponse to the user input.  Repeated use of a mouse in the SRM browser is detrimental to fast input.
    2. Not all types of invoices can be handled in SRM e.g. Invoice without PO.
    3. We process approx 20,000 invoices per month with multiple input operators, SRM could not handle that sort of load.
    4. SRM is really a procurement application and although invoice entry is included it is probably more for users who wish to use SRM in a stand alone scenario.  We use extended classic so all our financial transactions take place in the ERP backend.
    Your choice also depends upon the number and quality of the invoices you plan to process, and the need for the operators to be trained on one or two systems.
    Hope these personal observations assist with your decision
    Regards
    Allen

  • Idoc processing best practices - use of RBDAPP01 and RBDMANI2

    We are having performance problems in the processing of inbound idocs.  The message type is SHPCON, and transaction volume is very high.  I am a functional consultant, not an ABAP developer, but will try my best to explain our current setup.
    1)     We have a number of message variants for the inbound SHPCON message, almost all of which are set to trigger immediately upon receipt under the Processing by Function Module setting.
    2)      For messages that fail to process on the first try, we have a batch job running frequently using RBDMANI2.
    We are having some instances of the RBDMANI2 almost every day which get stuck running for a very long period of time.  We frequently have multiple SHPCON idocs coming in containing the same material number, and frequently have idocs fail because the material in the idoc has become locked.  Once the stuck batch job is cancelled and the job starts running again normally, the materials unlock and the failed idocs begin processing.  The variant for the RBDMANI2 batch job is currently set with a packet size of 1 and without parallel processing enabled.
    I am trying to determine the best practice for processing inbound idocs such as this for maximum performance in a very high volume system.  I know that RBDAPP01 processes idocs in status 64 and 66, and RBDMANI2 is used to reprocess idocs in all statuses.  I have been told that setting the messages to trigger immediately in WE20 can result in poor performance.  So I am wondering if the best practice is to:
    1)     Set messages in WE20 to Trigger by background program
    2)     Have a batch job running RBDAPP01 to process inbound idocs waiting in status 64
    3)     Have a periodic batch job running RBDMANI2 to try and clean up any failed messages that can be processed
    I would be grateful if somebody more knowledgeable than myself on this can confirm the best practice for this process and comment on the correct packet size in the program variant and whether or not parallel processing is desirable.  Because of the material locking issue, I felt that parallel processing was not desirable and may actually increase the material locking problem.  I would welcome any comments.
    This appeared to be the correct area for this discussion based upon other discussions.  If this is not the correct area for this discussion, then I would be grateful if the moderator could re-assign this discussion to the correct area (if possible) or let me know the best place to post it.  Thank you for your help.

    Hi Bob,
    Not sure if there is an official best practice, but the note 1333417 - Performance problems when processing IDocs immediately does state that for the high volume the immediate processing is not a good option.
    I'm hoping that for SHPCON there is no dependency in the IDoc processing (i.e. it's not important if they're processed in the same sequence or not), otherwise it'd add another complexity level.
    In the past for the high volume IDoc processing we scheduled a background job with RBDAPP01 (with parallel processing) and RBDMANIN as a second step in the same job to re-process the IDocs with errors due to locking issues. RBDMANI2 has a parallel processing option, but it was not needed in our case (actually we specifically wouldn't want to parallel-process the errors to avoid running into a lock issue again). In short, your steps 1-3 are correct but 2 and 3 should rather be in the same job.
    Also I believe we had a designated server for the background jobs, which helped with the resource availability.
    As a side note, you might want to confirm that the performance issues are caused only by the high volume. An ABAPer or a Basis admin should be able to run a performance trace. There might be an inefficiency in the process that could be adding to the performance issue as well.
    Hope this helps.

  • Best practices to design OWB Process Flows

    Hi All,
    I am using OWB to develop historical data warehouse. For this I have created few mappings that load data from source tables to staging tables and then from staging to target tables.
    Now I am designing process flows to execute these mappings. Can someone suggest me what are the best practices to design process flows?
    I have few questions in my mind that I am looking answers for:
    1) Should I keep the mapping to load staging and target tables in different process flows or in one?
    2) Do I need to include an email activity for each mapping failure or in other words how can I integrate all mapping errors in one email?
    3) I am using two outgoing transitions from each mapping; one for success and another unconditional, is that right method or I should use three transitions one for success, one for warning and one for error?
    4) I have created some email activity templates that I am using in process flows. For development environment I have used test from_address and to_address. But when I actually deploy these process flows in production, these addresses will need to be changed. Will I need to change these address individually in each process flow or in other words what is the best way to deploy process flows?
    I am using 10gR2 database and OWB.
    Any help will be appreciated.
    Thanks,
    VKumar

    Any help will be appreciated.
    Thanks.

  • Goods return best practice

    Hi,
    Kindly help me out in following senario
    I raise a purchase order for 100 qty of goods at $2. I do a MIGO and receive all the quantity in to my Inspection stock (QM is activated). 80 items are OK and 20 not. So 80 of them are transfer posted to unrestricted stock and rejected 20 is transfer posted to restricted stock.
    My question is what is the best practice SAP business process I need to follow in order to
    1) return the rejected 20 back to the vendor and then
    2) replace them with another 20 items from the same vendor?
    3) what reports can I run to see the goods return information in SAP?

    To answer the question :
    1) return the rejected 20 back to the vendor and then
    You can go to MIGO transaction, and choose "return delivery", and state the amount of money that will be returned
    2) replace them with another 20 items from the same vendor?
    Basically you can do several good receipts to a purchase order. So after you return delivery do a good receipt with reference to the same PO number
    3) what reports can I run to see the goods return information in SAP?
    What kind of reports do you want? if you want to see the material for each material document number, report using MB51 would be the best option. Either way, you can create a custom report for more specific kind of report
    Hope it helps

  • Best Practice for sugar refinery process

    hello, my company need to deploy a new business concerning raw sugar refinery.
    so we need to analyze the business requirements and propose a process for refinery management  .
    step 1: arrival of goods in docks
    step 2: raw sugar need to be charged in our stock ( quantity and value ) but is not  our property
    step 3: goods need to be delivered to our plant ( we pay the transport as service for our business partner )
    step 4: goods need to be verified ( for quality and quantity ) and accepted by operators
    step 5: goods are processed in a refinery plant, we need to verify timing, costs, quantity and human resources employed ( for costs remittance and transfer  )
    step 6: sugar is delivered to other industrial plants, warehouse and finally sold ( but is not our property ), for us it's like a refinery service.
    step 7: we need to trace production lot from raw sugar arrival to the docks up to step 6 .
    step 8: inventory and  maintenance costs need to be traced because our profit is a part of this refinery service reduced by costs incurred
    any suggestions to find the right best practice ?
    I'm not a skilled BPS, I was looking for oil refinery but  is not the same process, so.. what can i look for?
    TNks

    Hi Kumar,
    In order to have Consigment in SAP u need to have master data such as material master, vendor master and purchase inforecord of consignment type. U have to enter the item category k when u enter PO. The goods receipt post in vendor consignment stock will be non valuated.
    1. The intial steps starts with raising purchase order for the consignment item
    2. The vendor recieves the purchase order.
    3. GR happens for the consignment material.
    4. Stocks are recieved and placed under consignment stock.
    5. When ever we issue to prodn or if we transfer post(using mov 411) from consignment to own stock then liability occurs.
    6. Finally comes the settlement using mrko. You settle the amount for the goods which was consumed during a specific period.
    regards
    Anand.C

Maybe you are looking for

  • How to hide the PORT NUMBER from the URL

    Hi, We have Application on Node 1 and Database on Node 2 Apps Version: 11.5.10.2 DB: 9.2.0.6 We recently added a new node to act as web server for iSupport. Now we are able to access the url using public Ip-address which looks like http://<Internet I

  • Cannot view workflow log via Services for Object Workflow Overview

    We recently had an upgrade of R3/ECC and discovered a new issue.  For some reason we can no longer view workflow log via Services for Object > Workflow Overview for Material Master (transaction MM03).  We get the message "There are no workflows that

  • XFCE automount does not work

    I got this message when insert a disc/flash Rejected send message, 1 matched rules; type="method_call", sender=":1.35" (uid=1000 pid=2380 comm="exo-mount) interface="org.freedesktop.Hal.Device.Volume" member="Mount" error name="(unset)" requested_rep

  • Anyone worked with activex controls within labview

    I am working on an new project and the instrument has an activex/Dcom control. Any ideas on how to work with this in my laview (8) environment?

  • L555d-S7930 The biggest problem in the History

    Good luck Technician's solving my laptop problem Thank a lot. http://www.youtube.com/watch?v=4G-s3jIOKqs&feature=plcp