Post processing records in MF47 and COGI

Hi All,
I have a query....
In STD SAP will the postprocessing records created by Repetative Mfg. be visible in transaction COGI???
And will the postprocessing records created by Descrite Mfg. be visible in transaction MF47???
Regards,
Vinayak.

Hi ,
In general for Discreate Mfg the post processing records are checked and cleared in Tcode : COGI.
Whereas for REM Mfg it is : MF47.
You will be able to view the REM postprocessing records in MF47, it is a standard behaviour of SAP , hence I can say there is no bug in your system.
Hope this will help you.
Regards
radhak mk

Similar Messages

  • Post processing records deletion log in MF47 -reg

    Hi...
    How to know the MF47 post processing records deletion by users  ?
    some of the post processing records in MF47 are being deleted by the users with out processing in time
    we would like to know where this log will be there and we should be able to see the log like
    which user deleted which records on which date
    regards,
    madhu kiran

    hi,
    i have posted earlier on deletion of MF70 records -backdated backlogs which could not be processed
    now i have asked for tracking of post processing records deletion in MF47
    if some record is deleted then no way to track when and who has deleted  them ?
    regards,
    madhu kiran

  • Post Processing records in BAPI_PRODORDCONF_CREATE_TT

    Hi All,
    I am using BAPI_PRODORDCONF_CREATE_TT for Production Confirmation with Auto Goods receipt(GR) Feature.
    But, in case any error in goods movement occurs system creates a post processing record( Visible in transaction COGI).
    We don't want entires in COGI.In case, any error occurs system should terminate the session. The same control is maintain in SPRO. But, BAPI overrules that.
    Kindly let me know which settings to be used in BAPI, to avoid COGI.
    In Post Wrong entries field of BAPI , I have used ' " ( Blank) value.
    Please suggest.

    Hi Stuti,
    What I suggest you is to use 2 BAPI's instead of 1.
    First use BAPI_GOODSMVT_CREATE to carry out goods movements.
    If this BAPI is successful then only execute the BAPI you are using only to carrying out the confirmations
    This is the best way to control your confirmation for failed goods movements.
    Regards,
    Yogesh

  • TECO status for Prod order for which Post processing record exists

    Dear PP Gurus,
    We use  BAPI_PRODORD_COMPLETE_TECH to TECO production order. Program doesn't TECO orders , if there are Post processing records exsiitng for order giving message "Postprocessing records for order 1000068 prevent technical closing" .  I think this is standard BAPI msg.
    When same BAPI is run in Foreground mode, it gives "Confirmation Prompt" , 'Order 1000068 : There are still reprocessing records . Set order to Technically Complete" with YES/NO/Cancel option. You can save TECO by selecting YES
    Is there a way to achieve this in Background mode.
    Thank you much in advance for help,
    Regards,
    Jatin

    Hello Jatin,
    Call function DIALOG_SET_NO_DIALOG before the BAPI, then system handles the BAPI as in the non-dialog mode and the pop-up does not appear
    Refer KBA 1986661 - PP-SFC: BAPI_PRODORD_COMPLETE_TECH Popup
    Best Regards,
    R.Brahmankar

  • Lightroom 5 slowdown, my post processing times have tripled and the develop tasks take multiple seconds for one adjustment, Please Help. . . .

    my post processing times have tripeled and the develop tasks take multiple seconds for one adjustment, Please Help. . . .

    my post processing times have tripeled and the develop tasks take multiple seconds for one adjustment, Please Help. . . .

  • CIF Post Processing

    Dear All,
    In the CIF post processing, either via CIF Cockpit or via other post processing transactions, in APO or ECC, we need to go into each post processing record to find the reason for the post processing record being created and to check the status for Error / Warning.
    Is there a way to view all the post processing records at APO or ECC system level?
    Is there any method to analyze the reason for the post processing records, other than going into each one of them to find out?
    I saw in the web of the existence of a program, where on adding some simple coding to the program, a report can be created to view the 'reason for post processing record' being created.
    Regards,
    Sridhar R

    Hi Senthil,
    Thanks for the reply.
    The alerts system is already in place.
    We are looking at a report where on execution, it would give the reason for the post processing record / error creation.
    Based on the report corrective action could be taken, by appropriately notifying the respective teams / users in corresponding plants / regions.
    Regards,
    Sridhar R

  • Post Processing List

    Hello all,
    I have a question related to the Post Processing List (transaction MF47), where they appear the backlogs resulted from backflush.
    If, by mistake, I deleted some postrocessing records (instead of processing them), how could I see later these deleted records? And how could I process them later on, if they need to be processed? --> do it exists an "undo", related to the action of deleting postprocessing records?
    Thank you very much,
    Edith

    Dear Catalin Ignat,
    Once if the Post Processing Record is deleted in MF47,
    in no way you can fetch the data back.
    It will not show either in AFFW or AUFM or COGI.
    SAP has given a temporary means to use this Reprocessing allowed in REM
    Profile.
    So once deleted unknowingly its lost.
    Reward points and close the thread.
    Regards
    Mangal

  • Post processing errors in REM

    Hi Gurus,
    What is significance of post processing in REM?
    We have 4 options.
    1)     MF45 u2013 Post process Individual
    2)     MF46 u2013 Post process Collective
    3)     MF47 u2013 Post processing list
    4)     COGI u2013 Post processing individual Components
    I am able to use COGI and clear errors occurred in back flushing.
    But cannot under above first 3 transactions.
    System is taking to all materials where correction is needed for eg in MF47. When I select a material and change post processing record, taking me to screen of post processing list of component.
    Here I observe that bell , Batch determination and Stock determination are disabled. Where as these are enabled in COGI. What can be the reason behind? Any settings missing in REM?
    Pl. help me.
    Srini

    Hi Srini,
    To avoid this you can confirm the following things.
    1.Ensure that suffecicient stock is available in the backflushing locations of each materail.
    2.To prevent the generation of "post-processing list" you can block the BOM from getting backflushed if sufficient stock is not available in specified location for the BOM COMPONENTS. To do this change the "REM profile" to "002"in MRP4 view for the BOM MATERIAL.(if the profile is
    not available you can crate it through SPRO).
    3. To clear the existing Backlog use transcation MF47 and re-process(be ensure that required stocks is
    available for each bom component)
    Hope it will solve ur problem.
    Regards,
    R.Brahmankar

  • Post Processing Backflushing Backlogs

    Hi All,
    When I wanted to reprocess backflushing backlogs with MF45, I can not do any changes such as changing storage Location or batch number and etc because appeared post processing list  is read only. How I can change this screen from display mode to Change?
    Regards,
    Fateme Goudarzi

    Dear Fateme Goudarzi,
    1) just check ur REM profile in OSP2
    2) under Error creation for Backflushing there are two options
    Option A ) Create cumulative post processing records
    Option B ) Also create individual post processing records
    3) if option A is ticked , then Go to MF47, here u can change the Fields or clear the records
    4)  if option B is ticked, hen Go to COGI , here u can change the Fields or clear the records
    Regards
    Madhu Kumar

  • Issue with Bulk Load Post Process

    Hi,
    I ran bulk load command line utility to create users in OIM. I had 5 records in my csv file. Out of which 2 users were successfully created in OIM and for rest i got exception because users already existed. After that if i run bulk load post process for LDAP sync and generate the password and send notification. It is not working even for successfully created users. Ideally it should sync successfully created users. However if there is no exception i during bulk load command line utility then LDAP sync work fine through bulk load post process.Any idea how to resolve this issue and sync the user in OID which were successfully created. Urgent help would be appreciated.

    The scheduled task carries out post-processing activities on the users imported through the bulk load utility.

  • Post processing problem

    Hi all,
    I am facing an issue where i have done booking through MFBF and some parts got stored for post processing, now the problem is that when i am doing document specific reversal, parts which got stored for postprocessing is not getting reversed while other parts got cleared. Kindly help
    Prashant.Pillai
    SAP PP Consultant

    Hi,
    There are 2 ways to solve the problem.
    1. Clear the backflush error using MF47 and reverse both the docs.
    2. Do the material document reversal and delete the post processing error in MF47 because the error there is MF47 is actually not posted and is under reserve for that Header material. This option is only good for testing but not in actual production scenario.
    Regards,
    Gaurav Mehra

  • Audio post processing and CS5

    I have a few questions regarding audio post processing done outside of PrPro.  The content is music that was recorded at 48khz/24-bit.  After post processing, it is in a 32-bit file, which is then combined with a video clip via Premiere.
    1)  Does Premiere change the audio bit rate when it compresses video clips during the DVD burn process?
    2) Does Premiere add dithering to the audio signal, prior to the compression process?
    3) Is it desirable to add dithering to audio clips that are added to video clips?
    Thanks,
    Steve

    Hi Hunt,
    Thanks.  A couple of questions - I am not considering DVD-audio, but audio that has been recorded off-camera, edited, and then combined with the corresponding video clips.
    1) Does 24-bit audio get converted to 16-bit audio by Premiere?  If so, this suggests that adding dither to audio files used within video clips could be beneficial.
    2) I am guessing BD refers to Blue-ray.  Yes?  If so, is it advisable to up-sample audio clips or leave them at the sample rate that they recorded at?
    Thanks,
    Steve

  • PGI delay posting records in LIKP and LIPS

    Hi
    We are facing problem with PGI, When we are process PGI entries with SAP_ALL profile user ID. System process records in Seconds.
    When we are processing PGI entries with all required authorisation for PGI. System takes more time in posting records in LIPS and LIKP as well as Locks material.
    We could not able to find why system behave differently.
    Thanks,
    Yogesh

    Yes I should elaborate a little more. And also add some info.
    There are two entries in CO09 as "total records". One for customer orders and the other for delivery. If you double click in both they do not "unfold" into  the detailed view, as rthe correct "total records do".
    We are using APO, but not for availability check as far as I understand. I do not know how to check this in customiz. but when debugging   PERFORM AVAILABILITY_CHECK_R3 is used inside LATPCU05 source code.
    and inside SAPLATPC / LATPCF0A the  CALL FUNCTION 'AVAILABILITY_CHECK_SERVER' function is called with a dest. that is the R3 name...
    The information displayed in CO09 correspond to a delivery and order that are not pending. I know only because it fits in quantities, not because tehre is any information that tells me so. The delivery was deleted and the order served with another delivery.
    No other order is pending in the system at the moment.
    Some trace of my debugging:
    For what I see, funnction module  CALL FUNCTION 'STOCK_RECEIPT_ISSUE_READ' in SAPLATP0 / LATP0U04 table  S_ATPKXI is felt with the wrong data     
    inside that function module:
      read sales requirements: individual records
              PERFORM vbbe_read.
    and after then this little piece of code:
    SB_READ VBBE.
    (a DEFINE I am not so maliar with..)
    and we get to
    IMPORT P_ATPMX FROM SHARED BUFFER ATPSB(AC) ID P_ATPBI.      in SAPLATP2 / LATP2FMA
    That id contains data that corresopnds to the material. I am not very familiar with this, but I guess it is a sort of buffer normally to avoid extra DB access. If I could just..."delete" that ID....well...just a wild idea.
    I hope that helps you to help me.

  • Direct link from Premiere Pro to Speedgrade won't open and stops at the post processing bar.

    I've just installed the new version of Speedgrade and every time I try to open a direct link from Premiere Pro to Speedgrade, it opens and starts to load, but when the post processing bar appears it stops there and doesn't open, I've gone back to the original versions of both programmes and that same thing happens, HLEP!

    Hi Vinay,
    1. I am using third party plug ins, and I understand that might be causing an issue, but I've also tried opening a new project with nothing in it at all, and that won't open.
    2. Premiere Pro quits after the project loading bar or after I create a new project.
    3. I used a lot of files! mov, mts, psd, ai, png, jpeg, and AE compositions through dynamic link.
    4. The project has about five sequences. They range from 1 minute to 4 minutes.
    Thanks!

  • Best way to stream lots of data to file and post process it

    Hello,
    I am trying to do something that seems like it should be quite simple but am having some difficulty figuring out how to do it.  I am running a test that has over 100 channels of mixed sensor data.  The test will run for several days or longer at a time and I need to log/stream data at about 4Hz while the test is running.  The data I need to log is a mixture of different data types that include a time stamp, several integer values (both 32 and 64 bit), and a lot of floating point values.  I would like to write the data to file in a very compressed format because the test is scheduled to run for over a year (stopping every few days) and the data files can get quite large.  I currently have a solution that simply bundles all the date into a cluster then writes/streams the cluster to a binary file as the test runs.  This approach works fine but involves some post processing to convert the data into a format, typically a text file, that can be worked with in programs like Excel or DIAdem.   After the files are converted into a text file they are, no surprise, a lot larger than (about 3 times) the original binary file size.
    I am considering several options to improve my current process.  The first option is writing the data directly to a tdms file which would allow me to quicly import the data into DIAdem (or Excel with a plugin) for processing/visualization.   The challenge I am having (note, this is my first experience working with tdms files and I have a lot to learn) is that I can not find a simple way to write/stream all the different data types into one tdms file and keep each scan of data (containing different data types) tied to one time stamp.  Each time I write data to file, I would like the write to contain a time stamp in column 1, integer values in columns 2 through 5, and floating point values in the remaining columns (about 90 of them).  Yes, I know there are no columns in binary files but this is how I would like the data to appear when I import it into DIAdem or Excel.  
    The other option I am considering is just writing a custom data plugin for DIAdem that would allow me to import the binary files that I am currently creating directly into DIAdem.  If someone could provide me with some suggestions as to what option would be the best I would appreciate it.  Or, if there is a better option that I have not mentioned feel free to recommend it.  Thanks in advance for your help.

    Hello,
    Here is a simple example, of course here I only create one value per iteration in the while loop for simplicity. You can also set properties of the file which can be useful, and set up different channels.
    Beside, you can use multiple groups to have more flexibility in data storage. You can think of channels like columns, and groups as sheets in Excel, so you see this way your data when you import the tdms file into Excel.
    I hope it helps, of course there are much more advanced features with TDMS files, read the help docs!

Maybe you are looking for