ABAP or FM for post-processing of infopackage

Gurus,
Does anyone know the name of the ABAP program or function module that handles the post-processing of an info-package.  For instance, we have an infopackage that needs to delete prior reqeusts where the selection criteria match.  The infopackage is setup to perform the delete, and it works if we execute the infopackage manually.  However we would like to run in the middle of a batch job that executes several ABAP programs.  We already have an ABAP program that runs the BAPI to execute the infopackage, but it does not delete the prior request from our cube.
Note:  We are running an old APO system (BW 2.1C), so we cannot use process chains. 
Thanx in advance for your help.

Roberto, thanks for your prompt reply (as always). 
Actually, it seems that the deletion of the prior infopackage request is performed before executing the infopackage again (not after), I apologize.  I will try your suggestion, though.  If this is actually the correct FM, then I will award you full points!
I must admit, though, understanding the parameters of the FM are difficult.  Can you aor anyone else provide more details?

Similar Messages

  • BADI for post processing of IDOC BOMORD

    Hi experts ,
    I want a BADI for post processing of IDOC BOMORD. Actually after IDOC processing BOM got updated in system and i want to schedule the explosion of BOM in process order after it. Please update me with the solution.
    Thanks
    Nitin kapoor

    Roberto, thanks for your prompt reply (as always). 
    Actually, it seems that the deletion of the prior infopackage request is performed before executing the infopackage again (not after), I apologize.  I will try your suggestion, though.  If this is actually the correct FM, then I will award you full points!
    I must admit, though, understanding the parameters of the FM are difficult.  Can you aor anyone else provide more details?

  • Row-Based Only option for post processing operator

    Hi,
    Have a question on the role of the "Row-base Only" option for post processing operators. I am trying to understand how a post-processing operation can be done in a row-based mode. A coded PLSQL procedure in a post processing operator is expected to run only once. So I am not clear on how this option might affect the execution of the transformation. Any clarifications would be appreciated.
    Thanks,
    Mazen

    Hi Carsten,
    My question is more on what is the use of the "Row-based only" checkbox that shows up in the properties window for a post-processing operator. How does this checkbox affect the execution of the post-processing transformation?
    Regards,
    Mazen

  • Workflow for Post Processing Magic Lantern RAW Video?

    Dear Adobe Community,
    I am interested in learning what steps/workflow fellow DSLR, Magic Lantern users take to Edit, Color Grade and add FX to their RAW video?  Here is my untested thought process so far:
    Adjust the Exposure, White Balance, Lens Distortion, Etc of DNGs in Adobe Camera Raw.
    Import DNG image sequence into Premiere Pro for Editing
    Use Dynamic Link to Color Grade in Speed Grade
    Use Dynamic Link to add effects in After Effects
    Render out in Premiere Pro
    One of the concerns I have is that I've read on the Magic Lantern Forums that Premiere Pro will reduce 14 Bit DNG to 8 Bit.  Is this something you've encountered?  Is there a workaround?
    I am fairly new to all of this and would greatly appreciate any feedback to point me in the right direction for Post Processing Magic Lantern Raw Video files.  I have a subscription to Lynda.com as well if you have any suggestions for courses to look at.
    Thank-you for your time.

    Nothing Adobe makes can read MLV or RAW video files, so you have to use something else to convert them into CinemaDNG (a folder of images) or a high bit depth movie file (ProRes etc.) - there are many tools listed in the ML forum which can do that.
    Premiere Pro can import cDNG footage but it struggles to play back in real time, and will not allow you to grade it using Camera Raw. To do that you have to use After Effects. The workflow is supposed to be "ingest in Prelude, edit in Premiere Pro, color in SpeedGrade", but quite frankly the learning curve for SG is massive. Adobe's attitude to CinemaDNG is strange; although the standard is open, support is only coded for very specific models of camera and the CC suite assumes a Hollywood workflow where professional colorists (who spend years learning how to use the software) operate offline from the rest of the edit, usually after all the cuts are made. It's not set up for a typical lone DSLR filmmaker, which is quite frankly why a lot of ML users prefer another well-known way to resolve the problem.
    The 'fast and dirty' approach to getting a cinematic grade in Premiere would be to apply LUT files to the footage using the Lumetri effect (basically running SpeedGrade presets), or you can hand-grade it using the inbuilt Color Corrector tools - but that doesn't cover the other important stuff that ACR can do, such as lens correction, alignment, noise reduction, camera calibration profiles, etc. - for that, you should import the cDNG footage into After Effects (which does support ACR), correct and calibrate the frames, then export back to something high quality that Premiere can work with (such as DNxHD or ProRes 444). You won't notice any visual loss in quality, and Premiere then will play the timeline without struggling, but it takes a loooong time to chew through every clip.
    As to if Adobe applications will ever support MLV files natively - well, since Adobe relies on close partnerships with camera manufacturers including Canon, you can guess what would happen if Adobe ever endorsed it. The only scenario I can imagine is if MLV is adopted in-camera by a major DC manufacturer (Alexa, etc) so vendors can support it without slapping Canon in the face. Flying pigs come to mind.

  • Workflow for post processing offline images

    I'm a professional real estate photographer who normally edits his own images after a photoshoot. I've recently become so busy that I'd like to outsource the post processing of my images to another individual who also uses Photoshop.
    Is there a way to send this individual lower resolution images so he can work on them and then have him send me the metadata from his edits, so I can apply them to the images I have on my system? The reason for this question is because I'd like to minimize the size of the images I'm sending to the editor, but still be able to produce high resolution stills.
    I understand there is a way to work with RAW images and save the metadata, but that doesn't fix the file size issue, and you can only save certain aspects like white balance, levels, etc... I really need a way to capture his history and save that to a file, that way if he were to use a certain filter, or brush on the image, the same results would be applied to my images.
    Thanks in advance for your help!

    You can't apply the history log, it is just information on what has been done to the file. You would have to read this file and simply repeat the same steps on the higher resolution version of the file.
    As for the replacement work flow I mention: you would open the low resolution file that your worker created. Then you would use the Place command (under the File menu) to add your original high resolution version of the image as a new layer. This will make your image a Smart Object layer. Move this layer to the bottom of the layer stack, just above Background, if that is an option. Make sure your smart object is at 100% scale (when you place the file, it is shrunk to fit the existing document boundaries), you can do this by using Free Transform (under the Edit menu) and verifying the values at the top. Then use the Reveal All command (under the Image menu) to resize the canvas to fit the full resolution version of the image. However, this really will only work if the editing involved was pretty much nothing by adjustment layers. Any pixel edits will have to be resized as well and may loose detail or not line up correctly once enlarged.

  • OVD mappings: support for post-process filtering

    Hi all,
    does anyone know if post process filtering is possible via python's mapping scripts using a method similar to java's com.octetstring.vde.util.FilterUtils.evalFilter(Entry e, Filter f)?
    This functionality is required when you have to work with ldap filters in inbound traffic, that contain references to virtual attributes, ie attributes that do not really exist on the backend data store and have no one to one mapping with a real attribute/column . Such filters have to be omitted in the inbound traffic and apply them on the outbound traffic after the virtual attribute has been constructed.
    I know how to process the filter elements and how to pass the "infected" element of the filter from the inbound to the outbound via the request method, but I dont' know if there is an easy way to apply the filter on the outbound results.
    Think of an employeeType attribute (this is the virtual attribute) that is constructed based on the prefix of the value of the employee_code (this is a real column in DB). Now think of a user filter like (employeeType=something)

    Have you attempted something similar before? Have you done it in python? Regarding this custom logic I see two issues
    a)do I have to loop in the results set, to process each entry in the results? Is there a ready method to apply directly the ldap filter on the results set?
    b)in order to cope with substrings within ldap filters do I have to built custom code to simulate ldap regexp and escape codes? I would welcome a method to apply directly a filter at least on entries if the first option is not possible.

  • ABAP Trial NW04s installation post-processing ?

    Dear All,
    I have installed the ABAP NW2004s Trial version and it runs fine (no errors, can logon to the system). However, I want to use it to experiment with basis activities and e.g. it does not run DB02. Can anyone offer me some tips to get it working ?
    What about Support Packages ? I learned that SAP only delivers SP's through Solution Manager's Maintenance Optimizer. They don't expect each trial user to run a SolMan as well on the e.g. laptop or ? Is there a way around this ? Anyone loaded SP's on a trial edition before ?
    TIA,
    Johan,
    Belgium

    check this weblog
    <a href="/people/thomas.jung/blog/2006/07/03/apply-support-packages-to-the-sdn-abap-as-sneak-preview">Apply Support Packages to the SDN ABAP-AS Sneak Preview</a>

  • Problem caling ConcRequest with Post Processing from code using fnd_request

    Gurus,
    Background:
    I am working on an XML Publihser report using Reports6i. I have defined a Report in Applications. This report generates XML data and as part of post processing I specify the template to pick under "Upon Completion" Section. The report works fine when I submit it from SRS window.
    (This is possible because with XML Publisher installed, we can specify a template for any RDF report and its processed as part of Post processing)
    Now I have to invoke the same report programmatically. I am using
    fnd_request.submit_request (application => 'PO',
    program => 'XXTPOREP',
    description => '',
    start_time => '',
    sub_request => FALSE,
    argument1 => 'R',
    argument2 => CHR (0),
    argument3 => CHR (0),
    argument4 => CHR (0),
    argument5 => CHR (0),
    My query is how do I specify the parameters for post processing the request to ensure that the output is generated with the template .
    Thanks in Advance

    before submitting the request add this line
    v_request_status := Fnd_Request.add_layout('PO','XXTPOREP','en','US','PDF');
    This is with the assumption that you need a pdf document as the output and the template language and territory is English and United States
    Message was edited by:
    user585821 on 19 July 2007

  • Formatting / Post Processing of Exported Excel Sheet

    Hi ,
    Issue:
    The Columns in the Excel Sheet are not fully visible when we export Report from CR viewer  to Excel Sheet. So we had planned for Post processing of the Excel Sheet using VBA or .NET etc to format the Excel Sheet, which would be on the server side, meaning the End user will never see the formatting process, but only the End Report. Is it possible to achieve this ??
    NOTE:
    1) VBA or .NET could not be run on the Linux Server but we could take advantage of the windows server in the presentation layer.
    Architecture:  
    Presentation Layer                    
    Linux Server <--> Windows .Net Server <--
    > End User
    {Crystal RAS Server} <-->{GUI CR Viewer Application}<--
    >
    Our Proposed Solution:
    1) Create an GUI Application that has CR viewer in it .
    2) Create a Separate Excel Export button on the webpage.
    3) when the user wants to export the report , he clicks the button.
    4) The GUI/ application saves the Exported Excel Sheet from the Ras server to the windows server.
    5) Then the .Net or the Vba code is applied to process/Format the Excel Sheet.
    6) When complete the End user is prompted for saving the report to a local disk.
    My questions:
    1) Can this be achieved ?
    2) what would be the best way to handle this ?
    3) what should be the process flow ?
    4) what are the things to be considered while planning for such a design ?
    Regards,
    Ramkumar Govindasamy

    So you're looking at using RAS .NET SDK on the Application Layer to export a Crystal Report to Excel, save to temp file on the Application Layer machine, process that temp file, and stream that back out to the client web browser.
    Considerations:
    1. You need to create your own custom UI button to trigger the process, since the .NET Web Forms Crystal Reports viewer won't have the hooks to customize the Excel export.
    2. Running Excel VBA from your Web App may be problematic - you'd have to be particularly careful if the system is under load, since the Excel VBA - COM-Interop isn't necessarily designed for high throughput.  Under high load, you may get file locking or COM-Interop layer just refuse to process.  It's pretty common to try and catch exceptions and retry if you encounter this.
    You'd likely not find anyone here familiar with 2 above, but 1 is fairly common.
    Sincerely,
    Ted Ueda

  • Post Processing Steps after Support Package Stack Application

    I'm curious if anyone has any guidelines for post processing steps (or pre-processing) when applying Support Package Stacks to their Development Infrastructure (Developmend Workplace and the Central NWDI Server).  We have just upgraded a couple of developers local engines and developer studio to SP stack 19 from SP stack 14 and are experiencing some problems.  We also applied the J2EE stack and appropriate SCA files to the NWDI server.
    After the support packs it looks like our DTR files are gone (when reimporting configuration via Developer Studio the SC's are there but there are no DC's inside of them).  Additionally, it looks like we have to manually reimport the newest versions of SAP_BUILDT, SAP_JTECHS, and SAP-JEE.  Another thing is that old Local Web Dynpro DC's are now complaining about class path problems and different versions of files.  We followed the documentation for applying SP19 exactly as the documentation says for a Java Development Usage type.  Is there a checklist or something that anyone has for steps to perform after the application of the support packs?

    I think I'm missing something.  Now I see the code and DC's inside the DTR.  However, when I try to import into NWDS no DC's come in (the software components exist, but no DC's in them).  Additionally, the CBS web ui reports that my software components do not contain any DC's even though I see them in the DTR.  What things can I look at to determine what I'm missing here?
    Thought I'd add some more info..after applying the support packs, we imported the new SAPBUILD, SAP_JTECH, and SAP_jee SCA'S into our track as we required some functionality from the newer build SCA.  We also reimported our old archives back into the system by manually checking them in assuming this would fix the problem with us not seeing the source in the NWDS or the DTR.  After the import, the CBS no longer sees our custom DC's, but the DTR does (both in active and inactive ws).  When importing the dev configuration into the NWDS our custom DC's no longer appear, but SAP's standard SCA's do.
    Message was edited by:
            Eric Vota

  • Post processing a report

    The report attributes page has a section for External Processing with a online help of
    "Specify the URL to a server for post processing of a report. See documentation for instructions."
    I couldnt find anything in the documentation abou this.
    Can someone from Oracle please explain what this is with an example?
    Thanks

    Oh Man that's really sad...I've been working on SOA for that last 9 months and just getting back into APEX. Carl helped me heaps and answered alot of my questions. I'm an oracle instructor and have been teacing/using the product since it started. Carl's name was synonymous with APEX...
    I've just checked out quite a few blogs and it seems he helped many other people as well.
    thanks Tony
    Paul P : (
    By the way I eventually found out about FOP it's all explained at http://www.oracle.com/technology/pub/notes/technote_htmldb_fop.html and numeous postings.

  • Post processing problem

    Hi all,
    I am facing an issue where i have done booking through MFBF and some parts got stored for post processing, now the problem is that when i am doing document specific reversal, parts which got stored for postprocessing is not getting reversed while other parts got cleared. Kindly help
    Prashant.Pillai
    SAP PP Consultant

    Hi,
    There are 2 ways to solve the problem.
    1. Clear the backflush error using MF47 and reverse both the docs.
    2. Do the material document reversal and delete the post processing error in MF47 because the error there is MF47 is actually not posted and is under reserve for that Header material. This option is only good for testing but not in actual production scenario.
    Regards,
    Gaurav Mehra

  • CIF Post Processing

    Dear All,
    In the CIF post processing, either via CIF Cockpit or via other post processing transactions, in APO or ECC, we need to go into each post processing record to find the reason for the post processing record being created and to check the status for Error / Warning.
    Is there a way to view all the post processing records at APO or ECC system level?
    Is there any method to analyze the reason for the post processing records, other than going into each one of them to find out?
    I saw in the web of the existence of a program, where on adding some simple coding to the program, a report can be created to view the 'reason for post processing record' being created.
    Regards,
    Sridhar R

    Hi Senthil,
    Thanks for the reply.
    The alerts system is already in place.
    We are looking at a report where on execution, it would give the reason for the post processing record / error creation.
    Based on the report corrective action could be taken, by appropriately notifying the respective teams / users in corresponding plants / regions.
    Regards,
    Sridhar R

  • CE 71 Productive Edition SR5 post processing

    What is the procedure for post processing CE71 prod edition? Do I just run the config wizard only or do I need to run the prod or dev template.
    Thanks
    Mikie

    Hi,
    See Post Installation section:
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/80653293-da1c-2b10-3687-918d6323b728
    Normally, you run the DEV, then run wizard. But the above document says the correct steps.
    Best regards

  • Changing user for PPF - Post Processing Framework

    Hello,
    Is there a way to change the user for a Post Processing action once the trigger is activated?
    There are times where we want a user action to kick off a PPF task, but we do not want them to have security to complete this extra task themselves.  Also, the PPF task may create an activity document where we want a specific user id to be shown.
    I believe the answer is no, but was wondering if anyone else out there was able to do this or had other ideas?
    Thanks.
    Mike

    Thank you, PK!
    That helped us look at our system and we do see a logical RFC connection we set up 6 years ago for workflow for when we went live.  So that did help us some.
    If we set up this logical connection, do you know how we can tie the ABAP RFC that we set up in BRF to use that logical connection? 
    Thanks.
    Mike

Maybe you are looking for

  • In Flash Builder: "No Disk" Error when trying to switch to Design View

    Just had Flash Builder Beta 1 uninstalled and Beta 2 installed instead. When I try to switch to Design View or open a link from Start Page in Flash Builder Beta 2, the application  freezes and "No Disk" error message pops up.

  • What does this error message mean?   Bridges Parent application is not active

    I updated my Bridge and now I can't use it because I get this message Bridges Parent application is not active Bridge requires that a qualifying product has been launched at least once to enable this feature

  • Flex app in Firefox

    Hi, I have a flex app which is inside coldfusion wrapper. If I open it in IE it works fine but if I open it in firefox then it opens it in center and the width and height are really small and I get lots of scroll bars. I have width=100% and height =

  • TSM Notes

    Looking for somenotes for TSM unterstanding , is there any good at metalink...

  • "Play" and "Last Played" fields don't update.

    What's up with that? Sometimes those two fields update but most of the time they don't. Annoying...anyone had similar problems?