Correcting GL date on a reversed receipt

Hi!
We reversed a receipt in A/R with the wrong GL date. Is there a way to easily correct this? One of our users believes that we used to be able to reverse the reversal, but we don't see this option in our latest version 11.5.10.2.
Thanks

They are actually two terms for the same thing, they are the dates the transactions will be accounted in GL, which will determine the period they are posted etc.
Where is it you see these terms?

Similar Messages

  • Stock transfer are not showing correct expire date for Shelf life subcontracting

    Hi Gurus,
    We have shelf life active for PPDS. We had added 5 characteristic to have min and Max shelf life in APO so that data will be considered during pegging.
    Self life works perfectly fine for production plant where based on manufacture date min shelf life is considered.
    LOBM_APO_SL_MIN
    LOBM_APO_SL_MAX
    LOBM_APO_SL_UTC
    LOBM_VERAB
    LOBM_HSDAT
    Question is for sub contracting process stock or inventory date is giving back dated where when we run planning run stock is getting expired.
    Can any one help in getting stock correct expire date instead of 1982---- some thing like this.
    Thanks in advance
    Thanks & Regards,
    Rajesh Kumar

    When you plan subcontracting in SNP following is generated.
    Object                             Source                                   Destination
    Sub Con PR (Header)       Sub contractor                       Plant
    PR/STR (Child)                Plant                                     Subcontractor
    PR (Child)                       Child Mat Vendor                    Plant (or direct to Sub Contractor)
    Usually first and third items will get converted to Purchase orders and will vanish as and when Goods Receipts are posted against them.
    The second order may not get converted in SAP.  Tis will remain in the system.  But in the next planning run, these will be deleted if the quantities have been already fulfilled.
    Regards
    Nitin Thatte

  • HOW CAN U CORRECT THE DATA IN UR FILE WHICH CONTAINS 1 LAKSH RECS

    Hai Frnds,
    i Attend an interview they asked this questions can u know the answeres . tell me .
    In File to file scenario how can we reprocess records which failed records.
    HOW CAN U CORRECT THE DATA IN UR FILE WHICH CONTAINS 1 LAKSH RECS
    Thanks in advance
    thahir

    Hi,
    Refer these links:
    this might help you
    Generic Approach for Validating Incoming Flat File in SAP XI - Part 1
    Generic Approach for Validating Incoming Flat File in SAP XI - Part 1
    validating against schema file for the output XML file
    Informing the sender about bad records
    Regards,
    Nithiyanandam

  • Reversed Receipt Handling in CE Open Interface

    Oracle Support has stated that in 11.0.3, it is not possible to interface reversed receipts (NSF, etc.) from an external system through the reconciliation open interface.
    Does anyone know if there are plans to enable this functionality, as cash management does not provide full support to interface with an external AR sub-system?
    Thanks.
    Francois Gendron
    La Societe d'Informatique Gendron Inc.
    Tel: (514) 212-3994

    Pl. visit following links.
    Oracle Apps: 11.5.10 / R12 ROI How to Receive Intransit Shipment (Inter-org transfer) for Lot / Serial Controlled Items …
    11.5.10 ROI를 통해 INTRANSIT SHIPMENT(INTER-ORG TRANSFER)를 처리하는 방법

  • Ensure field sequence is correct for data for mutiple source structure

    Hi,
    I'm using LSMW with IDOC message type 'FIDCC2' Basic type 'FIDCCP02'.
    I'm getting error that packed fields are not permitted.
    I'm getting Ensure field sequence is correct for data for mutiple source structures.
    Source Structures
           HEADER_STRUCT            G/L  Account Document Header
               LINE_STRUCT              G/L Account Document Line
    Source Fields
           HEADER_STRUCT             G/L  Account Document Header
               BKTXT                          C(025)    Document  Header Text
               BLART                          C(002)    Document Type
               BLDAT                          DYMD(008) Document Date
               BUDAT                          DYMD(008) Posting Date
               KURSF                          C(009)    Exchange rate
               WAERS                          C(005)    Currency
               WWERT                          DYMD(008) Translation Date
               XBLNR                          C(016)    Reference
               LINE_STRUCT               G/L Account Document Line
                   AUFNR                          C(012)    Order
                   HKONT                          C(010)    G/L Account
                   KOSTL                          C(010)    Cost Center
                   MEINS                          C(003)    Base Unit of Measure
                   MENGE                          C(013)    Quantity
                   PRCTR                          C(010)    Profit Center
                   SGTXT                          C(050)    Text
                   SHKZG                          C(001)    Debit/Credit Ind.
                   WRBTR                          AMT3(013) Amount
    I have changed PAC3 field for caracters fields of same length to avoid erreur message of no packed fields allowed.
    Structure Relations
           E1FIKPF FI Document Header (BKPF)         <<<< HEADER_STRUCT G/L  Account Document Header
                   Select Target Structure E1FIKPF .
               E1FISEG FI Document Item (BSEG)          <<<< LINE_STRUCT   G/L Account Document Line
                   E1FISE2 FI Document Item, Second Part of E1FISEG   (BSEG)
                   E1FINBU FI Subsidiary Ledger (FI-AP-AR) (BSEG)
               E1FISEC CPD Customer/Vendor  (BSEC)
               E1FISET FI Tax Data (BSET)
               E1FIXWT Extended Withholding Tax (WITH_ITEM)
    Files
           Legacy Data          On the PC (Frontend)
               File to read GL Account info   c:\GL_Account.txt
                                              Data for Multiple Source Structures (Sequential Files)
                                              Separator Tabulator
                                              Field Names at Start of File
                                              Field Order Matches Source Structure Definition
                                              With Record End Indicator (Text File)
                                              Code Page ASCII
           Legacy Data          On the R/3 server (application server)
           Imported Data        File for Imported Data (Application Server)
               Imported Data                  c:\SYNERGO_CREATE_LCNA_FI_GLDOC_CREATE.lsmw.read
           Converted Data       File for Converted Data (Application Server)
               Converted Data                 c:\SYNERGO_LCNA_FI_GLDOC_CREATE.lsmw.conv
           Wildcard Value       Value for Wildcard '*' in File Name
    Source Structures and Files
           HEADER_STRUCT G/L  Account Document Header
                         File to read GL Account info c:\GL_Account.txt
               LINE_STRUCT G/L Account Document Line
                           File to read GL Account info c:\GL_Account.txt
    File content:
    Document  Header Text     Document Type     Document Date     Posting Date     Exchange rate     Currency     Translation Date     Reference     
    G/L Account document     SA     20080401     20080409     1.05     CAD     20080409     Reference     
    Order     G/L Account     Cost Center     Base Unit of Measure     Quantity     Profit Center     Text     Debit/Credit Ind.     Amount
         44000022                    1040     Line item text 1     H     250
         60105M01     13431     TO     10          Line item text 2     S     150
    800000     60105M01                         Line item text 3     S     100
         60110P01     6617     H     40          Line item text 4     S     600
         44000022                    ACIBRAM     Line item text 5     H     600
    The file structure is as follow
    Header titles
    Header info
    Line titles
    Line1 info
    Line2 info
    Line3 info
    Line4 info
    Line5 info
    Could someone direct me in the wright direction?
    Thank you in advance!
    Curtis

    Hi,
    Thank you so much for yout reply.
    For example
    i have VBAK(Heder structure)
              VBAP( Item Structure)
    My file should be like this i think
    Identification content         Fieldnames
         H                               VBELN      ERDAT     ERNAM        
                                          Fieldvalues for header
          H                              1000          20080703   swapna
    Identification content         Fieldnames
        I                                   VBELP     AUART 
                                          Fieldvalues for item
        I                                  001             OR
                                           002             OR
    Is this format is correct.
    Let me know whether i am correct or not

  • Thinkpad T61 reports in-correct EDID data to Xorg

    Thinkpad T61 with Quadro NVS 140M reports in-correct EDID data . Primary issues is that I want to switch to "1024x768" resolution which is well supported for LCD projectors. But currently with default setting nvidia driver it gives only the following resolutions. Is there any BIOS update for this ??
    root#xrandr
    Screen 0: minimum 640 x 480, current 1680 x 1050, maximum 1680 x 1050
    default connected 1680x1050+0+0 0mm x 0mm
       1680x1050      50.0*    51.0     52.0
       1600x1024      53.0
       1280x1024      54.0
       1280x960       55.0
       800x512        56.0
       640x512        57.0
       640x480        58.0
    Here is the snip of Xorg.0.log
    (II) Setting vga for screen 0.
    (**) NVIDIA(0): Depth 24, (--) framebuffer bpp 32
    (==) NVIDIA(0): RGB weight 888
    (==) NVIDIA(0): Default visual is TrueColor
    (==) NVIDIA(0): Using gamma correction (1.0, 1.0, 1.0)
    (**) NVIDIA(0): Option "RenderAccel" "True"
    (**) NVIDIA(0): Option "AddARGBGLXVisuals" "True"
    (**) NVIDIA(0): Enabling RENDER acceleration
    (II) NVIDIA(0): Support for GLX with the Damage and Composite X extensions is
    (II) NVIDIA(0):     enabled.
    (II) NVIDIA(0): NVIDIA GPU Quadro NVS 140M (G86GL) at PCI:1:0:0 (GPU-0)
    (--) NVIDIA(0): Memory: 524288 kBytes
    (--) NVIDIA(0): VideoBIOS: 60.86.3e.00.00
    (II) NVIDIA(0): Detected PCI Express Link width: 16X
    (--) NVIDIA(0): Interlaced video modes are supported on this GPU
    (--) NVIDIA(0): Connected display device(s) on Quadro NVS 140M at PCI:1:0:0:
    (--) NVIDIA(0):     IBM (DFP-0)
    (--) NVIDIA(0): IBM (DFP-0): 330.0 MHz maximum pixel clock
    (--) NVIDIA(0): IBM (DFP-0): Internal Dual Link LVDS
    (WW) NVIDIA(0): The EDID for IBM (DFP-0) contradicts itself: mode "1680x1050"
    (WW) NVIDIA(0):     is specified in the EDID; however, the EDID's valid
    (WW) NVIDIA(0):     HorizSync range (53.398-64.075 kHz) would exclude this
    (WW) NVIDIA(0):     mode's HorizSync (42.7 kHz); ignoring HorizSync check for
    (WW) NVIDIA(0):     mode "1680x1050".
    (WW) NVIDIA(0): The EDID for IBM (DFP-0) contradicts itself: mode "1680x1050"
    (WW) NVIDIA(0):     is specified in the EDID; however, the EDID's valid
    (WW) NVIDIA(0):     VertRefresh range (50.000-60.000 Hz) would exclude this
    (WW) NVIDIA(0):     mode's VertRefresh (40.1 Hz); ignoring VertRefresh check
    (WW) NVIDIA(0):     for mode "1680x1050".
    (WW) NVIDIA(0): The EDID for IBM (DFP-0) contradicts itself: mode "1680x1050"
    (WW) NVIDIA(0):     is specified in the EDID; however, the EDID's valid
    (WW) NVIDIA(0):     HorizSync range (53.398-64.075 kHz) would exclude this
    (WW) NVIDIA(0):     mode's HorizSync (42.7 kHz); ignoring HorizSync check for
    (WW) NVIDIA(0):     mode "1680x1050".
    (WW) NVIDIA(0): The EDID for IBM (DFP-0) contradicts itself: mode "1680x1050"
    (WW) NVIDIA(0):     is specified in the EDID; however, the EDID's valid
    (WW) NVIDIA(0):     VertRefresh range (50.000-60.000 Hz) would exclude this
    (WW) NVIDIA(0):     mode's VertRefresh (40.1 Hz); ignoring VertRefresh check
    (WW) NVIDIA(0):     for mode "1680x1050".
    (II) NVIDIA(0): Assigned Display Device: DFP-0
    (WW) NVIDIA(0): No valid modes for "default"; removing.
    (WW) NVIDIA(0):
    (WW) NVIDIA(0): Unable to validate any modes; falling back to the default mode
    (WW) NVIDIA(0):     "nvidia-auto-select".
    (WW) NVIDIA(0):
    (II) NVIDIA(0): Validated modes:
    (II) NVIDIA(0):     "nvidia-auto-select"
    (II) NVIDIA(0): Virtual screen size determined to be 1680 x 1050
    (--) NVIDIA(0): DPI set to (129, 127); computed from "UseEdidDpi" X config
    (--) NVIDIA(0):     option
    (**) NVIDIA(0): Enabling 32-bit ARGB GLX visuals.
    (--) Depth 24 pixmap format is 32 bpp
    (II) do I need RAC?  No, I don't.
    (II) resource ranges after preInit:
        [0] -1    0    0x00100000 - 0x3fffffff (0x3ff00000) MX[B]E(B)
        [1] -1    0    0x000f0000 - 0x000fffff (0x10000) MX[B]
        [2] -1    0    0x000c0000 - 0x000effff (0x30000) MX[B]
        [3] -1    0    0x00000000 - 0x0009ffff (0xa0000) MX[B]
        [4] -1    0    0xf8102400 - 0xf81024ff (0x100) MX[B]
        [5] -1    0    0xf8102000 - 0xf81020ff (0x100) MX[B]
        [6] -1    0    0xf8101c00 - 0xf8101cff (0x100) MX[B]
        [7] -1    0    0xf8101800 - 0xf81018ff (0x100) MX[B]
        [8] -1    0    0xf8101000 - 0xf81017ff (0x800) MX[B]
        [9] -1    0    0xdf2fe000 - 0xdf2fffff (0x2000) MX[B]
        [10] -1    0    0xfe227400 - 0xfe2274ff (0x100) MX[B]
        [11] -1    0    0xfe226000 - 0xfe2267ff (0x800) MX[B]
        [12] -1    0    0xfe227000 - 0xfe2273ff (0x400) MX[B]
        [13] -1    0    0xfe220000 - 0xfe223fff (0x4000) MX[B]
        [14] -1    0    0xfe226c00 - 0xfe226fff (0x400) MX[B]
        [15] -1    0    0xfe225000 - 0xfe225fff (0x1000) MX[B]
        [16] -1    0    0xfe200000 - 0xfe21ffff (0x20000) MX[B]
        [17] -1    0    0xd4000000 - 0xd5ffffff (0x2000000) MX[B](B)
        [18] -1    0    0xe0000000 - 0xefffffff (0x10000000) MX[B](B)
        [19] -1    0    0xd6000000 - 0xd6ffffff (0x1000000) MX[B](B)
        [20] 0    0    0x000a0000 - 0x000affff (0x10000) MS[B]
        [21] 0    0    0x000b0000 - 0x000b7fff (0x8000) MS[B]
        [22] 0    0    0x000b8000 - 0x000bffff (0x8000) MS[B]
        [23] -1    0    0x0000ffff - 0x0000ffff (0x1) IX[B]
        [24] -1    0    0x00000000 - 0x000000ff (0x100) IX[B]
        [25] -1    0    0x00001c60 - 0x00001c7f (0x20) IX[B]
        [26] -1    0    0x00001c20 - 0x00001c3f (0x20) IX[B]
        [27] -1    0    0x00001c18 - 0x00001c1b (0x4) IX[B]
        [28] -1    0    0x00001c40 - 0x00001c47 (0x8) IX[B]
        [29] -1    0    0x00001c1c - 0x00001c1f (0x4) IX[B]
        [30] -1    0    0x00001c48 - 0x00001c4f (0x8) IX[B]
        [31] -1    0    0x00001830 - 0x0000183f (0x10) IX[B]
        [32] -1    0    0x000001f0 - 0x000001f0 (0x1) IX[B]
        [33] -1    0    0x000001f0 - 0x000001f7 (0x8) IX[B]
        [34] -1    0    0x000001f0 - 0x000001f0 (0x1) IX[B]
        [35] -1    0    0x000001f0 - 0x000001f7 (0x8) IX[B]
        [36] -1    0    0x000018e0 - 0x000018ff (0x20) IX[B]
        [37] -1    0    0x000018c0 - 0x000018df (0x20) IX[B]
        [38] -1    0    0x000018a0 - 0x000018bf (0x20) IX[B]
        [39] -1    0    0x00001880 - 0x0000189f (0x20) IX[B]
        [40] -1    0    0x00001860 - 0x0000187f (0x20) IX[B]
        [41] -1    0    0x00001840 - 0x0000185f (0x20) IX[B]
        [42] -1    0    0x00002000 - 0x0000207f (0x80) IX[B](B)
        [43] 0    0    0x000003b0 - 0x000003bb (0xc) IS[B]
        [44] 0    0    0x000003c0 - 0x000003df (0x20) IS[B]
    (II) NVIDIA(0): Initialized GPU GART.
    (II) NVIDIA(0): Setting mode "nvidia-auto-select"
    (II) Loading extension NV-GLX
    (II) NVIDIA(0): NVIDIA 3D Acceleration Architecture Initialized
    (II) NVIDIA(0): Using the NVIDIA 2D acceleration architecture
    (==) NVIDIA(0): Backing store disabled
    (==) NVIDIA(0): Silken mouse enabled
    (**) Option "dpms"
    (**) NVIDIA(0): DPMS enabled
    (II) Loading extension NV-CONTROL
    (WW) NVIDIA(0): Option "CalcAlgorithm" is not used
    (==) RandR enabled

    Hi,
    Please check whether you are able to open the PDF through  SOST transaction.
    http://www.sapdevelopment.co.uk/reporting/rep_spooltopdf.htm
    Best regards,
    Prashant

  • Correct end date to a present or future date - Section80 and 80C deduction

    Hi SDN,
    We are getting following exception for Section80 and Section80C deductions when click on Review button.
    Data valid only for past is not allowed, correct end date to a present or future date. Begin date should start from or higher. We have reset the valid From date to current financial year starting date(4/1/2011) using BADI HRXSS_PER_BEGDA_IN. But Valid to date it is taking 3/31/2011 instead of 3/31/2012. Where can we change this Valid to date. Please help me.
    regards,
    Sree.

    Dear Punna rao
    Kindly check this BADI HRXSS_PER_BEGDA_IN and activate it
    below is the IMG for this BADI
    Employee self service-->Service specific settings-->Own data->Change Default start date
    Ask your technical team to activate this BADI, as this requires access key to activate it. Kindly do this process & let me know the out comes, as this must solve ur issue.
    Hope this Info will be useful
    Cheers
    Pradyp

  • Batch classification "Date of last goods receipt" is "00.00.0000"

    Dear all,
    I have configured all steps for batch determination for FIFO.
    It worked fine for IM (stock transfer - mov.type 301) on last friday, however, get the issue today:
    - After I do GR (mov.type 101), the batch is auto created, batch classification is created,
    I use MSC2N to view batch classification:
    the value of "Date of last goods receipt" is "00.00.0000"
    What is wrong with my configuration?
    Could you pls help me to fix this?
    Thanl you very much for you help!
    Edited by: Ngo Quoc Hung on Dec 13, 2010 10:59 AM

    System Updates Batch Data in the Batch Master Record in the Following ways :
    1)  If the batch does not yet exist for the plant, it is automatically created.
    2)  If the batch already exists for the plant, the new quantity is simply posted as a receipt in the specified storage location.
    When posting a goods receipt with reference to Purchase order, the system updates Posting date as the goods receipt date in general data of the batch.

  • The value in flexfield context reference web bean does not match with the value in the context of the Descriptive flexfield web bean BranchDescFlex. If this in not intended, please go back to correct the data or contact your Systems Administrator for assi

    Hi ,
    We have enabled context sensitive DFF in Bank Branch Page for HZ_PARTIES DFF , We have created Flex Map so that only bank branch context fields are only displayed in the bank branch page and  as we know party information DFF is shared by supplier and Customer Page so we dint want to see any Bank Branch fields or context information in those pages.
    We have achieved the requirement but when open existing branches bank branch update is throwing below error message :
    "The value in flexfield context reference web bean does not match with the value in the context of the Descriptive flexfield web bean BranchDescFlex. If this in not intended, please go back to correct the data or contact your Systems Administrator for assistance."
    this error is thrown only when we open existing branches, if we save existing branch and open then it is not throwing any error message.
    Please let us know reason behind this error message.
    Thanks,
    Mruduala

    You are kidding?  It took me about 3 minutes to scroll down on my tab to get to the triplex button!
    Habe you read the error message? 
    Quote:
    java.sql.SQLSyntaxErrorException: ORA-04098: trigger 'PMS.PROJECT_SEQ' is invalid and failed re-validation
    Check the trigger and it should work again.
    Timo

  • How to correct the data in the psa table?

    1Q. There are lot of invalid character in the infopackage of say 1million records. it takes lot of time to check each and every record in the data package(PSA)and correct it. i think there is an efficient way to reslove this issue that is going in the PSA table to correct all the records. is it right, if yes how to do it?
    2Q. If say there are 30 data packages in the request and only data pacakge 25 has the bad records. if i correct the data in the PSA and push it to the data target, its gone process all the data packages one by one that takes lot of time and delay our process chain job that has depedency on the load. can i just manually process this data package only. if yes how to do it?
    3Q. when i successfully correct all the bad records in the data package and push it from the PSA. the request dont turn to status green and have to manually turn this request to green in the data target after i verify all the data packages have on bad records and it is a delta update. is my process right? as it is a delta what are the pitfalls i have to watch for? and the next step after this is compress the request this is very dangerous because this basic cube have lot of history and it will take a long time probably weeks to reload it. how to take precuation before i turn it to stutus green in the data target?
    Thanks in advance! and i know how to thank SDN experts by assining points.

    Hi,
    1Q . Update the invalid chars in the filter table using tcode RSKC and also write a ABAP routine to filter out the invalid characters.
    2Q. For the incorrect data packet, you can right click on the data packet in the monitor details tab and say update manually. That way you don't need to reload the entire request again.
    3Q. When you reload the request or update individual data packet again, the request should automatically turn green. You don't have to turn it green manually. The pitfall is, if you turn a delta request green, you have chances of losing data and corrupting the delta. Best practise is never turn a request green manually.. Even if you compress the requests, you can use selective deletion to delete the data and then use an infopackage with the same selections, that you used for deletion to load the same data back.
    Cheers,
    Kedar

  • Manual data corrections in data-Warehouse/OLAP

    Hi to all,
    is it somewhere in essbase(hyperion) possible Manual data corrections in data-Warehouse/OLAP?
    Thank you
    G.

    Hi NareshV,
    thanks for your reply. In fact, I have also some difficulties tu understand (I am not autor) this question;-), but ok - lets say u have some records of data retrieved from essbase (or Hyperion) like for instance an open excel sheet with data. In excel is possible to delete or override any value in any column, is it possible to do this ia essbase olap server?
    Thank, G.

  • Some photos are not sorted correctly by date

    I just imported some photos from my friends camera who were on the same trip. However, I noticed that a couple (not all) of his photos are not sorted correctly by date in the Event. A photo from day 3 somehow comes before the photos from day 2. I checked the extended information and the dates on the photo are correct. Has anyone had this problem and know of a solution? Thanks.

    It might be of some help to delete the photo cache. Please see this link - http://support.apple.com/kb/TS1314

  • I've just imported photos that are misdated and appear out of order in my events. How can I correct the dates on these events so they appear properly?

    I've just imported photos that are misdated and appear out of order in my events. How can I correct the dates on these events so they appear properly?

    The one iin the Photos ➙ Adjust Date and Time menu option:
    checkbox below:

  • HT1347 I have an mp3 DVD with episodes of Gunsmoke.  iTunes imports them, but does not sort them correctly by date or episode.  One file name is GS 52-04-26 001 Billy the Kid.mp3

    I have an mp3 DVD with episodes of Gunsmoke.  iTunes imports them, but does not sort them correctly by date or episode.  One file name is GS 52-04-26 001 Billy the Kid.mp3.  iTunes sorts them by month/day/year.  How can I fix that?

    The weird thing is, it worked before. I encoded all the videos previously and had them all listed and didn't encounter this problem, but I noticed I'd forgotten to decomb/detelecine the videos so I deleted everything and re-encoded it all. AFter filling the tags out and setting poster frames again I was thinking of making life easier by just pasting the same artwork for every episode, so I did, but didn't like the results so I deleted it again (I selected every video, used Get Info and pasted the artwork that way).
    I think it was after this that the problem started to occur. In any case I've been through every episode and made sure the name, series, episode and episode id fields are all filled in with incremental values and they are. I even made sure the Sort Name field was filled in on every video. But this didn't work either.
    So, thinking another restart was needed I deleted every video again. Re-encoded them all, again. Set the tags for every video and poster frame again. Finished it all off nicely, checked cover flow and was highly annoyed to find it was STILL showing the same two pieces of art work for every video as shown in that image I posted.
    Ultimately, I wouldn't care so much that cover flow is screwing up like this as I always intended to sort by program anyway so it would only ever display one piece of artwork for an entire series of videos, however where-as when it was working it picked the artwork for the first episode of a series to display, its instead picking one of the latter episodes. I can't seem to find any way of choosing what artwork I want displayed.

  • Unable to get correct ship Date-MRP ATP API Call

    Hi,
    We are facing issue in getting correct ship date by calling MSC_ATP_GLOBAL.Extend_ATP from OM for ATO Model. However, correct ship date returned by availibility check from OM.
    Currently, we are checking only capacity constrain by enabling CTP for critical resource for TOP Model(Resoure Only). The API is not checking the capacity constrain and returns ship date as sys date.
    Can someone provide your expert advise, why ATP ignores the capacity contrain (CTP) when called from API.
    PS: We are in distributed environment, Source on 11.5.10 and Destination on R12
    Regards,
    ML

    Hi Abhishek,
    Yes, I am able to see in the ATP details pegging tree (resource constrain- date getting pushed). In the API, I have set it as N.
    I dont see an option to upload the dump in the forum, can you please let me know what specific details to be checked in both files. I will check and let you know.
    Regards,
    ML

Maybe you are looking for