Information needed on ATP functinality based on collected data.

Hi,
we need to use ATP fuctionality in some of our ASCP scenarios.
I have created FG items and components under them.
Created onhand for components.
Now if I do ATP inquiry for FG item , expecting it should give ATP quantity based on its component availabilty.
But it is showing 0 only for whatever quantities or dates i query for .
I had done setups like item ATP attributes,BOM check ATP,Profile option INV:Capable to promise based on collected data,Routing check CTP.
Can any one confirm me whether it is the way it works or I need to do any specific setup.
Thanks
Vasudha.

Hi
Do want to use priority based on demand class? The prioritization of ASCP is diffrent from demand class prioritization. You will need to have a customization to assign demand class based prioity and update the demand lines to respect it by ASCP .
Please try to use Allocated ATP . The use allocated ATP profile needs to be set to yes. There should be proper allcoation rule also setup based on demand class. But you cannot guarantee that the ASCP and ATP will be in sync
Thnx
partha

Similar Messages

  • ATP/CTP Based on Planning Data- Clarifications

    Hi gurus,
    All below are ATP/CTP Based on Planning Data:
    1)Implications of enabling ATP(Enable ATP check box) in while running plan and not enabling ATP check box.?
    2)I am running a Plan with ATP enabled, what benefits we get if i run with 24x7 ATP and what benefits we get if without 24x7 ATP. Basically would like to know the functionality of 24x7 ATP.
    3)Have set up plan without ATP,
    After plan run if i create new sales order will it be shown in planning workbench?
    If it is not showing up what could be the reason.
    When this new sales order is expected to be shown as Majenta color in planning workbench.
    4)After plan run if my new sales order ATP to get occurate(Supply/Demand) data from planning output, what are all necessary setups i have to do (May be simple typical setup)?
    Otherway around: what are all the necessary steps and concurrent programs to be done in order get current supply/demand picture.
    Please do the needful.
    Thanks.

    Hi
    Do want to use priority based on demand class? The prioritization of ASCP is diffrent from demand class prioritization. You will need to have a customization to assign demand class based prioity and update the demand lines to respect it by ASCP .
    Please try to use Allocated ATP . The use allocated ATP profile needs to be set to yes. There should be proper allcoation rule also setup based on demand class. But you cannot guarantee that the ASCP and ATP will be in sync
    Thnx
    partha

  • Need to generate XML based on Data template

    Hi,
    I'm unable to submit (the second submit) my request using the demo links (XDO 5.6.3 or XDO 5.7) provided in http://bipublisher.us.oracle.com/?tab=samples&header=dataTemplate to generate the XML based on the data template.
    Heard that these links are available for Oracle EBS internal use. I work with EBS-CRM. I need to generate XML based on a data template and use the XML in the template. Data template has two queries for a template.
    Pls help.
    Thanks,
    Impha

    Impha
    Please use the internal mailing list for your questions the forum is meant for external customers. If you do not know the ML name - drop me a mail
    Tim

  • Is there a way to export to separate folders based on collection organization?

    I am wondering if there is a simple way to export photos from lightroom into seperate folders based on collections.
    For example, in LR5, I have collection set "2014". Within "2014", there are sub-collections Jan, Feb, March... and so on. How can I export all the photos, at once, from the "2014" collection set so they end up in a folder structure that mirrors the collection set and sub-collections. My collection is more complex and contains many more sub-collections than the example provided, so I would really like to avoid having to export each sub-collection separately.
    In searching for this, I found mention of some sort of plug-in that apparently was capable of this, but I could not get it to work. The GUI was not intuitive and I didn't have time to learn it. Unfortunately, I do not remember the name of the plug-in since I deleted it. But is there any way to do this without having to learn code or install plug-ins?

    asbissonnette221 wrote:
    is there any way to do this without having to learn code or install plug-ins?
    Ordinary hard drive service will export into disk folders in tree structure which mirrors hierarchy of publish collections.
    I had always assumed (wrongly) this was NOT the case, otherwise what is the purpose of Jeffrey Friedl's Collection Publisher?
    Jim Wilde pointed out my erroneous assumption, and said there are advantages to Jeffrey's plugin, but I don't know what they are (probably the ability to have different settings per collection and dunno what else...). Or maybe jf's collection publisher will export in structure of non-publish collections so you don't have to re-define a matching set of published collections - I really don't know.
    Bottom line: if your needs are simple enough, Hard Drive publish service may be all you need.
    Rob

  • ATP check based on production finish date

    Hi all,
    customer order goods for the 02.06.09.
    Transportation take 10 days so,  by itinerary , I'll have the material avaiability date for the 15.05.2009.
    All' it's OK.
    Mrp planner , based on customer calendar, insert demand for customer date 02.06.09
    We have insert 10 days on material master as Goods receipt processing time, this for anticipate the end of production 10 days before .
    Now we need that atp check production order date not including the Good receipt processing time
    Is it possible ?
    Regards
    Sil
    When I do atp check by sales order , ato check see the 02.06.09 as avaiability date and not the desidered 15.05.09
    Any idea about haw to realised atp check on

    Hi,
    This requirement can not be fulfilled by Standard SAP. But you can address this issue via Z development.
    You need to maintain Z table with Customer and lead time. Write a BDC and run this BDC once sales order is created. So that system will change requirement date as per the Z table.
    I hope this will resolve your issue.
    Regards,
    Shailendra

  • How to create monitor based on the data collected by rule?

    I have an application which comes with a built-in command: GetBackLogNm.exe. Running this command can get the current backlog number in the system.
    Now I need to create a monitor which should trigger the alert if the backlog keeps increasing for 2 hours. I am thinking of creating a rule which run command GetBackLogNm.exe to get the number and save it into DW. Then create a monitor based on the collected
    data. Is it the correct way? If so, can anyone give me a outline how to do so? An example would be much appreciated if possible.
    Thanks!

    Hi Jonathan, thanks for the quick reply. But System.ConsolidatorCondition seems only available for SCOM 2012. I didn't find it in 2007 libraries.
    However the idea your and Vladimir showed in this
    post is really helping. I kind of solved the issue not using the exactly the components (since it is available in 2012 only) but following the same logic.
    In general I did this:
    1. create a script rule which execute the command to get backlog number; then save the log number in the DW and the registry.
    2. just like System.Performance.DeltaValueCondition, when the second time the rule triggered by schedule, it will compare the current backlog number and the last known number stored in the registry. If the value is exceeding given threshold AND bigger than
    the last known back log number, the rule increase an counter which is also stored in the registry
    3. create an unit monitor to monitor the counter stored in the registry and will trigger the alert if the counter exceeding a given value.
    This solved the issue. But I feel a little not comfortable to store data in the registry.
    So one quick question which might be out of the this topic: what is a good way to store the data generated by one workflow and then share it with another workflow or reuse sometime later? In my case here, I used the registry key. Cook down is one option
    to share data but it is very strict and can't solve the issue I have in this case.
    Thanks again!

  • Hi, I need to download the CS6 Master Collection - ME  !       Where can i find the link for the Middle East Version?

    Hi,
    I'm replacing my computer, and I need to download the CS6 Master Collection - ME
    I have software license, and serial number
    Where can i find the download link for the Middle East Version (CS6 Master Collection)?
    Thanks,
          ToMeR

    ToMeR did you purchase the middle eastern version from Adobe directly?  If so then please see Find a download link on Adobe.com for information on how to download your purchased software.  If you purchased the software from a reseller then I would recommend contacting them for installation files.

  • Can I bulk collect based on a date variable?

    I'm want to use some sort of Bulk collect to speed up this query inside a PL/SQL block (we're in 10g). I have seen examples of using a FORALL statement but I don't know how to accommodate the variable v_cal_date. My idea is to be able to run the script to update a portion of the table "tbl_status_jjs" based on a date range that I provide. tbl_status_jjs contains a list of dates by minute of the day for an entire year and a blank column to be filled in.
    I though of using something like FORALL v_cal_date in '01-apr-2009 00:00:00'..'01-jun-2009 00:00:00' -- somehow need to increment by minute!? ... but that doesn't seem right and i can't find any exmples of a bulk collect based on a date. How do I bulk collect on a date variable? Can I use the date/time from a subset of records from the final table in some sort of cursor?
    Thanks
    Jason
    -- loop through one day by minute of the day and update counts into table  
    v_cal_date Date       :=TO_DATE('01-apr-2005 00:00:00','dd-mm-yyyy hh24:mi:ss');
    intX := 1;
    WHILE intX <= 1440 LOOP
        UPDATE tbl_status_jjs
               SET  (cal_date, my_count) =
                    (SELECT      v_cal_date,
                                 NVL(SUM(CASE WHEN v_cal_date >= E.START_DT AND v_cal_date < E.END_DT THEN 1 END),0) AS my_count
                     FROM        tbl_data_jjs E
               WHERE cal_date = v_cal_date;
        v_cal_date := v_cal_date + (1/1440);
        intX := intX + 1;
        COMMIT;
    END LOOP;

    Here are the two tables. The goal is to find an efficient way to count how many records in tbl_data have a start_dt before and a end_dt after each cal_date in tbl_status.
    01-apr-2005 00:05:00 ==> 3
    01-apr-2005 00:25:00 ==> 1
    DROP TABLE tbl_status;
    CREATE TABLE tbl_status
    (   CAL_DATE    DATE,
        MY_COUNT    NUMBER);
    DECLARE
        start_date Date       :=TO_DATE('01-jan-2006 00:00:00','dd-mm-yyyy hh24:mi:ss');
        end_date   Date       :=TO_DATE('01-jan-2006 01:00:00','dd-mm-yyyy hh24:mi:ss');
    BEGIN
        INSERT INTO tbl_status (CAL_DATE)
        SELECT  start_date + ( (LEVEL - 1) / (24 * 60))
        FROM    dual
        CONNECT BY    LEVEL <= 1 + ( (end_date - start_date) * (24 * 60) );
    END;
    DROP TABLE tbl_data;
    CREATE TABLE tbl_data
    (   START_DT    DATE,
        END_DT      DATE);
    INSERT INTO tbl_data VALUES (TO_DATE('2006-01-01 00:05:00', 'yyyy-mm-dd hh24:mi:ss'), TO_DATE('2006-01-01 00:15:00', 'yyyy-mm-dd hh24:mi:ss'));
    INSERT INTO tbl_data VALUES (TO_DATE('2006-01-01 00:05:00', 'yyyy-mm-dd hh24:mi:ss'), TO_DATE('2006-01-01 00:20:00', 'yyyy-mm-dd hh24:mi:ss'));
    INSERT INTO tbl_data VALUES (TO_DATE('2006-01-01 00:05:00', 'yyyy-mm-dd hh24:mi:ss'), TO_DATE('2006-01-01 00:30:00', 'yyyy-mm-dd hh24:mi:ss'));
    INSERT INTO tbl_data VALUES (TO_DATE('2006-01-01 00:35:00', 'yyyy-mm-dd hh24:mi:ss'), TO_DATE('2006-01-01 00:40:00', 'yyyy-mm-dd hh24:mi:ss'));
    DECLARE
        v_cal_date Date       :=TO_DATE('01-jan-2006 00:00:00','dd-mm-yyyy hh24:mi:ss');
        intX Integer          :=1;
    BEGIN
        WHILE intX <= 60 LOOP
            UPDATE tbl_status
                   SET  (cal_date, my_count) =
                        (SELECT      v_cal_date,
                                     NVL(SUM(CASE WHEN v_cal_date >= E.START_DT AND v_cal_date < E.END_DT THEN 1 END),0) AS my_count
                         FROM        tbl_data E
                   WHERE cal_date = v_cal_date;
            v_cal_date := v_cal_date + (1/1440);
            intX := intX + 1;
            COMMIT;
        END LOOP;
    END;Edited by: Jason_S on Oct 21, 2009 9:00 AM -- i messed up the years/months .. fixed now
    Edited by: Jason_S on Oct 21, 2009 9:13 AM

  • SSCM 2012 R2 IP Subnet based Device Collection

    HI,
    I have to create a IP Subnet based Device Collection , are we have to install the ccm clients on all machines first to create the IP subnet based Device Collection??
    Can you please share the step by step docs to create the IP Subnet based device collection in SCCM 2012 R2?
    Shailendra Dev

    Hello, As has already been mentioned this will require the SCCM client to be installed on the workstations to pick up the IP address and may require a Hardware Inventory cycle to have been ran (not 100% certain of the HW Inv requirement, give it a try without
    and see!).
    Create the collection as you would normally.
    right click the collection and select properties. Click into the membership rules tab
    select add rule > query rule
    Type a name "IP Subnets" and then click "Edit query statement"
    Click "Show Query language at the bottom and enter the below query making changes to the IP address.
        select *  from  SMS_R_System where SMS_R_System.IPSubnets like "172.16.5%"
    Update the collection membership. Job done.
    I am not sure how whether this information will remain bang up to date e.g. if a machine moves subnets it may not instantaneously update the data in the database regarding IP subnets.

  • I needed to make a account receivable collection report

    Hi
    I needed to make a account receivable collection report.  for that which table I needed select.
    bsid is enough or I needed go for any other tables also like bsad.
    Regards
    Sebastian

    Hi Sebastian,
    Use BKPF: Accounting Document Header and BSEG: Accounting Document Segment.
    Best regards,
    Shahid Malayil

  • Does anyone have a suggestion for a device that I can upload data to from an iPod when there is no wi-fi available? I'm using iForm to collect data and I need to get it uploaded to a database.

    can anyone recommend a device to upload data from an iPod when no wi-fi is available? I am using iForm to collect data in the field and I want to be able to upload it periodically.

    Hi there,  I'm myseld using the Application data-field in order to collect mobile forms OFFLINE.
    I can collect as many mobile form as i want BUT, of course, need to be online to sync the data from mobile to their cloud.
    Hope that helps.
    Otherwise, would probably need to plug ur idevice and look for the application database with such tool as iConnector.

  • Need Report based on "CUSTOM DATA field @ Shop Order Maintenance"

    Need the REPORT based on CUSTOM DATA field.
    Requirement - I am having the CUSTOM data at Shop Order Maintenance that is called as XXX, I need some report in SAP ME which can display the SFC related to particular Shop order based on XXX Number.

    Hi!
    As far as I remember there is no such base report. So, you may try to create it using SDK or request it as custom enhancement from SAP.
    If you want to create it using SDK, you can find the required data for report in CUSTOM_FIELDS table.
    Regards,
    Alex.

  • Please share your IPv6 information needs.

    Are you implementing IPv6 in your network? Please share any of your IPv6 information needs here. Also share any comments you have on the IPv6 Knowledge Base Portal.

    Hi John
    If it's like previous years, you can register for as many sessions as you want, providing there are no clashes as you can't register for sessions that conflict or overlap with previosly booked sessions. It's always worth booking for those sessions that are of interest as some often fill and if you're not registered, you're forced to wait in a standby queue and might miss out entirely.
    I only know as much as you do now regarding when the schedule builder will finally be available in that it's due sometime in July (which doesn't have too many days remaining).
    Now my sessions will obviously fill quickly so make sure you book early or you might miss out ;)
    Cheers
    Richard Foote
    http://richardfoote.wordpress.com/

  • Migrating a Path Based Site Collection to a Host Named Site Collection where the content database is greater than 135 GB

    Hi
    I have a 136 GB content database which will not back up via Backup-SPSite ( Microsoft say you cannot backup more than 100GB).
    So with out Backup / Restore how can I convert a Path Based Site Collection to a Host Named Site Collection  with my 136 GB content database which incidentally uses Remote Blob Storage ?
    Thanks
    Nigel 
    Nigel Price NJPEnterprises

    I see two options:
    Make the Backup-SPSite work despite being over 100GB (that's going to be a supported limit rather than a hard boundary)
    Externalise some of the content and then re-insert it after the move.

  • I recently put money on my itunes account but the problem is when i went to purchase music the credit card information needed to be deleted and i forgot the answers to my security questions what do i do?

    I  recently put money on my itunes account but the problem is when i went to purchase music the credit card information needed to be deleted and i forgot the answers to my security questions what do i do? I was told by a friend of mine that all i had to do was call apple support and i would be able to re start the secruity answers.

    You need to ask Apple to reset your security questions; ways of contacting them include clicking here and picking a method for your country, and filling out and submitting this form.
    (96653)

Maybe you are looking for