Collective processing for i.o -kok4

hi,
Does anyone know if it is possible to process Pm orders (cat 30) using co transactions kok3,kok4?
Amit/

Hello
I am unsure of the mentioned TCodes.
However collective processing is possible, provided the range is visible in the selection screen of the order processing.
Technical name: SAP_PM_WOC_ORDER_PROCESS
Tasks
This role contains all the functions that you require to process orders.
Activities in Plant Maintenance (PM)
The following activities for orders are available with this role:
Creating/changing/printing an order
Creating a sub-order
Changing operations
Changing the component list for an order
Where-used lists for production resources/tools
Reg
assign pints if useful

Similar Messages

  • Collective processing for status CLSD for production order

    Hi, All.......
    Is there any provision or Tcode to give status CLSD for all Production orders.......
    I hv checked COHV tcode but except TECO there is no provision for CLSD.
    Yogesh.........

    Hi Yogesh,
    Now there are 2 things involved in this..
    1. CLSD or DLFL
    2. Collective settlement of the Orders
    Isn't it??
    2. Collective Settleemnt of the Orders/Collective Processing with CO88.
    Before this there will be other steps also like Over head calculation, WIP calculation, variance calculation etc.. So, check with Controlling Team also..
    After that, you can device a Process for this and instruct the user to follow the process..
    1. Until unless you does the Settlement, it is not possible to have the Status CLSD/DLFL
    For collective updation of the Status DLFL, you need to go with the Program PPARCHP1.
    If you execute the Program with SA38, in the next scrren, you can select Multiple Orders or as per the selection criteria and Tick the Chcek Box for Deletion Flag only..
    Regards,
    Siva

  • Collective Processing VL10 is not grouping deliveries for same sales parame

    Hi.
    Created multiple sales orders for same sales parameters like ship to, sold to, sales org, etc.
    When tried to process collectively the delivery ceation via VL10, system generated group for each delivery selected instead of creating group for multiple deliveries.
    When I compared my client's system config with Standard config client, it is exactly same in terms of all parameters, scenario profile, delivery creation profile,etc. The standard system works as expected and grouping the deliveris and create one group for multiple deliveries.
    The group type is L and delivery categories we are tryign to group is J & T.
    Thanks,

    hi sridhar
    Collective processing of deliveries means delivering the few order collectively. that means one delivery document for  all orders.
    In vl10h u can access only orders due for delivery.
    first of all mark the orders line items to do collective processing deliveries.

  • CO88 - Collective Settlement Processing for Product cost collectors

    Hello Gurus,
    I have couple of Settlement related problems. Please advice if you have any solution to fix these problems.
    1.We are using CO88 - Collective Settlement Processing for Product cost collectors but some of the cost collectors are not settling fully. Where as the same cost collector settled individually without any problem.
    2. Some cost collectors are settled but very small amounts are still there as un-settled.
    Thanks in advance.
    Vishwanath.

    Hi
    Welcome to SDN! We hope your 1st experience here is good
    Issue 1: It depends on the error that you got at the time of settlement... Many times, when a PCC is being posted to from Production while you are trying to settle, it can give you error...
    later on it can be settled w/o any issues
    Issue 2: Ideally there does not remain any amount after settlement.. May be the amount which you are seeing pertains to the next month.. You can shift to periodic view using CtrlShiftF11 and see if any balance really exists
    BR,Ajay M

  • Tcode for Sales Order Output(collective processing)

    Hi
    What is the tcode or program to print sales orders collectively, similar to VL71 for deliveries & VF31 for invoices.
    Regards
    Uma

    Hi Uma
    There is no such T code for sales order print.
    You can use SE38-> run RSNAST00 program for sales order output (collective processing).
    try and revert.

  • Collective deliveries for open sales orders

    Is there any process or transaction code  for creating the collective Picking  for open sales orders.
            In my company we are not using the Warehouse management, so we pick the goods manually.
           If there is any way plzpost the answer.

    Check if VG01 helps you if you enter "K" as Group Type
    Cheers

  • Creating process for multiple Date fields for update or insert in APEX

    hello there,
    could someone please help me?
    i have a form on Apex based on view that is based on three tables and updating and inserting ok using trigger instead of.
    i have a problem now as in my form i have around 75 fileds (items) incuding 30 or more date fields which could be populated or left blank or update later.
    so for each date field i have two boxs; one for date, input as dd/mm/yyyy (text field) and second for time, input as 23:45. All dates will be insert or update manually by user. so as i mentioned not all date fields could be poulated at one stage.
    so i have created some process and validations and all of them work fine but i came accross if date left blank then (:) giving me problem so i have done following further process for each date field. In real table all the date fields have data type date.
    declare
    v_my_var date; -- for first date field
    str_dy VARCHAR2(10);
    dt_indx date;
    str_tm VARCHAR2(20);
    tm_indx date;
    begin
    str_dy := :p4_first_date
    str_tm := str_dy||' '||substr(:p8_first_date_hh,1,2)||':'||substr(:p8_first_date_HH,4,2);
    dt_indx := to_date(str_tm,'DD/MM/YYYY HH24:MI');
    IF str_dy is not null then
    v_my_var :=dt_indx;
    ELSE
    v_my_var := NULL;
    END IF;
    update table 1 set my_date = v_my_var where d_id= :p4_d_id;
    end;
    above code work fine but one date field of course therefore i have to do same code for each date field with changes and initialise variable again and again for each field.
    so i like to ask is there any easy way that is more professional. i was thinking about the procedure and using collection or similar but honestly not much experience on that so could some one please help me?
    I will be very thankful.
    KRgds

    Hi,
    You can do the needful by re-using the code if you can give the item names as P8_DATE1, P8_DATE_hh1, P8_DATE2, P8_DATEhh2 etc..So your item name just differs by a sequence.
    Now you write function which will return desired date value taking above items as input. Pass item names to this function, get session state using APEX_UTIL.GET_SESSION_STATE('item_name') API.
    Now modify you code as
    FOR i IN 1..30
    LOOP
    v_date_array[i] = f_get_date('P8_DATE'||i, 'P8_DATEhh'||i);
    END LOOP;
    ....Now you have all date valus in array. Just write one update as follows
    UPDATE  TABLE1
    SET date1 = my_date_array[1], date2 = my_date_array[2]..
    WHERE ....Hope it helps :)
    Cheers,
    Hari

  • Using bulk collect and for all to solve a problem

    Hi All
    I have a following problem.
    Please forgive me if its a stupid question :-) im learning.
    1: Data in a staging table xx_staging_table
    2: two Target table t1, t2 where some columns from xx_staging_table are inserted into
    Some of the columns from the staging table data are checked for valid entries and then some columns from that row will be loaded into the two target tables.
    The two target tables use different set of columns from the staging table
    When I had a thousand records there was no problem with a direct insert but it seems we will now have half a million records.
    This has slowed down the process considerably.
    My question is
    Can I use the bulk collect and for all functionality to get specific columns from a staging table, then validate the row using those columns
    and then use a bulk insert to load the data into a specific table.?
    So code would be like
    get_staging_data cursor will have all the columns i need from the staging table
    cursor get_staging_data
    is select * from xx_staging_table (about 500000) records
    Use bulk collect to load about 10000 or so records into a plsql table
    and then do a bulk insert like this
    CREATE TABLE t1 AS SELECT * FROM all_objects WHERE 1 = 2;
    CREATE OR REPLACE PROCEDURE test_proc (p_array_size IN PLS_INTEGER DEFAULT 100)
    IS
    TYPE ARRAY IS TABLE OF all_objects%ROWTYPE;
    l_data ARRAY;
    CURSOR c IS SELECT * FROM all_objects;
    BEGIN
    OPEN c;
    LOOP
    FETCH c BULK COLLECT INTO l_data LIMIT p_array_size;
    FORALL i IN 1..l_data.COUNT
    INSERT INTO t1 VALUES l_data(i);
    EXIT WHEN c%NOTFOUND;
    END LOOP;
    CLOSE c;
    END test_proc;
    In the above example t1 and the cursor have the same number of columns
    In my case the columns in the cursor loop are a small subset of the columns of table t1
    so can i use a forall to load that subset into the table t1? How does that work?
    Thanks
    J

    user7348303 wrote:
    checking if the value is valid and theres also some conditional processing rules ( such as if the value is a certain value no inserts are needed)
    which are a little more complex than I can put in a simpleWell, if the processing is too complex (and conditional) to be done in SQL, then doing that in PL/SQL is justified... but will be slower as you are now introducing an additional layer. Data now needs to travel between the SQL layer and PL/SQL layer. This is slower.
    PL/SQL is inherently serialised - and this also effects performance and scalability. PL/SQL cannot be parallelised by Oracle in an automated fashion. SQL processes can.
    To put in in simple terms. You create PL/SQL procedure Foo that processes SQL cursor and you execute that proc. Oracle cannot run multiple parallel copies of Foo. It perhaps can parallelise that SQL cursor that Foo uses - but not Foo itself.
    However, if Foo is called by the SQL engine it can run in parallel - as the SQL process calling Foo is running in parallel. So if you make Foo a pipeline table function (written in PL/SQL), and you design and code it as a thread-safe/parallel enabled function, it can be callled and used and executed in parallel, by the SQL engine.
    So moving your PL/SQL code into a parallel enabled pipeline function written in PL/SQL, and using that function via parallel SQL, can increase performance over running that same basic PL/SQL processing as a serialised process.
    This is of course assuming that the processing that needs to be done using PL/SQL code, can be designed and coded for parallel processing in this fashion.

  • Concurrency Visualizer Extension (VS2013): "An unknown error occurred while trying to start the collection process."

    Not sure if this is the right place to ask about this but since my question concerns an extension I'll give it a shot.
    I'm trying to use the Visual Studio 2013 Concurrency Visualizer extension to debug a simple test console application. I've set executive paging to disabled in the registry as it directed and hit "add SDK to project" in the concurrency
    visualizer menu options. And of course, restarted my computer to apply the settings. When I try to "start [the concurrency visualizer] with current project" I get an error popup:
    "An unknown error occurred while trying to start the collection process."
    I really need the concurrency profiling to start debugging contention in my multithreaded application. Prompt assistance would be much appreciated.

    Hi,
    This forum is to discuss and ask questions about extending and integrating with Visual Studio (using the Extension Manager, building VSX containers for deployment, Visual Studio SDK, and more), Visual Studio Online (REST APIs, service hooks, and OAuth),
    and Team Foundation Server.
    You have problems when using Concurrency Visualizer Extension, please ask in
    the page under "Q AND A". Since Concurrency Visualizer is a diagnostic tool, you can post this issue in
    Visual Studio Diagnostics (Debugger, Profiler, IntelliTrace) forum too.
    Best regards,
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Help,why brconnect do not collect statistics for mseg table?

    I found "MSEG" table`s statistics is too old.
    so i check logs in db13,and the schedule job do not collect statistics for "MSEG".
    Then i execute manually: brconnect -c -u system/system -f stats -t mseg  -p 4
    this command still do not collect for mseg.
    KS1DSDB1:oraprd 2> brconnect -c -u system/system -f stats -t mseg u2013f collect -p 4
    BR0801I BRCONNECT 7.00 (46)
    BR0154E Unexpected option value 'u2013f' found at position 8
    BR0154E Unexpected option value 'collect' found at position 9
    BR0806I End of BRCONNECT processing: ceenwjre.log 2010-11-12 08.41.38
    BR0280I BRCONNECT time stamp: 2010-11-12 08.41.38
    BR0804I BRCONNECT terminated with errors
    KS1DSDB1:oraprd 3> brconnect -c -u system/system -f stats -t mseg -p 4
    BR0801I BRCONNECT 7.00 (46)
    BR0805I Start of BRCONNECT processing: ceenwjse.sta 2010-11-12 08.42.04
    BR0484I BRCONNECT log file: /oracle/PRD/sapcheck/ceenwjse.sta
    BR0280I BRCONNECT time stamp: 2010-11-12 08.42.11
    BR0813I Schema owners found in database PRD: SAPPRD*, SAPPRDSHD+
    BR0280I BRCONNECT time stamp: 2010-11-12 08.42.12
    BR0807I Name of database instance: PRD
    BR0808I BRCONNECT action ID: ceenwjse
    BR0809I BRCONNECT function ID: sta
    BR0810I BRCONNECT function: stats
    BR0812I Database objects for processing: MSEG
    BR0851I Number of tables with missing statistics: 0
    BR0852I Number of tables to delete statistics: 0
    BR0854I Number of tables to collect statistics without checking: 0
    BR0855I Number of indexes with missing statistics: 0
    BR0856I Number of indexes to delete statistics: 0
    BR0857I Number of indexes to collect statistics: 0
    BR0853I Number of tables to check (and collect if needed) statistics: 1
    Owner SAPPRD: 1
    MSEG     
    BR0846I Number of threads that will be started in parallel to the main thread: 4
    BR0126I Unattended mode active - no operator confirmation required
    BR0280I BRCONNECT time stamp: 2010-11-12 08.42.16
    BR0817I Number of monitored/modified tables in schema of owner SAPPRD: 1/1
    BR0280I BRCONNECT time stamp: 2010-11-12 08.42.16
    BR0877I Checking and collecting table and index statistics...
    BR0280I BRCONNECT time stamp: 2010-11-12 08.42.16
    BR0879I Statistics checked for 1 table
    BR0878I Number of tables selected to collect statistics after check: 0
    BR0880I Statistics collected for 0/0 tables/indexes
    BR0806I End of BRCONNECT processing: ceenwjse.sta 2010-11-12 08.42.16
    BR0280I BRCONNECT time stamp: 2010-11-12 08.42.17
    BR0802I BRCONNECT completed successfully
    the log says:
    Number of tables selected to collect statistics after check: 0
    Could you give some advices?  thanks a lot.

    Hello,
    If you would like to force the creation of that stats for table MSEG you need to use the -f (force) switch.
    If you leave out the -f switch the parameter from stats_change_threshold is taken like you said correctly:
    [http://help.sap.com/saphelp_nw70ehp1/helpdata/EN/02/0ae0c6395911d5992200508b6b8b11/content.htm|http://help.sap.com/saphelp_nw70ehp1/helpdata/EN/02/0ae0c6395911d5992200508b6b8b11/content.htm]
    [http://help.sap.com/saphelp_nw70ehp1/helpdata/EN/cb/f1e33a5bd8e934e10000000a114084/content.htm|http://help.sap.com/saphelp_nw70ehp1/helpdata/EN/cb/f1e33a5bd8e934e10000000a114084/content.htm]
    You have tried to do this in your second example :
    ==> brconnect -c -u system/system -f stats -t mseg u2013f collect -p 4
    Therefore you received:
    BR0154E Unexpected option value 'u2013f' found at position 8
    BR0154E Unexpected option value 'collect' found at position 9
    This is the correct statement, however the hyphen in front of the f switch is not correct.
    Try again with the following statement (-f in stead of u2013f) you will see that it will work:
    ==> brconnect -c -u system/system -f stats -t mseg -f collect -p 4
    I hope this can help you.
    Regards.
    Wim

  • Processing for each item SDRQCR21 flag

    Hello all,
    What does mean the flag Processing for each item for ECC report SDRQCR21?
    Thanks in advance.
    Januario Faria

    Please read sap note 998102.
    Here the following is written:
    "Important information:
        o  With the 'Processing per item' option, the collective
           (summarized) requirements are no longer processed. Materials
           which belong to an atp group which summarizes the requirements
           are simply not checked for consistency."

  • What is process for payment run

    Hi guru,
    Can any one tell what is process for Payment by F110.
    Point will reward.
    thanks&regars
    Durgesh

    Hi
    COpy the Script F110_PRENUM_CHEK to some ZFORM and attach it in the F110 tcode for the related company code and payment method..
    You can do the required modifications to the script as per your requirement.
    Generally we use the Preprinted stationary for printing the cheque
    in which on the top vendor address and the fi doc line items will be printed.
    in the last cheque is printed.
    We have to print just few fields on that pre printed cheque like Vendor name, Amount, Date and amount in words etc.
    You have take some rought stationary xerox copies and to check by printing the fields whether they are correctly matching to the fields on the cheque exactly or not..have to adjust the positions and map.
    see the doc
    Run Payment Program (F110)
    Purpose
    Use this procedure to run the automatic payment program. The payment program is used to create cheques, BACs payments, electronic transfers, etc. for vendors. It is also used to create a direct debit file for customer payments. The payment program runs in three specific steps, which must be run in order.
    Create Payment Proposal. This is a listing of proposed payments. The proposal should be reviewed for accuracy and edited or deleted if incorrect. No postings or payments are created at this step.
    Create Payments. This step creates posting documents in the system, clearing the customer/vendor subledger accounts and posting the offsetting item to a cash or cash clearing account.
    Create Payment Media. This step creates the actual payments, sending cheque forms to the printer or creating files of electronic payments to be sent to the bank.
    Trigger
    Perform this procedure when you are ready to create vendor payments or to create a direct debit file for customers.
    Prerequisites
    • You must have the following master data prepared:
    • Banks and bank accounts with associated general ledger accounts;
    • Appropriate payment methods assigned to your company code;
    • Customer and vendor master records with the appropriate details completed.
    • If you wish to create a cheque payment, the vendor/customer master record must contain full address details.
    • If you wish to create an electronic payment, the vendor/customer master record must contain full bank details.
    • If you wish to create a direct debit transaction for a customer, the 'Col' (collection authorization) checkbox must be selected. This field is found on the General Data - Payment Transaction Data tab.
    Navigation Path
    Use the following navigation path to begin this transaction:
    • Select Processes è Purchase Requisition through to Payment è Payments è Automatic Payments è Run Payment Program to go to the Automatic Payment Transactions: Status screen.
    for more information please check out the link below it
    might help you
    http://www.hostlogic.hu/caghelp/Transactions/Finance/content/f110%20-%20run%20payment%20program/cc/html/index.htm
    Regards
    ANJI

  • Process China Collect Process

    Hi, Experts,
    I need to add a decision step in my process chain so that if one of the precedessor sucess, it will continue the chain.
    That is to say, the process chain will continue only when all the predessors finish and at least one of them is finished with green.
    I tested the "OR" collect process type, it seems that it doesn't wait the finish of all the predessors. If one finishes with sucess, i would continue.
    Can someone confirm that and give me a solution?

    Hi, Srinivas 
    To what i know, the OR will run every time if one of the predecessors finish with sucess. Ex: If you have Step A1, Step A2 and Step A3 at the first level  in paralel, then you have OR linked these predessors to Step B.
    For the instant T1, STEP A1 is still running, Step A2 is finished with success, Step A3 is finished but red.
    > Step B will run as Step A2 is green.
    For the instant T2 (T2>T1), STEP A1 is also finished with green, STEP A2 is finished already finished with sucess, STEP A3 is red,
    > Step B will run again as STEP A1 is green.
    This is what i understand the collection type "OR".
    What i need is to wait until all the predecessor finish, and at least one of them finish with sucess.
    If we continue with the above example, the process chain will wait until Step A1 and A2 and A3 are all finished, then it will look at the status. If there is one green, it will run Step B. But Step B should only run ONCE.

  • COLLECTIVE PROCESSING

    HI GURUS,
        i wanna know exactly how  a collective process work coz when iam creating so.... my orders r not getting combined for  a particular customer....in a process how can i select all the groups ?is it possible for me to see the combined orders in VL03N?......pls any one help me out in this.....

    Hi manoj.,
                     To do collective delivery for the orders you have to tick the order combination is sales area data of the customer,and at the time of delivery you shouls have the same shipping point,Route,Inco terms,shipping date & same ship to party,For collective billing different deliveries you should have same payment terms,invoice dtae,Destination country,currency and same payer,You can see the orders to be combined for delivery in VL04 & VL10a also
    REWARD if helpfull
    Thanks & Regards
    Narayana

  • The worker process for application pool 'MSExchangeAutodiscoverAppPool' encountered an error

    Hello Guys,
    I am getting an event ID 2297 with the description given below.
    The worker process for application pool 'MSExchangeAutodiscoverAppPool' encountered an error 'Cannot add duplicate collection entry of type 'add' with unique key attribute 'name' set to 'WSMan'
    ' trying to read global module configuration data from file '\\?\C:\inetpub\temp\apppools\MSExchangeAutodiscoverAppPool\MSExchangeAutodiscoverAppPool.config', line number '275'.  Worker process startup aborted.
    Could someone help me please to resolve this.
    Thanks in Advance. :)
    Regards Rishi Aggarwal

    Hi,
    Based on the error message, it seems the issue is related to that there is a duplicate entry in the file it mentioned.
    I recommend you follow Amit’s suggestions to check result.
    Best regards,
    Belinda Ma
    TechNet Community Support

Maybe you are looking for