No confirmation through Collective Confirmation CO12
Hi
We are having setup 0010 and 0020 operation, at 0010 PRT is attached with usage value calculation.
While confirming through CO12 transaction. and putting all operation together There is no confirmation is executed. We get the message "No confirmation Executed" and the message no is RU514.
We are working ECC 6.0
Can any one help
Pl check whether order is released or not ? Or if there is an costing error in confirmation , it will not post the confirmation. Pl send me the details of the error message.
Thanks
Similar Messages
-
Hi Guys,
plz light on below two scenariou2019s...
1) I would like make wage group field in CO12 screen mandatory, for that in t code CO86 field AFRUDu2014LOGRP I have selected as a required field in both ie initial and details screen, but still it is not working,
2) in case of collective entry confirmation (CO12) screen system displays activity 1, 2, and 3,
i would like to change these displayed activity text as production time, setup etc,
to do this plz suggest the settings,
thanks in advance,
Mohan MHi Sree Gowda,
i have three options in wage group, when i am confirmation of production orders collectively by CO12 , selecting the appropriate wage group, hence i want make this feild mandatory,
thanks in advance,
Mohan M -
How to use collective confirmation CO12
can any body explain me how to use collective confirmation CO12 with screen dump.
Plz helpHi Viswa,
Prerequisites:
You can predefine values to identify confirmations and to determine default values in Conf. parameters for collective confirmation/fast entry (Customizing for Shop Floor Control by choosing Operations ® Confirmation ® Confirmation Parameters Collective Entry/Fast Entry):
Identifying the confirmation
You can identify an individual confirmation either by the confirmation number or the order number/operation number.
Suggest actual data
You can define whether and when (after entry, during saving) quantities (see Determining a Default for Yield) or activities are suggested. The default setting is that no quantities are suggested and activities are determined during saving. Personnel data and dates are always suggested.
Procedure:
Choose Logistics ® Production ® Shop floor control ® Confirmation ® Enter ® For operation ® Collective entry.
The screen for collective entry of confirmations appears.
In this screen you can enter confirmations in a table (table control). You enter a new confirmation in each line. You can control the appearance of the table yourself.
You can change the type of confirmation identification during input. To do so choose Other view.
Enter all the data needed for the confirmations. Each line corresponds to a new confirmation. In this line you have all the fields that are available in single entry of a time ticket confirmation.
On the top line of the screen you can enter default values (for example unit of measure, personnel number) that are copied to all confirmations. Enter the values and choose This graphic is explained in the accompanying text. You can save these default values for a specific user, so that they are preset the next time the user logs on. You can also delete these defaults.
With this function, you can let the system suggest actual data (quantities, activities, dates, personnel data). To do so, select the confirmations, choose This graphic is explained in the accompanying text Propose actual data, set the indicator for the relevant actual data and choose This graphic is explained in the accompanying text. This data overwrites data that you may have already entered.
Save your confirmations.
Note
To reach the Actual data screen for the selected confirmations choose This graphic is explained in the accompanying text. Here you have the complete functionality of time ticket confirmation. From this screen you can go to the detail screens and to the goods movement overview. If several confirmation have been selected, you can switch between the individual confirmations.
Note
You can switch to the goods movement overview from the collective entry screen (choose This graphic is explained in the accompanying text Goods movements). In this screen the goods movements for the selected confirmations are displayed.
You can check and, if necessary, change the goods movements. When you go back to the collective entry screen, the GM indicator is set for the selected confirmations (goods movements for confirmations already determined). As soon as you save a confirmation, the goods movements are also posted.
For more information choose the GOODS MOVEMENT OVER VIEW.
Regards,
Madhu.G
Edited by: madhu333mac on Jan 11, 2012 11:34 PM -
After 2 hours transferred back and forth between billing, one bill, home and wireless no one could prove that I owed money. Wireless said I owed nothing and home said I had a credit for cancelling a couple days before the billing cycle but owed for wireless. After a three-way called between Home and Wireless and Wireless insisting I do not owe anything, the Home representative disconnected herself, the wireless rep promised me a call back the next day to get the other department online and to get everything resolved. No one called me back next day and the day after. Knowing this could go in circles and don't want to do anything bad to my credit, I paid the settlement amount to the collection agencies. Now I saw that my credit score got ding for an amount that Verizon couldn't prove that I owed. I should not have paid it at all. If someone from Verizon reads this message, please get back to me and get this resolved. After more than 15 years with this company, this is making me so frustrated. I liked Verizon services but at this point I am going to start looking for someone with better billing.
I called customer service, spoke to Verizon Wireless Representative but they cannot pull anything on my record. So after two transfers I asked to speak to financial services. Spoke to Ronnie Ext 5100. Representative was very rude, not listening, kept on insisting that since I paid the collection agency I cannot dispute. Said should not have paid the collection agency. Rep was over talking, not listing to me explaining my frustration of being on the phone for hours in November between Verizon OneBill, Home and Wireless and got nothing resolved, and that's why I paid. Said there's nothing she can do except to put a note to Credit Bureau that the account is paid in full. I asked to talk to a supervisor or a manager and she said no one can do anything else. I insisted and she asked me to leave a phone number to call back. After all the calls and speaking to many reps, this is the worst. Now I am back to square one waiting for someone to call me back. And my credit score is still one hundred points lower thanks to Verizon!!!
-
Hello I need to loop through the collection to process 500 records at a time from the resultset.
how do i do this?
I have created another variable of same type as resultset output variable type. but not sure how to assign nodes 1 to 1000 to the new variable.Hi,
You can loop over your collection with a while loop. The following is from the bpel samples directory (112.Arrays) adapted to your needs. Define the variables:
<variable name="iterator" type="xsd:integer"/>
<variable name="count" type="xsd:integer"/>
<variable name="xpath" type="xsd:string"/>
Set the initial values:
<assign name="SetInitialValues">
<copy>
<from expression="1"/>
<to variable="iterator"/>
</copy>
<copy>
<from expression="ora:countNodes('input', 'payload','/tns:collection/tns:item')"/>
<to variable="count"/>
</copy>
</assign>
Make a while loop:
<while condition=" bpws:getVariableData('count') >= bpws:getVariableData('iterator')">
<sequence>
<assign name="setAttribute">
<copy>
<from expression="concat('/tns:collection/tns:item[',bpws:getVariableData('iterator'),']/tns:value')"/>
<to variable="xpath"/>
</copy>
<copy>
<from expression="bpws:getVariableData('input','payload',bpws:getVariableData('xpath'))/>
<to variable="output" part="payload" query="/tns:recordVariable/tns:value"/>
</copy>
<!--
Place here your own actions to be executed for every row
-->
<copy>
<from expression="bpws:getVariableData('iterator') + 1"/>
<to variable="iterator"/>
</copy>
</assign>
</sequence>
</while>
So the trick is to dynamically build an xpath expression based on the while loop counter.
Kind Regards,
Andre -
Looping through Collection and getting ConcurrentModificationException
I'm trying to loop through a collection to delete some objects referenced by that collection but keep coming up against a ConcurrentModificationException.
BwView view = svci.getViewsHandler().find("Lists");
Collection<BwSubscription> subarr = new TreeSet<BwSubscription>();
subarr = svci.getSubscriptionsHandler().getAll();
if (subarr == null) {
logger.info("Its not working");
for (BwSubscription sub: subarr) {
logger.info("Removing subs " + sub);
svci.beginTransaction();
svci.getSubscriptionsHandler().delete(sub);
logger.info("Deleting calendars :" + sub);
svci.endTransaction();
The loop is allowing me to delete the first entry but then fails with the exception.
I have tried with a generic loop but this doesn't initialise the sub entity so the loop fails and is ignored leading to problems later in the code.
BwView view = svci.getViewsHandler().find("Lists");
BwSubscription[] subarr = (BwSubscription[])view.getSubscriptions().toArray(new BwSubscription []{});
if (subarr == null) {
logger.info("Its not working");
for (int i=subarr.length - 1; i>=0; i--) {
sub = subarr;
logger.info("Removing subs " + sub);
svci.beginTransaction();
svci.getSubscriptionsHandler().delete(sub);
logger.info("Deleting calendars :" + sub);
svci.endTransaction();
Sub is either not initialised or gets initialised as 0 causing an ArrayIndexOutofBoundsException. I'd be grateful for some advice on getting the code to loop correctly.While iterating over a collection (using its iterator), a ConcurrentModificationException will be thrown if the collection is modified, except if it is modified using the iterator itself.. The enhanched for-loop you're using is iterating over the collection by implicitly using an Iterator. To do what you want, make the Iterator explicit (change the enhanced for-loop to a while loop) and then use iterator.remove().
-
Looping through collection of symbols and populating a drop down on a form EXTENDSCRIPT
Hello everyone-
I got a question. Would it be possible to run a script that would loop through a collection of symbols in a defined library and populate a drop down with the available symbols. My goal is to be able to open Illustrator, and when I run the script the dropdown is populated with all available symbols(text is fine) in a defined library that I can then select and place on the artboard. Any help would be appreciated.
thanks in advance!So...
Dim appref As New Illustrator.Application
Dim docRef As Illustrator.Document 'The work document
Dim docToAdd As Illustrator.Document 'The document which contain the symbol's library
Dim pathArt As Object
Dim ItemSymbol As Object
Dim itemSymbolref As Illustrator.Symbol
Dim artObject As Variant
appref.Open ("C:\..... workDoc.ai")
Set docRef = appref.ActiveDocument 'Assign docRef to your work document
appref.Open ("C:\......symbolDoc.ai")
Set docToAdd = appref.ActiveDocument 'Assign docToAdd to your symbol document
Set symbolref = appref.ActiveDocument.Symbols(Symbol_Name) 'Assign symbol ref the symbol you're looking for
Set itemref = appref.ActiveDocument.SymbolItems.Add(symbolref) 'Copy it on the current active page
itemref.Selected = True 'select it
docToAdd.Copy 'add it to your clibboard
docToAdd.close (aiDoNotSaveChanges) 'Close the library, without changing anything
appref.ActiveDocument.Paste 'Paste the clipboard on your work document
'As soon as you paste the symbol, it will be added in the document symbol library
Clipboard_Clear 'See below why...
SelectedObjects =appref.ActiveDocuments.selection ' Select the symbol you past
' The loop is not necessary if you have only one thing selected
For Each artObject In selectedObjects
artObject.Delete
Next
' Cleanup your pointers
Set artObject = Nothing
Set pathArt = Nothing
Set docToAdd = Nothing
[....] Continue your script on your working document
Set docRef = Nothing 'after closing your working doc...
Remark on clipboard_clear : If you don't do it, the risk is to accumulate plenty other things you don't want...
Public Function ClipBoard_Clear()
Call OpenClipboard(0&)
Call EmptyClipboard
Call CloseClipboard
End Function
With
Public Declare PtrSafe Function OpenClipboard Lib "user32" (ByVal hWnd As Long) As Long
Public Declare PtrSafe Function CloseClipboard Lib "user32" () As Long
Public Declare PtrSafe Function EmptyClipboard Lib "user32" () As Long
be careful I use the PtrSafe pointer due to the fact I'm working in x64 architecture
in x86 it will be
Public Declare Function OpenClipboard Lib "user32" (ByVal hWnd As Long) As Long
Public Declare Function CloseClipboard Lib "user32" () As Long
Public Declare Function EmptyClipboard Lib "user32" () As Long
Regards to all
Michel -
PO through Collective or RFQ no.
Dear All
I have made Comparision through ME57 t-code in SAP and i want to create the PO from that no.(RFQ) but i didnot get the PR no. in the requistion tab, so it does not give any error message if i will make several PO against the same RFQ with full qty.
regds
deveshdear
but it should be upto the PR qty is pending right....
see... when i create the PO through RFQ no. it should also bring the PR no. in the PO but it didn't bring ... it s okay...no problem i saved the po without the PR no. ok
now i make another po with same rfq no. it should give error message during the save but it didn't give. it means i can create as many po as i can with the rfq or some error in configure.... pls check and revert
regds
devesh -
Collective Confirmations/Fast Entry(CO12/CO1V)
Hi Frnds
I have a question on the collective entry /fast entry of production orders
My requirement is
When ever iam entrng the set of production order ,system should pic the operation number as 0010 as default
The system should get the activty names as maintained in workcenter .routing rather as usal activity 1, 2 ,3
The user ammends the activty details ,for quanity partially confirmed ..
The confirmation screen need to set so that activity details populate next to qty coloumn
Above setting for CO1V/CO12 need to be common for all the users..who ever checks the transaction should have the same setting ,
Any document is higly apprichiated. can send the doc to (me4sap at gmal.cm)
thanks
Rajan
Edited by: Rajan Kotagal S on Jun 3, 2009 2:03 AMHi Sree Gowda,
i have three options in wage group, when i am confirmation of production orders collectively by CO12 , selecting the appropriate wage group, hence i want make this feild mandatory,
thanks in advance,
Mohan M -
GR qty showing less than production order confirmation qty
Hello Experts,
Greetings.
My issue is while doing production order confirmation(CO12),for some materials it shows less GR quantity than the confirmation qty.I m wondering how it can be possible?
What could be the problem?
Plz advice.
Thanks
NitinConfirmed qty of each operation and delivery qty at the header are two different fields. Confirmed qty will be the same with the yield to conf. during confirmation. Whereas, the GR qty is equal to the total sum of quantity received from the order.
Check whether during confirmation you create auto GR or not. If yes, then check the COGI t-code. There might be some error log of goods receipt in there. If no, then check GR qty in document goods movement compared to confirmed qty during confirmation.
Hope it helps. -
SharePoint 2010 designer workflow assoicated tasklist not showing columns added by collect data
Hi,
We had a SharePoint designer workflow in production. This is a reusable workflow, and this workflow uses collect data action to get some of data from users. Through collect data action, we added 7 columns(default task list), which have been used in logging,
email template and other places in the rest of the workflow.
Suddenly we find all the references to task columns(added through collect data), are not showing up in the workflow. If we try to add those collect data columns, it is not showing in "Field from Source" field as shown below.
Here Association Tasklist field are not showing.
If we again try to select columns, we find that all the columns that we added through collect data
no longer visible in associated task list.
Please help.Hi Fender,
Please can you confirm which version of Outlook your users have (and if they are all on the same version)? All my none 2007 users lack the Edit Item button, and have to use the links in the body of the email, which I believe is simply due to the fact that
2003 et al lacks the same SharePoint integration options as 2007.
Cheers
Stew -
What is difference between Iterator and Collection Wrapper?
Hi all,
I dont understand the actual difference between Iterator and Collection Wrapper. I observed both are used for the same purpose. Could any one please let me know when to use Collection Wrapper and when to use Iterator??
Thanks,
Chinnu.L_Kiryl is right.
Collections support global iteration (through collection->get_next( )) and local iteration (through iterator->get_next( )).
Each collection has a focus object. Initially, the first object has the focus.
Any global iteration moves the focus, which is published by the event FOCUS_CHANGED of the collection.
If you want to iterate on the collection without moving the focus (and without triggering timeconsuming follow-up processes) you can use local iteration. To do so, request an iterator object from the collection and use this to iterate.
And one more advantage of using iterator: it takes care of deleted entities. If you use global iteration then when you reach deleted entity it will be an exception. But there is no exception with iterator in the same situation. -
How to crate a dynamic size array, collection
Hi,
Can someone point me to some tutorial on how to create a dynamic size array. Actually I have multiple cursors and on different tables and want to loop through each cursor and get some values from each cursor and put it in an array. But don't know how to create or initialize an array as I don't know the size. Is there any other way.
Here is what I am doing I have 6 cursors on different tables, I loop through each cursor and get some specific data that I need to place at one place after looping through all the cursors which then finally needs to be inserted in one table. But before the insert I need to validate each data so want to have all the data in some array or some other place for temporary storage from it's easier to do the validate and insert rather than while looping through the cursors as there may be duplicates which I am trying to remove.
As this procedure will be called multiple times so wanted to save the cursor data in temporary array before inserting in the final table. Looking for some faster and efficient way.
Any help is appreciated.
Thanksguest0012 wrote:
All the 6 cursors are independent and no relation i.e. can have a join and have one sql as no relationship in tables.If there is no relation, then what are your code doing combining unrelated rows into the same GTT/array?
Now using GTT when I do an insert how do I make sure the same data doesnot already exists. i.e. GTT will only have one column. Then create a unique index or primary key for the GTT and use that to enforce uniqueness.
So everytime I iterate over a cursor I have to insert into GTT and then finally again have to iterate over GTT and then do an insert in the final table which maybe a performance issue Which is why using SQL will be faster and more scalable - and if PL/SQL code/logic can be used to glue these "no relationship" tables together, why can't that not be done in SQL?
that's why i was wondering if can use any kind of array or collection as it will be a collection of numbersAnd that will reside in expensive PGA memory. Which means a limit on the size of the collection/array you can safely create without impacting server performance, or even cause a server crash by exhausting all virtual memory and causing swap space to trash.
and finally will just iterate ovr array and use FOR ALL for insert but don't know what will be the size of the array as I only know after looping the cursors the total number of records to be stored. So wondering if how to do it through collection/array is there a way to intialize the array and keep populating it with the data with defining the size before hand.You cannot append the bulk collect of one cursor into the collection used for bulk collecting another cursor.
Collections are automatically sized to the number of rows fetched. If you want to manually size collection, the Extend() method needs to be used, where the method's argument specifies the number of cells/locations to add to the collection.
From what you describe about the issue you have - collections are not the correct choice. If you are going to put the different tables's data into the same collection, then you can also combine those tables's data using a single SQL projection (via a UNION for example).
And doing the data crunching side in SQL is always superior in scalability and performance, than doing it in PL/SQL. -
Updating a table from a collection
I have some data in a collection that is meant to update a table
I have a string of colon delimited column names like
pk:age:height:weight
The table contains the PK and the data for column names as per the string above.
So,
c001=pk
c002=age
c003=height
c004=weight
How can I loop thru the table and create/execute a UPDATE statement that updates
those columns in a pre-specified table?
Of course using bind variables as much as possible?
Thanksthought you may do it easier by loop through collection in a process as ...
for c in (select c001 pk, c002 age, c003 height, c004 weight
from yur_collectioin
where c.pk = c001) loop
update yur_tbl set age = c.age ....
where pk = c.pk
end loop;
Or you may "loop thru the table" and update in a process as ...
for c in (select * from tbl) loop
for cc in (select c001 pk, c002 age, c003 height, c004 weight
from yur_collectioin
where c.pk = cc.pk) loop
update tbl set age = cc.age ....
where pk = cc.pk
end loop;
end loop;
Should not have mutation problem but I am not sure. Good luck.
DC -
Reading from .CSV and storing it into a collection
Hi folks,
Is there a way to make a dynamic procedure to work with .CSV documents and store it into a collection? For example you have to make a procedure to read from .CSV but users upload 10 different version that have different number of columns.
Normally I would define a record type to match those columns and store it into a collection. However if I don't know the number of columns I would need to define 10 records in advance which I am trying to avoid.
Problem is I cant define SQL elements on the fly. Meaning on production I don't have the rights to dynamically create a table to match my columns and then drop the table after I no longer need it so I need to store data into a collection.
And the last option where I would loop through the document and then do the operations I need is not good since the document is a part of other procedures that write and read from it. The idea is to pick the data, store it into a collection, close the file and then work with it.
This is what I got so far:
declare
-- Variables
l_file utl_file.file_type;
l_line varchar2(10000);
l_string varchar2(32000);
l_delimiter varchar2(10);
-- Types
type r_kolona is record(
column_1 varchar2(500)
,column_2 varchar2(500)
,column_3 varchar2(500)
,column_4 varchar2(500)
,column_5 varchar2(500));
type t_column_table is table of r_kolona;
t_column t_column_table := t_column_table();
begin
/*Define the delimiter*/
l_delimiter := ';';
/*Open file*/
l_file := utl_file.fopen( 'some dir', 'some.csv', 'R');
/*Takes first row of document as header*/
utl_file.get_line( l_file, l_line);
loop
begin
utl_file.get_line( l_file, l_line);
/*Delete newline operator*/
l_string := rtrim( l_line, chr(13)) || l_delimiter;
/*Extend array and insert parsed values */
t_column.extend;
t_column(t_column.last).column_1 := substr( l_string, 1, instr( l_string, l_delimiter, 1, 1) - 1);
t_column(t_column.last).column_2 := substr( l_string, instr( l_string, l_delimiter, 1, 1) + 1, instr( l_string, l_delimiter, 1, 2) - instr( l_string, l_delimiter, 1, 1) - 1);
t_column(t_column.last).column_3 := substr( l_string, instr( l_string, l_delimiter, 1, 2) + 1, instr( l_string, l_delimiter, 1, 3) - instr( l_string, l_delimiter, 1, 2) - 1);
t_column(t_column.last).column_4 := substr( l_string, instr( l_string, l_delimiter, 1, 3) + 1, instr( l_string, l_delimiter, 1, 4) - instr( l_string, l_delimiter, 1, 3) - 1);
t_column(t_column.last).column_5 := substr( l_string, instr( l_string, l_delimiter, 1, 4) + 1, instr( l_string, l_delimiter, 1, 5) - instr( l_string, l_delimiter, 1, 4) - 1);
exception
when no_data_found then
exit;
end;
end loop;
/*Close file*/
utl_file.fclose(l_file);
/*Loop through collection elements*/
for i in t_column.first .. t_column.last
loop
dbms_output.put_line(
t_column(i).column_1
|| ' '
|| t_column(i).column_2
|| ' '
|| t_column(i).column_3
|| ' '
|| t_column(i).column_4
|| ' '
|| t_column(i).column_5);
end loop;
exception
when others then
utl_file.fclose(l_file);
end; Stupid version would be to define a record with 50 elements and hope they dont nuke the excel with more columns :)
Best regards,
IgorIgor S. wrote:
Use some to query data and then fix wrong entries on prod (insert, update, delete). Manipulate with some and then make new reports. The first that come to mind but basicly is to write a procedure that can be used for ANY .csv so I dont have to rewrite the code.This is logically wrong and smacks of poor design.
You're wanting to take CSV files with various unknown formats of data, read that data into some generic structure, and then somehow magically be able to process the unknown data to be able to "fix wrong entries". If everything is unknown... how will you know what needs fixing?
Good design of any system stipulates the structures that are acceptable, and if that means you know there are just 20 possible CSV formats and you can implement a mechanism to determine which format a particular CSV is in (perhaps something in the filename?) then you will create 20 known targets (record structures/tables or whatever) to receive that data into, using 20 external tables, or procedure or whatever is necessary.
Doing anything other than that is poor design, leaves the code open to breaking, is non-scalable, hard to debug, and just wrong on so many levels. This isn't how software is engineered.
For example you have 20 developers that have to work with .CSV files. So when someone has to work with a .CSV he would call a procedure with parameters directory and file name. And as a out parameter would get a collection with .CSV stored inside.As others have mentioned, give the developers an Apex application for their data entry/manipulation, working directly on the database with known structures and validation so they can't create "wrong" data in the first place. They can then export that as .CSV data for other purposes if really required.
Maybe you are looking for
-
I'm trying to import Hi8 into FCPX via a Canon GL2. Although the footage appears in the import window, it says camera not controllable. The manual import buttons activate, but the footage is not actually importing. Any ideas? Thanks so much.
-
I have an ASUS computer and my CD ROM device will not run. The device is an ATAPI DVD A DH24ACSH SAT A CD-ROM. I get a Code 19 which states that the configuration information (in the registry) is incomplete or damaged. Is it the driver with detail
-
Connect iPad wireless to Sony bravia
For playing vídeos or photos
-
Hardware Requirement for Solaris 8 Installation
Hi, Can anyone give me the hardware requirements for Solaris 8 Installation. I need to install the following patches too.. C++Runtime 108434-12 Linker 109147-21 J2SE Solaris8 (patch cluster) SUNWxwfnt (X Window System platform fonts) SUNWilof (ISO-88
-
No Transparency on PNG image problem
I'm trying to create an ImageIcon that uses a transparent PNG image. I have set a particular colour on the image itself to be transparent in Corel PhotoPaint, and when i view the .png image that is output this colour is transparent. However, when i t