How to enforce data collection

Hi everybody,
how can I enforce data collection at a specific operation? I am using the integrated POD showing the data to be collected in one part of the window. The data collection itself works fine, but it is merely optional. I can complete the operation without a warning.
So, where is the hook or rule that I need to set up in order to check whether data collection has been done before completion of the operation?
Georg

Hello,
Please chec k the link below:
http://help.sap.com/saphelp_me52/helpdata/EN/d7/f7f0be3fec4a31bec083a035eb2423/content.htm
this explains activity CT500,
This activity deals with the checking of component, making them as mandatory to assembly etc.
you can change for the CT500 and CT500_RICH activities in Activity Maintenance:
On POD you have to use these activities.
Note: The system executes all code associated with a hook point in the same database transaction. For hook points within POD pushbutton activities, the transaction includes a single pushbutton activity, such as Start (PR500). If the hook point activity fails, the system rolls back, or cancels, the entire transaction. For example, in the figure in Site Level Hook Points, if you associate Check Configuration (CT520) with the POST_START hook point and the components have not been assembled, the system rolls back the Start as well. This is true for all hook points
br,
Pushkar

Similar Messages

  • How to perform Data Collection on single SFC with QTY = 1 with material lot size 1?

    Dear experts,
    We are working with SFC qty>1 on a relaxed routing. At a given operation we want to collect the data on single quantity; i.e. SFC qty on that operation, where the collection will happen, will be 1.The corresponding material lot size is for ex 10. The operator must be able to collect data on SFC with qty=1 multiple times until the quantities will be consumed. He must be also able to collect other values on the remaining quantities on the same operation with the same DC group or other DC groups. How many times the data must be collected is dependent on the shop order build quantity. The data may be collected several time but not more than the build qty. In other words some specific data will be collected on a qty of a product while others will be collected against remaining quantity. The data collection must be also done in a serialized manner.
    Here's what we have set up so far:
    1) 3 DC groups, each DC group has 3 data fields.
    2) Each data field has the following restrictions:  Required Data Entries = 0 and Optional Data Entries = 1
    3) All DC groups are attached on the same combination of operation\material\routing
    4) we are using relaxed routing
    Process description:
    The operator must be able to collect any data field on single product. For that he will enter the operation where the data collect are attached, he will enter the SFC with qty=1 then he will run the data collection after selecting the appropriate DC Group and entering the needed information. The operator will complete the SFC with qty=1.
    The operator will pick the next product, select the same SFC and enter qty 1 and collect other value against this product.
    Problem is:
    Once the first collection is done on a given SFC with entered qty=1, the system is not allowing the operator to do further collects on the same SFC with qty=1 or any other quantity. He cannot select any DC group from the DC group list. We tried also with the table selection menu on the DC Group list but nothing can be selected.
    So we tried to play around with the DC group definitions as follows:
    A) we set Required Data Entries = 0 and Optional Data Entries = 10. Still the operator was not able to select any DC group after collecting data the first time. We tried to reopen the pod and list again. But we get the same blocking behavior.
    B) we set Required Data Entries = 10 and Optional Data Entries = 1. The operator was able to select the DC group after collecting data the first time. BUT operator must enter the data fields 10 times on one SFC quantity, which is not what we want. Besides again he cannot collect other information on remaining quantities on the same operation.
    C) There's an option to serialize the SFC before reaching the operation where the collection is happening, then merging ofter complete. Automation is needed here; hence customization. We are highly avoiding customization now, since we expect the data collect to work well on single quantities even when the main SFC has qty>1
    Questions:
    1) Are we missing any kind of further configuration\setup?
    2) Or the current system design does not allow collecting data on single quantities of an SFC which main quantity is greater than 1?
    3) Looking at this link Approaches to Collection of Data - SAP Manufacturing Execution (SAP ME) - SAP Library, there's nothing mentioned about same SFC number with multiple quantities!!!
    We are using SAP ME 15.0.3.0.
    Thanks in advance for your help
    Ali

    Ali
    to collect data for the same SFC multiple times, your system rule "Allow Multiple Data Collection" needs to be set to true for the site.
    Stuart

  • [JAVA] How to input data collected in a table

    Hello!
    I'm writing a program that monitors the signal of a sensor in a chart. Now I want to insert into a table of data collected by the sensor using a JTable.The problem is that if I run the program, data are not included in the table.
    Where did I go wrong?
    Here's part of the source code.
    ArrayList<Data> datiArray = new ArrayList<Data>();
    DataTableModel dtm= new DataTableModel(datiArray);
    public class DataTableModel extends AbstractTableModel {
        private ArrayList<Data> datiArray;    // <---- Il table model ha la collezione
        public DataTableModel(ArrayList<Data> datiArray) {
            this.datiArray=datiArray;
        public void addData(Data d) {
            datiArray.add(d);
    int row = datiArray.size()-1;
    fireTableRowsInserted(row,row);
             private String colName[] = {"Time", "G-value" };
            public String getColumnName(int col) {
                return colName[col];
            public int getRowCount() {
                return datiArray.size();
            public int getColumnCount() {
                return 2;
            public boolean isCellEditable(int row, int col) {
                return false;
              public Class getColumnClass(int c) {
              return (c == 0) ? Long.class : Double.class;
    public Object getValueAt(int row, int column) {
            Data d = datiArray.get(row);
            switch (column) {
                case 0: return dati.getTime();
                case 1: return dati.getGvalue();
            return null;
        private class Data  {
            public long time;
            public double gvalue;
            public Data(long time, double gvalue) {
                this.tempo = tempo;
                          this.gvalue = gvalue;
    public long getTime() { return time; }
        public double getGvalue() { return gvalue; }
    RecordButtonAction
        private void recordButtonActionPerformed(java.awt.event.ActionEvent evt) {                                            
            int i=0;
    int j= graphView.getSampleTime();
    int k=graphView.getIndexMax();
    System.out.println(j);
    System.out.println(k);
            while(i<k){
    Data dr= new Data(graphView.getTime(i),graphView.getGvalue(i));
    //datiArray.add(dr);
    dtm.addData(dr);
    //these System.out.println are served to me to check if the values were actually included in the class Data and DataTableModel
    System.out.print(graphView.getTime(i));
    System.out.println("/t"+graphView.getGvalue(i));
    System.out.println(dr.getTime());
    System.out.println(dtm.getValueAt(i ,1 ));
            i=i+j;
            readyRecord = false;
    Sorry for my bad English.

    Please don't cross-post the same question in multiple forums. Especially since you've already been given directions as to how to ask the question smartly.

  • How to display data collected in main vi to indicators in another vi

    Hi,
    I am collecting data using NI 6251 USB DAQ
    in the main vi but as i have too many controls and indicators on the
    main vi front panel. so i want to display the indicators(graphs) in another vi. i also
    want the vi with the graphs to open automatically. Does any one know how
    i can do this? I would
    really appreciate any help.
    Thanks,
     ygupta
    Solved!
    Go to Solution.

    Hi,
    Search forum for the sub vi concept.
    For your reference i have attached a example.
    you can see procedure for the same on the net.
    regards,
    Shrek
    Attachments:
    graph.zip ‏11 KB

  • How to average data collected in a loop

    Hey everyone,
    I am using an interface card to read voltage across a resistor to measure the current through a photodiode.  The VI I made slowly increases the voltage applied across the sample. It takes a starting voltage, increases it by a specified increment, and then takes a series of measurements at that voltage (usually around 200 or 300). I had the program just save all of this data to an external measurement file, where I would then average it in excel.  I had to change the program to measure three variables, and I want to the VI to average the data and then save the average current at each voltage in a measurement file. 
    Ex. 
    It used to export the data as...
    -1     .90
    -1     .80
    -1     .85
    Im trying to get the program to average all of these values and then save it as one data point.
    -1     .85
    I would like the program to take the 200 or so data points, average them, and then save just the average in a file. 
    I usually have about 300 different applied voltages to measure, and with 200 current readings at each it becomes a huge amount of data.
    Right now I have the part of the VI that takes the measurements in a while loop, and once the number of loop iterations reaches the specified number of measurements it stops running.  The program would then increase the voltage, and run the measurement loop again.  I got everything else working, I just can't figure out a way to average all the data.
    Any help would be greatly appreciated

    Alright, I just started using labview last week and i knew that i would have to use shift registers, but when i tried to create one the add shift register option was grayed out.  All i had to do was click on the right or left side of the loop instead of the bottom which is what i had been trying before. 
    Thanks for the fast response

  • Data Collection in BPC

    Hi,
    How does the Data Collection(UCMON tcode) works in BPC?. Do we have any standard Data Manager Package to do the Data Collection?.How can we achieve this
    Would appreciate your time and response.
    Thanks.

    Hi,
    First, I am not sure what does the tcode that you mentioned does. There is no such code in BPC as far as iam aware.
    Regarding the data collection into BPC application (in other words a cube) -
    You could have a flat file approach -  Upload the flat file into BPC file service using upload data file - and then using standard import data manager package - you could import the data into BPC application. You need to define  and use the BPC transoformations and conversions (if any) in the import process. These are nothing but the files.
    Alternatively, you could also load the data directly from the BW infocube or even from a multi-provider. There is a separate data manager package to do this.
    You could also collect data into BPC through input schedules.
    You have the advantage of triggering the default logic while collecting the data which would apply your business logic on the data before it is saved to the database.
    Hope the above gives some insight to you.
    Thanks

  • How to Check UPL data collection at Solution Manager

    Hello,
    I had activated UPL data collection as per the document How to Guide –SAP Custom Code Management Usage and Procedure Logging.
    I can check UPL data in my managed system by Report “/SDF/SHOW_UPL"
    But I am not able to get the same data in Solution Manager by this same repot (/SDF/SHOW_UPL).
    When I run this report in solman it show me only Solution manager data.
    How I can read Manage System data in Solman? And what are the check point (for both Solman and Manage system) I need to check for ensuring my UPL date collection is working fine.

    Set up CCLM and the custom code setup and you will be able to view UPL data that way,

  • How do I synchronize data collection on two DT3001 boards?

    I am very new to Labview, and I am trying to find an easy way to synchronize my data collection on two DT3001 boards.  I have tried modifying the AI Continuous Scan example.  This works because I am using all 16 channels on each of my boards.  Unfortunately, I cannot find a way to make both boards collect data beginning at the same time.  Is there an easy way to synchronize both boards to collect data at the same time?  Thank you for your help!

    Hello Kacie,
    I'm not sure how your DT 3001 device and it's drivers LabVIEW functions work, but for synchronizing acquisition on two National Instruments DAQ cards, you need to share a start trigger and sample clock between the two cards by connecting the Real-Time System Integration (RTSI) buses of the cards using a RTSI cable.  This provides dedicated lines for sharing the same sample clock and start trigger, and provides highly accurate hardware-timed synchronization.  Any synchronized starting you program in software in LabVIEW will be software timed.  You can use the sequence structure within LabVIEW to group together different portions of code so that all the configuration is performed for both cards first, and then they are started at the same time.  I know this probably isn't too much help, but I would need to know more about how your DT 3001 device and the AI Continuous Scan example works.  You might be able to find better help on this at Data Translation's website.
    Travis G.
    Applications Engineering
    National Instruments
    www.ni.com/support

  • I need to set up 9 IPhones and 4 IPads to be used as data collection devices. How can I do that without setting up 13 separate ITunes accounts?

    I need to set up 9 IPhones and 4 IPads to be used as data collection devices. How can I do that without setting up 13 separate ITunes accounts?

    You can use the same account to set up as many devices as you like.
    You might also find there are tools to help with a roll out.
    http://www.apple.com/support/iphone/enterprise/
    tt2

  • Is there any one who knows how to apply machine learning algorithms to the spectrum data collected using ni usrp for ism band to use spectrum efficiently by link adaptation

    Hi,
    may i know any one working in cognitive radio research and applying machine learning techniques using the spectrum data collected from NI usrp kits

    Can't edit my message anymore, so for the tldr crowd (too long didn't read), here is a shorter version:
    The good:
    E4200 worked fine for about 15 hours with great throughput and link quality (with one brief disconnect in the middle - online gaming software shows brief disconnects that otherwise go totally unnoticed).
    The bad:
    Wireless on the E4200 stopped broadcasting entirely (to a Windows 7 laptop with a 802.11n 2.4 GHZ USB adapter).
    I had switched between two adapters that use the same chipset, same driver and in fact show up as a single device in device manager) without rebooting the router but it worked (fine) for an hour after I swapped the adapters.
    Right before I rebooted it, I checked and the E4200 was not hot to the touch and wired internet was still working after the wireless radio stopped working.
    A reboot of the router cured it.
    The open question as to the root cause:
    Now I want to know whether this wireless radio ceasing to broadcast (requires a reboot of the E4200) will happen daily?
    I.e. I want to know whether it it really is overheating (then again why would only some have that problem needing to reboot daily while others have gone weeks with the E4200 without a single reboot being needed and without any problems occurring) or whether it had to do with changing the adapters back and forth without rebooting the router. Keep in mind that the router worked fine for an hour or so after I stopped swapping the adapters.
    Extra question:
    Will getting a Cisco AE1000 USB adapter perhaps help? (I know this is a loaded question and their is no easy / sure answer, but even a "maybe" or reasons why it might help at this point would be better than nothing.
    Any similar experiences (with the router needing to be rebooted to get wireless radio back on) would be appreciated as it may help myself and others experiencing these types of issues.

  • When or How does the time stamp on Report Manager / Data Collection change

    Hi,
    I have been waiting since 10am waiting for data to be available. Unfortunately, the Sun MC has not updated the time stamp. I am unable to create graphs for the machines I setup this morning? Any ideas on how to post data faster than waiting forever for Sun MC server?
    thanks

    Hi,
    Clearly a bug. You can try changing your Tools|Preferences|Database|NLS|Timestamp Format to match the Date Format as a workaround. Otherwise why not upgrade to the lastest production SQL Developer release (3.1.07.42)? The bug does not occur there.
    Regards,
    Gary
    SQL Developer Team

  • How to do filter data after Data Collection for Oracle PS

    Hi Experts,
    I have a critical issue on the "Data Volume" after data collection program.
    If I give this big chunk of data as a input to PS, then integration program is not taking this High volume of data into PS.
    Hence now I have a critical task to reduce number of Data.
    In this front, first I need to delete some items which are not required for PS in the base table(In MSC_SYSTEM_ITEMS) after collection program, but I do not know the dependent entities which also has to be taken care in MSC side after collection, If I delete the itrems from MSC_SYSTEM_ITEMS..
    Hence I require help on this data filter issue.
    Appreciating anyone's help in advance...
    Thanks & Regards,
    Bharathram N

    Hello,
    Have you looked at the request Purge ODS Data of Collections? Using this you can delete all the collected data for a specific instance id.
    Once that is done you can run a complete refresh for all entities for all enabled organizations. This will ensure that old data pertaining to disabled orgs is removed.
    Additionally in 11.5.10 (collections rup 32 and later) there is a feature implemented which does not pull data from disabled organizations into the source snapshots. This in itself does not make a difference for the ODS data, but does give better performance in the refresh snapshots.
    One thing to take into account if your purge the ODS data that in the event when you still have ascp plans that are based on this data, then you may end up with no data being displayed in the workbench (because the lid tables have been purged). Here you would have to rerun the plans.
    best regards
    Geert

  • How to pass data from one component to other component.

    Hi,
      I have created a table view in Item Details Page Under Price Agreement Assignment Block. Price Agreement Assignment Block has one table as standard. I have created one more view Under the Assignment block of Price agreement(CRMCMP_CND). In that view I have an option to enter data when the user click on edit. In that custom view, If user cnages to edit mode and enter data. it has to save when user click's on save button in Quotation page(BT116QH_SRVQ). From Quotation page user clicks on item to get the Item details. In the Item details there is an Price Agreement Assignment Block. In that Custom view was created and assigned to it. Data entred in that view needs to be stored when user clciks on Save button on Quotation page when he comes back. How to get the custom view data from CRMCMP_CND to BT116QH_SRVQ). Please advice. Thanks In Advance.

    Hi Satish,
    Global custom controllers are not available everywhere. So before using them please make sure that they are available for your case.
    http://sapdiary.com/index.php?option=com_content&view=article&id=2402:sap-network-blog-interaction-center-global-custom-controllers&catid=81:data-services&Itemid=81
    if you find problems in navigating data using component controllers then you can also navigate it through navigation links.
    If you see your outbound plug method op_xxxxx, then there is a method call to navigate method. This method has an option to pass your data collection. Pass your data collection and also inbound plug so that in target component you can read and temporarily store data as soon as you hit target component.
    Hope that helps,
    BJ

  • How to get data into the mySQL database?

    First some background.
    I have a website that has outgrown its designed dimensions and is a huge burden to maintain. See PPBM5 Benchmark
    There is a lot of maintenance work involved, so I'm investigating a PHP/MySQL approach to easen the burden and to add functionality to the site. With the current Excel based structure and over 420 entries, it is cumbersome for me to maintain, but also for users to find what they need.
    A MySQL based dynamic structure is a lot easier and offers vastly more selection capabilities, like selecting only records that meet specific criteria.
    Data submission is done with a form, that contains most of the relevant data, but the drawack is that people submitting their data are often not technically inclined, give wrong answers due to a lack of understanding or making typo's. The test results are attached in one or two separate .txt files, but often they have not read the instructions correctly or did something wrong, so these attached .txt files can not be trusted automatically, they have to be checked before inclusion.
    These were my initial thoughts:
    1. Data collection:
    To avoid spending all our energy and time  on correcting typo's, getting missing data, correcting errors, I am  investigating the use of CPU-Z in Ghost mode to create a .txt or .html  file that contains all relevant hardware info we need and even more. It gives all the info we currently have, but adds  data like number of memory sticks, DDR timings, stock clock speed and  BCLK setting, video card info and VRAM size, etc.
    To see what I mean, run CPU-Z, go to the About tab and press the Save Report button and look at the results.
    This can all be done without user intervention in an automatic way, but  maybe I need to add an Auto-It file to the test to make it all run as  desired.
    If this works and I'm able to extract the relevant data from the created  file and can insert it into the database, we may be in business for the  next version of PPBM5.5 or PPBM6. It does require a modification to the instructions, making them a lot  easier, because there is less data to fill out.
    2. Data submission:
    The submission form can be simplified if  the CPU-Z data can be used. We have to create an automatic way to attach  the created .html file from CPU-Z to the submission form and we have to  streamline the Output.txt and Output-MPE.txt files to be more easily included in the 'form.lib.php' file. It  currently is manual labor and very time consuming.
    3. Adding to Database:
    I have to find a way to create database  records from the Gmail forms I receive. All incoming mail messages need  to be checked on relevancy and if relevant, need to be added  automatically to the database and then offered for approval before final inclusion in the database. Data included in the database  will then include submission date and time, Email address,  IP address  used, plus links to the files submitted and available on the website.
    4. Publication of the database:
    After approval of new records from step  3, all updates will be automatically applied to the database and  accessible for users. I do not yet intend to introduce a user account ,  requesting login before all functionality is accessible. Too much trouble and administration.
    Queries should be possible on things like CPU (check box), so include  17-920, i7-930, i7-950 but exclude i7-980X and i7-990X, Size of memory  (check box), Overclocked (boolean, yes, no), SSD as OS disk, and similar  options.
    The biggest problem is to keep the color grading and statistical  indicators (Top, D9, Q3, Med, Q1 and D1) intact on dynamically generated  queries. Say you make a query which results in 20 observations, this  should show the related colors and legends. Next query results in 48 observations and of course the color grading and legends  do need to reflect that. Question in my mind, does the RPI remain  constant, independent of the query or does that need to be recalculated  on the basis of the query?
    Next thing is to allow a user to select a specific observation and by  simply clicking on it be shown, in a separate window (detail page) or  accordion, all the CPU-Z related information about the hardware.
    The graphs, Top-20 and MPE Gains, need to be dynamically adjusted, based on the query used.
    5. Ideally, external links:
    In an ideal situation, one could link the  CPU-Z data to external price databases, looking up current prices for  CPU, memory, video card, disks, raid controller, etc. to get instant  BFTB charts, based on the query made. But that is the next step.
    Situation now:
    I have a MySQL database that is easily updated with the new submissions. Simply create a .CSV flie from the submitted forms and import that into the database. The bulk of the initial work is done.Lots remain to be done as you can see above, but that is for a later time.
    Question:
    I have this table, that needs to be filled with data in the submitted and attached files. Mr. X submitted his data and can be uniquely identified by his "Ref_ID". He attached one or two files in .TXT format with the relevant test data. These files are stored on the server with a concatenated name:
    "Ref_ID","-","filename"
    Say his Ref-ID is: 20110204-6cf5 and his submitted file is called: Output(99).txt then the file can be found on the server as
    20110204-6cf5-Output(99).txt
    I need to be able to open that comma delimited file, the contents may look like this: "439","1036","819","531" and insert these contents into the relevant record and fields.
    Graphically,
    is what I want to achieve.
    This being my first exposure to PHP/MySQL, you can imagine I'm not clear on how to go from here.
    Added complication is that I actually have 5 numbers to insert per record and two calculated fields, Total Score and RPI should be calculated fields. Haven't yet figured out how to handle calculated fields, maybe only in the PHP/HTML code and not in the database.
    I hope someone can help me.

    You do have a very complex looking site and may need several tables in mysql to handle all that data. If you knew to phpmysql I would suggest taking a look at this tutorial it will help get you started in understanding how to $_GET info from a database and also how to $_POST data to a database. I am no expert just learning myself and I found this very helpful. This is the link http://www.adobe.com/devnet/dreamweaver/articles/first_dynamic_site_pt1.html
    There are also many tutorials on Youtube to help build a CMS Content Management Site I would suggest the following: -
    http://www.youtube.com/user/phpacademy
    http://www.youtube.com/user/betterphp
    http://www.youtube.com/user/flashbuilding
    And many more on my channel here
    http://www.youtube.com/user/Whisperingonthewind
    CMS's are easier to maintain, add edit and delete content.
    I have also recently bought a Book by David Powers Training from the Source very helpful.
    Anyway hope you get it sorted.

  • How to extract data from custom made Idoc that is not sent

    Hi experts,
    Could you please advise if there is a way how to extract data from custom made idoc (it collects a lot of data from different SAP tables)? Please note that this idoc is not sent as target system is not fully maintained.
    As by now, we would like to verify - what data is extracted now.
    Any help, would be appreciated!

    Hi,
    The fields that are given for each segment have their length given in EDSAPPL table. How you have to map is explained in below example.
    Suppose for segment1, EDSAPPL has 3 fields so below are entries
    SEGMENT          FIELDNAME           LENGTH
    SEGMENT1         FIELD1                   4
    SEGMENT1         FIELD2                   2
    SEGMENT1         FIELD3                   2
    Data in EDID4 would be as follows
    IDOC           SEGMENT                          APPLICATION DATA
    12345         SEGMENT1                        XYZ R Y
    When you are extracting data from these tables into your internal table, mapping has to be as follows:
    FIELD1 = APPLICATIONDATA+0(4)        to read first 4 characters of this field, because the first 4 characters in this field would belong to FIELD1
    Similarly,
    FIELD2 = APPLICATIONDATA+4(2).
    FIELD3 = APPLICATIONDATA+6(2).  
    FIELD1 would have XYZ, FIELD2 = R, FIELD3 = Y
    This would remain true in all cases. So all you need to do is identify which fields you want to extract, and simply code as above to extract the data from this table.
    Hope this was helpful in explaining how to derive the data.

Maybe you are looking for

  • Record and Case Management in ABAP

    Hello, please, does anybody know how can I find, or what is the relationship between the Records table SRMREC00 (or SRMRECP00) and the Cases table SCMG_T_CASE ?? I know that data are stored and related in these tables, and I need to select them, but

  • How deploy BPM process with ADF form?

    Hi everybody, I'm a newbie on BPM (11g) and ADF. I've created a BPM process and an ADF train into the same project. I was wonder if I can deploy the ADF stuff and start to test only the train before hook up the ADF form with the UserActivity. How I c

  • Too many SYS_EXPORT_FULL_tables

    Hi Guys, I take backup of one of my database using data pump on daily basis. I have noticed that now my database contains too many SYS_EXPORT_FULL_ tables (i.e SYS_EXPORT_FULL_1... SYS_EXPORT_FULL_52 ...), and these tables are quite big. The size of

  • Itunes- can't quit.. soon as quit it start up again automatically-Often

    before i shutdown my compter sometimes if i quit itunes when i don;t listen music.. it start up automactically.. don;t know why... almost like trying to kill the bug and then bug never die and start to fly again... lol.. why my itunes like that? what

  • I photo Questions

    I have version 8.1.2 of iPhoto 09. I am now scanning old slides and negatives into iPhoto using Ion's scanner.  I now discover that some photos may be reversed and it is tough to figure out before loading the jpgs into iPhoto.  Is there a way to reve