BI data collection question

Dear colleagues,
I have one big question regarding the BI data collection functionality (within the solution reporting tab of T-cd: DSWP).
To whom it may concerned,
According to the SAP tutor located at the learning map,
it looks like SolMan system requires XI installed in order to use the BI data collection. Is this correct? If so, are there any step by step configuration guide that covers from installing XI to enabling the BI data collection function?
Thank you guys in advance.
rgds,
Yoshi

Hi,
But we want to maintain Item Master in ITEM02, Item Branch in ITEM03  data separate infoobjects because the discriptions always changing that is the reason I created different  Master Data infoobject.
Let me give one sample data
Record1: IT001 LOT1 BRAN1 CODE1 CODE2 DS01 DS02 in  2006
Record2: IT001 LOT1 BRAN1 CODE1 CODE2 DS05 DS04 in  2007
If you look at above data in 2006 item description was DS01,DS02 and in 2007 Description changed from DS01, DS02 to DS05,DS04 with the same item no.
We want to maintain item descriptions will be the same in 2006 and 2007 if we change item descriptions in 2007. How can it be possible?
If I go your way how can we achive this task?
Do we need Dalta loads?
Also do i need to create InfoCube? Can I use ITEMNO infoobject as a Data target ?
Please let me know your thoughts.
Regards,
ALEX

Similar Messages

  • Excel to PDF Data Collection Question

    Greeting all,
    My name is Mark and I work for the Navy in the IT department for my organization; specifically the system administration division.  My dilemma is that I have over 600 users and all have to fill out and sign an electronic form prior to accessing our network.  I have an excel sheet with each person's information (last name, first name, last four etc) on file, so the hard part is done.  What I want to do is pull information from the excel sheet and have it automatically populate into the acknowledgement form so I could e-mail it to them and have them digitally sign it vice printing it out and having to scan them in to retain on file.  Currently, the requestor will come in and fill out each document and then sign the acknowledgement form, then I have to scan each individual form after it's been signed by ink.
    Is there any way to pull data from the excel file and import that into the pre-made pdf acknowledgement form?
    Thanks in advance.
    Mark

    There are a number of ways you could do it, but the most seamless involve some type of programming. Here's a good article that discusses some of the ways: http://acrobatusers.com/articles/getting-external-data-acrobat-x-javascript
    In item #3 the article touches on the importTextData method. If you export the spreadsheet to a tab-delimited text file, you can use the doc.importTextData method to (manualy) import a row of data from the file: http://livedocs.adobe.com/acrobat_sdk/9.1/Acrobat9_1_HTMLHelp/JS_API_AcroJS.88.502.html
    This is probably the simplest method that will get you what you want. The first row of the data file has to be the tab-delimited field names.
    Another way that can be automated would be to create an FDF file using a macro in Excel and then loading the FDF into the form. If you haven't done this sort of thing before, it will involve a bit of a learning curve.

  • How to perform Data Collection on single SFC with QTY = 1 with material lot size 1?

    Dear experts,
    We are working with SFC qty>1 on a relaxed routing. At a given operation we want to collect the data on single quantity; i.e. SFC qty on that operation, where the collection will happen, will be 1.The corresponding material lot size is for ex 10. The operator must be able to collect data on SFC with qty=1 multiple times until the quantities will be consumed. He must be also able to collect other values on the remaining quantities on the same operation with the same DC group or other DC groups. How many times the data must be collected is dependent on the shop order build quantity. The data may be collected several time but not more than the build qty. In other words some specific data will be collected on a qty of a product while others will be collected against remaining quantity. The data collection must be also done in a serialized manner.
    Here's what we have set up so far:
    1) 3 DC groups, each DC group has 3 data fields.
    2) Each data field has the following restrictions:  Required Data Entries = 0 and Optional Data Entries = 1
    3) All DC groups are attached on the same combination of operation\material\routing
    4) we are using relaxed routing
    Process description:
    The operator must be able to collect any data field on single product. For that he will enter the operation where the data collect are attached, he will enter the SFC with qty=1 then he will run the data collection after selecting the appropriate DC Group and entering the needed information. The operator will complete the SFC with qty=1.
    The operator will pick the next product, select the same SFC and enter qty 1 and collect other value against this product.
    Problem is:
    Once the first collection is done on a given SFC with entered qty=1, the system is not allowing the operator to do further collects on the same SFC with qty=1 or any other quantity. He cannot select any DC group from the DC group list. We tried also with the table selection menu on the DC Group list but nothing can be selected.
    So we tried to play around with the DC group definitions as follows:
    A) we set Required Data Entries = 0 and Optional Data Entries = 10. Still the operator was not able to select any DC group after collecting data the first time. We tried to reopen the pod and list again. But we get the same blocking behavior.
    B) we set Required Data Entries = 10 and Optional Data Entries = 1. The operator was able to select the DC group after collecting data the first time. BUT operator must enter the data fields 10 times on one SFC quantity, which is not what we want. Besides again he cannot collect other information on remaining quantities on the same operation.
    C) There's an option to serialize the SFC before reaching the operation where the collection is happening, then merging ofter complete. Automation is needed here; hence customization. We are highly avoiding customization now, since we expect the data collect to work well on single quantities even when the main SFC has qty>1
    Questions:
    1) Are we missing any kind of further configuration\setup?
    2) Or the current system design does not allow collecting data on single quantities of an SFC which main quantity is greater than 1?
    3) Looking at this link Approaches to Collection of Data - SAP Manufacturing Execution (SAP ME) - SAP Library, there's nothing mentioned about same SFC number with multiple quantities!!!
    We are using SAP ME 15.0.3.0.
    Thanks in advance for your help
    Ali

    Ali
    to collect data for the same SFC multiple times, your system rule "Allow Multiple Data Collection" needs to be set to true for the site.
    Stuart

  • BCS - Data collection issue

    Hi,
    I'm using BCS 4.0. I'm working now in final testing and I have some question regarding to the data collection process using load from data stream method. I ran the task in consolidation monitor for 007.2007 period and for all companies without any error or warnings, but we have differences in financial information for that period.
    I reviewed the content in my BCS cube (RSA1) and we don't have any data for that accounts, the only thing that I found was that all docs were created on same date
    I deleted the ID request in RSA1in my BCS cube and I executed task again in consolidation monitor, but the result was the same.
    Looking at log / source data, in the rows listed, these data is not taking from the infoprovider
    Any idea which could be the problem ?
    Thanks
    Nayeli

    Hi Nayeli,
    I had to do this kind of job (reconciliation of data in the source basis cube and the totals cube) during the final testing a lot of times, with an accountant.
    The only way to have the first clue what is happening is to compare every particular amount s in both cubes. Comparing and trying to figure out any dependencies in data.
    The difference might arise because of ANY reason. Only you may analyze the data and decide what to do.
    AFAIU, you compared only reported data and did no currency translation and eliminations?
    Then, I'm afraid that you've made a very big mistake deleting the request from the totals cube. You have broken the consistency between totals and document data. For a transactional cube the request is yellow until the number of records in it reach 50000 records. Then it is closed and become green. As you may understand, in those 50000 records maybe contained a lot of data for different posting periods, for example, reported data for the previous closed period. Though, documents still contain eliminations for that period. - Inconsistency.

  • Too many BPM data collection jobs on backend system

    Hi all,
    We find about 40,000 data collection jobs running on our ECC6 system, far too many.
    We run about 12 solutions, all linked to the same backend ECC6 system. Most probably this is part of the problem. We plan to scale down to 1 solution rather than the country-based approach.
    But here we are now, and I have these questions.
    1. How can I relate a BPM_DATA_COLLECTION job on ECC6 back to a particular solution ? The job log give me monitor-id, but I can't relate that back to a solution.
    2. If I deactivate a solution in the solution overview, does that immediately cancel the data collection for that solution ?
    3. In the monitoring schedule on a business process step we sometimes have intervals defined as 5 minutes, sometimes 60. Strange thing is that the drop-down of that field does not always give us the same list of values. Even within a solution I see that in one step I have the choice of a long list of intervals, in the next step in that same business process I can only choose between blank and 5 minutes.
    How is this defined ?
    Thanks in advance,
    Rad.

    Hi,
    How did you managed to get rid of this issue. i am facing the same.
    Thanks,
    Manan

  • Is there any one who knows how to apply machine learning algorithms to the spectrum data collected using ni usrp for ism band to use spectrum efficiently by link adaptation

    Hi,
    may i know any one working in cognitive radio research and applying machine learning techniques using the spectrum data collected from NI usrp kits

    Can't edit my message anymore, so for the tldr crowd (too long didn't read), here is a shorter version:
    The good:
    E4200 worked fine for about 15 hours with great throughput and link quality (with one brief disconnect in the middle - online gaming software shows brief disconnects that otherwise go totally unnoticed).
    The bad:
    Wireless on the E4200 stopped broadcasting entirely (to a Windows 7 laptop with a 802.11n 2.4 GHZ USB adapter).
    I had switched between two adapters that use the same chipset, same driver and in fact show up as a single device in device manager) without rebooting the router but it worked (fine) for an hour after I swapped the adapters.
    Right before I rebooted it, I checked and the E4200 was not hot to the touch and wired internet was still working after the wireless radio stopped working.
    A reboot of the router cured it.
    The open question as to the root cause:
    Now I want to know whether this wireless radio ceasing to broadcast (requires a reboot of the E4200) will happen daily?
    I.e. I want to know whether it it really is overheating (then again why would only some have that problem needing to reboot daily while others have gone weeks with the E4200 without a single reboot being needed and without any problems occurring) or whether it had to do with changing the adapters back and forth without rebooting the router. Keep in mind that the router worked fine for an hour or so after I stopped swapping the adapters.
    Extra question:
    Will getting a Cisco AE1000 USB adapter perhaps help? (I know this is a loaded question and their is no easy / sure answer, but even a "maybe" or reasons why it might help at this point would be better than nothing.
    Any similar experiences (with the router needing to be rebooted to get wireless radio back on) would be appreciated as it may help myself and others experiencing these types of issues.

  • CHaRM Reporting Data Collection Background Job

    Hi CHaRM Gurus,
    I saw lot of thread about data collection for CHaRM Reporting using /tmwflow/Config_lock. Question is do we need to run everytime before run the report or is there anyway can schedule in background to collect data Asynch. There is a SAP Note 1269192 applied in both Solution Manager and Managed system. If any job to schedule what job and the ID to use run what authorization. Because this job need to collect transport request and objects, so definitely required different authorizations.
    Thanks
    RS

    Hi,
    The job is SM:TMWFLOW_CMSSYSCOL, program  /TMWFLOW/CMSSYSCOL2. The data collector program requires additional authorizations for Read RFC user you have defined for client '000' of the satellite systems.
    Authorization object: S_TRANSPRT
    ACTVT = 03 and TTYPE = <*> (full authorization)
    Refer to note 881553 for authorizations.
    Thanks & Regards,
    Kriti Bhalla

  • Create a network safe data collection screen

    I am trying to create a data collection screen that will be fool proof if the network should have to fail.
    I have created an irpt page that allows inputs and does calculations of this data.  I will also have an irpt page that will replace our error page that will appear when the network fails. 
    The question is can I have the data append to an xml file or append a text file locally on the hard drive?  Can I have it smart enough that it will ping the network and/or the database.  If everything is up and running to upload the data, delete the xml or text file and reload the original irpt page?

    You can also try using AJAX. Maybe make a HTTP call to the backend every few seconds and if the network is down, maybe give a pop-up. GMail does the same thing if it fails to send your mail, for e.g.
    But be forewarned that the moment you close the web page, your data will be lost and there will be no way of getting it back. Alternatively you can save the data on the server (using AJAX, again) every few seconds and then when the network is down maybe warn the user with a popup. This way the last saved data will still be available to you.

  • Firefox users cannot presently exercise choice, to opt-in or out of "data collected for improving services." As a result, Firefox is constantly dialing home, sometimes four times a day or more up to 48 times a day. There is some over-stepping and redunda

    I understand the charter on this is to "check-in" once every time the program is turned on, and then once every 6 hours after that, or, once every 24 hours presumably. But the phone-home-effect is over-stepping these basic parameters. If a user turns their browser off and then on, Firefox is still obligated to "check-in" even if it just checked in 3 minutes prior. If the coding is not pre-designed to overstep, or act excessively in a redundant focus, the instance of once every six hours or once every 24 hours, is still overmuch if a user has been doing this (non-voluntarily-participating) for 9 months or longer.
    == This happened ==
    Every time Firefox opened
    == This started when Firefox took up the initiative of "improving service. to end users" or similar idea, making the web safer for novice users, etcetera.

    Opening question was truncated. Should read: "Firefox users cannot presently exercise choice, to opt-in or out of "data collected for improving services." As a result, Firefox is constantly dialing home, sometimes four times a day or more up to 48 times a day. There is some over-stepping and redundancy here. It would seem the practice of "improving service" has been accomplished with as much information as Mozilla has gathered in the last 6-18 months about its users habits. '''Isnt it about time to give users the option to opt out of that now that most the heavy liftiing has been accomplished?'''"

  • ASP data collection

    This is a post from ASP data collection in which I was referred to this forum.  Sorry
    I did not start here.  I have since found a substandard, overly complex, and resource consuming solution, but have not had time to drill down to make it work yet.  If I could figure out how to get the method of this question to work, it should be
    much easier to work.  Also, I will be excited and very grateful!
    Hello, I am trying to collect a small list with:
    Full Product, SHA1, and File Name from
    http://msdn.microsoft.com/en-us/subscriptions/downloads/default.aspx (Link) for several different servers.  I am going to make an
    AIO install disk with dism/wim, and need to download all the different ISO's.  Planning/wanting to do so for Server 2003, 2008, and 2012, so there are a lot of different images I will need to download.  Wanted to do the download with a manager which
    will check the SHA1s automatically.
    Anyway, I thought this would be a quick and easy project, but I have been struggling with it.  I do not know asp, but I have been trying to use
    Invoke-WebRequest and Invoke-RestMethod to retrieve this information.  I have not been able to make much sense of it at all.  I have been working on
    http://msdn.microsoft.com/en-us/subscriptions/downloads/default.aspx#searchTerm=&ProductFamilyId=137&Languages=en&PageSize=100&PageIndex=0&FileId=0(Link)
    and have not been able to retrieve the information yet!  Usually I can get a little something, but I am completely stuck.  I tried to do
    $var.Forms[0].Fields.Add("ProductFamilyId","137") and then use
    Invoke-RestMethod -Body $var.Forms[0] but have not been able to retrieve the info.
    Is there a quick way to do this within PowerShell?  Would the HtmlAgilityPack work better?  I am completely flummoxed and tired, so any help will be greatly appreciated.  I am going to go to bed before I fall asleep on my keyboard.  Thanks
    for any help you can render!
    I added a little more information
    If you were navigating the page in a web browser, you would need to press the "Details" link for each product to show the "SHA1" and
    "File Name" information.  I do not have a lot of free time for this, so I was hoping someone would be able to provide a little guidance to get past the posting/requesting hump.  I believe there
    are two posts/requests, one when the page first loads which requests the specific
    search / ProductFamilyId, and two needing to open all of the "Details" links.
    Thanks for any and all help!
    Sincerely,
    John

    Sorry for the delay, its been a little busy recently.  The messages kept getting squeezed to the right side, so I quoted instead of replied.
    If I force a Json call this is what is returned.
    "FileId":47977,
    "DownloadProvider":1,
    "NotAuthorizedReasonId":null,"FileName":"en_windows_server_2003_sp2_ia64_cd.iso",
    "Description":"Windows Server 2003 Service Pack 2 (ia64) - CD (English)\r\n",
    "Notes":null,
    "Sha1Hash":"D5829B080FF2401AD259A15C54A7704529FE392D",
    "ProductFamilyId":138,
    "PostedDate":"\/Date(1318351988783)\/",
    "LanguageCodes":["en"],
    "Languages":["English"],
    "Size":"574 MB",
    "IsAuthorization":false,
    "BenefitLevels":["BizSpark",
    "BizSpark Admin",
    "MSDN OS (Retail)",
    "MSDN OS (VL)",
    "MSDN Platforms",
    "VS Premium with MSDN (MPN)",
    "VS Premium with MSDN (Retail)",
    "VS Premium with MSDN (VL)",
    "VS Pro with MSDN (Retail)",
    "VS Pro with MSDN (VL)",
    "VS Pro with MSDN Premium (Empower)",
    "VS Pro with MSDN Premium (MPN)",
    "VS Test Pro with MSDN (Retail)",
    "VS Test Pro with MSDN (VL)",
    "VS Ultimate with MSDN (MPN)",0
    "VS Ultimate with MSDN (NFR FTE)",
    "VS Ultimate with MSDN (Retail)",
    "VS Ultimate with MSDN (VL)"],
    "IsProductKeyRequired":false
    We would need to parse all of the action links toreturn alof the details.
    ¯\_(ツ)_/¯
    I am uncertain what you mean by "force Json call".  I don't mind parsing the information.  So, you used Webclient->DownloadString($url)->$xml=[xml]$page -> some sort of Json call? 

  • ATP Based on Data Collection

    Hi,
    I am working on a fresh implementation project where I want to implement Scheduling Features of OM. (Based upon Collected Data)
    I have already applied Patch related to R12.1.2 and also ran ATP Partition Program. Now, all the table and Partition names are in Pair. I have dropped unwanted partitions as well. Setup Profile Options
    Now, my question is, at what time, I need to run Data Collection? (After loading all the items / loading all the open transactions)
    How Often, I need to run Data Collection?
    Do I need to select all the organizations while setting up the instance?
    Appreciate your timely help.
    There are few notes available on Metalink related to Data Collection and related Setup but it will be really great if someone can help me setup and understand this feature.

    Now, my question is, at what time, I need to run Data Collection? (After loading all the items / loading all the open transactions).
    Run the DC after loading all items and onhand quantities and open suppliers (po, work orders) etc.
    How Often, I need to run Data Collection?
    You should run a full DC every night and schedule a program called "Continue collection" every 20 minutes or so.
    Do I need to select all the organizations while setting up the instance?
    No. Setup only those orgs which will impact ATP.
    Hope this answers your question
    Sandeep Gandhi
    Omkar Technologies Inc.
    Independent Techno-functional Consultant
    513-325-9026

  • [JAVA] How to input data collected in a table

    Hello!
    I'm writing a program that monitors the signal of a sensor in a chart. Now I want to insert into a table of data collected by the sensor using a JTable.The problem is that if I run the program, data are not included in the table.
    Where did I go wrong?
    Here's part of the source code.
    ArrayList<Data> datiArray = new ArrayList<Data>();
    DataTableModel dtm= new DataTableModel(datiArray);
    public class DataTableModel extends AbstractTableModel {
        private ArrayList<Data> datiArray;    // <---- Il table model ha la collezione
        public DataTableModel(ArrayList<Data> datiArray) {
            this.datiArray=datiArray;
        public void addData(Data d) {
            datiArray.add(d);
    int row = datiArray.size()-1;
    fireTableRowsInserted(row,row);
             private String colName[] = {"Time", "G-value" };
            public String getColumnName(int col) {
                return colName[col];
            public int getRowCount() {
                return datiArray.size();
            public int getColumnCount() {
                return 2;
            public boolean isCellEditable(int row, int col) {
                return false;
              public Class getColumnClass(int c) {
              return (c == 0) ? Long.class : Double.class;
    public Object getValueAt(int row, int column) {
            Data d = datiArray.get(row);
            switch (column) {
                case 0: return dati.getTime();
                case 1: return dati.getGvalue();
            return null;
        private class Data  {
            public long time;
            public double gvalue;
            public Data(long time, double gvalue) {
                this.tempo = tempo;
                          this.gvalue = gvalue;
    public long getTime() { return time; }
        public double getGvalue() { return gvalue; }
    RecordButtonAction
        private void recordButtonActionPerformed(java.awt.event.ActionEvent evt) {                                            
            int i=0;
    int j= graphView.getSampleTime();
    int k=graphView.getIndexMax();
    System.out.println(j);
    System.out.println(k);
            while(i<k){
    Data dr= new Data(graphView.getTime(i),graphView.getGvalue(i));
    //datiArray.add(dr);
    dtm.addData(dr);
    //these System.out.println are served to me to check if the values were actually included in the class Data and DataTableModel
    System.out.print(graphView.getTime(i));
    System.out.println("/t"+graphView.getGvalue(i));
    System.out.println(dr.getTime());
    System.out.println(dtm.getValueAt(i ,1 ));
            i=i+j;
            readyRecord = false;
    Sorry for my bad English.

    Please don't cross-post the same question in multiple forums. Especially since you've already been given directions as to how to ask the question smartly.

  • WJA - Failed Alerts - Data Collection Inaccuracies

    I have been running Web Jetadmin v10.2.7491B on a 2003 server for about 3 years. There are 120 managed printers consisting of 9 different models.
    Issue #1
    Initially I had alerts configured on all printers that would send an email when a consumable dropped below 3%. This worked very relibly for the first year. Over time, an increasing number of printers stopped sending reports, until eventually only a handful sent alerts. I never found the cause of this problem.
    Issue #2
    Since alerts were broken, I setup a daily data collection for all printers and a morning csv report sent via email. I would then sort the data by remaining consumable percentage and replace those at 1% or below. Recently, 3 models out of 9 have stopped reporting accurate consumable percentages in the daily reports. This problem seems to correspond with a firmware update I pushed out to all printers. The WJA console DOES show accurate information for individual printers when they are selected.
    Steps taken to resolve #2
    Printers have been removed from the all Report Data Collection groups, and then added back. This did not resolve the issue. Printers were initially added individually to a Data Collection, but I have implemented a Group Policy that automatically adds a printer to the Data Collection. 6 printer models still show accurate percentages, but models 4345, 4730, and 5035 do not.
    Is there a database file I can delete that will force a Data Collection rebuild for all printers? Is it possible the new firmware for the 3 listed models has a bug that prevents the data from being updated?
    This question was solved.
    View Solution.

    This forum is focused on consumer level products.  For this issue you may have better results posting in the Enterprise Print Servers, Network Storage and Web Jetadmin forum here.
    Bob Headrick,  HP Expert
    I am not an employee of HP, I am a volunteer posting here on my own time.
    If your problem is solved please click the "Accept as Solution" button ------------V
    If my answer was helpful please click the "Thumbs Up" to say "Thank You"--V

  • Synchronous data collection using pci-6143's

    I can set up synchronous data collection across the analog inputs of my three PCI-6143's using a separate task for each board and explicitly sharing the master timebase from board 1 (the one receiving the trigger to start data collection) to the other 2.  Then I need 3 read channel VI's etc. 
    The DAQ Assistant will configure all the AI channels to work inside one task across the three boards, which is very convenient, but I lose the synchronicity.  Specifically, the device triggering the data collection (board 1), leads the other two boards by a few microseconds.  How can I use a single task for all three boards for analog input (voltages) while retaining completely synchronous data collection?  Thanks!

    Hi Brian_g,
    You should be able to synchronize your SMIO cards by including them in the same task this way. You will have to type in the names, ie "Dev1\ai0:7, Dev2\ai0:7, Dev3\ai0:7" and still specify the start trigger off of your master device. I would work from the "Cont Acq & Graph Int Clk.vi" example and add in the digital trigger.
    Please post back if this does not resolve your issue or I didn't answer your question.
    Cheers,
    Andrew S.
    National Instruments
    Getting Started with NI-DAQmx
    Measurement Fundamentals

  • Log NC based on data collection

    Is it possible to trigger the logging of an NC based on a data collection value being outside the acceptable range?
    ie. Acceptable range for the data collection is a number less than 6, if the user enters 7 I would like to log an NC that says the data collection is out or range.

    To summarize:
    What I'm taking away from this is that it is the best practice to have only one parameter per DC group if you intend to trigger the automatic logging of an NC when that group "fails." The one parameter in the DC group MUST have a min/max value assigned and a fail is triggered when the operator enters a value outside of that range.  The NC is logged using the value assigned to the LOGNC_ID_ON_GROUP_FAILURE parameter in activity maintenance.
    If there are multiple parameters in the DC group, they all have to have a min/max value assigned and ALL of the responses have to be out of range in order to fail the SFC.
    I cannot have a DC group that contains parameters of multiple types and expect an NC to be logged based on an incorrect answer (for one question or multiple.)
    I cannot expect an NC to be logged based on an incorrect answer of one question, if the rest of the questions in the DC group are answered "correctly."
    Sound correct?
    Edited by: Allison Davidson on Apr 18, 2011 10:06 AM  - typo

Maybe you are looking for