Data to analyze

I have three text data files with information about users, AD groups, and servers in this format:
groupname,groupID,membercount
user;groupID,groupID,groupID...
server,groupID;groupID;groupID...
I want to be able to:
-click on a group and see all members
-click on a server and see all groups
-click on a user and see all group names
Is it possible to put this into powerpivot given that the user and server rows are variable length?  I'm thinking the data would have to be formatted as
user1;groupID1
user1;groupID2
user1;groupID3
user2;groupID1
etc
and
server1,groupID1
server1,groupID2
server1,groupID3
server2,groupID1
etc.
thoughts?  TIA

Power Pivot doesn't perform those kinds of transfomations.  Power Query does that before the data hits the Power Pivot model.
See
Unpivot columns
David
David http://blogs.msdn.com/b/dbrowne/

Similar Messages

  • Any 802.11u data to analyze?

    hi,
    i have a question for you, so can anybody help me.
    does anyone have any "802.11u" network data to analyze, I need them for my seminar
    i'm using wireshark

    I posted a 802.11u capture I received from Omnipeek when I was down there last week. Its on my blog ...
    http://www.my80211.com/home/2012/9/20/wildpackets-gestalt-it-wfd3-wildpackets.html
    "Satisfaction does not come from knowing the solution, it comes from knowing why." - Rosalind Franklin

  • Problems with Edit Data in Analyzer 5.0.3

    I´m trying to use Edit Data function in Analyzer client and Analyzer web.When I type the data value and click Lock/Send, the message bellow appears in Analyzer client:"Error 1014004 - Unable to Update Members Which Have Not Been Locked."Error during EndUpdate.This message bellow appears when I use Analyzer web:(11138)Write-back of data to the database failed."Error 1014004 - Unable to Update Members Which Have Not Been Locked."Somebody may help me?Adriane

    I didn't have the Essbase setting of "Use Grid API" turned on for the user I was testing with.Jim

  • Data Base Analyzing

    Dear Gurus!!!
    I'm analyzing the growh of the data base, but I have any questions:
    1. The field "Lin Delta", ¿Are record of the table?
    2.  The Total Size (KB) of the DB is = Tabla + primary Key + Indexes?
    3. For What The total size  of the Db is diferent with total sum of the table in the space statistic.
    best regards

    Hi Luis,
    Table indexes are indicated seperately in DB sizing, when u check them in DB02,
    also the following link has info on DB space statistics...
    http://help.sap.com/saphelp_nw04s/helpdata/en/6b/cbb93aa0c3c172e10000000a114084/frameset.htm
    hope it helps...
    Br,
    Sri
    Award points for helpful answers

  • Analyze data from LabWindows 4.0

    I have a small hydro power plant which has been controlled by a program made in LabWindows 4.1 (from 1998-1999). This program have logged a lot of information on production, waterlevels and so on for 10 years, and I have now an interest in analysing the data. Each months there was produced a .dat-file with the data, which can be (only?) read by the control-program to watch plots and tables. But I am more interested in extracting the data to analyze with Excel, MatLab or LabView, i.e. getting the data in nice tables in maybe a simple .txt-file. Unfortunately the .dat-file only contains jibborish which I can't make any sense of. Do any veterans in here have any ideas on how to go forth to aquire the data in a sensible ASCII-format? I'm attaching a file if that helps. Wouldn't expect anyone to decode it, but maybe someone could point me in the right direction.
    Attachments:
    1999-01.txt ‏210 KB

    Hi Super-Chub,
    from what you are saying you don't have the source code for your application, right? Couldn't you find the original developer and either ask him the description of the data files data file or have him write a small application to translate them?
    From the file alone it is difficult to understand which data is saved on it and how; .dat extension does not identify a specific file type. From a fast glance to the file you attached, I seemed to recognize a 36-byte pattern with embedded 4 spaces that repeats throughout the whole file, but no more than this. Based on your knowledge of the application you may try to guess which type of data is stored in each file and possibly how data are organized in it.
    One could try to decode the file based on the assuption it was produced by ArrayToFile function, but I can't remember whether CVI 4 was already having that function and how it was organized; moreover the 4-spaces field seem to contrast with the use of such function. It could also be the dump of an array of structures... Anyway, I am just throwing out some idea that may help you in this task.
    ** Edit: I just saw in the online help that ArrayToFile function is present in CVI from release 4.0.
    Proud to use LW/CVI from 3.1 on.
    My contributions to the Developer Zone Community
    If I have helped you, why not giving me a kudos?

  • How can I display data gathered in a subVI in a graph of the main VI?

    I have written a largish application (~50 VI's) which acquires, analyzes, display and saves data from an instrument with a built-in DAQPad. My problem is that my block diagram is rather messy by now. I'm using an event structure in my main VI which reacts to buttons being pressed on the front panel. During data acquisition (one frame of the event structure), I need to do a lot of data processing, and I'm displaying both raw data and analyzed data on the front panel. I'm using a lot of subVI's for this, but I always need to get data out of the subVI's again to display it on the front panel, cluttering my block diagram. It would be much nicer if the subVI could update the main VI's graphs and indicators. I just found two examples with control references which show how a subVI can modify e.g. a 3Dgraph of the main VI, but I'm unable to use this with normal graphs and charts - I can't find a way to update the actual data in the plots (I can scale the plot or color it blue etc - but I really want to change the data it's displaying, not color it blue). Is there anything I'm missing? Is there example code for this kind of problem?
    best regards
    Martin

    im assuming that you want to update your graphs and indicators as you are performing your DAQ, otherwise, you can pass out your value/s when the DAQ completes.
    I have attached a very simple example of using a reference to update your front panel graph.
    Hope this helps.
    Attachments:
    Reference Example(LV7.1).zip ‏17 KB

  • SAP BPC MS 7.5 with Extended Analytic Analyzer and EPM connector

    Hi experts,
    I need your inputs regarding Extended Analytic Analyzer add ins.
    I installed the SAP Business Object Extended Analytic Analyzer hoping to integrate the Xcelsius in SAP BPC MS 7.5
    I am following the HTG to integrate but got lost.
    In EPM connector steps, I cannot find the option from OPERATION TYPE: Retrieve data using Analyzer Report.
    The only options available under operation type are
    EPM Report
    Retrieve Environments
    Retrieve Models
    Retrieve Dimensions
    Retrieve Dimension Members
    Input Data
    Retrieve Business Process Flows
    Retrieve Context
    Retrieve Members Property Values
    RetrieveText From Library
    It doesn't include the option Retrieve data using Analyzer Report.
    Im wondering if there are different version of the epm connector? Was my EPM connector differs from the HTG?
    And also in Excel under Extended Analytic Analyzer, the function =GETREPORTDEFINITION() is missing
    Please help me on this guys..
    Thanks in advance.
    yajepe

    It seems a very good oportunity to use FIM.
    FIM was designed especially for exchange data betweeen different SAP product.
    FIM will provide an easy way to do the conversion using wizards and also will assure you about data inetgrity and quality.
    This will be the way forward but more details has to be define during the implementation.
    I hope this will help you.
    Kind Regards
    Sorin Radulescu

  • Data acquisition & state machine

    Hi all,
    I am working on a project that is capable of data acquisition, analyse, and record processes. I have to acquire data of temperature from 8-32 channels and that of pressure from 7 channels. After analysis, I have 48 outputs. I want to record raw data (acquired) and analysis data at the same time.
    1) Is it possible to record two text files (raw data and analysis data records) at the same time?
    I also want to adjust acquired sample rate but I have doubts about utilizing it in a state machine. For now, I have just used time elapsed function to acquire data once in 60 seconds.
    2) How could I set sample rate of acquisition except using time elapsed function?
    I also have one more doubt about timing. If I needed to acquire sample once in 60 seconds, but the analysis took more than 60 secs, what would happen?
    3) How can I prevent data loss in this case?
    I have not prepared a user interface yet.
    4) How can I utilize it? Do I put graphs and indicaters in the same VI or is it better to prepare another vi just for user interface?
    Thanks in advance
    Egemen
    Attachments:
    Program.llb ‏433 KB

    Hi Egemen
    -          If we need to acquire data and analyze it at the same time we recommend the architecture producer-consumer to prevent data loss, and yes you can record two text files at the same time. Please take a look to this architecture.
    -          http://www.ni.com/white-paper/3023/en
    -          Basically you use the first loop to acquire data and the second one to analyze and record data (using queues).
    -          https://decibel.ni.com/content/docs/DOC-2431
    -          You can also find examples about this architecture at the developer zone.
    Regards

  • 0D/0A pair written into data file when it shouldn't be

    Hello,
    I have a data file problem. My customers data is written as a 398 byte row in a .csv file. It's OK that the row ends with a CR/LF. But the data is actually modified by the string to array or the write to spreadsheet function. In the example below you can see the data being written and the data as read back by HxD (a hex editor/reader). They are not equal anymore.
    When an 0D is encountered in the data an 0D/0A pair is written, when an 0A is encountered an 0D/0A is written.
    Any suggestions would be much appreciated.
    Thanks in advance.
    Barry
    barry
    Attachments:
    Write test.vi ‏11 KB

    I have had to switch to the Write To Binary File with an Open/Create File and Set File Position (end) ahead of it. This does work and is readable in a HxD window. Hopefully it will be just as useful when the data is analyzed in MatLab or some other program.
    It does however add a header to the data which is not needed but acceptable by the customer. I find it a bit useful as I can ctrl-F through the data using the header info. In my case the header is constant.
    Thanks.
    barry

  • IP - Issues with Input Query: Works on Bex analyzer but not on Web

    I am doing the following:
    Created a Aggregation level for a multi provider which has only real time infoproviders associated with it
    Created the query using Bex Query Designer availing all the options for planning
    When I execute this query it executes using a web template and does not open up the Key Figure cells for input, I had tried using the Web Application designer with Save button but still the same issue
    I tried the same on Bex analyzer and it works fine
    Please help to resolve the above issue.
    Thanks in advance.

    Hi Ram,
    Cells should be input enabled on the web just running the query. No need to create a Webtemplate, altough you need to have the webtemplate to be able to use the "save data" function.
    Maybe your query is not input ready at all, and you're misinterpreting analyzer layout.
    To test this, please enter some plan data in analyzer, right click and choose "save".
    Please check if data is written to the real time infoprovider.
    Hope this helps you.
    Regards,
    Miguel P.

  • Logistic Data source and loading sequence

    Hi Gurus ,
    Can anyone help me explain the use of the 2lis_02_CGR and 2lis_02_SCN.
    I also need to know the data loading sequence for the following data source
    2lis_02_HDR,
    2lis_02_ITM,
    2lis_02_SCL,
    2lis_02_ SGR,
    2lis_02_ CGR,
    2lis_02_SCN
    Thanks for your help

    Hi,
    i.Purchasing Data (Header Level)
    Technical name: 2LIS_02_HDR
    Technical Data
    Type of DataSource:
    Transaction Data (Movement Data)
    Application Component:
    Materials Management (MM)
    Available as of OLTP Release
    SAP R/3 4.0B
    Available as of Plug-In Release
    PI 2000.1
    RemoteCube Compatibility:no
    Prerequisites:
    Activation of the DataSource.
    Use :
    The DataSource is used to extract the basic data for analyses of purchasing documents consistently to a BW system.
    ii.Purchasing Data (Item Level)
    Technical name: 2LIS_02_ITM
    Technical Data
    Type of DataSource:
    Transaction Data (Movement Data)
    Application Component:
    Materials Management (MM)
    Available as of OLTP Release
    SAP R/3 4.0B
    Available as of Plug-In Release
    PI 2000.1
    RemoteCube Compatibility:no
    Prerequisites:
    Activation of the DataSource.
    Use :
    The DataSource is used to extract the basic data for analyses of purchasing documents consistently to a BW system.
    Delta Update:
    A delta update is supported. Delta process: ABR – Complete delta update with deletion indicator using delta queue (Cube-compatible).
    iii.Purchasing Data (Schedule Line Level)
    Technical name: 2LIS_02_SCL
    Technical Data
    Type of DataSource:
    Transaction data
    Application Component:
    Materials Management (MM)
    Available from OLTP Release
    SAP R/3 4.0B
    Available from Plug-In Release
    PI 2000.1
    RemoteCube Compatibility:No
    Prerequisites:
    Activation of the DataSource.
    Use:
    This DataSource extracts consistent basic data for analyzing purchasing documents to a BW system.
    Delta Update:
    A Delta update is supported. Delta process: ABR – Complete delta with deletion indicators using delta queue (Cube-compatible).
    iv.Allocation - Schedule Line with Goods Receipt
    Technical name: 2LIS_02_SGR
    Technical Data
    Type of DataSource:
    Transaction Data (Movement Data)
    Application Component:
    Materials Management (MM)
    Available as of OLTP Release
    SAP R/3 4.0B
    Available as of Plug-In Release
    PI 2002.1
    RemoteCube Compatibility:No
    Prerequisites:
    Activation of the DataSource.
    Use:
    This DataSource is used to extract the schedule line quantities allocated with goods receipt quantities consistently to a BW system.
    Delta Update:
    A delta update is supported. Delta process: ABR – Complete delta update with deletion indicator using delta queue (Cube-compatible).
    v.Allocation - Confirmation with Goods Receipt
    Technical name: 2LIS_02_CGR
    Technical Data
    Type of DataSource:
    Transaction Data (Movement Data)
    Application Component:
    Materials Management (MM)
    Available as of OLTP Release
    SAP R/3 4.5B
    Available as of Plug-In Release
    PI 2002.1
    RemoteCube Compatibility:No
    Prerequisites:
    Activation of the DataSource.
    Use:
    This DataSource is used to extract the confirmation quantities allocated with goods receipt quantities consistently to a BW system.
    Delta Update:
    A delta update is supported. Delta process: ABR – Complete delta update with deletion indicator using delta queue (Cube-compatible).
    vi.Allocation  – Confirmation with Schedule Line
    Technical name: 2LIS_02_SCN
    Technical Data
    Type of DataSource:
    Transaction Data (Movement Data)
    Application Component:
    Materials Management (MM)
    Available as of OLTP Release
    SAP R/3 4.5B
    Available as of Plug-In Release
    PI 2002.1
    RemoteCube Compatibility:No
    Prerequisites:
    Activation of the DataSource.
    Use:
    The DataSource is used to extract the schedule line quantities allocated with confirmation quantities consistently to a BW system.
    Delta Update:
    A delta update is supported. Delta process: ABR – Complete delta update with deletion indicator using delta queue (Cube-compatible).
    If it helps assign points.
    Thanks,
    Akshay a.

  • Scheduled date not matching std dates defined

    Dear all,
    We are facing scenario where when we create process order in the system, The system automatically assign dates of operation to be completed ( Earliest & Latest ) while in recipe we have defined std time for that particular phase. So my query is how system calculates those dates & where i can change or check so that system should throw dates in relationship with STD time provided for that operation.
    Because of that we are not able to plan our capacity correctly as system itself showing not expected or correct timing for the operations.
    Please refer following screen shot for your reference.
    1) Operation screen for process order
    Now for phase 0011,
    please check phase details.
    In formula for resource we have kept OH time as operation time.
    So according to me, system should take show
    1) for phase 0011 latest start time 07:00:00 & latest finish time 15:00:00 i.e. 8 hrs which is defined in recipe. as of now it shows 07:00:00 to 16:37:27. i.e. 9hrs 37 mins. approx.
    Please advise,
    Thanks in advance.
    Regards,
    Deep Dave.

    Dear KK,
    Please find below SS for your reference.
    1) Operation Screen
    2) operation timings
    3) Capacity header for resource used.
    I guess this will be sufficient data to analyze. Please revert your thoughts on same.
    Regards,
    Deep D.

  • Jpcap and capturing data in text format

    Hey everybody!
    I want to use jpcap to capturing all data sending from local computer, which contains words like 'sex', 'porn' etc.
    1. Is it possible to achieve this using jpcap?
    2. If the answer is yes, then please, give me small hint. I cant see any method in jpcap API to get String data from captured packets...
    Best Regards,
    Peter.

    If your need is to capture outgoing ( or maybe incoming data ) then consider using WireShark.
    Available on Windows and Linux ( maybe Solaris... )
    If you want to create something like network data capture/analyzer using jpcap then why can't you find this :
    [http://netresearch.ics.uci.edu/kfujii/jpcap/doc/|http://netresearch.ics.uci.edu/kfujii/jpcap/doc/]

  • Gait data: step count and average waveform

    I am currently evaluating DIAdem to determine if it will be usable for my task. I have human gait force/moment data to analyze. In particular, I am analyzing force data similar to force plate data. Data is anticipated to be collected over periods of days so they will be relatively large files. First, I need to segment the data according to activity (inactive, standing, walking, localized movement). I have figured out how to do this manually using flags and visual inspection. If I go forward with the software, I will want to automate this process but my current questions are regarding the analysis of the segmented data.
    (1)    For the walking and localized movement segments, I want to know the number (count) of gait steps. One complete step is illustrated in the pdf between the two black solid lines. The gait movement typically has a double peak appearance but that is not always the case, sometimes it resembles a single peak. The peak search does not necessary give me what I want.
    (2)    For the walking segment, I want an average gait step waveform. Take all the individual steps and display an average waveform (one complete step).
    (3)    For each activity segment, I want to know the total duration for the activity. The sampling rate is not necessary constant and might fluctuate or change. Since the segmented time data references the relative time location, I haven’t been able to figure out how to represent the total time especially since the sample rate changes.
    I have attached a sample of the data which includes the complete force data and manually segmented activity data. I am not experienced with VBscript writing. Suggestions/assistance with any of these tasks will be helpful or indication if any of these tasks are not feasible with DIAdem.
    Thanks
    Attachments:
    Gait Waveform.pdf ‏39 KB
    Trial Data.zip ‏1189 KB

    Hi MJG3,
    What you are trying to do is definitely possible with DIAdem and while it is a lot easier than it would be in other languages, it isn't necessarily simple - especially if you don't have much experience with VBscripting.  I took a look at the data that you attached and you will definitely need to create an algorithm to filter out each different section of your data to create separate graphs.  In order to actually create these graphs, it is probably easier to record a macro in the Scripting tab that creates a subset of data and then adds it to your report (or wherever you need it) than to write it from scratch.  Then, from there, I would use the algorithm you develop to determine where to put each data point.
    As far as the actual algorithm goes, I have a general idea of how you might want to do this.  You could use a 'filter' (if/else if) statements that look at the following parameters to determine whether you are walking, standing, localized, or inactive.
    If you are walking, it looks like you could use the frequency and amplitude of your signal to determine if you are walking (If Frequency > limit && Amp > limit2)
    Next, I would check to see if you are standing by determining if the change in amplitude over a certain period of time is less than a certain amount but the amplitude is greater than a set limit. I might suggest using an offset of your localized data followed by an integral to get the area under the curve to determine the change.  
    To determine if you are 'inactive', I would use the same parameters as standing but you would want to look below the set limit.
    Finally, any data that is not accounted for is 'localized' which appears to be more random than the other three data sets. 
    Of course, to do all of this you will need to take subsets of your data and scan each section x points at a time.  The number of points to look at is going to vary based on the rate that you are acquiring data, how fast you expect your subjects to walk, etc.  I would use something similar to this example in order to count the number of peaks from walking which would tell you the number of steps.
    I wish you the best of luck on your project!
    Regards, 
    Trey C.

  • Essbase log analyzer

    Back in the Hyperion days, there used to be a set of objects you could download that consisted of some rules, calc script, and database shel. It was used to import a database log and would build some dimensions and load the data to analyze the logs. Does anybody have any idea where I could find that?

    You might want to take a look at this: http://www.network54.com/Forum/58296/thread/1196629882/EssbaseRightLog
    Regards,
    Cameron Lackpour

Maybe you are looking for

  • I can't sync my iPod to entire library

    My husband had the first iPod.  We shared an email address.  When he died, I gave his iPod away.  I had been able to sync mine early on, but now I can't add new content or sync the content of the main library.  I'm still using the old email account. 

  • My first PKGBUILD, sendmail.

    Alright, well I've been reading the wiki, and some of the other posts in the forum here, but I'm not entirely sure what I'm doing wrong. So I figured I'd ask here and see what mistakes I've been making. Here is the PKGBUILD file I made: #Contributor:

  • How to convert nikon files to jpg

    how to convert nikon files to jpg

  • SAP IDM 7.2 SP09 initial load jobs error

    Hello IDM experts We are getting below error while loaidng initial jobs in IDM 7.2 SP09 java.lang.Throwable: java.lang.ExceptionInInitializerError: JCO.classInitialize(): Could not load middleware layer 'com.sap.mw.jco.rfc.MiddlewareRFC'JCO.nativeIni

  • Storage Bin where the material is stored

    Hello, I just uses Inventory Management and not the Warehouse Management. In Inventory Management you define where the material is stored, in the material master data.    However, I need to know if is possible decided at the time of good receipt, whe