Suggestions on creating a data visualization solution
Hello All,
A website provides to its surfers some data (statistics actually) in form of .xls documents.The data is actually some stats like say in one excel that covers one social welfare scheme you would have one row for a state/region of a country and different columns like Total money allocated by the federal govt, TM sanctioned by a regional govt, Money allocated to a city authority , money that actually went to the end user.. stuff like that. The columns run to many in some cases and there is different kinds of such data.
After I came in touch with this data I realised that the hard work done by the guys who gathered this data, gets ignored because of not so friendly data format. The end user is supposed to crunch numbers and that too after labouring between the excel cells. I am thinking of some way in which this data could be made more presentable. I was suggested by a developer friend to check if flex could be used and thats how I arrived here.
I myself work on Linux based systems and have also coded in Perl. I can't pay for the licensed version of Flex with IDE so I will go for SDK.
Could you guys could tell me if flex would be the right choice for the this kind of problem, in which case I would dig further to study Flex and AS.
Suggestion and comments are more than welcome
-Pranab
This is really an infographic design problem rather than just one of pure technology. Flex could help you or you could use Flash, but you need to look to using good inforgaphics with a drill down capability. It's not straight-forward. There is no magic-bullet capability in Flex to make this data easily accessible.
Similar Messages
-
BIA Test: Suggestion for creating enormousTest Data for Infocubes
Hi,
I need to create some enormous amount of data for infocube in order to test BIA.
I don't think my basis team will allow me to load data from Production to Sandbox and moreover production has less than 250 millions records.
Any suggestion for 250 million records?Hello,
Just an idea. If you can load a minimum of data for your cube then it is easy.
You just have to generate for your cube for instance a datasource and then create some update rules to loop on your data.
a. generate export datasource
b. map it to your infosource
c. define some update rules to ensuere your data will be different. For example in the start routine.
Hope it helps.
Patrice -
I'm seeking suggestions to create a data message playback system
I am working with a data message table and want to playback the data with timed, FFw, Rev or pause functions.
Details:
Table contains different messages like rows in a spreadsheet.
Each message row is sent one at a time and is comprised of binary hex data that must be decoded and displayed on the screen.
I'd like to use event structures to step through the data and view the decoded messages.
Does anyone have example vi's that may solve my problem or provide direction?Hi jdam,
Thank you for posting in our forums. LabVIEW is a very powerful and versatile tool that should be able to accomplish some of your tasks. First, I just want to clarify some things about your issue. With this data message playback, is there a specific file format that you were looking to use? Or were you just looking to display the results back on the screen? If your data is stored in a spreadsheet file, then I'm assuming you have some sort of two-dimensional array of data. You can use some of the array functions, such as Index Array, to pull out each row on each iteration of a while loop and act on that data. This action can include performing a conversion from the binary hex data into whatever specific format you are trying to use. We have functions like the typecast function and other format conversion functions in the String/Conversion Palette of VI's that can help accomplish this task.
You should be able to use a simple user interface that has a button for each playback action. Each of these buttons can be tied to a specific event in an event structure. Using the appropriate converted data format, you can have each of these events perform whatever actions or display with your data depending on the button that is pressed. Was this the type of functionality that you were looking for? If you wouldn't mind elaborating a little bit more on your steps, I would be able to give you more detail to push you along a more specific path. I hope that I'm understanding you correctly. Take care and have a great day!
Regards,
Matt G.
Applications Engineer
National Instruments -
Data Visualization Education Path
Hi,
I’m interested in learning how to create interactive data visualizations, such as maps and charts that will eventually be data driven. Here is an example: http://www.boston.com/news/local/massachusetts/graphics/08_22_10_property_tax/
My background is in UX design (mainly Photoshop/Illustrator to xHTML and CCS) and I have had little to no Flash animation or object orientated programming experience. So that leaves me trying to figure what software, languages or other technologies I need to learn to achieve this and my head is spinning.
I’ve started the process by learning the basics of Flash CS5.5 animation from Lynda.com. Next I plan on learning ActionScript 3.0. However, this is where my questions start to arise:
1.
For data visualization, is working in Flash Builder better or more efficient than working in Flash Professional?
If I wanted to do a map with shaded areas like the one in the example can this be done in Flash Builder alone?
Are there any data visualization tutorials using adobe products out there?
I understand that this is a pretty big undertaking and my questions may seem pretty fundamental, but at this point I’m just trying to figure out in what direction I need to go. Any insight will be appreciated.
Thanks!<p>
Hi,
</p>
<p>
It seems you have no suitable libraries in your ViewController project. See on this image whet libraries you need.
</p>
<p>
Kuba
</p> -
Posting Document not created (export data missing)
We are working on ECC 6.0 and during creation of Cancellation Billing document type IVS for the Intercompany billing, the document did not create Accounting document giving the Posting status as G - Posting document not created (export data missing). However checked the Foreign trade data and the system states that Foreign trade is complete.
Can any one suggest the solution for this?
Thanks.Hello Mohammad,
About this issue, the incompletion of this cancellation document
arises might because the reference delivery has been archived/deleted.
This is likely due to your current copy control settings (VTFL).
If you have set ' ', in the export determination field this means that
there is a common 'exnum' key between the delivery and billing documents
LIKP-EXNUM & VBRK-EXNUM are the same.
The result of deleting the delivery is that you also delete the FT data
for the delivery, Because the billing document and delivery share the
same foreign trade table entries, you have in effect deleted the FT data
of the billing document too. This would most likely go unnoticed but if
you then cancel the invoice the system has no FT to copy into the new
cancellation billing document, which leads to this issue.
Firstly you should set the export determination flag in the copy control
of the delivery & billing documents. This will avoid this issue in the
future. Set it to to 'A' or 'B' depending on your requirements.
For the incomplete cancellation billing document, you could use userexit to
set it as complete:
You can use transaction CMOD.
Create a project for example FT project button create short text: ....
drop button Enhancement assignment & save use for example local object.
Use the F4-help for Enhancement and look for V50EPROP.
Drop button Components:
here you will find EXIT_SAPLV50E_005 and EXIT_SAPLV50E_006
In the userexits doubleclick on the includes ZXV50U05 and ZXV50U06
The coding in the userexits should be: C_COMPLETE = 'X'.
Afterwards activate generate ... and save.
The development class is VEI.
Please also check note 118573 which explains the FT user exits.
Regards,
Alex -
Upcoming Lumira Webinar June 11th on the topic of "Big Data Visualization and Custom Extensions"
The next webinar in the Lumira series is coming up this week on Wednesday, June 11th, 10:00 AM - 11:00 AM Pacific Standard Time. The day is almost here and if you've already registered, thank you!
If not, please click here to register.
Speaker Profile:
The topic is presented by the Jay Thoden van Velzen, Program Director Global HANA Services/Big Data Services Center of Excellence at SAP!
Jay has been working in Analytics/Business Intelligence since it was called Decision Support Systems in the late 90s. Currently he is focused on Big Data solutions and how to make the various components of such a solution run smoothly integrated together using the SAP HANA Platform.
Abstract:
Big Data analysis poses unique and new challenges to data visualization, compared to more traditional analytics. Such analysis often includes frequency counts, analysis of relationships in a network, and elements of statistical and predictive modeling. In many cases, traditional visualization techniques of bar- and column charts, pie charts and line graphs are not the most appropriate. We have to avoid the “beautiful hairball” and make it easy for end users to absorb the information through clever use of filtering, transparency and interactivity. We will likely also need to provide more context to go with the visualization than we have been used to in traditional analytics. Moreover, in case of forecasts you need to include any confidence intervals in order not to mislead.
This means we need more chart types, and often the chart types you need may not exist, nor could the need for such chart types necessarily be foreseen. However, Lumira allows us to design and code our own D3.js visualizations and integrate it into Lumira while providing all the data access methods – including SAP HANA – that it provides out of the box. This means we can develop our visualizations to share the outcomes of Big Data analysis to exactly how we feel it should be presented. During the webinar we will show a number of examples, and specifically the integration of forecasts coming from R into Lumira through a Lumira custom extension.
We really hope to see you there!
Cheers!
Customer Experience GroupCongrats to Joao and Alex!
Microsoft Azure Technical Guru - May 2014
João Sousa
Microsoft Azure - Remote Debbuging How To?
GO: "Clever. Well Explained and written. Thanks! You absolutely deserve the GOLD medal."
Ed Price: "Fantastic topic and great use of images!"
Alex Mang
The Move to the New Azure SQL Database Tiers
Ed Price: "Great depth and descriptions! Very timely topic! Lots of collaboration on this article from community members!"
GO: "great article but images are missing"
Alex Mang
Separating Insights Data In Visual Studio Online
Application Insights For Production And Staging Cloud Services
Ed Price: "Good descriptions and clarity!"
GO: "great article but images are missing"
Ed Price, Power BI & SQL Server Customer Program Manager (Blog,
Small Basic,
Wiki Ninjas,
Wiki)
Answer an interesting question?
Create a wiki article about it! -
Design pattern / data loading solution
Hello all!
I have been working on a few projects which involve loading data, sometimes remotely, sometimes local, sometimes JSON, sometimes XML. The problem I am having is that due to the speed of development and the changing minds of various clients I am finding my designs are too rigid and I would like them to be more flexable. I have been trying to think of a reusable solution to data loading, and would like some advice as I imagine many of you out there have had the same problem.
What I would like to do is to create a generic LoadingOperation abstract class, which has member variables of type Parser and Loader, which have parse() and loadData() methods respectivly. The Parser and Loader classed are interfaces and classes that implement these could be XMLParser and JSONParser, LocalLoader and RemoteLoader etc. With something like this i would like to have a new class which extends LoadingOperation for each thing to be loaded, weather thats a local XML file, or remote JSON, or whatever.
The problem is is that specific Parser implementation cannot return custom data types without breaking the polymorphic behavior of the LoadingOperation class. I have been messing around with generics and declaring subclasses of LoadingOperation like
class SpecificLoader extends LoadingOperation<CustomDataType>
and doing similar things with Parser classes, but this seems a bit weird.
Does anyone have any suggestions on what im doing wrong / could be doing better. I want to be able to react quickly to changing specifications (ignoring the fact that they shouldnt be changing that much!) and have a logical seperation of code etc...
thanks for any help!
psi have also asked this question here [http://stackoverflow.com/questions/4329087/design-pattern-data-loading-solution]rackham wrote:
Hello all!
I have been working on a few projects which involve loading data, sometimes remotely, sometimes local, sometimes JSON, sometimes XML. The problem I am having is that due to the speed of development and the changing minds of various clients I am finding my designs are too rigid and I would like them to be more flexable. I have been trying to think of a reusable solution to data loading, and would like some advice as I imagine many of you out there have had the same problem.
What I would like to do is to create a generic LoadingOperation abstract class, which has member variables of type Parser and Loader, which have parse() and loadData() methods respectivly. The Parser and Loader classed are interfaces and classes that implement these could be XMLParser and JSONParser, LocalLoader and RemoteLoader etc. With something like this i would like to have a new class which extends LoadingOperation for each thing to be loaded, weather thats a local XML file, or remote JSON, or whatever.
The problem is is that specific Parser implementation cannot return custom data types without breaking the polymorphic behavior of the LoadingOperation class. I have been messing around with generics and declaring subclasses of LoadingOperation like
class SpecificLoader extends LoadingOperation<CustomDataType>
and doing similar things with Parser classes, but this seems a bit weird.
Does anyone have any suggestions on what im doing wrong / could be doing better. I want to be able to react quickly to changing specifications (ignoring the fact that they shouldnt be changing that much!) and have a logical seperation of code etc...That depends on the specifics.
The fact that it seems like processes are similar doesn't mean that they are in fact the same. My code editor and Word both seem to be basically the same but I am rather sure that generalizing between the two would be a big mistake.
And I speak from experience (parsing customer data and attempting to generalize the process.)
The problem with attempting to generalize is if you generalize functionality that is not in fact the same. And then you end up with conditional logic all over the place to deal with differences dependent on the users. Rather than saving time that actually costs time because the code becomes more fragile.
Doesn't mean it isn't possible but just rather that you should insure that it is in fact common behavior before implementing anything. -
Create dynamic data type in structure
Hi Experts,
I am new to ABAP.
In my scenario data type is varying for the field. for that I need to create dynamic data type in structure, this structure I am using for internal table for OVS search input.
Please suggest the solution for this.
Advance thanks,
Regards,
BBCThanks for your quick reply,
I used your logic like this.
data:
ls_component type abap_componentdescr,
lt_component type abap_component_tab.
*... (1) define structure components :
clear ls_component.
ls_component-name = 'NVALUE'.
ls_component-type ?= cl_abap_typedescr=>describe_by_name( <fs_seg_v>-fieldname ).
insert ls_component into table lt_component.
*... (2) create structure
data lr_strucdescr type ref to cl_abap_structdescr.
data lr_data_struc type ref to data.
lr_strucdescr = cl_abap_structdescr=>create( lt_component ).
create data lr_data_struc type handle lr_strucdescr.
field-symbols <fs> TYPE any.
assign lr_data_struc->* to <fs>.
your logic is working fine.
here I am getting feild name (<fs_seg_v>-fieldname) from internal table.
But I need to assign same field name structure to query parameter.
FIELD-SYMBOLS: <ls_query_params> TYPE lty_stru_input.
Please can you suggest how I can refer the field name structure?
Regards,
BBC -
Cisco Email Interaction Manager showing strange 'Created On' Dates
Hello All,
Could anyone tell me where EIM gets its "Created On" date and times from? I was looking into some of the Queues on EIM at about 10:50 am this morning and the emails that were in there were showing Creation times of well after 11:00 am on today's date, and I have no clue why.
The system time on the Server running the EIM software has the correct system date and time. I also checked the system time on our Mail server and that was correct as well. I have the ability to view emails that come into the Aliases configured on the Mail server before the Retriever service on EIM collects them and they are showing the correct times on there as well. So I'm not sure where EIM is getting these times form?
If anyone has any thoughts or suggestions it would be much appreciated.
Thanks in Advance,
Matthttps://supportforums.cisco.com/message/3360647
eGain now offers a multichannel solution that integrates with Cisco Unified Contact Center Express.
For more details, please contact:
Tibor Vari
eGain Communications
Email: [email protected]
Ph #: (201) 236 8353 -
Fields missing while creating the data source
Hi All
I am using a third party system to pull data in to BI...Actually there was one table where from the other end they had added 2 fields recently...when i checked in development i was able to create the data source for the table along with the new fields..but when i check it in production those newly added 2 fields are missing .. though i am able to see the old fields in the data source the newly added fields are missing..so is there any problem with the refreshment of the table or should we reastart the connection?
Can any one please let me know a solution for this...
Regards
ShilpaHI,
Are you having connectivity like this
BI Dev
BI QA ---> All Three connected to same 3 rd party system
BI Prod
If yes, If it is R3 replicate datasource will help.
Incase of third party system you need to regenerate the datasource.
if it is DB Connect try to go to the source system and connect to database table from BI side and at the top you will have
generate datasource. use that option you will be able to solve.
Else
BI Dev - 3rd party dev
BI prod - 3rd party prod
Try to add 2 new fields in the 3rd party system and generate the datasource as told above.
Thanks
Arun
Edited by: Arunkumar Ramasamy on Sep 14, 2009 9:12 AM -
How to create a date range in Web Application Designer
I am using 3.1 version of Web App Designer. I need to create a report with date ranges. I can get one date to work but not two (I need a start date and end Date). My Query has 0CALMONTH with a variable for Interval. When I select it as a dropdown (??) in Web App Designer, I only get one date prompt in web app designer.
Any suggestions??
Thanks
KristenHi Kristen,
I'm sorry i'm not coming to solve your problem,but for the trouble I have had,and I want to create a date picker in Web Application Designer,so if you have solved the problem ,please email to me? [email protected] ,thx very much!
best regards
zegion chan -
How to creat a Data provider based on different fields in SAP BW ?
Hi,
Experts,
There are 20 fields of Plant Maintainace like :
SWERK
BEBER
STORT
TPLNR
EQUNR
INGRP
QMDAT ---peroid
STTXT
QMDAT - Date of Notification
QMNUM
QMTXT
QMART
AUSVN
AUZTV
AUSBS
AUZTB
AUSZT
ERNAM
QMDAB
AUFNR
I want to creat a Report based upon these fields ?
For that I h'v checked the relevant Fields to the existing standard Datasource in Bw side &
Checked cubes created based upon these Datasource in Bw side !
i h'v found some fields are existing different cubes & some are missing .
How to creat a Data provider based on different fields in SAP BW ?
plz suggest !!!!!!!
Thanx,
Asit
Edited by: ASIT_SAP on Jul 15, 2011 6:25 AM
Edited by: ASIT_SAP on Jul 15, 2011 6:27 AM
Edited by: ASIT_SAP on Jul 15, 2011 12:37 PMHi Lee, Please see below..
DECLARE @Machine2 TABLE
DispatchDate DATE
INSERT INTO @Machine2 VALUES ('2014/02/01'), ('2014/02/02'), ('2014/02/03')
DECLARE @DateFrom DATE
SELECT @DateFrom = DATEADD(D,1,MAX(DispatchDate)) FROM @Machine2
SELECT @DateFrom AS DateFrom
Please mark as answer, if this has helped you solve the issue.
Good Luck :) .. visit www.sqlsaga.com for more t-sql code snippets and BI related how to articles. -
While creating the Data source for Sql server am getting this error?
Hi i new to Power BI sites and office 365,
For my first step
1. i have successful creaeted Gateway in Office 365
2.Now i have creating the Data source for that gateway using Sql server R2
3.While Creating the Data source using the Corresponding gateway i have given and next phase set Credentials phase i have noticed one pop up window and it will shows me like
Data source Settings and below in that window it will show's u the things like
Datasource Name.
Loading......
Credentials type:
Privacy Level:
Failed to verify parameter
I will get this results finally ,so hw should i achive this probelm ,Please let me know if any one knows
So kindly give me the proper answer for this.
RegardsAny suggestions for Chinnarianu?
Thanks!
Ed Price, Azure & Power BI Customer Program Manager (Blog,
Small Basic,
Wiki Ninjas,
Wiki)
Answer an interesting question?
Create a wiki article about it! -
In "List View" in a Finder window, among the many column options are "Date created" and "Date modified." In "View Options" (command-J) for any folder, one can add these columns, along with the standard ones "Size," "Kind," "Label," etc.
A user can alter a file's name, and a file's "label" (i.e. its color). On the other hand, a user can NOT arbitrarily edit/alter a file's official "size" -- other than by physically altering the contents of the file itself, obviously.
But what about a file's "Date created" and "Date modified"? Can either of those be manually edited/changed, just as a file's name can be changed -- or is a file's creation-date an immutable attribute beyond the editorial reach of the user, just as a file's "size" is?
And yes, a person can "alter" a file's "Date modified" by simply modifying the file, which would change its "Date modified" to be the moment it was last altered (i.e. right now). But (and here's the key question) can a user somehow get inside a file's defining attributes and arbitrarily change a file's modification date to be at any desired point in the past that's AFTER its creation date and BEFORE the present moment? Or will so doing cause the operating system to blow a gasket?
If it is possible to arbitrarily manually alter a file's creation date or modification date, how would it be done (in 10.6)? And if it is NOT possible, then why not?sanjempet --
Whew, that's a relief!
But as for your workaround solution: All it will achieve is altering the created and modified dates to RIGHT NOW. What I'm looking to do is to alter the modification/creation dates to some point in the past.
I'm not doing this for any nefarious reason. I just like to organize my work files chronologically according to when each project was initiated, but sometimes I forget to gather the disparate documents all into one folder right at the beginning, and as a result, sometimes after I finish a job, I will create a new folder to permanently house all the files in an old project, and when that folder is places in a bigger "completed projects" folder and then is organized by "Date created" or "Date modified" in list view, it is out-of-order chronologically because the creation and modification dates of that particular project folder reflect when the folder was created (i.e. today), and not when the files inside the folder were created (i.e. weeks or months ago).
The simplest solution would simply to be able to back-date the folder's creation or modification date to match the date that the project actually started! -
Problem creating new data sources
hi -- I've just created a file DSN data source (Microsoft ODBC for Oracle Driver), and in Crystal, have used the ODBC (RDO) data source type, pointing to the file DSN. (I'm accessing an Oracle 10g database on a Unix server.)
The connection to the database is successful, but when I use the Database Expert to pick the actual object (eg table) that I want the report to access, I get the following strange results:
Not all schemas are visible
For the schemas that are visible, only stored procedures are shown. No tables, views, or anything else.
When I create an Oracle Server data source, all schemas and their objects are visible. The report works fine.
Can someone explain to me why the File DSN is not showing me all schemas and objects?
Thanks,
CarolAll -- Thanks for your suggestions. I probably should have given you more information up front:
- I've already updated the database using Set Datasource Location.
- I've already verified that the database explorer is including views and tables
and is not restricting based on LIKE.
- I did as Sourashree suggested and created a system DSN -- it also only shows
me the stored procedures.
I have successfully created a non-ODBC data source just using the Crystal Reports
data source type of Oracle Server. It works just fine -- tables, views, everything I need
shows up in the explorer. So, my question about the File DSN is really just academic,
out of curiousity (I struggled with it so long I'd like to know why it doesn't work!).
If you can help me out, that would be great. But I see no reason the Oracle Server
data source won't meet our needs.
Thanks,
Carol
Maybe you are looking for
-
BT can't seem to bill me?
For anyone who like long stories of BT problems please see my original thread in Infinity folder. https://community.bt.com/t5/BT-Infinity-Speed-Connection/Please-Help-Lost-everything-because-of-swit...k Despite being signed up for over a month and ha
-
How do I transfer my pictures from iPad to windows 7 operating computer?
How do I transfer my pictures from iPad to windows 7 operating computer?
-
Problem in parsing an XML using SAX parser
Hai All, I have got a problem in parsing an XML using SAX parser. I have an XML (sample below) which need to be parsed <line-items> <item num="1"> <part-number>PN1234</part-number> <quantity uom="ea">10</quantity> <lpn>LPN1060</lpn>
-
How to parse XML and store the data in tables using sql or plsql?
I want to parse the xml <?xml version="1.0" encoding="UTF-8" standalone="yes"?> <xmlListWrapper> <size>2</size> <AppTypeID>10</AppTypeID> </xmlListWrapper> and store in a table |pk|apptypeid| 1 10
-
Revision: 12685 Revision: 12685 Author: [email protected] Date: 2009-12-08 19:23:32 -0800 (Tue, 08 Dec 2009) Log Message: Fix for RTE in VideoPlayer when trying to capture bitmaps. Put try-catch block around bitmapData.draw() and use a Rectangl