9

I'm looking to graph and interactively explore live/continuously measured data. There are quite a few options out there, with plot.ly being the most user-friendly. Plot.ly has a fantastic and easy to use UI (easily scalable, pannable, easily zoomable/fit to screen), but cannot handle the large sets of data I'm collecting. Does anyone know of any alternatives?

I have MATLAB, but don't have enough licenses to simultaneously run this and do development at the same time. I know that LabVIEW would be a great option, but it is currently cost-prohibitive.

Thanks in advance!

Aleksandr Blekh
  • 6,603
  • 4
  • 29
  • 55

3 Answers3

7

For this answer, I have assumed that you prefer open source solutions to big data visualization. This assumption is based on budgetary details from your question. However, there is one exclusion to this - below I will add a reference to one commercial product, which I believe might be beneficial in your case (provided that you could afford that). I also assume that browser-based solutions are acceptable (I would even prefer them, unless you have specific contradictory requirements).

Naturally, the first candidate as a solution to your problem I would consider D3.js JavaScript library: http://d3js.org. However, despite flexibility and other benefits, I think that this solution is too low-level.

Therefore, I would recommend you to take a look at the following open source projects for big data visualization, which are powerful and flexible enough, but operate at a higher level of abstraction (some of them are based on D3.js foundation and sometimes are referred to as D3.js visualization stack).

  • Bokeh - Python-based interactive visualization library, which supports big data and streaming data: http://bokeh.pydata.org
  • Flot - JavaScript-based interactive visualization library, focused on jQuery: http://www.flotcharts.org
  • NodeBox - unique rapid data visualization system (not browser-based, but multi-language and multi-platform), based on generative design and visual functional programming: https://www.nodebox.net
  • Processing - complete software development system with its own programming language, libraries, plug-ins, etc., oriented to visual content: https://www.processing.org (allows executing Processing programs in a browser via http://processingjs.org)
  • Crossfilter - JavaScript-based interactive visualization library for big data by Square (very fast visualization of large multivariate data sets): http://square.github.io/crossfilter
  • bigvis - an R package for big data exploratory analysis (not a visualization library per se, but could be useful to process large data sets /aggregating, smoothing/ prior to visualization, using various R graphics options): https://github.com/hadley/bigvis
  • prefuse - Java-based interactive visualization library: http://prefuse.org
  • Lumify - big data integration, analysis and visualization platform (interesting feature: supports Semantic Web): http://lumify.io

Separately, I'd like to mention two open source big data analysis and visualization projects, focused on graph/network data (with some support for streaming data of that type): Cytoscape and Gephi. If you are interested in some other, more specific (maps support, etc.) or commercial (basic free tiers), projects and products, please see this awesome compilation, which I thoroughly curated to come up with the main list above and analyzed: http://blog.profitbricks.com/39-data-visualization-tools-for-big-data.

Finally, as I promised in the beginning, Zoomdata - a commercial product, which I thought you might want to take a look at: http://www.zoomdata.com. The reason I made an exclusion for it from my open source software compilation is due to its built-in support for big data platforms. In particular, Zoomdata provides data connectors for Cloudera Impala, Amazon Redshift, MongoDB, Spark and Hadoop, plus search engines, major database engines and streaming data.

Disclaimer: I have no affiliation with Zoomdata whatsoever - I was just impressed by their range of connectivity options (which might cost you dearly, but that's another aspect of this topic's analysis).

Aleksandr Blekh
  • 6,603
  • 4
  • 29
  • 55
1

Visualizing large datasets is a long standing problem. One of the issues is to understand how we can show over a million points on a screen that has only about ~ million pixels.

Having said that, here are a few tools that can handle big data:

  1. Tableau: you could use their free desktop tool.
  2. Tabplot: built on top of ggplot2 in R to handle larger datasets.
  3. See this review for 5 other products that can help you do your job.
Nitesh
  • 1,625
  • 1
  • 12
  • 22
0

If you are using python, I would suggest using mpld3 which combines D3js javascript visualizations with matplotlib of python.

The installation and usage is really simple and it has some cool plugins and interactive stuffs.

http://mpld3.github.io/

Ram
  • 349
  • 3
  • 2