Subscribe to DSC Newsletter

Hi Everyone,

  I'm an analyst / developer on a new project called Composable Analytics out of MIT.  It's essentially a platform allowing users to explore, create, and collaborate on analytical methods and data.  Users develop analytics (we call them applications) using a flow-based programming language, and can leverage other languages like SQL, Matlab, C#, Python, R, etc ..  

If I was to compare it to an existing technology, the dataflow concept similar to Spark, StreamBase, Alteryx, or or a Lavastorm.  But I think we scale better, have a different type system, and offer a nicer interface with much more collaborative features for sharing and executing analytics.

I'm soliciting feedback, and I'd love to hear your thoughts.

Cheers,

  Lars

Tags: analytics, business, composable, flow-based, intelligence, programming

Views: 1086

Reply to This

Replies to This Discussion

So, is this product for sale (yet)? Any idea what pricing will look like?

It is.  Our beta cloud-services are currently offered free of charge.  And if you are looking for your own instance either in the cloud or on premises, licensing will be on an annual basis (includes getting you up and running and any updates throughout the year).  We can potentially waive fees since we're just starting out.

Bruce Higgins said:

So, is this product for sale (yet)? Any idea what pricing will look like?

Hi Lars, I'm interested, keep me posted.

One immediate thought that comes to mind, is that all the tools I've seen, and I could be wrong, seem to leave it up to the user to do his own exploratory data analysis (EDA). I'd like to see a tool that automatically performs a bunch of appropriate EDA depending on the type of analyses you want to perform. I.e. If you tell the tool you want to perform multiple regression, the tool should automatically come up with a whole host of EDA (charts, graphs, statistics, etc.) and provide summary "messages" to the user about where some of the linear assumptions may not hold and therefore where a variable transformation may be appropriate. That's the idea. If the user has to do all this himself manually then many users would be lazy and only check out the tip of the iceberg of EDA they should be performing. Have the tool automatically assess the data and tell you where to address any issues in other words (large outliers, etc.).

Another feature I'd really like to see in a tool, and I am not seeing it, is an intuitive, efficient, way of analyzing changes in analysis parameters and comparing the effects. For instance, if you made assumptions to exclude outliers early on in your analysis, then when you go towards the end perhaps weeks or months later and see that you might want to revisit your outlier exclusion assumptions, how does that all domino and affect all your analyses you've done in the past weeks? Will you have to totally re-do all your charts/graphs, conclusions that perhaps you've already written up in your report? I'd like to see the effects of omitting the top "n" largest data points on any given statistic, say a beta value. I'd like to see a graph of  "n" on the x-axis and the resulting beta value on the y-axis to see where "n" makes a difference or you can safely say it does not make a difference. The idea is change analysis.

all the tools I've seen requires you to keep track of the affects of changes manually or build separate models and somehow compare them. That's a lot of memorization, note-taking, or model building just to understand the impact of changes.

Email me if this is not clear...

Good luck.

if you'd like to use Composable Analytics on your own network, feel free to download it. - http://composableanalytics.com.

RSS

Videos

  • Add Videos
  • View All

© 2019   Data Science Central ®   Powered by

Badges  |  Report an Issue  |  Privacy Policy  |  Terms of Service