Subscribe to DSC Newsletter

Internet of Things? Maybe. Maybe Not.

Everything is connected, through the cloud all machine-generated data are collected and widely shared over the Internet. That’s how we imagine IoT – the Internet of Things.

 

Correction: That’s how THEY imagine IoT. What WE envision here is not just about the Internet of Things but also the Intelligence of Things. The idea is: When a device is equipped with connectivity and sensors, why not take another bold move to make the device intelligent? With an agile and affordable computing unit, every device has the power to analyze collected data and take fact-backed actions, thus making intelligence “in-place” a part of the Internet of Things, anywhere and at anytime. Intelligence, according to Jeff Hawkins*, is defined by predictions.

 

Computers, home appliances, vehicles – even the apparel and kitchenware – can be turned into a thinking unit.  They can help you act or react to the environment or your neighbours based on your behavioral routines and preferences. Your running shoes could control the friction of their soles according to your weight, the weather, and the kind of trail you choose. Your home theater system fine-tunes sound effects according to the movie genre and what time of day you are watching. There are plenty of exciting applications that come with the advent of intelligent things.

 

The question is, how does it work?

 

The data collected from sensors uploads to the cloud and is stored in (machine) learning systems, while streaming data input triggers an analytic engine to predict the best outcome and to react accordingly. Big data accumulates the background knowledge while small data evokes intelligence in-place.

 

In-Place Computing, fully utilizing the unbounded memory space of our existing 64-bit architecture, opens up the window for this sci-fi-like scenario. In-place computing utilizes virtual memory space, and thus avoids hardware lock-in and offers cross-platform computing power. As Qualcomm announced the introduction of 64-bit CPUs for handheld devices, now all mobile devices are entitled to serve complicated computing jobs at your fingertips. In-place Computing, can thus be the catalyst for a new era of “Intelligence of Things.”

 

 

*Check out this awesome video where Jeff Hawkins explains how brain science will change computing

Views: 1920

Comment

You need to be a member of Data Science Central to add comments!

Join Data Science Central

Comment by chang hsiung on June 22, 2014 at 8:34am

thank you anyway,

I am still somewhat curious what kind of problems you are solving that current technology can not solve ?

If you don't mind can you give us some real world scenario that your in-place computing can make a difference ?

Are you providing tools or libraries that people can use to build their analytics models ?

Last year I got a job from Intel Lab and they claimed to do just like that, unfortunately I turned them down because my wife was against it, otherwise I would know better by now.

Comment by wenwey hseush on June 22, 2014 at 8:18am

I hope you can solve your problems. Best regards.

Comment by chang hsiung on June 22, 2014 at 8:10am

unfortunately for industrial people, we only care about whether your technology can solve our problems,

we don't care what you call it.

Comment by wenwey hseush on June 21, 2014 at 10:16pm

I simply tried to clarify what in-place computing is, and don't wish the term be used in any way we don't promise to do. It is an abstract computing model best for building (in-memory) data analytic engines. You certainly can build complicated models on top of an in-place data engine. We don't do machine learning models. Wish the best for your work.

Comment by chang hsiung on June 21, 2014 at 5:13pm

Since I came from Taiwan, I will go back Taiwan this fall or next spring.

I am still somewhat foggy about what your company can do for us though ?

Since doing predictions with loaded models only take maybe 0.01 sec, this is not an issue for us at all.

With your technology, do you do models building inside the cell phone ?

How big or complicated models (machine learning models, e.g. SVM or random forest ) can be built at how long a time with your technology and the current status of cell phone ?  or in case a master model has been built and loaded into the cell phone, do you have efficient algorithm to do model updating (based on locally collected new data) within a time that our customers are willing to wait (typically maybe only a few seconds) ?  while as I mentioned before, building a new or updated model from scratch again can take several minutes up to hours.

If your company can help us to solve our problems, then we should definitely meet and talk, otherwise you can still do your MacroData or in-place computing and I will still working on our consumers spectrometer.  Here at Bay area, my wife (who has PhD in Computer Engineering) meet with professors and PhD students from Stanford every week, I don't think we need any more help from academia.

Comment by wenwey hseush on June 21, 2014 at 3:39pm

I am the author of "BigObject Store: In-Place Computing for Interactive Analytics", which will appear in IEEE Big Data Congress June, 2014. What we at MacroData intend to address is for the execution layer rather than the semantics layer. Similar to the way the turing machine is defined as an abstract model to manipulate symbols, in-place computing is abstract model that allows data object live and work in an flat and unbounded memory space. We believe the in-place computing is the right model for big data, rather than the traditional compute-centric (stored program) model (based on the von Neumann architecture) people have been using for 75 years. An infinite and flat memory-based computing model for data is needed so there won't be any data movement or data retrieval. That's why we call in-place computing, which is closely related to in-memory computing. Download the paper from MacroData website. I will be happy to meet you if you come to Taiwan. 

Comment by Sam Sur on June 21, 2014 at 11:07am

@Chang, Great comments. I was in fact referring to both aspects. Most of the times in my use case scenarios, in place computation is not possible, mainly because we need to aggregate data from several agents before making any prediction.

Comment by Sam Sur on June 21, 2014 at 11:03am

Good discussion @Yuanjen. Let us meet up and discuss F2F when I am in Taiwan or you are in US next time. This is my fav (after soccer of course :-)

Comment by chang hsiung on June 20, 2014 at 3:57pm

from my prospect, there are two aspects of in-place computing;

1) models building

2) prediction only

1) can be very calculation demanding and 2) usually is much less demanding

for some complicated models and less efficient algorithms, 1) can take hours even on my engineering laptop with 32GB of RAM, while 2) usually only take mini-sec to finish.

I am not sure which aspect are you guys referring to ?

Comment by Yuanjen Chen on June 20, 2014 at 2:38pm

Hi Sam,

Thanks for the sharing. In-place Computing Model is an abstraction that defines "computations take place where data reside." It fully utilizes the unbounded virtual memory space of 64-bit architecture and thus bypass the limitation of swap space. Since 64-bit CPU for handheld devices will be available this year, we expect the adoption to prevail in maybe 2 or 3 years when those devices equipped with 64-bit CPU is empowered to analyze. That's when we expect days "Intelligence of Things" will come. 

Videos

  • Add Videos
  • View All

© 2019   Data Science Central ®   Powered by

Badges  |  Report an Issue  |  Privacy Policy  |  Terms of Service