In this series of exploratory blog posts, we explore the relationship between recurrent neural networks (RNNs) and IoT data. The article is written by Ajit Jaokar, Dr Paul Katsande and Dr Vinay Mehendiratta as part of the Data Science for Internet of Things practitioners course. Please contact [email protected] for more details
RNNs are already used for Time series analysis. Because IoT problems can often be modelled as a Time series, RNNs could apply to IoT data. In this multi-part blog, we first discuss Time series applications and then discuss how RNNs could apply to Time series applications. Finally, we discuss applicability to IoT.
In this article (Part One), we present the overall thought process behind the use of Recurrent neural networks and Time series applications - especially a type of RNN called Long Short Term Memory networks (LSTMs).
The process of Prediction involves making claims about the state of something in future depending on values in the past and its current state. Many IoT applications (such as temperature values, smart meter readings etc) have a time dimension. Classical pattern recognition problems are concerned with knowing dependencies between variables (Regression) or in classification of input vectors into categories (Classification). By including changes in time, we add an additional temporal dimension giving us Spatio-temporal data. Even when a phenomenon is continuous, it can be converted to a Time series by the process of sampling. Thus, phenomenon like speech, ECGs etc can be modelled as a time series when sampled.
Hence, we have a variable x changing in time xt (t=1,2,...) and we would like to predict the value of x at time t+h. Time series forecasting is a problem of function approximation and the forecast is made by computing an error measure over a time series. Also, given a time series model, we can solve many related problems. For example: Forecast the value of the variable at a time t x(t); Classify the time series at a time in future(will prices go up or down); Model one time series in terms of another(for example Oil prices to Interest rates) etc.
However, while many scenarios (such as Weather prediction, foreign exchange fluctuations, energy consumption etc) can be expressed as a time series, formulating and solving the equations is hard in these cases because of reasons such as
a) There are too many factors influencing the outcome
b) There are hidden/unknown factors influencing the outcome.
In many such scenarios, the focus is not to find the precise solution to the problem but rather to find a possible steady state where the system will converge. Neural networks often apply in such scenarios because they are able to learn from examples only and are able to catch hidden and strongly non-linear dependencies.
Neural networks are trained from historical data with the objective that the network will discover hidden dependencies and that it will be able to use them for predicting into future. Thus, neural networks are not represented by an explicitly given model and can spot nonlinear dependencies in spatiotemporal patterns. They solve the problem of feature engineering i.e. in finding out what is the best representation of the sample data to learn a solution to your problem
When used for Time series forecasting, the obvious first problem is: How to model Time in the neural network? For traditional applications of Neural networks (such as pattern recognition or classification), we do not need to model the Time dimension. Time is difficult to model in a neural network because it is constantly moving forward. However, by including a set of delays, we can retain successive values in the time series. Thus, each past value is treated as an additional spatial dimension. This process of converting the time dimension into an infinite-dimensional spatial vector is called embedding. Because for practical purposes, we need a limited set of values – we consider a history of previous d samples (the embedding dimension) as shown in the figure below
Source: http://www.cs.cmu.edu/afs/cs/academic/class/15782-f06/slides/timese....
For an evolution of neural networks, see the previous post Evolution of Deep learning models. Recurrent neural networks are often used for modelling Time series. An example is using Recurrent Neural Networks To Forecasting of Forex(pdf)
A recurrent neural network (RNN) is a class of artificial neural network where connections between units form a directed cycle. This creates an internal state of the network which allows it to exhibit dynamic temporal behaviour. Unlike feedforward neural networks, RNNs can use their internal memory to process arbitrary sequences of inputs. This feature of using internal memory to process arbitrary sequences of inputs makes RNNs applicable to tasks such as unsegmented connected handwriting recognition, where they have achieved the best known results. (Wikipedia). The fundamental feature of a Recurrent Neural Network (RNN) is that the network contains at least one feed-back connection, so the activations can flow round in a loop. That enables the networks to do temporal processing and learn sequences, e.g., perform sequence recognition/reproduction or temporal association/prediction. Using the same idea as time delays as above, the recurrent neural network can be converted into a traditional feed forward neural network by unfolding over time as shown below.
Source: http://www.cs.bham.ac.uk/~jxb/INC/l12.pdf
RNNs are used to model Time series because the feedback mechanism creates a ‘memory’ i.e. an ability to process the Time dimension. Memory is important because many Time series problems (such as Traffic modelling) need a long term / historical modelling of Time values. Long Short Term Memory networks(LSTMs) are a special kind of RNN, capable of learning long-term dependencies especially because they are capable of remembering information over long time frames. The figure below summarises feed forward neural networks and Recurrent neural networks.
Source: http://deeplearning.cs.cmu.edu/notes/shaoweiwang.pdf
In subsequent articles, we will explore LSTMs in greater detail and implications for IoT data.
This article covers topics we teach in the Data Science for Internet of Things practitioners course. Please contact [email protected] for more details
Comment
You pointed out that a multivariate problem is a challenge for the classic Time Series methods, but when switching attention to RNN, only single variable problems have been outlined. Do you have an answer for RNN application to multivariable problems.?
© 2019 Data Science Central ® Powered by
Badges | Report an Issue | Privacy Policy | Terms of Service
Most Popular Content on DSC
To not miss this type of content in the future, subscribe to our newsletter.
Technical
Non Technical
Articles from top bloggers
Other popular resources
Archives: 2008-2014 | 2015-2016 | 2017-2019 | Book 1 | Book 2 | More
Most popular articles
You need to be a member of Data Science Central to add comments!
Join Data Science Central