# A Different Breed of Mathematics: Topology

Topology is the mathematical study of the properties that are preserved through deformations, twistings, and stretchings of objects. Tearing, however, is not allowed[1]. Topology can be used to abstract the inherent connectivity of objects while ignoring their detailed form. Put simply, Topology is a mathematical discipline that studies shape and assumes that shape has meaning[2]. This post discusses topology’s applications in finance and insurance. While Topology is certainly not new breed, it was restricted to the realms of physics and pure mathematics for a long time and only recently has started gathering applications in machine learning, finance and insurance. This new and advanced application of an under-rated but highly powerful branch of mathematics of topology can significantly enhance results derived from big data and data science applications.

A sizable portion of financial and actuarial research is built upon classical applications of Linear Algebra (such as regression analysis) and Stochastic Calculus (such as valuation models). As a result, these methods focus on geometric locations rather than logical relations. Traditional actuarial models could be complemented with Topological and Graph-Theoretical tools that recognize the hierarchy and relationships between agents in the system. While Black-Scholes and quantitative finance looks challenging mathematically, the mathematics behind it is quite old and we must keep up with recent applications based on mathematics like topology especially because they are revolutionizing every aspect of science[3].

The problem with Linear Algebra and Stochastic Calculus is that these were not designed to study complex adaptable systems. Linear Algebra focuses on locations in Euclidian geometry, thus the “linear” term. Stochastic Calculus shares that focus, by concentrating on the measurement of changes of geometric location associated with a random variable, the speed of that change, etc. While Linear Algebra can answer questions such as how closely together a system moves, it cannot recognize what is a system’s shortest path of propagation, what nodes can shut down a network, or what connections are most critical to ensure network flow[4].

A Topological study of the financial system provides a relationships map to how markets are arranged and interconnected. A Graph Theory study of the financial system tells us about the role that various nodes play. This is useful to market participants, because it explains how information propagates through the system, allowing them to monitor capital flows and anticipate price trends[5].

Anticipating price trends and modeling of demand is a key requirement to insurance ratemaking and can be used to simulate price trends and cycles for emerging liabilities. One appropriate topological application for this purpose is Stochastic Flow Diagrams (SFDs).  Stochastic Flow Diagrams (SFDs) is a novel mathematical methodology that can help us visualize the complex network of demand and supply flows scattered around punctuated equilibrium.  We propose SFDs that combines elements of Graph Theory and inferential statistics to visualize the structure of a complex system, allowing for an intuitive interpretation of its state and future course. The SFD method takes into consideration the dynamic properties of the system, determining the direction of the flows in terms of lead-lag and causality effects. SFD connectivity is determined by statistical significance of the graph’s arcs, which are weighted based on the flow carried through the arcs involved. Because SFD maps a dynamic system, it incorporates a time dimensionality, where crossing each arc represents a unit of time elapsed[6].

Aside from SFDs, there is also Topological Data Analysis. Topological Data Analysis (TDA) refers to the adaptation of this discipline to analyzing highly complex data. It draws on the philosophy that all data has an underlying shape and that shape has meaning[7].

The machine intelligence approach advocated by Ayasdi (a startup founded by Stanford professors) combines topology with machine learning to achieve data-driven insights instead of hypothesis driven insights.  Machine learning on its own have significant limitations. Clustering, for example, requires an arbitrary choice of clusters which the analyst has to specify. With dimensionality reduction techniques, the danger is on missing the subtle insights included in the data that can potentially prove to be very useful to the analysis. Including topology with machine learning overcomes these drawbacks effectively[8].

The topology visualizations capture the subtle insights in the data while also representing the global behavior of the data. From the nodes identified by the topology network diagrams from the data, clusters are identified and each cluster is fit onto a model that fits it more properly so that instead of a one-size-fit-all model, different models are applied to different regions of data for maximum predictive potency[9].

On another note, by following the lead by the author Ovidiu Racorean[10] and generalizing his topological approach from stock markets to ratemaking, a surprising image of premiums can potentially arise if the price time series of all lines of business are represented in one chart at once. The chart can evolve into a braid representation of the general insurance portfolio by taking into account only the crossing of stocks and fixing a convention defining overcrossings and undercrossings. The braid of stocks prices has a remarkable connection with the topological quantum computer. Using pairs of quasi-particles, called non-abelian anyons, having their trajectories braided in time, topological quantum computer can effectively simulate the premium levels and behavior encoded in the braiding of the portfolio. In a typically topological quantum computation process the trajectories of non-abelian anyons are manipulated according to the braiding of stocks and the outcome reflects the probability of the future state of stock market. The probability depends only on the Jones polynomial of the knot formed by plat closing the quantum computation. The Jones polynomial of the knotted stock market acts, making a parallel with the common financial literature, in a topological quantum computation as a counterpart of a classical technical indicator in premiums arrived by let’s say a Generalized Linear Model. The type of knot stock market formed is also an indicator of its future tendencies.

Following this approach, premium pricing signals can become a process of writing a quantum code and the topological quantum computer is the perfect device designed to read it for decoding the premium pricing market behavior. The end of typical topological quantum computation consists in fusing the pairs of non-abelian anyons together, a process that results in plat closure of the braided trajectories of anyons. The outcome of the topological quantum calculation is referring at the final state of the system and expresses the probability of the insurance market to end in a certain state, say soft or hard underwriting cycles[11].

[1] Topology section on Mathworld, Wolfram.com

[2] Ayasdi: Topology & Topological Data analysis

[3] Calkin, N. J and Prado M. L, Algorithmic Finance 3 (2014) pages 43–85; The topology of macro-financial flows: an application of stochastic flow diagrams.

[4] Ibid

[5] Ibid

[6] Ibid

[7] Ayasdi: Topology & Topological Data analysis

[8] Ibid

[9] Ibid

[10] Racorean, O. Applied Mathematics in Finance Department, SAV integrated systems. “Braided and Knotted Stocks in the Stock Market: Anticipating the flash crashes”.

[11] Ibid

Views: 2628

Comment

Join Data Science Central

Comment by Syed Danish Ali on July 2, 2016 at 1:14pm

Based on very helpful comments by Michel Baudin and Sione Palu I have briefly revised some comments to make this post more accurate.

Comment by Sione Palu on June 29, 2016 at 2:56pm

I second to Michel Baudin' comment. The author of this article is misleading or simply its unbeknownst to him the history of topology. It is more than 100 years old, so the author of this article makes it out as if topology is new.  Besides, machine learning has been adopting concept of topology in the last 10 years or so,  from text-mining, image recognition and so forth.

One example of topology is Riemannian Manifold topological space (or simply Riemannian Tensor) which was popularized by Einstein in 1916 when he used Riemannian Tensor to develop his General Theory of Relativity.

Bernhard Riemann first published his work on what we know now as "Riemann Geometry" (or topology) in 1854.

Here are some researches on Manifold (http://mathworld.wolfram.com/Manifold.html) in Machine Learning that has been available in the literature.

1)  Non-negative Matrix Factorization on Manifold

http://www.cse.wustl.edu/~zhang/teaching/cs517/Spring12/CourseProje...

2) Non-Negative Tensor Factorization with Applications to Statistics and Computer Vision

http://www.cs.huji.ac.il/~shashua/papers/NTF-icml.pdf

3) Discriminant Analysis on Riemannian Manifold of Gaussian Distributions for Face Recognition with Image Sets

http://www.cv-foundation.org/openaccess/content_cvpr_2015/papers/Wa...

4) Large-Scale Manifold Learning

http://web.cs.ucla.edu/~ameet/largeManifold.pdf

As I mentioned above, there are tons of machine learning models that have been developed in the manifold topology framework concepts, not limited to the 4 listed above.

Comment by Michel Baudin on June 29, 2016 at 1:14pm

While the idea of topological data analysis is new to me and I find it intriguing, I have to take exception to the description of topology itself as "a new breed of math."

Topology, which I would describe as the study of continuity in its most general form, is not new. I first studied it  in Bourbaki's 1971 textbook "Topologie Générale," which contained many results from the early 20th century, and later studied algebraic topology under Henri Cartan, whose own work was from the mid-20th century.

It was fascinating stuff, but hardly new.