Big Data

September 4, 2017

In recent years, computation power has reached a level where machine learning and rudimentary artificial intelligence is accessible by a wider audience. In the case of machine learning, many algorithms and machine learning methodologies are derived from Bayesian Theory. Bayes Theorem was initially described in the 1763 posthumous memoir of Thomas Bayes (Howson, 2006). The theorem is based on the idea of revising a probability value based on additional information that is later obtained. Stated mathematically, the probability of event A, given that event B has subsequently occurred :

Bayes Theorem

 The use and development of Bayes work has grown exponentially since the 1980’s (Howson, 2006).

Machine Learning can be defined as “a method of data analysis that automates analytical model building. It is a branch of artificial intelligence based on the idea that machines should be able to learn and adapt through experience” {SAS:wd}. As can be noted with the definition, machine learning and Bayes are related via the idea of adaptation via experience.

It can be stipulated that this growth corresponds with the rise of personal computers. Additionally, the ability to collect and store vast amounts of data, or Big Data, has proved to be prime fuel for Bayesian and Machine Learning methodologies.

Howson, Colin, and Peter Urbach. 2006. Scientific Reasoning. Open Court Publishing.

comments powered by Disqus