The gradient descent algorithm is one of the most popular optimization techniques in machine learning. It comes in three flavors: batch or “vanilla” gradient descent (GD), stochastic gradient descent (SGD), and mini-batch gradient descent which …See More
I am a theoretical physicist and data scientist. During my Ph.D. at New York University, my research was on quantum many-body systems out of equilibrium, one of today's hottest research fields in theoretical physics. In 2015 I earned a Ph.D. for my thesis "Prethermalization, universal scaling at macroscopic short times, and thermalization following a quantum quench." Subsequently, I spent two years as a postdoctoral researcher focusing on applications of computational physics methods to areas such as quantum chaos, non-equilibrium quantum dynamics, and thermalization viability. I've published several articles in the most prestigious peer-reviewed journals in the world.
I have 15 years of experience building mathematical/statistical models spanning a wide range of disciplines, including: non-equilibrium quantum systems, marketing attribution for digital and offline channels (using machine learning, game theory, stochastic processes and hierarchical Bayesian models), influencer marketing, marketing mix modeling, record linkage/data matching/entity recognition, conjoint analysis (using support vector machines), bidding optimization for digital media (using original functional optimization techniques) and many others.
I was a speaker at the American Physical Society (APS) March Meeting, the largest physics conference in the United States, in 2013, 2015 and 2016 (for more details and also information about other conferences see my personal website www.marcotavora.me)
In 2017 I co-founded Prior Solutions, a data-driven consulting company with clients in more than ten countries around the world.