People complain that governments or hackers are reading our messages for nefarious purposes. Of course this "reading" is done automatically, in large volume, by machines and NLP (natural language…Continue
Many unscrupulous bloggers re-post copyrighted material on their blogs, without permission. The problem is compounded by the fact that Google can give credit to the illegal version, and erroneously…Continue
This is our new challenge of the week. Previous challenges…Continue
Well rounded, visionary data scientist with broad spectrum of domain expertise, technical knowledge, and proven success in bringing measurable added value to companies ranging from startups to fortune 100, across multiple industries (finance, Internet, media, IT, security) and domains (data science, operations research, machine learning, computer science, business intelligence, statistics, applied mathematics, growth hacking, IoT).
Vincent developed and deployed new techniques such as hidden decision trees (for scoring and fraud detection), automated tagging, indexing and clustering of large document repositories, black-box, scalable, simple, noise-resistant regression known as the Jackknife Regression (fit for black-box, real-time or automated data processing), model-free confidence intervals, bucketisation, combinatorial feature selection algorithms, detecting causation not correlations, and generally speaking, the invention of a set of consistent robust statistical / machine learning techniques that can be understood, implemented, interpreted, leveraged and fine-tuned by the non-expert. Vincent also invented many synthetic metrics (for instance, predictive power and L1 goodness-of-fit) that work better than old-fashioned stats, especially on badly-behaved sparse big data. Some of these techniques have been implemented in a Map-Reduce Hadoop-like environment. Some are concerned with identifying true signal in an ocean of noisy data.
Vincent is a former post-doctorate of Cambridge University and the National Institute of Statistical Sciences. He was among the finalists at the Wharton School Business Plan Competition and at the Belgian Mathematical Olympiads. Vincent has published 40 papers in statistical journals and is an invited speaker at international conferences. Vincent also created the first IoT platform to automate growth and content generation for digital publishers, using a system of API's for machine-to-machine communications, involving Hootsuite, Twitter, and Google Analytics.
Vincent's profile is accessible at http://bit.ly/1jWEfMP and includes top publications, presentations, and work experience with Visa, Microsoft, eBay, NBC, Wells Fargo, and other organisations.
This resource is part of a series on specific topics related to data science: regression, clustering, neural networks, deep learning, Hadoop, decision trees, ensembles, correlation, ouliers, regression Python, R, Tensorflow, SVM, data reduction, feature selection, experimental design, time series, cross-validation, model fitting, and many more. To keep receiving these articles, …Continue
When dealing with time series, the first step consists in isolating trends and periodicites. Once this is done, we are left with a normalized time series, and studying the auto-correlation structure is the next step, called model fitting. The purpose is to check whether the underlying data follows some well known stochastic process with a similar auto-correlation structure, such as ARMA processes, using tools such as…Continue