Reducing total cost of ownership (TCO) is a topic familiar to all enterprise executives and stakeholders. Here, I discuss optimization strategies in the context of AI ado...
Introduction Language models have existed for decades — long before today’s so-called “LLMs.” In the 1990s, IBM’s alignment models and smoothed n-gram systems t...
You have plenty of experience. Maybe you are over 40, or fresh out of college, and you realize that there is no tech job out there. More precisely, 1000 applicants for an...
With explainable AI, intuitive parameters easy to fine-tune, versatile, robust, fast to train, without any library other than Numpy. In short, you have full control over ...
This book opens up new research areas in theoretical and computational number theory, numerical approximation, dynamical systems, quantum dynamics, and the physics of num...
Standard LLMs rely on prompt engineering to fix problems (hallucinations, poor response, missing information) that come from issues in the backend architecture. If the ba...
In this article, I discuss the main problems of standard LLMs (OpenAI and the likes), and how the new generation of LLMs addresses these issues. The focus is on Enterpris...
In my ground-breaking paper 51 available here, I paved the way to solve a famous multi-century old math conjecture. The question is whether or not the digits of numbers s...
Tools such as OpenAI can on occasion give the impression that they are able to prove theorems and even generalize them. Whether this is a sign of real (artificial) intell...