The Tokenization of Life Martin Schwill, PhD · Follow 13 min read · 1 hour ago Martin Schwill, PhD; 2024–12–30 Chapter 1: Introduction Tokenization is the process of dividing sequences into smaller units, known as tokens, which can be understood and processed by discrete neural network models. I draw parallels to the concept of quantization in physics, where matter is reduced to its fundamental subunits, such as quarks, leptons, and bosons, which collectively form atoms. These particles adhere to the principles of the Schrödinger wave equation, exhibiting inherently probabilistic behavior in their interactions. Yet, the fundamental subunits of atoms give rise to all known chemical substances and the myriads of combinations forms the basis of life itself, all of which, in turn, adhere again to the discrete rules, which […]
Original web page at medium.com