Default Image |
![]() |
No context provided. |
About me:
Hello, I am a mathematician (BSc, MSc) specializing in algebraic topology and differential geometry, with a particular focus on topological data analysis. My recent interests have expanded to include the statistical mechanics of deep learning, as well as topological and geometric deep learning. I am currently working on the development of a topological MFT and order parameter-based theory for coupled networks, employing spectral sheaf theory. Additionally, I am using sheaf neural networks to model the diffusion of multimodal, heterogeneous coupled signals. I have also delved into the relationship between minimal Hamiltonian decompositions in RG flow and information theory, finding that Hamiltonian descriptions appear to correspond with Kolmogorov complexity measures. This line of inquiry complements my broader interest in symplectic geometry and Floer homology. I am interested in bridging equilibrium statistical mechanics (Landau-Anderson picture) and non-equilibrium statistical mechanics (Prigogine picture). Furthermore, I am currently working on the development of a contemporary theory of memetics in a neurodynamical context.
About this project:
Since the 1950s, there has been active exploration of the deeper connections between statistical mechanics, ergodic theory, information theory, topology, computational complexity, and more, as evidenced by the works of luminaries such as John von Neumann and Andrey Kolmogorov. In a similar vein, complexity sciences, also known as computational cybernetics or Plectics (a term coined by Murray Gell-Mann), have emerged as an interdisciplinary field studying complex adaptive systems. This field amalgamates communication theory, signal processing, information technology, control theory, pattern formation theories, and seeks to elucidate the connections between simplicity and complexity, as explored at the Santa Fe Institute. Notably, recent work by John Beggs has applied the principles of equilibrium statistical mechanics, such as the Ising model, to the study of neuronal avalanches within a neuroscientific context, echoing the origins of modern machine learning models that started in an energy-based context with Hopfield models. Similarly, statistical mechanics has been successfully employed in the study of social multi-agent networks, as demonstrated by physicists like MatjaΕΎ Perc. This mathematical language of statistical mechanics has proven invaluable in describing and predicting complex neurobiological, cognitive, and social phenomena, such as cognitive effort. Our research group and I are currently working on a more comprehensive framework of plectics that employs a shared language, with the goal of connecting small and large scale network-based cognitive phenomena, from neuropsychology to culture, under the modern theory of memetics.
Contact me:
- Email: moc.liamg|8ngnsx#moc.liamg|8ngnsx
- Twitter: @1_800_MIRACLE
Currently authoring (smaller projects):
(The βοΈ emoji stands near articles I am more or less happy with).
To write on kernels and the kernel trick in machine learning:
βοΈ Hilbert Space β’ βοΈ Reproducing Kernel Hilbert Space β’ βοΈ Mercer's Theorem β’ Gram Matrix β’ Kernel function β’ Support Vector Machine β’ Kernel Vector Machine
To write on spectral theories, Hodge theory, and Riemannian geometry:
Hodge theorem β’ Hodge theory β’ Spectral theory β’ De Rham Cohomology (Differential Form β’ Smooth Manifold β’ Exterior Derivative)
To write on Riemannian information geometry:
Vector transport β’ Optimal vector transport β’ Metric tensor β’ Fundamental theorem of Riemannian Geometry (fundamental-theorem-of-riemannian-geometry) β’ Affine connection β’ Levi-Civita connection … (tbc)
Memetics and adjacent topics:
Cogcode β’ Stroop task β’ Cognitive effort β’ Information Hazard β’ Cognitive warfare β’ Steganography β’ Argot β’ Engram β’ Salience β’ Salience network β’ Semiosis β’ Mimesis β’ Obfuscation
Need to also write: Spectral graph theory
Currently writing (larger project):
Personal sandbox
- Spectral Sheaf Theory:
- * Spectral Sheaf Theory - Sandbox
- * //Spectral Sheaf Theory - Sandbox for a Navbox
- * Spectral Sheaf Theory - Notes
Sandbox:
Notes:
- Statistical mechanics of machine learning (Notes)
- Anderson and Prigogine Images (Notes)
- Dissipative Thermodynamics of Computation (Notes)
- Memetics (Notes)
Articles being written: