|
Kolmogorov-Anderson Program We focus on algorithmic information theory and equilibrium statistical mechanics to understand strong generalization (internal model selection in favor of minimal description length) in statistical models. In particular we are interested in singular learning theory, equivariant feature learning, and homological nature of entropy (in the sense of Bennequin). |
|
Conceptual Dynamics We seek to address a formidable challenge of modeling belief propagation across human networks, particularly in semasiological and heterogeneous settings. We integrate insights from social sciences and neurobiology into modern deep learning, to understand complex dynamics of belief systems and relationship between micro- and macroscale social behavior. |
|
Spectral Selective State Spaces Leveraging the recent successes of selective state space models such as S4/S6, we aim to integrate them into geometric deep learning framework, with particular focus on spectral sheaf theory and optimal transport on $O(d)$-bundles. We believe that this approach will be advantageous in analysis and control of complex dynamical and belief propagation systems. |
|
Unsupervised Invariant Distillation/Detection Can unsupervised learning models and their latent space representations be used in statistical detection and classification of invariants by probing rich moduli spaces? In what cases can human intuition be assisted by unsupervised learning models? |