Programme en évolution

Jennifer Balakrishnan (Boston University)

Quadratic Chabauty in higher genus

Abstract
Determining rational points on modular curves is an important problem in arithmetic geometry; those curves which have Jacobian rank at least equal to the genus remain the frontier. While quadratic Chabauty can be an effective p-adic tool for computing rational points on certain modular curves where the rank of the Jacobian equals the genus, many of the underlying computations, such as computing a basis of de Rham cohomology, as well as the local height computations, become computationally prohibitive for higher genus non-split Cartan modular curves.  We will discuss joint work in progress with Steffen Mueller and Jan Vonk, aided by Google Gemini, to find promising plane models for the genus 8 non-split Cartan modular curve $X_^+(19)$  and what remains to be done to complete the quadratic Chabauty computation.

Barinder Banwait

Realizing cubic isogeny primes: AI-assisted experiments
Abstract

Mazur's celebrated 1978 theorem determines the complete list of primes $p$ such that some elliptic curve over $\mathbb$ admits a rational $p$-isogeny. The natural strong uniformity analogue over degree-$d$ number fields — for each $d \geq 2$, determine the (conjecturally finite) set of primes arising as a $k$-rational isogeny degree for some elliptic curve over some number field $k$ of degree $d$ — is an open problem, already out of reach for $d = 2$ and $d = 3$. Over odd-degree fields the usual « non-CM » caveat is automatic, making $d = 3$ a particularly clean target.

In joint work with Maarten Derickx, we have developed an open-source tool, isogeny-primes, which for any number field $K$ computes a provably correct superset of its isogeny primes. Sweeping it across the cubic fields in the LMFDB yields a concrete finite upper bound for strong uniformity in degree 3. The remaining task is realization: for each candidate prime $p$, either exhibit a cubic field $K$ and an elliptic curve $E/K$ witnessing $p$ as an isogeny degree, or rule it out  — equivalently, decide which $p$ admit a non-cuspidal cubic point on the modular curve $X_0(p)$.

In this talk I'll describe an exploratory attempt to engage with this realization problem using contemporary AI tools: both as coding assistants for the classical computational attack on $X_0(p)$ (Chabauty–Coleman, Mordell–Weil sieve, explicit descent), and as lightweight predictors to help triage which candidate primes to pursue. I will present initial findings and some reflections on what this style of approach currently can and cannot contribute to problems of this flavour.

Gergely Berczi (Aarhus University)

A short tale of Stirling coefficients for symmetric powers

Abstract
We study the Stirling coefficients of symmetric power representations, with the goal of understanding their structure, finding closed formulas, and testing the conjectural binomial patterns underlying them. The story unfolds in a strikingly modern way: AlphaEvolve and deep-learning based computations suggest formulas, LLM frontier models turn these into proofs and structural results, and the resulting picture with human insight leads to new conjectures as well as new computational experiments with AlphaEvolve.
I will describe how this pipeline yields a proof of a strong form of the original conjecture, and in particular, I will present new rank-two binomial-positivity and log-concavity conjectures for the Schur coefficients, the partial results currently proved, and an AlphaEvolve setup aimed at the first genuinely open case.

Deines Alyson (CCR La Jolla)

Transfer Learning Rational L-functions
Abstract

This talk will focus on the transfer learning of rational L-functions, specifically between classical modular forms, Bianchi modular forms, Hilbert modular forms, and genus 2 curves over the rationals.  In these cases transfer learning (LDA, SVN, Neural Nets) performs quite well. However, when the LMFDB was updated last fall giving two more Bianchi modular forms, performance in one of the areas tanked: using LDA trained on L-functions originating from Bianchi modular forms and tested on the other types.  In this talk we explore this anomaly.

Jordan Ellenberg (University of Wisconsin)

1-Human-machine iteration I: introduction

Abstract

I will introduce the mechanisms of PatternBoost and funsearch/AlphaEvolve and talk about my own experiences using these two protocols to produce material of mathematical interest. In all cases, the experience is one of cooperation between traditional and ML methods, often with repeated back and forth between the two. In this talk I'll also try to give a sense of which problems seem best suited to these methods as they exist in 2026. (Perhaps our work this week will broaden my answer to this questi

2-Human-machine iteration II: generalization

Abstract

I'll talk about the question of generalization: to what extent can we use ML methods to help us generate not only examples for single instances of a problem (e.g. a large capset in dimension 8) but to help with families of examples that would change our knowledge for the general case of a problem (e.g. an algorithm that for each n yields a large capset in dimension n.) I'll talk more specifically about a recent result using AlphaEvolve which ends in a theorem about large hypercubes in the Bruhat order on S_n, for arbitrary n.

3-Summarizing our successes and failures

Kyu-Hwan Lee (University of Connecticut)

Reading a transformer's mind
Abstract

In this talk, I will discuss how one can interpret what a transformer has learned after training. The presentation will be based on two papers: arXiv:2502.10357 and arXiv:2511.12421.

Abbas Mehrabian (Google DeepMind)

AlphaEvolve user interface

Abstract
I will introduce the AlphaEvolve user interface. Please bring your laptops so we can walk through a simple problem together. For a general introduction to AlphaEvolve and the problems it can address, please see https://arxiv.org/abs/2511.02864.

Tom Oliver (University of Westminster)

Data Representation in Number Theory

Abstract
Before applying machine learning or AI to number theory (or to any subject) we must first decide how the underlying objects should be represented as input to an algorithm. But what is the most meaningful representation of an arithmetic object? Is an integer N best viewed simply as an integer, as a vector of residues modulo primes p, or through structures attached to N, such as the coefficients of an elliptic curve of conductor N? Is an elliptic curve a five-dimensional vector arising from a Weierstrass equation, an infinite-dimensional vector determined by its L-function, or a vector field derived from its L-function together with  its twists? Different representations reveal different patterns, and the choice of representation may determine what AI systems are able to learn and discover about arithmetic phenomena.

Alexey Pozdnyakov (Princeton University)

Three lessons from the murmurations discovery
Abstract
We review the role that machine learning and data science played in the discovery of murmurations. We then highlight a few takeaways on AI for math from the project. Namely, we describe the crucial role of interpretability, the data-scientific perspective on mathematics, and the role played by theory or the lack thereof.

Andrew Sutherland (MIT)

Using generative AI to explore number-theoretic datasets
Abstract

I will demonstrate two new tools that leverage the capabilities of generative AI to allow researchers to explore questions in number theory (and other fields) with much greater ease and flexibility than was previously possible.  The first tool is AlphaEvolve and the second tool is a new LMFDB-MCP connector.  I will give live demonstrations of both tools that I hope will facilitate exploration and discovery by workshop participants, both during and after the workshop.