Hi, I’m back again, returning after a somewhat extended hiatus from physics. For one thing I’ve been doing a lot of competitive programming practice (Leetcode, Codeforces, Google Code Jam Archives - dynamic programming is a favourite), which honestly deserves a whole post of its own. Back in the realm of physics, a lot of my current exploration is in computational phys/chem, something which I hadn’t looked into much until now. Here’s what I’ve been working on lately:
-
I was also revising solitons in low dimensions - kinks, monopoles, vortices - because I’m planning on making a numerical simulation of $\phi^4$ kink-antikink-meson scattering. Since it’s a non-linear boundary value problem, I’ll probably be using the shooting method, which is a nice callback to the Griffiths Chapter 2 exercise. Plus I want to acquaint myself with Atiyah and Hitchin’s work on soliton moduli spaces, visualise the Kosterlitz-Thouless transition, and explicitly construct some skyrmion solutions/rational maps (incidentally, DAMTP is a huge hotbed for soliton research). Zee was pretty helpful for the physical interpretations here. In fact, I found Zee also contains the only good exposition I’ve seen of the fractional quantum hall effect from a field theory standpoint.
-
Density Functional Theory. I’ve wanted to make a simple, functioning DFT program for at least a year, but didn’t have the time until now. Unlike my previous analytical, Gaussian orbital-based Hartree-Fock code for helium, I opted to use a lattice-based plane-wave basis for periodic systems, for reasons of simplicity and novelty. Among other things, pinning down the physics was my first practical use case of the Obsidian.md note-taking application. The code is still a work in progress, but I’ve managed to get the first significant module, a Poisson equation solver, up and running - with NumPy and matplotlib of course. Although I was planning on making it in C++ or Julia, NumPy/Python has the huge benefit of being able to prototype a robust first iteration quickly. A major difference from other projects I’ve done (think the chess engine, or the physics engine) is that a proper DFT simulation is quite hard to debug - realistic computations involve matrices with over 10000 rows and columns, so I essentially have to develop optimisations right from the beginning. Initially, finding useful resources was a huge struggle: the core theory behind DFT is actually very simple, but details on computational implementations were very scarce. I did find DFTK.jl through the JuliaCon presentations (seriously, check those out), but the source code is no Stockfish, it felt genuinely hard to read. The ultimate resource was actually the Cornell mini-course lectures here, even though the arguably most important video is missing from the series. Anyway, the next steps will probably be speeding up my Poisson solver to include more basis functions, then implementing a simple LDA functional and gradient descent solver. The ultimate goal will be to find some thermodynamic properties of a ion lattice ab initio.
-
BFSS matrix theory. This is the paragon of simplicity returning after a drudging trek through the complex: it turns out that M-theory in a particular background and reference frame is conjectured to be equivalent to a (relatively) simple supersymmetric quantum mechanics system describing D0-branes in 10 dimensions, which I find incredible. I’m using https://arxiv.org/abs/0101126 as a “big picture” review and https://arxiv.org/abs/1708.00734 as an in-depth source. Hopefully I’ll be able to delve into the meat of the conjecture without getting bogged down in the endless pushing around of numbers that seems to be prevalent in supergravity computations. For fun I’m also revisiting quantum 3D gravity (first exposure around a year ago).
-
And joining the ranks of “learning the theory behind the numerical techniques without coding them” are Monte-Carlo methods for QFT, and lattice gauge theory. The day I actually implement these is the day my personal supercomputer cluster arrives.