If you don’t know already, I’ve been playing the keyboard/piano for a long time now, over 10 years. This was a fun little project that I undertook - ‘Take Off’ is, in my opinion, the best-sounding Yamaha keyboard demo songs, so I decided to recreate it with a multitrack recording. Now I’m no robot, so a little editing was definitely in order - I went with BandLab, which was really easy to get the hang of. The beat creator was quite cool, I used it to engineer the drums for my recording - here it is: (best heard with headphones on)
I’ve been working on this app - it’s called Mentr, and it’s designed to connect mentees with mentors (teachers, parents, alumni, even seniors) in school, replacing what used to be a cumbersome (and virtually non-existent) process. I used Google’s cross-platform Flutter toolkit (Android Studio is a hulking mess) along with Firebase, which, of course, integrates really well considering they’re both Google products. Although it’s not fully functional yet (quite a few features are mock-ups), I think I’ve managed to carve out a nice style for myself (frankly, I’m quite surprised how it turned out, as I had only a week to belt it out). I even made a voiced-over demo video:
I haven’t added in the PDFs in yet, they’ll be here soon.
Maths
I’ll try and understand the proof of why there are no finite dimensional unitary representations of the Lorentz group.
I looked at geodesics on an ellipsoid, the Cauchy problem makes sense and I suppose the closed subgroup theorem is good to be aware of.
Although I didn’t get past the first chapter, the Diagrammatic Calculus of Coxeter and Braid Groups by N. Gowravaram and Uma Roy (MIT PRIMES) looked really beautiful (Dynkin diagrams and Weyl groups are quite straightforward after all, perhaps my aversion to papers back then hindered my progress - I’d still like to look at a few examples, e.g. of $\mathfrak{sl}_n$, but I believe I’ll need the Cartan matrix). To add to this, I found a ‘Graphical Introduction to classical Lie algebras’, which uses category theory too!
I encountered Lipschitz continuity and the cascading inclusion chain (continuous vs. uniformly continuous: roughly, use the same delta for all x).
Regularisation involved the expansion of the gamma function at the poles, which naturally led to the Weierstrass factorisation theorem and a short proof for the Euler-Mascheroni constant integral.
I looked at Banach’s fixed point theorem applied to cosine and polynomials in general.
Now I have a good example of a bundle section - vector fields and the tangent bundle (it just looks like an inclusion map backwards).
I also found a great paper covering rings and modules (the obvious next step), and my quest to undestand the Calabi-Yau manifold led me to a summary of differential geometry.
The homotopy theory on wiki is actually a pretty good summary of cofibrations and related concepts too.
There was a useful Warwick PDF on elementray p-adic results (I’m quite comfortable with them now), also Keith Conrad (of course!) lecture notes on p-adic infinite series (which I haven’t got round to reading yet).
There’s actually been a full-cycle algorithm for the 3x3 Rubik’s cube, speaking of which, I’d like to understand Thistlethwaite’s algorithm.
The Peter-Weyl theorem looks like a useful sledgehammer (indeed, I alluded to it in my first Maths SE answer regarding the completeness of he spherical harmonics - although I am yet to see the Stone-Weierstrass proof).
Retractions and deformation retractions in topology make sense now (and its use in Brouwer’s fixed point theorem).
I tried to find something interesting on $\text{O}(\infty)$ or $\text{D}_\infty$, but it’s not quite clear.
Box vs product topology looks a lot like direct sum vs product to me (I’m guessing they’re categorical limits/colimits resp. somehow).
How do different free groups (and different vector spaces) differ among themselves?
As usual, I forgot the difference between Hilbert spaces, metric spaces and Banach spaces. I also found something called ‘Finite element exterior calculus’ which looks wildly interesting, but seems quite difficult.
I liked Gauss’ heptadecagon constructibility proof with the Gaussian periods (now I see why Fermat primes work - because the roots ‘halve all the way down’) - how do I find the general cos(2pi/n)?
I looked at distribution on nLab (+ bump functions, which emerge as the Fourier transform of continuous functions -> Schwarz space. That’s why fields go to zero at infinity.) as well as symmetric group representations - Young tableau find use here!, and I grokked compact spaces once and for all.
The Nilsen-Schreier theorem is cool and all, but I feel like it has some elegant applications which I can’t seem to find.
I missed this - L’Hopital’s rule states that g’(x) cannot be zero anywhere on the open interval containing c. On the whole, analysis seems forbidding because of its rigor and many definitions, but it’s extremely versatile, and can be used in a myriad of ways, even in the most unexpected of places.
The FEEC paper inspired me to read about Galerkin methods - the best resource was a UToronto Lecture. In fact, this could pave the way for something new (and something that I’ve been getting at for ages) - functional analysis. It’s slightly more advanecd than regular analysis, but I feel it’s more memorable and practical - I looked at Lp completeness, weak derivative, Hilbert spaces again. Robin vs. mixed boundary conditions is pretty simple.
It’s important to remember that diffeomorphism is weaker than Lie isomorphism (obviously, but sometimes it makes a difference).
The subset toplogy (and hence ‘continuity of inclusion’) make sense now.
The $W_{1,2}$ thing I was puzzled about earlier are actually just Sobolev spaces.
I found a great PDF as well as a website (with code) on the Galerkin method (which really mirrors my chemistry Hartree-Fock program in the sense of ‘basis-pursuit’).
The FEEC paper is a little clearer than the presentation (which was a little too concise).
I checked out the fundamental theorem of abelian groups (and how it’s different from the generalised Chinese remainder theorem).
Also, chain complexes are like a generalisation of exact sequences. Even de Rham cohomology is starting to make sense (especially in the context of differential forms - Carroll was a tad hasty in his exposition in the GR book) - but first I had to revise differential forms, and then simplicial complexes from none other than Jeremy Kun.
The norm topology is obvious in hindsight, but again, it’s something that I often take for granted.
UChicago has a pretty good introduction to schemes, but I’m not that interested in algebraic geometry anymore (much like number theory, its too abstract.
Integration of k-forms is confusing - I understand the pullback, but how come the wedge product vanishes under the integral?
I’m beginning to view the Fourier transform as a Schauder basis now. Also, the Baire Category Theorem has changed my perspective on ‘nowhere dense’ sets.
Linear Algebra and Category Theory
‘Thirty-Three Miniatures in Linear Algebra’ by Matoušek is exactly the kind of thing I was looking for - a short set of useful tricks that could prove handy in problem solving.
The determinant can be calculated from the trace (I hadn’t seen the relation with Newton’s symmetric polynomials).
The one-by-one substitution proof that all bases of a vector space are the same cardinality was cool - its something that I usually just take for granted.
I’ll try to formalise my knowledge of linear algebra (‘the only thing mathematicians really understand’!) - so I looked at the critical distinction between Hamel vs Schauder bases (how can the Baire Catgory Theorem be used to prove that Hamel bases are finite or uncountable for Banach spaces? Also, how does the axiom of choice come into play here?), and the incompleteness of the space of polynomials
I looked at cokernels as the measure of failure of surjectivity and the constraints on a linear equation -> zero morphisms, universal properties (again, it seems to be a catch-all term)
Physics
I looked at some analytical methods for finding the Klein-Gordon propagator on SE - I wonder if there’s an easier method for the light-cone integral?
I started regularisation and renormalisation (and importantly, their difference, aided by an SE answer. I thought that all it took was a ‘hard cutoff’, but that breaks almost all the symmetries) in Schwartz with Appendix B and Chapter 16 (in this regard it’s similar to, but more involved than, P&S chapters 6 and 7). I find it to be peppered with some cool motivating examples (Casimir effect, the Lamb shift especially, bringing me back to 2S1/2-2P1/2 splitting with first-order perturbation theory!) which puts the theory into perspective nicely.
I’m guessing that Schwartz’ dLIPS is just the invariant four-volume element?
Also I was correct, Schwinger and Feynman parameters are equivalent. Two annoying things that I just cleared up: Lorentz invariance vs. covariance and dimensions of the metric tensor (convention).
I continued with symplectic geometry for Hamiltonian mechanics (classical, mind you. I want to be able to look at Hamiltonian mechanics with the Poisson algebra + symplectic manifold formalism in a classical context, which also finds use in my newfound hobby, robotics!). So, I read a bit about symplectic manifolds and their relation to phase spaces (this part isn’t too strong), the interior product (when you expand the exterior algebra equivalence class, it’s a bit like applying the product to the first term only)
I couldn’t resist getting a taste of IR divergences (looks similar to the one in P&S), Yang-Mills theory and QCD Feynman diagrams - it seems to mirror QED, just with some color generators attaches; consequently, the pace is rather fast. To complement Schwartz, I need to find a good overview on practical calculations in QED and QCD (Schwartz often leaves us ‘hanging’ as to what to do with the result).
John Baez actually has lecture notes on Hamiltonian mechanics - unfortunately the later ones haven’t been transcribed to LaTeX yet. MIT Open Courseware might be really handy for this.
Again, I’m continuing with formal QFT, so I read about n-point functions, the Wightman axioms, Wick rotations (which just ‘wraps’ to periodic Euclidean spacetime), operator-valued distributions, adiabatic switching (this was there in QFTGA). Fock spaces are just the tensor algebra of single-particle states, as I intuited correctly. Although I didn’t delve too deeply into Wick algebras (they are hardly mentioned outside nLab), the equivalence between matrix and wave mechanics is just due to them being different representations of it.
Interstellar encouraged me to revisit General Relativity - wormholes and time dilation seems like a good place to start.
The Casimir negative energy sounded suspicious, and it is (negative relative to the zero-point energy).
I revised some degenerate perturbation theory (I was looking for something formulated in a more matrix-y way) -> quadratic and linear Stark effects, presented by Richard Fitzpatrick were great
Rigged Hilbert spaces just look like they add Dirac orthonormality
ER=EPR sounds interesting, speaking of which, Lenny Susskind’s lectures could complement David Tong’s lecture notes very nicely.
Now I know what Kahler manifolds are (it’s hard to keep track of all these definitions, but each one always seems to satisfy about 3 properties) and states on C* algebras. Unfortunately, I’ve found out that while Algebraic QFT may be mathematically rewarding, it was formulated at a time when the physical underpinnings weren’t entirely fleshed out, and consequently, they fail to describe several phenomena that were found later. I credit it with introducing me to functional analysis though.
I read a bit about the weak interaction in Schwartz (I was under the impression that it would be taught before QCD, but it seems quite challenging).
I also finished the first lecture in ‘Mastering Quantum Mechanics’ by Prof. Barton Zweibach, MIT. Nothing too difficult, it seems to mirror Griffiths Chapter 1.
Computer Science, Robotics, App Dev
I finished outlining the booking and payment screens for the app, creating it won’t be easy though.
gieseanw.wordpress.com is a fantastic blog - he covers expression templates in C++ vs. the new coroutines (now I’m excited for them); double dispatch, forward declarations - I hadn’t seen a C++ blogger before this.
Julia actually has SFML bindings, now I’m leaning towards it (it is garbage collected, but it shouldn’t matter too much).
I even browsed through some of Jeremy Kun’s other topics, simulating a fair coin with a biased one, hybrid images, 3d Game of Life
I managed to get the client-side prototype working in Flutter, just a bit of UI - I think I’ve developed a slick minimalistic style (perhaps not the most appropriate for milk delivery). I used animations and collapsible appbars for the first time - Flutter makes the workflow super smooth. I found a useful date picker by Syncfusion, but they need licensing so it might not make the final cut. In fact, I’d originally over-engineered it a little, so I imporved the ease-of-use front. Overall it was a thorough review of Dart and Flutter.
Zero-knowledge proofs were quite interesting (I think I’ve seen then before albeit in a different form - discrete log or blockhain).
I also know what mimaps are now (they avoid the Moire effect too!). For some reason, I decided to read a bit more about computer graphics -> back-face culling, alpha compositing, normal mapping, gamma correction, z-buffers, normal mapping, then onto shading models - Lambert and Phong (a review by Cornell).
An important point on the discrete log problem - it seems O(n) in terms of the group order, but its exponential in terms of element exponentiation - what the actual computation involves
The MIT presentation on forward and inverse kinematics (which, turns out, is just a frame choice - effector vs base) often referenced ‘grasping’, so I’ll take a look at that (I believe it’ll be more physics-oriented, using, e.g. friction). Anyway, a UMass PDF cleared some things up on linearization of control (its just like physicist’s linearization; I don’t know what I was anticipating)
Much in the spirit of Andrew Gibiansky’s posts, Jeremy Kun also has a set on KNN for digit classification (his repertoire is astonishingly comprehensive)
I’m trying to learn the meaning behind the expectation-maximisation algorithm - I’m also going to be looking at interesting machine learning algorithms, e.g. KNN for local outlier detection, and hopefully something related to structured data (even for NLP - reminds me of Chomsky’s hierarchy, which amazingly implies that regex cannot be used to parse HTML)
Of course Keith Conrad has to make a cameo here - I enjoyed his proof of the simplicity of PSL_n(F) using Iwasawa’s criterion
Oxford University’s Prof. Steven H. Simon’s Lecture Notes on Solid State Physics was an excellent, humorous introduction to a subject which I never thought I’d enjoy - solid state physics (in fact I used to have a Gell-Mann-like attitude that the only type of physics worth doing is particle physics!). It builds on the bits that I did enjoy in Griffiths Ch.5 - and even the friendliness of QFTGA pales in comparison to this - it could do with a few more real-life examples though. On a related note, I’ve heard that QFT somehow arises out of graphene physics, that could be interesting.
There are actually so many good papers that I’ve got on my phone - unfortunately, the file organisation system isn’t very ergonomic at all. One of the greats is Claire Frechette’s introduction to dessins d’enfants - originally I was mystified as to why there was so much graph embedding and combinatorics - but the final build up to algebraic geometry showed me just how diverse dessins are
Jonatan Lindell’s ‘Profinite Groups and Infinite Galois Extensions’ is a very terse, compact account, building up right from the axioms of set theory. It should serve as my go-to reference for such matters, but it’s not the most pedagogical of tools - I for one don’t vacuum in lemmas and suddenly start speaking Galois.
An unnamed PDF (possibly a section from David Tong’s classical mechanics notes) on Green’s functions for PDEs was a great survey on Fourier transforms, fundamental solutions and Green’s identities
Columbia’s GR Lecture Notes look very good from a mathematical perspective, but I prefer the physical-oriented style, like David Tong’s
Algebra and Analysis
I revised a bit of Riemannian geometry - fairly vanilla at this point, except the Lie bracket of vector fields which I had forgotten (or misunderstood, having not been thorough in vector vs vector field).
I looked at MIT’s conformal mapping, useful for GR which I’ll be restarting.
I rounded off my Galois theory research with some specific inverse Galois problems, general methods for finding Galois groups (an excellent compilation on SE), symmetry reduction criteria (with Keith Conrad’s S_n and A_n - still not sure how to factor polynomials modulo p) and explicit bases. I wasn’t aware of cycle types, but I see now that they’re useful in conjugacy classes (looks like a partition problem to me?). Also, a site called GroupNames is a great summary of the properties and relations of small groups.
I also revised some complex analysis - homolomorphic functions, an excellent proof of their analyticity, Parseval’s theorem, hramonic functions, the Weierstrass M-theorem as the analogue of the direct comparison test.
sporadic.stanford.edu has a fairly good site on Fourier transforms of groups, and I found a great overview of p-adic analysis by Cambridge.
rreusser.github.io’s visualisation of the sphere eversion was beautiful!
I was introduced to zeta functions of linear operators (I believe in the mathematical context of causal structure?) so naturally I was led to Green’s function methods for differential operators (as I am returing to physics), and functional analysis is probably a good mix between them - I can start with the functional trace and determinant
In my research on operator zeta functions, I came across the nLab page which, contrary to my previous experience (see 22 July), I found quite enlightening!
I suppose it’s because I’m more familiar with category theory and topology now, so I decided to explore related topics - [moved to the Physics section]
*-algebras are SO straightforward now (four months ago when I’d searched this, I hardly knew what a ring was, much less an algebra)
Emily Riehl’s article on configuration spaces and cohomology groups looks really interesting, I’d love to understand it properly.
Brilliant’s page on analytic continuation was illuminating: I can now easily visualise it, after reading on essential singularities, removal singularities, poles and zeroes. Descartes’ rule of signs was quite unexpected.
Representations and isomorphisms of the p-adic groups should be next on my list - it should help in clearing things up.
The Gaussian trick for the volume of the n-ball was a really neat find.
sporadic.stanford.edu has a great paper on Hecke algebras (my motivation was the modularity theorem) - I only read the first few pages where they introduced Coxeter groups - my first thought was that they look like generalised braid groups, at least in the group presentation.
I read an introduction to infinite-dimensional Lie algebras (ties in with vertex algebras), but I think the Witt algebra is a prerequisite.
Continuing with complex analysis, I found the unbelievable Picard’s theorems, Cauchy’s integral formula and Liouville’s theorem too - using which jeremykun had a great proof of the Fundamental Theorem of Algebra (finally!)
Lifting maps to covering spaces was so obvious in hindsight, just compose with the covering map.
The uniformization theorem was good to know (no proof) - I realise I’d seen it before on math3ma.
To compose Mobius transforms, multiply the matrices (this is like a reverse representation).
The Fat Cantor set was a great pathological example (I had just arrived from ‘How to be Small’ on math3ma) - it came from the same source as the long line (‘A Few of my Favourite Spaces’).
Another great application of 2-adics: Monsky’s theorem.
I’d forgotten that nilpotent groups had a terminating subnormal series - but why are they useful? Also, what’s the $W_{1, 2}$ condition (some kind of generalised C_1).
I guessed right - there’s no formalisation of proper classes in the ZF axioms
Category Theory and Linear Algebra
I had to relook at math3ma’s adjunctions series - I’m more interested in them after hearing the maxim, ‘adjunctions are everywhere’, and the free-forgetful adjunctions. I think a good exercise would be to take a look at the ‘big’ categories - Grp, Ring, Top, etc., and functors, adjunctions, nats, etc. between them - I think the big picture will also be useful in progressing to 2-categories.
Tai-Danae Bradley’s book, “What is Applied Category Theory?” is quite good, it really picks up in the later chapters - it goes hand in hand with ‘Seven Sketches in Compositionality’ (which she recommends) which could serve as the QFTGA of category to me. I also looked at Grp and Functor categories.
The illustrious Emily Riehl has another great paper called ‘Infinity Catgeories from Scratch’ - I’ll read it when I return to catgory theory
I hadn’t realised the connection between SVD and the Moore-Penrose psuedoinverse - John Cook had a great post on that, as well as on surprises in numerical linear algebra (I was mostly aware of these)
Apparently only normal matrices are unitarily diagonalisable (I’m not sure whether Griffiths mentions this at the end)
Physics
I was wondering what Noether’s theorem does to gauge fields (turns out you just make $\alpha(x)$ constant and treat it as global).
I was mulling over whether to look at the use of structure constants in non-Abelian gauge theories, but I’ll cross that bridge when I come to it.
WHY do Feymnman Parameters work? Also, why does the c-limit exist at all? - the usual rapidity argument is ridiculous, it just shifts the blame to the arctan(v/c) equation.
On nlab, I read about the Feynman propagator (this one really had a wealth of ideas), Mellin transform for the Schwinger parameterisation (I believe that’s equivalent to Feynman parameters, but I can’t remember how), causal structure, hyperbolic differential operators, advanced and retarded Green’s functions and the rather difficult Hadamard distribution.
I wasn’t able to find an example for a null basis in Minkowski space - I’d seen it on the causal structure Wiki page, which served as a great refresher for the kinds of paths and cones (I recall skipping that in Caroll) - although I still didn’t understand the term ‘chronological future relative to T’.
I really want to learn the principal U(1) bundle formulation of electrodynamics (I think it’s level four in difficulty - with geometric algebra at 3).
Speaking of causal structure, I’d like to see some common conformal maps for solving problems in GR (I actually found one, by UFJF).
So the Wightman propagator is to normal ordering what the Feynman propagator is to time ordering, right?
I finally found a decent line bundle connection on phenomenologica.com - while it only hinted at the relation between the two, it was interesting nonetheless.
I’m drifting into PDE theory now, with the Green’s function of the d’Alembertian.
Nakahara’s book, ‘Geometry, Topology and Physics’ looks really solid - not even in the context of physics, but because physics tends to be pedagogically more straightforward, so I’ll use it for homology instead :)
David Tong’s lecture notes on kinetic theory (?) were a great tool for linear response functions (this errs on the side of engineering, I feel).
Because I am absolutely in love with quotient spaces, I really want to see a description of gauge invariance as a modulo.
The moving magnet and conductor problem was a good exercise in special relativity.
Does the Levi-Civita connection form some sort of pseudo-Lie algebra? [Nihar from the future: what]
I needed some reminding of Weyl spinors and Noether’s theorem worked out (nothing David Tong’s lecture notes couldn’t handle)
The difference between propagators, Green’s functions and kernels is fairly clear now.
I suppose Cartan’s theorem on closed subgroups will come in handy for the Lorentz group subgroups.
I was trying to link graph theory and Feynman diagrams, nothing conclusive though.
I took a look at the screened Poisson equation - the 1/(r-r’) trick always seems to work. Also I still don’t get why the Yukawa potential dies off (it looks infinite to me)
The photon sphere was a great application of the Schwarzschild metric -> (in Kerr vs. Reissner-Nordstrom, I think I’ll tackle the former first, it seems more interesting despite its difficulty).
The Geodesics wiki page is rife with different interpretations, including very exotic ones regarding the ‘double tangent bundle’ and ‘geodesic spray’ (no idea what they do or anything).
I saw Dirac’s belt trick in action!
Today I returned ot Schwartz QFT again. Best decision ever - it trumps P&S in almost all regards. That’s why I read almost 200 pages in a single day. I had previously avoided this book since chapters 5 and 6 looked forbidding and unmotivated (maybe they are), but having done it elsewhere, I picked up from chapter 7 and realised just what I was missing out on. He emphasises group theory and Lorentz invariance as a means of postulating Lagrangians and elegantly attacking equations. He works through the calculations as much as P&S, and the calculations in chapter 13 may not be as detailed, but they are still solid - he also motivates spinors far, far better than P&S. He brings in the Ward identity early, and uses gauge invariance to good effect too. I also liked how he used invertibility for linear operators (I’m going to continue functional analysis because of him) and showed several methods to prove the spin-statistics theorem (at least for spins $0, \frac12$ and $1$), along with a stress on building up from phi-fourth to phi-third to scalar QED to spinor QED for new ideas (this really helped me). The group theory could have been a bit more pronounced, but that’s just my John Baez-influenced opinion right there.
Computer Science and Robotics
Technically Scientific Computing
Since my physics engine is pretty much complete (feature-wise), I had a little fun setting up some scenes - ramps, motors, a paddle, which was my first use of combined rigidbodies (it worked first time surprisingly), even some user look-around. All I need to do is haul in the distance constraints - although this’ll require some reworking since in its current form, ‘constraints’ is a hobbled together UUID-indexed dict. Once I’m done, though, I’ll have a physics engine supporting multi-body, springs, strings, pendulums, ramps, collisions and motors - awesome. I used cProfile to churn out some optimisations (AABB and local-to-global transforms were taking a long time - again, I’d love to port this to C++) + Pyglet has a cool FPS visualiser
The Robotics SE is surprisingly good - looking at the top questions might help. I researched a bit about inverse kinematics.
I managed to find a great overview of robotic control and dynamics - it alluded to the SE(3) group often, which was quite interesting
I thought a good example of linearization control would be the unicycle dynamical system, but I couldn’t find a good treatment of it (hopefully it includes the Udwadia-Kalaba method too)
I looked at numerical methods to solve PDEs, but engineering is SO BORING (disclaimer: this opinion is subject to change) - I might as well find out the difference between finite difference, element and volume (none of the sources so far have been useful)
In my search for ‘warm starting’ in physics engines, I came across an epynomous technique in Bayesian optimisation!
Jeremy Kun had two outstanding posts on cryptanalysis and word segmentation on n-grams - I knew that a naive Bayes classifier was in order!