The Good Stuff (31/8/20 - 6/9/20)

Number Theory and Algebraic Geometry

  • I found a short article on lattices, by John Baez (importantly, it goes into the non-elliptic curve formulation of the j-invariant through the eta function).
  • St. Mary’s College, California has an enlightening paper on the Lindemann-Weierstrass Theorem
  • I didn’t see a method to calculate the q-series of the j-invariant without a deep detour into L-functions (apparently this is one of Noam Elkies’ areas of expertise). One method involved the Hardy-Littlewood circle method (which connects back with the asymptotic analysis of the partition function!), which I really want to learn about.
  • Liouville’s numbers as a ‘too close’ approximation by rationals is something I’d never even considered.
  • I took a look at the Feit-Thompson theorem and p-adic numbers (only got the p-adic integers; need to revisit)
  • I took a peek at the modularity theorem (not the proof, of course!) but it looks quite difficult.
  • Stanford’s cryptography website is a great resource for finite group and field theory.
  • I came across Grothendieck’s ‘dessins d’enfants’ - although they seem very complicated, I love graph representations of maths and physics (Feynman, Dynkin and now this!) $\rightarrow$ I came across what might be the most intriguing blog of all time: neverendingbooks - primarily focused on algebraic geometry and algebraic group theory (like the stacks project, by Columbia, again!), it has a whopping 129 pages of posts! I also found a promising, elementary approach to dessins, by Pierre Guillot.
  • Inspired by the circle method, I did a bit more number theory - euler function, eta functions, the (rather surprising) pentagonal number theorem, q-shifted factorial and the modular discriminant, but it seemed way too abstract for my taste: aside from cryptography, I didn’t really find any concrete applications.
  • Ring $\rightarrow$ Integral domain $\rightarrow$ UFD $\rightarrow$ Field was explained by a Brilliant guest contribution (some are better than the wikis).
  • G-modules and R-modules seemed strikingly different to me.
  • I did find an accessible proof of Dirichlet’s theorem, but it was quite difficult.
  • The Eisestein series was a nice simple example of modular forms, and I read a little about the Rogers-Ramanujan identities, but I’ll leave aside number theory for now since its too abstract.
  • I HAVE to find a good source on Pontryagin duality - according to me, it seems like another one of those ‘beautiful mathematical objects’ that are really worth exploring to uncover new topics.
  • Although neverendingbooks looks really interesting, its too advanced for me currently, and I’ve not yet gone through algebraic geometry properly yet.

Group Theory and Abstract Algebra

  • The Wiki page on projective linear groups is not the most intuitive, so I’ll need a better source.
  • I’d like to see some concrete examples of octonion automorphisms (crucial for G2) and class functions.
  • Wiki has a clear set of properties relating the centraliser and the normaliser.
  • Another TWFIMP on symmetries of Dynkin diagrams (personally, I find it very cool that a simple root diagram encodes stuff about representations and automorphisms) - [Nihar from the future: I wouldn’t take that too seriously].
  • Of course I’m going to be digging into SO(8) and Spin(8) (the universal cover pancake diagram is overkill, to be honest).
  • Incredibly, Michael Penn decided to give an introduction to vertex operator algebras, related to his PhD thesis, along with associative + Lie = Poisson algebras. Since he talked about quotients of the tensor algebras by ideals, I suppose that’s where I’ll start in learning about: universal enveloping algebras, symmetric algebras and Clifford algebras (and hence Spin) (it’s amazing how everything fits in together so perfectly!).
  • I’m also eager to see how Poisson algebras arise here and how its related to quantisation.
  • Nimbers form a field of char 2!
  • “The Unapologetic Mathematician” looks like a promising source on abstract algebra.
  • I also found an extremely compact (hardly 131 pages) on group theory for QM - similar to the Harvard paper, but that one was more focused on representation theory - I’ll be taking a break from semisimple Lie theory anyway.
  • Cayley’s theorem seems blatantly obvious.
  • I couldn’t resist rereading triality, $\text{SO} \rightarrow \text{Spin}$ (because John Baez’s paper on octonions makes some unmotivated jumps) $\rightarrow$ no mention of Spinc groups anywhere? Probably the best method to understanding this business is through the tensor algebra (I’ve been delaying learning abstract algebra properly for so long now).
  • Turns out the curved arrow is just an inclusion map - all of those 3x3 group morphism diagrams will be a bit more informative now.
  • The S6 outer automorphism seemed very mysterious to me - luckily I found a MIT PRIMES presentation on it
  • Finally, I learnt a bit about ideals. Brian Sittinger’s answer introduced me to Abel’s theorem. Ideal and ring theory and abstract algebra in general seems to be an obvious next step up from group theory (much of the concepts should remain the same).
  • I thought of solvable groups as non-simple ‘all the way down’.
  • Terry Tao has a series of very in-depth posts on Lie algebra classification (peppered with the real-time developments) - not sure if I’m ready for that yet.
  • Another object of interest is PSL(p, q) (finite) - John Cook has two great blog posts on it - one introductory, and the other on equivalences (I’ll be able to look at A5 from a PSL perspective) $\rightarrow$ the isomorphism of PGL(2, 5) and S5 can be explained using the outer automorphism of S6! I have to come back to understand that in depth.
  • I saw a proof of how automorphisms on $\mathbb{R}$ must be the identity $\rightarrow$ Galois theory again! I probably hadn’t done it justice previously
  • I had forgotten the rigorous definition of a field extension degree, but it was just the dimension of the vector space over the original.
  • Eisenstein’s criterion was useful - I should check the proof out as well.
  • The Frobenius endomorphism looked run-of-the-mill until Senia Sheydvaserr wrote an answer showed an example in Galois theory!
  • Michael Penn returned with a new video on Vertex Operator Algebras - fairly high level, but it piqued my interest. Christophe Nozaradan had a beautiful paper on VOAs and their connections - at quite an accessible level too.
  • The Abel-Ruffini proof is almost elementray using Galois theory - so I looked at various proofs that A5 is simple - the conjugacy class-Lagrange’s theorem method was quite nice $\rightarrow$ I was going to look at splitting of conjugacy classes (S5 -> A5), but the groupprops page was mediocre.
  • Thomas Morrill’s paper linking the Modularity Theory and Galois representations was also super cool
  • I need to find come good tricks to whittle down $S_n$ cycles: amazingly, there’s a lot of applied group theory on AoPS.
  • I’ll just consult the wikipedia page on the exceptional automorphism of S6 (since John Baez’s icosahedron-based one is a little hard to visualise) - this relies on K6 graph factorisation $\rightarrow$ graph factors, factorisation, coloring, perfect matching.
  • Symmetric and exterior algebras are fairly clear, since quotienting really makes sense to me now.
  • I’ve stopped mixing up field vs algebra and vector space vs algebra now, but I’m still searching for a good source on p-adic numbers (appending searches with ‘blog’ is not the greatest; I’ll likely use Terry Tao’s long post).
  • I’m surprised there’s very little written on the braid groups, I thought they seemd interesting.
  • My knowledge of ring theory is shocking, so I’ll be attacking that next. I’ve become a huge fan of Keith Conrad’s after seeing that he has authored a vast collection of papers: I had read the Galois theory ones, but I found on one ideals too! He explains things really, really well, peppered with illuminating examples.
  • I was thinking of ideals as the ‘normal subgroups’ of rings.

Miscellaneous Maths

  • I looked at gyrovectors, inspired by a hyperbolic world game designer, but they look extremely difficult.
  • Picard iteration reminded me of power-solving a matrix equation.
  • Diffeomorphism’s aren’t $C_\infty$ smooth?
  • Sylvester’s law of inertia really ties in with SVD. I also took a look at Brouwer’s fixed point theorem.

Logic, Set Theory, Analysis

  • The formalism behind Cantor’s theorem is very reminescent of Russel’s paradox (Brilliant was useful in this regard)
  • Ordinal arithmetic (+ limit ordinals) was fairly straightfoward.
  • I can’t believe I forgot about Cauchy sequences - they are the key between compact and complete spaces.
  • I hadn’t heard of the Generalised Continuum Hypothesis, but it looks quite similar.
  • I think I’ll research more about topology and analysis in general, it looks interesting
  • Fundamental groups and centres seem to be very similar for simple Lie groups $\rightarrow$ Jeremy Kun’s Primer on the Fundamental Group in topology (he explained homotopies and loops very nicely) $\rightarrow$ using it to prove the fundamental theorem of algebra!
  • I finally see the relation between connected spaces and fundamental groups -> the topological long line, fun use of ordinals and lexicographic ordering
  • I read about the subset topology, quotient spaces, tensor products on Jeremy Kun’s blog.

Category Theory

  • Wow, it’s been a long time since I did anything specifically category theory-related (nevertheless, it pops up from time to time - as a last section abstraction on wiki pages). I was just thinking about how homology is like a functor from Top to Grp.
  • I should have known - Jeremy Kun has several expository posts on category theory - I took a look at universal properties (although I couldn’t find anything on existence theorems)
  • I didn’t understand monomorphisms vs injections: the classic remedy of examples of various categories would help. I’ll probably be returning here more often, considering I’m diversifying my study beyond group theory and I’ll be skipping the middleman: universal algebra - I took a look at categorical products and coproducts, particularly in the context of the direct product vs direct sum (some of the observations that I made may relate more to universal properties, rather than limits - as in Jeremy Kun’s blog).
  • Really confusingly, direct limits are colimits and inverse limits are limits (categorically)

Linear Algebra

  • The Gauss-Seidel solver was incredibly intuitive, probably my favourite iterative algorithm (need to compare the pros and cons though) - even though I won’t be using it in my physics engine unless there’s a huge simultaneous constraint. Speaking of Gauss-Seidel solvers, a PPT from robotics at Oxford made a wonderful distinction between Jacobi and Gauss-Seidel (the latter takes the newest values every time). This also introduced me to successive over-relaxation (using a continuous parameter is always fun)
  • Diagonalisation of quadratic forms was obvious in hindsight - signatures and degeneracy too (reminds me of zero divisors in non-divison algebras).
  • The matrix determinant lemma is quite useful for small perturbations, or reducing the determinant time. In fact, it’s very reminiscent of optimal tensor contraction, a topic that I was infatuated with in June due to its connection with graph theory.

Computer Science and Programming

This is a gripping tale, for sure.

  • Amazingly, I built a basic constraint solver (generic constraints + Baumgarte stabilisation for ~98% accuracy in distance constraints) in just over an hour! While geometric collision detection is a pain, the physics and linear algebra aspect is far more rewarding and inutitive, not to mention enjoyable - I made triple pendulums, edge hooks, compressible non-elastic strings, but what other 2D constraints are there beyond distance and angle? dyn4j was an excellent source on introducing the different Jacobians.
  • I was slightly led astray by the allure of global constraint solvers (which older papers seemed to be advocating), due to their more involved linear algebra, but that was before Erin Catto’s 2006 (?) GDC talk - I wasn’t aware that sequential impulse had to be applied several times. I’ll still have to manipulate matrix Jacobians for simultaneous constraints (prismatic) - and I had the idea that I can do the same for resting contacts: wouldn’t that minimise the instability from applying an impulse on one side only?
  • On a side note, I may have to port it to C++, non-Numpy Python may be too slow for this.
  • Sutherland-Hodgman polygon clipping is used in polygonal masks as well. I’m yet to find a proper explanation of sweep and prune (I’ll just do it myself in that case).
  • The definition of the contact manifold points varies widely, so its really annoying using constrasting sources. Once again, I’ve resolved to just do it myself (I still can’t see why a 4-point manifold is ever required).
  • I’m not sure if a generic constraint solver is the most appropriate method (I’ll have to do heavy code reuse anyway). On second thoughts, sweep-and-prune is probably not the way to go: I’ll use quadtrees or some kind of grid method.
  • Unfortunately, I still couldn’t work out the collision detection properly: maybe I’ll revert to SAT (since GJK-EPA seems needlessly time-consuming)
  • I realised my roadblock boiled down to a fundamental misunderstanding: how many contact points are there in a manifold? Consequently, I asked a question on gamedev SE (a very kind moderator always seems to answer them). And the problem lay in the fact that some sources have only one contact point - what I actually want is one on each shape (two in case of a resting contact). So I identified the goal - only problem was that I was stuck at the same place that I was a week ago: how to identify the ‘preferred’ contact (for which p1 + mpv = p2).
  • I’m considering porting the engine to C++, and I have a choice between Eigen and Armadillo (I’ll probably stick with Eigen: it’s already downloaded, only headers and has a slightly better interface for what my needs). Of course, Julia is also an option, but I’m just worried about the graphical side. It also wouldn’t hurt to try force-based and reduced-coordinate solvers, I think it’ll be a great rounding off.
  • I revisited my original SAT test - of course, I had to flip the mpv in half the cases! Collision detection is now fully complete (I added a ContactManifold class) so I can treat it as a black box and proceed to refine my constraint solver without any inhibitions. Unfortunately, it’s looking rather unstable - even with clamping, rotational errors accumulate up to 10e4! (or it could just be a bug). Much like my CNN project, I’ll make the interface a lot cleaner now (it’s like clearing up your messy workshop after weeks of toil).
  • I read about speculative contacts (courtesy of Erin Catto) in advance, since I don’t have proper continuous collision detection. I can’t wait to start root solving for speculative contacts: I had seen some of those methods in an introductory financial mathematics on SafariBooks.
  • The pybullet forums are the equivalent of the talkchess forums for chess engines (you regularly see experts stroll the fora!). On a side note, realpython.com is one of the better python guide sites.
  • Selection usually trumps Bubble on nearly-sorted arrays (pertinent to sort-and-sweep). As a final attempt to save my physics engine, I tried a separate solve for 1 and 2 contact points and adding slop. It’s all right, but still very unstable, so I’ll pause that project - although I have a sneaking suspicion that it’s due to low frame rate. What a journey!

Physics

  • Euler’s equations for rigidbody dynamics are sadly ignored during sequential impulse (+ to find the derivative of a quantity in a moving frame, cross it with $\omega$)
  • I didn’t find anything deep on the relation between commutators and poisson brackets - perhaps physics SE is the wrong place to look. Anyway, I haven’t been doing a lot of physics AT ALL (beyond the physics engine, which I’m abandoning now), so I should restart that (probably with radiation + magnetic fields in matter in Griffiths E&M), but I’m not sure where to go after that (P&S ch9, maybe?) - definitely not mathematical particle physics, beacuse I’m bored of that (although, the tensor algebra might reignite that).
  • Since I’ve been unsuccessfully trying to relate Clifford algebras and spin, I’m going to return to Peter Woit’s excellent QM-group theory book - I can’t believe I thought I could find a better source than that.

Stats and Machine Learning

  • Deep Octonion Networks sounded really cool, but the paper didn’t really pique my interest.
  • Finally, a Normalising Flow innovation that sinks GANs! I’ve loved NF for being very distribution oriented, it felt more natural. The super-resolution itself is not explained very clearly though.
  • After doing ANNs and stats for an entire month, back in June(?), I feel like I’m suffering from ML burnout - the only thing that seems to interest me currently is the computational side.
  • I thought the Moore-Penrose inverse looked familiar - linear lstsqs has an identical solution (the MPI minimises the L2 norm difference apparently).
  • Inspired by the EECV paper, I headed over to Eric Jang’s Modern Normalising Flows page.
  • I came across a really comprehensive guide to Bayesian statistics, which I need to polish - I particularly enjoyed the gamma-Poisson, normal-normal and binomial-beta conjugate examples. I also found a Cambridge Statistical Processing handout, geared towards Expectation Maximisation - cool stuff.
  • I also came across Gaussian Belief Propagation, by Danny Bickson - haven’t read it yet, but it looks like a wholesome review of methods in stats

Logic, Set Theory, Analysis

  • The formalism behind Cantor’s theorem is very reminescent of Russel’s paradox (Brilliant was useful in this regard)
  • Ordinal arithmetic (+ limit ordinals) was fairly straightfoward.
  • I can’t believe I forgot about Cauchy sequences - they are the key between compact and complete spaces.
  • I hadn’t heard of the Generalised Continuum Hypothesis, but it looks quite similar.
  • I think I’ll research more about topology and analysis in general, it looks interesting
  • Fundamental groups and centres seem to be very similar for simple Lie groups $\rightarrow$ Jeremy Kun’s Primer on the Fundamental Group in topology (he explained homotopies and loops very nicely) $\rightarrow$ using it to prove the fundamental theorem of algebra!
  • I finally see the relation between connected spaces and fundamental groups -> the topological long line, fun use of ordinals and lexicographic ordering
  • I read about the subset topology, quotient spaces, tensor products on Jeremy Kun’s blog.

Category Theory

  • Wow, it’s been a long time since I did anything specifically category theory-related (nevertheless, it pops up from time to time - as a last section abstraction on wiki pages). I was just thinking about how homology is like a functor from Top to Grp.
  • I should have known - Jeremy Kun has several expository posts on category theory - I took a look at universal properties (although I couldn’t find anything on existence theorems)
  • I didn’t understand monomorphisms vs injections: the classic remedy of examples of various categories would help. I’ll probably be returning here more often, considering I’m diversifying my study beyond group theory and I’ll be skipping the middleman: universal algebra - I took a look at categorical products and coproducts, particularly in the context of the direct product vs direct sum (some of the observations that I made may relate more to universal properties, rather than limits - as in Jeremy Kun’s blog).
  • Really confusingly, direct limits are colimits and inverse limits are limits (categorically)

Linear Algebra

  • The Gauss-Seidel solver was incredibly intuitive, probably my favourite iterative algorithm (need to compare the pros and cons though) - even though I won’t be using it in my physics engine unless there’s a huge simultaneous constraint. Speaking of Gauss-Seidel solvers, a PPT from robotics at Oxford made a wonderful distinction between Jacobi and Gauss-Seidel (the latter takes the newest values every time). This also introduced me to successive over-relaxation (using a continuous parameter is always fun)
  • Diagonalisation of quadratic forms was obvious in hindsight - signatures and degeneracy too (reminds me of zero divisors in non-divison algebras).
  • The matrix determinant lemma is quite useful for small perturbations, or reducing the determinant time. In fact, it’s very reminiscent of optimal tensor contraction, a topic that I was infatuated with in June due to its connection with graph theory.

Computer Science and Programming

This is a gripping tale, for sure.

  • Amazingly, I built a basic constraint solver (generic constraints + Baumgarte stabilisation for ~98% accuracy in distance constraints) in just over an hour! While geometric collision detection is a pain, the physics and linear algebra aspect is far more rewarding and inutitive, not to mention enjoyable - I made triple pendulums, edge hooks, compressible non-elastic strings, but what other 2D constraints are there beyond distance and angle? dyn4j was an excellent source on introducing the different Jacobians.
  • I was slightly led astray by the allure of global constraint solvers (which older papers seemed to be advocating), due to their more involved linear algebra, but that was before Erin Catto’s 2006 (?) GDC talk - I wasn’t aware that sequential impulse had to be applied several times. I’ll still have to manipulate matrix Jacobians for simultaneous constraints (prismatic) - and I had the idea that I can do the same for resting contacts: wouldn’t that minimise the instability from applying an impulse on one side only?
  • On a side note, I may have to port it to C++, non-Numpy Python may be too slow for this.
  • Sutherland-Hodgman polygon clipping is used in polygonal masks as well. I’m yet to find a proper explanation of sweep and prune (I’ll just do it myself in that case).
  • The definition of the contact manifold points varies widely, so its really annoying using constrasting sources. Once again, I’ve resolved to just do it myself (I still can’t see why a 4-point manifold is ever required).
  • I’m not sure if a generic constraint solver is the most appropriate method (I’ll have to do heavy code reuse anyway). On second thoughts, sweep-and-prune is probably not the way to go: I’ll use quadtrees or some kind of grid method.
  • Unfortunately, I still couldn’t work out the collision detection properly: maybe I’ll revert to SAT (since GJK-EPA seems needlessly time-consuming)
  • I realised my roadblock boiled down to a fundamental misunderstanding: how many contact points are there in a manifold? Consequently, I asked a question on gamedev SE (a very kind moderator always seems to answer them). And the problem lay in the fact that some sources have only one contact point - what I actually want is one on each shape (two in case of a resting contact). So I identified the goal - only problem was that I was stuck at the same place that I was a week ago: how to identify the ‘preferred’ contact (for which p1 + mpv = p2).
  • I’m considering porting the engine to C++, and I have a choice between Eigen and Armadillo (I’ll probably stick with Eigen: it’s already downloaded, only headers and has a slightly better interface for what my needs). Of course, Julia is also an option, but I’m just worried about the graphical side. It also wouldn’t hurt to try force-based and reduced-coordinate solvers, I think it’ll be a great rounding off.
  • I revisited my original SAT test - of course, I had to flip the mpv in half the cases! Collision detection is now fully complete (I added a ContactManifold class) so I can treat it as a black box and proceed to refine my constraint solver without any inhibitions. Unfortunately, it’s looking rather unstable - even with clamping, rotational errors accumulate up to 10e4! (or it could just be a bug). Much like my CNN project, I’ll make the interface a lot cleaner now (it’s like clearing up your messy workshop after weeks of toil).
  • I read about speculative contacts (courtesy of Erin Catto) in advance, since I don’t have proper continuous collision detection. I can’t wait to start root solving for speculative contacts: I had seen some of those methods in an introductory financial mathematics on SafariBooks.
  • The pybullet forums are the equivalent of the talkchess forums for chess engines (you regularly see experts stroll the fora!). On a side note, realpython.com is one of the better python guide sites.
  • Selection usually trumps Bubble on nearly-sorted arrays (pertinent to sort-and-sweep). As a final attempt to save my physics engine, I tried a separate solve for 1 and 2 contact points and adding slop. It’s all right, but still very unstable, so I’ll pause that project - although I have a sneaking suspicion that it’s due to low frame rate. What a journey!
  • I can’t recommend Erin Catto’s GDC lectures enough - they cover a wide variety of topics ranging from numerical methods to collision detection

Physics

  • Euler’s equations for rigidbody dynamics are sadly ignored during sequential impulse (+ to find the derivative of a quantity in a moving frame, cross it with $\omega$)
  • I didn’t find anything deep on the relation between commutators and poisson brackets - perhaps physics SE is the wrong place to look. Anyway, I haven’t been doing a lot of physics AT ALL (beyond the physics engine, which I’m abandoning now), so I should restart that (probably with radiation + magnetic fields in matter in Griffiths E&M), but I’m not sure where to go after that (P&S ch9, maybe?) - definitely not mathematical particle physics, beacuse I’m bored of that (although, the tensor algebra might reignite that).
  • Since I’ve been unsuccessfully trying to relate Clifford algebras and spin, I’m going to return to Peter Woit’s excellent QM-group theory book - I can’t believe I thought I could find a better source than that.

Stats and Machine Learning

  • Deep Octonion Networks sounded really cool, but the paper didn’t really pique my interest.
  • Finally, a Normalising Flow innovation that sinks GANs! I’ve loved NF for being very distribution oriented, it felt more natural. The super-resolution itself is not explained very clearly though.
  • After doing ANNs and stats for an entire month, back in June(?), I feel like I’m suffering from ML burnout - the only thing that seems to interest me currently is the computational side.
  • I thought the Moore-Penrose inverse looked familiar - linear lstsqs has an identical solution (the MPI minimises the L2 norm difference apparently).
  • Inspired by the EECV paper, I headed over to Eric Jang’s Modern Normalising Flows page.
  • I came across a really comprehensive guide to Bayesian statistics, which I need to polish - I particularly enjoyed the gamma-Poisson, normal-normal and binomial-beta conjugate examples. I also found a Cambridge Statistical Processing handout, geared towards Expectation Maximisation - cool stuff.