The Good Stuff (7/9/20 - 13/9/20)

I’ll just call it ‘Maths’ from next time

Number Theory, Abstract Algebra, Analysis, Topology

  • The Wikipedia page on dessin’s d’enfants is sadly not very illuminating.
  • I looked at how Liouville numbers were ‘too rationally-approximable’ to be algebraic on proofwiki.
  • I found out what a regular graph is -> the Cayley graph for groups! The fact that group characters induce eigenvectors for the Cayley graph adjacency matrix blew my mind!
  • I didn’t know there was a bijection between N and algebraic numbers (in Cantor’s non-constructive proof of transcendental numbers: this also finds their density).
  • I found a site called mathphysicsbook which provides a shallow dip into mathematical applications to physics - that’s where I was introduced to fibre bundles.
  • Peter Cameron also has an excellent blog - I read about integrals of groups -> touched up on the commutator subgroup, abelianisation, subnormal series, prefect groups - now it’s all making sense! It’s wonderful realising that it must be a subgroup - so you can identify it quickly!
  • I had forgotten than Spin(3) was isomorphic to SU(2) - that was confusing for a while.
  • ‘Torsion group’ is a really weird name, I thought it was related to rotations.
  • As part of my beginnings in analysis, I read about set closure, interior, nowhere dense sets, etc.
  • I’d forgotten the Perlin noise algorithm, it’s actually pretty cool - sighack.com made crafted some beautiful visual art using it.
  • The free object seems reasonably simple.
  • Turns out the inverse Galois problem is hard.
  • I was searching for some concrete automorphisms of Steiner systems, but it didn’t yield much useful. I suppose I’ll be taking a break from mathematical group theory (maths in general, in fact, after a few days) and focus on applied stuff.
  • The stars and bars method on Brilliant had a great set of associated problems. I was trying, without success, to find a proof of the upper bound on Skewes’ number, or even its existence (Littlewood’s proof). Now I’m interested in topological fibrations (inspired by the Hopf fibration).
  • I’d like to see a scenario where it pays to see the power set as functions to {0, 1}.
  • St. Andrews had a cool page on factor rings and isomorphism theorems - nothing too radical.
  • I’d really like to understand Hilbert’s basis theorem - a prerequisite to that is Noetherian rings, so I turned to math3ma - she pointed back to Keith Conrad! His ‘blurbs’ on tensor products, finite fields and Noetherian rings are also very handy.
  • You can construct a 65537-gon but not a heptagon :) - I was looking at Galois-based proofs of the impossibility of trisection and non-Fermat prime+1 polygon construction.
  • Peter Cameron had a paper relating Steiner systems to the S6 outermorphim!
  • I went on a math3ma binge today, looking at the hierarchy of integral domains, tricks using Eisenstein’s criterion, functors (the Jacobian really cleared things up), automorphisms of the disk, the Yoneda embedding, an incredible proof of Brouwer’s fixed point theorem, the fundamental group of the circle (one-line proof) that really kicked off an interest in topology, quantum entanglement and SVD (one of my absolute favourites).
  • I randomly remembered ‘Iwasawa theory’ in Wiles’ proof and incredibly, it related the absolute Galois group, p-adic analysis AND ring theory - a good incentive to learn all three (it’s truly remarkable how often I serendipitously chance upon new ideas that relate concepts I’m currently interested in!). Sadly the AMS article that I found on it became very hard very quickly (some background reading is evidently in order, but I wish I could come across a Keith Conrad-style introduction to it).
  • Again, I remembered Heegner numbers, so I’ll need to read about class numbers before I can find out more.
  • The general fibration looks rather complex, so I’ll focus on Hopf fibrations, for which I found a friendly intro relating to quaternions. Because of the incredible one-line proof of the fundamental group of the circle, I was inspired to read more about pointed spaces, reduced suspension, topological join - even the smash product (quotients are ubiquitous, it seems, so my abstract algebra prep turned out to be useful in this regard); parallelizable manifolds - now fibre bundles look interesting too: I’ll begin with the vector bundle, since it seems pertinent to general relativity too.
  • What are Galois reps?
  • Finally, I looked at Marsaglia’s ziggurat algorithm.
  • All this got me pretty interested in geometric topology. Why is the smash product of S_m and S_n homeomorphic to S_m+n?
  • math3ma had lovely posts on operads, quite a bit clearer than the n-Category Cafe one.
  • I suppose vector bundles make sense, but I’d still like to see them in the context of GR, or a phase space.
  • Tai-Danae Bradley has a paper on Applied Category Theory - in chemical reaction networks and NLP - it’ll be a fantastic read!
  • I was looking at parallelizable manifolds, which reminds me - I should see the proof of the Hairy Ball theorem.
  • The intermediate and mean value theorem wiki pages were worth the read.
  • I need to read more on monoidal categories - I feel they underpin a trove of info.
  • Quadratic fields and integers make sense now, as does $\mathbb{PR^2}$ vs. $\mathbb{CP}$ (the trick is in the quotient).
  • I don’t know why a fancy ‘Dedekind cut’ is needed in the completeness of the reals.
  • On math3ma, I also took a look at the formalisation of the Yoneda lemma (very useful in category theory proofs), maximum vs maximal (again, posets vs. tosets).
  • Cornell has a decent book on vector bundles and K-theory: not sure how long my interest will last though. I’m still looking for an intuitive explanation of the Hopf fibration.
  • I’d love to be able to understand the exotic R4 manifolds (the great Michael Freedman worked on those)
  • Bott periodicity is a potential object of interest - it connects Clifford algebras and vector bundles.
  • I saw a fun generalisation of Fibonacci numbers to $\mathbb{C}$.
  • Brahmagupta’s solution generator for Pell’s equations was fairly simple, courtesy of Brilliant (no, they’re not sponsoring me).
  • As part of a school topic, I was reading about chaos theory (which seems to be rather vague to me) - but I was astounded to find its relation to topological symmetry breaking!
  • I finally found a simple explanation of Hopf fibration by David Lyons, quite visual - this special case is far, far easier than fibrations in general then.
  • I’m still trying to find the reason behind why the Jordan curve theorem is so hard - I’ll stick with the polygon curve proof for now.
  • math3ma’s automorphisms of Riemann surfaces are good practice for analysis.
  • I appear to have scrawled “covering space is fibre bundle w/ discrete fibres’, but a week later, I am struggling to decipher this.
  • Homotopy, homology and fundamental groups were rather confusing first, but I think I’ve got it now.
  • Tom Leinster’s ‘Doing without diagrams’ is also a great guide on how to generalise and apply category theory in proofs.
  • I kind of got bored of ring theory, so I’ll leave that aside for now - geometric topology too, most likely - I don’t like to aimlessly dwell on a topic for too long, it narrows my exploration.
  • Differential Galois theory sounded amazing, but unfortunately, it’s not used very much since the algebraic world is hard to translate into a form suitable for Lie theory. Ah, only separable field extensions have [L:K] = |Gal| (forgive the shorthand)
  • I hadn’t even looked at the difference between total and partial ordering facepalm. I looked at some tutorials on using Coq too - I’d like to understand it’s type checking functionality better.
  • Is there a systematic way to construct larger cardinals that is reasonably intuitive?

I’ve decided to add a separate subsection for distinguished reads:

  • Svante Janson - Roots of Polynomials of Degrees 3 and 4 - cool use of Galois theory for cubics and quartics (plus some Fourier analysis of finite groups!)
  • Couple of PDFs on Steiner systems - but nothing overly deep: I was looking for more on their automorphisms
  • Nigel Boston, UMadison, The Proof of Fermat’s Last Theorem - I’d consider this the next stage of Simon Singh’s excellent book; it’s an excellent mixture of history and the more technical aspects, including profinite groups and deformations of elliptic curves. I didn’t get an opportunity to read much of it, but I’ll definitely be coming back to it
  • Keith Conrad’s p-adic Expansion of Rational Numbers is very well explained, plenty of enlightening examples. His ‘Factoring in Quadratic Fields’ is also peppered with super-useful examples, but overall it was fairly easy.
  • ‘Olympiad Number Theory Through Challenging Problems’ looks really impressive, it’s just what I wanted (I doubt I’ll learn any new techniques, but who doesn’t love tons of solved problems)
  • ‘Iwasawa Theory: A Climb up the Tower’ got real hard, real fast (it is highly concentrated after all) - but looking back, I probably should have been more comfortable with PIDs and UFDs beforehand. Jim L. Brown has a much longer, but much more comprehensive introduction on the same. They could both be useful in conjugate with the Nigel Boston paper (*sigh* when will I ever get time to read all of these?)

Physics

Wow, I hardly did anything in Physics this week.

  • Non-linear optics and phonons (Fourier decoupling!) look really promising - I’ll have to go through Griffiths chapter 3 again though.
  • Allen Chou’s 3D Game Math series is also very useful - very cool method for quaternion renormalisation and implementations, a fun use of Taylor polynomials in sine and cosine approximations, and numeric springing for animations.
  • I was trying to find the significance of the Heisenberg group - but what better place than Peter Woit’s book? I had abandoned it a while back, but I’ll persist now - the tensor product of representations was great.
  • I just realised, the electric field equations in 2D look very similar to the Cauchy-Riemann equations.
  • I returned to Peskin and Schroeder, just for fun, picking out interesting bits from chapter 7.
  • I was looking at the Kepler problem since John Baez had a nice presentation on the SO(4) symmetry of hydrogen, and I came to the impressive Bertrand’s theorem. I’m looking for some kind of intro to string theory, inspired by the SO(32) embedding - maybe Joe Polchinski’s one? Even if only for the sake of learning about Calabi-Yau manifolds, M-theory looks like it’s worth a shot. I sorely miss being a kid, where time isn’t at a premium and I could just sit down with a pencil, paper and pure thought.
  • I reopened QFTGA, picked up right where I left off - statistical physics, generating functionals, the path integral, Wick rotations, symmetry breaking (I loved this chapter) and the introduction to renormalisation. This book is severely underrated: it covers such a vast array of topics, explains everything incredibly lucidly and has cute diagrams to boot - in fact, its explanation of QED processes (not computation) is superior to P&S’s. I’m also really excited to get a taster of Non-Abelian gauge theories AND QFT for condensed matter.
  • I’d also like to read the epic 5-part Division Algebras and Supersymmetry series by John Huerta, John Baez’ student

Computer Science and App Development

  • A YouTuber named Dani has an excellent tutorial on grappling physics - good use of quaternions, plus I put into practice line renderers and joints for the first time in Unity.
  • On a side note, I cranked out a Python script to add the history to the Chrome extension … until my epoch time was off by a magnitude of 3 (a harmless mistake)
  • Sebastian Lague’s Marching Cubes and Ray Marching videos are rather impressive - but Hydraulic Erosion was my favourite, I feel like doing something involving shaders in Unity now.
  • I remembered the oddly-named ‘Wavefunction Collapse Algorithm’ from a couple of years ago - no new improvements, but I’m trying to explore the procedural generation universe.
  • I was just browsing methods to remove jitter and stack-sinking from my physics engine. While I did find some interesting propositions, like applying variable positional corrections, force-based dynamics on pybullet and Gamedev SE, I’ll only be implementing stuff like warm starting and sleeping in the future.
  • Miraculously, and I can’t stress how lucky I was, I sorted out the physics engine. I was mournfully doing some routine code cleanup (adding persistent contacts in preparation for warm starting) when I flipped two minus signs to make it more symmetric. All of a sudden it was super-stable, and I sped it up a fair bit by caching a Cholesky factorisation and then using cho_solve. Of course! I’d forgotten my training - floating-point subtraction kills off precision! It’s now working excellently with just 8 rounds of joint sequential impulse - all I need to do now is improve the speed, clock out more FPS. Later, I reverted to multiple sequential impulse, which wasn’t jittery for the collisions with the new update! All right, straight to GitHub!
  • In other news, I also decided to get to work on my Milk Delivery app, designed to connect local grocery outlets with my neighbourhood during the pandemic.

Robotics and Machine Learning

Robotics, eh? Who’d have thought it?

  • I randomly remembered Andrew Gibiansky’s quadcopter post - I hadn’t really gone through it properly previously. It was incredible! Robotics is a perfect mix of physics, programming and linear algebra - much like a physics engine, but more practical and more diverse. His paper was probably the best, but I decided to look at a few other quadcopter dynamics papers. It gave me a great overview of what to expect in a robotics problem, but it’s quite complex, so I’ll start with some simple robotics examples - hopefully I can find a course.
  • In a stroke of incredible luck, I found a course on Underactuated Robotics by Russ Tedrake, MIT - it is OUTSTANDING. Incredibly intuitive, with visual explanations of phase portraits, applications like Acrobots and cart-pole (just like in RL! In fact, robotics even uses Bellman’s equations in cost analysis!) - this definitely spurned a lifelong interest in robotics. One qualm - a lot of potentially useful information is relegated to the appendices and later chapters (I’m not sure whether the first few chapters were just an exposition, but I wasn’t aware of, say, the manipulator equations).
  • I also found a paper on Sparsity in Robot Dynamics - it’s always fun to apply abstract theoretical concepts.
  • OpenAI’s blog looks very cool - today I read ‘Deep Double Descent’ and ‘Activation Atlases’.
  • Naive Bayes’ classifiers are nice and simple - its unfortunate that my knowledge of classification methods began with ANNs (I’m sure many others’ too) because for simple applications, there are far sturdier tools for the job.

The Good Stuff (31/8/20 - 6/9/20)

Number Theory and Algebraic Geometry

  • I found a short article on lattices, by John Baez (importantly, it goes into the non-elliptic curve formulation of the j-invariant through the eta function).
  • St. Mary’s College, California has an enlightening paper on the Lindemann-Weierstrass Theorem
  • I didn’t see a method to calculate the q-series of the j-invariant without a deep detour into L-functions (apparently this is one of Noam Elkies’ areas of expertise). One method involved the Hardy-Littlewood circle method (which connects back with the asymptotic analysis of the partition function!), which I really want to learn about.
  • Liouville’s numbers as a ‘too close’ approximation by rationals is something I’d never even considered.
  • I took a look at the Feit-Thompson theorem and p-adic numbers (only got the p-adic integers; need to revisit)
  • I took a peek at the modularity theorem (not the proof, of course!) but it looks quite difficult.
  • Stanford’s cryptography website is a great resource for finite group and field theory.
  • I came across Grothendieck’s ‘dessins d’enfants’ - although they seem very complicated, I love graph representations of maths and physics (Feynman, Dynkin and now this!) $\rightarrow$ I came across what might be the most intriguing blog of all time: neverendingbooks - primarily focused on algebraic geometry and algebraic group theory (like the stacks project, by Columbia, again!), it has a whopping 129 pages of posts! I also found a promising, elementary approach to dessins, by Pierre Guillot.
  • Inspired by the circle method, I did a bit more number theory - euler function, eta functions, the (rather surprising) pentagonal number theorem, q-shifted factorial and the modular discriminant, but it seemed way too abstract for my taste: aside from cryptography, I didn’t really find any concrete applications.
  • Ring $\rightarrow$ Integral domain $\rightarrow$ UFD $\rightarrow$ Field was explained by a Brilliant guest contribution (some are better than the wikis).
  • G-modules and R-modules seemed strikingly different to me.
  • I did find an accessible proof of Dirichlet’s theorem, but it was quite difficult.
  • The Eisestein series was a nice simple example of modular forms, and I read a little about the Rogers-Ramanujan identities, but I’ll leave aside number theory for now since its too abstract.
  • I HAVE to find a good source on Pontryagin duality - according to me, it seems like another one of those ‘beautiful mathematical objects’ that are really worth exploring to uncover new topics.
  • Although neverendingbooks looks really interesting, its too advanced for me currently, and I’ve not yet gone through algebraic geometry properly yet.

Group Theory and Abstract Algebra

  • The Wiki page on projective linear groups is not the most intuitive, so I’ll need a better source.
  • I’d like to see some concrete examples of octonion automorphisms (crucial for G2) and class functions.
  • Wiki has a clear set of properties relating the centraliser and the normaliser.
  • Another TWFIMP on symmetries of Dynkin diagrams (personally, I find it very cool that a simple root diagram encodes stuff about representations and automorphisms) - [Nihar from the future: I wouldn’t take that too seriously].
  • Of course I’m going to be digging into SO(8) and Spin(8) (the universal cover pancake diagram is overkill, to be honest).
  • Incredibly, Michael Penn decided to give an introduction to vertex operator algebras, related to his PhD thesis, along with associative + Lie = Poisson algebras. Since he talked about quotients of the tensor algebras by ideals, I suppose that’s where I’ll start in learning about: universal enveloping algebras, symmetric algebras and Clifford algebras (and hence Spin) (it’s amazing how everything fits in together so perfectly!).
  • I’m also eager to see how Poisson algebras arise here and how its related to quantisation.
  • Nimbers form a field of char 2!
  • “The Unapologetic Mathematician” looks like a promising source on abstract algebra.
  • I also found an extremely compact (hardly 131 pages) on group theory for QM - similar to the Harvard paper, but that one was more focused on representation theory - I’ll be taking a break from semisimple Lie theory anyway.
  • Cayley’s theorem seems blatantly obvious.
  • I couldn’t resist rereading triality, $\text{SO} \rightarrow \text{Spin}$ (because John Baez’s paper on octonions makes some unmotivated jumps) $\rightarrow$ no mention of Spinc groups anywhere? Probably the best method to understanding this business is through the tensor algebra (I’ve been delaying learning abstract algebra properly for so long now).
  • Turns out the curved arrow is just an inclusion map - all of those 3x3 group morphism diagrams will be a bit more informative now.
  • The S6 outer automorphism seemed very mysterious to me - luckily I found a MIT PRIMES presentation on it
  • Finally, I learnt a bit about ideals. Brian Sittinger’s answer introduced me to Abel’s theorem. Ideal and ring theory and abstract algebra in general seems to be an obvious next step up from group theory (much of the concepts should remain the same).
  • I thought of solvable groups as non-simple ‘all the way down’.
  • Terry Tao has a series of very in-depth posts on Lie algebra classification (peppered with the real-time developments) - not sure if I’m ready for that yet.
  • Another object of interest is PSL(p, q) (finite) - John Cook has two great blog posts on it - one introductory, and the other on equivalences (I’ll be able to look at A5 from a PSL perspective) $\rightarrow$ the isomorphism of PGL(2, 5) and S5 can be explained using the outer automorphism of S6! I have to come back to understand that in depth.
  • I saw a proof of how automorphisms on $\mathbb{R}$ must be the identity $\rightarrow$ Galois theory again! I probably hadn’t done it justice previously
  • I had forgotten the rigorous definition of a field extension degree, but it was just the dimension of the vector space over the original.
  • Eisenstein’s criterion was useful - I should check the proof out as well.
  • The Frobenius endomorphism looked run-of-the-mill until Senia Sheydvaserr wrote an answer showed an example in Galois theory!
  • Michael Penn returned with a new video on Vertex Operator Algebras - fairly high level, but it piqued my interest. Christophe Nozaradan had a beautiful paper on VOAs and their connections - at quite an accessible level too.
  • The Abel-Ruffini proof is almost elementray using Galois theory - so I looked at various proofs that A5 is simple - the conjugacy class-Lagrange’s theorem method was quite nice $\rightarrow$ I was going to look at splitting of conjugacy classes (S5 -> A5), but the groupprops page was mediocre.
  • Thomas Morrill’s paper linking the Modularity Theory and Galois representations was also super cool
  • I need to find come good tricks to whittle down $S_n$ cycles: amazingly, there’s a lot of applied group theory on AoPS.
  • I’ll just consult the wikipedia page on the exceptional automorphism of S6 (since John Baez’s icosahedron-based one is a little hard to visualise) - this relies on K6 graph factorisation $\rightarrow$ graph factors, factorisation, coloring, perfect matching.
  • Symmetric and exterior algebras are fairly clear, since quotienting really makes sense to me now.
  • I’ve stopped mixing up field vs algebra and vector space vs algebra now, but I’m still searching for a good source on p-adic numbers (appending searches with ‘blog’ is not the greatest; I’ll likely use Terry Tao’s long post).
  • I’m surprised there’s very little written on the braid groups, I thought they seemd interesting.
  • My knowledge of ring theory is shocking, so I’ll be attacking that next. I’ve become a huge fan of Keith Conrad’s after seeing that he has authored a vast collection of papers: I had read the Galois theory ones, but I found on one ideals too! He explains things really, really well, peppered with illuminating examples.
  • I was thinking of ideals as the ‘normal subgroups’ of rings.

Miscellaneous Maths

  • I looked at gyrovectors, inspired by a hyperbolic world game designer, but they look extremely difficult.
  • Picard iteration reminded me of power-solving a matrix equation.
  • Diffeomorphism’s aren’t $C_\infty$ smooth?
  • Sylvester’s law of inertia really ties in with SVD. I also took a look at Brouwer’s fixed point theorem.

Logic, Set Theory, Analysis

  • The formalism behind Cantor’s theorem is very reminescent of Russel’s paradox (Brilliant was useful in this regard)
  • Ordinal arithmetic (+ limit ordinals) was fairly straightfoward.
  • I can’t believe I forgot about Cauchy sequences - they are the key between compact and complete spaces.
  • I hadn’t heard of the Generalised Continuum Hypothesis, but it looks quite similar.
  • I think I’ll research more about topology and analysis in general, it looks interesting
  • Fundamental groups and centres seem to be very similar for simple Lie groups $\rightarrow$ Jeremy Kun’s Primer on the Fundamental Group in topology (he explained homotopies and loops very nicely) $\rightarrow$ using it to prove the fundamental theorem of algebra!
  • I finally see the relation between connected spaces and fundamental groups -> the topological long line, fun use of ordinals and lexicographic ordering
  • I read about the subset topology, quotient spaces, tensor products on Jeremy Kun’s blog.

Category Theory

  • Wow, it’s been a long time since I did anything specifically category theory-related (nevertheless, it pops up from time to time - as a last section abstraction on wiki pages). I was just thinking about how homology is like a functor from Top to Grp.
  • I should have known - Jeremy Kun has several expository posts on category theory - I took a look at universal properties (although I couldn’t find anything on existence theorems)
  • I didn’t understand monomorphisms vs injections: the classic remedy of examples of various categories would help. I’ll probably be returning here more often, considering I’m diversifying my study beyond group theory and I’ll be skipping the middleman: universal algebra - I took a look at categorical products and coproducts, particularly in the context of the direct product vs direct sum (some of the observations that I made may relate more to universal properties, rather than limits - as in Jeremy Kun’s blog).
  • Really confusingly, direct limits are colimits and inverse limits are limits (categorically)

Linear Algebra

  • The Gauss-Seidel solver was incredibly intuitive, probably my favourite iterative algorithm (need to compare the pros and cons though) - even though I won’t be using it in my physics engine unless there’s a huge simultaneous constraint. Speaking of Gauss-Seidel solvers, a PPT from robotics at Oxford made a wonderful distinction between Jacobi and Gauss-Seidel (the latter takes the newest values every time). This also introduced me to successive over-relaxation (using a continuous parameter is always fun)
  • Diagonalisation of quadratic forms was obvious in hindsight - signatures and degeneracy too (reminds me of zero divisors in non-divison algebras).
  • The matrix determinant lemma is quite useful for small perturbations, or reducing the determinant time. In fact, it’s very reminiscent of optimal tensor contraction, a topic that I was infatuated with in June due to its connection with graph theory.

Computer Science and Programming

This is a gripping tale, for sure.

  • Amazingly, I built a basic constraint solver (generic constraints + Baumgarte stabilisation for ~98% accuracy in distance constraints) in just over an hour! While geometric collision detection is a pain, the physics and linear algebra aspect is far more rewarding and inutitive, not to mention enjoyable - I made triple pendulums, edge hooks, compressible non-elastic strings, but what other 2D constraints are there beyond distance and angle? dyn4j was an excellent source on introducing the different Jacobians.
  • I was slightly led astray by the allure of global constraint solvers (which older papers seemed to be advocating), due to their more involved linear algebra, but that was before Erin Catto’s 2006 (?) GDC talk - I wasn’t aware that sequential impulse had to be applied several times. I’ll still have to manipulate matrix Jacobians for simultaneous constraints (prismatic) - and I had the idea that I can do the same for resting contacts: wouldn’t that minimise the instability from applying an impulse on one side only?
  • On a side note, I may have to port it to C++, non-Numpy Python may be too slow for this.
  • Sutherland-Hodgman polygon clipping is used in polygonal masks as well. I’m yet to find a proper explanation of sweep and prune (I’ll just do it myself in that case).
  • The definition of the contact manifold points varies widely, so its really annoying using constrasting sources. Once again, I’ve resolved to just do it myself (I still can’t see why a 4-point manifold is ever required).
  • I’m not sure if a generic constraint solver is the most appropriate method (I’ll have to do heavy code reuse anyway). On second thoughts, sweep-and-prune is probably not the way to go: I’ll use quadtrees or some kind of grid method.
  • Unfortunately, I still couldn’t work out the collision detection properly: maybe I’ll revert to SAT (since GJK-EPA seems needlessly time-consuming)
  • I realised my roadblock boiled down to a fundamental misunderstanding: how many contact points are there in a manifold? Consequently, I asked a question on gamedev SE (a very kind moderator always seems to answer them). And the problem lay in the fact that some sources have only one contact point - what I actually want is one on each shape (two in case of a resting contact). So I identified the goal - only problem was that I was stuck at the same place that I was a week ago: how to identify the ‘preferred’ contact (for which p1 + mpv = p2).
  • I’m considering porting the engine to C++, and I have a choice between Eigen and Armadillo (I’ll probably stick with Eigen: it’s already downloaded, only headers and has a slightly better interface for what my needs). Of course, Julia is also an option, but I’m just worried about the graphical side. It also wouldn’t hurt to try force-based and reduced-coordinate solvers, I think it’ll be a great rounding off.
  • I revisited my original SAT test - of course, I had to flip the mpv in half the cases! Collision detection is now fully complete (I added a ContactManifold class) so I can treat it as a black box and proceed to refine my constraint solver without any inhibitions. Unfortunately, it’s looking rather unstable - even with clamping, rotational errors accumulate up to 10e4! (or it could just be a bug). Much like my CNN project, I’ll make the interface a lot cleaner now (it’s like clearing up your messy workshop after weeks of toil).
  • I read about speculative contacts (courtesy of Erin Catto) in advance, since I don’t have proper continuous collision detection. I can’t wait to start root solving for speculative contacts: I had seen some of those methods in an introductory financial mathematics on SafariBooks.
  • The pybullet forums are the equivalent of the talkchess forums for chess engines (you regularly see experts stroll the fora!). On a side note, realpython.com is one of the better python guide sites.
  • Selection usually trumps Bubble on nearly-sorted arrays (pertinent to sort-and-sweep). As a final attempt to save my physics engine, I tried a separate solve for 1 and 2 contact points and adding slop. It’s all right, but still very unstable, so I’ll pause that project - although I have a sneaking suspicion that it’s due to low frame rate. What a journey!

Physics

  • Euler’s equations for rigidbody dynamics are sadly ignored during sequential impulse (+ to find the derivative of a quantity in a moving frame, cross it with $\omega$)
  • I didn’t find anything deep on the relation between commutators and poisson brackets - perhaps physics SE is the wrong place to look. Anyway, I haven’t been doing a lot of physics AT ALL (beyond the physics engine, which I’m abandoning now), so I should restart that (probably with radiation + magnetic fields in matter in Griffiths E&M), but I’m not sure where to go after that (P&S ch9, maybe?) - definitely not mathematical particle physics, beacuse I’m bored of that (although, the tensor algebra might reignite that).
  • Since I’ve been unsuccessfully trying to relate Clifford algebras and spin, I’m going to return to Peter Woit’s excellent QM-group theory book - I can’t believe I thought I could find a better source than that.

Stats and Machine Learning

  • Deep Octonion Networks sounded really cool, but the paper didn’t really pique my interest.
  • Finally, a Normalising Flow innovation that sinks GANs! I’ve loved NF for being very distribution oriented, it felt more natural. The super-resolution itself is not explained very clearly though.
  • After doing ANNs and stats for an entire month, back in June(?), I feel like I’m suffering from ML burnout - the only thing that seems to interest me currently is the computational side.
  • I thought the Moore-Penrose inverse looked familiar - linear lstsqs has an identical solution (the MPI minimises the L2 norm difference apparently).
  • Inspired by the EECV paper, I headed over to Eric Jang’s Modern Normalising Flows page.
  • I came across a really comprehensive guide to Bayesian statistics, which I need to polish - I particularly enjoyed the gamma-Poisson, normal-normal and binomial-beta conjugate examples. I also found a Cambridge Statistical Processing handout, geared towards Expectation Maximisation - cool stuff.
  • I also came across Gaussian Belief Propagation, by Danny Bickson - haven’t read it yet, but it looks like a wholesome review of methods in stats

Logic, Set Theory, Analysis

  • The formalism behind Cantor’s theorem is very reminescent of Russel’s paradox (Brilliant was useful in this regard)
  • Ordinal arithmetic (+ limit ordinals) was fairly straightfoward.
  • I can’t believe I forgot about Cauchy sequences - they are the key between compact and complete spaces.
  • I hadn’t heard of the Generalised Continuum Hypothesis, but it looks quite similar.
  • I think I’ll research more about topology and analysis in general, it looks interesting
  • Fundamental groups and centres seem to be very similar for simple Lie groups $\rightarrow$ Jeremy Kun’s Primer on the Fundamental Group in topology (he explained homotopies and loops very nicely) $\rightarrow$ using it to prove the fundamental theorem of algebra!
  • I finally see the relation between connected spaces and fundamental groups -> the topological long line, fun use of ordinals and lexicographic ordering
  • I read about the subset topology, quotient spaces, tensor products on Jeremy Kun’s blog.

Category Theory

  • Wow, it’s been a long time since I did anything specifically category theory-related (nevertheless, it pops up from time to time - as a last section abstraction on wiki pages). I was just thinking about how homology is like a functor from Top to Grp.
  • I should have known - Jeremy Kun has several expository posts on category theory - I took a look at universal properties (although I couldn’t find anything on existence theorems)
  • I didn’t understand monomorphisms vs injections: the classic remedy of examples of various categories would help. I’ll probably be returning here more often, considering I’m diversifying my study beyond group theory and I’ll be skipping the middleman: universal algebra - I took a look at categorical products and coproducts, particularly in the context of the direct product vs direct sum (some of the observations that I made may relate more to universal properties, rather than limits - as in Jeremy Kun’s blog).
  • Really confusingly, direct limits are colimits and inverse limits are limits (categorically)

Linear Algebra

  • The Gauss-Seidel solver was incredibly intuitive, probably my favourite iterative algorithm (need to compare the pros and cons though) - even though I won’t be using it in my physics engine unless there’s a huge simultaneous constraint. Speaking of Gauss-Seidel solvers, a PPT from robotics at Oxford made a wonderful distinction between Jacobi and Gauss-Seidel (the latter takes the newest values every time). This also introduced me to successive over-relaxation (using a continuous parameter is always fun)
  • Diagonalisation of quadratic forms was obvious in hindsight - signatures and degeneracy too (reminds me of zero divisors in non-divison algebras).
  • The matrix determinant lemma is quite useful for small perturbations, or reducing the determinant time. In fact, it’s very reminiscent of optimal tensor contraction, a topic that I was infatuated with in June due to its connection with graph theory.

Computer Science and Programming

This is a gripping tale, for sure.

  • Amazingly, I built a basic constraint solver (generic constraints + Baumgarte stabilisation for ~98% accuracy in distance constraints) in just over an hour! While geometric collision detection is a pain, the physics and linear algebra aspect is far more rewarding and inutitive, not to mention enjoyable - I made triple pendulums, edge hooks, compressible non-elastic strings, but what other 2D constraints are there beyond distance and angle? dyn4j was an excellent source on introducing the different Jacobians.
  • I was slightly led astray by the allure of global constraint solvers (which older papers seemed to be advocating), due to their more involved linear algebra, but that was before Erin Catto’s 2006 (?) GDC talk - I wasn’t aware that sequential impulse had to be applied several times. I’ll still have to manipulate matrix Jacobians for simultaneous constraints (prismatic) - and I had the idea that I can do the same for resting contacts: wouldn’t that minimise the instability from applying an impulse on one side only?
  • On a side note, I may have to port it to C++, non-Numpy Python may be too slow for this.
  • Sutherland-Hodgman polygon clipping is used in polygonal masks as well. I’m yet to find a proper explanation of sweep and prune (I’ll just do it myself in that case).
  • The definition of the contact manifold points varies widely, so its really annoying using constrasting sources. Once again, I’ve resolved to just do it myself (I still can’t see why a 4-point manifold is ever required).
  • I’m not sure if a generic constraint solver is the most appropriate method (I’ll have to do heavy code reuse anyway). On second thoughts, sweep-and-prune is probably not the way to go: I’ll use quadtrees or some kind of grid method.
  • Unfortunately, I still couldn’t work out the collision detection properly: maybe I’ll revert to SAT (since GJK-EPA seems needlessly time-consuming)
  • I realised my roadblock boiled down to a fundamental misunderstanding: how many contact points are there in a manifold? Consequently, I asked a question on gamedev SE (a very kind moderator always seems to answer them). And the problem lay in the fact that some sources have only one contact point - what I actually want is one on each shape (two in case of a resting contact). So I identified the goal - only problem was that I was stuck at the same place that I was a week ago: how to identify the ‘preferred’ contact (for which p1 + mpv = p2).
  • I’m considering porting the engine to C++, and I have a choice between Eigen and Armadillo (I’ll probably stick with Eigen: it’s already downloaded, only headers and has a slightly better interface for what my needs). Of course, Julia is also an option, but I’m just worried about the graphical side. It also wouldn’t hurt to try force-based and reduced-coordinate solvers, I think it’ll be a great rounding off.
  • I revisited my original SAT test - of course, I had to flip the mpv in half the cases! Collision detection is now fully complete (I added a ContactManifold class) so I can treat it as a black box and proceed to refine my constraint solver without any inhibitions. Unfortunately, it’s looking rather unstable - even with clamping, rotational errors accumulate up to 10e4! (or it could just be a bug). Much like my CNN project, I’ll make the interface a lot cleaner now (it’s like clearing up your messy workshop after weeks of toil).
  • I read about speculative contacts (courtesy of Erin Catto) in advance, since I don’t have proper continuous collision detection. I can’t wait to start root solving for speculative contacts: I had seen some of those methods in an introductory financial mathematics on SafariBooks.
  • The pybullet forums are the equivalent of the talkchess forums for chess engines (you regularly see experts stroll the fora!). On a side note, realpython.com is one of the better python guide sites.
  • Selection usually trumps Bubble on nearly-sorted arrays (pertinent to sort-and-sweep). As a final attempt to save my physics engine, I tried a separate solve for 1 and 2 contact points and adding slop. It’s all right, but still very unstable, so I’ll pause that project - although I have a sneaking suspicion that it’s due to low frame rate. What a journey!
  • I can’t recommend Erin Catto’s GDC lectures enough - they cover a wide variety of topics ranging from numerical methods to collision detection

Physics

  • Euler’s equations for rigidbody dynamics are sadly ignored during sequential impulse (+ to find the derivative of a quantity in a moving frame, cross it with $\omega$)
  • I didn’t find anything deep on the relation between commutators and poisson brackets - perhaps physics SE is the wrong place to look. Anyway, I haven’t been doing a lot of physics AT ALL (beyond the physics engine, which I’m abandoning now), so I should restart that (probably with radiation + magnetic fields in matter in Griffiths E&M), but I’m not sure where to go after that (P&S ch9, maybe?) - definitely not mathematical particle physics, beacuse I’m bored of that (although, the tensor algebra might reignite that).
  • Since I’ve been unsuccessfully trying to relate Clifford algebras and spin, I’m going to return to Peter Woit’s excellent QM-group theory book - I can’t believe I thought I could find a better source than that.

Stats and Machine Learning

  • Deep Octonion Networks sounded really cool, but the paper didn’t really pique my interest.
  • Finally, a Normalising Flow innovation that sinks GANs! I’ve loved NF for being very distribution oriented, it felt more natural. The super-resolution itself is not explained very clearly though.
  • After doing ANNs and stats for an entire month, back in June(?), I feel like I’m suffering from ML burnout - the only thing that seems to interest me currently is the computational side.
  • I thought the Moore-Penrose inverse looked familiar - linear lstsqs has an identical solution (the MPI minimises the L2 norm difference apparently).
  • Inspired by the EECV paper, I headed over to Eric Jang’s Modern Normalising Flows page.
  • I came across a really comprehensive guide to Bayesian statistics, which I need to polish - I particularly enjoyed the gamma-Poisson, normal-normal and binomial-beta conjugate examples. I also found a Cambridge Statistical Processing handout, geared towards Expectation Maximisation - cool stuff.

The Good Stuff (24/8/20 - 30/8/20)

I’ll probably be renaming (and/or splitting up) ‘Group Theory, Calculus and Fun Stuff’, probably into more manageable sections like Number Theory, Group Theory, Abstract Algebra - Category Theory, Linear Algebra and Logic+Set Theory already exist.

Group Theory, Calculus and Fun Stuff

  • The series differential equation for Bernoulli polynomial might come in handy. MathCounterexamples.net is very, very useful - particularly for topology and vector space concepts.
  • The supremum’s like a categorical limit on posets.
  • I saw the proof of the Five-colour theorem (this one was slightly lacking in rigor for the final step that leads to the contradiction though)
  • I’m actively trying to avoid group theory and representation theory (along with QED and computational chemistry). Correction - although I’m avoiding group theory in the context of physics, I’ll still be exploring sporadic groups (inspired by monstrous moonshine), because I feel they connect a lot of topics - so I narrowed down on the Mathieu groups (and on the side, the j-invariant) $\rightarrow$ the binary golay code - super interesting, I haven’t done much coding theory before.
  • I had been trying to understand the Conway group through the octonions and the Leech lattice, without much headway, although there was a really good PPT intro by Rob Curtis and the CIMPA Conference - took the route through sphere packing, which I am familiar with, but I got stuck at Weyl groups (that’s a topic which is too mathematical for me, along with Coxeter groups).
  • The eightfold path to E8 looks very promising (through the Hurwitz quaternions), as does a paper about the absolute galois group - wow, I’ll probably have a mountain of unread PDFs by the end of the year
  • I was digging around for some proofs of the Prime Number Theorem - it would probably be best to go through the zeta function continuation-route, but it is rather heavy on the analysis. Alternatively, I could try the Erdos-Selberg ‘elementary’ formulation
  • I revisited John D. Cook’s excellent blog which encompasses a large number of topics - I really enjoyed posts on elliptic functions, Hamming codes, Bessel functions, egg equations, lgamma, constraints - he also has ‘diagrams’, as in, cheatsheets, providing a succinct summary of different topics (the clearest was the gamma function identities).
  • I found a paper on elliptic curves and the j-invariant - all right for an introduction, but it made a series of unmotivated algebraic manipulations which obscure the underlying symmetry
  • I just realised - Michael Penn’s ‘difference under the integral’ is entirely equivalent to Feynman’s differentiation under the integral paramaterisation trick.
  • I looked at a bit more number theory - prime number theorem, Legendre’s conjecture and Bertrand’s postulate (I believe number theory was what got me interested in maths many years ago!).
  • Determinants of tridiagonal matrix yield a cool recurrence relation.
  • Are the Ulam spirals bars related to Dirichlet’s theorem? (probably).
  • I looked at a powerful use of generating functions in counting problems (something beyond stats!)
  • Can barycentric coordinates be used beyond triangles?
  • Michael Penn’s PhD thesis was related to vertex operator algebras - amazingly, they have far-reaching applications in string theory - hopefully I can find a friendly explanation. I tried some PPTs, but again, much of it seems rather unmotivated
  • Speaking of this, I’ve reached a point where I can reasonably understand mathematical papers (I used to avoid them and stick to websites, but my library has been growing since July), and although I haven’t come to favour the dense, lemma-rich method of exposition, a balance between that and a friendly exposition is the ideal learning arrangement [me from the future: take for example Clifford Algebras as Filtered Algebras by Yaim Cooper, MIT]. However, I would rank powerpoint presentations by British universities as my favourite introductory source of all.
  • I’m reading about Airy functions now (I skipped the WKB approximation in Griffiths so I didn’t come across it there) - it might be useful to record all the Fourier transforms that I come across in case the reverse pops up somewhere.
  • One thing that I’m going to do more often is search ‘A and B’, to find relationships - ‘Conway group and the Leech Lattice’, ‘Octonions and E8’, ‘Mathieu Groups and the Golay Code’, ‘Icosians and the Janko Group’, ‘E8 and spinors’ - all these yield a trove of papers.
  • I randomly remembered the Zariski topology (as a not-so-pathological non-Hausdorff topology - because of its use in algebraic geometry) - so I decided to pursue that further.
  • Summation by parts came as a surprise to me
  • The Golay code also led me to some basic coding theory (dual and parity-check matrices).
  • I’m quite excited to learn about the absolute galois group (my path is practically forged by a MathOverflow post on the most beautiful objects and relationships in mathematics!).
  • I found some old friends, the Lindemann-Weierstrass theorem and the Gelfound-Schneider Theorem (once I look at transcendentality measures, I’ll read some proofs).
  • cp4space doesn’t disappoint again! - a great post on the Leech lattice (I’m now reasonably confident of what it is, but I want to know how it relates to, say, octonions)
  • I thought it’d be fun to get back into number theory - where better to start than the Lindemann-Weierstrass theorem> (+ proof, this time!).
  • Transitive groups make total sense now - if you want a intro that summarises all the main features before diving into Wikipedia, Wolfram Math World is probably the best choice (if there are no blogs :p).
  • Took a look at Zorn’s lemma and its application to transfinite induction (which just looks like strong induction on ordinals).
  • The inverse scattering transform looks very interesting, but most resources are too hard (I’m not up to date on modern diff eq methods).
  • I needed reminding of the derivation of Stirling’s approximation using Laplace’s method (whose non-rigorous proof seemed very natural on third glance).
  • I read a bit about Bessel functions. I also came across some awesome John Baez articles on the Leech Lattice, as well as on the discrete harmonic oscillator spectrum due to the phase space constraints, so a ‘TWFIMP’ binge is likely, I think.
  • I reread projective linear groups (I didn’t know that PGL was isomorphic to PSL if the latter contained all the field roots of unity).
  • I finally looked at the Sylow theorems - very useful in the context of finite simple and solvable groups, which I love because of Galois theory
  • I should try using the Coq proof solver (looks more fun than Agda at any rate)
  • Found a very cool example of a non constructive proof (irrational^irrational)

    Linear Algebra

  • Levinson recursion for Toeplitz matrices (wonder where I’ll need those?).
  • Singular Value Decomposition is awesome - there’s geometric intuition on Wiki (incidentally, hyper-ellipsoids popped up in PCA too - can they be linked?).
  • I read the comprehensive page on the Rayleigh quotient - incredibly, it links the variational principle, quantum matrix elements, PCA and Sturm-liouville theory! (I didn’t quite get the min-max theorem, I’ll redo that later).
  • After some digging around, I found out how to find a matrix representation of a linear operator (just use the basis vectors) - which makes finding the eigenvalues straightforward now.
  • How do you permute a sparse matrix to reduce Cholesky fill-in?

    Physics

  • There HAS to be a deeper reason for why differentiating and integrating Grassmas variables yields the same result (probably delving into exterior algebra) - on that note, I’d love to see a unification of differentiation and integration somehow (or formulate it myself!).
  • Quantum dots - most explanations don’t really involve anything quantum?
  • Penguin Feynman Diagrams led me back to Quantum Diaries, very expository, especially regarding the Higgs (being ‘eaten’ by the W’s) and its use in Feynman diagrams $\rightarrow$ he cited the ‘spinor bible’, an excellent amalgamation of spinors, Feynman diagrams and supersymmetry (three of my favourite concepts in quantum!).
  • Measuring the weak coupling constant by comparing decay rates was very cool.
  • I read an outstanding Physics SE answer on why it’s not necessary for Boltzmann’s constant to have dimensions (I’m not too good at thermodynamics, but I think I can glance at a cheat sheet to remedy that).
  • I noticed that spherical wave transformations (a type of conformal mapping) work a lot like the modular group action.
  • Interesting how solitons can be formed by the non-linear Kerr effect cancelling out dispersove forces - I still have to explore the maths behind it though
  • I hadn’t seen the usage of de Broglie’s theorem to justify stationary orbits as standing waves (that was a long-standing qualm on my side)

Computer Science

  • Touched up on how DFT is a polynomial evaluation at the roots of unity.
  • Interesting how sudoku’s can be represented as a graph coloring problem.
  • I checked out SymPy’s trace tech, but it’s not too sophisticated.
  • The Phong reflection model is rather basic - how can I upgrade it? (I should see Sebastian Lague’s excellent use of shaders in Unity - he makes videos that are both entertaining and informative).
  • Minimising FLOPs in matrix product traces reminded me of optimal tensor contraction order that I did several months back (graph-based methods are, of course, my favorite!).
  • I read about the simplex method for linear programming on Brilliant - again, the method using slack variables reminded me of a similar procedure for finding the furthest polytope vertex in my Rust physics engine (I’ll be making a new one in Python, now that I’ve found a good graphics lib!).
  • The nearest-neighbour algorithm is TSP related? (not ML!) $\rightarrow$ couple of TSP heuristics, including one that I thought of when I was 8! (reversing the order of a subset).
  • There is a nice definition for the inverse Ackermann function - now I see how it arises: from repeated function composition (like log*) -> on the other hand, cp4space introduced his sigma function which dwarfs TREE (I liked the combinatorial logic system with oracles). I liked the ‘strategy-stealing’ system for poset games
  • Langton’s ant was super-cool (I’ll try it with a hexagonal grid).
  • Out of VisPy, Kivy and Pyglet, the last suited my purpose very well. I got graphics and collision detection up and running (impulse-based resolution still in progress - Randy Gaul’s page, one of the very first I visited, is not the greatest for the mathematical derivation: I found a different set of slides).
  • Most of my old sources are solid: I’m going to do proper broad phase (sort and sweep - Nilson Souto nails it on Toptal - his is far more relevant for 2D). Allen Chou’s excellent series will also be very useful.
  • I opted for SAT instead of GJK this time (people don’t seem to mention the fact that SAT can be used for general convex polygons too) - but the major improvement has been switching over to NumPy (sometimes a 15:1 code line saving on Rust): extreme ease of use and speed.
  • I’m still a NumPy veteran thanks to CNNs, so I was able to not just vectorise, but matrix-ise the SAT implementation.
  • After watching a video by pontypants, I feel suddenly inspired to use Unreal for a game
  • I fixed the flickering by employing batch-drawing + double buffering (pyglet is so low-level that it keeps it optional).
  • I took a look at Minkowski Portal Refinement, but contrary to popular belief, it doesn’t provide the contact manifold (oh well, I’ll have to deal with clipping now).
  • Discrete vs Continuous collision makes sense now - spatial hashing might be worth reading about just in the context of robust data structures
  • Pampy looks very interesting for pattern matching (Python is sorely lacking in that, but I’m not sure how well it would integrate with duck typing)
  • The GDC slides (by Erin Catto and Dirk Gregorius) are excellent resources, much better than lots of half-baked sites. I also found some slides for a uni course: they derived the Eberly formula clearly, but it doesn’t generalise well to universal constraints
  • I read Real-Time Collision Detection, but it feels a bit introductory. I liked ‘How to Stop My Constraints from Blowing Up’.
  • I found dyn4j’s blog as well, which gives a decent (but simplified) overview of the major algorithms, but more importantly, is the first to properly explain clipping to find contact points in SAT.
  • I’m working on understanding sequential impulse (mostly from Erin Catto). Using my poor SAT contact points, I managed to add rotations (angular momentum was, of course, the wrong way) to my (very unstable!) physics engine!
  • Today I had my first taste of constraint-based solving (introduced by Nilson Souto [Part III], dyn4j and the very helpful gamedev SE - the first one on which I asked a question!).
  • Corrected a slight problem: I was using the geometric mean rather than the centroid (so I had to gather the unwieldy “shoelace formula” stuff - my method of translating the vertices to put the centroid at the local origin is still a great idea).
  • I implemented GJK and EPA as well for a laugh - I just can’t seem to get the contact points, though (and most GitHub physics engine routines are in practically obfuscated). Unfortunately I hit a wall in my EPA refinement - I’ll probably have to invoke barycentric coordinates somehow, but I still can’t see it.
  • A paper by Marjin Tamis was very helpful in constraint-based solving: I was introduced to lambda-clamping, global solvers and distance constraints

Statistics and Machine Learning

  • There’s a very clear example on conjugate priors on Wiki, really cleared it up.
  • I like to imagine Bayesian methods as a duality between parameters and data (I feel they’re interchangeable).
  • I’d like to find out more about recursive Bayesian elimination.
  • I somehow returned to jakevdp! His In-depth series is almost flawless - I learnt about Random Forests, Naive Bayes Classification, PCA and Linear Regression (properly) - I should read his book on SafariBooks (one problem is that he uses sklearn a lot, but doesn’t show the underlying algorithms - but it’s fine for an intro and some concrete examples).
  • A quick jbstatistics video cleared up some confusion regarding the negative binomial distribution (I knew it was a generalisation of the geometric distribution!) - speaking of which, confusion matrices are useful for precision and recall (and F1 - first real use of the harmonic mean)
  • The old stick-breaking probability question had a set of nice answers on math.overflow - the best one was joining the two ends and seeing if the points lie in a semicircle.
  • I looked at some intuitive explanations of SVD and singular values, on gregorygunderson.com’s very concise blog, as well as jeremykun (!), who related the notions of data storage and linear transformations - what’s a ‘stick-breaking’ multinomial?
  • I need to read about Empricial Bayes and Radomized Sigular Value Decomposition on Gregory Gundersen’s excellent site

The Good Stuff (17/8/20 - 23/8/20)

If anybody comes across an error, please feel free to correct me!

Maths

  • Out of all the subjects, I think Brilliant excels most at Number Theory (although Stanford’s Cryptography website comes close) - I tried my hand at some problems using Euler’s totient theorem, the rather obvious rational root theorem and the Extended Euclidean Algorithm (super important with Bezout’s theorem for RMO) with the Chinese Remainder Theorem (which, on the surface, looks extremely fancy, but is really just a uniqueness theorem)
  • Mobius strips are the clearest examples of line bundles - but what purpose do they serve?
  • The Mobius inversion formula looks promising, but I’m yet to find some good problems utilising it.
  • I read a lovely high-level introduction to monstrous moonshine and the sporadic groups.
  • I also found a great Maths SE answer on when to use Ch, LU(P), QR and SVD (in short, there’s a tradeoff between numerical stability and speed; also QR is generally used for orthogonalisation)
  • Sylow’s theorems kinda remind me of the twin prime conjecture, I don’t know why.
  • Laplace transforms are just continuous complex series, if you think about it.

Linear Algebra

  • Silly me! I thought the only way to diagonalise a matrix was by pre-post multiplying with the transformation matrix - instead, just put the eigenvalues on the diagonal! (in a computational context where I’d already computed the eigenvalues)
  • Now I see why Gaussian elimination is a killer - triangular matrices are equally simple for determinants and eigenvalues.
  • I’ll need a better source than Wikipedia on the Jacobi eigenvalue algorithm, since it’s incorrect.
  • The spectral theorem was a good induction exercise.
  • Obviously separable matrices have rank 1 (I hate that terminology)
  • OK, the Jacobi eigenvalue algorithm involves using Givens’ rotations to zero out entries, fairly straightforward

Physics

  • I completely understood the Hellman-Feynman theorem now (although the expectation values of 1/r^n are too boring).
  • I have to investigate into ‘instantaneous multipoles’, in the context of dispersion forces.
  • minutephysics’ video on time dilation and length contraction is arguably the clearest there is
  • I solved for the equations of motion for a projectile with air resistance (fairly simple - but always use Newtonian formalism!).
  • Wave function collapsing is also clear now (Griffith’s book is quite poor in this regard) - what is decoherence?
  • Goldstone bosons arising due to spontaneous symmetry breaking is so cool (in phi-fourth theory and harmonic oscillators, they make the constant squared so that it’s positive!)
  • Now it’s fairly clear why bosons ‘transform’ under the adjoint representation - that’s how the gauge field changes!
  • I’d forgotten something basic about electric fields - so I decided to go through Griffiths Chapters 1-3 again (as Andrew Dotson said, it deserves to be a national treasure. It’s a shame that the same cannot be said for his QM book).
  • Scattering looks a bit more interesting now because of HEP in P&S - I’ll go through Rutherford scattering first in Griffiths.
  • Doppler broadening of spectral lines was interesting (and something which I’d never thought of before)
  • I’d love to see spherical harmonics paired with FFT in some computational problem; a link between geometric topology and Feynman diagram symmetry would be awesome too
  • Gamma matrices won’t form a basis for $\mathbb{C}_{4\times4}$ (just do some dimensional analysis)

Chemistry

  • Its been confirmed - electrons do occupy a superposition of all orbitals (I knew this, but had a niggling doubt regarding the formalism). The shielding effect is interesting - I wonder why there isn’t more literature on it?
  • Maybe I should research a bit about ligand field theory (it doesn’t feel like a substantial improvement, though).
  • My hypothesis is that $nd^5$ and $nd^{10}$ are more stable because of Hund’s first and third rules respectively. I found a great explanation on why $\text{S=S}$ is stronger than $\text{O=O}$, and why antibonding orbitals are more antibonding than bonding orbitals are bonding.
  • I tried learning about DFT, but I decided that a solid grounding in practical Hartree-Fock SCF is in order first, so I discovered a good PPT by emich.edu and ESQC. IvoFilot.nl had an awesome practical post on SCF linear algebra methods!
  • I’m wondering how I can separate the overlap and 4-point matrices (in the spirit of a Gaussian filter, but obviously not the same method)
  • $d_{z^2}$- is actually an abbreviation for $d_{3z^2 - r^2}$ (sneaky chemists) - that’s why you only need 5 d-orbitals
  • I made a basic CGTO basis set program, with assistance from my second favourite book of the month, Szabo and Ostlund. Everything about Hartree-Fock is explained in detail - they don’t skim over anything, they have appendices on the computational aspect, they have EXAMPLES, even a friendly tone - and, of course, it was under my nose the whole time I was trying to find a decent source on Hartree Fock (I had downloaded it last year).
  • I repeated my least-squares basis function for GTOs - turns out its an optimisation problem called ‘basis pursuit’. So, ab initio, I was able to compute the ground state energy for $\text{HeH}^+$ ($\text{H}_2^+$ can be deduced by symmetry arguments - it didn’t deserve a full-blown matrix solve)
  • I wasn’t aware of ‘diffuse functions’, with small $\zeta$: I’ll probably never get round to using them, unfortunately
  • I managed to understand the theory behind DFT - perhaps a good basis set is all you need, given the equations. The ABC of DFT is a decent introduction, but it is surprisingly difficult to find practical DFT code the same way you can for vanilla HF. I think compphys.go.ro’s GitHub page might supplement his already excellent articles with some concrete code.

    Computer Science

  • Butcher tableau for RK methods are a neat representation - is there a way to find whether a differential equation is stiff other than trying explicit methods on it?
  • Level-one cellular automata are cool (they’re on the Cambridge train station!), so I’ll try making a parallel implementation of all of them.
  • What are the advantages of quad vs trapz vs simpson? I was thinking of numerical integration for my HFSCF program, but I’m leaning towards analytical solutions for GTOs (same problem with the kinetic operator). I might use this as an opportunity to use Julia for a practical project because of the way it handles special matrices, although I’m not crazy about the VSCode support.
  • All right, using discretisation, I managed to solve for the hydrogen energies to about 99.8% accuracy (its not as satisfying as HF, nor does it generalise well, but it’s good practice for finite element methods).
  • Do neural ODE solvers have an edge in any way here?
  • Also, I guess I’ll be using Householder rotations for diagonalisation, unless Julia also handles that.
  • I read a bit about Gaussian integral evaluation techniques - I should look into FFT as a general method for such cases.
  • I’d really like to find out more about grid-based methods for ODE-based problems - hopefully some FFTs to the reciprocal space as well.
  • I found a guide called ‘FORM for pedestrians’ - a comprehensive intro, even containing Dirac-algebra trace formalism! (I should try and clone it as my first true CMD program). It’ll definitely be an excellent companion for any HEP course
  • I also need to find out more about graph embeddings

    Machine Learning

  • I’m not sure how important sampling is faster, I’ll have to consult a better source. I read a bit about CMA-ES, hopefully that helps with the rest of EAs
  • Lilian Weng has one of the clearest entries on VAEs of all shapes and sizes - I learnt about the VQ-VAE this time.
  • Back to Brian Keng - Expectation Maximization Algorithm, Regularization and Linear Regression Probabalistically, Marginal Likelihood in VAEs, Autoregressive Autoencoders (see: MADE) - even Hypothesis Testing, where he made a crucial distinction between ‘trapping’ with probability, and probabalistically trapping.
  • I saw why the denominator in the usual Bayesian formula is called the ‘evidence’ - I suppose it sort of makes sense, although it feels like abuse of terminology
  • There was a word2vec paper (the original?) that I never got round to reading :(