Pages

Friday, 23 August 2019

Quaternions Visualised

See Neil Wildberger on Quaternions and Rational Trigonometry.


This idea is also interesting with regard to sound modelling of the process of computing solutions to differential equations as discussed in
Aristotle on The Continuum.


See


These last two videos will give you a motivation for understanding the idea of a conformal mapping of the complex plane:


See John H. Conway on Weird Programming Languages. And the connection with numerical solutions of differential equations shows up here:


It looks to me like there might be a kind of duality in the above transformation. It might lend itself to illustration by this view of Fourier transforms. The relevant part of this idea is that of using a parametric formulation of the 2D curve you are calculating the transform of: this gives you two dependent functions, x and y, say, of a third independent variable z. The reason for this is that it is dependencies between x and y which lead to singularities in the derivatives, where the "slope" is infinite. But precisely where these singularities occur on the curve is merely an accident of the chosen basis: the particular orthonormal pair [i,j] which are in some sense arbitrary. This doesn't happen when the derivatives are taken with respect to a separate independent variable, which is orthogonal to i and j. The reason that this works is that you can often quite easily find a relation between the derivatives w.r.t. to the two dependent variables, and since their basis vectors are orthogonal, when one of them is close to 90 degrees, the other is close to zero, and well-behaved there. Now, if you look at the way the Euclidean measure of angle works, the half-turn, it paramaterises the rational points on the curcumference of a circle by units of exactly one right angle, and it does so symetrically about the 45° line, so the values at the parameters above and below the 45° line are reciprocal. See the fourth lecture of Wildberger here: Neil Wildberger on Quaternions and Rational Trigonometry and this video on complex exponential function, from 18 minutes 33 seconds and this video on the idea of the exponential function being the fixed-point of the derivative function.


This is basically the idea behind implicit differentiation which Nancy Pi explains rather beautifully here:


On the summing of infinite series of reciprocals of squares, see this video on how Euler calculated a closed form for the value of the Riemann zeta function at 2.


And here is a more "left-brained" treatment, but still visualisable, just you have to visualise processes of algebraic operations on equations, rather than processes of geometric operations on points, ...


At 8 minutes 35 seconds you see that by starting with the idea of some sort continuum, and deriving operations from the repeated process of division, you can recover basic elements such as the integers 1,2,3,4 and 5, but underlying the idea of each element is the idea of a continuum operated on by operators which have names, ... What are the names, well, they're things like 1,2,3,4 and e. This is what I meant by the phrase "sound modelling of the process of computing [numerical] solutions to differential equations". This is the idea underlying the use of Fast Fourier Transform to efficiently compute products of large integers: in the Fourier space multiplication is convolution, which is a sort of smearing addition operation, in the same way that long multiplication works by adding shifted copies of the multiplicand together. It the linearity of the Fourier basis functions which make this representation possible. See


So this idea of modelling processes of computing numerical solutions 5o differentiial equations gives models of integersas functions like 2^t = e^ln(2t) and that allows us to treat functions and their inverses symetrically, by using a single independent (i.e. orthogonal) variable, and the chain rule is then the connection between the derivative of the function and the derivative of its inverse. Here's a nice explanation of how abstract vector spaces can be used to represent both the functions as operators, and the spaces of things upon which those self-same operators act.


And this gives us the idea of a duality between the different representations of functions as vectors and vectors as  functions.


Under this duality, differentiation is an operation which rotates a representation of a function in such a way that the odd components of the function are transformed to even components, and vice versa. In other words, differentiation is a kind of phase-shift. The kinds of vector spaces which have this duality property are called inner product spaces. An Inner Product Space is any Vector Space with an extra scalar field, called a norm which obeys certain symmetries of conjugacy which restrict the way scalar multiplication acts on vector elements to preserve this duality. This scalar field is often the complex plane, but can even be a two-vector of complex vales, as is the case with spin one-half systems in Quantum Mechanics which are represented by spinors with something called Hermitian symmetry to preserve the conjugacy requirements of the inner product space axioms.

The idea I have is that we can model the "whole stack", starting with some species of complex analysis and associated operators for differentiation and integration, down to a successor functional on the integers, as a thing called a Catrtesian Closed Category, starting with just the exponential functor. And this will allow us to build an arbitrarily deeply nested series of inner models for operations on any part of the continuum we can  meaningfully carve out. This will allow us to investigate, for example, in what conditions the axiom of infinity plays nicely with the power-set axiom. Cartesian Closed Categories are part of Category Theory, a.k.a. Abstract Nonsense. Here's a nice introduction to the idea of a functor which is one of the principal abstractions:


See Aristotle on The Continuum, which gives some of the very basic theory. Now people of a more practical bent might think this is all a bit Airey-Fairey, so they should have a look at how it applies to industrial robotics. First, see my comments on this video, about we can make a low cost computer which computes these operations:


Those comments will also explain how I think we could quite practicly and feasibly build a machine, at moderate cost, which formally verifies its own operation as it progresses.

Now see how we could build such control systems into industrial robotics and process control systems,  including safety-critical areas such as flight control systems for passenger aircraft: See my comments on this video:


Another application of this would be "phased array LiDAR", using piezo-electro-optical switching. See Synthetic Aperture Radar and Feynman on Patents and the Value of an Idea.

I have thought a bit more about the idea of starting at Brachistochrones and Involutes and am now convinced of it, because when I looked up the definition of Evolute, it turns out to be just a generalisation of Euclid's definition of a circular arc to any curve. If you look at a copy of Euclid's Elements in Greek, such as Fitzpatrick's, you will see that in the Greek, a circle is described as a curve where the locus of perpendiculars is a point a constant distance from the curve. In other words, the circle is the degenerate involute of a singularity.  This corresponds exactly to Gauss' idea of curvature. See All About e.

No comments:

Post a Comment