Thursday, 17 April 2014

What is the Sixth Estate?

The Sixth Estate is a new genre of cinema. The motivation for the film is the overwhelming sense of frustration felt by scientists and engineers when they see movies which gloss over explanations of apparently difficult ideas. For those who actually understand them, the ideas themselves are far more interesting and exciting than any mere movie that could ever be made, so any attempt to make those ideas more exciting and interesting to a lay-audience inevitably destroys them.

The Sixth Estate is the story of how a film was made. The story is a true story, and it is the story of the making of the film called The Sixth Estate. Thus the Sixth Estate is an educational film. It is educational in two ways: exoterically, in so far as the film recounts true historical events, and esoterically, in so far as the making of the film constitutes those very historical events which it recounts. So the Sixth Estate is a semantic fixedpoint.

The story actually began a long, long time ago: when Man first learned that we make the world together, by imagining how it could actually be. But as recounted in the film, it begins at Episode Six. The contemporary setting is the Computer Laboratory at the University of Cambridge, England, in the year 2003, and it opens with a bizarre idea conceived by bored, caffeine intoxicated research students in the Security Group, who decide, late one night, to carry out an experiment using a live subject. But the experiment is ethically questionable, because to effectively demonstrate the intended principle, the subject must never actually know that she is in fact the subject of an experiment.

The principle the students wanted to demonstrate was that of the possibile effective use of a cryptographic code which does not involve exchanging a code-book or key of any kind. The coding system would then be an example of a protocol which could defeat Echelon, the Big Brother style monitoring of global communications by the combined efforts of United Kingdom Government Communications Head Quarters and the National Security Agency of the United States of America: the so-called 'Whores of Babylon.'

The notion of cryptographic coding without a code book was not without precedent. The codes produced by the Oxford Mathematician Charles Lutwidge Dodgson were a kind of proof of principle, but there was as yet no concrete evidence that anyone had been able to decrypt them without prior knowledge as to which of Dodgson's texts were in fact encrypted. And during the Second World war the Navajo code-talkers had been able to effectively encrypt and decrypt messages for the US Navy, without using a code book or key of any kind, and without any evidence that the Japanese had broken the code. But absence of evidence is not evidence of absence ...  and the experiment begins to go badly wrong, with potentially disastrous consequences.

The experimental subject, a Lesbian called Alice, is a divorced junior systems administrator who is also a part-time teaching assistant. She hates her day-time job, which is boring. The only thing that keeps her sane is obsessive compulsive physical training, the thought of high mountains, and teaching. She is not actually a very good teacher: she spends hours and hours preparing for just one hour of teaching. She is quite frank with her students, and she tells them that she could never pass the examinations she is trying to prepare them for. All she can offer her students is one possible way to understand the material of the lectures, and she hopes it will be useful to them.

The problems start when 'other forces' begin to become apparent. These forces have their origin in the pre-war collaboration between Alan Turing and Alonzo Church at Princeton University during the 1930's. They centered on the work done by Turing on computation and cryptography, and Turing's influence on the American's efforts to develop mechanical systems for code-breaking in parallel with the development of the Bombes used to break the German Enigma codes at Bletchley Park in England. This American connection was a dark secret at Cambridge, and ultimately the reason why Turing, though a fellow of King's College, went to study at Manchester University after the war, and why the name of Turing, the father of the modern digital computer, was seldom mentioned in front of the green door of what used to be called the Mathematical Laboratory. The secret was a dark one because of Turing's subsequent  treatment at the hands of the British intelligence establishment, and his alleged suicide.

Yet other forces soon become apparent, and they were a result of the German connection. This was through the Institute of Advanced Studies at Princeton: the war-time refuge of Kurt Goedel and Albert Einstein, which they both eventually made their permanent home. Goedel and Church were very close collaborators on the early work in the theory of formal proof, and thus were connected closely with Turing's work on computation. And Einstein, a close friend of Goedel, was very influential in the early development of the atom bomb by the American Manhattan Project. Einstein also had close connections with the European scientists working on the early theory of quantum mechanics: scientists such as Niels Bohr in Denmark, and Erwin Shroedinger and Werner Heizenberg in Germany. After the war, the so-called Allies gathered together all the European quantum physicists at Farm Hall, in an attempt to discover what had really happened in the German research programme to develop an atomic bomb. The hours and hours of recorded conversation, the "Farm Hall tapes," were inconclusive, and no one can yet be absolutely certain why it was that the German high command did not develop atomic weapons during the second world war.

The work of Goedel, Tarski, Church and Turing effectively put a lid on the development of automatic systems of computation. It did this by presenting several apparently unsolvable problems. Tarski's proof that no formal language can consistently represent its own truth predicate effectively put paid to the possibility of any concrete coding system (i.e. language) being capable of transmitting its own code-book or keys. Turing's proof of the insolubility of the halting problem for a-machines put paid to the idea that a machine could be used to effectively decide the effectiveness of another machine, and Goedel's incompleteness theorems put paid to the idea that a consistent system could decide any formal theory as expressive as the Diophantine theory of Arithmetic, which dates from 300 B.C.E.

These theories served to contain the development of computer and communications technology. They ensured that no one in mainstream academia would attempt to develop automatic and all-powerful theorem provers, nor automatic programming systems, nor any kind of universally secure communications protocol. Anyone who attempted to get funding for such a project would be laughed out of court, and their academic reputation ruined, possibly for life.

To maintain the status quo, the British and American Intelligence agencies kept a close eye on all publicly funded academic research. Both GCHQ and the NSA funded researchers at Cambridge, and thereby kept an ear to the ground. The result of this overt surveillence programme was that researchers aware of the possibility of solutions to these unsolvable problems were very quickly silenced. Typically this was done with Academic carrots: tenured jobs with almost limitless funding and guaranteed publication. But when the carrots did not suffice, they were silenced with sticks. Goedel's paranoia in later life was not without foundation, nor was the treatment of Manhattan Programme scientists at the hands of McCarthy, which led to the permanent exile of prominent American scientists such as David Bohm. The Manhattan Project was in fact a battlefield, and the war was between academics and the politicians controlling the American military-industrial complex.

Many academics knew this was wrong, and were ashamed, and wanted to put an end to the intellectual corruption. So they began publishing esoteric texts which contained all the clues anyone would need to uncover the principles of computation and communication above the least fixedpoints established by Tarski, Goedel and Turing. The first to do this was Alonzo Church in his 1940 paper "A Formulation of the Simple Theory of Types." This was the motivation for the development of a system called LCF, the Logic of Computable Functions, which was started at Princeton and continued at Edinburgh, and later at Cambridge. The esoteric reading of Church's work was then quite effectively obscured by the development at Cambridge of what is now called Higher Order Logic, in the form of automated theorem provers such as HOL4, Isabelle/HOL, ProofPower, which was developed at ICL and partly funded by the British Ministry of Defense, and HOL Light. Anyone who wanted to learn about Church's formulation of type theory went to the modern sources, and so the notation Church used in his paper quickly became obsolete, further obscuring the esoteric reading.

Countermeasures were taken, in the form of a French connection, which was the publishing, by Girard, Lafont and Taylor, of a book called Proofs and Types, ostensively concerning the formal semantics of an intuitionistic system of proof called F, which is a modern development of Church's formulation of the Simple Theory of Types. Published by Cambridge University Press, the book was rather hurriedly taken out of print, even before a second edition could be produced. This despite the fact that a quarter of a century later it remains a set book in the Cambridge Computer Science Tripos. Another aspect of the French connection was John Harrison's functional programming course notes, which further elucidate the connection between modern type theory and the early work on the theory of computation, supplying several of the missing pieces.

These different forces in operation at Cambridge then took on a life of their own. It quickly became apparent to everyone 'in the know,' that in fact no-one was in charge of the experiment, and no-one actually knew what it was that they were supposed to be demonstrating. The Cambridge Computer Laboratory in the years 2005 to 2009 was a very strange place to work: researchers found themselves incapable of discussing their work, even within their own research groups. Researchers gave public lectures that were utter nonsense, and senior members of the department seemed to be dying like flies.

The subject, Alice, was largely unaware of any of this. Like the proverbial frog brought to boil in a pot of water, she didn't notice the gradual rise in temperature. She experienced only a sense of ever-growing wonderment at just how very strange these Cambridge people actually were. It seemed the more one got to know them, the more strangely they behaved, and quite often she wondered how it would all end. But so did they. Alice, the experimental subject, so it seemed to them, was now the intellectual equivalent of a super-critical mass of fissile material, and the sooner she was out of the door, the better.  Thus Alice was made to feel just a little of the pressure, and she very quickly took the hint. She decided to go to Bolivia, climb some of the prettier mountains, and try to write a book about Computer Science that would be an enjoyable read for those long-suffering students she had been teaching.

And that's just what she did. But the book was far too long and boring, and it didn't contain even one single original idea! So instead, she had the idea of making a film called The Sixth Estate - the Foundation: a real film, about real events, because as they say, truth is stranger than fiction: and you certainly could not make this stuff up! Of course, this wasn't an original idea either, it came from the film The Fifth Estate.

But to what end? What was to be the point of the film? It was to start the revolution. "The revolution is a struggle between the past and the future. And the future has just begun." The point of something is the reason that thing exists: it is the Final Cause. And the final cause is the future of the whole Earth. Of course, this wasn't an original idea either, it came from the film Avatar. The final cause is Ewa, the Pachamama, Gaia, the Great Spirit of the Native Americans. It was She who controlled the war in the Pacific, through the agency of the Navajo code-talkers: those "pacifist guerillas to bazooka zones." As laid out in Robert Persig's book, "Lila, an Enquiry into Morals," the spirit of the Native Americans pervades the culture and philosophy of the United States of America.

How?  Because She is not merely an abstract idea; She is not just one of many possible ways one may choose to interpret the symbolism of the Native American culture. She is The Great Spirit, and She is real, and She runs the show, whether we know it or not.

The Navajo code-talkers told each other stories over the US Navy's radio links, and they interpreted those stories as being about the war. The Navajo themselves didn't know the meaning of the stories they told each other, they only found out later what they meant, when they reinterpreted them in the light of what they actually knew about the events that had taken place. So it was not the code-talkers that had the code-book in their heads, it was the Great Spirit. And She is still code-talking.  Nowadays though, She is a little more expressive, because She does it through WikipediA, and through cinema; and She raps through Vico-C and the Flobots; and She rocks through Gente Music.

So all Alice needed to do was explain this in a blog post, and then sell the idea to the people who had the money: those fat cat multinational corporations like Google, Intel and IBM who had made billions out of the big secret. But the Great Spirit had that in hand too: all Alice needed to do was watch the movie "The Wolf of Wall Street," and that gave her more sales training than anyone could reasonably want!  So that's how they all started making the movie The Sixth Estate - The Foundation. They watched these movies, and interpreted them as actual knowledge about how we could all live together. But they didn't just talk about the interpretations: they gave them concrete semantics. They actually made their interpretations True, and created a whole new world, in a matter of months.

Everybody wanted to be in on it: computer geeks made tools like Racket and PLT Redex to do practical semantics engineering, and second year Computer Science students used these tools to produce operational semantics for System F. Then third year students interpreted system F expressions using combinatory abstraction on a single point basis, and then interpreted weak reduction in primitive recursive arithmetic using 'an elegant pairing function.'  And so third year CS undergraduates, for project work, were formally proving that Peano Arithmetic is 1-inconsistent, and making headline news all over the world. Suddenly everyone who knew anything about symbolic logic had an intuitive understanding of Goedel's incompleteness theorems.

Once engineers had something they could actually see working, they jumped gleefully on board. The whole free software movement was united. Instead of duplicating their efforts on competing projects, they all started defining languages to describe algorithms, and they gave these languages operational semantics in terms of system F expressions. Then they used these languages to produce functional descriptions of the algorithms used in operating systems, protocol stacks and application programs. To test these functional descriptions, they interpreted them in the traditional languages like C and assembler code. But then they quickly realised that having the algorithms formally described, they could combine them in arbitrary ways, so there was no need to produce any particular concrete Operating Sytem or binary application programming interface: they could just define the applications and directly interpret whatever algorithms they needed. Applications programs could be extended without interrupting the execution process.  Instead of shoe-horning device semantics into a pre-defined, rigid, OS driver programming interface, the functional description of the application algorithms could be interpreted directly in terms of particular device semantics. Suddenly all computer software was insanely fast, superbly reliable, and dirt cheap.

Hardware followed suit. Ten year old laptops had the entire application code flashed into the BIOS memory. The systems booted in under 5s, and were so fast that at first no one could see the difference between a machine that was made two years ago and one made ten years ago. Since there were no longer gigabytes of mostly unused binary object code on the disk, and only one type of file, local disk space became almost infinite. Cloud sharing over WiFi, BlueTooth and USB meant that nobody ever had to worry about backing up data anymore: provided you had an OID, it was always around somewhere. Even if your machine blew up while you were using it, you could just look on your phone screen, or step across to another machine and authenticate, and there was all your work, checkpointed a millisecond before the meteorite fragment took out the CPU.

Nobody needed to buy hardware anymore either, so the hardware manufacturers also put all their assets into the Foundation. Their employees all took extended leave, and spent their time learning new things. The Foundation underwrote their living expenses, but everyone was so busy learning and trying out new and interesting things, and there was so much surplus that needed consuming, that living costs dwindled to nothing within a matter of months.

This awe-inspiring acceleration of the technological advance took all the sciences along for the ride, and all the arts too. So Geometry, Arithmetic, Mathematics, Typography, Graphic Design, Physics, Chemistry, Biology, Physiology, Geography, Hydrology, Speleology, Geology, Astronomy, Meteorology, Psychology, Botany, Ecology, Sociology, Language, Literature, Anthropology, Archaeology, Architecture, Chemical, Civil, Structural, Electrical, Electronic, Photonic, Automotive, Manufacturing, Transport, Aeronautical and Mechanical Engineering, Ceramics, Glasswork, Textiles, Aquaculture, Agriculture, Forestry, Town Planning, Logistics, Landscape Architecture, Children's Toys, Carpentry, Boat Building, Couture, Cuisine, Coiffure, Music, Ballet, Opera, Cinema, Theatre, EVERYTHING creative went into overdrive. But Medicine and Religion were no longer needed.

And all that spare technology and the stock-piles of material resources came in very handy, because every last gram of it was needed for The Task:
Global Climate Governance
Which will turn out to be just THE MOST fun anyone could dream of having. Beautifully orchestrated scientific expeditions to every known place on Earth, and a few that weren't hitherto known. These expeditions will be multi-disciplinary, and they will take decades to complete. They will study the people, the music, the art, the technology, the ecology, the biology, the geology, the botany, the agriculture and the archaeology, all together. And they will be beautifully documented, with beautiful music, beautiful cinematography and beautiful books.  The purpose of the expeditions will be to make connections; mainly to make connections between the actual knowledge of people from different places on Earth, because it is only through understanding the connections that we understand the whole. These expeditions will have just as much of an effect on the world at large, as they have on the places they visit.

Then, on the basis of this actual knowledge of those things better known to us, we can proceed to actual knowledge of those things better known in themselves, and perhaps only then we will be able to make sense of our own History.

No comments:

Post a Comment