Monday, April 14, 2014

'L' is for the Lambda-calculus

'L' is for the λ-calculus

So, nearly halfway through the alphabet. How are we all doing, are we going to be able to go from 'A' to 'Z' in one month? Tune in tomorrow to see the next compelling and exciting installment of the #atozchallenge!

So, this is the big one, a way at looking at mathematics that has been seething under the surface, but with the λ-calculus (pronounced 'lambda calculus') has now bubbled up to the surface.

So, we've seen that the formula, or the functions, are the things that concern mathematics, and not so much the objects operated on, and that turned world-thought on its head. Well, the λ-calculus came to be to address a very particular concern that mathematicians worry about (mathematicians are such worriers. Except me, I'm carefree as a hummingbird, flitting among rose bushes and lemon trees, gathering necktar to keep my heartrate at three-thousand beats per minute.) (no, ... wait.), and that concern is this: in the function

f x y z = x + 2y - 3z

How do we know x is x, y is y and z is z?

That is, what if I juxtapose y and z, mixing the two up, as it were? Is there something, anything, that enforces the ordering of the arguments to a function?

In set theory, you see, the set {1, 2, 3} is equivalent to the set {3, 1, 2} or any permutation? Two sets are the same if they have the same members, so there's no ordering imposed.

Now, you could enforce an ordering, by adding another layer of sets:

{{1, 1}, {2, 2}, {3, 3}}

And this ... 'means' (I say 'means' quite loosely, you see) that "the first number is 1, the second number is two, and the third number is two."

But it doesn't at all, because, as sets are order-independent, how can we guarantee the ordering argument is the first one in the set? We can't. So we add another layering of sets:

{{{1}, 27}, {42, {3}}, {{2}, 21}}}

Which says: "The first argument is 27, the second argument is 21, and the third argument is 42."

You see what I did there? I mixed up the ordering of the sets and even their ordering arguments, but since the order is a subset of the subset, you're still able to get that information to order the sets as you read through them.

This gave mathematicians headaches, because mathematicians are wont to get headaches about things, because the above set is not the set {27, 21, 42} (or any of its permutations, but something quite different, and the function

f x y z

is not the function

f {{z, {3}}, {{1}, x}, {y, {2}}

(or any of its permutations

because the first one, f x y z, has three arguments, but f {...whatever...} has one argument, which is a set of ordered arguments.

Ugh. Try working with that!

Some people have enough problems working with f x y z, already!

So, how to solve this?

Two of the most popular ways that mathematicians solved this problem are:

1. get rid of variables altogether. And numbers, too. And all symbols, except some very primitive functions which we'll predefine for you, so instead of the above function

f x y z = x + 2y - 3z

we have something looking like this:


Okay, you can reason about that, it comes as its own Gödel numeral, so you can have fun with it that way, but ...

But that's hard for a lot of people to swallow or to know what you're talking about, or to have a conversation about the veracity of the statement you've constructed.

The other approach is this: partial funtions.

The function f x y z can actually be represented as three functions

Some function f that takes a variable x given the result of the partial functions of y and z, or, in 'math words':

f x = something + x

where 'something' is

g y = 2y + something else

and 'something else' is

h z = - 3 z.


So when we string these functions together to get our total answer (instead of three partial answers we have

f x . g y . h z = x + 2y - 3z

Which gets us our original equation back, and guarantees that our f x is getting x, because it's not getting anything else, our g y and h z are getting their y and z, respectively, too, for the same reason. Once you reduce a function down to one of only one argument, then you know the argument you're getting is the right one for that function.

Problem solved.




(And all the students stand up an applaud at the end of the lecture.)

That solves that problem, neatly, too.

But then it magically solved a bunch of other problems, too.

Because now that you're going through your math-space in your spaceship of imagination, just like Carl Sagan through his Cosmos, then it becomes totally natural to start composing these very simple functions together to get more complex ones, and you build what you're really after from these very simple building blocks.

And it was Church, the inventor of the lambda calculus, or maybe his Hinglish buddy, Turing, who invented the U-combinator, aka the 'universal combinator,' or also called the Turing-combinator in his honor, was that all you needed was the lambda (λ), the variables, and 'application' (the gluing of them together) and you got out of that ... everything.

And by everything, I mean, everything: numbers, counting, addition, and all math operations after that, and then, from number (again), knowledge-representation.

λ z -> z  (zero)
λ succ . λ z -> succ z (one) (and counting, too)

You just named the things, like 'z' and 'succ,' but you could name them anything, and they would ... 'mean' the same thing.

This naming to make numbers came to earn its own name: it was called 'Church numerals,' after its inventor.

And the Church numbers actually mean the numbers they name, because

λ succ . λ z -> succ succ succ succ z

Is the Church numeral 'four,' but if you put in a chicken for 'succ' you have:

λ succ[succ/chicken] . λ z -> succ succ succ succ z or

chicken chicken chicken chicken z

And there you go: your four chickens. Throw those puppies on the grille and you've got yourself a nice tailgate party for you and your friends to go with 'z,' the keg.

(What else would 'z' stand for, I ask you.)

(And the keg would definitely be 'zeroed' out, I tell you what.)

The really neat thing?

There's a one-to-one correspondence between combinatory logic and the λ-calculus.

For if you have the λ-term K such that

λ K . λ x . λ y -> x (dropping the 'y' term)

And the λ-term S such that

λ S . λ x . λ y . λ z -> xy(xz) (distributing application over the forms)

Then combining the λ-terms K and S in the following way:


gets you λ x -> x

Which is the identity function, or the I-combinator, just like combining the combinators S and K in the exact same way gets you the same result.

Surprising. Shocking. Really neat!

These to things, the λ-calculus and combinatory logic were developed around the same time, the lambda calculus to provide proper ordering of variables, and then demonstrating that all you needed were the variables, and nothing else, not numbers, nor addition, nor anything else, just the λ-abstraction and function application got you to numbers and everything else.

Well, combinatory logic argued that all you needed were the functions. It had no variables whatsoever.

Two opposite mathematical languages, but they ended up by meaning the exact same thing!

Fancy that!

So, if you proved one thing in one of the languages, because it was easier to express the forms in that way, then the exact same proof held in the corresponding (and opposite) mathematical language.

The other really neat thing is that once you've captured formulas in these building blocks, you can even express logic in these formula, so not only do you have:

λ x . λ y . λ z = x + 2y - 3z

in the λ-calculus, but you also begin to build logic statements of truth, like: 

For every x in the above formula, x is a number.

It was later found out that these statements of truth are simple propositions that type the λ-terms.

Ooh! Typed-λ calculus, and on top of that, using the λ-calculus, itself, to type ... well, itself!


And then came this dude, Per Martin-Löf (Swedish professors call each other dude, but collegially. That's a well-known fact.), and he said, 'Yo! Homies!' and then lifted the type-system to predicate calculus, where a type depended (or was predicated on) other types or the instances of the type itself.

So, he would write about the chickens (that came after the eggs, as you recall), that:

For every succ in the below formula, given that every succ is a chicken, then for every z below, then the z must be a keg.

λ succ . λ z -> succ succ succ succ z

The very type of z depends upon the instances ('inhabitants') of the type of succ, knowing one gives you the other, dependent on the foreknowledge.


So 'cool,' in fact, that you could actually program with just the types and not have any instances or spelled out formula at all, and, guess what? People have actually done that: created programs that only had type descriptors, and the program somehow magically figured out what it was supposed to do from those type descriptions.

How did it figure this out? The types themselves only allowed one way for the program to flow from its start to the end.

Magic! A program with no code actually doing something!

(Cue spooky Twilight Zone thematic music.)

And that brings us right back to 'meaning' ... (and my quotes around 'meaning').

(Question: when you're indicating that you're quoting a word, such as the word 'meaning' ... is that quoted word quoted (again) because you're quoting that you're quoting that word?)

(Just wondering.)

What does 'succ' and 'z' and 'x' and 'y' and the other 'z' and ... everything else ... mean?

I could save that for my entry 'M' is for Meaning, but I'm not going to write that entry, because 'M,' of course, is for 'Burrito' (a.k.a. 'Monad,' but we'll get to that later).

So I'll tell you know:


None of those symbols mean anything at all in mathematics. Succ is an abbreviation for 'successor function,' which means 'add one to a number,' but, for example:

λ succ . λ z -> succ succ succ succ z

could mean, just as well, 'apply the function named "succ" cumulatively to some function named "z"'

Because that's what is actually happening there, in the lambda calculus. The lambda calculus doesn't 'care' what 'succ' or 'z' 'mean.' They are just functional variables to be applied to each other, the above function could have just as easily be written with the symbols 'x' and 'y':

λ x . λ y -> x x x x y

Same functional application, same effect. No meaning.

Just like 'succ' doesn't 'mean' successor function, nor does it mean 'chicken.'

In mathematics, it is vitally important that the symbols mean nothing, and by nothing, I mean absolutely nothing. All that matters is the mechanics: there are symbols and how you move them around in your system, and that's it.

Why is that so vitally important?

Because if the symbols mean nothing, then they can represent anything, in which case math truly becomes the universal language: you encode your knowledge base as symbols ('O' is for ontology) you manipulate those symbols following the rules of manipulation you established a priori, and you come up with a result. Many times, it's a surprising result, so surprising that you get people telling you, to your face, 'that can't be done.'

The you translate those symbols back to what they were used to represent, or you apply your knowledge to those symbols a posteri to the symbolic manipulations, and you do those things that you were told are impossible for you, or anyone else, to do.

But you just did it, despite that fact that 'it can't be done.'

The non-mathematician then has the opportunity to do something they've never done in their lives, not at home, not at school, not in college, and especially not in their jobs:

They have the opportunity to learn something.

About the world, yes, but the world is as how they see it, so they have the opportunity to do something even better. They have the opportunity to learn something about themselves.

Math is hard.


Because, to 'get' math, you have to 'get' yourself, or, as is most often the case: you have to 'get' over yourself and your preconceived notions of what the world is and how the world works. Math will fail you as hard as you're failing, and, when you transcend yourself, it will succeed in places you never could before and let you see things you've never seen, ... if you are brave enough to look.

The λ-calculus, by the way, is the basis for every electronic device that exists today, and for every programming language used today. A programming language judged as 'Turing-complete,' and it must be that to solve anything more than trivial problems (which, unfortunately, most programming tasks are not even trivial, anymore: they're just simply data storage and retrieval and 'ooh! look at the pretty colors on this web-page!' 'Can I get it in corn-flower blue?' 'Why, yes! Yes, you can!' 'Good, good!') Turing based his 'machine,' which was a conceptual model of a universal computational machine, on an equivalent to Church's λ-calculus (Turing and Church were two dudes who were buds.)

Church developed this very simple calculus to address a very small concern. It's simple math.

And it changed the world into the one we now live in.

'L' is for this world we now live in. Thanks to a little bit of mathematics.

Oh, and p.s. ... HA! I finally wrote a complete post. Yay!

No comments: