'F' is for function.
Okay, you just knew we were going there today, didn't you?
Today, and ... well, every day in the last 2,600 years (I'm not joking) has had this dynamic tension between objects and functions. Which is greater?
Read this tell-all post to find out!
Sorry, I channeled Carl Sagan in Cosmos for a second.
Not really.
But I digress.
WAY back when, back in the olden times, there were just things, and then when you had a full belly and your one thing, your neighbor, who happened to be named 'Jones,' had two things.
And then keeping up with the Jones' came to be a real problem, because you had one thing, and they had two. So you got one more.
And counting was invented.
For things.
In fact, in some cultures, counting doesn't exist as an abstract thing. You don't count, in general, you have different counting systems for different things. Counting, for example, pens, is a very different thing than counting eggs.
Ha-ha, so quaint! Who would ever do that?
Well, you do. What time is it? You have a different way for counting time than you do for counting distance or for counting groceries that you need to buy, don't you?
But, at base, they are all just things: time, distance, and gallons of milk.
Sorry: litres of milk. How very British Imperial of me!
Question: why are Americans just about the only ones using the British Imperial system any more? That is, the real question is: why are the British not using the British Imperial system? They onto something we're not?
Anybody want a satellite coded wrongly because some engineers intermingled British Imperial units with metrics?
Okay, so, units, things, metrics, whatevs.
And counting.
Counting is doing something to something, whereas, at first, it's important you have that chicken in your stomach (cooked is nice, but optional), suddenly it's become important that you have not one chicken, a hen, but a (very rarely) occasional rooster with all your hens would be nice, too.
(Roosters are big jerks, by the way.)
Counting was all we had, after the chickens and their yummy eggs, that is, and that for a long while.
Question: which came first, the chicken or the egg.
Answer: that's obvious, the egg.
Think about it. You have eggs for breakfast, then you have a chicken-salad sandwich.
Like I said: obvious! First the egg, then the chicken.
Unless you're one of those weirdos that eat chicken sausage with your breakfast, then it's a toss-up, but in that case I have no help for you.
Okay, chickens, eggs (egg came first) (You read that first, right here on this blog) (and you thought there was no point to math!), and then counting, for a long, long time.
Then, ... counting became too much, because you ran out of fingers, then toes, to count your mutton on. It happens. That's a good thing, because then people started to have to think (that's a good thing, too), and they thought, 'how do I count all my mutton when I run out of fingers and toes? I need to know that I have more sheep than the Jones', and I've paired each of my sheep to his, and I have more than twenty-two than he does. How many more do I have!"
Then people started getting smart. They did more than one-to-one, or pairwise, groupings, they started grouping numbers, themselves.
All this is very nice background, geophf, but to what end?
First counting, then grouping, those are operations, and on or using numbers.
Then addition and subtraction, which are abstractions, or generalizations on counting, then multiplication, and let's not even go into division.
But.
Counting, adding, multiplying. These things worked for you chickens (and eggs) (first), but also on your mutton, and when you settled down, it worked on how many feet, yards, the hectares of land you had, too. The same principles applied: numbers worked on things, no matter what the things were, and the operations on numbers worked on ... well, no matter what the numbers counted.
Now. Today.
Or, sometime back in the 40s anyway, way back before the Birkenstock-wearers were mellowing out with lattes, anyway. A couple of dudes (Samuel Eilenberg and Saunders Mac Lane, specifically) said, 'Hey, wait!' they exclaimed, 'it doesn't matter what the thing is, at all!' they said, 'as long as the things are all the same thing, we don't care all all about the thing itself at all!'
And they invented a little something that eventually became known as Category Theory, which is the basis of ... well, a lot of things now: quantum thermodynamics, for one, and computer science, for another.
And they nailed it. They created a one-to-one correspondence between the things in categories and the things in Set theory.
Why is that important? Set theory is this:
{}
Got me?
A set is a thing that contains a thing. The fundamental set is the set that contains nothing (the 'empty set'), which is this:
{}
And other sets are sets that contain sets:
{{}, {{}, {{}, {}}}}
With sets you can model numbers if you'd like:
0: {},
1 {{}} (the set that contains '0'),
2: {{}, {{}}} (the set that contains '0' and '1')
3: {{}, {{}}, {{}, {{}}}} (the set that contains '0', '1', and '2')
etc...
And as Gödel (eventually) showed, once you model numbers, you can model anything.
(We'll get to that ... for 'G' is for Gödel-numbering, I'm thinking)
How? Well, once you have numbers, you can count with them (as just demonstrated), you can add one to them (the 'successor' function), you can take away one from the (the 'predecessor' function), you can add two of them together (successively add one to the first number until the second number is zero. Try it!), subtract, by doing the opposite (or by just removing duplicates from both until one of them is gone ... 'pairwise-subtraction'!), multiply two together (successive addition), divide numbers ... (um, yeah)... and, well, do anything with them.
Once you have Number, you have counting, and from that, you have everything else.
So, our dudes mapped category theory things (morphisms) down to Set mapping functions, and they had it.
Because why?
Well, it's rather cumbersome to carry a whole bunch of sets for your Pert formulation, especially when you want to see the money you (don't) save by paying off your mortgage's principal early.
Banker: oh, geophf, pay off your mortgage principal early, you'll save a ton in the long run.
me: uh, that was true when mortgage interest was 10-15-21%...
(you remember those Jimmy Carter days?)
me, continuing: ... but now that my interest rate is 2.5% I save zip after 30 years for early payment.
Try it. Run the numbers. Bankers haven't. They're just repeating what was good advice, ... twenty-five years ago.
But when you have objects, whose representations do not matter, be they 0, 1, a googol, or {{}, {{}}, {{}, {{}}}}, then you can concentrate on what's really important.
The functions.
Whoopsie! I slipped and gave the answer, didn't I?
Me, and Alan Kay, both.
Alan Kay, inventor of Smalltalk, and inspirer of the movie Tron: The objects aren't important, it's the messages that go between them that is.
So there.
Well, okay, functions win. It used to be the chicken (or the egg), but now we've ... grown a bit, worrying less about what fills our stomach, and where we're going to get our next meal (and not be the next meal), you know: survival.
We've transcended our stomachs.
Maybe.
Well, okay, functions rule, so back off.
The dudes didn't stop there, because there's been some really neat things going on with functions, since discovering functions are things in and of themselves, because if functions are things, in and of themselves, then, well, ...
Well, you can number them (tomorrow) (promise), you can count them, you can add them, you can multiply them, you can exponate them, you can ...
You can, in fact, operate on them, applying functions to functions.
So, our dudes did just that. They said their functions, their morphisms have two rules to be a function.
You can ask what it is, it's identity:
id f = f
And you can string them together, composing them:
g . f = ... well, g . f
So, if you have
> f x = succ x
and
> g x = x * 2
then (g . f) gives you some function, we can call it, h, here if you'd like, and h is the composition of applying f to your number first, then g, or
> h x = 2 (x + 1)
Try it!
Pull up your functional programming language listener (I use Haskell for now) (until I find or invent something better)
and enter the above.
Now, it's an interesting proposition to show that
(g . f) == h
But, for now, in Haskell, we cannot do that.
Now, in some other programming languages (I'm thinking of my experience with Twelf), you can, but first you have to provide numbers, on you own, like: 0, 1, 2, ...
... and prove they are what they are, you know: they follow in order, and stuff like that ...
Then you provide the functions for identity and composition, and then you can prove the above statement, because now you have theorems and a theorem prover.
Yay!
No, ... no 'yay'!
I mean, 'yay,' if that's your thing, proving theorems (I happen to have fun doing that at times), but 'ick,' otherwise, because it's a lot of work to get to things like just plain old addition.
See, theorem-prover-folk are these hard-core fighters, right on the edge of discovering new mathematics and science. Not plain-old-little me, I just use the stuff (to build new theories from preexisting, proved, ones), so I'm a lover of the stuff.
To quote an eminent mathematical philosopher: I'm a lover, not a fighter.
And, no, Billie-Jean is not my lover. She's just a girl that claims that I am the one.
Anyway!
So, but anyway, using the above two functions, identity and composition, you're now able to reason about functions generally, making new functions by stringing together (composing) other, preexisting functions.
This is so important that Eilenberg and Mac Lane gave this process it's own name 'natural transformation.'
But okay, Categories have units and you don't particularly care what those units are; we saw one set of units, numbers, themselves, as an example,
But units can be anything.
Even functions, so you have a category of functions (several categories, in fact, as there are different kinds of functions, just so you know) ... You can even have categories of categories of things (things being some arbitary unitary types, like numbers, or functions, or categories, again ... it can get as wild and woolly as you want it to get).
And types. Categories can be used to model types, so you can have a category that represents the natural numbers ... as types:
Nat : Category
zero : 1 -> Nat
succ : Nat -> Nat
so zero = zero ()
one = succ zero ()
two = succ one
three = succ two
etc ...
And those formula, those functions works, regardless of the underlying type of things like 'number' or even 'function.'
Category theory was originally called 'Abstract nonsense,' by the way, because Eilenberg and Mac Lane saw it as so abstract that they were actually talking about nothing.
Sort of like how Set Theory does, starting from nothing, the empty set, and building everything from that object, but even more abstractly than that, because category theory doesn't even have the 'nothing'-object to start from, it just starts identifying and composing functions against objects about which it doesn't care what they are.
Now, there was, at one point, a working definition of abstraction, and that was: the more useful a theorem is, the less abstract it is.
The contrapositive implied: the more abstract a formula is, the less useful.
But I find I play in these domains very happily, and in playing, I'm able to invent stuff, useful stuff that's able to do things, in software, that other software engineers struggle with and even give up, stating: "This can't be done."
I love it when I'm told I "can't" do something.
You know what I hear when I hear the word "can't"?
I hear opportunity.
Everyone else may have given up on this particular task, because it 'can't' be done. But me? I go right in with my little abstract nonsense, reinventing the language that was unable to frame the problem before with a language that can, and I find, hey, I not only can do it, but I just did.
Language frames our thoughts, telling us what we can and cannot do. English is a very complex language, taking in a lot of the world and describing it just so.
The language of mathematics is very, very simple, and describes a very tiny world, a world that is so tiny, in fact, doesn't exist in reality. It's a quantum-sized world: mathematics and has the possibility to be even smaller than that.
But so what? I get to invent a world that doesn't exist, each time I apply a theorem from mathematics that others are unfamiliar with.
And when I do so, I get to create a world where the impossible is possible.
It's very liberating, being able to do things others 'can't.' And when I pull it off, not only do I win, but my customers win, and maybe even those developers who 'can't' win, too, if they're willing to open their eyes to see possibilities they didn't see before.
'F' is for functions, a totally different way of seeing the world, not as a world of objects, but as a world of possibilities to explore.
No comments:
Post a Comment