About ultra-sort - and God
So, as I was saying, I recently wrote the LARGEST SINGLE PIECE OF CODE I have ever written in my life. I estimate that it is larger than all of the software I have written in over 30 years of serious industrial experience, also including all the comparatively minor things I did during my academic life. Just to give you a way of comparing things, it takes up the same amount of space as the ENTIRE collection of Chesterton's works - roughly fifty megabytes.
But I neglected to mention that this thing is just one single subroutine - or if you prefer the term, "procedure" or "function" or subprogram. It is merely an extremely efficient sorting routine, which is built to sort 9 items using the minimal number of comparisons, which of course is 19. Note: there are NO OTHER comparisons, such as indices, or other tricky bits of code.... but if you don't know how this is done, or can't quite figure out what is going on, I would advise you to take a course in algorithm design or perhaps read the third of the great texts of Donald Knuth on The Art of Computer Programming.
The other thing I neglected to mention is that I did not write the actual code myself. I am not adverse to spending a lot of time in order to make a computer do work for me, so I wrote a program which computes the specific SOURCE CODE to accomplish this little task. It was fun, and mildly interesting, and it got more interesting as I continued deeper into the study. It is worth telling you about, not only because of the insights into software and algorithms, but into some much deeper matters.
A few weeks back, another topic induced me to ask myself: what really is the "best" sorting algorithm? Granted a correct answer differs based on the use to which one is going to apply that algorithm - but you can find books to treat that topic. For now, I wil merely state that I was led to wonder what the minimum number of COMPARES would be for a given number of items-to-be-sorted. It seemed easy enough: four items would require five compares; five items would require seven compares...
An aside: to get the number of compares to sort n items, find the smallest integer k such that 2^k is greater than n! Behold:
n n! k 2^k
2 2 1 2
3 6 3 8
4 24 5 32
5 120 7 128
6 720 10 1024
7 5040 13 8192
8 40320 16 65536
9 362880 19 524288
10 3628800 22 4194304
11 39916800 26 67108864
12 479001600 29 536870912
13 6227020800 33 8589934592
14 87178291200 37 137438953472
15 1307674368000 41 2199023255552
16 20922789888000 45 35184372088832
17 355687428096000 49 562949953421312
18 6402373705728000 53 9007199254740992
19 121645100408832000 57 144115188075855872
20 2432902008176640000 62 4611686018427387904
I worked out the code for three and four by hand. Here, for your consideration and amusement, is the sort for n=3:
void Sort(int * q,int * r)
{
if(q[0] < q[1]) // depth 0 makes 6 tests yield 3 else 3
{
if(q[0] < q[2]) // depth 1 makes 3 tests yield 2 else 1
{
if(q[1] < q[2]) // depth 2 makes 2 tests yield 1 else 1
{
// 3 ifs to get here
r[0]=q[0];
r[1]=q[1];
r[2]=q[2];
}
else
{
// 3 ifs to get here
r[0]=q[0];
r[1]=q[2];
r[2]=q[1];
}
}
else
{
// 2 ifs to get here
r[0]=q[2];
r[1]=q[0];
r[2]=q[1];
}
}
else
{
if(q[0] < q[2]) // depth 1 makes 3 tests yield 1 else 2
{
// 2 ifs to get here
r[0]=q[1];
r[1]=q[0];
r[2]=q[2];
}
else
{
if(q[1] < q[2]) // depth 2 makes 2 tests yield 1 else 1
{
// 3 ifs to get here
r[0]=q[1];
r[1]=q[2];
r[2]=q[0];
}
else
{
// 3 ifs to get here
r[0]=q[2];
r[1]=q[1];
r[2]=q[0];
}
}
}
}
Then I began to do five, but decided it made more sense to let the machine do such busy work for me. And hence after some pleasant entertainment of software development, I had a program which would produce the SOURCE CODE which (given a particular small integer n) would accomplish the specific SORT of n items with the minimum number of compares. Incidentally, I ought to state that even for EIGHT items the compiler refused to handle the program - it was just too large for it to process. I was delighted; I have faced many goofy and difficult and tedious limitations of computers - some which were not computational, but human - and yet I had never encountered THAT limitation, that the program was too large for it to process. (Of course I easily transcended that limit (it's what engineers do), and went on to other matters... but that is not of interest here.)
This ultra-sort thing is a fascinating topic, since the COMPLEXITY of the result is (according to Knuth) still n*lg(n) as the best sorts are. (There is more to say on this matter, but I won't say it here; there is something more important to get at.) That is nice, but one has to pay the price: the SIZE OF THE CODE grows as n!, which is a bit tedious. Hence the nearly astronomical chunk of code which I mentioned at the beginning. Although any given sort of nine items requires no more than 19 compares to correctly sort, there are a huge number of actual IF statements in order to accomplish this sort! Yes: for almost all of these IFs contain nested IFs in both the THEN-clause and the ELSE-clause... it is huge. Simple, boring to do by hand; impressive to contemplate - and quite efficient, but gargantuan in its immensity of trivial code.
Yet, there was something being hinted at here, and - now, as I think back - I probably spent MORE time thinking about its MEANING than I spent on construction and investigation of the code and algorithm. Because this strange trade-off, converting (as it were) TIME into CODE - trading off a larger amount of INSTRUCTIONS in order to acquire a faster SPEED - began to suggest something about algorithms, and about getting solutions.
Quite some time ago, one of my co-workers gave me a RUBIK's Cube. We talked about it a lot - and at one point someone commented about the "God Algorithm" which provided the exact shortest sequence of twists required to return ANY given disrangement back to the homogeneous state. Of course this is nothing remarkable: God already knows the solution to all possible math problems, as well as all possible physics problems - not to mention all possible HUMAN problems. If you would rather read someone else's comments about this, hunt up Charles Babbage's Ninth Bridgewater Treatise where he argues that his own work in software development had revealed to him a little about the amazing foresights of the Creator, and the fathomless Wisdom which could arrange the ALL as He should ordain... just as a software developer does! (Note, this is explored in some detail in Jaki's Brain, Mind, and Computers, which is available here.
Indeed, it would appear that if one was permitted an unbounded amount of "memory" in which to store code, it suggests that one could gain an arbitrarily fast algorithm - at least up to the innate nature of the question at hand. (I defer the philosophy for the present; this is just a sort of slovenly meditation with some engineering seasoning, and not a term paper, hee hee!) But such a view surely indicates something hinting at the omnipotence of God - Who - as not being physical and hence bounded - has no limitations related to "code size". He knows every answer without requiring to "process through" as if He was performing an algorithm to acquire it.
It is quite fascinating, and I have barely begun to ponder the matter, but I felt it was worth jotting something down and letting you know about it.
6 Comments:
So can your computer now sort your socks? :-) http://stackoverflow.com/questions/14415881/how-to-pair-socks-from-a-pile-efficiently
My own reaction was (because, for stupid reasons, I'd been reading it last week) to recall "Don't Eliminate Cut", (George Boolos, 1984 J. Phil. Logic 13).
I suppose if you don't need to remember/update a dynamic tree, the static code under an initial number of compares can behave exactly as if it was sorting into a balanced binary tree already, but to check whether this was a minimax approach I don't know how.
But never mind! I've often wondered if, in Heaven, if I were moved to ask, could God show me a model of ZFC.
Ok, so I'm coming into the question late, and this is probably a dumb question, but why write out the nineteen compares explicitly? Couldn't some sort of loop go through the list automatically and regardless of the size? Or, if it's too complex for that, a recursive function? Is it simply that the goal here is to create code that is manually optimized to involve not merely the minimum compares but the minimum actions at all?
Sorry, pet peeve, involving my own thoughts on programming and omniscience. ;^) Maybe I'll write them someday...
Sorry, I have been away from this fertile topic!
Chaka, I use a FIFO for socks. Hee hee. Also if all items are already known to be identical, sorting reduces to constant time.
Belfry: I don't have access to that journal. I rather think the effect of such static code is (and probably can be shown to be) equivalent to a balanced tree. Proof left for homework. Hee hee. I am busy with other fish, as you may have expected from this delay. And I do not recall ZFC, sorry; kindly refresh my memory.
Scott: Fascinating question! A LOOP contains a conditional, and any recursion (which is just a variant way of exepressing the concept of a LOOP (or vice-versa) must contain a conditional too. My idea (which was simply curiosity) was to arrange the MINIMUM required conditionals.
The larger question, the minimum "work" is perhaps a bit contrived, but then most theoretical work is contrived, as we must do things differently in the Real World, where the customer needs the thing done "Now" and not "When the Assignment is Due" or "When your Committee Decides You've Suffered Enough To Grant the Degree". (Hee hee)
It's a bit futile as far as complexity, but I am not trying for any points. It was a simple scientific exploration. (Some of this is discussed in Knuth's third volume, incidentally.)
I look forward to hearing more about your thoughts on programming and omniscience. For a curios take on this matter I recommend you exploring Babbage's Ninth Bridgewater Treatise!
ANYWAY, sorry I was away - I will see if I can add some more to this matter at some point - too many other things just now to SORT. Hee hee.
Oh, ZFC ↓ Zermelo-Frænkel+axiom of Choice set theory.
That's too bad about the journal... he shows, roughly, that eliminating "cut" steps --- or transforming a natural deduction into a Gentzen tree --- can be arbitrarily explosive; natural deductions are allowed to rely on lemmas as often as you like, but a Gentzen tree has a separate copy of the proof for each use of each lemma. So trees have a simpler proof theory, but natural proofs are easier to write down. But we knew that already, perhaps!
Thanks Belfry Bat. I've been away from that part of the e-cosmos, alas - fascinating.
Gosh, now I hesitate to proceed on my other puzzle, wondering whether I will induce an explosion... oh, but then as GKC says,
Some sneer; some snigger; some simper;
In the youth where we laughed, and sang.
And they may end with a whimper
But we will end with a bang.
[GKC CW10:485]
Post a Comment
<< Home