The Future Steam Engine — or — Whither the Microprocessor?

Thu 12/28/2006. (Also be sure to see my imaginary letter to Dr. Dobbs on the topic!)

“Imagine if processors stop getting faster” — David Patterson, co-author, with John Hennessy, of the standard text Computer Architecture: A Quantitative Approach; in an interview, ACM Queue, 12/06.

Imagine, indeed! Where would we be then? ... Obviously, in a dark dreary unremunerative kind of place! ... The estimable Patterson and Hennessy advocate, of course, Giant Government Programs to develop parallel-core software techniques — comparable to the glorious endeavors which, as everyone knows, were responsible for the parallel instruction execution of today’s processors....

... Lo! I sense objections! In the vast audience, there are catcalls, derision, shouts! ... Steam partisans rise up, crying “steam engine progress has not stopped!” ... And they are quite right, I’m sure, even ’though I know absolutely nothing about it. ... But note that Big Steam was replaced in many cases by electricity — the great works of the industrial revolution abandoned their steam engine hulks for Mr. Kilowatt, so convenient and readily available — which was often supplied by vast big improved steam engines at some comfortably remote spot.1

... So Patterson, Hennessy, and, apparently, the ACM, are deeply concerned that the new multiple-core strategy adopted by Intel, AMD, and everyone else, won’t work — basically because there’s no evidence that anybody knows how to write software to take advantage of these things. ... Hence, of course, the need for Giant Government Programs....

Parallel Instructions ...

To obfuscate a bit, parallel instruction execution was the glory of the 90s and the Pentiums: instead of boringly executing each microprocessor instruction, finishing it, and doing the next, the gadget has a pipeline where numerous sequential instructions can get executed more-or-less simultaneously. ... Starting with simple pre-fetch enhancements, where the pipeline would notice that the next instruction needed some data, and get hold of the stuff before the current instruction was finished, the scheme blossomed into vast and amazing intricacies numberless to man. ... And the really great thing about all this was the programmer knew nothing: we’d just write our wretched code, and the clever processor’d figure-out all the heavy lifting, to execute it so much faster!...

... But these days ended; the parallel instruction pipeline has, apparently, gotten as long as it can, and we’ve been depending on faster processors for ever-faster computers — and they, in turn, depend on something called Moore’s Law. ... Actually the bright observation of Gordon Moore, that processors got twice as many transistors every two years — for the same cost, which minor detail is often omitted. ... Anyway, the trouble is, it broke!2

... or Parallel Cores

So, Intel et al have veered toward parallel cores: execute entire programs and/or parts of programs in multiple CPUs built on the same chip. ... The newest processors implement this hardware, without anybody being really certain how it will work — which difficulty Patterson and Hennessy heartily acknowledge, hence the need for Giant Government Programs....

... In all this, I feel like some Rip Van Winkle because it’s as if nobody but me remembers that this parallel programming stuff has been a hot topic for years. ... I have a particular fondness for the transputer; my archives include an article from the 3/86 PCW (I’m not sure anymore what “PCW” stands for!) about Occam, the parallel programming language (like Occam’s Razor, you know) — stuck in amongst the latest CP/M revelations. ... And my genuine transputer brochure is dated 1985! ... Inmos was responsible for both the transputer and Occam, which were supposed to work together to introduce a glorious new age of parallel processing. ... Regrettably, that didn’t happen. ... It should be understood, parallel instructions or not, if someone could’ve got this parallel CPU stuff going it would’ve made $billions! ... So it seems unlikely, after hammering away all these years, that now we’re going to figure it out, Giant Government Program or not, just because some of us really really want to....

Money Matters: Owen’s Predictions

As I noted, one often-ignored feature of Moore’s Law is the cost. ... Comparably, Patterson and Hennessy etc. don’t seem to appreciate that, while processors may not be getting faster as fast as they used to, they’re definitely getting cheaper at least as fast, if not faster.3 ... As an important software expert, therefore, I’ll go out on a limb and predict that our future will be awash with cheap processors; and, less obviously, that these wandering herds of cheap CPUs will be loosely coupled — which arrangement, as far as I can guess, is anathema to the parallel core bunch....

... “Loosely coupled” means the ubiquitous gadgets of the future might send each other relatively slow messages every now and then, instead of at nanosecond rates. ... Kind of like cell phones do now. ... Well, just like cell phones do now, actually! ... The brave new world of parallel computing is all around us, already: in these wretched cell phones, in people’s cameras, DVD players. ... Of course these unevolved critters don’t chat all that much — but they do chat! ... And they will only chat more! ... That is, it’ll be kind-of like the way electricity replaced steam, in what used to be called a paradigm shift — although now the term seems to mean a brilliant new progressive political idea — that is, the way the technology is used will change so much, it will seem like — and, really, be — a different thing....

... See, if you have enough cheap gadgets, it won’t make that much difference how long it takes them to compute. ... I’m talking millions here ... billions! ... Actually, there’s a familiar example: biological intelligence. ... Zillions of cells neurons whatever, apparently loosely coupled and slow, at least no one’s ever noticed anything going on in there that’s anywhere near the speed of the glorious Pentiums — but still, seemingly, capable of thought, reason, and manufacturing cell phones!

... I.e., eventually it’s going to be like Neuromancer (© 1984!), where the mysteriously-linked computers evolve into artificial intelligence — without the dystopia, of course. ... And without the big iron which author Gibson tacitly assumed would be in charge. ... Mind you, the cell phones could be just as nasty, and judging by their current inclinations I wouldn’t be surprised, but it’ll still be a cheap computer kind-of world. ... Love it or leave it, you know....

... Owen Predictions, Take 2: Big Iron Ad Astra! (Monday, January 8, 2007)

They say Jobs will announce Apple’s super phone tomorrow! ... But I was inspired by SMP and Embedded Real Time (Linux Journal 1/07 p 52) where Paul McKenney claims it’s a myth that “Parallel Programming is Mind Crushingly Difficult” because, after all, there’s MySQL, Apache, and the Linux kernel! ... There should be a Latin name for this, a variant of ad hominem: I will call it the macho argument, where the writer rhetorically asks if the reader is man-enough. ... Being a lifelong coward and wimp, I will cheerfully admit total unworthiness4 — but of course, it’s a fallacy: mind-crushing or not, parallel programming costs more; the question is not “are you programmer-enough?”, the question is “do the cost-benefits of doing a project in a parallel fashion make sense?” — i.e., as opposed to writing some decent normal program. ... The answer is usually no, it doesn’t make sense, as many many industry thinkers — that is, other than me — have concluded over the years....

Different Worlds

... But McKenney’s harmless citation of MySQL and Apache jogged the grey cells, and suddenly I could see the future! ... “Big Iron” will survive; maybe it won’t control the cellphone masses in a dystopic fascist regime, but there will be multi-multi-core servers, the performance of which so far scales very nicely with additional parallel resources; ... and there will also be personal computer microprocessors — whose performance doesn’t.5 ... Personal computer users will not pay for wonderful advances in server technology, preferring features like longer battery life and fancy integrated graphics....

... The interesting corollary to this now-obvious future is that server development will no longer finance desktop development — which is such a norm of current arrangements that I, and most everybody else, hardly notice! ... For years. we’ve just expected the super-fast servers to trickle-down to our desktops in a few years — but no more....

... Obviously there’ll still be cross-fertilization, but just as obviously the paths will diverge, the feckless users’ processors wandering into mass-market frivolity hardly imaginable to us today, as they become gradually absorbed into the science-fiction super-phones who-knows-what? of the future, while the servers will stolidly plow-along, becoming ever more powerful and making Amazon and Google ever-more money — for which privilege such esteemed organizations will pay top-dollar. ... It’ll be a strange new world — and sooner than I suspected!...

And now ... [6]

Late news Tuesday, May 4, 2010: now they’re threatening us! (Slashdot). ... I like the commentator faithfully promoting the ignorant silliness that Moore’s observation is somehow supposed to be an insight into the physical universe or something. ... “Well, it’s a Law, Mr. Smarty Pants!”


Notes

1. Well there was some diesel and gasoline in there somewhere; but in the end, many steam-driven plants got electric motors which were, in turn, supplied by steam generators.

2. Ok Ok I’m lying ... Moore’s Observation didn’t break. ... It’s just that it was always taken to mean that every two years (1.) you got twice as many transistors or (2.) the same number of transistors got a lot faster. ... What’s happened recently to this popularly accepted notion of the observation is physics put the kibosh on #2: microprocessors have stopped getting faster, and people know it; they know their next PC isn’t much faster than last year’s — which is one of the many things that so worries Microsoft....

3. Cheaper computers is something else that’s worrying Microsoft....

4. I of course am merely being falsely modest, which I will try to correct here by mindless puffery. ... See my resumé for my promethean efforts in interrupt-driven assembler; ... while I’d probably never figure-out this week’s code repository, not to mention the scintillating open-source culture, the subject-matter is hardly alien....

5. McKenney’s inclusion of the kernel in his parallel group, while not strictly wrong, isn’t right. ... The big 21/2 operating systems — Linux, Windows, and OS X — would indeed scale nicely with parallel execution — if only the vast hordes of simultaneous users, for which these systems were designed, would show up! ... That is, it is an anomaly of today’s system software that it is designed for multiple users, but used on computers that almost always have only one! ... It got like that for historical reasons, mostly the obvious circumstances that all the predecessors to today’s microcomputer software were indeed multi-user aka “time-sharing” systems — and, I suspect, the imprinting effect of the formative years of most of the movers and shakers in the computer biz, so they couldn’t stand the idea of abandoning that wonderful parallel execution. ... These systems benefit today from the multi-tasking nature of their design and couldn’t leave home without it, but in the typical desktop/laptop PC, these tasks are relatively unique, and do not benefit much from massive or even minor parallelism. ... You can actually see this in predictive puffery: the best the few remaining magazines can conjure with is running a great video game in one window while you surf the web — which will doubtless be much easier with a few spare cores but hardly what most people would spend more than a few bucks for....

6. Later news: pcworld.com breathlessly announces 3/29/17 that “many scientists agree that Moore’s Law is dying”! ... Wow it sure took those scientists quite a while, eh? ... Of course in the real world it’s been interred and rotting in the ground for ages, and the obsequies are a distant memory. But not on planet FakeNews™....