Friday, October 29, 2010

All modern operating systems are shite.

"Modern" operating systems are, to my eyes, fundamentally broken. Indeed, there's very little that's "modern" about them, when it comes down to it.

They are seemingly all mired in the concept of being nothing more than a layer over hardware, something that allows single-purpose "applications" to access that hardware in a consistent manner. The focus is wholeheartedly on the applications, and not on the data those applications use.

Admittedly, some of those systems are better than others. Apple's OSX is nice enough to use, mainly because it's decent enough to get out of the way when you need it to. But even so, it's conceptually no different to Microsoft's Win7, or even Vista, XP, Win2K (go back down the MS lineage as far as you want here) or any other "desktop" operating system - the only real distinguishing factor apart from what applications are compatible is how you get on with the UI. Personally, I find OSX tolerable, and Windows execrable, but that's personal preference, and not really much different to preferring a Ford over a Fiat.

Even the handheld market, one which was all-but invented by Apples ground-breaking Newton, has taken masive steps backwards from the beauty and simplicity of Newton, or even the sparse functionalism of Palm; "handheld devices", say Jobs, Brin and Ballmer, "are for consumption of pre-rendered media". Forget using them as extensions of your computing environment, forget using them for anything useful (except, perhaps, as a poor replacement for a spirit level), they're there as status symbol, toy, and above all, conduit for advertising.

Yeah, I keep coming back to the Newton. Funny, that.

So, anyway, whilst fiddling with Android, and trying to make mobile devices actually useful again, and generally buggering about with my Wits A81, I've been doing a fair amount of thinking about this. And, indeed, I've been thinking about going further than that, and actually doing something about it.

I started by thinking that something could be done by using the linux kernel as a starting point. After all, it's already developed, and it runs everywhere - why bother reinventing the wheel? Just make a performant "shim" OS over the top of it, and a file system that does what you want, and you're laughing.

But that's suboptimal. Linux (the kernel) is tied to the Unix concept that "everything is a file". Which is a nice enough abstraction, but it doesn't work for me. The reason it doesn't work for me is that "a file" is far from being a rich enough concept. Part of this is to do with metadata (or, more particularly, the lack of it) - it's very difficult to implement interesting inter-application behaviour without metadata and (even more importantly) transparent, dynamic, access methods for that data and metadata.

As an example, let's consider Apple's OSX, and the interaction between, and This is, pretty much, state of the (current) art - you send me an email suggesting a lunch date tomorrow at midday, I double click on the "midday" text and get a "lunch" appointment added to my calendar with you as an "attendee". Pretty damn slick. But all that interaction knows about is those 3 apps - it's hard-coded to only work between those three. This is because the processing is contained in the aplications, and not "owned" by the data. I can't make do this without linking to a bunch of Apple-provided frameworks (and even then, it's hard, as a lot of the behaviour is not externally exposed), and even when I do, doesn't suddenly inherit the ability to do what does - the interaction only goes one way.

We shouldn't be living in a world where metadata and behaviour is application specific. Developers should be developing behaviour, interfaces and transforms for specific data types, not reinventing the "application" over and over. Users should be able to edit an image directly in an email, and send it back to the sender, without having to save it, open it in anoother application, edit it there, save it again, then send it back. Data should be automatically versioned. It should follow you around. Your mobile device should be an extension of your desktop, able to take important data with you and merge changes back when you return, able to contact your machine over the 'net and fetch data you "forgot". And so on.

Desktop computing hasn't advanced that much since 1984. There have been a few attempts to make things better, but they have, by and large, failed, at least if we measure success in a commercial sense. Newton was one of these. Computing has failed on several of its promises. We are slaves to the machine, not the other way around.

It is clear to me, at least, that we need a change. And the idea has been buzzing around my head for some time. I had considered that much of this could be accomplished by layering something over the Linux kernel, with a metadata-storing file system backing everything up. The problem with this approach is that the two worlds can never be allowed to collide - Linux provides a hardware abstraction layer and nothing else. Everything else the kernel does, including scheduling, would be surplus to requirements. So why not, I thought to myself, develop an OS directly "on the metal" in a dynamic language, do it all from the ground up? After all, it's been done before. The Jupiter Ace was a "home micro" which was coded purely in FORTH. The Symbolics Lisp Machines were coded, from the ground up, in Lisp.

And that hooked me. I've always loved Lisp. It has a beauty which is transcendental, a purity of purpose which is unrivalled in any (non-Lisp) language that's come since. A lot of people get scared by the syntax, but then people get scared by the fact Objective-C uses square brackets and colons, so fuck 'em. Result - I started looking at Lisp-based OSes, to see if it had all been done before. And I found this:

Now, this started to tickle my interest. Not only was I not insane (or, at least, not insane and alone), but Mikel is one of the original guys on the Newton (and pre-Newton) project at Apple. One of the guys who worked on SK8 at Apple. An insanely talented guy, who seemingly thinks much the same way as I do. Sure, it's not much further advanced than I am, but it at least shows I'm (possibly) not utterly wrong.

So, I'm not mad. And I *am* gonna do this. Stop thinking, start acting.