Friday, December 17, 2010


A few years back there was a big hoo-hah in the open source world, and it was at least partially responsible for the creation of the GPLv3. That hoo-hah was the TiVoization of the Linux kernel.

In short, what happened was this:

TiVo, inc. created their famous PVR device, which took the world (especially the geek world) by storm. At the heart of the device was a bunch of GPL software, including the Linux kernel. This later fact was, IMO, a large part of why TiVo took the geek world by storm, but that's another argument.

TiVo did what any good company using GPLed software should do - they delivered the source of their modifications to the GPLed software, and kept their non-GPLed software scrupulously away from the GPLed stuff.

So far, so good, right? Wrong.

Whilst it was possible for anyone to recompile their own version of the TiVo firmware, it was not possible to flash that recompiled firmware onto a TiVo branded device. The firmware images used to flash the TiVo were cryptographically signed, and the means to do that was not public. There were, of course, good reasons for this, not least of which was that the media companies would have, legally speaking, shat upon TiVo, inc. from a great height if it were not so.

There was much oohing and aahing from the GPL advocates over this, and it was generally decided that this was somehow wrong, and it was going to kill the GPL, and other such crap. And thus was born the GPLv3, which "protected" not only the software, but also the hardware it was designed to run on. Linus and the other kernel developers, who saw little wrong with what TiVo had done, decided to stick with the old GPLv2 anyway, and to hell with V3.

Fast forward a few years. The Linux kernel has not died. The GPL has not died. But now there is a threat to the GPL, and it is attacking both v2 and v3. It's a real and present danger, and it's being ignored (at best) or cheered on (at worst). That threat has a name. Its name is Android.

It's not Android as such that is the threat, but the market in which it is being used - mobile devices. An open system is anæthema to the hermetically closed world of mobile phones and telecoms carriers.  Android is being claimed as the saviour of the open source world from the big bad ogres at Apple and MS.  I believe this claim is wrong.

Even Google aren't being the open source heros they are claimed.  The (non-GPL parts of the) source for Android 4.x (Ice Cream Sandwich) was released months after the first devices hit the street, and I'm not sure if the 3.x series has ever been made public.

Telephony providers are absolutely against people reflashing their devices (or even rooting them), and many mobiles are every bit as TiVoised as the original TiVo.  And that inability to reflash is being used as a marketing device by the providers / manufacturers.  "Benefit from Android x.x", they trumpet, "get yourself a new contract with handset / tablet X.1", quietly ignoring the fact that handset / tablet X.0 could quite happily run Android x.x should they bother to take a little time to provide a firmware upgrade.  And yet Apple are "evil", and "push you to consume", although they have a policy of supporting hardware for a decent amount of time.  No, Apple are evil because they are "closed source", and to avoid that we'll happily get fucked up the arse by someone waving a GPL banner.

But the worst, the absolute worst, of the lot, are the Chinese manufacturers of cheap Android enabled devices, motherboards, and chipsets.  It is absolutely impossible to get them to release the slightest piece of source code, despite the fact they are obliged to do so.

Android is nothing more than a system for pushing ads to your mobile device.  It's nothing to do with freedom, unless you're talking about Google's freedom to rape your private data.  The telephony providers are using it because it costs them jack shit.  Nothing to do with consumer benefit, nothing to do with freedom, simple bottom line accounting.  The Chinese manufacturers are using it for the same reason, and because MS have got harder on hooky copies of WinCE.

None of them give a flying fuck about "freedom", but between them, they may bring the GPL down.

How good is your hashing?

As promised in my last post, I'm working on an embedded lisp-based OS. Yeah, another one. Because what the world really needs is yet another geek's idea of what an OS should look like.

Well, fuck the world, I'm scratching an itch.

Now, if I were being sensible, I'd target something usable, like my Wits A81 tablet device. A little lispy tablet could be really neat (and, indeed, it may well end up being so). However, I decided to start a bit smaller.

A lot smaller, actually.

Yep, a lisp-based OS on a microcontroller. Not, I might add, the st8ms this blog originally started with (and is titled as), but it's bigger brother, the stm32. It's an ARM Cortex-M3 processor, and the one I have in hand has 128K of flash and 8K of RAM, which is quite nifty.

Now, anything that's gonna fit into that little space is gonna have to be tight. Sure, I can push as much code as possible into flash and execute from there, but that 8K is a really tight limit for something like a Lisp, even a cut down one. And so, I started thinking about what takes up lots of space in lisp.

Firstly, there's objects. A naive implementation of boxed objects in Lisp means that even the humblest character takes up a significant amount of space, with 24 bytes not being unheard-of. Well, bollocks to that. I'm being as tight as I can.

Once you've squeezed stuff like characters and numbers and so on down, though, you start looking at what you can get shot of. Symbols, for example. They are used as unique identifiers, and ar generally stored as a hash value and a literal string. Now, the hash value is what's used most of the time, and the string is only really used when "exploding" the symbol out into an array of characters. So if I'm willing not to do that, I can get away with *just* holding the hash. That's a significant saving.

So. What hash to use?

I need something that can be pushed into a single word of space (4 bytes), and which will allow me to detect collisions. By which I mean that if "a" and "b" resolve to the same hash value, I need to be able to tell. The first bit is easy enough, but the second is hard as nails

A google for hashing algorithms will generally end up pointing you to Dan Bernstein's venerable djb2. Unfortunately, it's not really very good, and it certainly doesn't handle the second case. For that, what I need is an algorithm that pushes out *two* hash values, a primary and a secondary.

Enter Bob Jenkins' 'lookup3.c', particularly 'hashlittle2()'. This is a rather nice hashing algorithm that throws out a pair of hash values, has very good characteristics, and runs fast.
But, oh noes! Compiling it for the STM32 results in over 1K of code. Surely we can do better than that? Yes, we can.

.global hash
.type hash, %function
.align 2
@ Hash Function
@ Thumb-2 implementation of 'hashlittle2' by Bob Jenkins.
@ In : R0 -> pointer to string
@ R1 -> length of string
@ R2 -> Initial value for 'hash c'
@ R3 -> Initial value for 'hash b'
@ Out : R0 -> 'hash c', the main hash value
@ R1 -> 'hash b', the secondary hash value
@ Clobbers R2, R3, flags
hash: push {r4-r7,lr}
@ Within the function, we use registers as follows:
@ r1 : length
@ r2-r4 : temporary for character loading
@ r5-r7 : a, b, c

@ initial setup
adr r7, hash_constant @ a = b = c = 0xdeadbeef + length + hash c
ldr r7, [r7]
adds r7, r1
adds r7, r2
mov r5, r7
mov r6, r7
adds r7, r3 @ c += hash b

cmp r1, #0x0c @ Is r4 <= 12?
it le
ble hash_tail @ If so, go do the tail part

ldmia r0!, {r2, r3, r4} @ Load values
adds r5, r2
adds r6, r3
adds r7, r4
subs r1, #0x0c @ Subtract 12 from length

@ Mix
subs r5, r7 @ a -= c
eor.W r5, r5, r7, ror #28 @ a ^= rot(c,4)
adds r7, r6 @ c += b

subs r6, r5 @ b -= a
eor.W r6, r6, r5, ror #26 @ b ^= rot(a,6)
adds r5, r7 @ a += c;

subs r7, r6 @ c -= b;
eor.W r7, r7, r6, ror #24 @ c ^= rot(b, 8);
adds r6, r5 @ b += a;

subs r5, r7 @ a -= c
eor.W r5, r5, r7, ror #16 @ a ^= rot(c,16)
adds r7, r6 @ c += b

subs r6, r5 @ b -= a
eor.W r6, r6, r5, ror #13 @ b ^= rot(a,19)
adds r5, r7 @ a += c;

subs r7, r6 @ c -= b;
eor.W r7, r7, r6, ror #28 @ c ^= rot(b, 4);
adds r6, r5 @ b += a;

b hash_loop
cbz r1, hash_done @ length 0 requires no extra work

ldmia r0!, {r2, r3, r4} @ Load values
adr r0, do_hash_mask
add r0, r0, r1, lsl #2

mov.W pc, r0 @ doubles for count 0 entry in masking table, *must* be 4 bytes hence .W
@ Here we mask off the bits we don't want according to
@ what data we have
bic r2, #0x0000ff00 @ count is 1, mask off all but least significant byte
bic r2, #0x00ff0000 @ etc etc
bic r2, #0xff000000
bic r3, #0x000000ff
bic r3, #0x0000ff00
bic r3, #0x00ff0000
bic r3, #0xff000000
bic r4, #0x000000ff
bic r4, #0x0000ff00
bic r4, #0x00ff0000
bic r4, #0xff000000
bic r4, #0x00000000

adds r5, r2 @ Add masked vales before final mix
adds r6, r3
adds r7, r4

@ Final mix
eors r7, r6 @ c ^= b
sub r7, r7, r6, ror #18 @ c -= rot(b,14)
eors r5, r7 @ a ^= c
sub r5, r5, r7, ror #21 @ c -= rot(c,11)
eors r6, r5 @ b ^= a
sub r6, r6, r5, ror #7 @ b -= rot(a,25)
eors r7, r6 @ c ^= b
sub r7, r7, r6, ror #16 @ c -= rot(b,16)
eors r5, r7 @ a ^= c
sub r5, r5, r7, ror #28 @ c -= rot(c,4)
eors r6, r5 @ b ^= a
sub r6, r6, r5, ror #18 @ b -= rot(a,14)
eors r7, r6 @ c ^= b
sub r7, r7, r6, ror #8 @ c -= rot(b,24)

mov r0, r7
mov r1, r6
pop {r4-r7, pc}

.word 0xdeadbeef


There you go. 206 bytes of hash function. There's probably a few bytes to shave off here and there, but it works pretty well.

As it happens, I'm only using 30 bits of the main hash, with the lowest 2 bits being used to indicate type, in order that symbols fit completely into a single machine word.

Friday, October 29, 2010

All modern operating systems are shite.

"Modern" operating systems are, to my eyes, fundamentally broken. Indeed, there's very little that's "modern" about them, when it comes down to it.

They are seemingly all mired in the concept of being nothing more than a layer over hardware, something that allows single-purpose "applications" to access that hardware in a consistent manner. The focus is wholeheartedly on the applications, and not on the data those applications use.

Admittedly, some of those systems are better than others. Apple's OSX is nice enough to use, mainly because it's decent enough to get out of the way when you need it to. But even so, it's conceptually no different to Microsoft's Win7, or even Vista, XP, Win2K (go back down the MS lineage as far as you want here) or any other "desktop" operating system - the only real distinguishing factor apart from what applications are compatible is how you get on with the UI. Personally, I find OSX tolerable, and Windows execrable, but that's personal preference, and not really much different to preferring a Ford over a Fiat.

Even the handheld market, one which was all-but invented by Apples ground-breaking Newton, has taken masive steps backwards from the beauty and simplicity of Newton, or even the sparse functionalism of Palm; "handheld devices", say Jobs, Brin and Ballmer, "are for consumption of pre-rendered media". Forget using them as extensions of your computing environment, forget using them for anything useful (except, perhaps, as a poor replacement for a spirit level), they're there as status symbol, toy, and above all, conduit for advertising.

Yeah, I keep coming back to the Newton. Funny, that.

So, anyway, whilst fiddling with Android, and trying to make mobile devices actually useful again, and generally buggering about with my Wits A81, I've been doing a fair amount of thinking about this. And, indeed, I've been thinking about going further than that, and actually doing something about it.

I started by thinking that something could be done by using the linux kernel as a starting point. After all, it's already developed, and it runs everywhere - why bother reinventing the wheel? Just make a performant "shim" OS over the top of it, and a file system that does what you want, and you're laughing.

But that's suboptimal. Linux (the kernel) is tied to the Unix concept that "everything is a file". Which is a nice enough abstraction, but it doesn't work for me. The reason it doesn't work for me is that "a file" is far from being a rich enough concept. Part of this is to do with metadata (or, more particularly, the lack of it) - it's very difficult to implement interesting inter-application behaviour without metadata and (even more importantly) transparent, dynamic, access methods for that data and metadata.

As an example, let's consider Apple's OSX, and the interaction between, and This is, pretty much, state of the (current) art - you send me an email suggesting a lunch date tomorrow at midday, I double click on the "midday" text and get a "lunch" appointment added to my calendar with you as an "attendee". Pretty damn slick. But all that interaction knows about is those 3 apps - it's hard-coded to only work between those three. This is because the processing is contained in the aplications, and not "owned" by the data. I can't make do this without linking to a bunch of Apple-provided frameworks (and even then, it's hard, as a lot of the behaviour is not externally exposed), and even when I do, doesn't suddenly inherit the ability to do what does - the interaction only goes one way.

We shouldn't be living in a world where metadata and behaviour is application specific. Developers should be developing behaviour, interfaces and transforms for specific data types, not reinventing the "application" over and over. Users should be able to edit an image directly in an email, and send it back to the sender, without having to save it, open it in anoother application, edit it there, save it again, then send it back. Data should be automatically versioned. It should follow you around. Your mobile device should be an extension of your desktop, able to take important data with you and merge changes back when you return, able to contact your machine over the 'net and fetch data you "forgot". And so on.

Desktop computing hasn't advanced that much since 1984. There have been a few attempts to make things better, but they have, by and large, failed, at least if we measure success in a commercial sense. Newton was one of these. Computing has failed on several of its promises. We are slaves to the machine, not the other way around.

It is clear to me, at least, that we need a change. And the idea has been buzzing around my head for some time. I had considered that much of this could be accomplished by layering something over the Linux kernel, with a metadata-storing file system backing everything up. The problem with this approach is that the two worlds can never be allowed to collide - Linux provides a hardware abstraction layer and nothing else. Everything else the kernel does, including scheduling, would be surplus to requirements. So why not, I thought to myself, develop an OS directly "on the metal" in a dynamic language, do it all from the ground up? After all, it's been done before. The Jupiter Ace was a "home micro" which was coded purely in FORTH. The Symbolics Lisp Machines were coded, from the ground up, in Lisp.

And that hooked me. I've always loved Lisp. It has a beauty which is transcendental, a purity of purpose which is unrivalled in any (non-Lisp) language that's come since. A lot of people get scared by the syntax, but then people get scared by the fact Objective-C uses square brackets and colons, so fuck 'em. Result - I started looking at Lisp-based OSes, to see if it had all been done before. And I found this:

Now, this started to tickle my interest. Not only was I not insane (or, at least, not insane and alone), but Mikel is one of the original guys on the Newton (and pre-Newton) project at Apple. One of the guys who worked on SK8 at Apple. An insanely talented guy, who seemingly thinks much the same way as I do. Sure, it's not much further advanced than I am, but it at least shows I'm (possibly) not utterly wrong.

So, I'm not mad. And I *am* gonna do this. Stop thinking, start acting.

Saturday, July 31, 2010

OpenEmbedded under OSX.

So, as a result of getting the little android tablet I mentioned before, I've been playing with OpenEmbedded. Unfortunately (for me), it's not supported (or at least not fully) under OSX - even getting the native tools up and running is painful. There's a *load* of gnu-isms in the source, which break strict POSIX compliance, and frankly make it a massive pain to get stuff working.

Much of this can be got around by using precompiled stuff (either pulled from one of the OSX package repositories, fink etc, or, in my case, compiled manually), and then explicitly removed from the OE build system using ASSUME_PROVIDED.

This doesn't fix a few overall issues, though, much of which comes from OE assuming it's built on a system that uses ELF as its binary format.

A large part of this can be fixed with one patch, which I've attached here. $OEBASE/$CHECKOUT/classes/relocatable.bbclass tries to fix up the rpath in any binaries having it, but simply assumes all binaries are ELF format. Obviously, this crashes and burns horribly under OSX, where the runtime format for native binaries is Mach-O. Patch below, apply from $OEBASE/$CHECKOUT with patch -p1

diff --git a/classes/relocatable.bbclass b/classes/relocatable.bbclass
index 2af3a7a..3a4c119 100644
--- a/classes/relocatable.bbclass
+++ b/classes/relocatable.bbclass
@@ -3,6 +3,19 @@ SYSROOT_PREPROCESS_FUNCS += "relocatable_binaries_preprocess"
CHRPATH_BIN ?= "chrpath"

+def is_elf_file (fullpath):
+ import subprocess as sub
+ p = sub.Popen(['file', '-b', fullpath],stdout=sub.PIPE,stderr=sub.PIPE)
+ err, out = p.communicate()
+ if p.returncode != 0:
+ return 0
+ if out.startswith('ELF'):
+ return 1
+ else:
+ return 0
def process_dir (directory, d):
import subprocess as sub
import stat
@@ -24,7 +37,8 @@ def process_dir (directory, d):

if os.path.isdir(fpath):
process_dir(fpath, d)
- else:
+ if is_elf_file(fpath) == 1:
+ # only try to relocate ELF files
#bb.note("Testing %s for relocatability" % fpath)

# We need read and write permissions for chrpath, if we don't have
@@ -85,7 +99,7 @@ def rpath_replace (path, d):
bindirs ="${bindir} ${sbindir} ${base_sbindir} ${base_bindir} ${libdir} ${base_libdir} ${PREPROCESS_RELOCATE_DIRS}", d).split()

for bindir in bindirs:
- #bb.note ("Processing directory " + bindir)
+ bb.note ("Processing directory " + bindir)
directory = path + "/" + bindir
process_dir (directory, d)

I'll post more in a little while, including full instructions on getting oe up and running under OSX, but I'm off on holiday for a week.

Sunday, July 11, 2010

Android on tablets, rights and wrongs

This is likely to be a relatively contentious post. I've spent a bit of time trying to get to like Android, and, in short, I can't do it. I don't like Android. I'm sure it's OK as a smartphone OS, but on a tablet, it simply doesn't work for me.

Now, I'm aware that I'm probably not the average user, but I'm (humbly enough) pretty well attuned to what people need to make computers work. That comes from 20+ years in "the biz", using pretty much every operating system there's ever been, including some real wierdos that most people haven't even heard of.

Now, to start hammering on what Android's got wrong, we really need to look at a comparable system that's got it *right*. And it needs to be something I have near me, so out comes the trusty Newton. Yep, that's right. The Newton. A 13 year old, underpowered "pocketable" that was ridiculed in its earlier incarnations for its pisspoor handwriting recognition. Android's gotta be better than that, right? Read on for a load of words on why I think the way I do, but the executive summary is "No, wrong".


Before we get to the "nitty gritty" of little things, we need to look at the design rationales of the systems, or, for want of a better phrase, their "philosophy".

Newton came from a real intention to make a computer that was different, that was better. A radical shift from the 100lb hulking monoliths that lived on or under our desks at the time. A powerful computer that could be dropped into your pocket. A computer that anyone could use. A computer that didn't need a keyboard. It may have been a commercial failure (it cost Apple a lot of money at a time it couldn't afford it), but opened a whole range of new business markets and the whole concept of a pocketable "real computer" made a lot of money for other people.

Android comes from Google. In short, it can be summed up as a device for getting more eyes to Google's primary business, i.e. advertising. Google want to hit a new market (originally smartphones), so they leverage a bunch of open source software and try to make something that looks like what the current market leader (iPhone, in this case) looks like, and dump out a beta. Being "open" means that the hardware manufacturers can make it work on their chips, being free means that the carriers don't have to pay license fees, and being free means the consumer gets a cheaper device. Everyone wins, right? Well, in reality, not quite. Hardware manufacturers have mainly made *one* version work for *one* generation of chips, and have often not lived up to the GPL requirements so that their initial work could be carried on by the public. Carriers couldn't care less about updates, because they'd rather sell you a new phone with a new contract, and realistically, it's hardware costs that drive the pricepoint anyway. Net result is a messy market with a ~33/33/33 split of the 3 current major versions of Android, and most platforms not being upgradeable.

Then along came iPad, and with it a sudden rush of android-running tablets. Because a tablet is just like a scaled-up smartphone, right? Again, "No, wrong".

The problem here, I think, is that Android doesn't really fit in with the use case for a tablet device, and particularly not for the pocketable type of tablet. At least, not for *my* use case, but it's my post so I'll stick with my requirements, thank you very much. The iPad succeeds because it's a tightly controlled device with a tightly controlled market, aiming at consumption of media. It's not a general purpose computer, really - apps live in their own little walled gardens, and you can't run what you want on it. It's a fucking good little gadget, though.

Android thinks it's a phone. Except when it thinks it's an iPad. it dosn't have the same overall control that the iPad/iPhone has, and where Google have tried to enforce certain hardware requirements, they often don't make sense (for a phone *or* for a tablet) except when you look at it from an advertising pusher's point of view. I mean, really, why does a phone, or a tablet, need a GPS? Sure, it might be useful, but I fail to see the absolute necessity. Why restrict to a specific set of screen sizes? Why must it have a camera? After all, anyone who cares about the photos they take won't be using a phone camera anyway. but I digress slightly.

To sum it up: IMO, android is a "me too" iPhone/iPad clone, and not a very good one. It suffers from Google's "permanent beta" mania. Get it out fast and dirty, and fuck the early adopters.

User interface

The overall user interface of Newton is *tight*. It's intended to be used with a stylus, although most selections can be made with a finger. Here's what it looks like, more or less (in reality, it's a lot more "green"):

What we see at the bottom of these two screens is the stuff that's always available, viz: a "menu bar" for the current application, and a "Dock" (to use OSX terminology) for a few common apps and functions (undo, find, assist, more on these later). That uses up a fair amount of available screen space, but the rest belongs to your app. What's important here is that the interface is consistent. From the supplied calendar application to 3rd party web browser, you always know where to go.

Here's another one, with an "add-on" backdrop, and rotated into landscape mode (the green is a little overbearing, probably as it was pulled from an emulator)

The little star at the top of the screen is the "notification" icon for Newton. What Android mainly uses its waste of space "status bar" for.

Android is chaotic. It's mainly intended to be used with a finger, but that doesn't always work. One app might grab full screen and require some sort of gesture to pull up menus, another leaves menus and stuff on screen. Mostly apps leave the status bar at the top of the screen, but it's pretty much a waste of space - there's very little you can do with it. Admittedly, most apps respond to the "menu" button, but not all devices have a hardware menu button. Menu widgets don't seem to be standardised, and some are so small as to be unselectable without using a stylus. If nothing else, it's more food for the "open source can't do UIs" crowd.

Scrolling is standardised on both platforms - on Newton you use the up-down arrow keys, on Android you "swipe". I find that my swipes are taken as clicks or some other UI action about 50% of the time, and although the "inertial" scrolling thing looks cool initially, it's intensely frustrating as your swipe goes that bit too far and zooms waaaaaaaay past what you were looking for.

Closing apps is more or less standardised - the Newt has a little X at the bottom left corner, that closes your app. Android recognises the "back" and "home" buttons, but, again, not all devices have them.

Rotation hits some little glitches, as well. Android assumes you have an accelerometer, and auto-rotates the screen to fit. That's nice, as long as you have an accelerometer. I don't, and there doesn't seem to be any standard "rotate" widget as per the Newton. Which is not to say that the Newt is perfect in this respect, some apps behave very badly when rotated. I would argue, however, that Android should at least provide a widget as a sop to those who don't have an accelerometer.

Finding your data

Newton has a "find" button. It works. It always works. 'nuff said.

Android has a filesystem. A filesystem that you can't access without some sort of file browser. I have 3 on my device already - one works for one thing, one works better for another, and so on. Gah. I don't want to fuck about with filesystems, dammit!

Otherwise, you must find your data from within the application you want to use. Interfaces vary. It's messy. It's inconsistent.

This reveals another philosophical thing. Newton doesn't have a "filesystem" as such. No hierarchy, no structure. A big searchable "soup" of data and applications. Everything is data, everything has a bunch of attributes, you can search it. It's a database. While this is somewhat shocking to those used to the current way of doing things, it's hardly new : "Pick" did this in the 60s, BeOS did this to a certain extent, so did Apple to a very limited extent with the "classic" MacOS (and also with OSX).

The current "hierarchical" way of doing things is massively backwards, and it's holding us back. We use the name and location of a file to indicate its metadata. This makes no sense (as, for example, those who've been hit with Windows-based trojans and email viruses can testify). It works for certain system-level applications, but for user data manpulation, it's utter pants.

Currently, this need is covered to a certain extent inside individual applications (think, for example, iTunes), but that data can't be easily shared to other apps without intrinsic knowledge of how to "get at" the database, and upgrades to one app often break other stuff further down the chain.

It would be much nicer to simply have something where you could query aong the lines of "show me all the emails I sent to fred over the last month, sorted by date". "Okay, now show me the emails in that list which had attachments". "Now show me the attachments which pertained to project 'foo'". Etc etc.

It's about time someone came up with a filesystem that is completely non-hierarchical (I'm currently working on a fusefs that does exactly this, actually).

App launching

On the Newton, click on the "extras" button, go to the tab for stuff *you* have classified as applications, click your app. Or use "find". Or click on an associated piece of data. Or you might use "Assist".

On Android, go back to the home screen, go to the right tab, click your app.

To me, the whole "application centric" way of doing things smacks of "desktop computing metaphor crammed into a handheld device" with little thought. But maybe that's just me.

Data input

With Newton, you write on the screen. That's pretty much it. Really. Just write directly in the boxes. Newton usually turns it into a textual representation of what you wrote. Special characters and all. No special voodoo ways of writing a la "graffiti" on the Palm. Made a mistake? Scribble it out, it goes away. Or you can opt to have your text remain as a set of handwritten strokes, and have it recognised later (very handy for note taking in meetings, as the "usually" above implies - Newton's HWR doesn't always get it right, and might need some nudging in the right direction, which you can't do when scribbling at full speed).

Alternatively, there are on-screen keyboards, or, in extremis, the newton serial keyboard, which is very handy for programming direct on the device.

With HWR, you get around 20-30wpm, about the same as a *fast* typist on a good keyboard. With OSK, 3-10wpm. The newton keyboard is a bit craptacular though, and you can count on about 15-20wpm using it, along with cramp in the fingers.

With Android on a tablet, you don't have much choice. It's on screen keyboards pretty much all the way unless your device supports hardware keyboards. For standard OSK with prediction, I got about the same as on the Newton (the prediction side of things only seemed to be able to predict one word - "android"). I've heard of 10-15wpm with predictive OSK and a fast OSK replacement you're trained to use (Swype, for example) - about as fast as a "slow" typist. With more training you might get a bit faster.

HWR is good technology. It's fast, and discreet (speech recognition has the potential of being fast, but makes you look like a git, and you can't easily use it in meetings or noisy environments). If you want *really* fast, shorthand recognition could get you into massive wpm speeds - at least as fast as speech and maybe faster. The hardware can do it (my cheapo chinese tablet is, at a conservative guess, 6-10 times as fast as the Newton without taking into account the vector unit and GPU). Fast, accurate HWR is no idle dream.

There's other failings with Android's input methods: they hide the entire screen when you're using them, so you have no context - try entering data into a form, and you'll be given a "next" button indicating that entering the current data will take you to the next form field, but there's no indication of what this, or the next, form field actually are. Barking bloody mad, that is. Maybe it's fixed in Froyo.

Inter-app communications

Nothing standard under Android. Some apps share, some don't.

This is more or less the system under Newton as well, but it also has "Assist", and that's available to all apps. Yep, that button that looks like a question mark. What does it do?

From some app, maybe a word processor, scribble "send to Mark", and select it. Hit "assist". Newton goes off and looks, finds that Mark might mean 2 people, and that sending might mean faxing or emailing. Asks who, and how. And then goes off and does it. "Call Fred", Assist, and the telephone dialer fires up, dials the number, and then puts a call logger up, allowing you to scribble notes whilst chatting.

"Assist" is mad cool, and it was available in the early '90s.

The built in apps under Newton *do* share data, and the interface for sharing that data back and forth was available to other developers' apps. the same is more or less the case under Android, although it's hard / impossible for a developer to shoehorn additional stuff into an existing app (as could be done under Newton) without modifying and recompiling.

Synchronisation and so on

The Newton synchronises nicely with desktop apps via serial, ethernet, wifi, bluetooth. As long as you have a sync app, of course. They're pretty hard to come by these days, and it's getting difficult to use a newt with a "modern" desktop OS. You now need need 3rd party software to sync with a Mac, but I'll give the Newton a pass on that as the OS the Apple-supplied software ran on hasn't been current for nearly 10 years. Dunno if the windows sync stuff works on Vista/7, but it certainly ran on XP. 64 bit might be what fraks it totally.

Android mainly syncs, as far as I can tell, with "the cloud", which can largely be considered a euphemism for "Google's ads". Well, frankly, fuck the cloud. Desktop syncing is payware, so fuck that too. It really shouldn't be something one needs 3rd party software for - there *should* be some sort of extendable conduit for syncing - hell, OSX provides most of that anyway.

Other stuff

Let's be honest, Android does have a bunch of stuff it can do waaaaay better than the Newt. Video and audio playback is potentially loads better, mainly down to 12 years of additional hardware development. My cheapo tablet has more hardware potential than a top-of-the-range laptop from the Newton era, after all. That said, video playback under Android leaves a lot to be desired. the hardware should have the power to do NLE on video, and in most cases you can't even play stuff back unless it's in a specific format with a specific size. You can't assume codecs are there. A mess, in short.

The web experience on Android is loads better than on the Newt, too, even if it can't easily do embedded flash video (something I heave a sigh of relief over).

eBooks look nicer under Android. Higher DPI, colour screen. Fine. PDFs don't work at all under Newton, so even having an option to read badly-rendered PDFs slowly under Android is a blessing. That applies to 3rd party readers - I've not used the Adobe reader for 2 reasons:

- On the desktop it's a pile of bloatware crap
- My android device doesn't have Google Market, so I can't get to it.

Photos, idem. Colour screen, higher DPI. Win for Android by default and the inexorable march of progress.

One area the Newton wins on is startup. From "press of button" to the "Happy Newton" chime and a usable device is measured in single digits of seconds. Android takes an age. In addition, the Newton comes back to *exactly* where you were when you turned it off or the batteries ran out, no matter how much time has passed between the two. No data loss, nothing. you're back. No finding files, you're still there. In 15 years of using Newtons, I've *never* lost data (not something I can say of PalmOS devices, for example).

Wakeup - I can't compare. Newton is instant, but my tablet doesn't seem to have any PM stuff enabled, so I don't know if it works or not. I have to hard power cycle it. Sucks, but probably not an Android flaw as such.

As far as gaming goes, Android is more modern. But 99% (or more) of the "games" I've found so far available under Android are made of suck. A lot of this is down to Google's utterly braindead decision to use Java, of all things, as a systems programming language. I mean, really. Gaming suck on a platform that uses a garbage collected langauge with no JIT? Who'd ever have thought it?

And finally, performance.

I have in front of me 2 ARM devices.
One has a 162MHz StrongARM 110 processor, giving 1 DMips/MHz. It has 4MB of RAM and 4MB of flash, with 16MB of additional flash in one of the PCMCIA slots for a total of 20MB *total storage*. It's running a 12 year old, interpreted, prototype based, language.

The other has a 600MHz Cortex-A8 processor, which gives 2 DMips/MHz, and has, in adddition, a vector floating point unit, NEON FP extensions, and an OpenCL-enabled GPU. Even without using the addons, it's nearly *10 times* as fast as the Newton. It has a 13 stage superscalar pipeline and more cache than the Newton has main memory. It has 256MB of Main memory and 2GB of flash, with another 8GB of flash in the SD card slot for a total of 10GB storage. It's running software that's mainly written in Java. Oh, did I mention that the processor itself is optimised for Java?

Guess which one feels "snappy"? Yep, you're right - the hardware specs don't make up for the software implementation.

The "feel" of Android 2.1 makes me believe that the vaunted 450% speedup of Froyo's improved Dalvik are probably not overstated. IUt also makes me wonder why, if there was that much speedup to be had, why it wasn't "had" before initial launch of Android. And how much more there is to be had under the hood. Android's been out for over 2 years now; I'm hardly an early adopter.

No, really. Four hundred and fifty fucking percent faster. What the fuck?

Saturday, July 10, 2010

Teardown, and comparison to the Newton.

So, I tore the wits almost all the way down, and at least far enough to get a look at both sides of the motherboard. I didn't get closeups of the chips, because the little button boards are glued in place and removing them looked like it was gonna be a bit too delicate for a slightly inebriate hardware tech...

Photos are in this set, and here's a taster:

Newton & Wits

The hardware is impressive, The case has brass inserts for the screws rather than being the usual self-tapping crap, and the motherboard is *not* the mess of patch wires and crappy soldering I was led to expect.

Comparison time. Let's look at the witstech A81 vs a Newton 2x00. Remember, I am biased, so I may be appearing to be harsh, but I will try to at least explain *why*.

Form Factor

Although the Newton is slightly longer for a smaller screen, it wins on ergonomics as opposed to aesthetics. The big "chin" gives something to hold onto in landscape mode, and the slightly "slimmer in the middle" case suggests (and indeed gives) a comfortable "portrait" mode grip. The wits, on the other hand, has nothing much to hold onto, and the case is slippery as oppoed to the Newton's slightly "rubbery" feel. "How do I hold this" was one of my first thoughts when I picked it up.

The Newton also scores highly for its removable flip-over hard lid (screen protection), large stylus and pop-out stylus holder. Even the power supply scores well, with a bunch of adaptors supplied and easily interchangeable on the wall-wart.


No contest here. The chinaman takes it hands-down, in terms of clarity, resolution, brightness (although the Newton's backlight *is* 12 years old, so...) and so on. It also beats the Newton in terms of touchscreen performance, mainly because this particular example has a bad case of "the jaggies".

Other hardware issues

The wits has a useful stand. I'm tempted to cancel this out with the Newton's ability to run on 4 AA pencells if you need it to. Newton's (single) speaker is pretty good, but the hardware behind it lets it down.

Battery life

No contest. Newton, hands down. *Weeks* on a set of pencells, and you can happily let it simply run out of power, leave it for months, shove in a new set of batteries and you're *exactly* where you were when you put it down. That said, the wits does get an honourable mention for having a removable, and relatively high-capacity, battery. It's far better than most; I got 4 hours 45 minutes running "Operation Sandstorm" continuously ( load average 4, exercising the GPU and CPU pretty hard).


Both Wits and Newton can do bluetooth and wifi (although it's getting hard to find 5V PCMCIA wifi cards these days, and you're not gonna get anything other than 802.11b). The newt can happily do dialup, fax, or direct serial connection (this one is usually used as a serial terminal on my Sun Netra, actually). Oh, and IRDA. And Localtalk. That said, you can probably do most of that with the wits, given a usb device, some drivers, and a following wind, but would you actually *want* to?

I'll give this one to the wits, as Wifi and bluetooth on the Newt are "kinda neat", but painful to set up. I'm assuming there will eventually be some android support for the wits' bluetooth chipset here...


Grudgingly, the wits takes this. The newt has 2 PCMCIA slots, but in real world usage, one of those is gonna be taken up by a flash card. The little serial dongle on the newt is a pain in the ass, too. USB and a microSD slot show that some things *have* oved on in 12 years.

Cool Factor

Are you Joking? Newton by a mile. It's still got a multicoloured Apple on it, for ${DEITY}'s sake.

If I'm honest, the Wits is a nicer machine overall, and undoubtedly massively more powerful, but the Newton shows areas where it could have been vastly better in a purely hardware sense. A shame, because wits have obviously poured some love into this - it's not crap.

I will return to this later, with Newton vs Android in a software sense.

Wednesday, July 7, 2010

A New Toy

Well, I finally cracked. I'd been staying away from the Apple store, lest I inadvertantly find myself the owner of an iPad, when I came across Archos' 7 Home Tablet - a nifty looking little 7" screened "mobile internet device", running Android.

So I ordered one.

After a while (quite a while, actually), and 3 calls from Archos telling me my order had been put back again, I got a bit steamed with Archos and told them where they could stuff their 7" rigid object. If you catch my drift.

However, I'd sort of got hooked on the idea of a 7" replacement for my ageing Newton MP2100, so I started casting around and eventually found witstech, who produce the A81, a nifty looking 7" device with GPS, a Cortex-A8 processor, and all for less moolah than the Archos. Score!

So I contacted them, and purchased a brand spanking new, hot off the presses A81(E) (The E meaning that it has extra buttons for Android usage, but more on that later).

So, sure as eggs is eggs, it got a bit delayed. But it did eventually get here.

Review time.

Out of the box, I find: One 7" tablet, half of the back removable, and a replaceable battery that slips nicely behind the back. Okay. One mini USB B to female USB A cable, one generic 5v, 1.5A charger (and they were kind enough to send one that had the right sort of plug for France, score one for Wits tech people), and (as I have a developer model), a little mini-USB B to 1.8V serial adaptor. The latter item looks like a mini-USB plug with 3 hand-labelled wires hanging out of it. Oh, and a nice enough little semi-hard-case.'s unboxing video and overall photos can be found here. Note that hey got a few extra goodies that I didn't like a car charger (already got one) and holder (don't need one).

In terms of finish, the outside of the hardware is not so bad, considering the price. It's lightweight, and has a relatively clean look overall. Although lightweight, it feels tough enough. A bit "chubby" in terms of depth if you're used to looking at Apple gear, but that's tolerable.

Battery is marked 3000mAh, 3.8v. That's a couple of massive plus points - not only is it a load more beefy than the majority of the competitors' offerings, but it's also removable / replaceable. More on battery life later.

The stylus (which is tiny, about the same size as a DS stylus) has a tendency to jam in the holder if you don't push it in "just so" (and it has nothing to help you with alignment - 1 point for industrial design there).

Nothing much to say about the power and USB ports, which seem solid enough. The audio out port looks "odd", or rather "cheap". We'll come to audio later.

The MicroSD card slot hides behind a little rubber grommet, which is normal, although it doesn't look overly waterproof. It's a push/push slot, but seems to be mounted a little too far out, it's hard to get the rubber grommet in place with a card in there. Half a mm further in wouldn't have hurt.

The stand is a nice addition, pulls out from the back and holds the unit at a usable angle.

Button-wise, on the "top" of the unit we have a power button, which (being clear, and retro-lit with LEDs) also doubles as a charging / level indicator as far as I can tell. Green for fully charged, orange for charging, I think. Next to that, there's a pair of left-right buttons, one unit as a "rocker". Looks very nuch like a volume up/down pair, and, oddly enough, that appears to be what it's intended as. On the front, at the bottom left, there are 3 more buttons, with icons corresponding to "home", "back", and "menu". The thin plastic sheet over the fasia around these buttons is already coming away, and from photos it appears that this is endemic - I haven't had the guts to completely remove the sheet and see if it's just "packing protection" though.

Touchscreen is shiny, and has the slightly greasy feel that is common to most resistive devices. The LCD panel itself is bright (20% brightness is more than enough in most circumstances), clear, and (at least on mine) has no dead pixels. The 800x480 resolution gives roughly 135dpi, which makes for a nice enough display. Viewing angle range is "OK", at about 90° horizontally and 60° vertically before it's difficult to see.

On the back we have 2 speakers, oddly positioned vertically (assuming a "widescreen" mode of usage), one over the other. Left and right would have been tricky with the battery placement, I guess, and I'm not expecting hifi quality sound from built in speakers anyway.

On firing the device up, we find, horror of horrors, WinCE. Never has a user experience been so well summed up by its name. Wikipedia tells me that winCE has been out for 13 years, but it doesn't appear to have moved on at all - it still feels like Win 3.0 badly shoehorned into a portable device. It gets everything wrong. *Everything* runs full screen. Microscopic close buttons right next to other microscopic buttons - even with the stylus it's hard to hit the right one (and the touchscreen calibration is pretty much pixel perfect). This is obviously the build for the previous A81 with no buttons, because the buttons themselves don't do anything other than make a "click" noise. Even though I've told WinCE not to click. At all. Bastarding pile of shite.

Wifi performance is fine, it picked up my access point straight away and gave me decent speed. The antennae don't seem overly sensitive (I'd like an external antenna port, myself), but hey - it works. Bluetooth appears functional, although the godawful WinCE interface (particularly the disappearing keyboard) meant that I couldn't actually get the device to pair with my mac more than once.

WinCE, surprisingly enough, managed to use a USB keyboard and mouse through the supplied USB cable. Nice. Couldn't make my Mac see it as a USB storage device through a standard USB cable though. This may be an issue with the Mac, as I had the same problems with Adroid.

Before we do away with WinCE, though, let's look at the quality of the speakers (and the reasons for this will be come obvious later). Playing through the speakers sound quality is at least "OK", at least up to about 80% volume (about the volume of a relatively "loud" radio), at which point things start to distort badly. Cheap speakers, don't expect to use them for much.

So, we hate WinCE. Still alpha quality after 13 years. But we knew that.

Let's reflash with Android and see where we get. After all, Wits have just released an english language Android 2.1.

Reflashing is a piece of piss, once you have the SD card formatted properly (a bit tricky on the Mac, as it craps all over the disk as soon as it's formatted and the booter needs to be the very first item in the FAT, but a bit of VirtualBox voodoo gets us there). Power on, whilst holding down the "left" button, and leave it alone for a few minutes.

Now, some of my gripes about Android are related to the fact the Android I have here is a beta. Some are to do with Android itself, which I'm not really a fan of. Bear this in mind as you read on.

So, we power on, and we get a 4-colour quadrant display as the kernel loads, followed by a bunch of android robots, and finally an animated "android" logo. Boot is pretty quick, but it should be, it's solid-state after all.

Out of the box, we don't have much. For some reason, wits have left "phone" and "camera" functionality in place despite the tablet having neither. A bit daft, that.

No google Apps, (particularly "market").

Bluetooth doesn't want to play, at all. this is, I suspect, a wits issue, or quite likely something to do with kernel drivers for the WL1271 wifi/bluetooth/fm chip. The end result is the same - no bluetooth.

Battery status monitoring is also not included, it seems - the unit reports itself as being on charge, 100% charged, all the time. That sucks, but at least the power button indicates the charging state.

Sleep doesn't appear to function - I suspect that there's no power management going on at all. Surprisingly, this still allows the unit to function for 7-8 hours with wireless turned on and decoding music (but not playing it through the speakers - my wife would have killed me for leaving it turned on all night making noise).

Sound drivers give almost zero volume on the external speakers. That's gonna be a wits bad as well.

For the rest, everything seems to function properly. Calibration is good, again (although the fucking idiot who decided recalibration of the touch panel should require a reboot wants taking out back, shooting, and burying next to the bloke that decided changing network settings in WinNT required a reboot).

I find Android *very* difficult to get along with. Swipes are taken as taps, taps as swipes, and the keyboard / input handling is painful. I've found this to be the case on Android phones, too, but it sometimes makes me want to throw the damned thing at a wall.

The onboard browser is pretty good (no flash, thankfully), renders pretty fast, and appears to work OK. I might try mini opera to get some adblocking, though.

Although the onboard speakers are not properly driven by Android, the external audio is plenty loud enough, and without the usual underlying low-level hiss of cheap audio gear (tested with FLAC rips of Om's "At Giza" and Miles Davis' "Jeru"), and doesn't overload even with some really nasty noise pushed through it (Mainliner's "Black Sky"). As an audio player it's pretty good.

Gaming-wise, Quake3 runs nicely at 30-40fps, and Modern Combat:Sandstorm is equally playable (although multi-touch would be handy for this). Radiant Lite is a nice little "old school" shooter, too.

I haven't tested video playback, but given the gaming performance, it shouldn't be an issue.

Other apps were less endearing, and indicative of what the average android experience might be, regardless of platform. Many apps don't bother to check screen size, and run in a little window on the screen. That's just lazy coding. Others crash with no explanation, or simply hang. Again, bad coding - if they are crashing because of missing features, there should be some sort of feedback.

The device does miss a few features that Google, in their wisdom, have deemed "compulsory" for Android devices, namely: accelerometer, compass (and, to my chagrin, in my case, GPS). Not that I really need another GPS enabled device, but hey, I was kinda looking forward to playing "Zombies! Run!".

Overall, Android still feels part-finished, even allowing for the missing functionality of a beta release. That's part of the overall Google "permanent beta" thing, I guess. It's not awful, and it's certainly better than Wince, but that's not saying much.

More later. Hopefully I'll be able to find my spudgers and do a teardown, and I'll do a side-by side comparison versus my Venerable Newton.