Being a study of trade-offs in the design of mobile devices, with a view to avoiding dystopias and promoting creativity.

The current tempest-in-a-teapot about background apps (Androids do, Apples don’t) is instructive. Robert Love’s Why the iPad and iPhone don’t Support Multitasking is useful in explaining why this is actually hard: memory starvation. (Having said that, I’m quite sure that Apple will come up with a solution that’s competitive with Android’s, that’s probably what they’re pre-announcing later this week.)

It turns out that this issue makes a lot of other things hard, too. For example, I’d love a touch interface on the two most complex apps in which I spend any time, namely my photo editor (currently Adobe Lightroom) and my IDE (currently Eclipse/Android). These apps’ screens are infested with controls, and I’m pretty sure that I’d be more productive if I could get more intimate with my photos and classes and methods.

But these would also suffer grievously if starved of memory. My intuition tells me that something like Lightroom could be made to run acceptably on the kind of 1Ghz-or-so processor the iPad has, but never, I’m pretty sure, with only 256M of RAM and no swap.

This could lead to a very nasty future scenario. At the moment, more or less any personal computer, given enough memory, can be used for “creative” applications like photo editors and IDEs (and, for pedal-to-the-metal money people, big spreadsheets). If memory-starved tablets become ubiquitous, we’re looking at a future in which there are “normal” computers, and then “special” computers for creative people.

Should this happen, the “special” computers would lose the economies of scale have made it possible for me to type this into a circa-$1,500 device that would have been regarded as a supercomputer only a few years ago, which happily runs my whole authoring system, a combination of Perl and Java and Ruby components with a relational database, an image-manipulation-suite, and a Web server.

I dislike this future not just for personal but for ideological reasons; I’m deeply bought-into the notion of a Web populated by devices that almost anyone can afford and on which anyone can be creative, if they want.

So, let’s not do that. What does something like an iPad need to be seriously useful as a creative tool? Well, a keyboard, but that’s easy. And of course an application ecosystem that doesn’t exclude controversy, sex, and freedom. But I think that’s inevitable; even if what Apple is trying to do were a good idea, it simply won’t scale, and I’m pretty sure they’re going to have to unclench in the fullness of time.

But what it mostly needs is a butt-load of good old-fashioned high-performance random-access memory.

At this point I have to admit that my understanding of computer-system design issues is very basic. I have no idea what factors led to Apple’s decision to equip their device with a mere 256M of RAM; to my eye, this number seems seriously out-of-balance with a 1GHz CPU, a large high-resolution display, and an advanced application framework. Is it simply the case that they decided long battery life was more important than a big address space? Or is there some other constraint that comes with the form factor that I’m not clueful enough to know about?

In any case, I’m optimistic. Moore’s Law is on our side, and every class of popular-computing device in my lifetime has exhibited a monotonic increase in the amount of onboard memory.

I’m all for tablets, given the addition of keyboards, RAM, and freedom.



Contributions

Comment feed for ongoing:Comments feed

From: Mike Taylor (Apr 05 2010, at 21:48)

My hunch is that 256 megabytes is the amount of memory that they could fit onto the processor chip itself. Increasing the on-chip memory or adding memory chips themselves would probably increase the design complexity in such a way they lose on the heat/battery life values.

[link]

From: John Cowan (Apr 05 2010, at 21:51)

Y'know, this problem was solved by OS/MVT as long ago as 1967, a variant of OS/360 which was able to run without pre-established memory partitions. In how much memory? 256 *kilobytes*.

256M too little memory? Give me a break.

[link]

From: Ronald Pottol (Apr 05 2010, at 21:54)

Arg, exactly, it seems like so many companies spec just enough memory to run the demo, but not to run for the user. I have a G1 android phone, and the #1 reason I want to upgrade is more memory, but the only big step up is the Nexus One (I have 192, there are a few pones with 256, the N1 has 512).

But I've always felt this way, but it is just nuts for portable devices.

[link]

From: Gavin Carothers (Apr 05 2010, at 22:05)

Good news, there are a class of application that are very main stream that use even more resources then creative apps or IDEs. Games. Until people give up playing computer games, games are likely to continue to drive the cost of (massively parallel) computing (Graphics Cards) and general purpose computing down.

[link]

From: Michael Kozakewich (Apr 05 2010, at 22:06)

I feel confident that most dystopian futures are just pessimistic worrying or worst-case scenarios.

When you get right down to it, gamers have really driven a lot of the computer-performance markets, as have those big companies who need the extremely powerful computers, like Pixar or the like.

I think we'll see a split, in the very near future, between 'low-power' and 'high-power' systems, and you see the beginnings of that today. I don't think there'll be a complete flip-around to 'normal' and 'needlessly powerful', though, because people generally want the best of the best and they also want to be able to do a great variety of things, if the opportunity comes.

I think a tremendous majority will keep pushing for 'the best'.

[link]

From: Be Seeing You (Apr 05 2010, at 22:54)

So if you are viewing things as a web-based future, why does 256MB matter? Use the iPad as a thin client device. I agree with the keyboard issue, having just played with an iPad, but the keyboard dock will be out in short order and supposedly the bluetooth keyboard will work with the iPad now. I'd rather have tons of memory on a big box located elsewhere than lug it around, but maybe I'm something of an anachronist. Client/Server computing is not the evil, scary monster that some make it out to be, after all, in essence that's all the web is.

[link]

From: fauigerzigerk (Apr 05 2010, at 23:01)

I agree with you on the memory issue (and on freedom). There is something your current and former employers could do about that, which is to add structured values types to Java. The lack of structured value types makes Java software use approximately twice as much memory as an equivalent C/C++, C# or Go program.

[link]

From: mpe (Apr 05 2010, at 23:02)

Isn't a tablet with a keyboard and more RAM called a "laptop"? Or did I miss the memo from Cupertino.

[link]

From: GIgi (Apr 05 2010, at 23:28)

There is plenty of empty space inside the iPad. Space was not the reason.

[link]

From: JulesLt (Apr 06 2010, at 00:40)

Tim - I think we already have this kind of divide - the bulk of laptop sales are in the $699 end, and most of those are compromised in some respect (lower-grade CPU with smaller cache, slower RAM, inferior GPU - anything so long as they can hit the two things people think count - CPU speed and memory size).

While my team wouldn't want a cheap low-end machine from work, most of them are happy to buy low-end junk at home, because if it's only use is as an internet terminal, it doesn't need to be anything better (currently).

(And for what it's worth, it seems to me that most of the gamers at work have migrated over to XBox gaming these days - I don't think gaming is the driver for PCs anymore - hence the rise of the bottom-end spec)

A client/server solution won't work for something like Lightroom until there's a massive increase in upstream bandwidth (I get 10 Mbps downstream, 0.3 upstream on fixed broadband. Mobile broadband is significantly worse).

[link]

From: Parveen Kaler (Apr 06 2010, at 01:02)

The Apple A4 is a system on a chip. I'm guessing that doubling RAM would be expensive and kill yields. RAM sits in a block on the chip. Maybe there wasn't enough space on the chip. Maybe it consumed too much electricity. Maybe fabs weren't able to ramp up in time. Maybe it cascaded into an issue where a different voltage regulator or timer was required. I'm not sure. I don't know. (I've done a bunch of development on home game consoles but I'm not an Electrical Engineer.)

What I do know is that hardware is hard.

What does this mean? Patience. It'll happen whenever manufacturing catches up. Programmers and computer scientists are impatient and assume that anything that they can write today should work today.

It looks like Apple acquired a chipmaker named Intrinsity. Apple is not in the habit of making acquisitions. But hardware is hard and an acquisition in this space actually makes sense.

http://www.engadget.com/2010/04/03/is-intrinsity-apples-latest-chipmaker-acquisition/

[link]

From: gvb (Apr 06 2010, at 04:25)

One of the few useful bits of information in the A4 "teardown" http://www.ifixit.com/Teardown/Apple-A4-Teardown/2204 is that the A4 processor is a PoP http://en.wikipedia.org/wiki/Package_on_package. Here is a link to the full cross section: http://www.ifixit.com/Misc/iphone_processor_crossection.jpg. Note the two RAM die are the same size as the processor die, but slightly offset for wirebonding access.

I did not expect this since PoP is a technology created to cram a processor and memory subsystem into the extremely limited real estate of a cell phone. The iPad form factor is pretty expansive compared to a cell phone. On reflection, I should have expected that, since I'm sure the A4 (or A5 ;-) will be in the next iPhone.

I'm sure one of the big disadvantages of PoP is that the dies have size limitations, which would lead directly to memory size limitations. Yield may also play a part, but I suspect it would be a small part compared to the physical size constraints required by the PoP package.

[link]

From: John Doerr once cried at TED about global warming (Apr 06 2010, at 07:01)

Tim,

John Doerr and VC friends think the iPad is "The Next Big Thing" (see his 5 April 2010 Techcrunch guest author post), and he thinks the iPad, etc. will become a fat client and eventually will hoard around Terabyte sized storage capacities. If this ends up being the case, then why can't some of that TB storage be used for vm, swap, etc.? A few excerpts:

> Bill Joy says the key to more performance is lower power.

> Over the next decade he sees 3 times better batteries, and

> 10 times lower power chips. So we should be able to run

> for the same price, 30 times as much application.

>

> And as for storage, there’s no reason that can’t be 30x also.

> Or, about a terabyte of local, faster, solid state storage.

> (A terabyte is several hundred movies)

Doerr said that Kleiner is putting in another $200M into their iFund. Is Google going to compete with Kleiner in the VC funding of Android web services? Had Kleiner drunk the Apple Kool Aid.

Observe that John Doerr once shed tears at TED because he said he was worried sick about global warming and it future effects on his family. Hmmm ...

[link]

From: David - he of the Ecards (Apr 07 2010, at 04:07)

Just what I have been thinking.

I use a Macbook Pro at home.

Too heavy to lug about when I went to India for two months - so I took a Macbook Air.

An iPad weighs in at a pound and half, which sounds great compared to the three pounds that a Macbook Air weighs.

However, I shot photographs, uploaded the RAW images, converted them in Adobe Camera RAW, and uploaded them to the web server to accompany the articles I was writing for our blog.

How can I do that on an iPad?

[link]

From: Mike (Apr 07 2010, at 17:05)

So "preannouncing" is how the Google spinmeisters told you to characterize Apple's event? "Announcing" is intrinsically "pre": Perhaps you're confusing the word "announcing" with "releasing"?

[link]

author · Dad
colophon · rights
picture of the day
April 05, 2010
· Technology (90 fragments)
· · Apple (4 more)
· · Mobile (86 more)

By .

The opinions expressed here
are my own, and no other party
necessarily agrees with them.

A full disclosure of my
professional interests is
on the author page.

I’m on Mastodon!