TL;DR: we use Apple’s regular product launches as a sense-check to cope with the myriad of technologies in which we could invest our time and attention.
Yesterday was another Apple product launch. Since the passing of Steve Jobs they feel less and less like the Wizard of Oz showing us behind the curtain, and more like another tech company wheeling out incremental updates while their competition catches up. This time, both Microsoft and Adobe shared the stage with Tim Cook and co, for goodness’ sake.
There’s been a lot of ink spilled and pixels pushed about Apple’s ‘culture of innovation’ and it’s ‘design-led principles’. People argue that you can get better value for money with other devices. Others (including me) worry about vendor lock-in. And so many people in my Twitter timeline yesterday were tweeting during the event that the features and products Apple were launching have been available on other systems for years.
But I think this is to miss the point. If you’ve got five minutes to spare, Steve Jobs explains why this is irrelevant in his answer to a question at WWDC 1997:
The point is that market leaders make opinionated choices. They put the user first and make decisions based around what’s useful for the user.
Conservation of attention
I’d argue that Apple’s product launches are now cultural artefacts. They’re included in regular news items along with world disasters and briefings about national politics. Rather than considering this as ‘entertainment news’ I think it’s perhaps more instructive to see Apple’s product launches as attention conservation devices.
Let me explain.
In the not-so-recent past, it was entirely possible for people to choose not to pay attention at all to consumer technology. It could just ‘not be for them’. They wouldn’t even feature on the technology adoption curve. People like this used to live out their lives without giving a second thought to things that others (including me) would happily choose to consider during every waking moment.
Nowadays, without a smartphone and a social network account, you’re quite likely to feel like a social pariah. As a result, you’re forced to pay some attention to consumer technology. But there’s so much of it! Thankfully, there’s an organisation that you can pay a lot of money to in order to provide a small, continually-updated, fully-supported product line that will ensure you have all of the technology you need in your life.
My favourite manufacturer, as I mentioned on the TIDE podcast this week, is actually Sony. The difference between Apple and Sony is that the latter doesn’t tell people what to pay attention to. They provide a multitude of options to fill almost any niche. I can imagine Apple’s designers having far fewer user personas than other organisations — if they use them at all.
If I were an academic I think I’d do some more research into this area. For instance, Apple’s never put a Blu-Ray drive into one of their machines, choosing instead to phase out physical media. As a result, they’ve done extremely well and have tied this in with developments around app stores and new/easy ways to pay for digital good. However, the mojo only lasts as long as their products are fashionable and people agree with the opinionated judgements they’re making.
Attention is a zero-sum game: we’ve only got so much of it and once it’s gone, it’s gone. By providing regular, timely, opinionated updates about the state of the field in which they’re leading, Apple not only get to make massive profits, but are the world’s de facto ‘innovation department’ — even if they didn’t invent the technologies they’re showcasing.
Ever wondered why Mozilla’s Firefox web browser exists? It’s because about 10 years ago Microsoft had sewn-up about 90% of the market and was creating vendor lock-in through anti-competitive practices. You can read about this in the History of the Mozilla Project. Happily, Mozilla were successful and now there’s at least two high-quality alternatives to Microsoft Internet Explorer – which itself has become more aligned with web standards. It’s a win for everyone who uses the web.
The next battleground is mobile. Although Google’s Android mobile Operating System (OS) is billed as ‘open’, for example, it’s not really developed in the usual Open Source way: the source code tends to be released long after each iteration of the OS. Apple, meanwhile, maintains a notoriously closed ecosystem with a stringent procedure for inclusion in their App Store. They also control how you can get things on and off iOS devices in order to make money from the iTunes store.
Amazon, meanwhile, is a fairly new to the mobile device game. They’ve taken Android and significantly modified it – including defaulting to their own app store. They’ve slashed the price of the Kindle Fire 2 (with, cleverly, ‘special offers and sponsored screensavers’) for Black Friday* making it a loss-leader. They’re betting on making the money back through Kindle book purchases, Amazon Prime subscriptions, and Lovefilm streaming.
So even though we may have multiple vendors it’s essentially similar problem to the Internet Explorer issue ten years ago. You may get shiny new ways to consume things that the vendor is selling you, but it’s not a great situation, overall.
You want a tablet? For Christmas 2012 that means you’re going to need to choose your vendor lock-in.
Thankfully, all this is set to change in 2013. Why? Two reasons. First, Mozilla are working on Firefox OS built entirely of standards-based web technologies. Secondly, Ubuntu Linux is being developed for mobile devices like the Nexus 7 and (even more excitingly) you’ll soon be able to run an entire desktop OS from your docked smartphone.
My conclusion? Buy a tablet if you have to, but be aware that real choice is around the corner…
Chris Betcher on his problem with upgrading to the latest release of the Mac operating system:
It’s not that I don’t like their products. I do. I have several Macs, iPads, iPhones, and Apple TVs. Walled garden or not, they build beautiful products that – for the most part – do exactly what they claim… they just work. While I don’t always approve of their proprietary attitude to the way they build their products, I understand the design goals that such a hardware and software symbiosis achieves, and I would still ra use a Mac than any o machine.
The overall message from Apple is loud and clear: thou shalt save thy documents in the iCloud, and thou shalt interact with those documents primarily through the applications that created them. (Thou mayest still employ the old ways by clicking “On My Mac” ere opening or saving documents. But seriously, consider using iCloud instead.)
At the same time, the Mac App Store is starting to look a lot more like the iOS store – i.e. telling developers that unless Apple can take a cut of their profits, they’re unwelcome on users’ systems:
One of the attractions of using Apple devices is that you can get stuff done on them without having to worrying, for example, about random blue screens of death. However, it seems that those who do know their way around a computer are increasingly frustrated by Apple’s approach. Back to Chris Betcher:
I’ve been using personal computers for a long time. I’ll happily admit to being a “power user” and I ra object to Apple’s insistent belief that they need to dumb down my computer because they think I can’t cope with a file system, or that I should suddenly start scrolling in the opposite direction because it’s more “iPad like”, or that I should have fewer choices available because I need to have the software decide what’s best for me.
I’m all for making it easier to get things done with computers and o digital devices. What I’m not in favour of is simultaneously creating a walled garden for vendor lock-in. e are o ways – open standards anyone?
By the way, one of the best ways of reading that Ars Technica review is to add it to Pocket which then formats it really nicely without having to click for the next page (or be online, for that matter). 🙂
We have consistently voted for hardware that’s thinner rather than upgradeable. But we have to draw a line in the sand somewhere. Our purchasing decisions are telling Apple that we’re happy to buy computers and watch them die on schedule. When we choose a short-lived laptop over a more robust model that’s a quarter of an inch thicker, what does that say about our values?
Every time we buy a locked down product containing a non-replaceable battery with a finite cycle count, we’re voicing our opinion on how long our things should last. But is it an informed decision? When you buy something, how often do you really step back and ask how long it should last? If we want long-lasting products that retain their value, we have to support products that do so.
Today, we choose. If we choose the Retina display over the existing MacBook Pro, the next generation of Mac laptops will likely be less repairable still. When that happens, we won’t be able to blame Apple. We’ll have to blame ourselves.
This is less about Apple and hardware and more about a consumerist, short-term attitude that over-privileges form over function. And, of course, this applies to the Open Web too.
At the Mozilla Festival last year, Mozilla Chairperson Mitchell Baker stood up and gave a short talk. Something she said really resonated with me. In fact, it resonated so much that I baked it right in as a central message of my TEDx Warwick talk.
We need to move beyond mere ‘elegant consumption’.
There’s nothing inherently wrong with elegant consumption in and of itself. Reading, watching and experiencing other people’s creations put together in a thoughtful and delightful way is joyful. But if that’s all we’re doing, then we have a problem.
I’ve championed Apple’s hardware and software since buying my first MacBook in 2006. I love the way that their offerings are so easy to use. At some point over the past six years I think I’ve owned or used pretty much their whole product line.
So why this week did I install Pinguy OS (a Linux distribution) on my iMac and trade my iPhone for the open-source Nokia N9?
Until last year, it was possible to swap out almost any hardware and software and still have a functioning ecosystem. An individual or organization could first decide what they wanted that ecosystem to look like and then invest in the constituent parts of that ecosystem. I feel like that’s changed. Now it’s a case of choose your vendor lock-in. And worryingly, that choice seems to be increasingly an aesthetic choice.
Yes, it’s nice that Apple, through iCloud, auto-syncs all of my stuff everywhere. And it’s wonderful that Google can present me with a (mostly) seamless experience on their combination of hardware and software. But I don’t want to have to buy into their whole ecosystem to get the functionality I require.
I’ll tell you what I want. I want interoperability. I want standards. I want a world where I can plug one thing into another and it (mostly) works. And if that world is slightly less shiny than it might otherwise have been? Well, that’s fine with me. At least I’ll have learned to start worrying and love my data.
The face-to-face nature of conferences is, I believe, of even more importance in an extremely digitally connected world. Whilst it’s often the case that you can get to know people very well online, there’s something about embodied interaction that makes your knowledge of that person three-dimensional. I don’t think one method of interacting is necessarily ‘better’ than the other; a blended approach is best. This, I suppose, is why social media is so popular.
In addition, my opinion on Apple’s new iBooks Author was quoted on the JISC site this week. However, they mistakenly listed me as a ‘practising teacher’.
That little slip made me realise just how much I miss it…
There’s a website I check every morning after a quick scan of my emails and Twitter @’s and DMs. Yep, before I even find out if the world’s still there (via BBC News or, more likely, newsmap.jp on our touchscreen kitchen PC) I head over to Techmeme. If you haven’t seen it before, go and have a look now. We’ll wait for you. 😉
This morning I woke up to find an interesting juxtaposition of stories relating to the Apple iPad. Notwithstanding rumours of a 7-inch iPad in the works (hastily dismissed by John Gruber) the following couple of stories would make it seem like now is the time to get yourself an iPad:
What does this mean in practice? The ability to play almost any kind of media on the iPad, along with the long-awaited (potentially ‘killer app’) fully-fledged Google Docs.
But wait! What about the iPad naysayers? Those who say that it’s not neutral and that it’s only good for two things? My reply: no technology is neutral, nor is the language we use to describe things. There is no purely objective view/standpoint from which to judge anything. And as for the iPad only being good for two things? See above. 😉
A more salient point might be that this is v1 of the iPad. Although it marks almost a paradigm shift in computing, think of the original iPhone in comparison with what came after. Getting an iPad now, only for v2 with a ‘retina display’ and a front-facing camera to be launched after Christmas would be frustrating to say the least…
You’ll notice that I haven’t written a blog post about the new Apple iPad. There’s two reasons for that. First of all I haven’t got one (yet), and the second is that what would I have to say that hasn’t already been said? The iPad has been included in almost every presentation I’ve seen over the last few months as an example of outstanding design. The tech community have marvelled at the fact that people – such as the very young and the very old – are able to use the device intuitively. People haven’t had to have training to do things they and others find useful.
There are many definitions of digital literacy, the subject of my Ed.D. thesis. As I have discussed before, almost all of them are ambiguous in one of seven ways. Some of them are ambiguous due to semantics, some due to scope, and some because of scale. And some, quite frankly, as a result of a combination of two or three of the above. Many definitions of digital literacy conflate skills with knowledge, wrapping it all up in a Prensky-esque assertion that it is almost the preserve of ‘digital natives’.
This, of course, is nonsense. There is no reason why the mere use of a digital tool should require a separate literacy or, indeed, anything over-and-above the basic skills that primary schools should (and do) teach. It’s my belief that poor usability and bad interface design can be mitigated by the learning of procedural skills early in life. This in the eyes of older people who can remember life before that technology is assumed to be some kind of meta-cognition and a higher level skill that it actually is.
My favourite example of this is the ‘digital camera’. You don’t hear people of school age using this term. It’s an anachronism. Who uses film cameras in nowadays other than enthusiasts? The concept of taking a picture and it immediately appearing on a screen isn’t a difficult concept to grasp, my son happily snapping away as a 2 year-old and learning to frame shots as a 3 year-old.
It’s all about dominant paradigms. If you grew up taking photographs in the send-your-film-away-to-get-prints era, it takes a conceptual shift to move to digital photography. All the while you’re looking for the ‘equivalent’ of something in the digital system from the film system. It doesn’t quite work like that. It’s functionally similar but qualitatively different.
So, to my mind, much – but by no means all – of what we refer to as digital literacy consists of procedural skills. And the learning of such skills can be aided a great deal through effective interface design. For the second time this week I’m going to recommend you look at Chris Messina’s work – this time his rather useful Flickr collection of web usability stuff.
Digital literacy is a concept past its sell-by date. As I argue in an upcoming journal article, it’s lost pretty much any sense of creative ambiguity it may have once had. It also makes little sense from a procedural skills point of view.
We just need to design better user interfaces and nudge people into making more informed decisions. Enough of this talk of ‘digital literacy’! :-p