my @AmazonKindle Touch: well worth the wait

I’ve wanted a Kindle since version 2.0, and it’s hard to imagine that these devices were several hundred dollars. At long last, I’ve joined the club, with this little beauty:

my Kindle Touch WiFi

With a retail price of $99 it literally is almost a no-brainer now. Especially since buying a hardware Kindle gets you access to the Kindle Lending Library (assuming you are an Amazon Prime customer) which lets you read one book a month for free. I’m working my way through The Hunger Games now.

In addition, public libraries have ebook lending programs that work just like regular borrowing (though like physical books, you have to put a hold on the popular ones and wait a while). And of course there is Project Gutenberg and the vast public domain. I’m not averse to buying books but the same rules in my mind apply to buying a ebook as apply to buying a physical one: unless it’s a must-read, I can wait to borrow it from my library. The fact that the library lending model extends to ebooks’ domain is just pure unadulterated awesome. But if there’s something I really want to read, I can wait a month and get it via Amazon’s program, so that’s an advantage over the physical realm.

Of course, with the recent news that the Department of Justice is going after Apple and the big-name publishers for price-fixing collusion on ebooks, the price of ebooks will likely drop significantly at Amazon. The first book of The Hunger Games trilogy is already marked down to $5 (though the sequels are higher). The omnibus collection of the Game of Thrones books still says “price set by the publisher” at $29.99 which is less than $8/book, and I wouldn’t be surprised if that drops in the next week to $25 as well.

It’s a good time to own a Kindle. I have the same feeling of loyalty towards the Amazon ecosystem as most Apple stalwarts do theirs.

in defense of Microsoft Word

It’s the post-PC era, where we use apps and mobile phones and tablets and ultra-books, e-books, iBooks, and Nooks. We Kindle and we Hulu and we tweet and tumblr and like. Everything is in a cloud somewhere. This is quite a change from the halcyon days of when computing meant sitting down at your computer and launching a program to do something; now all it seems we do (if you live in the digerati echo chamber, that is) is consume and critique.

That’s the context I perceive for this piece by Tom Scocca (@tomscocca) in Slate mocking Microsoft Word, which quickly went viral. Of the many Top Tweets about it, I found these two rather illustrative:

Most of the other tweets just repeat the author’s assertion that Word is “cumbersome, inefficient, and a relic of obsolete assumptions about technology.” The tweets above are useful in that they are explicit in their counter-assumptions about technology; namely, that the only real writing happens on the Web. It’s certainly true that using Word for simple text like email or blog posts is overkill, in much the same way that using a jet engine to drive your lawnmower is overkill. What’s peculiar is that rather than using simpler tools for their simpler tasks, these people have declared that the more complex and capable tool is “obsolete” and “must die”. This attitude betrays a type of phobia towards technology that I suspect has grown more prevalent as our technology interfaces have become increasingly more “dumbed down”.

In actuality, most of the writing in the real world is the complex variety that requires more than a few buttons for bold, italics and blockquote. Ask any lawyer writing a brief, a scientist writing a grant, or a student writing a dissertation how useful Word is and you’ll get a very different perspective than that of people writing tweets about how Word is too complicated for their blogging. Scocca himself acknowledges that he used Word when he wrote his book, which is a pretty telling reveal that completely undercuts his argument that Word has outlived its utility.

If I were to match Scocca’s hyperbole, I’d have to contend that Word is possibly the finest piece of software ever written, in terms of its general utility to mankind. That statement is arguably more true than claiming Word must “die” – especially since as of fiscal year 2011, Office 2010 had sold over 100 million licenses and drove record revenue growth. And note that the software division inside Microsoft that release Office for the Mac is actually the largest OS/X software developer outside of Apple, Inc. itself.

The reason that Word has outlived all its competitors, including dearly departed Wordperfect and Wordpro, is that it has evolved over time, to becoming an indispensable tool for a writer to save time and stay organized. Here’s a great list of 10 features in Word that any serious writer should be intimately familiar with. And even for casual use, some basic knowledge of Word’s features can let you do amazing things with simple text.

However, let’s suppose that you really don’t want to do anything fancy at all. You just want to write a plain text document, which is the basis of Socca’s argument. Is Microsoft Word really as bad as he makes it out to be? Here’s a quick summary of Scocca’s complaints, with my comments:

* Too many features that are left “on”. As examples, he uses the infamous Clippy (which hasn’t been in Word since 2003) and the auto-correct function (which is also enabled by default in Gmail, as well as TextEdit and OS/X Lion). If you really hate the autocorrect, though, it’s almost trivially easy to turn it off – a small blue bar always appears under the autocorrected word when the cursor is next to it. You can use that to access a contextual dropdown that lets you immediately undo the autocorrect or turn it off entirely, for example:

autocorrect options in Microsoft Word

* Scocca finds certain features irritating, specifically “th” and “st” superscripts on ordinal numbers (1st, 2nd, 3rd, etc) and auto-indenting numbered lists. This is largely a matter of personal taste. Style manuals tend to recommend not using superscripts, out of concern on line spacing. Modern processors like Word can easily handle a superscript without breaking the paragraph’s layout.

* He thinks that Word incorrectly uses apostrophes and quotes. He’s mistaken; see the image below where I demonstrate single and double quotes. Note that if you insist on using “dumb” quotes, you can immediately revert by using CTRL-Z (which every Word user should be familiar with, hardly “hidden under layers of toolbars”).

smart quotes in Microsoft Word

* For some reason, the logo for the Baltimore Orioles uses a backwards apostrophe. And for some reason, Scocca believes this is Word’s fault. I have absolutely no idea why he blames Word for this. Try typing O-apostrope-s (O’s) into Word and you’ll see that the apostrophe is indeed facing the right way. I’m frankly unclear on why the backwards apostrophe on the Orioles’ logo is a threat to civilization, but even if so, it’s not Word’s fault.

* Word uses a lot of metadata to keep track of its detailed and complex formatting. This has the effect of marginally increasing file sizes by a trivial and negligible amount (the files taking up space on your hard drive aren’t Word documents, they are MP3 files, video, and photos). Bizarrely, Scocca tries to cut and paste the metadata back into Word as proof of excess, but this is a completely meaningless exercise which proves nothing. It’s true that if you try to open a native Word file in a plaintext editor, you’ll see a lot of gobbledygook, but why would you do that? If you open a JPG file in a text editor you’ll see the same stuff. Every file has metadata and this is a good thing when you use the file in the software it is intended. Of course, Word lets you export your data to any number of file formats, including web-friendly XML and plain text, so Scocca’s ire here is particularly misplaced and mystifying.

* Scocca sneers that Word still uses the paradigm of a “file” on a single “computer”. He says it’s impossible to use Word to collaborate or share. Perhaps he’s unaware of the fact that as of last month, email-based file attachments have been around for 20 years? Microsoft also is lauching a cloud-based version of Office, though, called Office 365, and with the advent of tools like Dropbox and Live Mesh the old one-file-one-PC paradigm is no longer a constraint. It’s actually better that Word focus on words and not include network-based sharing or whatnot; there are tools for that, and isn’t feature bloat one of Scocca’s chief complaints anyway?

* and finally, he calls the Revision Marking feature of Word “psychopathic” and “passive-aggressive”. I wonder if he’s ever actually collaborated on a document? The revision feature has literally transformed how I collaborate with my colleagues and is probably the single most useful feature in Word. It’s trivially easy to accept a single specific change or to do a global “Accept All” between revisions and users. The interface, with color-coded balloons for different users in the margin rather than in-line is elegant and readable. Scocca gripes that “No change is too small to pass without the writer’s explicit approval” – would he rather the software decide which revisions are worthy of highlighting and which aren’t? This complaint is utterly baffling to anyone who has ever actually used the feature.

Frankly, as a regular Word user for years myself, I find it pretty hard to sympathize with Scocca’s rant. None of his feature complaints are really valid, apart from some stylistic preferences (he’d rather bullet his own lists, etc) which are easily modified in Word’s settings. If the menus are really so intimidating, it’s trivially easy to google things like disable autocorrect, and if your google-fu isn’t up to that task then you can always leave a post at Microsoft’s super-friendly user forums where ordinary users themselves will be glad to walk you through it.

If Microsoft Word were to truly die, then we’d lose one of the most productive tools for complex and professional writing in existence. If that’s the future of the written word, where anything above the level of complexity of a tweet, email or blog post is considered too hard to deal with (and software gets dumber to match), then it’s a grim future indeed.

Long live Microsoft Word!

Facts about Foxconn – your move, Apple

As a follow up to the ongoing and groundbreaking series at the New York Times about Apple’s dark side (labor exploitation of Chinese workers at Foxconn), Nightline just aired their own investigation, and courtesy of the Verge here’s the takeaway message in handy factoid form:

Bloody Apple Valentine

  • It takes 141 steps to make an iPhone, and the devices are essentially all handmade
  • It takes five days and 325 hands to make a single iPad
  • Foxconn produces 300k iPad camera modules per day
  • Foxconn workers pay for their own food — about $.70 per meal, and work 12 hour shifts
  • Workers who live in the dorms sleep six to eight a room, and pay $17.50 a month to do so
  • Workers make $1.78 an hour
  • (Foxconn CEO) Louis Woo, when asked if he would accept Apple demanding double pay for employees replied “Why not?”

your move, Apple, indeed. The question is, will consumers also be willing to pay $20 more for their iPads? (If not, will Apple be willing to eat that cost?)

Bizarro-Moore’s Law for SSDs?

It looks like SSDs are going to get worse over time, unlike hard drives:

SSDs are seemingly doomed. Why? Because as circuitry of NAND flash-based SSDs shrinks, densities increase. But that also means issues relating to read and write latency and data errors will increase as well.
The group discovered that write speed for pages in a flash block suffered “dramatic and predictable variations” in latency. Even more, the tests showed that as the NAND flash wore out, error rates varied widely between devices. Single-level cell NAND produced the best test results whereas multi-level cell and triple-level cell NAND produced less than spectacular results.

This suggests to me that SSDs are never going to break out of the boot-disk niche for hardware builds. I get equivalent read-speads on my striped hard drives as my boot SSD, for that matter.

Feeling Blu, more on SOPA, DRM, blah blah blah

It looks like VLC media player will soon support encrypted Blu-rayplayback. This seems relevant to the discussion started by Pete and continued by J (hardware) and Steven (software). I’d just like to add that AnyDVD HD should be legal to own in the US as far as I know, since it allows you to backup discs you already own. I should get around to doing exactly that, in fact, because all our Disney DVDs are getting scratched to heck.

Actually, its probably illegal to download a torrent of a DVD you already own but is too scratched to view, and using the torrent to burn a new DVD copy. But it shouldn’t be, which is why I return to my rant about DRM and the huge wasted opportunity that was SOPA activism.

Speaking of SOPA – great article at Big Think going against the grain, titled “Hooray for SOPA!”. I think it’s a great point to make, especially about how small content producers get screwed by piracy – just look at the state of plagiarism on Amazon’s kindle store. And also an Ars article about the recent takedown of filesharing site Megaupload, asking “if we can take down Megaupload under existing US law, why do we need SOPA?” (Ars is more diplomatic. My answer: because SOPA was never about domestic infringing sites, and thus was never a threat). Mark also had kind words for my earlier screed, and I wholeheartedly endorse his archived post about DRM and Intellectual Property – a must-read.

What Would Steve Do?

I think the best way to honor Steve Jobs’ legacy as a visionary is to refuse to be content.

Why was the Mac such a success? Because of user discontent with computers at that time – and the existence of Macs is why Windows 95 was such a revelation to me, and why I love Windows 7. The same applies to music players, to handheld PDAs, to phones, to tablets. I love my Blackberry Bold Touch and I lust after a Kindle Fire, but they wouldn’t be worth lusting after if not for the iPhone and the iPad.

Discontent drives innovation, and stagnation creates opportunity.

I think that Steve Jobs understood this more deeply than many of the users of his devices. He created user experiences from start to finish – but he was always pushing the envelope. The intensity of Apple users’ fandom is testament to the value of those experiences, but it also in a sense created the same stagnation that afflicted the technologies obsoleted by Apple. It was Jobs’ genius that he refused to be content, even though his products created contentment.

Jobs created the iPad out of nothing, but suppose he hadn’t? Apple aficionados would still be happily using iPods, iPhones, and MacBooks. If some other manufacturer had created the tablet, Apple’s users would have dismissed, pointing out the many advantages of their elegant Apple products over such ungainly eyesores. Apple users have always been content with what they were given. Steve Jobs, alone, pushed the envelope, innovating not out of discontent by Apple users but by the discontent of everyone outside Apple’s fold.

That was a mighty burden and a challenge for Jobs, surely – one that I suspect he needed to maintain his creative output. After all, he could have easily milked the supply of loyal Apple fans endlessly without any real innovation at all. But it was his eye on the rest of us, still using Windows and Blackberries and Android, that pushed him onwards. It was our discontent with Windows 95, with our StarTacs, our Handsprings, even our mice and our monitors, that were the inspiration for his innovation and it was the stagnation of Microsoft, Logitech, Motorola, Samsung, etc that gave him the opportunity.

I think that without Steve Jobs, Apple risks that same stagnation. It’s already happened, in a sense, to the MacBook line and the iPod. And that’s ok, because Apple raised the bar, and could easily continue on autopilot on its existing product line and customer loyalty. It would then fall to another company to exploit that stagnation, and keep the engine of innovation moving forward. That’s Steve Jobs’ true legacy.

Are you content with your iPad? I’m not, even though I love it. And I think that there’s already signs that the next round of innovation is upon us. What better way to honor Steve’s legacy than to stay hungry for the next better thing, rather than the next same thing?