Urban Legends and the Web

Digital, internet, bytes, ones, zeroes, Web, wireless, computer,
The Monday Note covers the intersection between media and technology and the shift of business models. It is jointly edited by Frédéric Filloux, a Paris-based journalist and Jean-Louis Gassée, a Silicon Valley veteran currently general partner for the venture capital firm Allegis Capital in Palo Alto. Their column appears on CBSNews.com each Monday.

Legends die hard. In the pre-Web days, they got printed and reprinted, told and retold and so became official, like spinach being good for you because it held the iron your red cells needed. After decades of the disgusting veggie being inflicted upon young kids, a scientist went back to the bench and found out there was no digestible iron whatsoever in spinach. You don't get calcium by ingesting chalk, you need a calcium compound that'll get through the sophisticated filters in the digestive system. Eating spinach gives you as much digestible iron as sucking nails.

The spread of legends gets worse with the Web. Stories - I'm avoiding the word "information" - travel fast. Yarns bounce around a world-wide echo chamber. If I hear it from five sources, it must be true. Never mind that the so-called sources heard it from one another in sequence. Even worse, the Web never forgets as everything gets cached, archived and will be unearthed by search engines.

This creates a need and entrepreneurs pop out of the quantum vacuum ready to fill it: a Google search reveals at least three companies, reputationrestore.org, reputationrestorer.net and restore-reputation.com that promise to clean up your besmirched Web image. Actually, these three look like the same company, and, at the risk of unfairly tarnishing their own rep, look like one of these only too-frequent scams purporting to protect you from scams. Ah well…

So it goes for a tenacious legend, the one that Apple "lost" the market because it failed to license the Mac operating system to "everyone" and thus would have got to own the market instead of losing it to the "obviously inferior" Microsoft product.

A few days ago, no less than über-blogger Henry Blodget, the Internet Bubble repentito now head of Business Insider blog hub, fell for it. This industry observer who admitted that he never set foot in an Apple Store, not a sin if your territory is the quick oil-change industry, chided Apple for "making the same mistake again". Just like in the 80's, he says that Apple insists "on selling fully integrated hardware and software devices, instead of focusing on low-cost, widely distributed software." His conclusion: Apple will lose to the open-source Android, just like Apple lost to Microsoft.

I know we shouldn't let facts get in the way of a good story but let's take a closer look at the data. Today, Android is free. This, in effect, sets the market price for smartphone licensing deals. Ask Microsoft. How do you tell Motorola or HTC they ought to fork $25, or $15 for a Windows Mobile license while Android is free (and arguably better).

In this context, how does Apple charge for the iPhone OS? How do they replace the $400 or so they make per iPhone? The joke would have it that they make it back in volume. Maybe they get it in App Store revenue, an estimated net $500M in 18 months? Great, but still no match for tens of billions of dollars in hardware sales (multiply 50 million iPhones and iPod Touches by $400).

Apple could indeed end up "losing" the smartphone market to Android, just as it "loses" the PC market today, making more money than Dell and Hewlett Packard, combined. Those two companies account for a 33% market share compared to Apple with less than 10%. (More details in the November 1st, 2009 Monday Note.)

Ask GM how they feel about a "tiny" Bavarian automaker. Of course, Apple can make an inferior product and lose. That isn't too far from what actually happened with the original Macintosh. I know because I was there.

Rewind to 1981. IBM introducds the PC. At the time, it was pretty much a clone of the Apple II, slots, a cassette tape interface, game controls and all. The big difference was a 16-bit Intel processor, the 8086, whose four digits were used for the ending of Microsoft's original corporate phone number (I'm not kidding.) The then-reigning Apple II had the 8-bit 6502 processor, a dead-end architecture, as the supplier, MOS Technology, could not provide a credible transition to a 16 or 32-bit world.

The PC evolved, and got faster with newer Intel CPUs, with the crucial inclusion of a head disk and the even more epoch-making advent of the first killer app: Lotus 1-2-3. Written in assembly language and lightning-fast, Lotus 1-2-3 was called an "integrated application." It became all the rage as it incorporated a spreadsheet, a word processor and a database. I could attest to that from first-hand experience. To some of my Apple colleagues' chagrin back then, I maintained a PC in a small cubicle next to my office.

When the Mac came out in 1984, this is what it faced. The original Mac clearly showed great promise, its user interface was clearly superior and it built on the lessons learned from Lisa's failure. (Lisa was Apple's first bit-mapped screen and mouse-driven machine of 1983.) But the first Mac, for all its promise and sexiness, was slow, buggy, with a small screen, no hard disk, no color and no application software that could compete with Lotus 1-2-3. When Steve Jobs returned to Apple in 1997, he brought in a team of experienced engineers from NeXT and promptly killed the half-hearted licensing program that was siphoning off the company's hardware margins. You can't be in both the hardware and the licensing businesses at the same time. Over the years, a steadily improved product and a tight control of the layers of the user experience, including the Apple Stores, produced the revenue and profits we know.

But legends live on. How about that almost-forgotten one? IBM licensed key parts of the original PC design and, for its reward, lost the PC market in spite of its effort to regain control with a new bus architecture called Micro Channel and a new software platform, OS/2, called "better DOS than DOS" and "better Windows than Windows."

By Jean-Louis Gassée and Frederic Filloux
Special to CBSNews.com