IN 2003, when your correspondent arrived in Silicon Valley, a common response to “How is the Valley?” was “In a nuclear winter.” The dotcom bust had incinerated an entire generation of start-ups. A much-debated essay argued that “IT [information technology] doesn’t matter.” The Valley itself seemed to matter less.
Its geeks were desperately looking for their “next big thing” and minting neologisms (“utility computing”, “the digital home”) in the hope that one might stick. But ordinary people outside the Valley were no longer paying attention. Valley geeks were already hopping onto Wi-Fi hotspots and playing with “smart” phones, but most people were still dialling up to connect to the internet and using mobile phones only for talking. There was some excitement about a fairly new gadget, Apple’s iPod, but nobody suspected that its progeny, in the form of a phone, might one day make the internet “mobile”. Nor did a popular search engine, Google, show signs that it might be a lucrative business, much less a new technology superpower. It was still a world of personal computers, dominated by Microsoft through its Windows operating system.But towards the end of 2003 two conference organisers, Dale Dougherty and Tim O’Reilly, were brainstorming when Mr Dougherty used the words “Web 2.0”. They immediately realised that the phrase—with its software connotation of a newly released, better and more stable version—had enormous appeal as a rallying cry for the Valley. The Web 2.0 Conference was born, and the first one, in San Francisco in October 2004, created a stir.
It took place shortly after the first big initial public offering (IPO) of a technology firm since the dotcom boom. Google’s IPO did not just announce the Valley’s return to Wall Street. It also unveiled a new business model. When Google at last revealed how much money it was making by placing small, targeted text advertisements next to search results, jaws dropped. Overnight, every entrepreneur had a new one-word pitch to venture capitalists: advertising.
Google became the Valley’s new champion. Its share price soared, it entered new areas almost weekly—from e-mail to maps, from radio to newspaper advertising—and it started buying start-ups, thus replacing the stockmarket as the preferred “exit strategy” for entrepreneurs. Yahoo!, Microsoft and other rivals could not keep up.
Having popularised the term “Web 2.0”, meanwhile, Mr O’Reilly started fretting that it had become a cliché, and was being applied to so many things that it was in danger of becoming meaningless. He tried hard to give it a definition. And so Web 2.0 came to encapsulate several trends that had been going on all along. One was the shift from individual computers as the “platform” for applications to the web as a whole. Residing on the web, these new applications and services inherently lend themselves to collaboration, sharing and participation.
A new boom began, with telltale signs of frivolous start-ups but also the long-hoped-for succession of next-big-things. Each year saw at least one: MySpace, an online social network soon bought by News Corp, a media giant; Flickr, a photo-sharing site snapped up by Yahoo!; YouTube, a site for sharing amateur videos, quickly bought by Google; Facebook, the most innovative social network yet and so far fiercely independent; and now Twitter, a social-messaging service.
People began adopting new habits very fast. Wi-Fi became widespread in homes, offices and universities, and hotspots popped up in cafés, hotels and airports, allowing nomadic workers to go online many times a day, in many different places. Apple launched the iPhone, which upstaged even the BlackBerry in bringing the web, and Web 2.0 applications, to mobile phones, accessible all the time and everywhere. As social animals, people began expecting permanent “connectivity”.
Gadflies began pointing to excesses. Jaron Lanier, a Valley pioneer, saw behind the Web 2.0 totem of “collective intelligence” an insidious “digital Maoism” that suppressed individuality. Linda Stone, a former Apple and Microsoft executive, observed an unhealthy trend towards “continuous partial attention”, as people spent less time focusing on a single thing or person because they were constantly scanning so many other things—from Facebook to e-mail and their phones—for fear of missing out on some social opportunity.
Perhaps most dangerously, Web 2.0 still had only one business model, advertising, and the Valley was refusing to admit that only one company (Google) with only one of its products (search advertising) had proved that the model really worked. The older internet firms, Yahoo! and AOL, were doing their best to grab a piece of the action. But the “next big things” were selling negligible advertising, often on one another’s sites. Not one of them has become an advertising success in its own right.
And so, as this correspondent prepares to leave, the Valley again finds itself in a curious position. It has been a boon to the world, helping people keep abreast of acquaintances on their social networks, wherever they go, and record and share much more of their own lives. But the Valley stands on ground that is as unstable, seismically and metaphorically, as it was in the earlier bust. Another bubble—this time, not of the Valley’s making—has burst. The world economy is in crisis, advertising is collapsing and start-ups are once again vanishing into thin air. Silicon Valley may be entering another nuclear winter.
No comments:
Post a Comment