In studying and/or promoting web-technology, the phrase Web 2.0 can refer to a perceived second generation of web-based communities and hosted services such as social-networking sites, wikis, and folksonomies which aim to facilitate creativity, collaboration, and sharing between users. The term gained currency following the first O'Reilly Media Web 2.0 conference in 2004.Although the term suggests a new version of the World Wide Web, it does not refer to an update to any technical specifications, but to changes in the ways software developers and end-users use webs. According to Tim O'Reilly,

"Web 2.0 is the business revolution in the computer industry caused by the move to the Internet as platform, and an attempt to understand the rules for success on that new platform." [4]

Some technology experts, notably Tim Berners-Lee, have questioned whether one can use the term in a meaningful way, since many of the technology components of "Web 2.0" have existed since the early days of the Web.[5][6]

Defining "Web 2.0"

In alluding to the version-numbers that commonly designate software upgrades, the phrase "Web 2.0" hints at an improved form of the World Wide Web. Technologies such as weblogs (blogs), social bookmarking, wikis, podcasts, RSS feeds (and other forms of many-to-many publishing), social software, web application programming interfaces (APIs), and online web services such as eBay and Gmail provide enhancements over read-only websites. Stephen Fry (actor, author, and broadcaster) describes Web 2.0 as

"an idea in people's heads rather than a reality. It's actually an idea that the reciprocity between the user and the provider is what's emphasized. In other words, genuine interactivity, if you like, simply because people can upload as well as download". [7]

The idea of "Web 2.0" can also relate to a transition of some websites from isolated information silos to interlinked computing platformsthat function like locally-available software in the perception of the user. Web 2.0 also includes a social element where users generate and distribute content, often with freedom to share and re-use. This can allegedly result in a rise in the economic value of the web as users can do more online.[citation needed]
Tim O'Reilly regards Web 2.0 as business embracing the web as a platform and using its strengths (global audiences, for example).[citation needed] O'Reilly considers that Eric Schmidt's abridged slogan, don't fight the Internet, encompasses the essence of Web 2.0building applications and services around the unique features of the Internet, as opposed to building applications and expecting the Internet to suit as a platform (effectively "fighting the Internet").
In the opening talk of the first Web 2.0 conference, O'Reilly and John Battelle summarized what they saw as the themes of Web 2.0. They argued that the web had become a platform, with software above the level of a single device, leveraging the power of the "Long Tail", and with data as a driving force. According to O'Reilly and Battelle, an architecture of participation where users can contribute website content creates network effects. Web 2.0 technologies tend to foster innovation in the assembly of systems and sites composed by pulling together features from distributed, independent developers (a kind of "open source" development and an end to the software-adoption cycle (the so-called "perpetual beta"). Web 2.0 technology allegedly encourages lightweight business models enabled by syndication of content and of service and by ease of picking-up by early adopters.[8]
Tim O'Reilly provided examples of companies or products that embody these principles in his description of his four levels in the hierarchy of Web 2.0-ness. Level-3 applications, the most "Web 2.0"-oriented, only exist on the Internet, deriving their effectiveness from the inter-human connections and from the network effects that Web 2.0 makes possible, and growing in effectiveness in proportion as people make more use of them. O'Reilly gave as examples eBay, Craigslist, Wikipedia, del.icio.us, Skype, dodgeball and AdSense. Level-2 applications can operate offline but gain advantages from going online. O'Reilly cited Flickr, which benefits from its shared photo-database and from its community-generated tag database. Level-1 applications operate offline but gain features online. O'Reilly pointed to Writely (now Google Docs & Spreadsheets) and iTunes (because of its music-store portion). Level-0 applications work as well offline as online. O'Reilly gave the examples of MapQuest, Yahoo! Local and Google Maps (mapping-applications using contributions from users to advantage can rank as "level 2"). Non-web applications like email, instant-messaging clients and the telephonefall outside the above hierarchy.[9]
 

Characteristics of "Web 2.0"

Web 2.0 websites allow users to do more than just retrieve information. They can build on the interactive facilities of "Web 1.0" to provide "Network as platform" computing, allowing users to run software-applications entirely through a browser.[10] Users can own the data on a Web 2.0 site and exercise control over that data.[11][10] These sites may have an "Architecture of participation" that encourages users to add value to the application as they use it.[10][2] This stands in contrast to very old traditional websites, the sort which limited visitors to viewing and whose content only the site's owner could modify. Web 2.0 sites often feature a rich, user-friendly interface based on Ajax[10][2], Flex or similar rich media. The sites may also have social-networkingaspects.[11][10]
The concept of Web-as-participation-platform captures many of these characteristics. Bart Decrem, a founder and former CEO of Flock, calls Web 2.0 the "participatory Web"[12] and regards the Web-as-information-source as Web 1.0.
The impossibility of excluding group-members who don't contribute to the provision of goods from sharing profits gives rise to the possibility that rational members will prefer to withhold their contribution of effort and free-rideon the contribution of others.[13]
 

Technology overview

The sometimes complex and continually evolving technology infrastructure of Web 2.0 includes server-software, content-syndication, messaging-protocols, standards-oriented browsers with plugins and extensions, and various client-applications. The differing, yet complementary approaches of such elements provide Web 2.0 sites with information-storage, creation, and dissemination challenges and capabilities that go beyond what the public formerly expected in the environment of the so-called "Web 1.0".
Web 2.0 websites typically include some of the following features/techniques:
 

 Innovations sometimes associated with "Web 2.0"

 

Web-based applications and desktops

The richer user-experience afforded by Ajax has prompted the development of websites that mimic personal computer applications, such as word processing, the spreadsheet, and slide-show presentation. WYSIWYG wiki sites replicate many features of PC authoring applications. Still other sites perform collaboration and project management functions. In 2006 Google, Inc. acquired one of the best-known sites of this broad class, Writely.[14]
Several browser-based "operating systems" have been developed, including EyeOS[15] and YouOS[16]. They essentially function as application platforms, not as operating systems per se. These services mimic the user experience of desktop operating-systems, offering features and applications similar to a PC environment. They have as their distinguishing characteristic the ability to run within any modern browser.
Numerous web-based application services appeared during the dot-com bubble of 1997-2001 and then vanished, having failed to gain a critical mass of customers. In 2005, WebEx acquired one of the better-known of these, Intranets.com, for USD45 million.[17]
 

Rich Internet applications

Recently, rich-Internet application techniques such as Ajax, Adobe Flash, Flex, Nexaweb, OpenLaszlo and Silverlight have evolved that have the potential to improve the user-experience in browser-based applications. These technologies allow a web-page to request an update for some part of its content, and to alter that part in the browser, without needing to refresh the whole page at the same time.

Server-side software

Functionally, Web 2.0 applications build on the existing Web server architecture, but rely much more heavily on back-end software. Syndication differs only nominally from the methods of publishing using dynamic content management, but web services typically require much more robust database and workflow support, and become very similar to the traditional intranet functionality of an application server. Vendor approaches to date fall either under a universal server approach (which bundles most of the necessary functionality in a single server platform) or under a web-server plugin approach (which uses standard publishing tools enhanced with API interfaces and other tools).

Client-side software

The extra functionality provided by Web 2.0 depends on the ability of users to work with the data stored on servers. This can come about through forms in an HTML page, through a scripting language such as Javascript, or through Flash, Silverlight or Java. These methods all make use of the client computer to reduce server workloads and to increase the responsiveness of the application.

 

XML and RSS

Advocates of "Web 2.0" may regard syndication of site content as a Web 2.0 feature, involving as it does standardized protocols, which permit end-users to make use of a site's data in another context (such as another website, a browser plugin, or a separate desktop application). Protocols which permit syndication include RSS (Really Simple Syndication also known as "web syndication"), RDF (as in RSS 1.1), and Atom, all of them XML-based formats. Observers have started to refer to these technologies as "Web feed" as the usability of Web 2.0 evolves and the more user-friendly Feeds icon supplants the RSS icon.

Specialized protocols

Specialized protocols such as FOAF and XFN (both for social networking) extend the functionality of sites or permit end-users to interact without centralized websites.

Web APIs

Machine-based interaction, a common feature of Web 2.0 sites, uses two main approaches to Web APIs, which allow web-based access to data and functions: REST and SOAP.

  1. REST (Representational State Transfer) Web APIs use HTTP alone to interact, with XML or JSON payloads;
  2. SOAP involves POSTing more elaborate XML messages and requests to a server that may contain quite complex, but pre-defined, instructions for the server to follow.

Often servers use proprietary APIs, but standard APIs (for example, for posting to a blog or notifying a blog update) have also come into wide use. Most communications through APIs involve XML (eXtensible Markup Language) or JSON payloads.

See also Web Services Description Language (WSDL) (the standard way of publishing a SOAP API) and this list of Web Service specifications.

 

Economics and "Web 2.0"

The analysis of the economic implications of "Web 2.0" applications and loosely-associated technologies such as wikis, blogs, social-networking, open-source, open-content, file-sharing, peer-production, etc. has also gained scientific attention. This area of research investigates the implications Web 2.0 has for an economy and the principles underlying the economy of Web 2.0.

Don Tapscott and Anthony D. Williams argue in their book Wikinomics: How Mass Collaboration Changes Everything (2006) that the economy of "the new web" depends on mass collaboration. Tapscott and Williams regard it as important for new media companies to find ways of how to make profit with the help of Web 2.0 The prospective Internet-based economy that they term "Wikinomics" would depend on the principles of openness, peering, sharing, and acting globally. They identify seven Web 2.0 business-models (peer pioneers, ideagoras, prosumers, new Alexandrians, platforms for participation, global plantfloor, wiki workplace).[citation needed]

Organizations could make use of these principles and models in order to prosper with the help of Web 2.0-like applications: Companies can design and assemble products with their customers, and in some cases customers can do the majority of the value creation[18] In each instance the traditionally passive buyers of editorial and advertising take active, participatory roles in value creation.[19] Tapscott and Williams suggest business strategies as models where masses of consumers, employees, suppliers, business partners, and even competitors cocreate value in the absence of direct managerial control

Tapscott and Williams see the outcome as an economic democracy.

Some other views in the scientific debate agree with Tapscott and Williams that value-creation increasingly depends on harnessing open source/content, networking, sharing, and peering, but disagree that this will result in an economic democracy, predicting a subtle form and deepening of exploitation, in which Internet-based global outsourcing reduces labour-costs. In such a view, the economic implications of a new web might include on the one hand the emergence of new business-models based on global outsourcing, whereas on the other hand non-commercial online platforms could undermine profit-making and anticipate a co-operative economy. For example, Tiziana Terranova speaks of "free labor" (performed without payment) in the case where prosumers produce surplus value in the circulation-sphere of the cultural industries [21]

 

Criticism

Given the lack of set standards as to what "Web 2.0" actually means, implies, or requires, the term can mean radically different things to different people.

The argument exists that "Web 2.0" does not represent a new version of the World Wide Web at all, but merely continues to use so-called "Web 1.0" technologies and concepts. Note that techniques such as Ajax do not replace underlying protocols like HTTP, but add an additional layer of abstraction on top of them. Many of the ideas of Web 2.0 had already featured in implementations on networked systems well before the term "Web 2.0" emerged. Amazon.com, for instance, has allowed users to write reviews and consumer guides since its launch in 1995, in a form of self-publishing. Amazon also opened its API to outside developers in 2002.[22] Previous developments also came from research in computer-supported collaborative learning and computer-supported cooperative work and from established products like Lotus Notes and Lotus Domino.

In a podcast interview Tim Berners-Lee described the term "Web 2.0" as a "piece of jargon": "nobody really knows what it means"; and went on to say "if Web 2.0 for you is blogs and wikis, then that is people to people. But that was what the Web was supposed to be all along."[5]

Conversely, when someone proclaims a website "Web 2.0" for the use of some trivial feature (such as blogs or gradient-boxes) observers may generally consider it more an attempt at promotion than an actual endorsement of the ideas behind Web 2.0. "Web 2.0" in such circumstances has sometimes sunk simply to the status of a marketing buzzword, which can mean whatever a salesperson wants it to mean, with little connection to most of the worthy but (currently) unrelated ideas originally brought together under the "Web 2.0" banner.

Other criticism has included the term "a second bubble," (referring to the Dot-com bubbleof circa 1995=2001), suggesting that too many Web 2.0 companies attempt to develop the same product with a lack of business models. The Economist has written of "Bubble 2.0."[23]

Venture capitalist Josh Kopelman noted that Web 2.0 excited only 53,651 people (the number of subscribers to TechCrunch, a Weblog covering Web 2.0 matters), too few users to make them an economically-viable target for consumer applications.[24]

Authority Black Book