Tuesday, December 13, 2005

Best of the Web 2.0 Software of 2005: Don Hinchcliffe's

The Best Web 2.0 Software of 2005


Category: Social Bookmarking

Best Offering: del.icio.us

del.icio.us

Description: Just acquired by Yahoo!, which already has a social bookmarking service called My Web 2.0, the exact future of this seminal bookmarking site is now a little up in the air. But
del.icio.us remains the best, largest, fastest, and most elegant social bookmarking service on the Web. In fact, del.icio.us is the benchmark that all others use. And because del.icio.us appears to take the Web 2.0 ideas pretty seriously, they provide a nice API for others to build new services on top of. As a consequence of this, and because social bookmarking sites makes everyone's data public, witness the amazing array of add-on services (or if you have 15 minutes to spare, look here) that mash-up or otherwise reuse del.icio.us functionality and content. If you want access to your bookmarks anywhere you go along with engaging and satisfying functionality, this is your first stop. I personally can't live without my tag cloud of del.icio.us bookmarks.

Runners-Up
:


Category: Web 2.0 Start Pages

Best Offering: Netvibes

Description: There are a rapidly growing number of Ajax start pages that allow your favorite content to be displayed, rearranged, and viewed dynamically whenever you want. But if the traffic to this blog is any indication (though possibly it isn't) Netvibes is far and away the most popular one. Available in multiple languages, sporting new integration with Writely, and offering an extremely slick and well-designed interface that provides some of the best DHTML powered drag-an-drop organization, Netvibes has no major vendor backing, yet it has captured mindshare out of pure excellence. While many of the major Web companies like Microsoft and Google are offering competing products, none of them are yet very good.

Runners-Up
:



Category: Online To Do Lists

Best Offering: Voo2do



Description:
Ever more of the software we use on a daily basis is moving online, from e-mail to feed readers. To-do list managers are no exception. I've used a variety of them and so far the one that's resonated with me most is Voo2do. A one person operation run by Shimon Rura, Voo2do uses Ajax sparingly but very effectively to let you create and manage multiple to do lists. With an API available for you to access or export your data with your own programs, support for Joel Spolsky's Painless Software Scheduling method, Voo2do is the embodiment of simple, satisfying software.

Runners-Up:


Category: Peer Production News

Best Offering: digg



Description:
While not packed with Ajax, digg frankly doesn't lack for it. And of course, Ajax is only one of many optional ingredients on the Web 2.0 checklist. The important Web 2.0 capability digg provides is that it successfully harnesses collective intelligence. All news items listed in digg are supplied by its users which then exert editorial control by clicking on the digg button for each story they like. The home page lists the most popular current stories, all selected by its registered users. And digg's RSS feed has to be one of the most popular on the Web. Digg has been so successful that Wired magazine has even speculated it could bury Slashdot, which also allows users to submit stories, but doesn't let them see what stories were submitted or vote on them.

Runners-Up:


Category: Image Storage and Sharing

Best Offering: Flickr



Description:
Also acquired by Yahoo! earlier this year, Flickr is the canonical photo/image sharing site par excellence. Sprinkled with a smattering of just enough Ajax to reduce page loads and make tasks easy, Flickr provides an open API, prepackaged licensing models for your photos, tagging, a variety of community involvement mechanisms, and a vast collection of add-ons and mashups. There are other sites but none of them compare yet. Flickr is one of the Web 2.0 poster children and for a good reason.

Runners-Up:


Category: 3rd Party Online File Storage

Best Offering:
Openomy



Description:
As more and more software moves to the Web, having a secure place for your Web-based software to store files such as documents, media, and other data will become essential. There is a burgeoning group of online file storage services and Openomy is one that I've been watching for a while. With 1Gb of free file storage and an open API for programmatic access to your tag-based Openomy file system, and you have the raw ingredients for secure online storage of your documents wherever you go. There is even a Ruby-binding for the API. Expect lots of growth in this space going forward, especially as other Web 2.0 applications allow you to plug into your online storage service of choice and the desire also grows to offload personal data backup to professionals.

Runners-Up:


Category: Blog Filters

Best Offering: Memeorandum.com



Description:
Gabe Rivera's Memeorandum service is a relevance engine that unblinkingly monitors the activity in the blogosphere and appears to point out the most important posts of the day with a deftness that is remarkable. The growing attention scarcity caused by the rivers of information we're being subjected to in the modern world needs tools that effectively help us cope with it. Blog filters are just one key example of what the future holds for us. Memeorandum covers both the political and technology blogospheres, and hopefully others in the future. There are other blog and news filters out there, but none compare in terms of simplicity, elegance, and satisfying results.

Runners-Up:


Category: Grassroots Use of Web 2.0

Best Offering: Katrina List Network



Description:
I covered Katrinalist.net in a detailed blog post a while back but it remains one of the best examples of grassroots Web 2.0. Katrinalist was an emergent phenomenon that triggered the peer production of vital information in the aftermath of this year's hurricane disaster in New Orleans. In just a handful of days participants created XML data formats, engineered data aggregation from RSS feeds, and harnessed volunteer efforts on-the-fly to compile survivor data from all over the Web. This led to tens of thousands of survivor reports being aggregated into a single database so that people could easily identify and locate survivors from the Katrinalist Web site. All this despite the fact that the information was distributed in unstructured formats from all over the Web with no prior intent of reuse. A hearty thanks again to David Geilhufe for help making Katrinalist happen.

Runners-Up:



Category: Web-Based Word Processing

Best Offering: Writely



Description:
Easy to set-up, fast, free (in beta), and familiar to those with even a passing familiarity to MS word, Writely.com is an effective and easy to use online word processor. With its WSIWYG editor, users can change font and font size, spell check and insert images (up to 2MB). It also uses tagging and version control, both excellent features for any word processor. A very useful word processing tool, especially for those who can't afford to buy MS Office. In addition to being a word processor, Writely.com also serves as a collaboration tool. Users invite others to collaborate on a certain documents via email. It is can also serve as a tool to help a user blog and publish. Built with an AJAX user interface, it maximizes many of the new features available with Web 2.o. It ends, once and for all, any uncertainty that productivity tools can and should stay online. Writely is the best out there but just by a nose. The others are very close runners-up.

Runners-Up:


Category: Online Calendars

Best Offering:
CalendarHub



Description:
Online calendaring is a rapidly growing product category in the Web 2.0 software arena. The fact is that a lack of good, shareable electronic calendars is still a real problem these days. I'm fond of saying that the software world has vast collections of synchronization utilities and integration capabilities, yet it's incredible that we still can't routinely do simple things like keeping our personal, family, and work calendars synchronized. CalendarHub is the best online calendar I've seen so far, with Kiko a close second.

Runners-Up:



Category: Project Management & Team Collaboration

Best Offering:
BaseCamp



Description: Web 2.0 has terrific social collaboration models for two-way information exchange like blogs and wikis, open enrichment mechanisms like tagging, ranking, popularity, and organizing techniques like folksonomies. All of these provide a great backdrop for team collaboration and project management. Surprisingly, there aren't many terrific Web 2.0 project management tools. Part of this is because project management tends to be very specific between different types of projects. Fortunately for Web 2.0 companies, this means there isn't a lot of competition from traditional software companies like Microsoft and Primavera, which churn out somewhat mediocre products in the shrinkwrapped software space. This is why 37Signal's Basecamp is such a pleasant surprise. It's an excellent team-based project management tool that continues to delight me the more I use it.

Runners-Up:

The Story Continues However, As It Must!

No one person could accurately list the best Web 2.0 software of 2005. This is the wisdom of crowds bit of Web 2.0. In order to complete this list, I'll need your help. Please contribute your selections below. Keep in mind that I haven't worked with many of the terrific Web 2.0 software applications out there but many of you have. There are whole product categories I'm not covering here and I'm glad to keep extending this post if we get lots of feedback. Tell me about social spreadsheets, Web 2.0 project management tools, video versions of Flickr, additional grassroots Web 2.0 events, and whatever else you know of.

Web 2.0 is an exciting, vibrant community. Let's show the world what Web 2.0 is made of...

Update: I added an online calendar section and put a few new runners-up. Also added project management and team collaboration.



Wednesday, 7 December 2005
Five Reasons Why Web 2.0 Matters
I've been spending a lot of time lately with folks around the mid-Atlantic region and talking to them about Web 2.0. I get the expected full spectrum of responses ranging from genuine interest and active enthusiasm to some outright hostility. Part of it is where the Web 2.0 space is still: an elite niche of technologists with a growing wider awareness that's just beginning.

Most of us know that the technology industry and the Web are often far out ahead of the mainstream. The fact is that the general public is still struggling with blogs and wikis, much less full blown architectures of participation and software as a service (to name just two aspects of Web 2.0). Not sure about this? Try sampling a few people at random and ask them what a blog is. You will probably be surprised with the answers. Nevertheless, I'm extremely sanguine about Web 2.0 and where it's headed (notwithstanding Bubble 2.0 type events like the RSS Fund assembling a massive $100 million warchest and using it with questionable judgement.)

While generally exciting and engaging by most accounts, one thing my public presentations on Web 2.0 don't seem to address is the value proposition to the average person or organization. Why should they spend their valuable time to leverage Web 2.0 ideas, participate in Web 2.0 software, or even create new Web 2.0 functionality?
How exactly does taking the effort to do this become worthwhile? That question doesn't seem to be asked often enough or generally articulated. Web 2.0 is exciting enough in its own right to sustain lots of interest and buzz, but how does it translate to delivering tangible value to the world at large?

To address this, I've thought fairly long and hard, and come up with a starting point at least. I've tried to create the most distilled, direct explanation of the benefits that Web 2.0 best practices can provide in using and building engaging, useful software on the Web.


Five Reasons Why Web 2.0 Matters


  1. The Focus of Technology Moves To People With Web 2.0. One of the lessons the software industry relearns every generation is that it's always a people problem. It's not that people are the actual problem of course. It's when software developers naively use technology to try to solve our problems instead of addressing the underlying issues that people are actually facing. Then the wrong things inevitably happen; we've all seen technology for its own sake or views of the world which are focused much too little on where people fit into the picture. Put another way, people and their needs have to be at the center of any vision of software because technology is only here to make our lives and businesses better, easier, faster or whatever else we require. Web 2.0 ideas have been successful (at least) because they effectively put people back into the technological equation. This even goes as far as turning it on its head entirely and making the technology about people. Web 2.0 fundamentally revolves around us and seeks to ensure that we engage ourselves, participate and collaborate together, and mutually trust and enrich each other, even though we could be separated by the entire world geographically. And Web 2.0 gives us very specific techniques to do this and attempts to address the "people problem" directly.

  2. Web 2.0 Represents Best Practices. The ideas in the Web 2.0 toolbox were not pulled from thin air. In fact, they were systematically identified by what actually worked during the first generation of the Web. Web 2.0 contains proven techniques for building valuable Web-based software and experiences. The original Design Patterns book was one of the most popular books of its time because it at long last represented distilled knowledge of how to design software with ideas couched in a form that were reusable and accessible. So too are the Web 2.0 best practices. If you want to make software deliver the very best content and functionality to its users, Web 2.0 is an ideal place to start.

  3. Web 2.0 Has Excellent Feng Shui. Yes, I'll get in trouble for stating it this way but I think it fits, here goes... I'm a technologist by background and I don't buy into the new-agey vision of Web 2.0 that has sometimes been promulgated. And I certainly don't believe that Web 2.0 has a "morality" as the famous Tim O'Reilly/Nicholas Carr debate highlighted. However, as someone that has designed and built lots of software for two decades now, I have plenty of regard for the way the pieces of Web 2.0 fit together snugly and mutually reinforce each other. Why does this matter? It has to do with critical mass and synergy, two vital value creation forces. Taken individually, Web 2.0 techniques like harnessing collective intelligence, radical decentralization, The Long Tail are quite powerful, but they all have a potency much greater than their simple sum and they strongly reinforce each other. In fact, I'll go as far as to say that only "doing" parts of Web 2.0 can get you into some real trouble. You need a core set of Web 2.0 techniques in order to be successful and then the value curve goes geometric. This is why the ROI of software built this way is so much greater. Here's an earlier post that provides more detailed examples of why this is.

  4. Quality Is Maximized, Waste Is Minimized. The software world is going through one of its cyclical crises as development jobs go overseas and older, more bloated ways of building software finish imploding as the latest software techniques become more agile and lightweight (sometimes called lean). The guys over at 37Signals say it best... Using Web 2.0 you can build better software with less people, less money, less abstractions, less effort, and with this increase in constraints you get cleaner, more satisfying software as the result. And simpler software is invariably higher quality.

  5. Web 2.0 Has A Ballistic Trajectory. Never count out the momentum of a rapidly emerging idea. For example, I'm a huge fan of Eric Evans' Domain Driven Design but it's so obscure that it will probably never get off the ground in a big way. There's no buzz, excitement, or even a general marketplace for it. This is Web 2.0's time in the sun, deserved or not. You can use the leviathan forces of attention and enthusiasm that are swirling around Web 2.0 these days as a powerful enabler to make something important and exciting happen in your organization. Use this opportunity to seize the initiative, ride the wave, and build great software that matters.


Certainly there are other reasons why Web 2.0 is important and you're welcome to list them here, but I think this captures the central vision in a way that most anyone who is Web literate can grasp and access.

BTW, I will also use this moment to state that Web 2.0 is a terrible name for this new vision of Web-based people-centric software. Except that is for every other name we have at the moment (for example, like "next generation of the Web"). So I will continue to use Web 2.0 until something better comes along.

OK, don't agree? Please straighten me out. Why does Web 2.0 matter (or not) to you?

Technorati: web2.0


Sunday, 4 December 2005
Struggling to Monetize Web 2.0
Web 2.0 provides potent business models for making web applications which apply them successful, or least, ever popular with their users. These techniques typically have to do with connecting supply with demand cheaply and effectively (The Long Tail) or by providing a unique source of information that is difficult to recreate elsewhere. Unfortunately for the creators of many of these web applications, they sometimes confuse popularity with financial success, or more often, they optimistically believe the former can turn into the latter. The truth is, monetization of Web 2.0 services is a genuine issue for those that are planning to use Web 2.0 ideas for non-strategic purposes. Yet many Web 2.0 services seem to be intent on tactical financial capitalization of the attention and user base which Web 2.0 applications can build almost overnight.



ZDNet's Phil Wainwright thinks this issue, namely lack of revenue, is a big piece that's missing from the Web 2.0 business model. He posits that the next iteration of Web 2.0 will solve this and other problems, which he dubs Web 3.0. Phil's analysis is pretty sharp and he has identified at least three revenue models that will form the basis of commercial succesful Web 2.0 services:


  • Advertising: Phil doesn't think much of this model, no matter how well Google is doing with it and despite the fact the Microsoft is increasingly interested in the entire online ad space.

  • Subscriptions: Divided up into fixed rate, variable rate, and fixed+plus variable models, subscriptions are very popular with leading Web 2.0 companies like 37Signals and I'm with Phil that this will continue to be popular for large footprint services, but not for mash-ups and aggregation services that provide bite-size functionality.

  • Transaction Commissions: Best exemplified by companies like eBay that charge for a given successful transaction, Phil believes this will ultimately become the biggest player.


My issue with this trinity of revenue models is that it doesn't explicitly leave room for a fourth or possibly fifth needed model. I truly believe there is an active need for one or two as-yet-uninvented revenue models to fund Web 2.0 services that face the general public.

To illustrate the problem, take a look at terrific Web 2.0 services like del.icio.us or voo2do. Both of these sites are absolutely central to my daily work yet I pay nothing to use them, nor can I. And ugly, intrusive advertising would probably drive me away to find something else. I love their look, I love their feel, and they are intrinsically useful to me, but both sites are wise enough to wait for the revenue solution to arrive. They know they can't charge or users will route to the next free service. They hope to be acquired or find that new revenue source and they know the three models above are non-starters.

Unless new revenue models are discovered or derived for the kind of medium (del.icio.us/voo2do) or fine-grained Web 2.0 applications (many mash-ups), it may very well end up that Web 2.0 will become 1) primarily an adjuct of traditional commercial Web-based companies and 2) a pure software development technique representing design patterns (but not business models) for building lightweight, recombinant, interactive user experiences for applications.

In upcoming posts, I plan on exploring some interesting new revenue models like Attention Trust and others. Monetizing Web 2.0 is just not a well-solved problem and those that address this have vast success in the marketplace.

Do you know any new ways to make Web 2.0 services generate revenue?

Technorati: web2.0


Monday, 28 November 2005
Web 2.0 and the World-Wide SOA
One of the profound strengths of Web 2.0 is that it encourages open protocols and APIs to share information with everyone over services, instead of just web pages. This lets the raw information that users contribute to Web 2.0 applications be shared across the Web for second uses, remixing, filtering, and syndication. This has led to the rapidly growing phenomenon of mash-ups and dynamic supply chains of information that span the entire world. It's also fostering something I call the Global SOA, and this is something RSS and to a lesser extent, things like REST, are making happen with surprising speed. Important tweaks to the protocol standards, like Microsoft's SSE will help create this information ecosystem in full strength in the not very distant future. Read this irreverent but very good article by Kurt Cagle for a somewhat different yet similar view of SOA and Web 2.0.


Web 2.0: The Global SOA


I recently had an opportunity to give a luncheon address on the subject of Web 2.0: The Global SOA to SAIC here in the Washington, DC area at their Enterprise Content Exploitation Day on November 16th. John Furrier, of the Web 2.0 Workgroup was nice enough to syndicate it this morning on PodTech.net. In the speech, which I gave in my capacity as Chief Technology Officer of Sphere of Influence, I provide a non-technical overview of Web 2.0 and discuss some of the exciting things happening in the Web 2.0 community at both a grassroots and at an industry level. I also discuss how to use Web 2.0 for content exploitation, which was the theme of the day, and is something at which Web 2.0 has some terrific strengths. It's not the clearest recording but the ideas resonated very strongly with the audience and I was surprised at the strong positive feedback I received on the subject of Web 2.0 as the Global SOA in particular. We now have a bunch of opportunities to talk about this exciting topic in the near future and expect to hear more in this space as we continue to develop these ideas.

It has been amazing to see the convergence of these two extensively overlapping organizing principles in software development however. I think the Web 2.0 toolset in particular has a lot to offer state-of-the-art SOA development particularly in the radical decentralization, Data as the Next Intel Inside, and the harnessing collective intelligence pieces of Web 2.0. For Web 2.0's part, SOA principles offer discipline, practical architectural patterns, orchestration (BPEL) and more. We're seeing tremendous interest in this area with SOA practitioners in the DC area and I'll write more about this here as I'm able to.

Semi-Sidenote: Ryan Carson is hosting what looks like a bang-up new Web 2.0 conference, The Future of Web Apps, in London of all places, with the creators of Flickr, del.icio.us, and 37signals among others. He told me he wants to create excitement about Web 2.0 in Europe and to get the folks across the Atlantic (and indeed across the world) to start innovating and building exciting Web 2.0 applications. I mention it here in the Global SOA context since Ryan is giving one of the sessions on how to build enterprise apps quickly with next generation web techniques. And that's something that Web 2.0 provides the SOA world: some really great lightweight techniques for rapidly building high-value, truly scalable solutions, inside and outside the firewall.

Technorati: web 2.0, soa


Saturday, 26 November 2005
How Simple Sharing Extensions Will Change the Web
I've been studying Microsoft's proposed new RSS extension, Simple Sharing Extensions (SSE), for a few days now. Authored by Groove's Jack Ozzie and George Moromisato (pictured in this article towards the bottom), Simple Sharing Extensions has two big things going for it.

- One, it's being personally pitched to the world by Microsoft's CTO Ray Ozzie with an openness and participatory approach that almost seems startling coming from Microsoft.
- Two, the draft spec is really good.

S
SE is elegant, is itself simple, and provides an essential solution to an important problem with the increasingly two-way Web: Though RSS is a resilient, powerful, and ubiquitous protocol for Web content distribution, RSS has a one-way generic view of content flow. Without a standard way to process mutually published and edited content, there can be no shared perception of the unique pieces of information that pass between users and applications. These chunks of information are what RSS and SSE call items, and can represent calendar entries, blog posts, eBay auctions, podcasts, or whatever. With SSE added, RSS is still one way but we now have an easy way to collaborate separately on the same content (a wiki article, inventory list, or stock quote.)

Having no way to atomically perceive whether content is new or changed makes robust behavior, or truly interesting capability, impossible without resorting to hacks or non-standard techniques. SSE understands that more than one entity can be the "publisher" of an item and makes this possible to handle in RSS, which
without SSE has a view of only one publisher, many content subscribers.

By itself RSS provides fall-down-the-stairs easy content sharing and it scales well and encourages loose coupling. But this radical simplicity actually forms a barrier to entry for sophisticated and higher-order content distribution scenarios like mash-ups, supply chains/rings, web-scale SOAs, etc. But RSS did enable a truly service-oriented web and gave everyone workable planetary scale content syndication that virtually every blog, wiki, and web application provides today. And with SSE added, RSS can now form the fundamental basis for the rapidly growing Web 2.0 information ecosystem.





The masterstroke is that SSE rides inside RSS. The massive collection of feed readers, RSS aggregators, tools, scripts, utilities, and infrastructure doesn't have to change a whit to deal with SSE markup that appears inside an RSS feed. This gives SSE feeds backwards compatibility and a piggyback ride on top of the almost omnipresent world-wide RSS infrastructure. Straightforward upgrades of existing RSS handlers can provide a smooth and simple adoption path for most, since they can test if a feed contains SSE and do additional processing that makes two-way conversations smarter, instead of the unintelligent content feedback loops that occured with plain RSS. Or they can ignore SSE markup in an RSS feed if they don't care about it and just use the content they find the old way.

SSE is still so new (and forming, it's not even at 1.0 yet) that virtually no software exists yet except for what prototyping Ray Ozzie and company did to validate their ideas. But expect to see SSE become very popular, and quickly, since it solves such an important problem. I anticipate that RSS+SSE will become the most popular way to syndicate Web 2.0 content and to glue applications together. I've been taken to task occasionally for my prediction that RSS will be the fundamental Web 2.0 protocol, and with SSE my prediction is reinforced. It's so good, so simple, so right that I think you'll see the Web services of yore become almost completely eclipsed by it. RSS has reached the tipping point in terms of hundreds of millions of available feeds, tool support, mindshare, and the all-important success factor: critical mass. SSE completes RSS in so many ways. I'll be talking about this in more detail in the near future but I wanted to encourage you to study SSE, promote it, and implement it. Now is the time, it's early yet...

Other excellent coverage of SSE:


- Charles Cook does an impressive job analyzing how SSE does replication and conflict resolution between items.
- Lance Knobel covers the exemplar of calendar sharing that Ray Ozzie used to introduce SSE.
- The Inside Microsoft Blog discusses the SSE announcement and how Dave Winer has asked Google to use its position to make SSE big, quickly.
- Read how Niall Kennedy is already building SSE exporting into NetNewsWire.
- Don Dodge did a good number on SSE and discusses Microsoft's encouraging use of a Creative Commons license for the specification.
- Update: Infoworld's Jon Udell recently did a bang-up job of analyzing SSE.

Do you think RSS+SSE will become the dominant Web service standard?

Technorati: web2.0, rss, sse


Thursday, 24 November 2005
Intellectual Hydroplaning With Web 2.0 And Other Matters
A number of interesting events ocured in the Web 2.0 space this last week. And while seemingly unconnected, all point to an underlying trend which I'll talk about in a moment.

Among these notable events is the growth of
the highly informative Web 2.0 Workgroup, of which the blog your reading right now is part. The workgroup has just reached twenty in number and I am proud to be a continuing member. And in case you're not following this terrific collection of blogs that analyze and discuss the latest trends on the Web, I strongly encourage you to do so. Here's a summary of the membership:


Blogs by Category

Analysis & Trends: Read/WriteWeb, Dion Hinchcliffe, Susan Mernit's Blog, Web 2.0 Explorer
Companies & Products: TechCrunch, SolutionWatch, eHub
Design & Usability: WeBreakStuff, Bokardo, ParticleTree, Emily Chang
VC & Business: Jeff Clavier, Nivi
Podcasting: PodTech, Web 2.0 Show
Tech & Development: Programmable Web, CrunchNotes, Librarystuff
Commentary: Scripting News, HorsePigCow

The next big item in the last week, a landmark event in fact, was undoubtedly the release of the Simple Sharing Extensions (SSE) draft specification by Microsoft. Microsoft's CTO, Ray Ozzie captured the reasoning behind this brilliant move in an article on his new blog but the manuever will almost certainly be judged a masterstroke in the final analysis. The next generation of the Web will be connected by RSS, which as I wrote about recently will be the de facto Web service in this new world of mass two-way participation via blogs, wikis, and content sharing. But RSS is problematic because it's a one way conversation and though it promotes loose coupling by adopting an open ended pull-based syndication model, RSS doesn't support two way content sharing by itself.

And Web 2.0 best practices specifically encourages making things higher quality by making them simpler, so another new protocol or standard which doesn't leverage RSS's vast prevalence and its strength in simplicity would be the wrong move. So Microsoft was very wise in their decision instead to make SSE an extension of RSS and OPML. Like the SSE specification FAQ says, it "defines the minimum extensions necessary to enable loosely cooperating applications to use RSS as the basis for item sharing—that is, the bidirectional, asynchronous replication of new and changed items among two or more cross-subscribed feeds." The upshot is that SSE allows RSS to be much more powerful as a tool for gluing applications together, in a still loosely-coupled fashion. This enables two-way conversations to be much richer and more immediate all while leveraging the near ubiquity of RSS itself. I'll be talking much more about SSE in the near future as we see it get used to construct new Web 2.0 experiences.

The last noteworthy item was Jeremy Geelan's incisive commentary today on whether the Web 2.0 architecture of participation ideas are really taking us anywhere. While Jeremy apparently buys into the Wisdom of Crowds concept, saying "none of us is as smart as all of us", he paints the blogosphere as an uncritical medium in the extreme and notes that it may be an extremely poor forum upon which to present ideas, because it doesn't actually have a real stage. This leads to case where the "anonymity is compounded in six cases out of ten by the kind of vehemence more often associated with the bar-room than the Forum. Bloggers, it very often seems, are all legends in their own minds; they commit arson every day in their imagination, burning down the previous day's lies and distortions. Worse still, so many bloggers suffer from what Albert Camus called 'the sign of a vulgar mind,' namely the need to be right."

In the end, Jeremy finds things like Web 2.0 to be an amazing medium, but still just a medium in the end that is no more intrinsically insightful than any other place. I do believe the need to sort through and use automated mechanisms like Memeorandum to sort through the mountains of information we're being buried in and I believe his concerns might be unwarranted. But Jeremy goes right to the heart of the challenges that the kind of participation Web 2.0 entails and the risks of being buried in an echo chamber of unusable discourse in the end.

All of three of these things point to a core trend though, that regardless of the endless streams of issues that this new vision of the Web creates, actual interest, innovation, and real progress has become dramatically obvious and I expect that 2006 will be a banner year as big industry players like Microsoft fully deliver on their new Web 2.0 visions, while start-ups and the general public in join the fray in increasing numbers . Stay tuned for more, the excitement has barely started...

Technorati: web2.0


Monday, 21 November 2005
Making Web 2.0 Commercially Successful
Following up on this weekend's mainstream media discussion of Web 2.0 being a better, healthier boom than the original dot-com speculation frenzy, I've had some time to reflect on the ways that Web 2.0 software can be commercially successful. Certainly, the most popular model for most Web 2.0 startups themselves is through mergers and acquisitions. In this model, you typically build a successful service with a unique identity, some hip attitude, and attract a large user base and as long as you've done it cost effectively, the attention base of your users alone probably pays for the acquisition in the eyes of the buyer.



But once acquired, how can you stay commercially successful using Web 2.0 ideas? Sure, Google, eBay, and iTunes have shown it can be done, but knowing all the workable routes is important, especially if you significantly depart from the proven advertising model of Google or the pay-per-use models of eBay and iTunes. I think the answers to this question will be fascinating. And having the various pathways to commercial success explored is one of the remaining big stories to be written about Web 2.0.

ZDNet's Phil Wainwright recently laid out the most common approaches to generating revenue from on-demand software services. What he lists are the major revenue models generally open to the kind of software services that commercial Web 2.0 companies will offer.

This just identifies the ways you can gain financial remuneration for your services though. Being successful in the Web 2.0 era is about more than just generating net revenue. It's about
keeping your market share. To stay successful you have to maintain critical mass so that your continually contributed and enriched data stays better than the next person. It's about having the best content and functionality on the Web, and ensuring it stays that way. And that's where the next important ingredient to commercial success comes in. As I see it, there are (at least) four ways to have content and/or functionality that no one else has. These are patented techniques, hard to recreate data sources, copyrighted content, and secret formulae.



If you look at the big players in the Web 2.0 world, they have one or more of these mechanisms in place to keep you using their service. This might be iTunes with its music and video library (copyrighted content), Google with its closely held best-of-breed search index (secret formula and patented techniques) , or del.icio.us with the best bookmarks on the Internet (hard to recreate data sources). And for now, with open source databases like Wikipedia pounding the daylights out of copyrighted sources, you can bet that patents will become an increasing factor in the Web 2.0 world. Just like each of the big computer revolutions in the 1980s and 1990s ended with massive legal battles over who really created the best ideas, you can virtually count on Web 2.0 shaking out in the next five to ten years with battles over the legal protections established around the content and functionality of the dominant Web 2.0 players.

In fact, patents are now understood to be an integral part of a company's intrinsic worth and the issues surrounding software patents in particular are growing as the number of software patents granted has skyrocked in recent years. Expect that many players may establish and keep their dominance through use of patented capabilities and the resulting disputes will likely be a feature of the Web 2.0 revolution end-game.

What do you think? Will Web 2.0 companies have to resort to patent protections to achieve lasting success?

Technorati: web2.0, patents


Saturday, 19 November 2005
Tolerance and Experience Continuums

Some terrific new explorations and explanations of Web 2.0 have come out this week. I really enjoyed Dan Saffer's
The Web 2.0 Experience Continuum in the latest issue of Adaptive Path's online newsletter (incidentally these are the thought leading folks that coined the term Ajax and are behind the buzz generating Measure Map among other things). Along the way, Dan uses one of my favorite qutoes from William Gibson, "The future is here. It's just unevenly distributed." But specifically, he explores something I've been discussing with a few folks including SOA expert Jeff Schneider and that's how to lower the impedance between parts of the Web, particularly between services. Jeff made much recently about something he calls the tolerance continuum, which he described to me as:

"The durable storage or transaction layer (bottom) is low, while the presentation or user layer (top) is high. Realization that the tolerance continuum exists enables the reduction in structure (rules, constraints, heavy protocols). Reduction in structure enables ‘tolerance on the edge’, ‘compose-ability’ and ultimately, the use of information in unintended manners (increased consumption scenarios)."

In a very similar vein Dan specifically talks about an experience continuum where unstructured content enables a web less about places and nouns but about verbs and activities (a great citation of Ross Mayfield):

"The tools we’ll use to find, read, filter, use, mix, remix, and connect us to the Internet will have to be smarter and do a lot more work than the ones we have now. Part of that work is in formatting. Who and what determines how something looks and works? On the unstructured side of the continuum, perhaps only a veneer of form will remain. “Looks” will be an uneasy mix of the data and the tools we use to view it. Visual design is moving away from its decentralized locations on websites. Indeed, design is becoming centralized in the tools and methods we use to view and interact with content. Firefox users can already use extensions like Adblock, and especially Greasemonkey, to change the look of the Web pages they visit. RSS readers let users customize how they want to view feeds from a variety of sources. Soon, expect to see this type of customization happening with bits of functionality as well as content."

My personal prediction is that low impedance mechanisms will flourish dramatically in coming years the closer you get to the point of use. Back-end infrastructure will get both radically decentralized but remain essentially as formal and structured as it is today. Expect to see interesting things happen at the points where you cross over from one to the other. Tragically though, is that the future is arriving at such breakneck speed that equilibrium across our culture doesn't have time to take. Dan notes that many people he talks to about the Internet still don't know what a blog is.



The second item that caught my eye was John Battelle's article this week in the New York Times, titled Building a Better Boom. Organizer of the Web 2.0 Conference among other things and a respected member of the Internet community, John dismisses the existence of a second Internet bubble outright citing the lack of IPOs and notes that most Web 2.0 companies are taking the acquisition route instead. More significantly however, this article is one of the first in mainstream media which introduces Web 2.0 as an explicit term and John provides good examples and coherent explanation accessible to the general public. John does an good job explaining a big topic and uses a very light hand, without hype or buzzwords. A few dozen more articles like this and you'll finally be able to mention Web 2.0 in some circles without taking 10 minutes to explain it first.

Is unstructured content and locationless functionality the future of Web 2.0?

Technorati: web2.0


Friday, 18 November 2005
The (Weak) Pulse of Web 2.0?
Trying to find the actual importance of various aspects of Web 2.0 to the world at large has been a goal of mine for a few weeks now. What is it about Web 2.0 that people specifically find so interesting? Certainly Web 2.0 is a big topic and with the flood of new Web 2.0 services coming out daily, slews of industry and mainstream articles appearing constantly, Google seemingly to rapidly overtake everything, it's sometimes hard to keep things in perspective. Consequently, I've tried to figure out a way to take the pulse of the community.

So along these lines and in the spirit of using attention as a meter, I conducted an experiment this morning and took a (very) informal Web 2.0 survey using Google's search database.

The Goal: To find out what people are specifically thinking about in terms of Web 2.0 by looking at the content being published.

While the search terms I picked came out of my hat essentially at
random, some interesting things turned up. Here's what I found...

Occurence of Web 2.0 Terms in Google's Search Index

37Signals 548,000
Ajax 7,500,000
del.icio.us 18,500,000
digg 6,610,000
Flickr 34,800,000
Folksonomy 1,520,000
Google Base 3,680,000
Harnessing Collective Intelligence 795
HousingMaps 52,300
JotSpot 612,000
Mechanical Turk 625,000
NetVibes 264,000
Office Live 1,740,000
Tag Cloud 2,260,000
Tech.Memeorandum 157,000*
TechCrunch 1,940,000
The Long Tail 1,680,000
Web 2.0 16,900,000
Web 2.0 Conference 367,000
Web 2.0 Workgroup 81,400
Wisdom of Crowds 821,000
Software As A Service 591,000
Viral Marketing 2,330,000
Customer Self-Service 550,000


Google's search index handily specifies how many times an item appears in its search database and this is the number you see above. And if you put your search term in quotes, you'll only get exact matches and this is what I used. Armed with this approach, I selected obvious (to me) terms like "Web 2.0", "Ajax" (deconflicted from other uses by adding "XML" and "JavaScript" in two searches and splitting the difference), "Office Live", "Folksonomy" and the others you see above.

The result paints a overall picture of the current general relevance of various Web 2.0 ideas. It shows that Web 2.0 as an explicit concept is extremely popular (16.9 million unique occurences) but tellingly, many of the actual ingredients of it are discussed hardly at all. For example, witness the amazingly unused term harnessing collective intelligence. Google claims that there are less than 1,000 occurences of it on the entire Web. No one is talking about it, (and yes, I tried many variations, this was the most frequent) yet this is one of the more important pieces of Web 2.0. What's going on here folks? Is Web 2.0 discussion really this shallow? It does make you think.

And for sure this is not the least bit scientific but it's a good rough barometer and tells us that
more substance needs to be brought to the table. As a counterpoint though, you can see that Web 2.0 poster children del.icio.us and Flickr are extremely popular terms and you might make a case that this represents people just doing Web 2.0 things rather than talking about them. I am surprised though to see even relatively well-known concepts like Software As A Service and Customer Self-Service get relatively little play compared with Flickr's whopping 34 million occurances. Our beloved Tech.Memeorandum isn't exactly setting things on fire yet either, with a paltry 157,000 instances compared to digg's 6+ million.*

I suppose my point with all this is that while Web 2.0 things are actually happening all over the place, meaningful discussion that goes to the substance of the movement is still lacking. While the wonderful folks at the Web 2.0 Workgroup are collectively doing a terrific job (I particularly like Richard MacManus' detailed discussion and analysis lately of Yellowikis for instance), I encourage all of us to remember that little of this is in the public mind yet. And it'll be years of evangelism, promotion, discussion, and vigorous debate before the actual concepts and best practices that Web 2.0 represents becomes truly mainstream. Getting the word out is up to us, it's all about reaching critical mass in the end.

Update(*): Gabe Rivera of Memeorandum gently pointed out (below) that memeorandum is a much better search term for his site, and it's true. Google's numbers show 1,920,000 references to it, which is certainly more plausible. My bad analysis for exclusively reading the tech side of it.

Is the Web 2.0 community self-important? What do you want to see us talk about?

Technorati: web2.0


Tuesday, 15 November 2005
Finding the Real Web 2.0
As I prepare to give a keynote address about Web 2.0 to a DC area technology industry day, I have been looking hard for classic examples of Web 2.0 phenomenon happening at a grassroots level. It's one thing for Web 2.0 applications to be hyperscrutinized and lauded by the technorati, but it's quite another for every day Web users to be actually doing this stuff of their own accord. For example, I look at Tech.Memeorandum's coverage of Amazon's new Web 2.0 friendly tags, and I'm not so sure it's setting the world on fire yet.

I get more encouraged when I see simple, straightforward Web 2.0 solutions like SuprGlu, which allows anyone to dynamically integrate Web services that matter to them from all over the Web in just a few minutes. I use this example to show how anyone can do meaningful and actually useful Web 2.0 style service remixing and syndication with a few points and clicks and get something out of it that can
in turn be reused. And it shows that Web 2.0 as the Global SOA is real and here today as well.

But the success and popularity of SuprGlu isn't evident either. I want examples that show us that harnessing collective intelligence is a normal and common event, that network effects and large scale Web-enabled customer self service really are changing the world bottom up. I don't find top down efforts with Web 2.0 startups or widespread blogging about Web 2.0 to be compelling evidence yet, as interesting as those things are to me personally.



Enter an absolutely fascinating article in the December, 2005 issue of Discover magazine. In another piece of written virtuosity, technology writer Steven Johnson finds that Web 2.0 is happening right in front of us. If you recall, Steven Johnson did a bang up job on one of the first mainstream Web 2.0 articles in print in Discover magazine earlier this year. In this new piece, which is not available online right now, so I'll do fair-use quotes here, Johnson explores how people used Web 2.0 techniques, entirely unconsciously, to effectively deal with the aftermath of the Katrina hurricane disaster on a large scale.

It's a fascinating read and Johnson's premise is this: "Ordinary citizens can't do much about a 150-mph wind or a 30-foot wave, other than get out of the way. But the Internet revolution teaches us that ordinary citizens can play a crucial role in creating nimble new channels of information that are more resilient than official channels." The article chronicles the search for survivors in the chaos after the disaster and how people attempted to find each other using the Web.

At first, once the hurricane passed, there were a few isolated Web posts announcing a person's safety or location that were spread across blogs and a few national sites like craigslist. What's interesting is how the initially sporadic, individual efforts quickly coalesced into a self-organized effort of thousands of individuals that unconsciously leveraged the Web 2.0 memes of radical decentralization and harnessing collective intelligence.

Says Johnson, "on Saturday, September 3, as the catastrophe worsened, a handful of tech-savvy volunteers led by David Geilhufe started gathering data from these [isolated survivor posts] by 'screenscraping' the information for each person and depositing it in a single database." Geilhufe even concocted a standardized information structure called PeopleFinder Interchange Format to encode the information. There were thousands of missing people notices online the next day that weren't in this format because they were free form and not machine readable. So well known online figures Jon Lebkowsky and Ethan Zuckerman recruited volunteeers and in a self-organizing effort assembled thousands of people in a single day to create over 50,000 entries.

The result was katrinalist.net, which allows anyone to search for friends or relatives despite the fact that the information was originally deposited in free form text on some other site. Katrinalist's data source was a kind of emergent, self-organizing Mechanical Turk effort that harnessed the collective input of thousands of people to form a new high-value data source that was of enormous benefit to the world at large, all in a handful of days. Johnson says, "PeopleFinder was the kind of data management effort that could have taken a year to execute at great expense if a corporation or a government agency had been in charge of it."

Katrinalist.net is exactly the sort of thing I would expect to see happening if Web 2.0 principles not only work, but are the inevitable effect of hundreds of millions of well-connected users with two way access to the Web. Imagine what they can do for your organization or your personal dreams, if leveraged properly.

Technorati: web2.0


Friday, 11 November 2005
RSS is the Web 2.0 "Pipe"
Problematically, RSS is still not quite a household word yet, and even the software industry is just beginning to realize the importance of this workhorse syndication format. Though at this point it's already clear RSS will be the key enabler of Web 2.0 and Software As a Service. It will do this both as a notification system and as the actual glue that will eventually hold many Web 2.0 services and mash-ups together.

Dave Winer famously created the current incarnation of RSS but its implications are still rippling through the industry. The folks that can fully appreciate RSS will reap corresponding rewards. Microsoft CTO's Ray Ozzie is a good example of the folks that "get" the significance of RSS. I love the quote from his much-discussed leaked memo this week and I haven't been able to stop using it for the last day: "[RSS] is filling a role as ‘the UNIX pipe of the internet’ as people use it to connect data and systems in unanticipated ways."

And we can't forget that RSS feeds, storage, synchronization will be a
central new feature of the next version of Windows.

So expect to see RSS on every blog, every Web 2.0 service, web site, and piece of desktop software going forward. If you can't find the feed, you can be sure whatever it is won't last long.

And for the fully buzzword compliant and for the record, I do fundamentally believe REST/RSS is the new EAI. And the glue of first choice for lightweight SOA as well. And I'm actively starting to see some folks drop their traditional Web services and go RSS wholesale.


Of course, the real problem right now is that most people on the Web still don't have any idea what RSS is. At best, the average Web user might understand that RSS forms some kind of information "feed". More sophisticated users notice that if they can find an RSS link somewhere (a blog or news site for example) that they can use it like a URL to get updates of information within services like My Yahoo, Bloglines, or something called an RSS reader.

Murkiness and partial examples are the enemy here. Raising awareness and finding clear examples that fully express the potential and power of RSS should be the goal.

Here are some clear, canonical examples that I think convey the full scope of what RSS does for us in a Web 2.0 world:

RSS Use Cases

Notification: Need to inform a lot of people about changes to information? Don't want central control? Want to enable self-service? Use RSS.

Syndication: Publishing new information regularly? Put it into an RSS feed. This flows out to the world your blog entries, news articles, podcasts, videos, job posts, weather reports, financial updates, bug reports, etc. The software you use should be able to take your information and make it into an RSS feed. If your current software can't, find new software. It's that important.

Glue: Need to connect any service to another service on the Web (or anywhere else)? Trying to mash together data? Building supply chains? There is generally no need to ever ask anyone to stand up a new web service. Pull everything you need via its RSS feed. Some software developers will disagree with this and say there are better methods, but to this I point out: 1) RSS is robust enough that it's all you'll ever need nine times out of ten and 2) it's what you're going to offered automatically anyway, take it and get something done.

RSS creates the Web 2.0 information ecosystem by enabling interconnectedness, network effects, emergent behavior, and much more as well. And RSS doesn't demand control of the other end of the conversation. This is a big enabler all by itself and is a classic Web 2.0 force. By letting consumers of RSS use any tool or service they want on their side, barriers are eliminated and connectivity is encouraged.

That doesn't mean that RSS doesn't have its
weaknesses either and certainly there are other ways to create feeds, but RSS has the mindshare, support, and the goods right now. So though it's not perfect, it's more than up to the job. Let's spread the word...

What did I miss? What other canonical examples are there?

Technorati: web2.0, rss