Update–04-03-08. Version 2.0 of this talk is available here. This, the original version was delivered at a publishing meeting to publishers. V 2.0, the kinder, gentler version, was presented at a meeting of developmental biologists.

Last week I gave a talk at the American Association of Publishers Professional and Scholarly Publishing (AAP/PSP) meeting in Washington, DC. I was part of a panel discussion on “Innovative and Evolving Websites in STM Publishing” along with representatives from the New England Journal of Medicine, the Royal Society of Chemistry and the American Chemical Society. While the other talks were a bit more evangelical, or mostly presented a look at new technologies that had been incorporated into the societies’ own journals, I tried to be a bit more practical, taking more of a hard look at what’s currently being tried, whether it’s succeeding and the reasons behind that success/failure. I’m posting my talk below, in hopes of receiving further feedback. This talk was delivered to a room full of publishers, so it’s directed with that audience in mind. In a few months, I’m giving a similar talk to a meeting of scientists, the users of these sites rather than the creators. So I’d love to hear from users as to your thoughts on how Web 2.0 is serving your needs.

—article continues—

I began the talk with some background about CSHL Press (you can get that here) and about me. I’ll skip most of it, other than to note that for several years I served as CSHL Press’ Online Content Manager, overseeing much of our web strategy, and that I now run CSH Protocols, our attempt to move our successful laboratory manual publishing business online. I noted that I spent six years doing research in a lab where new technology development was the raison d’etre, that I consider myself a technophile, and that I’m always on the lookout for new and exciting tools. I should also note that CSHL Press is a Biology-focused publisher, and my comments address that market specifically. The lessons learned are definitely applicable elsewhere, but it may depend on the culture of a given field.

Web 2.0 in Life Sciences

Odds are, if you follow the scientific news and literature, you’ll be familiar with the large amount of hype Web 2.0 technologies and their potential use in science publishing have been receiving. There have been lots of articles, lots of publishers (as well as individuals) jumping into the ring with efforts to capitalize on this growing trend. This slide presents a rough list and is far from comprehensive. I put it together spending about half an hour with Google, but you can see how many entrants there are in the market (a new one just came to my attention as I was writing this). The list includes “social networking sites” which host blogs, connect collaborators, discussion boards, etc., “referral sites” which let you tag and promote interesting papers or store all of your references online, and a few others endeavors, some incorporating wikipedia-type information, others looking to take advantage of the growth in online video. So, the question is, how are these sites doing, are they being used by mainstream scientists?

I’ve been conducting an informal poll for the last four months, asking every scientist I know what science blogs they read or social networking science websites they use. I’ve spoken with undergraduates, graduate students, postdocs, PI’s and department chairs. So far, 100% of those polled have answered, “None” (I’ve cleverly illustrated this concept using the parlance of the internet, where the main purpose appears to be posting cute pictures of cats). This response was really disappointing. If you spend much time in the science blogosphere, everyone seems to be talking about these great tools and the changes they’re making in research science. But when you step away from the enthusiasts and speak with the majority of scientists, you find out that they don’t have much interest in using many of these new technologies. The whole situation reminds me quite a bit of what one saw online regarding the Linux operating system 5 to 10 years ago. You saw great enthusiasm, and predictions that Linux was soon to take over the computing world. The rest of the world shrugged, and went back to their Windows computers to get their work done.

I’ll try to discuss some of the reasons things aren’t working and make a few suggestions that might help. I certainly don’t have all the answers, I’m sorry to say, but hopefully I’m pointing in the right direction. Most of it boils down to the tools not being well designed for the desired audience. The hype is right in some ways–the potential is there, and it is something we should be excited about, but we’re failing to channel that potential into compelling tools that will catch on with the community. A successful tool will address a need of the user, and will do so in the context of the culture of the user. You’re unlikely to get a well-established culture to change just to suit your tool, no matter how much promise it shows. While most of the tools available are clever ideas, or seem useful on the surface, their lack of traction should be telling you that something’s not quite right. Although these comments are meant to address Web 2.0 sites, they’re applicable to most other websites as well (and I’ll use a few as examples).

Reasons for lack of adoption: Time

Time is the key component here, something I’ll mention many times in this talk. Web 2.0 is based on the idea of user-created content. You put up a site, the users create the content. It requires a hefty time investment from the users. The problem is, you’re dealing with a base of users who are overscheduled and overworked. Furthermore, there’s no professional incentive to participate in the content-building at your site. The quote used here from a Postdoc is important:

“I can barely keep up with the literature in my field and with what my labmates are doing. Who has time to spend reading some grad student’s blog?”

You also have the paradox that those whose input would matter most to your site are the least likely to contribute. I’m more interested in reading comments from a prominent researcher, or seeing her tags on papers that she finds interesting, rather than the less-informative opinions of a beginning graduate student. And yet those early graduate students are much more likely to be active on your site. Later in their careers, as they become more successful, their schedules get more crowded, and participation will wane just as it is becoming most valuable.

Is your site easier/faster than the already existing alternatives?

The goal should be to create tools that save time and effort, not new ways of investing huge amounts of time and effort. A couple of examples–

Molecular Cloning is our best-known laboratory manual (known informally as “Maniatis” or “The Bible”). When the most recent edition was released, we created a website to accompany it, MolecularCloning.com (it’s now defunct). The site was essentially an electronic, online version of most of the book. You bought the book, you got a unique user number that let you access the material. Only a relatively small percentage of book buyers ever logged on to the site using their number. An even smaller number became regular users.

If you think in terms of time, in terms of efficiency, it’s fairly obvious why this site failed to catch on. The site only offered the same material the user had already purchased in the form of the book. When it was time to do an experiment, the user had 2 courses he could take:
1) Go to the computer, go to the Molecular Cloning website, Log-in, search for the protocol, print the protocol, do the experiment.
2) Grab the book off the shelf, do the experiment.

So what we had done is essentially turn a 2 step process into a 6 step process. The advantages we were offering, better search capabilities and discussion forums, didn’t outweigh the downside of losing efficiency. We’ve tried to address these issues with CSH Protocols by taking advantage of scale. Instead of one book you’ve already purchased, CSH Protocols offers material from more than 25 books, along with material from CSHL Courses and newly published protocols not available anywhere else. Although you do have to do the steps above, it’s still quicker than looking through an entire library of books for your protocol, especially since your lab is unlikely to own all of the books CSH Protocols contains. Because CSH Protocols is sold by institutional subscription, the cost to an individual lab is also vastly lower than purchasing the content in book form (and we’ve eliminated the logging-in step above). So far, given our high level of traffic and usage, the advantages of scale and the efficiency that provides are providing something more useful to readers than our previous efforts.

As another example (and I’ve blogged about this previously), in a discussion with blogger and technology enthusiast William Gunn, he talked about using Connotea for journal clubs. Again, let’s review the choices for the user:
1) Go to the computer, go to the Connotea site, sign up for a Connotea account, log-in, add the link to the paper that’s going to be discussed, add tags to the paper, e-mail those links out to members of the journal club. Members receive the e-mail, go to Connotea, create an account, log-in, go to the link they were sent, follow that link to the actual journal’s website, download the pdf, read, discuss.
2) Go to the computer, download the pdf for the paper, e-mail it to the members of the group, read, discuss.

You tell me which is a more efficient use of precious time.

Why aren’t my readers leaving comments?

Like many other journals, CSH Protocols has discussion forums where one can comment on a paper, or ask questions about a technique. This seems like a no-brainer on a methods site, right? If you’re having trouble with an experiment, why not ask the crowd for suggestions? Creation of these commenting forums in a journal seems to inevitably lead to a second phenomenon, a few months later. This secondary phenomenon takes place in the form of an editorial wondering why no one is using the forums to leave comments. In the 1.5 years of CSH Protocols’ existence, we’ve seen a total of around four or five comments left on more than 1,000 articles published. There are many reasons behind the failure of these commenting systems (I’ll discuss cultural reasons later), but the main reason is that there’s no incentive for leaving comments. Again, you’re dealing with a limited amount of time, so why spend it on something for which you receive no credit? Where’s the upside in leaving a comment on someone’s paper?

In our case, as a methods journal, our readers are usually doing experiments that involve expensive reagents and that take up valuable time. If they’re having technical issues, does it make sense to ask strangers for help, or would a more directed search for a trusted source who has done the method before make sense? Would you trust the advice of a stranger when each experiment costs hundreds of dollars in reagents? And are those theoretical strangers really wandering through our articles looking for poor souls in need of advice? Why would they be wasting their time looking to play the good samaritan when they’ve got work to do as well?

Quality = Relevance

This quote comes from a talk given by Ian Rogers of Yahoo! Music at a Music Industry meeting, but it’s very relevant for our business as science publishers. We’re going from a world where there were fairly limited numbers of information sources, and hence lots of attention available for those sources, to a world of nearly unlimited information and attention scarcity. There is greater need now than ever for editorial oversight, for the separation of signal from noise, or as he puts it, “quality = relevance”. If you can make it easy for your readers to get the information they’re seeking (relevant to them), then you’re doing a high quality job for them, something they’re willing to pay for, even when there are free sources of less-well-organized information available.

Even Advertising can be a helpful timesaver

Can you come up with ways to save your readers time, to get them the information they want faster than they could gather it on their own? Here’s one example of something we’re trying on CSH Protocols, to turn advertisements away from being an annoyance and into being a useful feature. When our readers want to use one of the techniques we publish, they have to track down and order all of the reagents they need for the experiment. This can take a lot of work digging through a stack of catalogs or going to 20 different companies’ websites to find everything. At CSH Protocols, we’re partnering with the suppliers of these reagents, Sigma-Aldrich in this case, and creating advertisements tailored for individual protocols. The user can click on one ad and be taken to the supplier’s site where a customized page has been prepared and they can order everything they need for that experiment. Everyone wins–we get advertising dollars, we make our readers happy by saving them time and Sigma is happy because they get more business.

Reasons for lack of adoption

Another big reason for the lack of adoption of new technologies is inertia. By “inertia”, I mean a couple of related things.

First, most Web 2.0 sites aren’t useful until they’ve got a high level of participation. If the users are creating content, no users = no content. If there’s no content, no users are going to bother participating, rinse, lather, repeat, the circle goes around and around.

This becomes even further burdened by the proliferation of “me too” sites. Here you see nine different sites that all serve similar purposes. If I have limited time and each site requires a substantial time investment, how am I going to choose which one I’ll use when they all offer essentially the same thing? What happens instead is that most people choose not to choose and sit things out until a clear winner emerges. For those who do pick a site, the site they’ve chosen is only one of many, so it sees less traffic than if there were fewer available, which means less content, which means it’s less useful.

Lower the barriers to entry

The second thing I mean by “inertia” is the idea that if I’ve already got a way to do something, it’s going to take a lot to make me change to a new way. Doing something new takes effort and (again) time. If you don’t believe me, ask someone you work with to change computer operating systems, or ask your production department to switch from Quark to InDesign. If you want me to switch, you have to not just be better, you have to be way better for me to make that effort.

I’ll use the example again of the Molecular Cloning website, where putting the book online didn’t offer anything significantly better than the print version, so readers just stuck with what they already were using. As I mentioned earlier, we’re trying to address this problem by offering a product that gives the reader much more than they currently have available.

If you want your new site to be used, you need to lower the barriers to entry. Usability is often a huge barrier preventing new users from jumping in. Your tool has to be obvious, not only why you would use it, but how you would use it. The iPod is a good metaphor here. There were mp3 players around for years before the iPod, yet none saw any traction in the market. The why was obvious, as these players all offered significant advantages to carrying around a portable cd walkman. But the players were clunky and difficult to manage. One of the main reasons the iPod caught on like wildfire was because it didn’t require a manual to use. You pick up an iPod and the “how” is obvious.

Another great way to lower the barriers to entry is to use standard file formats. Work with what people already have, let them re-use the efforts they’ve already made, and then do as much of the work as possible for them. As an example here, look at the various referral sites available for keeping your reference lists online. Most of these sites allow you to input your reference list of papers from commonly used programs like EndNote. And that’s great, a really smart way to work, a big time saver. But the big time sink on these sites is going through each paper and add subject tags. If you’ve been a scientist for long, you have thousands of papers in your reference list. Why not automate this? Most papers published in journals feature keywords. Why aren’t any of the referral sites building in automated retrieval of keywords from papers listed, and setting those up as tags? Then you’d immediately have a useable database of tagged papers to start with, rather than requiring hours, if not days worth of work. It wouldn’t be perfect, but at least some of the work would be done for you.

And don’t just think about ways for the user to put their information into your site, also consider ways for them to export their information from your site. If you can provide functionality so the effort they put in at your site can be re-used elsewhere, your site suddenly becomes much more attractive. This also helps overcome the “me too” issues mentioned above. If I’m going to spend my time putting my information into your site, I want to be assured that if I have picked the wrong site, I can get my work out of your site for use on the eventual winner. If you offer that ability, the user is more likely to take the gamble.

Reasons for lack of adoption

Beyond the practical matters of time, there are also cultural reasons why the current tools are failing to catch on. I keep seeing the phrase “Myspace for scientists” used to describe new online efforts and it drives me crazy (examples here, here, here, here, here, here,…….). Myspace is targeted at a particular culture, and while it works well for that culture, the idea of shoehorning other groups into its functionality is flawed. Scientists interact in very different ways than teenager and their peers, or rock bands and their fans. Scientists don’t find collaborators by chatting online with strangers.

Tools need to be appropriate to the culture of your readers

As an example here, let’s go back to forums that allow you to comment on published papers. If you go to a biology meeting and study the behavior of the participants, you’ll some consistent patterns. The big shots and the networkers currently on the job market sit in the front rows and ask the majority of the questions of the speaker. The graduate students sit quietly at the back and if they have a question, they approach the speaker after the talk in private. I’m not saying this is a good thing, but there does seem to be a certain etiquette involved, or at least a wariness to expose oneself in public. Now, given that the younger students are those most likely to have the time to participate on your site, isn’t it likely that they’ll continue to follow the social patterns ingrained in their field? Unlike teenagers chatting, professionals making public statements have responsibilities and there are consequences to their words. Are your readers going to be willing to leave a critical comment on a paper publicly where everyone can see it for all time? Are they going to criticize work by someone who may one day be deciding on their grants, or offering them a job?

Appropriate Tools

Don’t get me wrong, there are actually some sites making great use of these new tools, and doing so in a manner appropriate to the culture.

As I said earlier, scientists don’t need social networks for chatting with online friends, or posting pictures of last weekend’s drunken bash. For scientists, social networks are all about jobs, finding openings or finding candidates for openings. SciLink is a site that understands this, and they’ve taken the LinkedIn model and tried to make a similar site for scientists (the problem here is that most scientists seem to be cutting out the middleman and just going directly to LinkedIn). Unlike Myspace, it’s not a place you would visit every day to check in on things. You’d set up a page, and really only spend much time there during the rare periods of your career when you’re actively seeking a job, or actively trying to hire someone.

Another great new Web 2.0 site comes from Scientists and Engineers for America (SEA). Given the paucity of funding available for science, and the lack of evidence-based reasoning in policy decisions seen in the US government over the last 7 or so years, more and more scientists are interested in becoming involved in the political process. But for many of us, it’s unclear how to do this, where does one start, what does one do? The SEA are creating tools that provide information and direction for scientists who want to have more of a voice in Washington. One very impressive tool is their new Science, Health and Related Policies (SHARP) Network. This is a Wikipedia-based project that gives you quick access to the science policies of elected officials and candidates. Here you’re likely to see more success of Web 2.0 tools because 1) there’s no deeply ingrained cultural precedent for scientists to express their political activism and 2) it’s an area where you’re likely to see enthusiasts, where people are passionate enough to commit the time needed to participate.

There are also many great community sites, like FlyBase and WormBase that are thriving. I’m not sure if these would really be considered “Web 2.0” sites, as most of their content is more like a standard database or an online book. If you take a look at the more interactive driven areas of these sites, like the discussion forums, you see very little traffic. Clearly there are tools that work for these communities on these sites, and tools that are haven’t caught on.

Are blogs useful tools?

Are blogs useful tools? The answer is yes, they do serve a purpose, but you must understand what that purpose is. Like the other tools I’ve mentioned today, blogging (at least blogging well) takes a lot of time and effort. If you’re going to commit that time and effort, you should be clear on what you’re getting out of it, and who your audience really is.

As I said at the beginning of this talk, I’ve had a hard time finding any scientists who regularly read science blogs. Science blogs can be a good tool for a student, a way to improve their writing and communication skills, but it’s unlikely they’re going to come to great scientific prominence from these blogs. There also seems to be some partially justified fears that blogging is more harmful to a career than helpful. A key quote from Sean Carroll (the blogging physicist, not the evolutionary biologist):

“A blog raises your profile, but it raises your profile for something other than research,” Carroll said. And even if you are extremely productive as a scholar, he said, some professors may view a blog as sign that you could be spending more time in the laboratory or library, engaged in traditional research.”

So who reads science blogs?
1) Other science bloggers
Go read any science blog and take a look at the comments people leave. Nearly every single one contains a link to the commenter’s own blog. There’s a relatively small circular group here, of scientists who are interested in blogging, who write blogs and who read blogs and who leave comments on other peoples’ blogs. These are the enthusiasts, not the mainstream. While they may be harbingers of future developments, right now they’re not getting the attention they think they are.

2) Non-scientists / Non-specialists
Probably the majority of people who read science blogs fall into this category. The quote used on the slide is again from Sean Carroll. As he notes, scientists already have very efficient methods of communicating their work with one another (publishing papers, giving talks, e-mail). Blogs, for him, are a great way to reach people outside of your field, and, as he puts it, “We don’t have a lot of goals other than us having fun.”

The next two sets of readers are the important ones for publishers to consider.
3) Journalists
Science journalists are clearly reading blogs and using them as fodder for story ideas. Blogs serve as a great place for a scientist to translate an important finding into clear language so journalists can pick up on the significance of the discovery. Blogs can be more effective than press releases, as they can cover a subject in much more detail and over a longer period of time. An editorial blog highlighting the papers your journal is publishing is a great way to get recognition for your authors.

4) Search Engines
You want people to find the material you’re publishing. Blogs help you out here, as they create more opportunities for your published content to come up in a web search. By blogging about the published material, you create new search strings, new ways for searchers to find your articles. By adding links to the content, you help increase the position of your material in search engine listings. That’s one of the main reasons we started the Bench Marks blog, really thinking of it more as a marketing endeavor than anything else. It helps raise awareness about the material we’re publishing, drives traffic to our journal site and helps our search rankings.

Don’t feel like you have to have a site, and have a business model

Following along with that point, it’s always important to have a clear picture of why you’re creating a website, and this is even more important for Web 2.0. You shouldn’t feel pressured to jump into this world just because you keep reading articles about it, or it seems like everyone else is doing it. It’s very easy to spend a lot of money and a lot of time and never see any return. There’s not much apparent money being made on Web 2.0 for science at this point. It’s important to know why you’re building a tool from the get-go. Have a real business model, don’t just build the site and assume you’ll later find a way to monetize it. A few quick examples are shown here. We originally envisioned CSH Protocols as a fairly different entity than it has become. It’s less the free-wheeling database and community center we expected, and is morphing more and more into a monthly journal, which is good in that readers are familiar with the paradigm, and it has a clear and proven business model behind it. It’s bad in that we’ve had to spend time and effort re-working and reverse engineering the site over time as our model changed. Other models you see in use include the idea of getting users to create content, then monetizing that content by selling advertisements, basically the modus operandi of Google and the Nature Network. SciLink’s business model is more about collecting data and doing analysis for companies. They can provide information to headhunters, and to HR departments regarding trends in hiring, both internal and external to companies. And lastly there are sites like this blog, which are not revenue-generating products, but instead serve a marketing purpose.

Summary

In summary, here are the key points, ideas to keep in mind when trying to design new online tools for scientists. As I said at the beginning, the potential is great but it has yet to be realized. I don’t expect the sites that finally break through to mainstream usage in the biology community to just be copies of things that have worked elsewhere. I expect that they’ll be specifically tailored to fit the needs of the community. Hopefully, by following some of the directions pointed here, you’ll come up with the exciting new tools I’m seeking. And I’ll just end this with an illustration from the always-entertaining Adam Koford, an uplifting “unicorn chaser” to wash away all my negative talk of things that aren’t working, and hopefully to inspire us all towards a shinier future.