So, what's this "PLoS One" thing I was talking about, anyway? PLoS is, as many will know, the acronym for the Public Library of Science, one of the beacons of the open-access science publishing movement. They had about half a dozen journals (almost entirely in the medical/biological fields) until recently, when they added PLoS One.
No, it's not the one with fewer calories. I'm not sure why the name was chosen, except perhaps as an attention-getting device. (Is there going to be a PLoS Two?) It is a fairly radical publishing move, establishing something that's part preprint server, part refereed journal, and part user-ranked content site. Papers can be submitted in just about any area of science, and will be checked to make sure that they're methodologically sound - that is, that their conclusions can reasonably be drawn from the evidence that they present. And that's it. Once past that point, everything gets in.
So how do you find out what's worth reading? Here's where the user-generated part comes in. You can leave comments on any aspect of any paper, and as long as they're presented reasonably, they're in to stay. The more comments/recommendations a paper gets, the more attention it will continue to draw. And (although they're not enabling this yet), there will be a ranking system, where readers can assign scores to each paper they're read, with visible aggregated ratings: Science meets Slashdot (or Digg, or Reddit, and yeah, I know that readers of each of these sites spend a fair amount of time making fun of each other). The PLoS people are bypassing most of the debate about how to referee papers, setting up what's essentially a garbage filter and letting the readers sort things out after that.
Papers can also be annotated, with comments attached to specific points in the manuscript. This (and the rating system) are the parts I'm most interested in seeing in action. As far as I can tell, they're going to have anonymity, although you'll have to register (confidentially) to use these features. The guidelines for commenting and annotating seem reasonable:
1. Language that is insulting, inflammatory, or obscene will not be tolerated.
2. Unsupported assertions or statements should be avoided. Comments must be evidence-based, not authority-based.
3. When previously published studies are cited, they must be accurately referenced and, where possible, a DOI and link to a publicly accessible version supplied.
4. Unpublished data should be provided with sufficient methodological detail for those data to be assessed. Alternatively, a permanent Web link to such information should be provided.
5. Arguments based on belief are to be avoided. For example the assertion, "I don't believe the results in Figure 2." must be supported.
6. Discussions should be confined to the demonstrable content of papers and should avoid speculation about the motivations or prejudices of authors.
I can see that they've devoted some thought to what might happen. I think that this will be a critical-mass phenomenon - if enough papers get annotated and ranked, it'll become the norm. And if not, these features might wither on the vine, which would be a shame. I've registered with the site as of this morning. Let the experiment begin!
(More useful commentary here at Evolgen, at The Unbearable Lightness. . ., Sciencebase, Bugs n' Gas Gal, ContentBlogger, Digging Digitally, Evangelutionist, and Notes From the Biomass.