Very interesting piece in the New York Times about putting academic journal articles online for peer review. Normally, a manuscript gets several anonymous reviews, at which point the editor (e.g. me at The Latin Americanist) decides whether to accept (very rare), revise and resubmit, or reject.
Mixing traditional and new methods, the journal posted online four essays not yet accepted for publication, and a core group of experts — what Ms. Rowe called “our crowd sourcing” — were invited to post their signed comments on the Web site MediaCommons, a scholarly digital network. In the end 41 people made more than 350 comments, many of which elicited responses from the authors. The revised versions were then reviewed by the quarterly’s editors, who made the final decision to include them in the printed journal, due out Sept. 17.
My first reaction is that there is a "too many cooks" element here. As an editor at even a small journal, I can tell you it is very time consuming to read through both the articles and the reviews. 350 "mini-reviews" per article manuscript means trying to sift through too much. But it is not only the bulk--it is also about trying to figure out the qualifications of the people making those suggestions. When I solicit reviews, I've spent quite a lot of time verifying that the reviewer is qualified.
See more at The Monkey Cage, with the idea that people could register and comment on articles when they're published, which sounds intriguing and with good moderation could spark interesting debates.