Skip to main content

Collaborative Community Review on PubPub

Published onSep 19, 2019
Collaborative Community Review on PubPub

In preparation for Peer Review Week, I wanted to take a closer look at some of the collaborative community review experiments that have happened recently on PubPub. Finding new ways to harness engagement in scholarly communications is a goal of the Knowledge Futures Group, and inline annotation is a technology that I rely upon every day to organize my thoughts and track my online reading. I reached out to the authors of three forthcoming MIT Press books that have undergone this type of review during the last year. I was excited to learn about their experiences and to share some of their observations here.

The projects I looked at were:

Data Feminism by Catherine D’Ignazio, Assistant Professor, Emerson College, and Lauren Klein, Associate Professor, Georgia Institute of Technology.

The Good Drone: How Social Movements Democratize Surveillance by Austin Choi-Fitzpatrick, Associate Professor of Political Sociology, University of San Diego, and Rights Lab Associate Professor of Social Movements and Human Rights, University of Nottingham.

Annotation, Remi Kalir, Assistant Professor, University of Colorado Denver, and Antero Garcia, Assistant Professor, Stanford University.

I asked these five authors how they learned about Community Review on PubPub, their process for dealing with annotations, their general experience going through the experiment, and whether or not they would open their next projects to such collaborative review. The authors’ intentions in collecting such feedback, the timing of this review within their overall publication process, and their way of responding the the reviews all contributed to their individual experiences.

Data Feminism

Lauren and Catherine had heard from MIT Press that PubPub was under development and that it would be available to pilot an open peer review process for Data Feminism. As authors of a book on intersectional feminism, they had a clear goal for engaging in a collaborative process, as they wished to include multiple perspectives to help do justice to many rich voices. “Criticism is a marker of respect and an acknowledgement that others see in us the ability to learn.” they noted. “Taking the time to teach us was an incredible act of generosity.”

They took considerable care in inviting individuals who might reflect upon the entirety of the work, as well as those with a particular interest in a specific chapter. More than two dozen participants by their estimate. In addition to inline discussion, they also received direct feedback through email. One person printed the entire work and annotated it by hand! In addition to the invited reviewers, the project was promoted on Twitter via their personal networks and by the MIT Press.

The project was open for collaborative review at the same time that it was undergoing traditional peer review. The authors found that the feedback obtained via their publicly available draft was equally as valuable as the suggestions in the traditional reviewer reports. Input from readers was incorporated into the online draft in preparation for the final publication which will be substantially different than the original.

At PubPub we’ve found that author engagement contributes greatly to the number and thoughtful nature of annotations. Catherine and Lauren tried as much as possible to keep up with the annotations as they were submitted. “Generous feedback is harder than negative feedback,” they remarked. “You never know beforehand who will give the comment that will make the bell go off, crystalizing a new way to frame an argument.” They agreed that they would do it all over again.

The Good Drone

Austin went into this project with a desire to make it as open as possible—Open Access drives eyeballs after all! He wanted to be read and to also be understood. Academic publications support tenure and promotion, but he also cares about the world we are living in. On past projects, he had invited folks to read and provide feedback and had thoughtfully organized books so that more academically-oriented chapters could be skipped without impinging on the argument of the work. He was aware of the open source annotation tool Hypothesis (full disclosure: I used to work for Hypothesis and still use it every day), and he’d even considered hiring a vendor to create an annotatable site for this book. Fortunately, “in a super lucky meeting of the minds” PubPub was there and available to host The Good Drone.

Unlike Data Feminism, which was very much a work in progress, Austin’s book was nearly ready for publication. His decision to post it for an 8-week collaborative review would have an impact on the publication schedule. He too reached out directly and via Twitter to invite participation. In addition, he encouraged students to participate in the experience. Also in contrast to Lauren and Catherine, he wanted to wait until the weeks long review process was complete. “I wanted to make sure that those creating the annotations would have ample space to have their debates without interjecting my opinions,” he noted.

Overall, there wasn’t as much engagement as he had hoped, so not many changes were made as a result of the open review. However, Austin sees publication of the work as one of several steps in the open manuscript process. He hopes to make The Good Drone the most open book possible: publicly collected data stored in an open archive, going through an open review process, and publishing the book in both print and as well as Open Access. He is in talks with PubPub about the possibility of launching a living version of the book, which would evolve in response to new evidence on the ground, new findings in the literature, and new arguments from readers. He thinks of the project as a dashboard for the reception of a scholarly work as well as a model for how books can morph over time.

He notes that authors of such projects should consider the return on investment. It take time to go through community feedback, so one needs to determine whether the pay off will be worthwhile. Nevertheless, if his next work is suitable for community review, he’d like to do it again.


When writing about annotation, open review was the obvious path to take. “You need that meta-experience,” Remi and Antero stress. “Annotation has been around for millennia. We needed a practical way to showcase it.” For this project, the annotations created—and the process of their collection— will be incorporated into the final version of the work.

They were aware of PubPub and had even read through the feedback generated on the Data Feminism draft. They also took a close look at Frankenbook, published in 2018 to commemorate the 200th anniversary of the publication of Frankenstein and heavily annotated to highlight key themes.

They invited about half of those who ultimately participated in the review process. They too promoted the manuscript’s availability via Twitter as well as in conference presentations that occurred during the open review timeframe. The project was mentioned in conjunction with the posting of materials from the annual I Annotate conference, a must-go for annotation enthusiasts, which raised its visibility further.

Like Austin, Remi and Antero emphasize that such a collaborative review process seems like a lot more work for authors. “For transparency and collaboration to thrive, it often takes effort,” they say. The whole thing can add a sense of anxiety, particularly if traditional peer review is occurring in parallel. It changes the logistics and shifts the deadlines, so authors should be prepared. Like Austin, they largely waited until the review period had closed before addressing the majority of annotations. They wanted to let the feedback settle and marinate. (They also had their hands full during this time period with new additions to each of their families!)

In preparation for the open review, they created annotation guidelines, based in part on the Code of Conduct posted for Data Feminism as well as the Hypothesis Community Guidelines, and to date they haven’t noticed any problematic posts. While they were still working their way through the feedback and had not yet received their traditional peer reviewer reports at the time of our conversation, they mentioned the need to balance the reaction to the perceived authority of the traditional anonymous reviewers with their responses to remarks from the named annotators, many of whom they know and respect greatly. Although their experience is still in progress, they appear willing to try a similar undertaking in future.

Experimenting with Collaborative Community Review

I enjoyed speaking with these authors. Their advice seems clear. Think about the goals you want to achieve with collaborative review. Be prepared for how the activity will affect publication timelines and, when necessary, how it will fit with traditional peer review. Make sure you have ample time to invite participants and to synthesize their feedback, whether you engage day-to-day or decide to wait. Be prepared to listen, and open yourself to the new perspectives that such an activity can offer.

One of the reasons I find PubPub so exciting as a platform is its experimental nature. Regardless of whether a project will eventually find its home on the platform, anyone can post a manuscript and invite different audiences, including the general public, to provide feedback through the annotation feature. (We’ll also be introducing functionality later this year that will streamline traditional peer review, so stay tuned!) We’re happy to hear from anyone who wants to give it a try. Thanks to everyone who has contributed feedback through collaborative review. And thanks to the authors of Data Feminism, The Good Drone, and Annotation who have been so generous in sharing their experiences so far. We’re thrilled to be a part of your authoring journey!

Header image: Photo by Kari Shea on Unsplash

Sign up for our occasional newsletter for the latest PubPub feature updates, community news, and blog posts.

No comments here
Why not start the discussion?