“In the long history of humankind: those who learned to collaborate and improvise most effectively have prevailed.” — Charles Darwin
Open, collaborative research accelerates scientific discovery. In few fields does speed matter more than biomedicine, where individual labs and clinics hold critical clues to life-saving therapies. Yet there are serious roadblocks to sharing data and insights. First, team science requires time and attention — both in short order for clinical researchers. Second, the current incentive system of ‘publish or perish’ positions collaborators as competitors. Our solutions for open sharing include tools, editorial facilitation, and a new reward metric.
By using a carefully designed architecture of attention, tools “enable us to scale up creative conversation, so connections that would ordinarily require fortuitous serendipity instead happen as a matter of course… amplifying collective intelligence.”1
Tools on the Collaboration Platform allow multi-institutional, multidisciplinary teams to:
- Post research results and clinical observations, discuss with peers, and update with new evidence and insights
- Join and form groups based on topics, tasks, and group identities — sharing with narrower or wider circles of access
- Engage in “organic peer review” to promote iteration, replication, and more reliable findings in open access respositories
Rapid discovery requires open communication, “organic” peer review among collaborators, and iteration of results. Yet collaboration places a burden on scientists’ time and attention. We are refining processes that reduce that burden by deploying our editorial subject experts to seed, mine, and traffic discussions and content — reversing the role of traditional journal editors. Instead of serving as gatekeepers, our facilitators work as peers with research teams, helping to communicate findings and insights to an ever wider audience.
Rapid Science Open, a work in progress, is a continuum of the Collaboration Platform where early findings, insights and reviews are submitted to a preprint repository with optional peer-reviewed publication.
- Continually updated Evidence Reviews are state-of-the-art, nuanced dispatches written and maintained by our editorial staff. Project participants are co-authors, contextualizing their latest findings and insights with the current published evidence.
- Micro-results from the team – e.g., negative results, N-of-1 studies, posters, and incomplete datasets (often lost in closed notebooks) – can be cited and given provenance in the Evidence Review to challenge or support the published evidence.
- Cases Central is a database for submitting and discussing anonymized patient data such as outlier responses to trial drugs and combinations. Data is submitted by template, free text, or uploaded document, and is standardized by RS.
“All that’s needed for open science to succeed is for the sharing of scientific knowledge in new media to carry the same kind of cachet that papers do today.”1
To promote and reward collaboration we’re developing a “C-score” that measures the quality and quantity of collaborators’ contributions to group efforts. The score will take into account the full variety of tasks occurring on the collaboration platform – sharing, commenting, rating, reviewing, and authoring. It will be formulated by reputation experts drawing from the insights of mathematicians, computer scientists, economists, and social scientists, and vetted by funding officials, academic administrators, and researchers themselves (see executive summary).
1 Michael Nielsen, “Reinventing Discovery: The New Era of Networked Science,” Princeton Univ. Press, 2011.