Center for Open Science, Gary Charness, and other Researchers to Present at The Unjournal’s Co-hosted Event

On March 25-26, The Unjournal will host its inaugural online event - Innovations in Research Evaluation, Replicability, and Impact, in collaboration with the Center for Open Science and Effective Thesis. The event has is scheduled in two sessions to accommodate both Asia-Pacific and Atlantic time zones, and registration is still open. A range of researchers and open-science advocates will present their work and initiatives, and engage discussion and feedback.

From left to right: Amanda Metskas, Clare Harris, Gary Charness, and Macie Daley

Macie Daley (Project Coordinator at the Center for Open Science) will present a new initiative called “registered revisions”, as well as other ways that COS broadly supports replicability, research integrity, meta-science, and innovation. (Atlantic session)

Gary Charness will discuss “improving peer review in economics”. Charness worked jointly with Anna Dreber, Daniel Evans, Adam Gill, and Séverine Toussaert, to survey over 1,400 economists to “document the current state of peer review and (ii) investigate concrete steps towards improving it.” Charness is a Professor of Economics and the Director of the Experimental and Behavioral Economics Laboratory in the Department of Economics at UC Santa Barbara. (Asia-Pacific session)

Clare Harris and Amanda Metskas host sessions on Incentivizing Best Practices Through Transparent Replications (by Clearer Thinking). (Both Asia-Pacific and Atlantic sessions)

We are testing a new approach to promoting open science and replicable research practices in experimental psychology. We replicate studies from randomly selected, newly-published papers from a predefined set of prestigious psychology journals, plus all newly-published psychology papers in Nature and Science. We rate studies on their transparency, replicability, and clarity in communicating findings, and publish the results on our website”. The authors emphasize the importance of unambiguously communicating if and how researchers’ conclusions necessarily follow from the research results.

Takahiro Kubo and Robert Kubinec, Impactful Research Prize & Impactful Evaluator Prize winners

The event will recognize the winners of The Unjournal’s Impactful Research Prize and Evaluator Prizes. Participants of the Pacific time zone session are invited to presentations by Takahiro Kubo (NIES & University of Oxford; Impactful Research Prize winner, “Banning wildlife trade can boost demand for unregulated threatened species”), and Robert Kubinec (NYU Abu Dhabi), one of the winners of Impactful Evaluation Prize.

The event will also feature group discussions, feedback sessions, and lightning talk presentations by Jay Patel, Jonny Coates (ASAPbio), and Safieh Shah (Academia for Advocacy).

The Unjournal’s David Reinstein will explain their approach to journal-independent public evaluation. He will host two discussion and feedback sessions:

  • Is journal-independent evaluation & rating (e.g.,Unjournal) better than the traditional model? How & why will this succeed or fail?

  • Structured evaluation metrics: (How) should we rate research?


Previous
Previous

Open Science, Humanitarian Engagement, and Critical Social Media Appraisal

Next
Next

Join us: Innovations in Research Evaluation, Replicability, and Impact