Application of OSS Pull Requests to Academic Citation and Incentivise Public Participation Peer Review
By defining “push” in academic citation, learning from open-source software (OSS) development, we demonstrate citation metrics can incentivise reviewer as repository owners, under open participatory peer review. Potential drawbacks and countermeasures are also discussed. venue: ‘ScisciConference’ paperurl: ‘https://sciscijp.github.io/scisciconfJP2024/program/' type: ‘Poster Presentation’
Hybrid Discipline: Knowledge Flows in Political Science for the past Century
[Contributed Talk] Incentivised Open Participated Peer Review without Rewarding
By defining “push” in academic citation, learning from open-source software (OSS) development, we demonstrate citation metrics can incentivise reviewer as repository owners, under open participatory peer review. Potential drawbacks and countermeasures are also discussed.
DNA barcoding of Research Diversity
Every university has its own research strength and norms — a unique mixture of ideas, vocabularies, and disciplinary codes. Epistemic diversity is often celebrated as a hallmark of vibrant academic institutions, fostering innovation and cross-pollination of ideas. University administrators are increasingly interested in fostering interdisciplinary research, and identifying the measures that can cultivate, sustain, or restore the strength of research ecosystems. Many researchers have tried to quantify this diversity, but it’s a tricky problem. Most publishers use subject classification systems to tag research articles: WoS category and ASJC to name a few. However, these systems are coarse-grained and often inconsistent granularity across disciplines. Many discipline has its own coding — physics has PACS, astrophysics has AAS, biology and medical sciences have MeSH terms, economics has JEL. But there are many disciplines that lack such coding systems. Recent advances in interdisciplinary research make it even harder to classify research into discrete categories. ...
Bento-kai: Making Better Science and Academic Ecosystem
You can see details here Build a better ecosystem of science and academia, together. We are inviting you to the "Bento-kai," a lunchtime study group: As we aim to innovate further and enhance research capabilities, we face many daily challenges regarding the mechanisms of science and academia. To address these, efforts have begun actively across industry, government, and academia to form communities, design policies and systems, and develop services. However, given the complexity of the challenges we face, you may experience difficulties and frustrations in your activities. We define endeavors that tackle challenges in science and academia as "Metascience," and with the aim of sharing practical knowledge surrounding Metascience and building an organic space for connections among participants, we have launched "Bento-kai," a "brown-bag" style study group. ...
[Contributed Talk] Scientists Have an Inherent Prioritized Queue in Selecting Collaborations
title: “Scientists Have an Inherent Prioritized Queue in Selecting Collaborations” excerpt: The paper analyzes the temporal dynamics of co-authorship in scientific publications, finding a fat-tailed interval distribution between recurring collaborations, which can be explained by a proposed priority selection model simulating multi-agent team dynamics based on co-authorship willingness. The findings suggest non-Poisson activity patterns in scientific collaborations. venue: ‘ICSSI’ url: https://icssi2024.org/ (the website is no longer available) paperurl: ‘http://academicpages.github.io/files/paper3.pdf' type: ‘Contributed Talk’ ...
From Audit to Dialogue: Research Evaluation's Radical Turn
What do research crowdfunding, citizen science, and blockchain technologies have in common? They represent a fundamental shift away from centralized authority toward distributed decision-making. This same shift is now reaching the heart of academic governance: how we evaluate research. I attended a symposium at Kyoto University on their new research evaluation framework, COMON. As an outsider—a researcher studying peer review systems—I witnessed something that goes far beyond the “responsible metrics” discourse of DORA or the collaborative spirit of CoARA. This is not another call for better measurement. It is a restructuring of institutional relationships, built on recognition that universities must help research communities create entirely new resources rather than compete for a fixed pie. ...
Science of Science Tutorial
I have contributed to ScisciConf. as a chief coder and superviser of scisci-handson. Codes are consists of 1. Data structure of OpenAlex, 2. Clustering and visualization of disciplines, 3. Research evaluation by Wu-disruptiveness index, 4. Researcher evaluation by h-index. venue: ‘ScisciConference’ paperurl: ‘https://github.com/ScisciJP/scisciJP2024_tutorial' type: ‘Lecture’ Science of Science Tutorial
[Contributed Talk] The Age of Anticipation and the Better Way to Share Science
venue: ‘DeSciTokyo Conference’ url: ‘https://desci-tokyo.jp/' type: ‘Contributed Talk’