Scholarly Publishing - Digital Science https://www.digital-science.com/blog/tags/scholarly-publishing/ Advancing the Research Ecosystem Tue, 28 Oct 2025 09:35:23 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.3 https://www.digital-science.com/wp-content/uploads/2025/05/cropped-favicon-container-2-32x32.png Scholarly Publishing - Digital Science https://www.digital-science.com/blog/tags/scholarly-publishing/ 32 32 Applications now open for the 2026 APE Award for Innovation in Scholarly Communication https://www.digital-science.com/blog/2025/10/applications-open-2026-ape-award-innovation-scholarly-communication/ Tue, 28 Oct 2025 09:30:32 +0000 https://www.digital-science.com/?p=94994 Inviting applications globally for the 2026 APE Award for Innovation in Scholarly Communication.

The post Applications now open for the 2026 APE Award for Innovation in Scholarly Communication appeared first on Digital Science.

]]>
Academic Publishing in Europe (APE) award celebrates pioneers in their field

London, UK & Berlin, Germany—Tuesday 28 October 2025

Digital Science and the Berlin Institute for Scholarly Publishing (BISP) invite applications for the 2026 APE Award for Innovation in Scholarly Communication.

Now in its fourth year, the award will be presented at the 21st Academic Publishing in Europe APE Conference in Berlin (13-14 January 2026).

The award is given to an individual who has brought innovation in scholarly communication to the world of research and the academic publishing community. The winner will receive a €1,000 prize, along with travel support and free attendance to the conference.

The closing date for applications is Friday 28 November 2025 – see full application details here.

Since its launch, the APE award has recognized a diverse range of innovators working to improve scholarly communication.

Past recipients include:

  • Vsevolod Solovyov (2023) for his work on an online platform that recommends grant reviewers to the European Research Council
  • Laura Feetham-Walker (2024) for advancing academic peer review, through training and certification
  • Dr Raym Crow (2025) for pioneering mission-driven, sustainable open publishing models

Dr Daniel Hook, CEO of Digital Science, said: “We are honored to once again partner with BISP to celebrate individuals who – through their vision and passion – are redefining how research is shared with the world.

“Innovation in scholarly communication isn’t just about technology or products, it’s about new ways of thinking, new business models, and collaborations. We look forward to seeing creative nominations from across the global research community.”

About Digital Science

Digital Science is an AI-focused technology company providing innovative solutions to complex challenges faced by researchers, universities, funders, industry and publishers. We work in partnership to advance global research for the benefit of society. Through our brands – Altmetric, Dimensions, Figshare, IFI CLAIMS Patent Services, metaphacts, Overleaf, ReadCube, Symplectic, and Writefull – we believe when we solve problems together, we drive progress for all. Visit digital-science.com and follow Digital Science on Bluesky, on X or on LinkedIn.

Media Contact

David Ellis, Press, PR & Social Manager, Digital Science: Mobile +61 447 783 023, d.ellis@digital-science.com

The post Applications now open for the 2026 APE Award for Innovation in Scholarly Communication appeared first on Digital Science.

]]>
Digital Science investigation shows millions of taxpayers’ money has been awarded to researchers associated with fictitious network https://www.digital-science.com/blog/2025/09/taxpayers-money-awarded-to-researchers-associated-with-fictitious-network/ Thu, 04 Sep 2025 13:00:44 +0000 https://www.digital-science.com/?p=94374 Digital Science investigations show researchers associated with a fictitious research network and funding source have netted millions of taxpayers' dollars in funding.

The post Digital Science investigation shows millions of taxpayers’ money has been awarded to researchers associated with fictitious network appeared first on Digital Science.

]]>
Thursday 4 September 2025 – London, UK and Chicago, USA

Researchers associated with a fictitious research network and funding source have collectively netted millions of dollars of taxpayers’ money for current studies from the United States, Japan, Ireland, and other nations. That’s according to investigations led by Digital Science’s VP of Research Integrity, Dr Leslie McIntosh.

The results of her investigations raise serious concerns about the lack of accountability for those involved in questionable research publications.

“This example illustrates how weaknesses in research and publishing systems can be systematically exploited, so that researchers can game the system for their own benefit,” Dr McIntosh says.

Dr McIntosh – one of the co-founders of the Forensic Scientometrics (FoSci) movement – has presented her analysis at this week’s 10th International Congress on Peer Review and Scientific Publication in Chicago, in a talk entitled: Manufactured Impact: How a Non-existent Research Network Manipulated Scholarly Publishing.

While not naming the individual researchers involved, Dr McIntosh’s presentation was centered on a group known as the Pharmakon Neuroscience Network, a non-existent body listed on more than 120 research publications from 2019–2022 until being exposed as fictitious. These publications involved 331 unique authors and were associated with 232 organizations and institutions across 40 countries.

Research network raised multiple red flags

The Pharmakon Neuroscience Network functioned as a loosely organized collaboration of predominantly early-career researchers, such as postdoctoral and PhD students, whose publications included:

  • Funding acknowledgments with unverifiable organizations
  • Use of questionable or unverifiable institutional affiliations
  • Suspiciously large citations in a short timeframe
  • Globally connected despite a young publication age

“Despite clear concerns about the legitimacy of their work, only three papers have been formally retracted to date,” Dr McIntosh says.

Using Digital Science’s research solutions Dimensions and Altmetric, Dr McIntosh and colleagues have tracked the progress of the authors connected with this network.

“Once the Pharmakon Neuroscience Network was exposed as being fake in 2022, it no longer appeared on publications, but many of the researchers associated with it have continued to publish and attract significant funding for their work,” she says.

Millions in funding for current research

Of the initial 331 researchers associated with the Pharmakon Neuroscience Network’s publications, Dr McIntosh has established that more than 20 currently have funding either as a Principal Investigator or a Co-Principal Investigator from sources where the grant commenced in 2022 or later. During this time, those researchers have collectively been awarded the equivalent of at least US$6.5 million from seven countries: US, Japan, Ireland, France, Portugal, and Croatia, and an undisclosed sum from Russia.

One researcher with more than US$50 million in funding has authorship on one of the Pharmakon papers. It is not clear if he knowingly participated in the network or was part of a former student activity. 

“Many of the researchers had grants before and after Pharmakon. This is legitimate, taxpayer money in most instances that are funding very unethical practices,” Dr McIntosh says.

“One aspect we need more time to vet is the possibility that a few of these researchers do not know they were authors on papers within this network. We are still completing this work.”

Of the funded researchers, five had never previously received funding for their research, but following their involvement with the Pharmakon Neuroscience Network they have since been awarded grants from the following sources ($US equivalent):

  • Science Foundation Ireland – $649,891
  • Ministry of Science, Technology and Higher Education (Portugal) – $538,904 total
  • Croatian Science Foundation – $206,681
  • Russian Science Foundation – undisclosed sum

“Here we have evidence that some authors have secured legitimate funding, including large sums of taxpayers’ money, following their participation in questionable research and publication activity,” Dr McIntosh says.

“We can presume that their publication portfolio, no matter how it was obtained, helped in securing this funding from legitimate sources.”

Dr McIntosh says this case has implications across the research system and emphasizes the need for stronger verification, monitoring, and cooperation.

“Although most of these publications remain in circulation and have been cited widely, corrective actions have been limited. This highlights the challenge of addressing such networks once their work is embedded in the scholarly record,” she says.

Recommendations

Dr McIntosh recommends the following:

  • Oversight to be reinforced by requiring the use of verified institutional identifiers, such as GRID or ROR, in all publications to ensure affiliations are legitimate and traceable.
  • Transparency to be mandated through clearer author contribution statements and verified funding acknowledgments, creating a more reliable and accountable record of how research is conducted and supported.
  • Monitoring mechanisms should be improved by supporting the adoption of forensic scientometrics, which can detect unusual collaboration patterns or questionable authorship practices before they become systemic.

“By addressing these gaps, governments, publishers and research institutions alike can help protect the integrity of the research system and ensure that trust in science is maintained,” Dr McIntosh says.

See further detail about this investigation in Dr McIntosh’s blog post: From Nefarious Networks to Legitimate Funding.

About Digital Science

Digital Science is an AI-focused technology company providing innovative solutions to complex challenges faced by researchers, universities, funders, industry and publishers. We work in partnership to advance global research for the benefit of society. Through our brands – Altmetric, Dimensions, Figshare, IFI CLAIMS Patent Services, metaphacts, OntoChem, Overleaf, ReadCube, Symplectic, and Writefull – we believe when we solve problems together, we drive progress for all. Visit digital-science.com and follow Digital Science on Bluesky, on X or on LinkedIn.

Media contact

David Ellis, Press, PR & Social Manager, Digital Science: Mobile +61 447 783 023, d.ellis@digital-science.com

The post Digital Science investigation shows millions of taxpayers’ money has been awarded to researchers associated with fictitious network appeared first on Digital Science.

]]>
Emerald Publishing to safeguard research integrity with Dimensions Author Check https://www.digital-science.com/blog/2025/08/emerald-publishing-to-safeguard-research-integrity-with-dimensions-author-check/ Wed, 06 Aug 2025 09:23:12 +0000 https://www.digital-science.com/?p=93831 Emerald Publishing has adopted Dimensions Author Check from Digital Science as part of Emerald’s ongoing commitment to research integrity.

The post Emerald Publishing to safeguard research integrity with Dimensions Author Check appeared first on Digital Science.

]]>
Wednesday 6 August 2025

Digital Science is pleased to announce that Emerald Publishing has adopted Dimensions Author Check as part of Emerald’s ongoing commitment to research integrity.

Dimensions Author Check offers publishers a fast and reliable way to incorporate research integrity checks into their work, helping to support responsible and ethical publishing.

Built on Digital Science’s Dimensions – the world’s largest interconnected global research database – Dimensions Author Check offers unmatched transparency into authors’, editors’ and reviewers’ publishing and collaboration histories, accessible through an intuitive and visual dashboard.

Using the dashboard, publishers can thoroughly review the publishing history of a researcher and the people they’ve collaborated with, to spot any unusual activities, such as retractions, expressions of concern, or atypical collaboration patterns.

Sally Wilson, VP Publishing at Emerald, said: “The primary use case for Author Check is to support our due diligence processes when developing and reviewing new special issue proposals. It allows us to efficiently verify the academic credentials, publication history, and editorial experience of proposed guest editors and contributors, helping ensure they meet our editorial standards and ethical expectations.

“Additional use cases are to support our editor succession planning and commissioning activities, offering valuable insights into potential candidates’ research impact and professional networks.

“We hope by integrating Author Check into these workflows, we not only enhance the integrity and transparency of our editorial decision-making but also save time by streamlining what would otherwise be manual and time-consuming processes.”

Dr Leslie McIntosh, VP of Research Integrity at Digital Science, said: “We’re excited that Emerald has become the latest publisher to adopt Dimensions Author Check, further boosting Emerald’s commitment to supporting the integrity of the scholarly record.

“Dimensions Author Check empowers publishers to uphold trust and transparency in research by ensuring they have the best possible information at their fingertips – within seconds.”


About Emerald

Founded by management scholars in 1967, and now part of the Cambridge Information Group, Emerald Publishing provides a range of publishing services to help researchers tell their stories in a meaningful and timely way, providing innovative tools and services to build confidence and capability in impactful research. As a proud signatory of DORA, Emerald is committed to establishing new pathways to impact, making research more accessible, and helping communities make decisions that change their world for the better.

For over 55 years Emerald’s core purpose has been to champion fresh thinkers and help them make a difference so that little by little those in academia or in practice can unite to bring positive change in the real world. Emerald Publishing is proud to be a Times Top 50 Employer for Gender Equality 2025 – for the second year in a row – and one of the Top 50 Inspiring Workplaces in the UK and Ireland for 2025.

About Dimensions

Part of Digital Science, Dimensions hosts the largest collection of interconnected global research data, re-imagining research discovery with access to grants, publications, clinical trials, patents and policy documents all in one place. Follow Dimensions on Bluesky, X and LinkedIn.

About Digital Science

Digital Science is an AI-focused technology company providing innovative solutions to complex challenges faced by researchers, universities, funders, industry, and publishers. We work in partnership to advance global research for the benefit of society. Through our brands – Altmetric, Dimensions, Figshare, IFI CLAIMS Patent Services, metaphacts, OntoChem, Overleaf, ReadCube, Symplectic, and Writefull – we believe when we solve problems together, we drive progress for all. Visit digital-science.com and follow Digital Science on Bluesky, on X or on LinkedIn.

Media contacts

David Ellis, Press, PR & Social Manager, Digital Science: Mobile +61 447 783 023, d.ellis@digital-science.com

Tom Shiels, Communications Manager, Emerald Publishing: tshiels@emerald.com

The post Emerald Publishing to safeguard research integrity with Dimensions Author Check appeared first on Digital Science.

]]>
Bloomsbury partners with Digital Science to monitor online impact https://www.digital-science.com/blog/2025/07/bloomsbury-partners-with-digital-science-to-monitor-online-impact/ Wed, 30 Jul 2025 09:57:16 +0000 https://www.digital-science.com/?p=93732 Bloomsbury Academic & Professional is partnering with Digital Science to introduce Altmetric Explorer across the whole of its e book platform, Bloomsbury Collections.

The post Bloomsbury partners with Digital Science to monitor online impact appeared first on Digital Science.

]]>
Wednesday 30 July 2025

Bloomsbury Academic & Professional today announced that it is partnering with Digital Science, a technology company serving stakeholders across the research ecosystem, to introduce Altmetric Explorer across the whole of its e book platform, Bloomsbury Collections.

The introduction, which will go live on July 31st, will bring better data and intelligence to the impact of Bloomsbury’s Academic work published globally.

Altmetric badges and data will be displayed on all content on the platform, providing users with quick access to online attention information. The suite of metrics, across all Bloomsbury Collections content, will enable Bloomsbury, its authors and the academic community to understand the reach, impact and value of its publishing, signposting trends and providing insight for future editorial strategy.  It will also inform future platform development.

Pooja Aggarwal, Director of Academic and Professional Publishing commented, “Intelligent analytics data is a vital component in creating a great experience for all our customers and authors. It will improve our publishing strategy, including the way we structure our content in order to improve the reader engagement. We also know that this introduction will greatly benefit our ground-breaking open access programme, measuring the widest audience and supporting the personal profile and career of the funding author.”

Altmetric Explorer tracks social attention by detecting links and citations to research in non-traditional literature, such as patent documents, policy papers, social media, news or blog sites. Additionally, it tracks sharing and discussion on social media; all global attention is then aggregated in a large searchable database which can be used to track the attention, re-use and impact of published research or a research topic.

Bloomsbury Collections contributes forward-thinking scholarship to the global academic community with perpetual access to more than 30,000 individual eBooks on a title by title basis, plus Collections of titles across the arts, humanities, and social sciences. It contains more than 30,000 titles across 45+ subject areas. Options are flexible to help librarians complete their collections by imprint, subject, collection year, or selected series.

About Bloomsbury

Bloomsbury is a leading independent publishing house, established in 1986, with authors who have won the Nobel, Pulitzer and Booker Prizes, and is the originating publisher and custodian of the Harry Potter series. Bloomsbury has offices in London, New York, Maryland, New Delhi, Oxford and Sydney.

About Digital Science

Digital Science is an AI-focused technology company providing innovative solutions to complex challenges faced by researchers, universities, funders, industry and publishers. We work in partnership to advance global research for the benefit of society. Through our brands – Altmetric, Dimensions, Figshare, IFI CLAIMS Patent Services, metaphacts, OntoChem, Overleaf, ReadCube, Symplectic, and Writefull – we believe when we solve problems together, we drive progress for all. Visit digital-science.com and follow Digital Science on Bluesky, on X or on LinkedIn.

Media enquiries

Tash Payne, Senior Corporate Communications Manager, Bloomsbury: tash.payne@bloomsbury.com

David Ellis, Press, PR & Social Manager, Digital Science: Mobile +61 447 783 023, d.ellis@digital-science.com

This press release has been supplied by Bloomsbury and was originally published here.

The post Bloomsbury partners with Digital Science to monitor online impact appeared first on Digital Science.

]]>
Digital Science relaunches Scientometric Researcher Access to Data (SRAD) program https://www.digital-science.com/blog/2025/07/digital-science-relaunches-scientometric-researcher-access-to-data-srad-program/ Tue, 22 Jul 2025 08:49:52 +0000 https://www.digital-science.com/?p=93586 Digital Science has reaffirmed its commitment to supporting the global scientometric research community by relaunching the Scientometric Researcher Access to Data (SRAD) program.

The post Digital Science relaunches Scientometric Researcher Access to Data (SRAD) program appeared first on Digital Science.

]]>
SRAD program announcement graphic

Access to Altmetric and Dimensions data is now boosted with Dimensions on BigQuery for researchers in the scientometrics field

Tuesday 22 July 2025

Digital Science today reaffirms its commitment to supporting the global scientometric research community and the study of scholarly literature, by relaunching its Scientometric Researcher Access to Data (SRAD) program.

This revitalized initiative will offer scientometric researchers streamlined, no-cost access to Digital Science’s Altmetric and Dimensions data, and is now further expanded by offering access to Dimensions on BigQuery.

The SRAD program is available to scientometrics researchers involved in non-commercial scientometric studies, empowering them to more easily answer system-wide research questions about scholarly literature and its impact.

To lead this important effort and build a thriving global community of expert users, Digital Science has appointed Kathryn Weber-Boer to the position of Director Scientometrics – Scientometric Researcher Engagement. Ms Weber-Boer brings deep expertise in scientometrics, academic engagement, and advanced analytics. 

Ms Weber-Boer said: “This program plays an important role in Digital Science’s commitment to open research and improving research. I am honoured to be in the position of driving strategic outreach, program design, and community leadership, to help researchers maximize the impact of Digital Science tools.

“By expanding access to Dimensions on GBQ, we’re excited to enable scientometrics researchers to answer complex questions with big data, exploring and linking more datapoints, and connecting our world-leading Dimensions data to other open datasets.

“The SRAD program is built around key principles of accessibility, responsible data use, and community empowerment. Through tailored training and dynamic community engagement, it’s our hope that we can contribute to driving innovation in the fields of Scientometrics, Research Policy, and Innovation Studies,” she said.

About Dimensions

Part of Digital Science, Dimensions hosts the largest collection of interconnected global research data, re-imagining research discovery with access to grants, publications, clinical trials, patents and policy documents all in one place. Follow Dimensions on Bluesky, X and LinkedIn.

About Altmetric

Altmetric is a leading provider of alternative research metrics, helping everyone involved in research gauge the impact of their work. We serve diverse markets including universities, institutions, government, publishers, corporations, and those who fund research. Our powerful technology searches thousands of online sources, revealing where research is being shared and discussed. Teams can use our powerful Altmetric Explorer application to interrogate the data themselves, embed our dynamic ‘badges’ into their webpages, or get expert insights from Altmetric’s consultants. Altmetric is part of the Digital Science group, dedicated to making the research experience simpler and more productive by applying pioneering technology solutions. Find out more at altmetric.com and follow @altmetric on X and @altmetric.com on Bluesky.

About Digital Science

Digital Science is an AI-focused technology company providing innovative solutions to complex challenges faced by researchers, universities, funders, industry and publishers. We work in partnership to advance global research for the benefit of society. Through our brands – Altmetric, Dimensions, Figshare, IFI CLAIMS Patent Services, metaphacts, OntoChem, Overleaf, ReadCube, Symplectic, and Writefull – we believe when we solve problems together, we drive progress for all. Visit digital-science.com and follow Digital Science on Bluesky, on X or on LinkedIn.


Media contact

David Ellis, Press, PR & Social Manager, Digital Science: Mobile +61 447 783 023, d.ellis@digital-science.com

The post Digital Science relaunches Scientometric Researcher Access to Data (SRAD) program appeared first on Digital Science.

]]>
Digital Science to strengthen research integrity in publishing with new Dimensions Author Check API https://www.digital-science.com/blog/2025/07/strengthen-research-integrity-in-publishing-with-dimensions-author-check-api/ Wed, 16 Jul 2025 08:54:05 +0000 https://www.digital-science.com/?p=93533 Scholarly publishers can now fully integrate research integrity checks into their editorial and submission workflows with Dimensions Author Check API.

The post Digital Science to strengthen research integrity in publishing with new Dimensions Author Check API appeared first on Digital Science.

]]>
Wednesday 16 July 2025

Scholarly publishers can now fully integrate research integrity checks into their editorial and submission workflows, thanks to Digital Science’s new Dimensions Author Check API, which launches today.

Built on Dimensions – the world’s largest interconnected global research database – Dimensions Author Check evaluates researchers’ publication and collaboration histories within seconds, delivering reliable, concise, structured insights.

For the first time, the new Dimensions Author Check API enables publishers to embed this functionality directly into their own workflows, without the need to switch to an outside platform.

Dr Leslie McIntosh, Vice President of Research Integrity at Digital Science, said Dimensions Author Check API is designed to support consistent and confident editorial decision-making.

“By highlighting key indicators of research integrity – such as retractions, tortured phrases, or unusual co-authorship patterns – the Dimensions Author Check API helps to rapidly identify potential issues for concern. These include continuously improving indicators that will identify paper mills and increase trust in science,” Dr McIntosh said.

“Importantly, the Author Check API can do this at scale, giving publishers the ability to screen multiple researchers per request. This makes it ideal for high-volume manuscript processing and broader editorial oversight.”

Key benefits of the new Dimensions Author Check API include:

  • Seamless integration: A standards-based RESTful API designed for easy deployment within publishers’ internal systems or third-party platforms.
  • Actionable insights: Clear summaries highlighting key aspects of researchers’ publication and collaboration histories.
  • Operational efficiency: Reducing editorial workload while enhancing the quality and consistency of integrity assessments.
  • Support for transparency and trust: Surfacing critical integrity information at key decision points, strengthening publishers’ ability to adhere to ethical standards.

Note to editors: The Dimensions Author Check dashboard was originally announced in December last year. This announcement is specific to the Dimensions Author Check API, which launches today.

About Dimensions

Part of Digital Science, Dimensions hosts the largest collection of interconnected global research data, re-imagining research discovery with access to grants, publications, clinical trials, patents and policy documents all in one place. Follow Dimensions on Bluesky, X and LinkedIn.

About Digital Science

Digital Science is an AI-focused technology company providing innovative solutions to complex challenges faced by researchers, universities, funders, industry and publishers. We work in partnership to advance global research for the benefit of society. Through our brands – Altmetric, Dimensions, Figshare, IFI CLAIMS Patent Services, metaphacts, OntoChem, Overleaf, ReadCube, Symplectic, and Writefull – we believe when we solve problems together, we drive progress for all. Visit digital-science.com and follow Digital Science on Bluesky, on X or on LinkedIn.

Media contact

David Ellis, Press, PR & Social Manager, Digital Science: Mobile +61 447 783 023, d.ellis@digital-science.com

The post Digital Science to strengthen research integrity in publishing with new Dimensions Author Check API appeared first on Digital Science.

]]>
Access vs Engagement – is OA enough? https://www.digital-science.com/blog/2025/07/access-vs-engagement-is-oa-enough/ Tue, 01 Jul 2025 13:35:55 +0000 https://www.digital-science.com/?p=93377 How do we know if Open Access research is having its intended impact?

The post Access vs Engagement – is OA enough? appeared first on Digital Science.

]]>
Making research Open Access (OA) is one major step in the process, but how do we know if OA research is having its intended impact? Ann Campbell and Katie Davison share the results of their investigations and some lessons for the future of OA.

Reaching OA’s potential

One of the principal aims of Open Access (OA) has always been to democratize knowledge by making research free to read; however, that should be the starting point, not the ultimate goal. Perhaps it’s time to step back and ask ourselves, “Are we in danger of becoming preoccupied with the ‘access’ aspect of open – neglecting the other components that make research successful?”

In our rush to remove paywalls and ‘financial barriers’, could it be that we are simply equating ‘freely available’ to ‘truly accessible’? How valuable is making research content accessible without it being discoverable? And how beneficial is it for an end user to find content if they don’t see its relevance, or if they can’t act on it?

Access alone isn’t enough. If research isn’t discoverable, understandable, or actionable for the people who need it (policymakers, practitioners, researchers across regions and community organizations), then OA has fallen short of its full potential. 

The ability to get research into the hands of those who can fully capitalize on it is a crucial factor to research success, but in practice, significant gaps and disconnects are evident – particularly from a data and systems perspective. We have made huge progress in terms of the volume of research that is technically ‘open’, however we now need to find out who is actually benefiting.

Current narrative suggests that OA articles are more likely to be cited – but our data suggests this isn’t universally true, or at least that there is more to the story. In addition, citations alone don’t tell us who’s engaging with the content or whether it’s reaching communities outside of academia.

If equity in research means the ability to publish and participate in research fairly, (regardless of location, career stage or discipline), should we accept that the measure of success is whether an article has been published OA? Or should we be measuring success based on whether the research achieves its intended aims, reaches its intended audience, and enables meaningful participation across global research communities?

This blog will look at what ‘access’, taken in isolation, is and what it isn’t. Using data from Dimensions, extracted from the Dimensions on GBQ environment alongside World Bank data on GBQ, we challenge the notion that emphasis on publishing OA is enough to ensure equitable participation. We explore what happens when we focus on access without discoverability. We assess whether research participation is happening in a balanced way or whether there are barriers to journal publication – including but not limited to Article Processing Charges (APCs) – and engagement.

To help us with this, we have conducted a benchmarking and data interpretation exercise to understand the wider problem of participation in research. 

SDGs case study

Let’s begin with a common assumption: that publishing is the ultimate goal for a researcher, and that lower-middle and low-income countries struggle to publish OA at the same rate as upper-middle and high-income countries due to the financial challenges associated with APCs.

The visual on the left (in Chart 1) shows us the number of gold OA articles published in 2023. This view alone might suggest that lower-income countries are being prevented from publishing OA compared to upper-income countries. However, benchmarking against the overall amount of research from these regions shows the reverse – low-income (LIC) and lower-middle-income countries (LMIC) are producing proportionately more OA content.

Chart 1. Open Access articles as a portion of overall research BY Income level versus as a portion of overall research AT income level.
Chart 1. Open Access articles as a portion of overall research BY Income level versus as a portion of overall research AT income level. Dimensions data filtered by 2023 pub year, research article document type and SDG 4. Accessed 28/02/2025.

With this data in mind we dismiss the notion that a general analysis of open participation will drive further insight and shift to participation at journal level. For this analysis, it is useful to consider participation in these terms: where there is intent to contribute to a research topic, is that intent being met or prevented through journal selection and traditional impact measures?

To see this in action, we decided to focus this case study on Indonesian researchers’ contribution to SDG 4, Quality Education.

  • We focused on Indonesia because in 2023 Indonesia was the second-highest producer of research articles among LMIC countries with a high amount of OA content. (NB: We will not delve into the reasons behind Indonesia’s high output in this piece.)
  • We focused on SDG 4 because Indonesian researchers produced a substantial, and outsized, amount of Quality Education research. More than any other country and roughly 10% of overall research aligned to SDG 4 (as seen in Chart 2).
Chart 2. The total publications of research aligned to SDG 4, in 2023, by country
Chart 2. The total publications of research aligned to SDG 4, in 2023, by country. Dimensions data filtered by 2023 pub year, research article document type and SDG 4. Accessed 28/02/2025.

In a world where participation in global research was truly balanced and contributions to knowledge were reflected proportionally, if Indonesia contributes 10% of overall research to quality education, we would hope to see the 10% Indonesian representation happen at journal level as well.

To view this, we analyzed journals publishing the most research articles aligned with SDG 4 and benchmarked them against common markers for citation impact and attention. We then assessed the representation of Indonesian research within these journals. Specifically, we calculated the proportion of SDG 4-aligned research with at least one Indonesian-affiliated researcher, aiming for a 10% representation rate. The results are shown in the visual below (Chart 3).

Chart 3. Balanced representation for Indonesia? This chart shows the journals that produce some of the highest amount of journal article content aligned to SDG 4 by citation and Altmetric averages
Chart 3. Balanced representation for Indonesia? This chart shows the journals that produce some of the highest amount of journal article content aligned to SDG 4 by citation and Altmetric averages. The size of the bubble related to the portion of research articles in that journal, with at least one author affiliated with an organization in Indonesia.

Our journal-level analysis revealed that the desired 10% participation rate was not met. There was an imbalance within the journals around the level of Indonesian research present. Notably, this imbalance occurred across varying access types and associated publication fees. At the top, Education and Information Technologies, our highest-cited journal, a hybrid title, showed ~2% Indonesian representation. Education Sciences, a gold title that scored middle-ish for citation average, has less than 1%. The largest portion of Indonesian research appeared at the bottom left in two diamond-access, regional titles where we saw lower average scores in both citation and attention.

Therefore, a barrier may be the APCs; usually higher for market leading, established journals. (We’d highlight that Cogent Education is the closest to meeting the 10% participation rate and is a publication that does charge an APC but also offers waivers for LIC and LMIC countries.) However, this is just one of many potential barriers to equitable participation and one addressed by programs like Research4Life and publisher-led, global discounting practices. Our focus here was viewing the research holistically, taking into account how open practices have supported or hindered participation through both journal selection and research impact.

This view (Chart 3) highlights the challenge seasoned publishers face in balancing publication preferences, what motivates or prevents a researcher to select that journal, and readership habits, which encompass both accessibility and discoverability, the kind of discoverability established journals typically offer. The low metrics for the diamond OA journals (bottom left, Chart 3) illustrate the challenge for journals of ensuring research reaches readers.

Publisher mediation

To look closer at the intersection between the two sides publishers must mediate to ensure research meets its potential, we first focus on publication preferences. Many publishers aim to remove participation barriers so we can share quality research in a balanced, fully representational way. How can publishers work to ensure this proportional representation?

One approach is reducing costs of APCs, another is raising awareness. Emerald Publishing uses Dimensions data to benchmark the locale of research relative to our journal level subjects and try to balance Editorial Advisory Board (EAB) selection proportionally. This practice aims to inform publishers and editors where the research is coming from, without compromising EAB selection quality; addressing this at journal level regardless of access type or other unintended barriers.

The other aspect of this publisher mediation, and the one crucial to ensuring research is seen by the intended audience, is understanding reader habits. It is important  to understand the benefits of making research openly accessible versus accessible, findable, and usable. Access in isolation, without the presence of discoverability to ensure the work reaches the end user, is not enough.

Below we can see the average citations for the top 100 most productive countries by access type (Table 1). We conclude from this brief view that hybrid titles generate more citation activity as they are the established journals that have an established readership base.

Citation CalculationClosedHybridGold (APC charge)Gold (no-APC charge)
Average1.93.01.81.1
Median1.82.91.81.0
Table 1. Average and Median citations for articles published in 2024 by access type. Dimensions data filtered by 2024 pub year, research article document type and access type including identifying non-APC journals. Accessed 27/03/2025.

It is probable that the imbalance in Indonesian representation is shaped by the age and prestige of journals themselves. For the most part, Open Access journals are younger than their subscription-based closed counterparts, and because Journal Impact Factors (JIFs) are based on a two-year citation window, newer journals (both open and closed access) are naturally disadvantaged.

As a result, newer journals that cover emerging or interdisciplinary areas, such as research aligned with the Sustainable Development Goals (SDGs), may find it difficult to achieve similar visibility and ‘reputation’. This creates a compounding effect: newer OA journals may be more inclusive and open to geographically diverse contributions, yet they lack the discoverability and citation momentum of older, established titles.

In turn, researchers from countries like Indonesia are more likely to publish in regional, Diamond OA journals – which remain under-recognized in global research metrics despite playing a crucial role in local knowledge and research ecosystems. 

This echoes the concerns raised in the Budapest Open Access Initiative 20th anniversary recommendations (BOAI20), which call for a more equitable and inclusive approach to Open Access – one that recognizes the value of diverse publication venues, fosters participation from underrepresented communities, and moves beyond outdated prestige indicators.

This points to a deeper issue: when discoverability and prestige are unequally distributed across journals, people may judge research quality based on where it’s published, rather than on the actual quality of the research.

A pattern emerges

This brings us to further consider the practice of prioritizing access above all else, how this may perpetuate bias in the system arising from assessing research quality based on its potential reach, and how that can be hindered by the journal itself.

We examined the quality of Indonesian research in high-output titles and found that when venue and discoverability practices align, Indonesian research citations are above average, dispelling any assumption about overall ‘quality’ that may arise from most Indonesian researchers prioritizing access when selecting journal (Chart 4).

Chart 4. Quality of Indonesian Research seen through balanced discoverability.
Chart 4. Quality of Indonesian Research seen through balanced discoverability. Dimensions data filtered by 2023 pub year, research article document type and SDG 4. Accessed 28/02/2025.

This prompted a further question: Even when quality is demonstrable, is it being recognized globally? A parallel analysis examining citation practices across all low-income countries allowed us to test whether the patterns we observed with Indonesian research reflect broader systemic issues. We found a consistent pattern: research from low-income countries is often overlooked in citation practices, even when it is highly relevant and well-aligned with global priorities and even when it aligns closely with the focus of the citing publication.

In a parallel analysis, we found a consistent pattern: research from low-income countries is often overlooked in citation practices, even when it is highly relevant and well-aligned with global priorities and even when it aligns closely with the focus of the citing publication.

The parallel analysis examined global research output from 2013 to 2023, focusing on contributions to Sustainable Development Goals (SDGs), excluding SDG 3 (Good Health and Well-Being) given its high proportion of research. Using author affiliations from the Dimensions database, we categorized publications by author country and matched them to World Bank income group classifications. This allowed us to compare research priorities between high-income and low-income countries over this time.

As shown in the chart below, there are clear differences in thematic focus. Researchers in low-income countries disproportionately prioritize areas like SDG 2: Zero Hunger and SDG 6: Clean Water and Sanitation – topics that directly reflect the urgent, lived realities in these regions. In contrast, high-income countries show a stronger focus on SDGs such as Affordable and Clean Energy and Partnerships for the Goals. These differing priorities demonstrate the local expertise and indigenous knowledge embedded in lower-income regions – expertise that, as shown in our citation analysis, is not being adequately acknowledged or cited in global research outputs.

Chart 5. SDG Priorities ranked by publication count Low Income and High Income countries.
Chart 5. SDG Priorities ranked by publication count Low Income and High Income countries. Extracted using Dimensions data joined to World Bank data on Google Big Query.

In critical areas such as Zero Hunger and Clean Water and Sanitation – topics where low-income countries often hold deep, practical expertise – our citation analysis reveals minimal inclusion of their work by researchers in high-income countries. Specifically, just 0.2% of references in high-income country publications on these SDGs cite publications where authors are based solely in low-income countries. In contrast, over 70% of the references come from publications with authors affiliated exclusively with high-income institutions (74% for Zero Hunger and 71% for Clean Water and Sanitation).

Even when we broaden the scope to include any contribution from a low-income country, the numbers remain stark: 1.41% for Zero Hunger and 1.22% for Clean Water and Sanitation. This is despite the fact that these regions face the most urgent realities tied to these challenges – and who are actively publishing in these areas.

These findings point to a clear disconnect between where expertise exists and where it is recognized. In both Zero Hunger and Clean Water and Sanitation, areas where low-income countries have direct, practical experience, we see how research is vastly under-cited by high-income country publications. This underrepresentation suggests a missed opportunity to draw on locally grounded knowledge that could meaningfully shape global solutions.

Conclusion

This isn’t about a lack of relevant research. It’s about discoverability, visibility, and deeply embedded citation habits. Open Access isn’t just about making research available, it’s about making sure that research is seen, used, and respected within the global knowledge ecosystem.

Emerald has recently launched the Open Lab, which looks at the research ecosystem and how open practices impact it. Its goal is to find real solutions to some of the problems not yet addressed by open practices and some of the problems created by them.

We hope this analysis encourages thoughtful discussion on where the focus should shift, thus allowing us to effectively evaluate the success of Open Access and help ensure that all research can meet its full potential.


Authors:

Ann Campbell, Technical Solutions Manager, Digital Science
Katie Davison, Insights Analyst, Emerald Publishing

The post Access vs Engagement – is OA enough? appeared first on Digital Science.

]]>