Meeting: 18th EASE General Assembly and Conference
Date: May 14–16, 2025
Venue: Oslo, Norway
Organizer: European Association of Science Editors (EASE)
Theme: Editing in the age of misinformation
The EASE and Oslo
- We had a wonderful opportunity to gather with colleagues from around the world, most of whom were from Europe. The 18th European Association of Science Editors (EASE) General Assembly and Conference took place at the Legenes Hus Conference Centre in Oslo, Norway, from May 14–16, 2025. The event featured a scientific program comprising 12 sessions and more than 30 speakers from across the globe. Prior to the main event, three optional workshops were held on May 13 and 14 (discussed in detail later). The meeting fostered comprehensive engagement among participants, allowing for deep interaction with both the subject matter and with fellow attendees through oral and poster presentations. The venue’s proximity to the Oslo Central Station made it conveniently accessible on foot. The conference dates were intentionally scheduled to coincide with Norway’s National Day on May 17, offering us a small but delightful glimpse of the celebrations before our departure.
Three optional workshops held on May 13 and 14
- The first workshop began on the morning of Tuesday, May 13, in a cozy room in central Oslo. Entitled “How to be a successful journal editor” primarily attracted editorial board members and editors-in-chief from a range of specialized journals. Joan Marsh, the trainer and editor-in-chief of The Lancet Psychiatry, led the workshop. Topics included the fundamentals of quality journals, editorial team and board development, peer review, indexing and bibliometrics, strategic journal planning, and approaches to journal development. The workshop offered standards and practical guidance for maintaining quality in journal publishing. It emphasized the need to establish policies proactively, as well as the growing importance of transparency, clarity, and informativeness. Editors and publishers must stay current with developments in research communication and technology. Both authors and readers increasingly expect better-integrated presentations and improved discoverability. The editorial skills discussed during the workshop included the following:
- (1) Demonstrating experience and knowledge of the field
- (2) Synthesizing information and views to make informed decisions
- (3) Practicing lifelong learning
- (4) Communicating clearly and effectively
- (5) Acting with leadership and integrity
- (6) Showing and implementing knowledge of research and publishing integrity
- (7) Upholding principles of ethical human and animal research
- (8) Articulating editorial rights and responsibilities
- (9) Identifying and using trustworthy resources
- (10) Selecting journal content to reflect the journal scope
- (11) Analyzing journal policies and practices
- (12) Evaluating articles for scientific rigor and integrity
- (13) Evaluating articles for best practice in research
- (14) Managing and ensuring the integrity of peer review
- A serious discussion focused on recognition for reviewers’ dedication. As editors and editorial board members, it is essential to offer training courses and resources, and to consult EASE for available support. One of the most valuable aspects of the workshop was the opportunity to engage in discussion and share concerns with other participants. While there may not be a single best answer and practices can vary among journals, listening to others’ experiences and challenges proved highly beneficial.
- The second workshop was arranged by the Committee on Publication Ethics (COPE), a nonprofit organization whose stated mission is to define best practice in the ethics of scholarly publishing. As suggested by the title, “Introduction to Publication Ethics,” two speakers provided a concise overview of their experiences. Small groups of four to five participants discussed two real-world cases: the first involved a journal receiving a reviewer’s report that raised concerns about potential artificial intelligence use by a manuscript’s author; the second addressed suspicious requests for changes to the authorship list after a manuscript had been accepted. Participants shared their small group discussions and received valuable feedback and practical advice from the COPE members.
- The International Society of Managing and Technical Editors workshop, “Achieving an Effective Editorial Office in the Age of Misinformation,” took place on the morning of May 14. Sessions focused on editorial office workflows, screening processes, and effective peer review. Particularly intriguing was the session on identifying suitable reviewers, where the instructor described both manual methods and various tools, such as Reviewer Finder [1], Reviewer Locator—note that journals using the ScholarOne Manuscripts or Editorial Manager submission systems have the Web of Science Reviewer Locator enabled [2], and Referee Finder [3]. The workshop also thoroughly addressed the importance of reviewer reward and recognition, as well as different methods for implementing these practices.
The 23rd Annual General Meeting
- The EASE General Assembly and Conference began with venue registration, poster viewings, and networking sessions, where participants enjoyed water, tea, or coffee and greeted one another. This year, the Annual General Meeting was held in both online and in-person formats by EASE. During the meeting, reports were presented on major activities, including webinars, workshops and training events (the EASE School for Manuscript Editors and Academic Authors and the EASE School for Journal Editors), the journal European Science Editing, and the quarterly e-newsletter. Updates also included the 13 regional chapters (including Korea), strategies, budget, and financial projections. Notably, EASE now operates social media channels on Facebook, LinkedIn, Bluesky, Mastodon, YouTube, and, newly, Instagram. There was also a note about the move from X (formerly Twitter) to Bluesky. The segmentation of EASE membership—which includes authors, manuscript editors, journal editors, peer reviewers, publishers, institutions, universities, and the broader scholarly and scientific community—was highlighted to help improve targeted recruitment, communications, and events. The EASE Secretary Mary Hodgson then reported on membership statistics from 2018 to 2025, noting that numbers have ranged from 600 to 700 in the past 5 years, with over 350 group members. The UK continues to top the membership leaderboard, followed by Croatia, Türkiye, and Ukraine, with 24 other countries each having five or more members.
- Next, the President thanked the 2024 council members and announced the new members for 2025. During the workshops, symposium, editorial school, and the 18th conference, 372 members participated at least once, while 92 members engaged in more than five activities. The board recommended that EASE develop new marketing materials, such as infographics and posters (e.g., “boost your career prospects with EASE”). Plans for new initiatives, including the Learning Hub and New Academy, were also introduced. We noted ongoing efforts to engage institutions and universities as organizational members. EASE is currently midway through a 6-year investment plan (2021–2027), and concerns have been raised about the need for a more robust professional organization.
- A highlight of the meeting was the launch of “Writing Your Research Paper: Tips from EASE Editors,” a major partnership project with American Journal Experts (AJE) and Research Square. EASE, AJE, and Research Square introduced a new EASE guideline document to support authors in writing high-quality articles, choosing the most appropriate journals for submission, and improving their success in research publishing journeys [4]. The board expressed appreciation for the contributions of all committee members, after which the meeting was adjourned.
Plenary lecture: Disinformation in publishing
- Nicolien van der Linden (Vice President of Research Analytics at RELX) delivered a lecture on understanding and identifying disinformation. In today’s environment, where information can be weaponized, understanding and addressing disinformation is more critical than ever, especially in research. Misinformation refers to unintentional false or inaccurate information spread without an intent to deceive. Disinformation denotes information that is deliberately false or misleading, created and spread with the intention to deceive. By contrast, malinformation is information shared with the intent to cause harm, often by taking facts out of context.
- van der Linden explored how false narratives infiltrate academic and scientific communities, carefully distinguishing between misinformation, disinformation, and malinformation using compelling examples. One notable example was the finding that “the five most viewed vaccine-related articles on Facebook are misleading.” The article “Quantifying the Impact of Misinformation and Vaccine-Skeptical Content on Facebook,” published in Science, reported that untagged content raising doubts about vaccine safety or efficacy was 46 times more influential in inducing vaccine hesitancy than content flagged as misinformation [5].
- van der Linden evaluated existing tools and strategies for combating these threats, emphasizing the essential role of researchers, editors, and media in safeguarding the integrity of knowledge. She stressed—and we fully agree—that confidence in science is low worldwide. In 2024, nearly half of the global population reported distrust in scientists. Misinformation and disinformation in health have existed long before the modern era, and incorrect information can lower immunization rates. Restricted accessibility to certain content can reduce its impact on vaccination decisions, yet vaccine-skeptical material that bypasses fact-checkers is especially potent. A significant share of misleading articles even originated from reputable news outlets [5]. This study demonstrated that unflagged, factual, yet misleading Facebook posts reduced willingness to receive the COVID-19 vaccine by a factor of 46 compared to false posts that had been identified as misinformation by fact-checkers. The authors stressed the importance of assessing both the reach and impact of content, not just its factual accuracy. While reducing disinformation is critical to improving public health, so-called “gray area” content—accurate but misleading—must also be addressed.
- In publishing, the rise of preprints and artificial intelligence (AI) is expected to generate new forms of misinformation, disinformation, and malinformation. Retraction Watch reported that 10,000 research papers were retracted globally in 2023, with Saudi Arabia, Pakistan, Russia, and China showing the highest retraction rates among major research nations over the past two decades, as noted in Nature [6]. van der Linden also compared the well-known cases of Woo-suk Hwang’s cloning scandal and the McCullough paper, among others.
Reception and tour at the City Hall
- The Mayor of Oslo, Anne Lindboe, welcomed EASE conference participants to Oslo City Hall. The Nobel Peace Prize award ceremony is held here each year on December 10, on the anniversary of Alfred Nobel’s death. For instance, Korean President Dae-jung Kim received the Peace Prize at this venue. Attendees enjoyed delicious food and drinks, and were given a guided tour of City Hall. In preparation for Norway’s National Day on May 17, tables were being set and colorful balloons prepared for the celebration. Participants admired various artworks, including the massive murals on the first floor depicting the history of Norway.
Keynote presentation: Science in the age of misinformation
- Camilla Stoltenberg, who served as director general of the Norwegian Research Centre from 2012 to 2023, brought to the role both a strong background in economics and considerable scientific insight. She began the talk by stating “misinformation has been around, of course not all of them are incorrect, and the question is how do we know, and more importantly the openness is critical.” While we cannot predict what will happen in the next 10 years, she asserted, this is the very nature of science—everything scientists say and do must strive for truth.
- In times of crisis, the existence of an independent institute is of utmost importance. We have witnessed significant crises over the past few decades, including the 2008 financial crisis, the food crisis related to bovine spongiform encephalopathy in cattle and its link to variant Creutzfeldt-Jakob disease—the so-called “mad cow disease”—and, most recently, the COVID-19 pandemic. During such crises, large volumes of misinformation often circulate unchecked and without adequate fact-checking.
- The Norwegian Institute of Public Health (https://www.norceresearch.no/en) encouraged its staff to participate in public debate, even without prior permission. Especially during the pandemic, fostering trust was essential because of the diversity of viewpoints. Therefore, developing guidelines and building consensus with other stakeholders, including the government, is critical. As we continue to face new crises, we must demonstrate how science can improve outcomes. Effective science communication can yield much better results, benefitting society on both national and global levels. Following the keynote, a brief Q&A session was held:
- Q. What actions can be taken in countries other than Norway that have limited expectations for openness?
- A. The core principle remains the same, yet different strategies can be utilized in various countries.
- Q. What actions should scientists take to improve the situation and address the crisis?
- A. Take opportunities and work together.
- Q. Given our diverse cultural backgrounds as editors, how can we effectively collaborate?
- A. Of course there are differences; it is essential for scientists to engage in communication and collaborate in pursuit of truth. Consideration should be given to developing a platform that could bridge the existing gaps between these differences.
Panel discussion
- Research integrity checks: the promise and perils of new tech and how to find a balance
- Caroline Sutton (CEO of STM) explained two major challenges: (1) large-scale manipulation and (2) identity manipulation. Moderator Brian Cody (CEO of Scholastica) asked, “How do we achieve solutions?” The speaker responded that close collaboration with publishers and the sharing of information is crucial; for example, when misconduct is discovered, sharing names and contact information is necessary. With regard to tools, it is vital to understand what problems actually need solving.
- Next, Matt Hodgkinson (Head of Editorial, Directory of Open Access Journals [DOAJ]) discussed the progression from individual to systemic to industrialized large-scale manipulation, and highlighted the importance of partnering with institutions to address these issues. When the moderator asked how this could be achieved, the speaker emphasized that our approach must be carefully evaluated—not just the implementation of tools, but also the rationale and understanding of how others are using them. Tools must be tested in real-life settings, as simply purchasing them is insufficient. Even a well-trained person can make a significant difference, though it is not an easy task; a decision tree may be helpful. Transparency is essential when using such tools—everyone should know what is being used and when. Transparency and the sharing of information can help prevent larger problems, particularly if articles are published simultaneously.
- Kim Eggleton (Head of Peer Review and Research Integrity, IOP Publishing) highlighted the importance of knowing when checks are most effective in the workflow (e.g., sequence, images, timing), and understanding exactly what plagiarism checks cover. Asked again how to achieve this, the speaker noted that open, collaborative tools have been effective and again emphasized transparency. It is important to consider both the best- and worst-case scenarios. Ultimately, all data and content should be verified by a human, as this is critical. Sometimes, the publisher’s technical experts have the answers, given their specialist status. When asked about data repositories and the necessity and risks of open data, the speaker emphasized that open data fosters trust and transparency, and we must think about the relationship between open data and research integrity. At the same time, we need to carefully determine what should be open and what should remain restricted, and to establish clear guidance.
- Jane Alfred (Director of Catalyst Editorial) raised several points: (1) the research community varies by subject area; (2) good practices with high technology must be established before committing to them; (3) the need for transparency; and (4) the pressure to publish and the various challenges faced by stakeholders. She emphasized the need for better integrity tools, as the abundance of current tools complicates collaboration among journals and publishers, especially for smaller publishers. Open AI tools must be further developed to address manipulation and plagiarism, which is crucial to building trust among researchers. Understanding how AI tools function is necessary, but a complete understanding remains challenging. This risk underlines the continuing need for human oversight. There are also challenges related to language: reliance on tools designed for English can disadvantage non-English-speaking users. A system must be developed to address these cross-language issues. Collaboration with experts may help expand the use of both images and text for integrity checks.
- The session concluded with a discussion on the inappropriate use of AI tools, particularly the problem of relying on AI without full understanding. The debate also addressed complaints from non-native English speakers regarding plagiarism detection systems, especially in terms of language options.
- Pressures and solutions for shoestring-budget journal publishers
- This session was particularly engaging, as the speakers—Iva Grabarić Andonovski (Editor of Food Technology and Biotechnology for over 22 years and EASE Vice President since 2023), Aira Huttunen (Editor-in-Chief of Informaatiotutkimus [Information Studies]), and Maarit Jaakkola (Co-director of Nordicom, University of Gothenburg)—discussed the specific challenges and limitations faced by small-scale publishers, followed by potential solutions. The difficulties encountered include the following:
- (1) Small, nonprofit society publishers
- (2) Open access (OA) could result in a severe loss of income
- (3) Uncertainty of project funding and subsidies
- (4) Science policy and small languages
- (5) Publish or perish
- (6) More administrative and academic housekeeping
- (7) More submissions, with varying quality
- (8) Quality of information—difficult to check
- A question was raised about how misinformation and limited resources impact the day-to-day operations of small publishers, and what types of support or interventions might help. Andonovski emphasized the importance of first understanding the unique issues faced by small publishers, such as the distinctions between institutional and national journals, and English-based versus non-English-based journals. Editor involvement is critical for both completing editorial work and clarifying the roles of editorial board members. Misinformation does not primarily arise from the tools themselves, but rather from the actions of key individuals, particularly editors. For small publishers, proactively preventing misinformation is often advantageous, even in uncertain cases; a double-blind review approach may help address conflicts of interest.
- The next issue addressed was the extent to which editors and reviewers rely on voluntary or undercompensated labor, and how much can reasonably be asked of them. The solution is complex. It is essential to recognize the contributions of editors and reviewers. Editorial boards should develop strategies to acknowledge faculty contributions to editorial work alongside their teaching and personal commitments. Specific measures should ensure appropriate returns, such as time flexibility and guaranteed free time. Fostering a motivating work environment while offering some flexibility is essential.
- Small organizations typically communicate by email, while larger, better-resourced journals often employ dedicated platforms for submission, peer review, and publication processes.
- Further discussion focused on several priorities: (1) developing investment strategies for young scholars; (2) promoting collaborative development with authors; (3) emphasizing interpersonal communication with authors for each article; and (4) supporting published authors to act as agents for promoting and disseminating the journal’s work.
- Rethinking peer review: enhancing reliability in scholarly publishing
- This panel featured four speakers: Mike Thelwall (University of Sheffield), Ludo Waltman (University of Leiden), Gunnar Sivertsen (Nordic Institute for Studies in Innovation, Research and Education), and Laura Sheard (University of Manchester). The discussion began with the question: why are we facing a “peer review crisis?” The main reasons identified were the growth and global integration of science, and the proliferation of low-quality journals that do not represent their scientific communities. While there is little that can be done about the former, the panel agreed that action is possible for the latter. For example, North Atlantic Treaty Organization (NATO)’s share of internally authored articles dropped from 65% to 31%, suggesting a rising number of submissions and a growing need for peer reviewers.
- In 2010, only one of the 20 largest journals was a gold OA journal (Plos One), but by 2022, all 20 were gold OA mega-journals (mainly MDPI and Frontiers). The two central tasks of scholarly journals are to oversee the research process through to dissemination and to contribute to collective knowledge. Ideally, journals should represent and remain accountable to their research communities.
- Improving review quality clearly requires workshops, training, and mentoring for career development. There was also discussion about how to enhance the credibility of the peer review system for researchers. It may be that the only way to achieve this is through openness and full transparency.
- The panel also discussed technology’s role in both aiding and threatening peer review in the age of large language models (LLMs). Possible outcomes include LLM-assisted peer review comments, LLM-generated suggestions in specific fields, and LLM editorial decision recommendations, although these are only moderately reliable.
- However, the panel noted that some peer review reports generated by AI may appear plausible but are actually inaccurate, posing a risk if overworked reviewers use them to save time. Generative AI can also produce assessments of originality, significance, rigor, and other qualities, but these reports are often inaccurate despite sounding credible. The discussion then turned to several key ethical and systemic concerns:
- (1) How can editors guard against unsafe LLM-generated reviews (not language polishing)?
- (2) Transparency is the key.
- (3) Do we have enough reviewers to give high-quality review reports? If “No,” then we have to be honest about the fact, and think about what to do.
- (4) Reviewers must exercise caution when using LLMs, particularly concerning copyright and data security.
- Intercepting misconduct: practical tips for editors
- This panel included Paula Saikkonen (Editor-in-Chief of Yhteiskuntapolitiikka, Finland’s leading social policy journal), Daniel Stuckey (Senior Publishing Ethics Expert of Elsevier), and Ines Steffens (Editor-in-Chief of Eurosurveillance). The discussion began by noting the “challenging and changing integrity landscapes,” as follows:
- (1) Emerging issues on AI and shifting industry-wide responses
- • New integrity tools are appearing, but they have challenges
- • Can we validate the results of these new tools?
- (2) Unknown or unwritten protocols on how to act upon results from integrity tools
- • Resource disparities
- • How can we make it possible for many editors to learn about best practices?
- Many editors continue to rely on manual checking skills and professional networks, raising questions such as “Could AI be a tool to support integrity checking?” and “Will human oversight always be necessary?” In practice, there will likely always be a need for some degree of manual verification. Upon manuscript submission, editors must check for single authorship, ensure that affiliations match the article’s content, clarify individual contributions, and verify the inclusion of “local” authors who may have participated in the research. Additionally, it is wise to scrutinize the cover letter for signs such as a stilted tone, inconsistencies with the abstract, an incorrect journal address, or excessive flattery towards the editor, team, or journal. Conflicts of interest can be overlooked, and authors—particularly those from pharmaceutical companies—may intentionally omit such information. Therefore, even if no conflict is declared, editors should inquire again at the final stage. Editors must also confirm that authors have described their adherence to reporting guidelines, including mandatory use of CONSROT (Consolidated Standards of Reporting Trials), CHEERS (Consolidated Health Economic Evaluation Reporting Standards), and PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses), while also encouraging the application of STROBE (Strengthening the Reporting of Observational Studies in Epidemiology), ORION (Outbreak Reports and Intervention Studies of Nosocomial Infection), SANRA (Scale for the Quality Assessment of Narrative Review Articles), and other relevant standards. Editors should also be alert to potential issues with images lacking copyright/permission when sourced from third parties, and verify that claims regarding the use of AI tools match the evidence.
Free tools and resources
- There was considerable discussion regarding “practical tips for editors: free tools and resources.”
Identity verification
- ORCID (Open Researcher and Contributor ID) provides a way to reliably, unambiguously, and permanently connect one’s name with one’s own work throughout a person’s research career, including publications, grants, education, employment, and other biographical information. Various free IP finder tools can help confirm whether an author’s location aligns with their records.
Assessing published output
- PubMed (https://pubmed.ncbi.nlm.nih.gov) is an openly accessible, free database, including primarily the MEDLINE collection of references and abstracts on life sciences and biomedical topics. Maintained by the National Library of Medicine at the US National Institutes of Health, PubMed is part of the Entrez information retrieval system and is useful for checking corrective notices, coauthors, and potential conflicts of interest.
- Google Scholar (https://scholar.google.com) is a freely accessible search engine indexing the full text or metadata of scholarly literature across many publishing formats and disciplines, and it is useful for citation counts.
- Dimensions AI (https://www.dimensions.ai) is a database referring to the total number of parameters and features that define an artificial intelligence model’s capacity to process and learn from data. The Dimensions platform provides analytics on publications, coauthors, citations, datasets, grants, patents, and clinical trials.
Identify research integrity issues
- Retraction Watch (https://retractionwatch.com) is a blog that reports on scientific paper retractions and related topics, launched in August 2010 and managed by science writers Ivan Oransky and Adam Marcus. The Retraction Watch Database (https://retractiondatabase.org) aggregates retractions from publisher websites and is updated daily; other types of updates, such as expressions of concern and corrections, are included but less comprehensively. PubPeer (https://pubpeer.org/) is a website established in 2012 that allows users to discuss and review scientific research after publication (post-publication peer review).
- Problematic Paper Screener (https://www.irit.fr/~Guillaume.Cabanac/problematic-paper-screener) is an automated tool that flags suspect publications for post-publication (re)assessment. It provides information issues like “feet of clay,” tortured phrases, citejacked journals, and problematic cell lines, among others.
- STM Integrity Hub offers an accessible and secure way to identify manuscripts that violate research integrity standards before publication, detecting paper mills and duplicate submissions efficiently and legally [7].
Free stats tools
- Free stats tools include chi-square, Benford’s law, GRIMMER (Granularity-Related Inconsistency of Means Mapped to Error Repeats) analysis, and P-values.
Conducting investigations
- Investigations should follow the guidelines and flowcharts from COPE.
Reporting and compliance checks
- The REAPPRAISED checklist can be used by anyone struggling to assess a submitted or published article, providing common-sense evaluations that go beyond the text itself, and should be used even if misconduct is not initially suspected. Readers should also consult the SAGER (Sex and Gender Equity in Research) guidelines and checklist [8], available in English and Korean for an accompanying checklist.
- The Equator Network (https://www.equator-network.org) defines a reporting guideline as a checklist, flow diagram, or structured text to guide authors in reporting a specific type of research, developed using explicit methodology. It serves as an umbrella organization, bringing together researchers, journal editors, peer reviewers, developers of reporting guidelines, research funding bodies, and other stakeholders committed to improving research publication quality.
- There is also a legal framework in Finland: the Finnish Code of Conduct for Research Integrity, developed by the Finnish National Board on Research Integrity (https://tenk.fi/en). The Finnish National Board on Research Integrity provides the Finnish Research Integrity Barometer, an English translation of the Code of Conduct, and a video illustrating the importance of research integrity. Additionally, the Federation of Finnish Learned Societies (https://www.tsv.fi/en) supports publishing platforms for scientific journals.
Identifying hallmarks of suspected generative AI use
- The prevalent issues of generative AI use and image manipulation were discussed in detail as follows:
- (1) Read the manuscript.
- (2) Are there associated PubPeer posts (published articles/preprints)?
- (3) Is there evidence of generative AI prompts/output phrases?
- (4) Are there any fake or improbable references? An example would be hallucinated output.
- (5) Are there tortured phrases? (For instance, “linear regression” becomes “straight relapse”).
- (6) Does the article contain outdated/unreliable information (reflecting limitations in LLM training data)?
- (7) Consider the increase in prevalence of certain words (e.g., the use of “delve,” “intricate,” and “meticulously” has spiked since 2023).
- (8) Are there stylistic anomalies within the manuscript or reviewers’ comments (e.g., repetition of works/phrases; imprecise/superficial language)?
- (9) Are there formatting telltales (e.g., Information organized into subheadings, intro, bullet-point lists, outro)?
- (10) Are there unusual features in images (e.g., blurry regions; missing items; garbled labels, spelling mistakes)?
Identifying suspected image manipulations
Check for image anomalies
- • Check for unusual contrast, artifacts, impossible features, obscured sections, and hand-drawn areas.
- • Do blots contain splices, duplicated regions, or odd features? Are they too “clean”?
Have raw data been shared and verified?
- • Check for completeness, faithfulness, and variability.
- • Have replicates been shared?
Check metadata
- • Check who created the file.
- • Check when and where were images acquired.
- • Is the file type consistent with what is described in the Methods?
Verifying image anomalies
- • Check with Adobe Photoshop (Adobe) or Microsoft PowerPoint (Microsoft Corp): change transparency of top image and overlay to check similarity.
- • Check with STM Image Alterations and Duplications Resource Center.
- • Check that raw images reflect those in the manuscript/article.
- • Consult COPE image manipulation flowcharts.
Escalate to the institution?
- • Check whether escalation is needed: the institution is best place to verify provenance of research, scrutinize lab books/hardware, and interview.
Welcome drinks in the conference venue
- Pre-dinner drinks were served on the roof terrace of the conference venue. After a full day of lectures indoors, it was refreshing to step outside, enjoy drinks and pretzel sticks, and have conversations with other attendees. We also met someone who had worked in a similar field. It was especially enjoyable to speak with people who were eager to visit Korea.
- Later, we went together to the Centropa restaurant for dinner, located near the Opera House. A fellow participant, a librarian, shared insights about the Oslo Public Library (the main branch of the Deichman Library), which was next to the restaurant. The gathering was so delightful that it lasted well into the night.
Keynote presentation: Author identity in the age of misinformation
- Alice Meadows (Co-founder of MoreBrains Cooperative) began her keynote by observing that misinformation is polluting the scholarly record (citing figures such as Robert F. Kennedy Jr), and that both the scholarly community and the public must work to rebuild trust. Currently, most research integrity efforts focus on the final product, not on the starting point of the research process. Increasingly, publishers are supporting the use of verified information. For authors and reviewers, trusted identity in academic publishing is crucial. ORCID is widely considered the best tool for this purpose, though challenges remain: user experience (e.g., multiple verifications), the system is not foolproof (bad actors still exist), and there are risks of inequity, especially for unaffiliated researchers. The 2024 ORCID annual report highlighted “By numbers of ORCID” [9]. The speaker presented a SWOT (strengths, weaknesses, opportunities, and threats) analysis of ORCID (Fig. 1). She also offered suggestions for how publishers can encourage the use of ORCIDs:
- (1) Use their ORCIDs (auto-updates by Crossref, DataCite, and others are especially valuable).
- (2) Add service information (e.g., editorial board membership, review contributions) to ORCID records.
- (3) Read and write trusted info from/to ORCID records.
- (4) Increase trust in the identity of your own authors, reviewers, and editors.
- (5) Improve your understanding of communities you serve.
- (6) Contribute to building trust in the wider scholarly and scientific community and its outputs.
Panel discussion: Trustworthy and quality journals: a multistakeholder perspective
- Academic journals in temporarily occupied territories: the Ukrainian case and a worldwide challenge
- Iryna Izarova (Professor of Law, Taras Shevchenko National University of Kyiv; Chair of the EASE Regional Chapter of Ukraine; Editor-in-Chief of Access to Justice in Eastern Europe Journal) discussed the significant threat posed by academic journals operating in temporarily occupied territories under questionable circumstances. She described how some Ukrainian academic journals infringe on legitimate titles and archives, misuse of ISSNs (International Standard Serial Numbers), and undermine scholarly credibility by disseminating manipulated or biased content. She outlined the ISSN International Centre’s updated rules for ISSN assignment:
- • Introduction of clear rules for assigning ISSNs to publications from occupied or disputed territories for the first time.
- • Implementation of detailed publication histories, including location, publisher names, and related ISSNs, along with explanations for special circumstances.
- • In cases where publication splits occur due to territorial conflicts, original ISSNs may be closed, and new ones assigned.
- • For regions without ISO (International Organization for Standardization) country codes, ISSNs may be issued with an “INT” designation.
- We also noted that the ISSN network is celebrating its 50th anniversary (https://50years.issn.org).
- DOAJ’s role in supporting quality and trust in scholarly publishing
- Joanna Ball, the managing director at DOAJ, introduced the DOAJ, an extensive and unique index of diverse, peer-reviewed OA journals across all disciplines, countries, and languages. DOAJ’s rigorous selection criteria have become a global gold standard for OA publishing. Currently, DOAJ includes 89 languages, 138 countries, and more than 21,500 journals. The 2024 DOAJ editorial statistics indicate that, of 8,300 applications, only 23% were accepted (8,298 applications received, 3,860 update requests, 1,901 applications accepted). Ball explained that DOAJ’s evaluation criteria—covering OA, copyright, and licensing; transparency regarding editorial boards, peer review, and ownership; editorial and publishing standards; website and functionality; and operational independence, business models, and archiving—are considered a gold standard for OA publishing. To combat predatory practices, DOAJ’s quality team conducts the following actions:
- (1) The quality team of DOAJ investigates instances of suspected bad practice (Fig. 2).
- (2) Concerns are flagged by the community or identified by DOAJ as part of the application process.
- (3) The team conducts evidence-based investigations and qualitative measures.
- (4) Investigations can result in removal and/or exclusion for 1 to 3 years.
- DOAJ is used as an indicator of trust in several ways:
- (1) Libraries and researchers use DOAJ to find trusted venues for research articles.
- (2) Libraries integrate DOAJ’s journal and article metadata into their discovery systems.
- (3) Research funders include DOAJ indexing as a requirement in their OA policies (for example, cOAlition S).
- (4) Aggregators and other services, such as OpenAlex (https://openalex.org/), integrate DOAJ’s metadata.
- Learning the art of OA publishing in a time of ambiguity: Think. Check. Submit. as a teaching tool
- The next talk, given by a representative of Think. Check. Submit. (https://thinkchecksubmit.org), focused on the following two parts.
- (1) Characteristics of predatory publishers
- • Misleading website information
- • Similar names to those of reputable journals and publishers
- • Confusing and/or complicated structures for charging fees
- • Use of deceptive or fake metrics
- • Promises of (too) fast peer review
- • Use of aggressive marketing methods
- (2) Block lists and allow lists
- • Block lists
- o Early responses to predatory journals included “block lists” (unsafe journals) and “allow lists” (safe journals)
- o These lists alone do not help researchers develop the skills to evaluate journals for themselves
- • Allow lists
- o Lists are not always fully reliable, and the criteria used are not always clear
- o It is recommended to use the Think-Check-Submit checklist as a tool to develop critical thinking.
- Think. Check. Submit. is a global initiative that provides tools for researchers to assess journal credibility before submitting their work. Developed in response to the rise of predatory journals, which charge authors to publish without proper peer review or editorial standards, the initiative advocates a three-step process:
- (1) Think: Is the journal you’re considering appropriate for your research? Is it trustworthy? Does it align with your field or study?
- (2) Check: Look for key indicators of journal quality, such as peer review practices, membership in recognized organizations, and clear editorial policies.
- (3) Submit: Only submit your research once you are confident that the journal is reliable and meets ethical standards.
- The choices available to authors can vary significantly, with some facing the pressures of a publish-or-perish culture and others receiving little to no institutional guidance. Additional challenges include language barriers, gaps in knowledge, and an over-reliance on metrics. When evaluating journals, it is important to consider factors such as readership—who engages with the journal—its reach and openness, and the rigor of its editorial process. Authors should reflect on who their target audience is, where that audience typically publishes, and which sources are deemed trustworthy. Three common scenarios authors may encounter include seeking a new journal for their work, having funding but lacking a trusted publication venue, or striving to gain recognition within a scholarly community.
- Building trust in scholarly communication: a librarian’s perspective from Bangladesh
- The information landscape in scholarly communication is rapidly changing, marked by a transition from print to digital platforms, the rise of peer-reviewed publications, and a surge in predatory journals alongside OA growth. These developments bring new challenges, particularly regarding trust. Threats such as predatory journals, misinformation, false metrics, and the prevalence of closed research environments put scholarly integrity at risk. Common views on OA journals include the lack of a peer review system or editorial board, the imposition of publication fees without proper quality checks, and unsolicited email invitations to submit articles or join editorial boards, leading to a misconception that all OA journals are predatory.
- Library and information science (LIS) professionals in Bangladesh face unique challenges in this environment. The role of librarians remains unclear, and feelings of insecurity when engaging with scholars can hinder their ability to provide effective support for publishing. Despite generally positive attitudes toward OA among librarians, practical challenges within their work environments remain significant.
- To meet these challenges, LIS professionals are focusing on capacity building through collaboration. One of the roles is prioritizing transparency through OA advocacy and information literacy programs. They are also tasked with upholding integrity by curating reliable resources and supporting academic honesty. Enhancing accessibility is also essential, with efforts directed toward creating inclusive services and spaces. Digital literacy programs are increasingly important, aiming to bridge the digital divide and empower users with necessary skills.
- Future plans for LIS professional include boosting research and innovation within universities, embedding research integrity into curricula, building multilingual OA platforms, and joining regional knowledge-sharing initiatives. Efforts to foster the use of OA resources are also needed, which could be complemented by policy development. LIS professionals must develop emerging skills to adapt to the evolving scholarly communication landscapes. This adaptation demands emerging skills such as technological proficiency, advocacy, leadership, networking, as well as expertise in metadata, knowledge organization, copyright, and licensing.
Scientific poster presentations
- Winners and runners-up in each category presented their posters at the conference, either in person or online. One of the authors, Cheol-Heui Yun, was recognized as a runner-up in the Community Engagement and Collaboration Posters category for the poster titled “Stepping stones for the development of scientific journal publishing in Korea: the role and vision of the Korean Council of Science Editors” (Figs. 3, 4).
How can you contribute to your journal’s sustainability?
- This session featured three speakers: Jo Wixon (Director, External Analysis, Wiley), Are Brean (Editor-in-Chief of Journal of The Norwegian Medical Association), and Iryna Izarova (Chair of the EASE Regional Chapter of Ukraine, Editor-in-Chief of Access to Justice in Eastern Europe Journal). The speakers discussed completing the sustainability checklist, recommending that organizations include practical actions relevant to their context and useful mechanisms for recording progress for individual journals, publishers, societies, or institutions. The session also addressed equity, diversity, and inclusion guidelines: (1) SAGER guidelines established procedures for reporting sex and gender information in study design, data analysis, results, interpretation, and checklist development [8]; and (2) GIST (Guidelines for Intersectional Analysis in Science and Technology) provide recommendations for researchers, peer-reviewed journals, and funders to systematically incorporate intersectional perspectives and checklist development [10].
- Intersectionality describes interdependent systems of inequality related to sex, gender, race, age, class, and other sociopolitical dimensions. By focusing on the compounded effects of social categories, intersectional analysis can improve the accuracy and experimental rigor of scientific research. Similar to the SAGER guidelines, integrating GIST into the peer review process will help journal editors standardize terminology, support best practices in reporting intersectional analyses, and raise researcher awareness of best methods in this area. The next step may involve EASE developing a rapid checklist to support uptake by authors and editors.
Regional chapter meeting
- During the tea break on the final day, a regional chapter meeting was held, bringing together representatives from various regions, including Korea, to share thoughts and experiences.
Epilogue
- The meetings concluded with a strong sense of satisfaction. Attendees took group photos, exchanged contact information, and said their farewells. On our last evening in Oslo, we enjoyed a boat tour around the islands. Although tired after a short trip to Scandinavia, we felt enriched and educated by the experience. We recalled the phrase from the meeting, “Information wants to be free,” famously stated by Stewart Brand at a Hackers Conference in 1984. Another saying was, “Nutritional value is good, but it is lacking the integrity issue, and perhaps it might be hard to find.”
Notes
-
Conflict of Interest
Cheol-Heui Yun has served as the ethics editor of Science Editing since 2020 and Jun-Beom Park serves as a member of the Korean Council of Science Editors, but were not involved in the peer review or decision-making process of this article. No other potential conflict of interest relevant to this article was reported.
-
Funding
This work was partially supported by a travel grant from the Korean Council of Science Editors for both Jun-Beom Park and Cheol-Heui Yun and from Seoul National University for Cheol-Heui Yun.
-
Data Availability
Data sharing is not applicable to this article as no new data were created or analyzed in this study.
-
Supplementary materials
The authors did not provide any supplementary materials for this article.
Fig. 1.SWOT (strengths, weaknesses, opportunities, and threats) analysis of ORCID (Open Researcher and Contributor). API, application programming interface.
Fig. 2.Quality investigation conducted by the Directory of Open Access Journals.
Fig. 3.Jun-Beom Park, Bahar Mehmani, and Cheol-Heui Yun (from left to right) in front of their poster at the 18th European Association of Science Editors General Assembly and Conference.
Fig. 4.Poster winners and committee members. From left: Cem Uzun (former President of European Association of Science Editors [EASE]), Sigmar de Mello Rode (EASE Council member), Cheol-Heui Yun, Iryna Izarova (Chair of EASE Regional Chapter Committee), Ksenija Baždarić (EASE Council member). Cheol-Heui Yun won the Community Engagement and Collaboration Posters category for the poster titled “Stepping stones for the development of scientific journal publishing in Korea: the role and vision of the Korean Council of Science Editors.”
Citations
Citations to this article as recorded by
