Skip Navigation
Skip to contents

Science Editing : Science Editing

OPEN ACCESS
SEARCH
Search

Previous issues

Page Path
HOME > Browse articles > Previous issues
17 Previous issues
Filter
Filter
Article category
Keywords
Authors
Volume 9 February 2022
Prev issue Next issue
Editorial
Metaverse in journal publishing
Kihong Kim
Sci Ed. 2022;9(1):1-2.   Published online February 20, 2022
DOI: https://doi.org/10.6087/kcse.256
  • 5,673 View
  • 229 Download
  • 2 Web of Science
  • 4 Crossref
PDF

Citations

Citations to this article as recorded by  
  • Facing the challenges of metaverse: a systematic literature review from Social Sciences and Marketing and Communication
    Verónica Crespo-Pereira, Eva Sánchez-Amboage, Matías Membiela-Pollán
    El Profesional de la información.2023;[Epub]     CrossRef
  • Emergence of the metaverse and ChatGPT in journal publishing after the COVID-19 pandemic
    Sun Huh
    Science Editing.2023; 10(1): 1.     CrossRef
  • Advances in Metaverse Investigation: Streams of Research and Future Agenda
    Mariapina Trunfio, Simona Rossi
    Virtual Worlds.2022; 1(2): 103.     CrossRef
  • What the Literature on Medicine, Nursing, Public Health, Midwifery, and Dentistry Reveals: An Overview of the Rapidly Approaching Metaverse
    Muhammet DAMAR
    Journal of Metaverse.2022; 2(2): 62.     CrossRef
Review
Types, limitations, and possible alternatives of peer review based on the literature and surgeons’ opinions via Twitter: a narrative review
Sameh Hany Emile, Hytham K. S. Hamid, Semra Demirli Atici, Doga Nur Kosker, Mario Virgilio Papa, Hossam Elfeki, Chee Yang Tan, Alaa El-Hussuna, Steven D. Wexner
Sci Ed. 2022;9(1):3-14.   Published online February 20, 2022
DOI: https://doi.org/10.6087/kcse.257
  • 5,628 View
  • 308 Download
AbstractAbstract PDF
This review aimed to illustrate the types, limitations, and possible alternatives of peer review (PR) based on a literature review together with the opinions of a social media audience via Twitter. This study was conducted via the #OpenSourceResearch collaborative platform and combined a comprehensive literature search on the current PR system with the opinions of a social media audience of surgeons who are actively engaged in the current PR system. Six independent researchers conducted a literature search of electronic databases in addition to Google Scholar. Electronic polls were organized via Twitter to assess surgeons’ opinions on the current PR system and potential alternative approaches. PR can be classified into single-blind, double-blind, triple-blind, and open PR. Newer PR systems include interactive platforms, prepublication and postpublication commenting or review, transparent review, and collaborative review. The main limitations of the current PR system are its allegedly time-consuming nature and inconsistent, biased, and non-transparent results. Suggestions to improve the PR process include employing an interactive, double-blind PR system, using artificial intelligence to recruit reviewers, providing incentives for reviewers, and using PR templates. The above results offer several concepts for possible alternative approaches and modifications to this critically important process.
Original Articles
The opinions of Indian dental faculty members on harmonizing manuscript preparation and the submission guidelines of journals
Gadde Praveen, Harsha GVD, Swati G Naidu, Dharani ASD
Sci Ed. 2022;9(1):15-21.   Published online February 20, 2022
DOI: https://doi.org/10.6087/kcse.258
  • 5,516 View
  • 268 Download
  • 2 Web of Science
  • 2 Crossref
AbstractAbstract PDFSupplementary Material
Purpose: Authors of scholarly writing are underrepresented in discussions about improving the academic publishing system. The objective of this study was to assess the possibility of harmonizing manuscript preparation and the submission guidelines of journals by assessing the opinions of dental faculty members who worked in the state of Andhra Pradesh, India.
Methods
A cross-sectional survey of 1,286 participants from 16 dental schools in Andhra Pradesh was conducted from March 15, 2021 to April 15, 2021. The questionnaire addressed the participants’ demographic details and perspectives on the guidelines for manuscript preparation and the need to harmonize those guidelines with the publication process. The online questionnaire was generated using Google Forms and consisted of six dichotomous, one multiple-choice, and seven Likert scale items. Descriptive statistics were obtained.
Results
Of the 894 (69.5%) dental faculty members who responded, 448 (50.1%) were not aware of the International Committee of Medical Journal Editors’ guidelines for manuscript preparation and submission. During the manuscript revision process, 792 (95.5%) had experienced difficulty with the variation in author guidelines for each journal, especially the guidelines for formatting tables, reference style, and citation of references in-text. The idea of a standardized template for manuscript preparation and submission was supported by 800 respondents (86.7%).
Conclusion
Dental faculty members in India experienced difficulty in manuscript preparation for medical journals due to the differing editorial policies among journals. Therefore, a standardized template providing uniformity in style and format is needed.

Citations

Citations to this article as recorded by  
  • Research publications of Australia’s natural history museums, 1981–2020: Enduring relevance in a changing world
    Tayla A. Green, Pat A. Hutchings, Fiona R. Scarff, James R. Tweedley, Michael C. Calver, Claudia Noemi González Brambila
    PLOS ONE.2023; 18(6): e0287659.     CrossRef
  • Why consistent, clear, and uniform instructions for authors are required
    Jean Iwaz
    Science Editing.2022; 9(2): 142.     CrossRef
Development of a decision-support tool to quantify authorship contributions in clinical trial publications
Sam T. Mathew, Habeeb Ibrahim Abdul Razack, Prasanth Viswanathan
Sci Ed. 2022;9(1):22-29.   Published online February 20, 2022
DOI: https://doi.org/10.6087/kcse.259
  • 4,915 View
  • 338 Download
AbstractAbstract PDF
Purpose: This study aimed to develop a decision-support tool to quantitatively determine authorship in clinical trial publications.
Methods
The tool was developed in three phases: consolidation of authorship recommendations from the Good Publication Practice (GPP) and International Committee of Medical Journal Editors (ICMJE) guidelines, identifying and scoring attributes using a 5-point Likert scale or a dichotomous scale, and soliciting feedback from editors and researchers.
Results
The authorship criteria stipulated by the ICMJE and GPP recommendations were categorized into 2 Modules. Criterion 1 and the related GPP recommendations formed Module 1 (sub-criteria: contribution to design, data generation, and interpretation), while Module 2 was based on criteria 2 to 4 and the related GPP recommendations (sub-criteria: contribution to manuscript preparation and approval). The two modules with relevant sub-criteria were then differentiated into attributes (n = 17 in Module 1, n = 12 in Module 2). An individual contributor can be scored for each sub-criterion by summing the related attribute values; the sum of sub-criteria scores constituted the module score (Module 1 score: 70 [contribution to conception or design of the study, 20; data acquisition, 7; data analysis, 27; interpretation of data, 16]; Module 2 score: 50 [content development, 27; content review, 18; accountability, 5]). The concept was integrated into Microsoft Excel with adequate formulae and macros. A threshold of 50% for each sub-criterion and each module, with an overall score of 65%, is predefined as qualifying for authorship.
Conclusion
This authorship decision-support tool would be helpful for clinical trial sponsors to assess and provide authorship to deserving contributors.
Changes in article share and growth by publisher and access type in Journal Citation Reports 2016, 2018, and 2020
Sang-Jun Kim, Kay Sook Park
Sci Ed. 2022;9(1):30-36.   Published online February 20, 2022
DOI: https://doi.org/10.6087/kcse.260
  • 4,567 View
  • 277 Download
  • 2 Web of Science
  • 2 Crossref
AbstractAbstract PDF
Purpose: This study explored changes in the journal publishing market by publisher and access type using the major journals that publish about 95% of Journal Citation Reports (JCR) articles.
Methods
From JCR 2016, 2018, and 2020, a unique journal list by publisher was created in Excel and used to analyze the compound annual growth rate by pivot tables. In total, 10,953 major JCR journals were analyzed, focusing on publisher type, open access (OA) status, and mega journals (publishing over 1,000 articles per year).
Results
Among the 19 publishers that published over 10,000 articles per year, in JCR 2020, six large publishers published 59.6% of the articles and 13 publishers 22.5%. The other publishers published 17.9%. Large and OA publishers increased their article share through leading mega journals, but the remaining publishers showed the opposite tendency. In JCR 2020, mega journals had a 26.5% article share and an excellent distribution in terms of the Journal Impact Factor quartile. Despite the high growth (22.6%) and share (26.0%) of OA articles, the natural growth of non-OA articles (7.3%) and total articles (10.7%) caused a rise in journal subscription fees. Articles, citations, the impact factor, and the immediacy index all increased gradually, and the compound annual growth rate of the average immediacy index was almost double than that of the average impact factor in JCR 2020.
Conclusion
The influence of OA publishers has grown under the dominance of large publishers, and mega journals may substantially change the journal market. Journal stakeholders should pay attention to these changes.

Citations

Citations to this article as recorded by  
  • Publishing trends of journals and articles in Journal Citation Reports during the COVID-19 pandemic: a descriptive study
    Sang-Jun Kim, Kay Sook Park
    Science Editing.2023; 10(1): 78.     CrossRef
  • Citation beneficiaries of discipline-specific mega-journals: who and how much
    Jing Li, Qiushuang Long, Xiaoli Lu, Dengsheng Wu
    Humanities and Social Sciences Communications.2023;[Epub]     CrossRef
Comparing the accuracy and effectiveness of Wordvice AI Proofreader to two automated editing tools and human editors
Kevin Heintz, Younghoon Roh, Jonghwan Lee
Sci Ed. 2022;9(1):37-45.   Published online February 20, 2022
DOI: https://doi.org/10.6087/kcse.261
  • 5,012 View
  • 326 Download
  • 1 Crossref
AbstractAbstract PDF
Purpose: Wordvice AI Proofreader is a recently developed web-based artificial intelligence-driven text processor that provides real-time automated proofreading and editing of user-input text. It aims to compare its accuracy and effectiveness to expert proofreading by human editors and two other popular proofreading applications—automated writing analysis tools of Google Docs, and Microsoft Word. Because this tool was primarily designed for use by academic authors to proofread their manuscript drafts, the comparison of this tool’s efficacy to other tools was intended to establish the usefulness of this particular field for these authors.
Methods
We performed a comparative analysis of proofreading completed by the Wordvice AI Proofreader, by experienced human academic editors, and by two other popular proofreading applications. The number of errors accurately reported and the overall usefulness of the vocabulary suggestions was measured using a General Language Evaluation Understanding metric and open dataset comparisons.
Results
In the majority of texts analyzed, the Wordvice AI Proofreader achieved performance levels at or near that of the human editors, identifying similar errors and offering comparable suggestions in the majority of sample passages. The Wordvice AI Proofreader also had higher performance and greater consistency than that of the other two proofreading applications evaluated.
Conclusion
We found that the overall functionality of the Wordvice artificial intelligence proofreading tool is comparable to that of a human proofreader and equal or superior to that of two other programs with built-in automated writing evaluation proofreaders used by tens of millions of users: Google Docs and Microsoft Word.

Citations

Citations to this article as recorded by  
  • Navigating the impact: a study of editors’ and proofreaders’ perceptions of AI tools in editing and proofreading
    Islam Al Sawi, Ahmed Alaa
    Discover Artificial Intelligence.2024;[Epub]     CrossRef
Essays
Role of academic publishers in 10 years: a perspective from the Chairman of Elsevier
Youngsuk Chi
Sci Ed. 2022;9(1):46-52.   Published online February 20, 2022
DOI: https://doi.org/10.6087/kcse.262
  • 3,402 View
  • 254 Download
PDFSupplementary Material
Role of Crossref in journal publishing over the next decade
Ed Pentz
Sci Ed. 2022;9(1):53-57.   Published online February 20, 2022
DOI: https://doi.org/10.6087/kcse.263
  • 3,162 View
  • 241 Download
  • 2 Web of Science
  • 2 Crossref
PDF

Citations

Citations to this article as recorded by  
  • Why do editors of local nursing society journals strive to have their journals included in MEDLINE? A case study of the Korean Journal of Women Health Nursing
    Sun Huh
    Korean Journal of Women Health Nursing.2023; 29(3): 147.     CrossRef
  • Scopus, cOAlition S, and Crossref’s views on scholarly publishing in the next 10 years
    Tae-Sul Seo
    Science Editing.2022; 9(1): 74.     CrossRef
Diplomacy in six patterns of reviewers’ queries during manuscript revision in scholarly publishing
Jean Iwaz
Sci Ed. 2022;9(1):58-61.   Published online February 20, 2022
DOI: https://doi.org/10.6087/kcse.264
  • 2,547 View
  • 238 Download
PDF
Increased number of Scopus articles from Indonesia from 1945 to 2020, an analysis of international collaboration, and a comparison with other ASEAN countries from 2016 to 2020
Prakoso Bhairawa Putera, Suryanto Suryanto, Sinta Ningrum, Ida Widianingsih, Yan Rianto
Sci Ed. 2022;9(1):62-68.   Published online February 20, 2022
DOI: https://doi.org/10.6087/kcse.265
  • 3,339 View
  • 319 Download
  • 1 Crossref
PDFSupplementary Material

Citations

Citations to this article as recorded by  
  • Performance of Indonesian Scopus journals in the area of agricultural and biological sciences
    Prakoso Bhairawa Putera, Ida Widianingsih, Sinta Ningrum, Suryanto Suryanto, Yan Rianto
    Science Editing.2022; 10(1): 100.     CrossRef
Reflections on 4 years in the role of a Crossref ambassador in Korea
Jae Hwa Chang
Sci Ed. 2022;9(1):69-73.   Published online February 20, 2022
DOI: https://doi.org/10.6087/kcse.266
  • 8,728 View
  • 235 Download
  • 1 Web of Science
  • 1 Crossref
PDF

Citations

Citations to this article as recorded by  
  • Role of Crossref in journal publishing over the next decade
    Ed Pentz
    Science Editing.2022; 9(1): 53.     CrossRef
Meeting Reports
Scopus, cOAlition S, and Crossref’s views on scholarly publishing in the next 10 years
Tae-Sul Seo
Sci Ed. 2022;9(1):74-76.   Published online February 20, 2022
DOI: https://doi.org/10.6087/kcse.267
  • 2,866 View
  • 248 Download
PDF
Local editors have no time to lose for building their journals’ reputations
Byung-Mo Oh
Sci Ed. 2022;9(1):77-78.   Published online February 20, 2022
DOI: https://doi.org/10.6087/kcse.268
  • 2,324 View
  • 233 Download
PDF
Training Materials
The evolution, benefits, and challenges of preprints and their interaction with journals
Pippa Smart
Sci Ed. 2022;9(1):79-84.   Published online February 20, 2022
DOI: https://doi.org/10.6087/kcse.269
  • 4,892 View
  • 306 Download
  • 8 Web of Science
  • 12 Crossref
AbstractAbstract PDF
This article presents the growth and development of preprints to help authors, editors, and publishers understand and adopt appropriate strategies for incorporating preprints within their scholarly communication strategies. The article considers: preprint history and evolution, integration of preprints and journals, and the benefits and disadvantages, and challenges that preprints offer. The article discusses the two largest and most established preprint servers, arXiv.org (established in 1991) and SSRN (1994), the OSF (Open Science Foundation) initiative that supported preprint growth (2010), bioRxiv (2013), and medRxiv (2019). It then discusses six different levels of acceptance of preprints within journals: uneasy relationship, acceptance of preprint articles, encouraging authors to preprint their articles, active participation with preprints, submerger by reviewing preprints, and finally merger and overlay models. It is notable that most journals now accept submissions that have been posted as preprints. The benefits of preprints include fast circulation, priority publication, increased visibility, community feedback, and contribution to open science. Disadvantages include information overload, inadequate quality assurance, citation dilution, information manipulation and inflation of results. As preprints become mainstream it is likely that they will benefit authors but disadvantage publishers and journals. Authors are encouraged to preprint their own articles but to be cautious about using preprints as the basis for their own research. Editors are encouraged to develop preprint policies and be aware that double-blind review is not possible with preprinting of articles and that allowing citations to preprints is to be encouraged. In conclusion, journal-related stakeholders should consider preprints as an unavoidable development, taking into consideration both the benefits and disadvantages.

Citations

Citations to this article as recorded by  
  • Seeing the forest for the trees and the changing seasons in the vast land of scholarly publishing
    Soo Jung Shin
    Science Editing.2024; 11(1): 81.     CrossRef
  • To preprint or not to preprint: A global researcher survey
    Rong Ni, Ludo Waltman
    Journal of the Association for Information Science and Technology.2024;[Epub]     CrossRef
  • Open publishing of public health research in Africa: an exploratory investigation of the barriers and solutions
    Pasipanodya Ian Machingura Ruredzo, Dominic Dankwah Agyei, Modibo Sangare, Richard F. Heller
    Insights the UKSG journal.2024;[Epub]     CrossRef
  • Recent Issues in Medical Journal Publishing and Editing Policies: Adoption of Artificial Intelligence, Preprints, Open Peer Review, Model Text Recycling Policies, Best Practice in Scholarly Publishing 4th Version, and Country Names in Titles
    Sun Huh
    Neurointervention.2023; 18(1): 2.     CrossRef
  • Most Preprint Servers Allow the Publication of Opinion Papers
    Jaime A. Teixeira da Silva, Serhii Nazarovets
    Open Information Science.2023;[Epub]     CrossRef
  • The rise of preprints in earth sciences
    Olivier Pourret, Daniel Enrique Ibarra
    F1000Research.2023; 12: 561.     CrossRef
  • The rise of preprints in earth sciences
    Olivier Pourret, Daniel Enrique Ibarra
    F1000Research.2023; 12: 561.     CrossRef
  • Sharing the wealth: a proposal for discipline-based repositories of shared educational resources
    Ellen Austin
    Perspectives: Policy and Practice in Higher Education.2023; 27(4): 131.     CrossRef
  • The experiences of COVID-19 preprint authors: a survey of researchers about publishing and receiving feedback on their work during the pandemic
    Narmin Rzayeva, Susana Oliveira Henriques, Stephen Pinfield, Ludo Waltman
    PeerJ.2023; 11: e15864.     CrossRef
  • An attempt to explain the partial 'silent' withdrawal or retraction of a SAGE Advance preprint
    Jaime A. Teixeira da Silva
    Publishing Research.2023;[Epub]     CrossRef
  • The use and acceptability of preprints in health and social care settings: A scoping review
    Amanda Jane Blatch-Jones, Alejandra Recio Saucedo, Beth Giddins, Robin Haunschild
    PLOS ONE.2023; 18(9): e0291627.     CrossRef
  • Dissemination of Registered COVID-19 Clinical Trials (DIRECCT): a cross-sectional study
    Maia Salholz-Hillel, Molly Pugh-Jones, Nicole Hildebrand, Tjada A. Schult, Johannes Schwietering, Peter Grabitz, Benjamin Gregory Carlisle, Ben Goldacre, Daniel Strech, Nicholas J. DeVito
    BMC Medicine.2023;[Epub]     CrossRef
How to share data through Harvard Dataverse, a repository site: a case of the World Journal of Men’s Health
Hyun Jun Park
Sci Ed. 2022;9(1):85-90.   Published online February 20, 2022
DOI: https://doi.org/10.6087/kcse.270
  • 3,236 View
  • 253 Download
  • 2 Web of Science
  • 2 Crossref
AbstractAbstract PDF
Data are a highly valuable asset for researchers. Earlier, researchers who conducted a study permanently owned their data. Currently, however, these data can be used as a source for performing further research. In 2018, the International Committee of Medical Journal Editors presented data sharing statements for clinical trials. Although this recommendation was limited to clinical trials published in medical journals, it is a meaningful change that formalized the concept of data sharing. However, the trend of data sharing is expected to spread beyond medical journals to include a wider range of scientific journals in the near future. Correspondingly, platforms that provide storage and services to share data will gradually diversify. The World Journal of Men’s Health has adopted a clinical data sharing policy. The data deposit process to Harvard Dataverse, a well-known data repository, is as follows: first, select the type of article for data sharing; second, create an account; third, write a letter to the corresponding author; fourth, receive and validate data from the authors; fifth, upload the data to Harvard Dataverse; and sixth, add a data sharing statement to the paper. It is recommended that scientific journal editors select an appropriate platform and participate in the new trend of data sharing.

Citations

Citations to this article as recorded by  
  • Korean scholarly journal editors’ and publishers’ attitudes towards journal data sharing policies and data papers (2023): a survey-based descriptive study
    Hyun Jun Yi, Youngim Jung, Hyekyong Hwang, Sung-Nam Cho
    Science Editing.2023; 10(2): 141.     CrossRef
  • The utilisation of open research data repositories for storing and sharing research data in higher learning institutions in Tanzania
    Neema Florence Mosha, Patrick Ngulube
    Library Management.2023; 44(8/9): 566.     CrossRef

Science Editing : Science Editing