Exploring Metaverse Literacy: Immersive Technologies in Library Environments A. Subaveerapandiyan, Arunima Baiju, Naved Ahmad, Manoj Kumar Verma, Priyanka Sinha Journal of Web Librarianship.2024; 18(2): 39. CrossRef
Facing the challenges of metaverse: a systematic literature review from Social Sciences and Marketing and Communication Verónica Crespo-Pereira, Eva Sánchez-Amboage, Matías Membiela-Pollán El Profesional de la información.2023;[Epub] CrossRef
Emergence of the metaverse and ChatGPT in journal publishing after the COVID-19 pandemic Sun Huh Science Editing.2023; 10(1): 1. CrossRef
Advances in Metaverse Investigation: Streams of Research and Future Agenda Mariapina Trunfio, Simona Rossi Virtual Worlds.2022; 1(2): 103. CrossRef
What the Literature on Medicine, Nursing, Public Health, Midwifery, and Dentistry Reveals: An Overview of the Rapidly Approaching Metaverse Muhammet DAMAR Journal of Metaverse.2022; 2(2): 62. CrossRef
Sameh Hany Emile, Hytham K. S. Hamid, Semra Demirli Atici, Doga Nur Kosker, Mario Virgilio Papa, Hossam Elfeki, Chee Yang Tan, Alaa El-Hussuna, Steven D. Wexner
Sci Ed. 2022;9(1):3-14. Published online February 20, 2022
This review aimed to illustrate the types, limitations, and possible alternatives of peer review (PR) based on a literature review together with the opinions of a social media audience via Twitter. This study was conducted via the #OpenSourceResearch collaborative platform and combined a comprehensive literature search on the current PR system with the opinions of a social media audience of surgeons who are actively engaged in the current PR system. Six independent researchers conducted a literature search of electronic databases in addition to Google Scholar. Electronic polls were organized via Twitter to assess surgeons’ opinions on the current PR system and potential alternative approaches. PR can be classified into single-blind, double-blind, triple-blind, and open PR. Newer PR systems include interactive platforms, prepublication and postpublication commenting or review, transparent review, and collaborative review. The main limitations of the current PR system are its allegedly time-consuming nature and inconsistent, biased, and non-transparent results. Suggestions to improve the PR process include employing an interactive, double-blind PR system, using artificial intelligence to recruit reviewers, providing incentives for reviewers, and using PR templates. The above results offer several concepts for possible alternative approaches and modifications to this critically important process.
Purpose Authors of scholarly writing are underrepresented in discussions about improving the academic publishing system. The objective of this study was to assess the possibility of harmonizing manuscript preparation and the submission guidelines of journals by assessing the opinions of dental faculty members who worked in the state of Andhra Pradesh, India.
Methods A cross-sectional survey of 1,286 participants from 16 dental schools in Andhra Pradesh was conducted from March 15, 2021 to April 15, 2021. The questionnaire addressed the participants’ demographic details and perspectives on the guidelines for manuscript preparation and the need to harmonize those guidelines with the publication process. The online questionnaire was generated using Google Forms and consisted of six dichotomous, one multiple-choice, and seven Likert scale items. Descriptive statistics were obtained.
Results Of the 894 (69.5%) dental faculty members who responded, 448 (50.1%) were not aware of the International Committee of Medical Journal Editors’ guidelines for manuscript preparation and submission. During the manuscript revision process, 792 (95.5%) had experienced difficulty with the variation in author guidelines for each journal, especially the guidelines for formatting tables, reference style, and citation of references in-text. The idea of a standardized template for manuscript preparation and submission was supported by 800 respondents (86.7%).
Conclusion Dental faculty members in India experienced difficulty in manuscript preparation for medical journals due to the differing editorial policies among journals. Therefore, a standardized template providing uniformity in style and format is needed.
Citations
Citations to this article as recorded by
Research publications of Australia’s natural history museums, 1981–2020: Enduring relevance in a changing world Tayla A. Green, Pat A. Hutchings, Fiona R. Scarff, James R. Tweedley, Michael C. Calver, Claudia Noemi González Brambila PLOS ONE.2023; 18(6): e0287659. CrossRef
Why consistent, clear, and uniform instructions for authors are required Jean Iwaz Science Editing.2022; 9(2): 142. CrossRef
Purpose This study aimed to develop a decision-support tool to quantitatively determine authorship in clinical trial publications.
Methods The tool was developed in three phases: consolidation of authorship recommendations from the Good Publication Practice (GPP) and International Committee of Medical Journal Editors (ICMJE) guidelines, identifying and scoring attributes using a 5-point Likert scale or a dichotomous scale, and soliciting feedback from editors and researchers.
Results The authorship criteria stipulated by the ICMJE and GPP recommendations were categorized into 2 Modules. Criterion 1 and the related GPP recommendations formed Module 1 (sub-criteria: contribution to design, data generation, and interpretation), while Module 2 was based on criteria 2 to 4 and the related GPP recommendations (sub-criteria: contribution to manuscript preparation and approval). The two modules with relevant sub-criteria were then differentiated into attributes (n = 17 in Module 1, n = 12 in Module 2). An individual contributor can be scored for each sub-criterion by summing the related attribute values; the sum of sub-criteria scores constituted the module score (Module 1 score: 70 [contribution to conception or design of the study, 20; data acquisition, 7; data analysis, 27; interpretation of data, 16]; Module 2 score: 50 [content development, 27; content review, 18; accountability, 5]). The concept was integrated into Microsoft Excel with adequate formulae and macros. A threshold of 50% for each sub-criterion and each module, with an overall score of 65%, is predefined as qualifying for authorship.
Conclusion This authorship decision-support tool would be helpful for clinical trial sponsors to assess and provide authorship to deserving contributors.
Purpose This study explored changes in the journal publishing market by publisher and access type using the major journals that publish about 95% of Journal Citation Reports (JCR) articles.
Methods From JCR 2016, 2018, and 2020, a unique journal list by publisher was created in Excel and used to analyze the compound annual growth rate by pivot tables. In total, 10,953 major JCR journals were analyzed, focusing on publisher type, open access (OA) status, and mega journals (publishing over 1,000 articles per year).
Results Among the 19 publishers that published over 10,000 articles per year, in JCR 2020, six large publishers published 59.6% of the articles and 13 publishers 22.5%. The other publishers published 17.9%. Large and OA publishers increased their article share through leading mega journals, but the remaining publishers showed the opposite tendency. In JCR 2020, mega journals had a 26.5% article share and an excellent distribution in terms of the Journal Impact Factor quartile. Despite the high growth (22.6%) and share (26.0%) of OA articles, the natural growth of non-OA articles (7.3%) and total articles (10.7%) caused a rise in journal subscription fees. Articles, citations, the impact factor, and the immediacy index all increased gradually, and the compound annual growth rate of the average immediacy index was almost double than that of the average impact factor in JCR 2020.
Conclusion The influence of OA publishers has grown under the dominance of large publishers, and mega journals may substantially change the journal market. Journal stakeholders should pay attention to these changes.
Citations
Citations to this article as recorded by
Publishing trends of journals and articles in Journal Citation Reports during the COVID-19 pandemic: a descriptive study Sang-Jun Kim, Kay Sook Park Science Editing.2023; 10(1): 78. CrossRef
Citation beneficiaries of discipline-specific mega-journals: who and how much Jing Li, Qiushuang Long, Xiaoli Lu, Dengsheng Wu Humanities and Social Sciences Communications.2023;[Epub] CrossRef
Purpose Wordvice AI Proofreader is a recently developed web-based artificial intelligence-driven text processor that provides real-time automated proofreading and editing of user-input text. It aims to compare its accuracy and effectiveness to expert proofreading by human editors and two other popular proofreading applications—automated writing analysis tools of Google Docs, and Microsoft Word. Because this tool was primarily designed for use by academic authors to proofread their manuscript drafts, the comparison of this tool’s efficacy to other tools was intended to establish the usefulness of this particular field for these authors.
Methods We performed a comparative analysis of proofreading completed by the Wordvice AI Proofreader, by experienced human academic editors, and by two other popular proofreading applications. The number of errors accurately reported and the overall usefulness of the vocabulary suggestions was measured using a General Language Evaluation Understanding metric and open dataset comparisons.
Results In the majority of texts analyzed, the Wordvice AI Proofreader achieved performance levels at or near that of the human editors, identifying similar errors and offering comparable suggestions in the majority of sample passages. The Wordvice AI Proofreader also had higher performance and greater consistency than that of the other two proofreading applications evaluated.
Conclusion We found that the overall functionality of the Wordvice artificial intelligence proofreading tool is comparable to that of a human proofreader and equal or superior to that of two other programs with built-in automated writing evaluation proofreaders used by tens of millions of users: Google Docs and Microsoft Word.
Citations
Citations to this article as recorded by
Navigating the impact: a study of editors’ and proofreaders’ perceptions of AI tools in editing and proofreading Islam Al Sawi, Ahmed Alaa Discover Artificial Intelligence.2024;[Epub] CrossRef
Exploring students’ perspectives on Generative AI-assisted academic writing Jinhee Kim, Seongryeong Yu, Rita Detrick, Na Li Education and Information Technologies.2024;[Epub] CrossRef
Why do editors of local nursing society journals strive to have their journals included in MEDLINE? A case study of the Korean Journal of Women Health Nursing Sun Huh Korean Journal of Women Health Nursing.2023; 29(3): 147. CrossRef
Scopus, cOAlition S, and Crossref’s views on scholarly publishing in the next 10 years Tae-Sul Seo Science Editing.2022; 9(1): 74. CrossRef
Performance of Indonesian Scopus journals in the area of agricultural and biological sciences Prakoso Bhairawa Putera, Ida Widianingsih, Sinta Ningrum, Suryanto Suryanto, Yan Rianto Science Editing.2022; 10(1): 100. CrossRef
This article presents the growth and development of preprints to help authors, editors, and publishers understand and adopt appropriate strategies for incorporating preprints within their scholarly communication strategies. The article considers: preprint history and evolution, integration of preprints and journals, and the benefits and disadvantages, and challenges that preprints offer. The article discusses the two largest and most established preprint servers, arXiv.org (established in 1991) and SSRN (1994), the OSF (Open Science Foundation) initiative that supported preprint growth (2010), bioRxiv (2013), and medRxiv (2019). It then discusses six different levels of acceptance of preprints within journals: uneasy relationship, acceptance of preprint articles, encouraging authors to preprint their articles, active participation with preprints, submerger by reviewing preprints, and finally merger and overlay models. It is notable that most journals now accept submissions that have been posted as preprints. The benefits of preprints include fast circulation, priority publication, increased visibility, community feedback, and contribution to open science. Disadvantages include information overload, inadequate quality assurance, citation dilution, information manipulation and inflation of results. As preprints become mainstream it is likely that they will benefit authors but disadvantage publishers and journals. Authors are encouraged to preprint their own articles but to be cautious about using preprints as the basis for their own research. Editors are encouraged to develop preprint policies and be aware that double-blind review is not possible with preprinting of articles and that allowing citations to preprints is to be encouraged. In conclusion, journal-related stakeholders should consider preprints as an unavoidable development, taking into consideration both the benefits and disadvantages.
Citations
Citations to this article as recorded by
Seeing the forest for the trees and the changing seasons in the vast land of scholarly publishing Soo Jung Shin Science Editing.2024; 11(1): 81. CrossRef
To preprint or not to preprint: A global researcher survey Rong Ni, Ludo Waltman Journal of the Association for Information Science and Technology.2024; 75(6): 749. CrossRef
Open publishing of public health research in Africa: an exploratory investigation of the barriers and solutions Pasipanodya Ian Machingura Ruredzo, Dominic Dankwah Agyei, Modibo Sangare, Richard F. Heller Insights the UKSG journal.2024;[Epub] CrossRef
Exploring the current dynamics of preprints Raj Rajeshwar Malinda, Dipika Mishra, Ruchika Bajaj, Alin Khaliduzzaman Current Medical Research and Opinion.2024; 40(6): 1047. CrossRef
Publishing Embargoes and Versions of Preprints: Impact on the Dissemination of Information Jaime A. Teixeira da Silva, Chun-Kai (Karl) Huang, Maryna Nazarovets Open Information Science.2024;[Epub] CrossRef
Accelerated acceptance time for preprint submissions: a comparative analysis based on PubMed Dan Tian, Xin Liu, Jiang Li Scientometrics.2024; 129(7): 3787. CrossRef
Are Preprints a Threat to the Credibility and Quality of Artificial Intelligence Literature in the ChatGPT Era? A Scoping Review and Qualitative Study Michael Agyemang Adarkwah, A. Y. M. Atiquil Islam, Käthe Schneider, Rose Luckin, Michael Thomas, Jonathan Michael Spector International Journal of Human–Computer Interaction.2024; : 1. CrossRef
A perspective on the Center for Open Science (COS) preprint servers J. A. Teixeira da Silva Science Editor and Publisher.2024; 9(1): 86. CrossRef
Post-Publication Review: Evolution of the Scientific Publishing Workflow D. M. Kochetkov Economics of Science.2024; 10(3): 8. CrossRef
Recent Issues in Medical Journal Publishing and Editing Policies: Adoption of Artificial Intelligence, Preprints, Open Peer Review, Model Text Recycling Policies, Best Practice in Scholarly Publishing 4th Version, and Country Names in Titles Sun Huh Neurointervention.2023; 18(1): 2. CrossRef
Most Preprint Servers Allow the Publication of Opinion Papers Jaime A. Teixeira da Silva, Serhii Nazarovets Open Information Science.2023;[Epub] CrossRef
The rise of preprints in earth sciences Olivier Pourret, Daniel Enrique Ibarra F1000Research.2023; 12: 561. CrossRef
The rise of preprints in earth sciences Olivier Pourret, Daniel Enrique Ibarra F1000Research.2023; 12: 561. CrossRef
Sharing the wealth: a proposal for discipline-based repositories of shared educational resources Ellen Austin Perspectives: Policy and Practice in Higher Education.2023; 27(4): 131. CrossRef
The experiences of COVID-19 preprint authors: a survey of researchers about publishing and receiving feedback on their work during the pandemic Narmin Rzayeva, Susana Oliveira Henriques, Stephen Pinfield, Ludo Waltman PeerJ.2023; 11: e15864. CrossRef
An attempt to explain the partial 'silent' withdrawal or retraction of a SAGE Advance preprint Jaime A. Teixeira da Silva Publishing Research.2023;[Epub] CrossRef
The use and acceptability of preprints in health and social care settings: A scoping review Amanda Jane Blatch-Jones, Alejandra Recio Saucedo, Beth Giddins, Robin Haunschild PLOS ONE.2023; 18(9): e0291627. CrossRef
Dissemination of Registered COVID-19 Clinical Trials (DIRECCT): a cross-sectional study Maia Salholz-Hillel, Molly Pugh-Jones, Nicole Hildebrand, Tjada A. Schult, Johannes Schwietering, Peter Grabitz, Benjamin Gregory Carlisle, Ben Goldacre, Daniel Strech, Nicholas J. DeVito BMC Medicine.2023;[Epub] CrossRef
Data are a highly valuable asset for researchers. Earlier, researchers who conducted a study permanently owned their data. Currently, however, these data can be used as a source for performing further research. In 2018, the International Committee of Medical Journal Editors presented data sharing statements for clinical trials. Although this recommendation was limited to clinical trials published in medical journals, it is a meaningful change that formalized the concept of data sharing. However, the trend of data sharing is expected to spread beyond medical journals to include a wider range of scientific journals in the near future. Correspondingly, platforms that provide storage and services to share data will gradually diversify. The World Journal of Men’s Health has adopted a clinical data sharing policy. The data deposit process to Harvard Dataverse, a well-known data repository, is as follows: first, select the type of article for data sharing; second, create an account; third, write a letter to the corresponding author; fourth, receive and validate data from the authors; fifth, upload the data to Harvard Dataverse; and sixth, add a data sharing statement to the paper. It is recommended that scientific journal editors select an appropriate platform and participate in the new trend of data sharing.
Citations
Citations to this article as recorded by
Korean scholarly journal editors’ and publishers’ attitudes towards journal data sharing policies and data papers (2023): a survey-based descriptive study Hyun Jun Yi, Youngim Jung, Hyekyong Hwang, Sung-Nam Cho Science Editing.2023; 10(2): 141. CrossRef
The utilisation of open research data repositories for storing and sharing research data in higher learning institutions in Tanzania Neema Florence Mosha, Patrick Ngulube Library Management.2023; 44(8/9): 566. CrossRef