Warning: fopen(/home/virtual/kcse/journal/upload/ip_log/ip_log_2024-03.txt): failed to open stream: Permission denied in /home/virtual/lib/view_data.php on line 88 Warning: fwrite() expects parameter 1 to be resource, boolean given in /home/virtual/lib/view_data.php on line 89 2021 Council of Science Editors annual meeting
Skip Navigation
Skip to contents

Science Editing : Science Editing

OPEN ACCESS
SEARCH
Search

Articles

Page Path
HOME > Sci Ed > Volume 8(2); 2021 > Article
Meeting Report
2021 Council of Science Editors annual meeting
Kihong Kimorcid
Science Editing 2021;8(2):177-179.
DOI: https://doi.org/10.6087/kcse.252
Published online: August 20, 2021

Department of Physics, Ajou University, Suwon, Korea

Correspondence to Kihong Kim khkim@ajou.ac.kr
• Received: August 5, 2021   • Accepted: August 5, 2021

Copyright © 2021 Korean Council of Science Editors

This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

  • 3,726 Views
  • 100 Download
From May 3 to 5, the 2021 Council of Science Editors annual meeting was held online. The main sessions were held daily for three days from midnight to 6:15 a.m. Korea time. On each day, two roundtable discussion sessions were held in parallel for one hour, followed by a 30-minute training session. Next, a keynote session was held for one hour, and four presentation sessions were conducted simultaneously for one hour, which was repeated once after a short break. In total, six discussion sessions, three training sessions, three keynote sessions, and 24 presentation sessions were held for three days. As one can guess from the large number of sessions, presentations on a wide variety of topics were made and the overall process went smoothly. Since the sessions were held after midnight Korea time, it was difficult for me to attend all of them, so I selected and participated in a few sessions on topics of particular interest to me. It was very helpful that registrants were allowed to watch the recordings of many sessions and also to download many of the presentation materials. It was especially useful because one could see all the recorded videos and presentations of the sessions that one couldn’t attend because they were running in parallel. In addition, many of the recorded videos simultaneously showed scripts of what the presenters said, which were obtained using a speech recognition software.
The keynote speaker on the first day was Jessica Malaty Rivera, Science Communication Lead of the COVID Tracking Project, an organization that collects and communicates various data related to coronavirus disease 2019 (COVID-19) in the United States. The topic was about how to effectively communicate science with the public. She presented her views on scientific communication in general, which were not limited to COVID-19. I felt that some of her methods of presenting data effectively were very similar to those used when writing scientific papers. It was interesting to hear that the person in charge of scientific communication should know not only the languages of scientists and non-scientists, but also that of pseudo-scientists, and should also be familiar with the emotional and cultural language.
Among the presentation sessions on the first day, the session entitled “Managing information from preprints” was particularly interesting. Preprints have been frequently discussed in this meeting, which is thought to be related to the fact that the role of preprints has been greatly expanded as a large number of research papers on COVID-19 have been prepublished as preprints during the pandemic. The first speaker was John Inglis, co-founder of bioRxiv and medRxiv, which are representative preprint sites in the field of biology and medicine, and presented various numerical data related to the usage of those sites during the pandemic. He also explained the screening process of papers in those sites and it was impressive that they underwent more rigorous screening by far more personnel than arXiv, a preprint site in the field of physics. The second speaker was Bruce Rosenblum of Inera, a journal editing software company, who presented issues related to citation and metadata of preprints. He gave examples of the problems that could occur in relation to this topic, such as those occurring when the sites that post preprints do not state clearly that the posted papers are preprints, when the same paper is posted on multiple preprint sites, and when the relationship between the original preprint and the revised preprint is not clearly stated. I thought this was an important issue and deserved further consideration. The third speaker was Iratxe Puebla of ASAPBio, an organization that supports the expansion of preprints in the field of life science, and presented ways to provide high quality metadata for preprints and to expand the screening of preprints to increase their reliability. On the first day, there were presentations on many other topics, including XML fundamentals, open access, and the diversity of editorial boards.
Among the presentation sessions on the second day, the “Artificial intelligence-assisted editorial tools” session was interesting to me. The application of artificial intelligence technology such as machine learning, data mining, and natural language processing to the editing and publishing of academic journals has recently attracted much attention as a rapidly developing area. In the first presentation, Robyn Mudgridge and Hannah Hutt of Frontiers, a journal publisher, introduced AIRA, an artificial intelligence-based editing software developed by Frontiers. This software automatically examines the quality of submitted manuscripts and whether there is a violation of research ethics in various aspects and performs the function of finding appropriate reviewers and editors. With the use of AIRA, the speaker said that that the quality of reviews and the satisfaction of authors were significantly improved. In the second presentation, Jennifer Chapman of the American Society of Civil Engineers introduced the experience of using artificial intelligence software called UNSILO Evaluate in four journals published by the American Society of Civil Engineers through concrete examples. Its main function is to automatically examine the quality of sentences, the accuracy of references, self-citations, and the accuracy of tables and figures for submitted papers, and to assist editors in their judgment. In the third presentation, Daniel Evanko of the American Association for Cancer Research introduced SciScore, an artificial intelligence-based software developed by the American Association for Cancer Research. The main purpose of this software is to enhance the reproducibility of the results of published papers. Authors are given scores by applying SciScore to the method section of their paper when submitting it. This score is awarded by automatically examining the rigor and consistency of various items related to research methods and data sources in the medical field. If the score is less than 4 out of 10, the authors will be asked to revise the method section. The speaker said that the overall quality of published papers improved through this process.
Another session of particular interest on the second day was “Research misconduct corrections/retractions: how to avoid getting sued” presented by Debra Parrish, an attorney at Parrish Law Offices. She gave a presentation of judicial precedents on various kinds of civil lawsuits that might arise in relation to the papers that were judged to be in violation of research ethics and retracted from publication. In particular, examples of various situations in which journal publishers could be sued, such as copyright infringement, plagiarism, research fraud, and defamation, and ways to avoid such lawsuits as much as possible were presented. On the second day, additional sessions were held on topics such as the policy regarding author list modification, fast track publishing processes, and overlay journals.
The keynote session on the third day was titled “Ethics whistleblowers and the responsibilities of journal editors.” Two speakers gave presentations on how editors should deal with serious research ethics violations, such as fabrication of data or figures in published papers, and then conducted discussions with each other. Elisabeth Bik advocated expeditious action in an open manner when clear violations were discovered. On the other hand, Daniel Bolnick of the University of Connecticut pointed out the negative side effects that can occur when processing in an open manner and suggested that a more cautious approach was better for academics and journals. I felt that the validation and evaluation of academic papers should be done in a cautious manner and Bolnick’s arguments made more sense.
Among the sessions on the third day, “The ethics of data sharing” was particularly interesting. The first presentation was made by Trevor Lane, a council member of the Committee on Publication Ethics. He gave a summary of the basic principles proposed by the Committee on Publication Ethics in relation to responsible data sharing. The second speaker was Shelly Stall of the American Geophysical Union, who discussed the problems that could arise with the papers written using publicly available data through real examples. The third speaker was Matt Cannon of Taylor & Francis, who gave a presentation on the proper way to deal with many cases related to data sharing, including the use of data containing personal information. I felt that the issues presented in this session needed to be continuously discussed, since various situations could arise in which several principles could be in conflict with each other. On the third day, other sessions were held on topics such as author-friendly submission methods and journal management.
I consider the 2021 Council of Science Editors annual meeting to be a very useful meeting where numerous presentations on various timely topics were made efficiently. It was impressive that many of the sessions were designed for professional editors working in journal publishing houses. Even in the midst of a pandemic, I could learn that the publication of academic journals was being carried out healthily and further development in new directions was taking place.

Conflict of Interest

Kihong Kim has been the editor of Science Editing since 2014, but has no role in the decision to publish this article. No other potential conflicts of interest relevant to this article was reported.

Funding

The author received no financial support for this article.

Figure & Data

References

    Citations

    Citations to this article as recorded by  


      Science Editing : Science Editing