Purpose The peer review process is essential for maintaining the quality of scientific publications. However, identifying reviewers who possess the necessary expertise can be challenging. In Open Journal Systems (OJS), which is commonly utilized by journals, the most effective method of inviting reviewers is when they are already registered in the system. This study seeks to improve the efficiency and accuracy of the reviewer selection process to ensure high-quality peer reviews.
Methods We introduced a process innovation to analyze users within OJS and obtain recommendations for potential reviewers possessing the relevant expertise for the manuscript under review. This study collected user data from OJS as potential reviewers and utilized information from the Scopus search application programming interface (API). We extracted authors’ data from the Scopus API to obtain their Scopus IDs, which were then used to scrape publication data of potential reviewers. The system matched the previous works of reviewers with the title and abstract of the manuscript using term frequency-inverse document frequency and cosine similarity algorithms.
Results The system was evaluated by comparing its recommendations with the assessments made by the editorial team. This evaluation yielded precision, mean average precision, and mean reciprocal rank values of 0.47, 0.77, and 0.87, respectively.
Conclusion The results clearly demonstrate the system’s ability to provide relevant reviewer recommendations. This system offers significant benefits by assisting editors in identifying suitable reviewer candidates from the existing user database in OJS, particularly for the evaluation of manuscripts.
A reporting guideline can be defined as “a checklist, flow diagram, or structured text to guide authors in reporting a specific type of research, developed using explicit methodology.” A reporting guideline outlines the bare minimum of information that must be presented in a research report in order to provide a transparent and understandable explanation of what was done and what was discovered. Many reporting guidelines have been developed, and it has become important to select the most appropriate reporting guideline for a manuscript. Herein, I propose an algorithm for the selection of reporting guidelines. This algorithm was developed based on the research design classification system and the content presented for major reporting guidelines through the EQUATOR (Enhancing the Quality and Transparency of Health Research) network. This algorithm asks 10 questions: “is it a protocol,” “is it secondary research,” “is it an in vivo animal study,” “is it qualitative research,” “is it economic evaluation research,” “is it a diagnostic accuracy study or prognostic research,” “is it quality improvement research,” “is it a non-comparative study,” “is it a comparative study between groups,” and “is it an experimental study?” According to the responses, 16 appropriate reporting guidelines are suggested. Using this algorithm will make it possible to select reporting guidelines rationally and transparently.
Citations
Citations to this article as recorded by
Journal of Educational Evaluation for Health Professions received the Journal Impact Factor, 4.4 for the first time on June 28, 2023 Sun Huh Journal of Educational Evaluation for Health Professions.2023; 20: 21. CrossRef
Why do editors of local nursing society journals strive to have their journals included in MEDLINE? A case study of the Korean Journal of Women Health Nursing Sun Huh Korean Journal of Women Health Nursing.2023; 29(3): 147. CrossRef