Skip Navigation
Skip to contents

Science Editing : Science Editing

OPEN ACCESS
SEARCH
Search

Articles

Page Path
HOME > Sci Ed > Volume 7(1); 2020 > Article
Meeting Report
Report on the Crossref LIVE19 annual meeting
Jae Hwa Changorcid
Science Editing 2020;7(1):82-84.
DOI: https://doi.org/10.6087/kcse.197
Published online: February 20, 2020

Infolumi, Seongnam, Korea

Correspondence to Jae Hwa Chang jhchang@infolumi.co.kr
• Received: February 1, 2020   • Accepted: February 4, 2020

Copyright © 2020 Korean Council of Science Editors

This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

prev next
  • 5,174 Views
  • 86 Download
  • 1 Web of Science
  • 1 Crossref
  • 1 Scopus
Date: November 13–14, 2019
Venue: Tobacco Theater, Amsterdam, the Netherlands
Theme: Have your say
Organizer: Crossref
URL: https://www.crossref.org/crossref-live-annual/
Crossref’s 2019 annual meeting was held in Amsterdam, the Netherlands, on November 13 and 14. At this meeting, Crossref celebrated its 20th anniversary and took the opportunity to look back on the last 20 years and to explore, together with the participants, the directions that Crossref should take in the rapidly changing environment of scholarly research and communications (Fig. 1).
The first day of the event began with a glimpse of Crossref’s history from its inception to today, presented by Ed Pentz, Crossref’s Executive Director. This was followed by a presentation entitled “Perceived value of Crossref” by Ginny Hendricks, Director of Member and Community Outreach. She explored perceptions of Crossref, its mission, and its services through telephone interviews and surveys. Based on the results, she presented Crossref’s value in the field of scholarly communications as an infrastructure organization, metadata distributor, open scholarship supporter, and community hub. Next was Ed Pentz’s presentation on “Strategic scene-setting.” He briefly addressed the following issues: the evolution of Similarity Check services, adding preprint as a content type, membership growth by fee tier by year, the distribution of revenues (comparing 2011 to 2019), total registered content distribution by annual membership fee tier, average time spent by constituency, and income and expense history during 2010–2019.
The next session was composed of “In their own words” talks, for which members were divided into five groups: large publishers, medium publishers, small publishers, research funders, and academic groups. Representatives of each group talked about Crossref’s role and their desires for Crossref’s future vision. One of the most impressive talks was “Crossref’s value in an era of open science,” presented by Todd Toler of Wiley. He defined Crossref’s value in four ways: first, Crossref is in a unique position to solve major problems in the field of research communications; second, publishers are in greater need of cooperation, shared infrastructure, and standards than ever before; third, Crossref’s revenue growth is driven by content deposition fees, but its ambitions are increasingly driven by the goal of providing infrastructure beyond its core linking service; and fourth, the sustainability of Crossref ’s future growth depends on aligning its roadmap with its funding sources.
Another interesting talk was “Researcher and metadata user view” by Ludo Waltman from the University of Leiden, who represented the perspective of academic researchers. He introduced some examples of how Crossref’s data are used in academic fields and drew sympathy from many participants by presenting the following recommendations: first, it is necessary to ensure that basic infrastructure works well; second, Crossref should work together with publishers to increase the completeness of metadata (abstracts, affiliations, license data, etc.) and participate in initiatives to improve and enrich metadata; and third, fair models should be developed for funding and sustaining such initiatives.
On day 2, a roundtable discussion continued throughout the day. The participants were divided into 11 discussion groups of about 10 people each. They discussed topics based on an examination of the 2018–2019 annual report, entitled “Fact File” [1]. After an exchange of viewpoints, the facilitator of each group presented the results of the discussion. In more detail, the process was as follows. First, given the discussion topic, each participant wrote three critical points on each of three sticky notes. Then, the participant summarized the three items aloud, one by one, and placed each sticky note in the middle of the table. If the same opinion was shared by other group members, multiple sticky notes were all placed together. Sticky notes representing the same view were stacked in the middle of the table, and the number of the same opinions was counted. The opinion with the highest number of sticky notes was identified as the representative opinion of the discussion group. That opinion was presented to all participants. Each group created a presentation in a Google slide. The session organizer shared the updated files in real time with all participants. There was no single correct answer in the discussion, but it provided an opportunity to hear what the other participants thought and to share thoughts.
The discussion consisted of three topics. The first was “What is our mission and who do we serve?” The 11 groups articulated several different opinions in response to this question. Fig. 2 presents the opinions of my group. The opinions on “What is our mission” were as follows: 1) a critical function is missing; 2) it is better to describe the mission as “effective and efficient” than as “new and innovative technologies”; and 3) the definition of “community” was not clear. The opinions on “Who do we serve” were as follows: 1) the expansion of board representation in response to changes in scholarly communications is positive; 2) in the future, more diverse board configurations should consider personal characteristics, organizations, age, gender, the global north/south, and specific aspects of candidate board members; 3) metadata consumers are missing from committees; and 4) too much staff time is dedicated to large publishers.
The theme of the second session was, “How are we sustained?” First, for the question, “Does anything surprise you about Crossref’s revenue streams?”, the members of my group identified the following factors: 1) the Similarity Check change in 2020; 2) the jump in revenue from the $275 category, which accounted for the highest proportion of total revenue in 2019; and 3) the 10-to-1 imbalance in the relationship between content registration and metadata users. Next, the answers to the question, “If there was one thing you could change about Crossref’s revenue streams, what would it be?” were as follows: 1) more revenue should be requested from metadata users (not for data, but for services); and 2) a more in-depth analysis of the $275 category is needed.
The topic of the third session was how to find a balance between members and the community, as reflected by the question “How should our priorities change?” We discussed what to consider first out of four categories: “Simplify and enrich existing services,” “Improve our metadata,” “Adapt to expanding constituencies,” and “Collaborate and partner.” The opinions gathered in my group are presented in Fig. 2. The opinions of the 11 groups are summarized in Table 1. The interest in the ROR (Research Organization Registry) was the highest, followed by the need for metadata principles and best practices and the rapid growth of membership in countries where English is not spoken as the first language.
The group discussion process, which was dynamic and involved the use of exciting tools, promoted active participation by all attendees. This annual meeting was an opportunity for participants of various backgrounds—with different membership types and affiliations, and from different countries—to experience multiple aspects of Crossref and to deepen their understanding of Crossref. Based on the invaluable opinions of the participants, I look forward to the ongoing development of Crossref, which has served as crucial infrastructure for sharing, preserving, and evaluating research information.

Conflict of Interest

Jae Hwa Chang is the Crossref Ambassador in Korea; otherwise, no potential conflict of interest relevant to this article was reported.

Fig. 1.
Participants in the meeting room of Crossref LIVE19 (photo provided by Crossref).
kcse-197f1.jpg
Fig. 2.
Opinion board on Crossref’s priorities from the author’s discussion group.
kcse-197f2.jpg
Table 1.
Ordered table of opinions on Crossref's priorities from 11 discussion groups
Priorities Group 1 Group 2 Group 3 Group 4 Group 5 Group 6 Group 7 Group 8 Group 9 Group 10 Group 11 Cumulative votes
ROR 4 7 3 4 - 4 4 - 6 3 3 38
Metadata best practices/principles including health checks 7 6 2 - 2 2 - 3 5 2 5 34
Non-English issues - - 2 4 3 4 2 8 6 5 - 34
Technical and operational debt - 8 6 - 5 7 - 7 - - - 33
Schema updates including JATS/CRediT/Schema.org - 6 - 4 - - 11 4 - - 4 29
Funder outreach - - 13 3 - - 2 - - 3 2 23
Data citations - 4 3 - - - - - 3 7 - 17
Public metadata feedback loop 7 - - 3 4 - - 2 - - - 16
DUL with COUNTER 2 3 3 5 - - - - - - 1 14
Joint search with DataCite through FREYA - 3 1 2 3 - - - 3 - 2 14
Foundational infrastructure with ORCID & DataCite 3 - - - - - - 4 - 5 - 12
EPEC with COPE, DOAJ, INASP - 5 - - - - - - 4 2 - 11
Ambassador program 3 - - 2 - 3 - - - 1 1 10
Participation reports phase 2/improvements/support - - - - - - 9 - - - - 9
Preflight checks - - - - - 9 - - - - - 9
Sponsor program - 3 - - 1 - - - 4 - 1 9
REST API - - - - - - - 6 - - 2 8
Citation classification - - - - - - - - 7 - - 7
Crossref (DOI display) widget - - - - - - 3 - 4 - - 7
Incentivize (fees)/advocate for more robust metadata - - - - 7 - - - - - - 7

Data provided by Crossref.

ROR, Research Organization Registry; JATS, Journal Article Tag Suite; DUL, distributed usage logging; EPEC, Emerging Publisher Education Coalition; COPE, Committee on Publication Ethics; DOAJ, Directory of Open Access Journals; INASP, International Network for the Availability of Scientific Publications.

Figure & Data

References

    Citations

    Citations to this article as recorded by  
    • Reflections on 4 years in the role of a Crossref ambassador in Korea
      Jae Hwa Chang
      Science Editing.2022; 9(1): 69.     CrossRef

    Figure
    • 0
    • 1
    Related articles
    Report on the Crossref LIVE19 annual meeting
    Image Image
    Fig. 1. Participants in the meeting room of Crossref LIVE19 (photo provided by Crossref).
    Fig. 2. Opinion board on Crossref’s priorities from the author’s discussion group.
    Report on the Crossref LIVE19 annual meeting
    Priorities Group 1 Group 2 Group 3 Group 4 Group 5 Group 6 Group 7 Group 8 Group 9 Group 10 Group 11 Cumulative votes
    ROR 4 7 3 4 - 4 4 - 6 3 3 38
    Metadata best practices/principles including health checks 7 6 2 - 2 2 - 3 5 2 5 34
    Non-English issues - - 2 4 3 4 2 8 6 5 - 34
    Technical and operational debt - 8 6 - 5 7 - 7 - - - 33
    Schema updates including JATS/CRediT/Schema.org - 6 - 4 - - 11 4 - - 4 29
    Funder outreach - - 13 3 - - 2 - - 3 2 23
    Data citations - 4 3 - - - - - 3 7 - 17
    Public metadata feedback loop 7 - - 3 4 - - 2 - - - 16
    DUL with COUNTER 2 3 3 5 - - - - - - 1 14
    Joint search with DataCite through FREYA - 3 1 2 3 - - - 3 - 2 14
    Foundational infrastructure with ORCID & DataCite 3 - - - - - - 4 - 5 - 12
    EPEC with COPE, DOAJ, INASP - 5 - - - - - - 4 2 - 11
    Ambassador program 3 - - 2 - 3 - - - 1 1 10
    Participation reports phase 2/improvements/support - - - - - - 9 - - - - 9
    Preflight checks - - - - - 9 - - - - - 9
    Sponsor program - 3 - - 1 - - - 4 - 1 9
    REST API - - - - - - - 6 - - 2 8
    Citation classification - - - - - - - - 7 - - 7
    Crossref (DOI display) widget - - - - - - 3 - 4 - - 7
    Incentivize (fees)/advocate for more robust metadata - - - - 7 - - - - - - 7
    Table 1. Ordered table of opinions on Crossref's priorities from 11 discussion groups

    Data provided by Crossref.

    ROR, Research Organization Registry; JATS, Journal Article Tag Suite; DUL, distributed usage logging; EPEC, Emerging Publisher Education Coalition; COPE, Committee on Publication Ethics; DOAJ, Directory of Open Access Journals; INASP, International Network for the Availability of Scientific Publications.


    Science Editing : Science Editing
    TOP