RSS 2.0
Critical Assessment of Information Extraction in Biology - data sets are available from Resources/Corpora and require registration.

BioCreative VII

Track 4- COVID-19 text mining tool interactive demo [2021-04-06]

Note for Biocreative participants: For registration to a track please use the Google form.
Do not use the team "Team page" tab as it is non functional.

COVID-19 text mining tool interactive demo

Introduction
The BioCreative Interactive Text Mining Track (IAT) was introduced in BioCreative III [22151968] and has served as a means to observe the approaches, standards, and functionalities of state-of-the-art text mining systems for specific applications. The track provides system’s developers the opportunity to get detailed feedback and testing on functionality directly from a variety of end users.
In light of the COVID-19 pandemic, there has been a rapid increase in SARS-CoV-2-related research, accompanied by the emergence of a number of literature datasets [33166392][CORD-19], bioinformatics resources [33147627] and text mining tools/systems [33279995] developed specifically for SARS-CoV-2 and COVID-19 research. The motivation for this track is to provide developers with a more formal evaluation about the usefulness of their tools by providing user-in-the-loop feedback.
In addition, the track aims to promote text mining web applications conceived to support COVID-19 research. Similar to previous interactive tasks (e.g., 27589961), these TM systems will be reviewed by the appropriate users, providing feedback on effectiveness and usability.

Task Description
The interactive track is a demonstration task and follows a format similar to previous IAT tracks, which have focused on exploring interactions specifically between biocurators and web/API interfaces with a text mining back-end. The effectiveness and usability of TM systems, as well as the generation of use cases, have been the focal point of the previous IAT tracks. However, the COVID-19 interactive track focuses on a broader set of applications of interest to a wider range of users, including bench biologists, biomedical researchers, clinicians, pharma, biocurators, policy makers and funders, as well as the public at large. Selected TM tools will be evaluated remotely and results will be reported at the BioCreative workshop. The organizers will recruit appropriate users for evaluating the TM tools.

Participating teams should submit a 2-page document by June 7, 2021 describing the system and the tasks that it can perform, provide a URL for accessing the system (it does not need to be the final version), and address the following aspects:

  • What class of TM tool is this (e.g., discovery, question answering, hypothesis generation, information retrieval, curation, topic clustering, relation extraction, etc.) ?
  • Relevance and Impact: The teams should clearly describe: i) the target user community; ii) interoperability (e.g., input and output formats, standards adopted); iii) example use cases for the application.
  • Data: What are the sources of data? What is the update frequency?
  • User Interactivity: We are asking for web-based text mining systems with user interactivity (such as highlighting, sorting, filtering, editing, and exporting results).
  • System Performance: Report on any benchmark done.
  • The web server must be functional, contain help pages or tutorials, and browser compatibility must be indicated.
  • Proposals will be chosen based on the relevance to COVID research and the reported maturity of the system. Teams who are interested in participating in this track should register here.
    Send your submission to via email to biocreativechallenge_AT_gmail.com with Subject: TRACK4 proposal. Please indicate in the proposal your name and team name, and if you already know the composition of your team, provide their names.Teams will be informed of acceptance by June 20th 2021.

    User Recruitment

    To make this activity successful, recruitment of users that are representative of the target user community is essential. We expect that the teams are familiar with their user community and are able to recruit/commit two users external to the developer team. The names of the users and email should be provided in the system document. Additional users will be recruited by the organizers.

    User Feedback
    In the IAT track activity, systems will be reviewed by users, performing basic tasks (free and pre-defined) and reporting on success in achieving those via a survey. The user survey will consist of six main topics: (1) overall reaction; (2) system’s ability to help complete tasks; (3) design of application; (4) learning to use the application; (5) usability of the system; and (6) whether the user would recommend use of the system.
    Our goal in organizing this task is to provide developers with detailed user feedback, expose users to new tools and to expand user adoption of text mining tools. We expect the participation in the IAT task to create new collaborations between text miners and users. The usability test will provide the participating teams with insight about their interfaces. The feedback can be utilized to improve the user-system experience.
    Page for user review available here: Demo page for users

    Track timeline

  • Registration: closed now
  • Proposal submission: June 7, 2021
  • Notification of acceptance to track: June 20, 2021
  • System ready for evaluation and 2-p technical system description paper due: July 23, 2021
  • User testing and feedback report returned to teams: October 5, 2021
  • Short technical systems description paper due: October 15, 2021
  • Reviews sent to teams: October 20, 2021
  • Final revised version October 28, 2021
  • Demo at workshop: November 2021
  • Track organizers

  • Cecilia Arighi, University of Delaware, USA
  • Andrew Chatr-Aryamontri, University of Montreal, Canada
  • Lynette Hirschman, MITRE Corporation, USA
  • Tonia Korves,MITRE Corporation, USA
  • Martin Krallinger, Barcelona Supercomputing Center, Spain
  • Karen Ross, Georgetown University Medical School, USA