HURON: A Quantitative Framework for Assessing Human Readability in Ontologies

The increasing use of ontologies requires their quality assurance. Ontology quality assurance consists of a set of activities that allow analyzing the ontology, identifying strengths and weaknesses, and proposing improvement actions. Human readability is a quality aspect that improves the use and re...

Full description

Bibliographic Details
Main Authors: Francisco Abad-Navarro, Catalina Martinez-Costa, Jesualdo Tomas Fernandez-Breis
Format: Article
Language:English
Published: IEEE 2023-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10254200/
_version_ 1797676677087625216
author Francisco Abad-Navarro
Catalina Martinez-Costa
Jesualdo Tomas Fernandez-Breis
author_facet Francisco Abad-Navarro
Catalina Martinez-Costa
Jesualdo Tomas Fernandez-Breis
author_sort Francisco Abad-Navarro
collection DOAJ
description The increasing use of ontologies requires their quality assurance. Ontology quality assurance consists of a set of activities that allow analyzing the ontology, identifying strengths and weaknesses, and proposing improvement actions. Human readability is a quality aspect that improves the use and reuse of ontologies. Human readable content refers to the natural language content consumed by humans and by the growing number of embedding methods applied to ontologies. The ontology community has proposed best practices for human readability, but there is no standardized framework for its evaluation. We aim to provide a framework for analyzing the human readability based on quantitative metrics to support ontology developers’ decisions. We present the HURON framework, which consists of the specification of five quantitative metrics related to the human readability of ontology content and a software tool to implement them. The metrics take into account the number of names, descriptions, or synonyms, and also assess the application of systematic naming conventions and the ‘lexically suggest, logically define’ principle. Target values are provided for each metric to help to interpret them. HURON can also be used to assess compliance with best practices. We have applied HURON to a representative set of biomedical ontologies, the OBO Foundry repository. The results showed that, in general, the OBO Foundry ontologies comply with the expected number of descriptions and names in their classes, and both lexical and semantically formalized contents are aligned. However, most of the ontologies did not follow a systematic naming convention. In general, the ontologies in this repository show adherence to some of the best practices, although areas for improvement were identified. A number of recommendations are made for ontology developers and users.
first_indexed 2024-03-11T22:33:44Z
format Article
id doaj.art-5139c4ccc95f40f2b243ee7fd4d19396
institution Directory Open Access Journal
issn 2169-3536
language English
last_indexed 2024-03-11T22:33:44Z
publishDate 2023-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj.art-5139c4ccc95f40f2b243ee7fd4d193962023-09-22T23:01:02ZengIEEEIEEE Access2169-35362023-01-011110183310185110.1109/ACCESS.2023.331651210254200HURON: A Quantitative Framework for Assessing Human Readability in OntologiesFrancisco Abad-Navarro0https://orcid.org/0000-0003-0201-3115Catalina Martinez-Costa1https://orcid.org/0000-0003-1857-1744Jesualdo Tomas Fernandez-Breis2https://orcid.org/0000-0002-7558-2880Departamento de Informática y Sistemas, Universidad de Murcia, CEIR Campus Mare Nostrum, IMIB-Arrixaca, Murcia, SpainDepartamento de Informática y Sistemas, Universidad de Murcia, CEIR Campus Mare Nostrum, IMIB-Arrixaca, Murcia, SpainDepartamento de Informática y Sistemas, Universidad de Murcia, CEIR Campus Mare Nostrum, IMIB-Arrixaca, Murcia, SpainThe increasing use of ontologies requires their quality assurance. Ontology quality assurance consists of a set of activities that allow analyzing the ontology, identifying strengths and weaknesses, and proposing improvement actions. Human readability is a quality aspect that improves the use and reuse of ontologies. Human readable content refers to the natural language content consumed by humans and by the growing number of embedding methods applied to ontologies. The ontology community has proposed best practices for human readability, but there is no standardized framework for its evaluation. We aim to provide a framework for analyzing the human readability based on quantitative metrics to support ontology developers’ decisions. We present the HURON framework, which consists of the specification of five quantitative metrics related to the human readability of ontology content and a software tool to implement them. The metrics take into account the number of names, descriptions, or synonyms, and also assess the application of systematic naming conventions and the ‘lexically suggest, logically define’ principle. Target values are provided for each metric to help to interpret them. HURON can also be used to assess compliance with best practices. We have applied HURON to a representative set of biomedical ontologies, the OBO Foundry repository. The results showed that, in general, the OBO Foundry ontologies comply with the expected number of descriptions and names in their classes, and both lexical and semantically formalized contents are aligned. However, most of the ontologies did not follow a systematic naming convention. In general, the ontologies in this repository show adherence to some of the best practices, although areas for improvement were identified. A number of recommendations are made for ontology developers and users.https://ieeexplore.ieee.org/document/10254200/Knowledge engineeringontologiesquality assurancereadability metricssemantic web
spellingShingle Francisco Abad-Navarro
Catalina Martinez-Costa
Jesualdo Tomas Fernandez-Breis
HURON: A Quantitative Framework for Assessing Human Readability in Ontologies
IEEE Access
Knowledge engineering
ontologies
quality assurance
readability metrics
semantic web
title HURON: A Quantitative Framework for Assessing Human Readability in Ontologies
title_full HURON: A Quantitative Framework for Assessing Human Readability in Ontologies
title_fullStr HURON: A Quantitative Framework for Assessing Human Readability in Ontologies
title_full_unstemmed HURON: A Quantitative Framework for Assessing Human Readability in Ontologies
title_short HURON: A Quantitative Framework for Assessing Human Readability in Ontologies
title_sort huron a quantitative framework for assessing human readability in ontologies
topic Knowledge engineering
ontologies
quality assurance
readability metrics
semantic web
url https://ieeexplore.ieee.org/document/10254200/
work_keys_str_mv AT franciscoabadnavarro huronaquantitativeframeworkforassessinghumanreadabilityinontologies
AT catalinamartinezcosta huronaquantitativeframeworkforassessinghumanreadabilityinontologies
AT jesualdotomasfernandezbreis huronaquantitativeframeworkforassessinghumanreadabilityinontologies