Skip to content
Snippets Groups Projects
Select Git revision
  • main default protected
1 result

ontoscorer

  • Clone with SSH
  • Clone with HTTPS
  • ontoScorer

    ontoScorer is a Python library that facilitates the evaluation of semantic extraction processes. It compares an ontology generated by an extraction process with a reference ontology, and provides various metrics to assess the quality of the extraction.

    Features

    Loading Ontologies: Load ontologies from local files or URLs.

    Ontology Manipulation: Explore the classes, individuals, relations, and hierarchical structure of ontologies.

    Ontology Comparison: Compare two ontologies based on classes, individuals, relations, and hierarchical structure. Obtain evaluation measures like precision, recall, and F1 score.

    Report Generation: Generate evaluation reports containing all the calculated measures.

    Environment Setup

    The python code has been tested under Python 3.7 and Linux Manjaro, but should be run on most common systems (Linux, Windows, Mac). All dependencies are listed in requirements.txt.

    We recommend creating a virtual environment before proceeding:

    1. Create a virtual environment: python3 -m venv env

    2. Activate the virtual environment: source env/bin/activate

    Usage

    Here's an example of how to use ontoScorer:

    from ontoScorer import OntoScorer
    
    # Initialize the scorer
    scorer = OntoScorer("path/to/reference_ontology.ttl", "path/to/generated_ontology.ttl")
    
    # Compare the ontologies
    scorer.compare()
    
    # Print the scores
    scorer.print_scores()
    
    # Generate a report
    scorer.generate_report("path/to/report.txt")