Skip to content
Snippets Groups Projects
Select Git revision
  • 6276eca0e070f01c69e720154b055ca1c963a2a2
  • master default protected
2 results

.gitkeep

Blame
  • README.md 2.79 KiB

    TENET - Terminology Extraction using Net Extension by (semantic) Transduction


    This tool exploits an intermediate semantic representation (UNL-RDF graphs) to construct an ontology representations of NL sentences. [TODO: compléter]

    The treatment is carried out in two stages:

    1. Initialization: TODO.
    2. UNL sentences Loading: TODO.
    3. Transduction Process: the UNL-RDF graphs are extended to obtain semantic nets.
    4. Classification / Instanciation
    5. Reasonning

    [TODO: compléter la description]

    1 - Implementation

    This implementation was made using Python languages, with UNL as pivot structure.

    [TODO: talk about UNL-RDF graph (obtained using UNL-RDF schemas)]

    The following module is included as main process:

    1. Semantic Transduction Process (stp) for semantic analysis with transduction schemes

    The python script tenet.py is used to manage the tool's commands, using components of the directory scripts. The data to be processed must be placed in the directory corpus. All working data, including the results, will be processed in the directory workdata.

    Transduction process configuration includes an ontology definition for semantic net, and several transduction schemes as SPARQL request.

    2 - Environment Setup

    [TODO: external module souces?]

    The python code has been tested on Python 3.7. All dependencies are listed in requirements.txt. These dependencies are used for external modules.

    The input directories contain evaluation files with some test corpus.

    3 - Execution

    The application runs in a terminal using the tenet.py script: python3 tenet.py .

    This prototype was tested with a standard computer configuration. The processing time is reasonable for both processing steps.