Skip to content

Implementation of evaluation measures for Named Entity Recognition task for use in the Matriciel project (PEPS)

Notifications You must be signed in to change notification settings

cvbrandoe/NEREval

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 

Repository files navigation

PRES : https://docs.google.com/presentation/d/1WMa8Sc1RudVV2qeW7YxiXan1WUEWk_FISczUQA62bPE/edit?usp=sharing

NER Evaluation

Here, you will find several code sources for use in the Matriciel project (PEPS CNRS), in particular

  • an implementation of evaluation measures for Named Entity Recognition task (classification is not considered) according to the State of art (Nouvel et al. 2015). The algorithm aligns named-entities annotations (and their context) given two input texts.

  • a client for annotating named-entities in French texts using DBpedia Spotlight (remote server or own server).

For any inquiry, please contact us.

Magali Capeyron Catherine Domingues Carmen Brando

About

Implementation of evaluation measures for Named Entity Recognition task for use in the Matriciel project (PEPS)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages