Active Learning for Sequence Labeling | This might, in particular, apply to NER where larger stretches of sentences do not contain any entity mention at all, or merely trivial instances of an entity class easily predictable by the current model. |
Experiments and Results | By the nature of this task, the sequences —in this case, sentences — are only sparsely populated with entity mentions and most of the tokens belong to the OUTSIDE class3 so that SeSAL can be expected to be very beneficial. |
Introduction | In the NER scenario, e.g., large portions of the text do not contain any target entity mention at all. |
Summary and Discussion | In our experiments on the NER scenario, those regions were mentions of entity names or linguistic units which had a surface appearance similar to entity mentions but could not yet be correctly distinguished by the model. |