×
Roi Reichart, Katrin Tomanek, Udo Hahn, and Ari Rappoport. 2008. Multi-Task Active Learning for Linguistic Annotations. In Proceedings of ACL-08: HLT, pages 861 ...
In the multi-task ac- tive learning (MTAL) paradigm, we select ex- amples for several annotation tasks rather than for a single one as usually done in the con- ...
It is shown that MTAL outperforms random selection and a stronger baseline, onesided example selection, in which one task is pursued using AL and the ...
Directly estimate the joint label probability. ○ Recognize the correlation between labels. ○ Need more labeled examples. ○ What if # tasks is large?
In the multi-task active learning (MTAL) paradigm, we select examples for several annotation tasks rather than for a single one as usually done in the context ...
2022/08/10 · Multi-task learning, in which several tasks are jointly learned by a single model, allows NLP models to share information from multiple ...
Fingerprint. Dive into the research topics of 'Multi-task active learning for linguistic annotations'. Together they form a unique fingerprint.
2022/11/07 · In this paper, we are the first to systematically explore MT-AL for large pre-trained Transformer models. Naturally, our focus is on closely related NLP tasks.
In this paper, we introduce an active learning framework consisting of a data selection strategy that identifies the most informative unlabeled samples and a ...
2022/11/07 · Our results suggest that MT-AL can be effectively used in order to minimize annotation efforts for multi-task NLP models.1.