×
ヒント: 日本語の検索結果のみ表示します。検索言語は [表示設定] で指定できます
2020/10/22 · In this paper, we consider the question of whether it is possible to pre-train a bilingual model for two remotely related languages without compromising ...
Language models based on deep neural networks have facilitated great advances in natural language processing and understanding tasks in recent years.
2020/10/22 · ABSTRACT. Language models based on deep neural networks have facilitated great advances in natural lan- guage processing and understanding ...
Language models based on deep neural networks have facilitated great advances in natural language processing and understanding tasks in recent years.
Here, we take steps towards closing various parts of the gap between languages with dedicated deep neural models, ones that share capacity with others in a ...
¥19,796
The book introduces key notions in minimalism and distributed morphology, making them accessible to readers with different scholarly foci. This book is of ...
含まれない: Fully Deep
We propose two methods to learn cross-lingual language models (XLMs): one unsu- pervised that only relies on monolingual data, and one supervised that leverages.
This paper describes ongoing work on a new approach for language modeling for large vocabulary continuous speech recognition. Almost all state-of-the-art ...
To address issues of cross- linguistic access to such models and reduce en- ergy consumption for sustainability during large- scale model training, this study ...
In this section, We present a multimodal in-context frame- work that can seamlessly integrate the strengths of language models with the specific requirements of ...