Introduction
Computational terminology and filtering of terminological information
Introduction to the special issue
Article outline
- Content of this special issue
- Acknowledgements
-
References
This article is available free of charge.
References
Bengio, Yoshua, Réjean Ducharme, and Pascal Vincent
2001 “
A Neural Probabilistic Language Model.” In
Advances in Neural Information Processing Systems 13, ed. by
Leen, Todd K.,
Thomas G. Dietterich, and
Volker Tresp, 932–938. Cambridge: MIT Press.
Bengio, Yoshua, Ducharme, Réjean, Pascal Vincent, and Christian Jauvin
2003 “
A Neural Probabilistic Language Model.”
Journal of Machine Learning Research 31: 1137–1155.
Hinton, Geoffrey E.
1986 “
Learning Distributed Representations of Concepts.” In
Proceedings of the Eighth Annual Conference of the Cognitive Science Society, Amherst, Massachusetts, 1–12. Hillsdale: Erlbaum.
Kilgarriff, Adam, Charalabopoulou, Frieda, Gavrilidou, Maria, Johannessen, Janne. B., Khalil, Saussan., Kokkinakis, Sofie. J., Lew, Robert, Sharoff, Serge, Ravikiran Vadlapudi and Elena Volodina
2014 “
Corpus-based Vocabulary Lists for Language Learners for Nine Languages.”
Language Resources and Evaluation 48(1): 121–163.
Mikolov, Tomas, Chen, Kai, Greg Corrado, and Jeffrey Dean
2013 Efficient Estimation of Word Representations in Vector Space. arXiv:1301.3781. (
[URL]). Accessed 8 February 2018.
Young, Tom, Hazarika, Devamanyu, Soujanya Poria, and Erik Cambria
2017 Recent Trends in Deep Learning Based Natural Language Processing. arXiv:1708.02709v4. (
[URL]). Accessed 8 February 2018.
Cited by
Cited by 2 other publications
Rigouts Terryn, Ayla, Véronique Hoste & Els Lefever
2020.
In no uncertain terms: a dataset for monolingual and multilingual automatic term extraction from comparable corpora.
Language Resources and Evaluation 54:2
► pp. 385 ff.
This list is based on CrossRef data as of 8 april 2024. Please note that it may not be complete. Sources presented here have been supplied by the respective publishers.
Any errors therein should be reported to them.