Top Research Papers on Language Learning
Explore the top research papers on Language Learning right here. These insightful studies delve into various aspects of language acquisition, from the cognitive processes involved to effective teaching methodologies. Whether you're a researcher, educator, or simply enthusiastic about learning new languages, these papers provide valuable knowledge and inspirations to advance your understanding.
Looking for research-backed answers?Try AI Search
The role of the first language in foreign language learning
135 Citations 2020Paul Nation
journal unavailable
No description supplied
Chatbots for language learning—Are they really useful? A systematic review of chatbot‐supported language learning
660 Citations 2021Weijiao Huang, Khe Foon Hew, Luke K. Fryer
Journal of Computer Assisted Learning
Abstract Background The use of chatbots as learning assistants is receiving increasing attention in language learning due to their ability to converse with students using natural language. Previous reviews mainly focused on only one or two narrow aspects of chatbot use in language learning. This review goes beyond merely reporting the specific types of chatbot employed in past empirical studies and examines the usefulness of chatbots in language learning, including first language learning, second language learning, and foreign language learning. Aims The primary purpose of this review is to di...
ChatGPT for Language Teaching and Learning
881 Citations 2023Lucas Kohnke, Benjamin Luke Moorhouse, Di Zou
RELC Journal
The digital competencies teachers and learners require to use this chatbot ethically and effectively to support language learning are presented.
Learning Vocabulary in Another Language
911 Citations 2022Paul Nation
Cambridge University Press eBooks
Vocabulary is now well recognized as an important focus in language teaching and learning. Now in its third edition, this book provides an engaging, authoritative guide to the teaching and learning of vocabulary in another language. It contains descriptions of numerous vocabulary learning strategies, which are supported by reference to experimental research, case studies, and teaching experience. It also describes what vocabulary learners need to know to be effective language users. This new edition has been updated to incorporate the wealth of research that has come out of the past decade. It...
Statistical Language Learning in Infancy
100 Citations 2020Jenny R. Saffran
Child Development Perspectives
A brief review of the infant statistical language learning literature is presented, and broader questions concerning why infants are sensitive to statistical regularities are raised.
Counterfactual Vision and Language Learning
116 Citations 2020Ehsan Abbasnejad, Damien Teney, Amin Parvaneh + 2 more
journal unavailable
This work proposes a method that addresses the problem of visual question answering by introducing counterfactuals in the training, and shows that simulating plausible alternative training data through this process results in better generalization.
Large Language Models Demonstrate the Potential of Statistical Learning in Language
122 Citations 2023Pablo Contreras Kallens, Ross Deans Kristensen‐McLachlan, Morten H. Christiansen
Cognitive Science
It is suggested that the most recent generation of Large Language Models (LLMs) might finally provide the computational tools to determine empirically how much of the human language ability can be acquired from linguistic experience.
Variability and Consistency in Early Language Learning
180 Citations 2021Michael C. Frank, Mika Braginsky, Daniel Yurovsky + 1 more
The MIT Press eBooks
A data-driven exploration of children's early language learning across different languages, providing an empirical reference and a new theoretical framework. This book examines variability and consistency in children's language learning across different languages and cultures, drawing on Wordbank, an open database with data from more than 75,000 children and twenty-nine languages or dialects. This big data approach makes the book the most comprehensive cross-linguistic analysis to date of early language learning. Moreover, its data-driven picture of which aspects of language learning are consi...
Coreferential Reasoning Learning for Language Representation
161 Citations 2020Deming Ye, Yankai Lin, Jiaju Du + 4 more
journal unavailable
The CorefBERT model is presented, a novel language representation model designed to capture the relations between noun phrases that co-refer to each other, and has made significant progress on several downstream NLP tasks that require coreferential reasoning.
Teaching and Learning Second Language Listening
358 Citations 2021Christine C. M. Goh, Larry Vandergrift
journal unavailable
Now in its second edition, this reader-friendly text offers a comprehensive treatment of concepts and knowledge related to teaching second language (L2) listening, with a particular emphasis on metacognition. This book advocates a learner-oriented approach to teaching listening that focuses on the process of learning to listen. It applies theories of metacognition and language comprehension to offer sound and reliable pedagogical models for developing learner listening inside and outside the classroom. To bridge theory and practice, the book provides teachers with many examples of research-inf...
Survey on reinforcement learning for language processing
133 Citations 2022Víctor Uc-Cetina, Nicolás Navarro-Guerrero, Anabel Martín-González + 2 more
Artificial Intelligence Review
The state of the art of RL methods for their possible use for different problems of NLP, focusing primarily on conversational systems, is reviewed, mainly due to their growing relevance.
Learning the language of viral evolution and escape
335 Citations 2021Brian Hie, Ellen D. Zhong, Bonnie Berger + 1 more
Science
This study modeled viral escape with machine learning algorithms originally developed for human natural language, and identified escape mutations as those that preserve viral infectivity but cause a virus to look different to the immune system, akin to word changes that preserve a sentence’s grammaticality but change its meaning.
Innovations and Challenges in Language Learning Motivation
247 Citations 2020Zoltán Dörnyei
journal unavailable
"Innovations and Challenges in Language Learning Motivation provides a cutting-edge perspective on the latest challenges and innovations in language learning motivation, incorporating numerous examples and cases in mainstream psychology and in the field of second language acquisition. Drawing on over three decades of research experience as well as an extensive review of the latest psychological and SLA literature, Dörnyei provides an accessible overview of these cutting-edge areas and covers novel topics that have not yet been addressed in L2 motivation research, such as: fundamental theoreti...
Learning to Prompt for Vision-Language Models
2326 Citations 2022Kaiyang Zhou, Jingkang Yang, Chen Change Loy + 1 more
International Journal of Computer Vision
Context Optimization (CoOp) is proposed, a simple approach specifically for adapting CLIP-like vision-language models for downstream image recognition that achieves superb domain generalization performance compared with the zero-shot model using hand-crafted prompts.
Curriculum Learning for Natural Language Understanding
154 Citations 2020Benfeng Xu, Licheng Zhang, Zhendong Mao + 3 more
journal unavailable
By reviewing the trainset in a crossed way, this work is able to distinguish easy examples from difficult ones, and arrange a curriculum for language models, and obtains significant and universal performance improvements on a wide range of NLU tasks.
The social brain of language: grounding second language learning in social interaction
118 Citations 2020Ping Li, Hyeonjeong Jeong
npj Science of Learning
A blueprint for the brain network underlying social L2 learning is provided, enabling the integration of neurocognitive bases with social cognition of second language while combining theories of language and memory with practical implications for the learning and teaching of a new language in adulthood.
Becoming a Language Teacher A Practical Guide to Second Language Learning and Teaching
189 Citations 2020Elaine K. Horwitz
Castledown Publishers eBooks
The author revealed that teachers had had limited preparation experiences at both the TFLAS and ESL levels, where teachers had struggled with the intensity of the anxiety levels and the complexity of the material.
Feature Extraction and Analysis of Natural Language Processing for Deep Learning English Language
162 Citations 2020Dongyang Wang, Junli Su, Hongbin Yu
IEEE Access
This paper proposes a multi-modal neural network that applies BI-GRU (Bidirectional Gated Recurrent Unit) to English word segmentation, and uses the CRF (Conditional Random Field) model to annotate sentences in sequence, effectively solving the long-distance dependency of text semantics, shortening network training and predicted time.
Federated Learning for Vision-and-Language Grounding Problems
103 Citations 2020Fenglin Liu, Xian Wu, Shen Ge + 2 more
Proceedings of the AAAI Conference on Artificial Intelligence
This work proposes a federated learning framework to obtain various types of image representations from different tasks, which are then fused together to form fine-grained image representations that are much more powerful than the original representations alone in individual tasks.
Investment and motivation in language learning: What's the difference?
195 Citations 2021Ron Darvin, Bonny Norton
Language Teaching
The year 2020 marked the 25th year since Bonny Norton published her influential TESOL Quarterly article, ‘Social identity, investment, and language learning’ (Norton Peirce, 1995) and the fifth year since we, Darvin and Norton (2015), co-authored ‘Identity and a model of investment in applied linguistics’ in the Annual Review of Applied Linguistics. From the time Norton's 1995 piece was published, investment and motivation have been conceptually imbricated and often collocated, as they hold up two different lenses to investigate the same reality: why learners choose to learn an additional lang...
Reflexion: Language Agents with Verbal Reinforcement Learning
250 Citations 2023Noah Shinn, Beck Labash, Ashwin Gopinath + 3 more
arXiv (Cornell University)
Large language models (LLMs) have been increasingly used to interact with external environments (e.g., games, compilers, APIs) as goal-driven agents. However, it remains challenging for these language agents to quickly and efficiently learn from trial-and-error as traditional reinforcement learning methods require extensive training samples and expensive model fine-tuning. We propose Reflexion, a novel framework to reinforce language agents not by updating weights, but instead through linguistic feedback. Concretely, Reflexion agents verbally reflect on task feedback signals, then maintain the...
Teaching and learning languages online: Challenges and responses
159 Citations 2022Jian Tao, Xuesong Gao
System
The outbreak of COVID-19 generated an unprecedented global push towards remote online language teaching and learning. In most contexts, language teachers and learners underwent a rapid switch to online instruction with limited resources and preparation. Their experiences demonstrate resilience, perseverance, and creativity under highly challenging conditions. This collection of studies examines the challenges that language teachers and learners have experienced in teaching and learning online, explores how they have addressed these challenges, and identifies critical lessons to help language e...
Web-based language learning and speaking anxiety
155 Citations 2020Muzakki Bashori, Roeland van Hout, Helmer Strik + 1 more
Computer Assisted Language Learning
Investigation of the presence of FLSA in Indonesian vocational high school students and whether web-based language learning might help to reduce speaking anxiety and the results showed that students felt less anxious when speaking in front of the ASR-based websites compared to speaking to peers or people.
COMPLEX DYNAMIC SYSTEMS THEORY IN LANGUAGE LEARNING
105 Citations 2021Phil Hiver, Ali H. Al‐Hoorie, Reid Evans
Studies in Second Language Acquisition
A scoping review of the heterogenous body of research adopting a framework for dynamic method integration at the levels of study aim, unit of analysis, and choice of method finds insights that will help enhance the methodological rigor and the substantive contribution of future research.
Conditional Prompt Learning for Vision-Language Models
1327 Citations 2022Kaiyang Zhou, Jingkang Yang, Chen Change Loy + 1 more
2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
Conditional Context Optimization (CoCoOp), which extends CoOp by further learning a lightweight neural network to generate for each image an input-conditional token (vector), and yields stronger domain generalization performance as well.
LANGUAGE LEARNING STRATEGIES OF UNDERGRADUATE EFL STUDENTS
113 Citations 2020Mega Lestari, Achmad Yudi Wahyudin
Journal of English Language Teaching and Learning
This study attempts to explore the language learning strategies used by the students’ who take English Literature study programs in English as a foreign language (EFL) setting. This study involves 76 participants asked to fulfill a questionnaire called Strategy Inventory for Language Learning (SILL) developed by Oxford (1990). The result of this research showed that metacognitive has been the most frequently used strategy followed by social and compensation strategies while affective strategies become the least strategy used by the students. This research could be meaningful insight for other ...
Learning functional properties of proteins with language models
198 Citations 2022Serbülent Ünsal, Heval Ataş, Muammer Albayrak + 3 more
Nature Machine Intelligence
A benchmarking study is described to compare the performances and advantages of recent deep learning approaches in a range of protein prediction tasks, to help researchers to apply machine/deep learning-based representation techniques to protein data for various predictive tasks, and inspire the development of novel methods.
Learning the protein language: Evolution, structure, and function
480 Citations 2021Tristan Bepler, Bonnie Berger
Cell Systems
This work considers how these models can be enriched with prior biological knowledge and introduces an approach for encoding protein structural knowledge into the learned representations, to improve downstream function prediction through transfer learning.
Artificial Intelligence in Foreign Language Learning and Teaching
109 Citations 2022Torben Schmidt, T. Strassner
Anglistik
The study combines a quantitative study of approximately 30 classrooms in Baden-Württemberg and North Rhine-Westphalia with a qualitative analysis of two classrooms in Hamburg to explore the conditions and effects of adaptive, individualized practice with intelligent feedback and scaffolding on classroom-based communicative task performance and language learning processes in general.
Second language reading and incidental vocabulary learning
146 Citations 2020Paul Nation, P Waring
journal unavailable
No description supplied
Foreign language learning boredom: Conceptualization and measurement
265 Citations 2021Chengchen Li, Jean‐Marc Dewaele, Yanhong Hu
Applied Linguistics Review
Abstract This article reports on a two-step investigation of foreign language learning boredom amongst Chinese university non-English-major EFL students and English teachers. In Study 1, 22 students and 11 English teachers were interviewed and 659 students responded to an open questionnaire, recalling and describing their experiences and perceptions of boredom in learning English. The data allowed a multidimensional conceptualization of Foreign Language Learning Boredom (FLLB), empirically supporting the control-value theory in educational psychology. Based on the conceptualization of FLLB, in...
Representation Learning with Large Language Models for Recommendation
137 Citations 2024Xubin Ren, Wei Wei, Lianghao Xia + 5 more
journal unavailable
A model-agnostic framework RLMRec is proposed that aims to enhance existing recommenders with LLM-empowered representation learning, and integrates RLMRec with state-of-the-art recommender models, while also analyzing its efficiency and robustness to noise data.
True Few-Shot Learning with Language Models
190 Citations 2021Ethan Perez, Douwe Kiela, Kyunghyun Cho
arXiv (Cornell University)
Pretrained language models (LMs) perform well on many tasks even when learning from a few examples, but prior work uses many held-out examples to tune various aspects of learning, such as hyperparameters, training objectives, and natural language templates ("prompts"). Here, we evaluate the few-shot ability of LMs when such held-out examples are unavailable, a setting we call true few-shot learning. We test two model selection criteria, cross-validation and minimum description length, for choosing LM prompts and hyperparameters in the true few-shot setting. On average, both marginally outperfo...
Reproducible Scaling Laws for Contrastive Language-Image Learning
430 Citations 2023Mehdi Cherti, Romain Beaumont, Ross Wightman + 6 more
journal unavailable
This work investigates scaling laws for contrastive language-image pre-training (CLIP) with the public LAION dataset and the open-source OpenCLIP repository and finds that the training distribution plays a key role in scaling laws as the OpenAI and OpenClIP models exhibit different scaling behavior.
Review of Studies on Technology-Enhanced Language Learning and Teaching
338 Citations 2020Rustam Shadiev, Mengke Yang
Sustainability
This review study can serve as a guide for teaching and research communities who plan on designing language learning and teaching activities supported by technologies.
Continual Lifelong Learning in Natural Language Processing: A Survey
109 Citations 2020Magdalena Biesialska, Katarzyna Biesialska, Marta R. Costa-jussà
journal unavailable
This work looks at the problem of CL through the lens of various NLP tasks, and discusses major challenges in CL and current methods applied in neural network models.
Universals in Learning to Read Across Languages and Writing Systems
126 Citations 2021Ludo Verhoeven, Charles A. Perfetti
Scientific Studies of Reading
In this article, we provide a cross-linguistic perspective on the universals and particulars in learning to read across seventeen different orthographies. Starting from the assumption that reading reflects a learned sensitivity to the systematic relationships between the surface forms of words and their meanings, we chose a broad group of seventeen languages, representing syllabic, morphosyllabic, alphasyllabic (abugida), abjad, and alphabetic writing systems. We investigated the systematic variation among these languages in their written forms and in their mapping of writing units to language...
CERT: Contrastive Self-supervised Learning for Language Understanding
198 Citations 2020Hongchao Fang, Pengtao Xie
journal unavailable
Pretrained language models such as BERT, GPT have shown great effectiveness in language understanding. The auxiliary predictive tasks in existing pretraining approaches are mostly defined on tokens, thus may not be able to capture sentence-level semantics very well. To address this issue, we propose CERT: Contrastive self-supervised Encoder Representations from Transformers, which pretrains language representation models using contrastive self-supervised learning at the sentence level. CERT creates augmentations of original sentences using back-translation. Then it finetunes a pretrained langu...
CERT: Contrastive Self-supervised Learning for Language Understanding
133 Citations 2020Hongchao Fang, Pengtao Xie
journal unavailable
This work proposes CERT: Contrastive self-supervised Encoder Representations from Transformers, which pretrains language representation models using contrastiveSelf- supervised learning at the sentence level and evaluates CERT on 11 natural language understanding tasks in the GLUE benchmark.
Deciphering the language of antibodies using self-supervised learning
149 Citations 2022Jinwoo Leem, L. Mitchell, James H. R. Farmery + 2 more
Patterns
To the knowledge, AntiBERTa is the deepest protein family-specific language model, providing a rich representation of BCRs, and is primed for multiple downstream tasks and can improve the understanding of the language of antibodies.