Evaluating Computational Models of Language Acquisition

How do children learn language? This question is one of the most important and mysterious challenges facing the modern linguistics community. If answered, language acquisition curriculums could be optimized, neural networks could be designed to more effectively learn human language, language corpora could be modeled after human language comprehension, and our understanding of the complexContinue reading “Evaluating Computational Models of Language Acquisition”

What is the Language of Time?

The third course I had the opportunity to attend at the 33rd European Summer School in Logic, Language, and Information was titled Temporal Logic: Priorian Perspectives. The lectures were delivered by Professor Patrick Blackburn, one of the best public-speakers I have ever met. The class represented my first formal lessons in complex logic, and IContinue reading “What is the Language of Time?”

Language Resources: Optimal Neural Network Training

This summer, I had the pleasure of attending the 33rd European Summer School in Logic, Language, and Information, a fascinating series of multidisciplinary courses focusing on the intersection of computational linguistics, semantics, and logic. Today I’d like to discuss one of the classes I participated in called Creating and Maintaining High-quality Language Resources, which didContinue reading “Language Resources: Optimal Neural Network Training”

The Modern State of Deep Neural Networks

The second lecture I was able to attend at ACL 2022 was about the state of deep nets, complex neural networks, in the modern world. The presenters brought up many interesting points about accessibility, cost of production, and even carbon emissions, and I’ll discuss some of the key points they made here today. One ofContinue reading “The Modern State of Deep Neural Networks”

Training Neural Networks with Limited Text Data

This weekend, I had the privilege of virtually attending the 60th Annual Meeting of the Association for Computational Linguistics (ACL). I was able to attend two fantastic lectures, and I will be discussing the first today. Its topic was one of great importance to modern computational linguistics: how to train neural networks accurately with limitedContinue reading “Training Neural Networks with Limited Text Data”

Modeling Language: Regular Expressions

Spring is finally here, and the warm weather and clear skies are a welcome change from the harsh winter snow. Today, in the spirit of overcoming the winter’s challenges, I’d like to discuss a common challenge faced by computational linguists, along with a fascinating solution. One of the most common programming languages in the modernContinue reading “Modeling Language: Regular Expressions”

Neural Networks in Linguistics Research

In the modern technological age, machine learning algorithms have been applied to almost every field of study, and linguistics is no exception. In this post, I’ll give a brief explanation of how neural networks work and then demonstrate how they can be used to assist linguistics research. Neural networks, or NNs for short, are machineContinue reading “Neural Networks in Linguistics Research”

A Word On Natural Language Processing

A decade ago, speaking computers and advanced artificial intelligence seemed like topics for science fiction novels, but now we live in a world where these advancements are normal parts of our day-to-day lives. Have you ever wondered how devices like Google Home, Alexa, and Siri understand what you’re saying? Ever wondered how text-to-speech bots work?Continue reading “A Word On Natural Language Processing”