Dr. Christopher Conway – How the Brain Learns Language
Dr. Christopher Conway is the Director of the Brain, Learning and Language Laboratory at Boys Town National Research Hospital. Dr. Conway is a distinguished psychologist with a specialty in speech and sensory communication and research focused on how the brain learns language. Before joining the team at Boys Town, he served as an Associate Professor of Psychology at Georgia State University. He received his PhD in psychology from Cornell University.
Here’s a glimpse of what you’ll learn:
- Dr. Christopher Conway explains the plasticity of the brain and how it applies to language
- What part does hearing play in learning language?
- Dr. Conway shares the important milestones of cognitive development and hearing
- Testing sequence processing and how it applies to language
- The crossover between visual and auditory senses
- Where is the field of auditory and language research going?
- Accurately diagnosing and helping kids with auditory issues
In this episode…
For many of us, language is something we never had to consider after early childhood. Yet the process itself is complex, involving both sight and sound to learn language over time. This knowledge also involves a great deal of psychology and barriers that inhibit the adoption of speech.
This is how Dr. Christopher Conway approaches his research. He recently took over the Brain, Learning and Language Laboratory at Boys Town National Research Hospital. Dr. Conway and his team seek to understand the mechanisms of language acquisition so they can help the children who struggle with it. Want to know more about his research?
Dr. Mark Syms has an in-depth conversation with Dr. Christopher Conway, the Director of the Brain, Learning and Language Laboratory at Boys Town National Research Hospital, to learn about how the brain acquires language. They discuss the overlap of audiology and psychology and how they both play a role in language acquisition. They also go into topics like sequence processing, the plasticity of the brain, and cognitive development.
Resources mentioned in this episode
- Arizona Hearing Center
- The Listen Up! website
- Listen Up!: A Physician’s Guide to Effectively Treating Your Hearing Loss by Dr. Mark Syms
- Dr. Mark Syms on LinkedIn
- Boys Town National Research Hospital
- Dr. Christopher Conway
- Dr. Christopher Conway’s email: Christopher.Conway@boystown.org
- David P. Pisoni
- Morten H. Christiansen
Sponsor for this episode…
This episode is brought to you by the Arizona Hearing Center.
The Arizona Hearing Center is a cutting-edge hearing care facility providing comprehensive, family-focused care. Approximately 36 million Americans suffer from some sort of hearing loss, more than half of whom are younger than the age of 65. That’s why the team at the Arizona Hearing Center is focused on providing the highest-quality care using innovative technologies and inclusive treatment plans.
As the Founder of the Arizona Hearing Center, Dr. Mark Syms is passionate about helping patients effectively treat their hearing loss so that they can stay connected with their family and friends and remain independent. He knows first-hand how hearing loss can impact social connection and effective communication. By relying on three core values—empathy, education, and excellence—Dr. Syms and his team of hearing loss experts are transforming the lives of patients.
So what are you waiting for? Stop missing out on the conversation and start improving your quality of life today!
Welcome to the ListenUp! podcast where we explore hearing loss, communication, connections and health.
Dr. Mark Syms 0:14
Hi, this is Dr. Mark Syms here, I’m the host of the ListenUp! podcast where I feature top leaders in healthcare. This episode is brought to you by Arizona Hearing Center, I help patients to effectively treat their hearing loss so they can connect better with their family, friends and remain independent. The reason I’m so passionate about helping people with hearing loss is because I lost my brother Robbie twice. First from his hearing loss from radiation to his brain when he passed away, I only care for years on the year vnt has performed over 10,000 surgeries over the past 20 years. And I’ve taken care of 7000s of patients with hearing loss. I’m the founder of Arizona Hearing Center. I’m the author of Listen Up: A Physician’s Guide to Effectively Treating Your Hearing Loss if you want to learn more about that go to listenuphearing.com, and for my clinical practice, go to azhear.com. Today, I’m excited to have Dr. Christopher Conway. He’s the director of the brain learning and language laboratory at Boys Town National Research Hospital. He obtained his PhD from Cornell University, his Master’s of Arts Psychology at Southern Illinois University and did his undergraduate degree at Duke University. He’s read his research is focused on how the brain learns language. This is a really interesting topic that I’m excited to learn about. And I’m excited to have him on. Listen up today. Chris, welcome to the show. Thanks for coming on.
Dr. Christopher Conway 1:28
Thanks, thanks for inviting me.
Dr. Mark Syms 1:30
So you know, tell me about your journey. Like, you know, I was asking you in the warm up, you know, you were telling me you did a dual undergraduate degree in electrical engineering and biomedical engineering, and then you’re now in, you know, how does the brain learn language? I mean, I’m, I’m sure there’s a connection, but tell me about your, your, your pathway from the one to the next. I mean, it’s always fascinating to me how people end up, you know, in these areas of research?
Dr. Christopher Conway 1:55
Yeah, that’s a great question. I think. I always liked math and science, but I was also drawn to humanities in high school. And so I’ve kind of always liked a lot of different things. And I settled on engineering, because that seemed like a good degree to have, you know, a lot of people gave advice, like, you can’t go wrong with an engineering degree. And I got the degree and I got a job as an engineer at college, working in a dog food factory, wouldn’t have thought I’d end up there. But they also need factories, need engineers to program things and, you know, work equipment. But I was just not happy. It was not until what you wanted to do, right. And so take some time off and had saved up money and started taking some classes at Southern Illinois University, and took an intro to psych class for the first time, and was just kind of blown away, because I really liked the marriage of science with people and you know, more of the kind of humanities and subjective kinds of things. And that got me interested in cognitive science and cognitive neuroscience of the brain. And from there, he got he and then sort of gradually became more interested in issues related to language.
Dr. Mark Syms 3:12
So your PhD was in psychology? That’s right. And what did you work on in your, you know, what was your area of focus in your PhD?
Dr. Christopher Conway 3:21
Yeah, so around the time I started my PhD, there was really sort of a revolution in cognitive science, related to the power of learning. because traditionally, in cognitive psychology, there was sort of an emphasis or assumption that language, for instance, was innate, you know, so children are born with knowledge of what language is, and it’s a matter of kind of map, sort of their language environment to the knowledge that’s already there. But in the late 90s, there was increasing resolute realization based on empirical data, that actually learning is more powerful than we think the brain can do a lot more than we previously thought. And so I entered graduate school around that time of those initial studies and became just fascinated by it and kind of captured in the whole idea of trying to understand what are the limits? What are the constraints on our brains ability to depart?
Dr. Mark Syms 4:20
So like neuroplasticity, right, that it is there are functions that can be gained through learning? It’s not just already born with it, right?
Dr. Christopher Conway 4:27
That’s right, there was, you know, assumption or in the thinking was that early in development, there can be a fair amount of plasticity and change in the brain. But then after a certain point in maturation, and things kind of get rigid. You know, as an adult, you can do less. It’s harder to learn language, for instance, as an adult or shatin accent, right? Yeah, exactly, exactly. And those things are all true. But it turns out the brain is constantly changing and adapting and generating new neurons and making new connections among neurons. And so There’s a lot more plasticity happening, then I think we thought, you know, 20 or 30 years ago.
Dr. Mark Syms 5:05
Yeah, I always wondered if it was just, you know, I mean, as we all know, as we transition to adulthood, our responsibilities become greater. And so the ability to dedicate as much of our cognitive load to learning language or shedding an accent just kind of goes down.
Dr. Christopher Conway 5:19
That could be Yeah, exactly. Like just sort of-
Dr. Mark Syms 5:22
Living and feed your family. So you don’t have to spend-
Dr. Christopher Conway 5:25
Less time pretending open, you know, opening of experiences and Right, exactly, so it almost I can transmit, the more you learn, the harder it is to learn something new, because you’re building yourself up with other
Dr. Mark Syms 5:37
high end and normal daily activities. Right, you know what you have to get done each day. I mean, I think about the beauty of kindergarten, right? It’s playtime, you just go play next, or whatever you want. It’s a little bit Unfortunately, my days are a little perhaps yours, too. They’re a little bit different than circle time. Yeah, yeah. Well, it’s Tell me, you know, what role is hearing playing learning language?
Dr. Christopher Conway 6:01
Yeah, that’s a good question. So I wasn’t initially interested in hearing or hearing loss or deafness until I started my postdoc work. And the person I worked with was David Pisoni at Indiana University. And he was one of the first researchers to look at deafness, and specifically range outcomes and children with ocular implants in terms of cognitive factors, cognitive and brain factors that might help understand why there’s a lot of variability in language outcomes, children with cochlear implants. And so I was interested in in kind of bringing in my, my experiences doing research in learning to try to, you know, come up with some insights into what was happening, these children. So it’s, it’s, it’s quite, it’s a complicated issue. So clearly, hearing loss will degrade auditory input language input, so it makes it a formidable challenge to online. That’s clear. But beyond that, it turns out even if you control for sort of audiological factors and demographic factors, like how well actually this device hearing later hearing aid or cochlear implant working, what kind of input or you know, is the child getting? Or what is did they get their implants? Even if you kind of look at those things and kind of control for them? There seems to be different learning happening in children with cochlear implants and, and with hearing aids. And I think there’s something going on in the brains, that leading them process information differently.
Dr. Mark Syms 7:47
So when you say there’s different learning, you mean, the way they get there? In other words, whatever the learning mechanism is, for language, it’s different than a normal hearing person.
Dr. Christopher Conway 7:57
That’s right. Yeah, that’s, that’s what the ideas that I’ve been exploring and some of my research that so they’re sort of, you can think of it two ways. So let’s take a child who’s born profoundly deaf, receives a corporate employee. At some point, though, let’s say he, so they’ve gone two years with very little auditory input. They’ve also gone two years with probably very little language, and but unless their parents are, or caregivers are, you know, deal with sign language and might expose them to that. But let’s say in the case, they’ve had very little experience with language. So the question is, well, what, what, how does the brain react to that? So you’ve got one whole sense of modality, that’s just not there. Right. And the brain is not learning language, which is something that’s super, super crucial that happens in those first few years of life. So there’s going to be changes that it goes without saying, and I think those changes are changing the way that after sound becomes available. They have there’s a lack of experience with dealing with an auditory signal and a linguistic signal because of that period of deprivation that happened.
Dr. Mark Syms 9:08
Right? Which is longer than two years, right? Because we know in utero, they’re hungry. And so, you know, is it you know, the teaching has always been for us clinically, right, is that you know, that the outcomes are substantially kind of that it’s like that the two year mark is kind of the some sort of mile marker that before and after does change the potential outcome. I’m not sure it’s really true. It might have been in the past. It took us that long to get them through all the evaluation to get them ready for a cochlear implant and I think we’re getting better at it. But is there some documented change that that does occur pre before and after to is there some milestone there?
Dr. Christopher Conway 9:47
Hmm. Um, I don’t know if it is a milestone, but I, you know, clearly earlier seems better. For sure. Right. And-
Dr. Mark Syms 9:57
Is there a back end that milestone Do you think in terms of plasticity, just based on the research? I’m just curious, you mean, you know that and in terms of meaning, like, you know, we used to the other thing that was always kind of the teaching was like at five, that kind of becomes, you know, the outcomes are gonna be, and I think it’s retrospective, right? It and the problem is, is if somebody when you learn their language outcomes, the way we thought the language is changed by the time you learn their language outcomes, right. So it’s hard.
Dr. Christopher Conway 10:25
Yeah. And it becomes, I think, increasingly difficult. The older, you know, the later the implantation occurs, just because of kind of what we’re saying, you know, as you get your brain becomes entrenched and used to doing different things. Now, those language centers, and those auditory centers are being used to do different things. And so-
Dr. Mark Syms 10:43
They’re not like brain capacity sitting there waiting for that input. And so is there any modeling of what maybe if you do become proficient, do you? You know, I mean, I’m not a neuroscientist, so forgive me, but like, you know, do you take CPU power from some other power process that you’re doing? Is there shown any loss anywhere else? Or is the brain so plastic? You can do both?
Dr. Christopher Conway 11:06
Yeah, there’s different there’s different theories and right theories about how plasticity plays out. So one is that, for example, the auditory centers might be taken over by visual or tactile senses.
Dr. Mark Syms 11:17
And people always talked about that your other senses are keener when you read.
Dr. Christopher Conway 11:21
Right, right. I think there’s some evidence to support that.
Dr. Mark Syms 11:25
But it’s not a total substitute. I think that’s probably the miss the misconception, right? Like, it’s like, you know, if you could do units, right? If you have 100 units of sensation, and you lose hearing, it’s not like the 20 units go across the other four senses. So you get 25 units of each, it doesn’t work?
Dr. Christopher Conway 11:40
Yeah. Yeah, in a way, there’s sort of the different perspective would be that you don’t have as much experience dealing with multimodal in audio visual processing.
Dr. Mark Syms 11:51
I was putting it together, right, the hearing and the visual estimate.
Dr. Christopher Conway 11:55
And that’s a big part of our, you know, for people with typical hearing, that’s a big part of our experience is not just hearing and not just seeing, but matching things together. And so the brains aren’t doing that either. And so it’s getting all these things are getting filled up with different forms.
Dr. Mark Syms 12:10
So does that catch up? The integration between the visual and the auditory? Do you know, that I don’t know. Because, you know, I talked to patients, I talked about the difference between communication and hearing, right. And so obviously, visual speech, reading, all of that is so important. And so it’s interesting, if that actually catches up and normalizes compared to hearing, there you go. We’ll figure that out. Let me know by next week.
Dr. Christopher Conway 12:35
So another related to this that might be of interest to is some of my work touches on sequence processing. So the ability to learn, you know, the order of events, so the word stimuli that you hear or that you see,
Dr. Mark Syms 12:49
interesting. So like, how do you test that? And what does that how does that practically apply?
Dr. Christopher Conway 12:53
Yeah, so one type of paradigm. And this was some of the early work done in the 90s, that I kind of mentioned, in passing, is to expose people to auditory input, or nonsense syllables or auditory tones. And it just sounds like gibberish. But actually, in the audit in these patterns are in the input sequences are regularities and patterns. So some stimuli T, or syllables, if that’s what the stimuli.
Dr. Mark Syms 13:29
So there’s a repetitive part, even though there’s an overall cast to it. Right.
Dr. Christopher Conway 13:33
Exactly. So there’s some amount of redundancy and a pattern in there. You don’t tell the people there, that that’s the case. And after the experiment, people will say, I didn’t know there was anything unusual about input except the sound weird. But there’s ways to mentor lots of different ways to measure that people are actually picking out the patterns. And so why is that important? It’s important for language, a lot of language acquisition is learning what these patterns are. So what words what words follow other words, or what sound syllables, you know, clustered together, form over? Right. And it looks like the research that we’ve done suggests that children with hearing loss, they’re not picking up these patterns as well. So it’s like auditory syntax. Yeah. Yeah. It’s a very related to syntax to sort of phonology or phonotactic. So you know, language, each language kind of has its own rules for sounds.
Dr. Mark Syms 14:32
So that is a lost skill or a lag to skill.
Dr. Christopher Conway 14:37
It looks like it’s an open question isn’t but I think it’s a lag skill. But the children who seem to retain that skill to some extent, they showed better language outcomes. So it kind of goes back to, you know, my interest in the brain’s ability to learn. It looks like how some short In our learning patterns is different. And that difference plays out in real world things like language. And that’s part of the reason why they’re not picking up language in the same way that we’re hearing.
Dr. Mark Syms 15:12
Especially because then, you know, perhaps your rehabilitation would be more targeted towards those skills than the actual I mean, because our rehabilitation is always just been, let’s expose you to language, and you kind of will acquire it, if that makes sense, right. And so, there are, it’s kind of like the ultra athletes who try these cross sports to try to get good better at their own core sport, because it helps them develop skills that have a cross application or something.
Dr. Christopher Conway 15:40
That’s right. I mean, you hit the nail on the head, I think if if there are these sort of deeper things going on in some children’s brains, then it might change how we think about rehabilitation or intervention and language therapy, instead of just trying to give them well, I’m saying doing what the normal, you know, practice of like maybe a speech language pathologist, maybe we got to think about, well, how can we help them learn these patterns? Or what kinds of inputs we need to give them that facilitate that?
Dr. Mark Syms 16:06
Do you think there’s any modeling in mathematics education that would be extrapolated to this? Because it seems like that’s, in some ways, a more concrete concept of learning. I mean, you know, multiplication tables, or every young child or the parent of a young child’s bane of their existence till the kids master. Right. And it’s, I mean, that’s a simplistic example. But it seems like it’s similar to that type of learning.
Dr. Christopher Conway 16:30
I think this type of learning is plays out in many, many things, it may be more so in more informal learning situations, so it may not be as relevant to a teacher explicitly saying, you know, two times two equals four, and, you know, memorization, this is a little different, this is more sort of picking up passively regularities and patterns, and I think it does happen in math, but it may not be in all situations, right?
Dr. Mark Syms 16:58
Yeah. Well, I mean, it certainly might even feed into, you know, I mean, some of the higher executive skills are noticing patterns in a different way, not from an auditory point of view, but from a repetitive human interaction point of view or something like that. I don’t know if it’s of the same nature, but it’s an interesting thing.
Dr. Christopher Conway 17:13
Yeah, I think I mean, the the appeal of this area of research, which is referred to as statistical learning, because you’re learning the thought is that you’re learning statistical regularities in the world, that the appeal is that it seems the crossover in so many things, we like you said social interactions, like even now in our in our interaction, our brains are picking up on cues, maybe you know, the other person and and that changes how we might interact. And you know, we’re not consciously thinking of these things, but we’re bringing our brains are doing.
Dr. Mark Syms 17:43
Well, yeah, just shows you how important languages right, because it’s the commerce of human relationships. And so until you have access to it, that doesn’t happen. Right. Right. So So are they are you doing it in other languages? So other auditory languages? Or is it all in English, your work? And how about like sign language? Are you doing any work in that as well?
Dr. Christopher Conway 18:07
Yes, some some of my work was actually with visual stimuli as a way to divorce to try to get rid of the auditory component. Yeah, divorce from auditory component and divorce from language as much as possible to so let’s just pare it down to.
Dr. Mark Syms 18:22
So you’re measuring people with a hearing impairment and then trying to measure it as a visual skill and seeing. Okay, that’s fascinating.
Dr. Christopher Conway 18:29
Yeah. And then see, does that relate at all to how well they learn language? It turns out, there is some relationship there. So the children even these are visual paradigms. There’s nothing linguistic about it. It’s not overtly. But one of
Dr. Mark Syms 18:44
The auditory skill to help the visual skills, there’s some cross skill. Is that what you’re saying? Yeah,
Dr. Christopher Conway 18:50
It’s true. Yeah, exactly. It’s a little counterintuitive, but there, to some extent, this learning ability seems to cut across perceptual modalities and domains, and so not entirely. But, you know, if I’m good at visual learning, I’m able to be good at auditory learning, not again, not 100%. But there’s some relationship there. So there’s some sort of Central mechanism that is guiding these processes across different perception with means.
Dr. Mark Syms 19:21
Are you measuring it in the visually impaired?
Dr. Christopher Conway 19:24
I haven’t, but it’s something I would like to do. Yeah.
Dr. Mark Syms 19:26
I mean, I obviously that’s the antithesis of the experiment. Great to see you back. But that within isolated if it’s auditory, and then obviously there are some people have the unfortunate both visual and auditory impairment, right, like so I should say something. So, boy, you could Yeah. All right. Well, you’ve got at least a couple of decades of work there, I think. Yeah. And so, getting that How about people who use sign language? I mean, so is that is there a temporal Is there a sequencing issue there that’s developed? Is that different? Or is that not known yet?
Dr. Christopher Conway 20:06
No, that’s, that’s a, it’s a great question. So sign language, so I’m not going to pretend to be an expert. But sign language is interesting. Because at least in terms of the brain, there’s a lot of the same sort of brain networks that are involved between, you know, sign and spoken language, sort of the classic language networks, which is sort of frontal areas, Broca’s area, and then also sort of more posterior areas. Veronica’s area. So comprehension and production is one way to think of it. Those same areas are active in sign language, the sign language, there are differences uses sign language, both parts of the brain that spoken language doesn’t generally rely on as much so like spatial spatial areas in the brain, which makes sense, sign language is partly spatial skill perception.
Dr. Mark Syms 20:54
This stuff starts at like, how do you measure this stuff?
Dr. Christopher Conway 20:59
So I haven’t personally done that.
Dr. Mark Syms 21:01
I mean, what tools do you use to measure for your lab for your experiments? So what are the tools used to actually perform these types of measurements?
Dr. Christopher Conway 21:08
So to do so, to look at different brain areas would generally be functional magnetic resonance imaging, so fMRI, you know, those are 90% of you know, if you read a news article or see a blog or whatever, something that shows the image of a brain, probably, you know, ethnic studies, are you using that in your study? I’ve done one study, but that’s not the primary tool I use the we use. We’ve, I’ve mostly used neurophysiological measurements. So looking at electrical activity that’s generated in the brain, but measured at the scale. Okay, so sophisticated, Eg, technically, exactly. So what that gives you, unlike fMRI, it gives you a nice temporal Time Force. So you can look at processing that’s happening at the millisecond level. Whereas for fMRI, you know, you’re dealing with way bigger, you know, seconds, or more.
Dr. Mark Syms 22:05
Right, the delay of the imaging and the reflecting of what’s going on is there’s a bigger lag.
Dr. Christopher Conway 22:10
Exactly. So you have a really sort of nice, precise measurement online as it’s happening, looking at bringing information processing, and you know, in real time, so that’s the appeal of eg, and event related potentials, which is a type of eg tech. So we’ve used that to look at the types of pattern learning and sequence processing and how that might be different between, you know, different sort of clinical populations.
Dr. Mark Syms 22:38
It must be a ton of data that you’re dealing with.
Dr. Christopher Conway 22:40
It’s a fair amount of it’s a fair amount of data. Which can be Yeah, which can be difficult, but it can, you know, with a lot of data, there’s a lot of things to look at.
Dr. Mark Syms 22:51
Which I mean, I think that’s the power as computing gets better. There’s things that we do now that we never would have even thought about doing right. And years ago, in terms of trying to crunch those types of numbers. It’s pretty nice.
Dr. Christopher Conway 23:02
Yeah. Now, the downside of that is you have so much data that you can almost find anything you want, if you look hard enough. And so there have been studies sort of suggesting that, you know, we really have to be rigorous and careful with the procedures we use and be transparent about everything. Because the longer you have it, you can find something that data, but it will most likely be what’s referred to as type one error, which means it’s just happened by chance. And finally, the thing you look for, if you look hard enough, you’ll eventually see it, but doesn’t mean it’s a consistent, reliable result.
Dr. Mark Syms 23:33
Well, that’s why people are trained to do high quality research and go undergo mentorship and all that. So I’ll just leave that to, you know, the peer your peers, and and, and so where do you see your field going In five or 10 years?
Dr. Christopher Conway 23:47
Hmm. Yes, so with this type of learning, statistical learning, we’re actually at a, I think, a crossroads and sort of identity crisis, because I think we’re hitting sort of adolescence in terms of this little sub field, this field. There’s a lot of, you know, excitement initially, and, and a lot of interesting research. But now I think people are kind of starting to look back and see kind of the flaws and the difficulties and some of the, the way we’ve operate, operationalize and measured some of these, this type of learning is not always been consistent across studies. And so it makes it really hard to compare. It’s almost like comparing apples and oranges. I might call this statistical learning, but somebody also calls it statistical learning, but it’s measuring it in a very different way. And you just wonder, are we measuring the same things? Are these-
Dr. Mark Syms 24:38
Giving generalizations about the progress of the field are difficult because the people are looking at it different ways. So you’re not sure you can draw an overall conclusion?
Dr. Christopher Conway 24:46
That’s right. And some there have been criticisms about that some of the these laboratory tests aren’t very reliable, so they’re not they don’t show robust results. So I might measure you on your ability to Do this kind of learning today. And then a week later, we measure you again and you shoot somebody very different. So you got to wonder, like, What’s up with that? Now, I don’t think it’s as doom and gloom as some people need it.
Dr. Mark Syms 25:11
People have good days and bad days, you know, I mean, big things that people have to understand. We’re not like consistent. Humans are consistent every day, right?
Dr. Christopher Conway 25:20
That’s very true.
Dr. Mark Syms 25:21
Especially when it comes to learning. I mean, there are days where I think I can learn a lot and other days where I’m just not optimized for, you know, integrating nearly the same amount of information.
Dr. Christopher Conway 25:33
Well, and that I think that is another potential criticism of this work, because people don’t always consider those things like motivation, or sleep, man asleep, or Yeah, I mean, you know, when the undergraduate in your lab is falling asleep during the task, but yeah, there might be other things that aren’t zombies to detect somebody might just be a little more careless that day or, or tired or worn out? It’s not showing the same.
Dr. Mark Syms 26:00
So it’s what you base your studies on what young adults or-
Dr. Christopher Conway 26:04
A combination young adults are available? Yeah, I mean, in the university setting, they’re much more available. You know, that’s why most psychology research and, and human neuroscience research involves, you know, young adults, college students, because they’re available, they’re, you know, they’re willing, for credit, or maybe a little bit of money on the side. So about half of my research, up until now involve, you know, young, young adults, but I’ve also been doing work with you, as I said, children are deaf and hard of hearing, we did a study with children, developmental dyslexia. I’ve been interested in brain development and language learning in children from different socio economic factors. So that’s been kind of a hot topic in psychology and language development as well.
Dr. Mark Syms 27:06
But I believe, you know, if you look in this ci literature, you know, the mom is the, the factors that the mother has, are, by and large, the greatest determinant of outcome in pediatrics. Yeah.
Dr. Christopher Conway 27:13
Yeah, that’s a bit Yep. It’s definitely a big factor. And it’s true across the board, you know, not just children see eyes, but, you know, take any, any given child from, you know, poverty, and then a child from a, you know, more affluent background, and there’s just gonna be a lot of differences in terms of test scores and educational attainment and language ability. And, you know, there’s a multitude of reasons for that.
Dr. Mark Syms 27:41
Yeah, so where do you see your field going in the next five to 10 years?
Dr. Christopher Conway 27:47
Yeah, I think, you know, to go back to what I was saying with, we need to the field needs to come to a consensus about how to really get out what this construct, is this statistical learning ability, whether it’s a single thing, or is it composed of sub components, you know, so I think it’s actually an umbrella term that involves a tons of a lot of different cognitive processes. And so the way we measure it, in my lab, I might be involving a, b, and c cognitive process, somebody else might be measuring a different way that all CDE and so we have to come to a consensus about what what it is these different labs and different, you know, researchers are doing what’s common, what’s different. And then we can really understand To what extent these learning abilities are different or atypical in children with hearing loss or children’s dyslexia or children with the language disorder, because honestly, the findings can be a little discrepant. So there’s evidence that, for example, sure, children with hearing loss do have problems, or you show difficulties with some of these kinds of pattern learning abilities, and then others show that they don’t and it just makes you wonder, you know, is it that the children the sample is different is that the way they’re measuring? The learning ability is different. And so the field has to if this field will continue, it has to answer these really fundamental question.
Dr. Mark Syms 29:16
So a lot of ways it’s it’s actually defining what that learning is. Yeah. Right, in a in a high specificity way. So you can compare different circumstances. It sounds like to me.
Dr. Christopher Conway 29:27
That’s right. And that’s been something I’ve been thinking about. And I just published a paper last year, that put forth one sort of idea to resolve what what this is. And so yeah, it’s something we’re still kind of.
Dr. Mark Syms 29:43
That’s great. You’re like on the forefront of developing what the forefront is going to develop. I mean, I know that’s kind of a double thing. But yeah, I mean, obviously, it’s common vocabulary in terms of research is very important. I mean, when people are talking about two different things with the same term, that becomes a very difficult thing to draw any conclusions, but hopefully that’ll happen. So then we can extrapolate move this, you know, into a clinical realm to be have application on this.
Dr. Christopher Conway 30:09
That’s right. And and that’s a great that, you know, that would be the hope is that once we get these basic questions sorted out, then we can use it for diagnosis or do it for identification of children who really will likely struggle with language learning, because their statistical pattern learning abilities are operating differently. And like we mentioned, you know, early then the hope would be, it might inform therapy.
Dr. Mark Syms 30:34
Well, interestingly, you might find kids who are non performers and other aspects of their educational process, that this could be a map for that. And then if you could meet right, you know, I mean, that’s right. Maybe not just as simple as saying, Well, I’m not good at math. Right? It might be I am actually good at math, but it’s some of the patterns or some of the deficits, you know, readings, an example. I mean, you know, people who don’t track lines, well don’t read well, right. So they know how to read, they just actually don’t track the lines well. So it’s a kind of an interesting concept that in learning,
Dr. Christopher Conway 31:04
That’s right. And in reading is another great example, where pattern learning is important to your learning what letters kind of seem to go together, and then you’re learning what the letters, what they map onto in terms of, of sounds, and that that is one idea that people with dyslexia when they’re just having problems learning these patterns. So this definitely, sort of at the heart of all these issues we’re talking about.
Dr. Mark Syms 31:30
Yeah, that’s great. Because if you can kind of crack the code, you might be cracking a lot of educational learning. goggles, right. And so if people can, and so well, that this is a great field that I’m going to, you know, keep my eye on because I think well, I will see because, obviously, I’ll see it in a cochlear implant children, if it has application pretty quickly. I think if you guys get good evidence, it’ll be adoption will not be slow. Right? If people because they want more tools to be better at this stuff. So yeah.
Dr. Christopher Conway 31:57
Right. I would hope so.
Dr. Mark Syms 31:59
Yeah. So Chris, I always ask people, you know, in terms of their mentorship, like, you know, who do you thank, you know, if you were at an awards banquet at somebody, your man and they said, okay, who do you thank for how, who got you here? Who are those people have been important to you and your issue?
Dr. Christopher Conway 32:14
Yeah, well, for sure. My PhD advisor at Cornell Morten Christiansen. He, he took me in, you know, like I said, I was a dropout engineer, and didn’t really know what I was doing. But I think he must have seen some potential in me. And I thought what he was doing was fascinating. And, and mostly I trusted him as a person, as a human being. I just felt like he was going to take care of me, he was going to shepherd me through this process. And he did. It was always very supportive and helpful. So him for sure, David Pisoni, who I mentioned earlier, just sort of opened my eyes in terms of how to make because up until that point, I was really interested in more of the basic questions about learning, and not tying it to clinical real life sorts of issues. And it opened my eyes to that. So my research on a completely different path, and I would have done otherwise. So for sure those people.
Dr. Mark Syms 33:13
Sure getting a smaller subset actually makes it a little easier to study to write a more focus on population.
Dr. Christopher Conway 33:20
Yeah, yeah. Yeah. Yeah. And he, you know, I would never really thought to work with children with cochlear implants till I started working with him. And you know, I found it pretty fascinating just because of all these unanswered questions about what’s going on, and why, you know, why shouldn’t we know by now, you know, how to how to help facilitate hearing and language learning in short, and I think there’s a lot of questions.
Dr. Mark Syms 33:47
Yeah, I think we have to get beyond what we put them in. And they work, right. I mean, that’s kind of, I mean, you know, when parents come to us, that’s kind of what we say, right? Our experience doesn’t say work. Right. And so it’s a kind of a, you know, not not the best, not the most granular answer for people. But interestingly, I mean, it is inside, we don’t actually have good granular data for, you know, posting really definite adults and how well they’ll do from a stratified point of view. So there’s a lot of work on getting good predictive data across the whole patient population of cochlear implant patients, right? Because they’re actually interesting to look, do those skills degrade and people with a progressive hearing loss? Hmm, right. Yeah. cognitive decline side on the long term hearing loss for older adults, you know, is it actually that is one of the contributors actually, some of this pattern recognition is gone, right? Because some of the cognitive overload is actually doing cognitive or pattern recognition of the context of the conversation to fill in for the hearing loss, right, knowing what people are talking about your brain fills in the blank. So it’s an interesting how important that pattern.
Dr. Christopher Conway 34:54
It is. And then and there’s not a whole lot of work in older adults with with this kind of learning that I’m talking about but definitely I wouldn’t be surprised if there wasn’t something different out there processing patterns just based on, like I said, what we know about cognitive decline and changes that happen in the brain. So-
Dr. Mark Syms 35:14
I think you’re gonna see a lot more work being that the connection between hearing loss and cognitive decline are becoming clearer and clearer. So I think you’ll see more and more work in that area. The other question I always ask people is, what’s your favorite sound?
Dr. Christopher Conway 35:29
Mmm hmm. I’ve always liked the sound of a cat purring.
Dr. Mark Syms 35:37
That’s a nice. I just love to ask that question, because it really helps us appreciate how much hearing how important it is to us and how much pleasure it really brings to us. So that’s great. That’s a great answer. I love that. So, folks, we’ve been talking to Dr. Christopher Conway. He’s the director of the brain language and learning laboratory, at Boys Town National Research Hospital. Chris, where can people contact you and learn more about you?
Dr. Christopher Conway 36:03
Sure, yeah. If you know, the easiest thing is Google my name PhD, and I probably pop up but I have a lab page on the Boys Town National Research Hospital website. So that would be the other way and my emails because for firstname.lastname@example.org
Dr. Mark Syms 36:21
That’s great. Thank you so much for coming on the podcast. I really appreciate interesting stuff. And I look forward to learning more as I watch this field. Are we all do watch this field develop? Thanks so much for that. And thanks for the great work. It’s a great really important stuff to do. Thanks again.
Dr. Christopher Conway 36:37
Thanks, appreciate it. I enjoyed it.
Thanks for tuning in to the ListenUp! podcast. We’ll see you again next time and be sure to click subscribe to get updates on future episodes.