Posts tagged "children"

Note:

At present, I write here infrequently. You can find my current, regular blogging over at The Deliberate Owl.

Randy, Elian at 8 months (sporting his lab t-shirt!), and I

Starting a family in grad school

I wasn't married when I got to MIT, but I had a boyfriend named Randy who moved up to Boston with me. Two years in, we discover that it is, in fact, possible to simultaneously plan a wedding and write a master's thesis! Two years after that? I'm sitting uncomfortably in a floppy hospital gown at Mt. Auburn Hospital using my husband's phone to forward the reviews I'd just received on a recent journal paper submission, hoping labor doesn't kick in full force before I finish canceling all my meetings and telling people that I'll be taking maternity leave a month sooner than expected.

Baby Elian is born later that night, tiny and perfect. The next three weeks are spent writing my PhD proposal from the waiting room while we wait for Elian to grow big enough to leave the hospital's nursery.

Our decision to have a baby during grad school did not come lightly. For a lot of students, grad school falls smack in the middle of prime mate-finding and baby-making years. But my husband and I knew we wanted kids. We knew fertility decreases over time, and didn't want to wait too long. In 2016, I was done with classes, on to the purely research part of the PhD program. My schedule was as flexible as it would ever be. Plus, I work with computers and robotts—no cell cultures to keep alive, no chemicals I'd be concerned about while pregnant. Randy did engineering contract work (some for a professor at MIT) and was working on a small startup.

Was it the perfect time? As a fellow grad mom told me once, there's never a perfect time. Have babies when you're ready. That's it.

Okay, we agreed, now's the time. It'd be great, right? We'd have this adorable baby, then Randy would stay home most of the time and play with the baby while I finished up school. He'd even have time in the evenings and on weekends to continue his work.

Naiveté, hello.

Since my pregnancy was relatively easy (I got lucky—even my officemate's pickled cabbage and fermented fish didn't turn my stomach), we were optimistic that everything else would go well, too. The preterm birth was a surprise, sure, but maybe that was a fluke in our perfectly planned family adventure. Then it came time for me to go back to the lab full time. I'd read about attachment theory in psychology papers—i.e., the idea that babies form deep emotional bonds to their caregivers, in particular, their mothers. Cool theory, interesting implications about social relationships based on the kind of bond babies formed, and all that. It wasn't until the end of my maternity leave, when I handed our wailing three-month-old boy to my husband before walking out the door that I internalized it: Elian wasn't just sad that I was going away. He needed me. I mean, looking at it from an evolutionary perspective, it made perfect sense. There I was, his primary source of food, shelter, and comfort, walking in the opposite direction. He had no idea where I was going or whether I'd be back. If I were him, I'd wail, too.

Us: 0. Developmental psychology: 1.

Finding a balance

This was going to be more difficult than we'd thought. For various financial and personal reasons, we had already decided not to put the baby in daycare. Other people's stories ("when he started daycare, he cried for a month, but then he got used to it") weren't our cup of tea. But our plans of me spending my days in the lab while the baby was back at home? Not so much. In addition to Elian's distress at my absence, he generally refused pumped breast milk in favor of crying, hungry and sad.

So, we made new plans. These plans involved bringing Elian to the lab a lot (pretty easy at first: he'd happily wiggle on my desk for hours, entertained by his toes). Coincidentally, that's when I began to feel pressure to prove that what we're doing works. That I can do it. That I can be a woman, who has a baby, who's getting a PhD at MIT, who's healthy and happy and "having it all". "Having it all." No matter what I pick, kids or work or whatever, I'm making a choice about what's important. We all have limited time. What "all" do I want? What do I choose to do with my time? And am I happy with that choice?

Now, Elian's grown up wearing a Media Arts & Sciences onesie and a Personal Robots Group t-shirt. I'm fortunate that I can do this—I have a super supportive lab group and I know this definitely wouldn't work for everyone. Not only does our group do a lot of research with young kids, but my advisor has three kids of her own. My officemate has a six-year-old who I've watched grow up. Several other students have gotten married or had kids during their time here. As a bonus, the Media Lab has a pod for nursing mothers on the fifth floor, and a couple bathrooms even have changing tables. (That said, it's so much faster to just set the baby on the floor, whip off the old diaper, on with the new. If he tries to crawl away mid-change, as is his wont these days, he can only get so far as under my desk.)

Randy comes to campus more now, too. It's a common sight to see him from the Media Lab's glass-walled conference rooms, pacing the hallway with a sleeping baby in a carry pack while he answers emails on his tablet. I feed the baby between meetings, play for a while when Randy needs to run over to the Green Building for a contractor meeting, and it works out okay. We keep Elian from licking the robots and Elian makes friends from around the world, all of whom are way taller than he is. The best part? He's almost through the developmental stage in which he bursts into tears when he sees them!

I also have the luxury of working from home a lot. That's helped by two things: first, right now, I'm either writing code or writing papers— i.e., laptop? check. Good to go. Second, my lab has undergone construction multiple times the past year, so no one else wants to work there either with all the hammering and paint fumes.

Stronger, faster, better?

But it's not all sunshine, wobbly first steps, and happy baby coos. I think it's harder to be a parent in grad school as a woman. I know several guys who have kids; they can still manage a whole day—or three—of working non-stop, sleeping on a lab couch, all-night hacking sessions, attending conferences in Europe for a week while the baby stays home. Me? Sometimes, if I'm out of sight for five minutes, Elian loses it. Sometimes, we make it three hours. Some nights, waking up to breastfeed a sad, grumpy, teething baby, it's like I'm also pulling all-nighters, but without the getting work done part.

Times when I'm feeling overwhelmed, I remember a fictional girl named Keladry. The protagonist of Tamora Pierce's Protector of the Small quartet, she was the first girl in the kingdom to openly try to become a knight—traditionally a man's profession (see the parallel to academia?). She followed the footsteps of another girl, Alanna, who opened the ranks by pretending to be a boy throughout her training, revealing her identity only when she was knighted. I remember Keladry because of the discipline and perseverance she embodied.

I remember her feeling that she had to be stronger, faster, and better than all the boys, because she wasn't just representing herself, she was representing all girls. Sometimes, I feel the same: That as a grad mom, I'm representing all grad moms. I have to be a role model. I have to stick it out, show that not only do I measure up, but that I can excel, despite being a mother. Because of being a mother. I have to show that it's a point in our favor, not a mark against us.

I remember Keladry's discipline: getting up early to train extra hard, working longer to make sure she exceeded the standard. I remember her standing tall in the face of bullies, trying to stay strong when others told her she wasn't good enough and wouldn't make it.

So I get up earlier, writing paper drafts in the dawn light with a sleeping baby nestled beside me. I debug code when he naps (even at 14 months, he still naps twice a day, lucky me). I train UROPs, run experimental studies, analyze data, and publish papers. I push on. I don't have to face down bullies like Keladry, and I'm fortunate to have a lot of support at MIT. But sometimes, it's still a struggle.

When I was talking through my ideas for this blog with other writers, one person said, "I'm not sure how you do it." I didn't have a good answer then, but here's what I should have said: I do it with the help of a super supportive husband, a strong commitment to the life choices I've made, and a large supply of earl grey tea.

This article originally appeared on the MIT Graduate Student Blog, February 2018


0 comments

child leans over tablet showing a storybook, in front of a fluffy robot who is listening

Does the robot's expressivity affect children's learning and engagement?

Reading books is great. Reading picture books with kids is extra great, especially when kids are encouraged to actively process the story materials through dialogic reading (i.e., asking questions, talking about what's happening in the book and what might happen next, connecting stuff in the book to other stuff the kid knows). Dialogic reading can, e.g., help kids learn new words and remember the story better.

Since we were already studying how we could use social robots as language learning companions and tutors for young kids, we decided to explore whether social robots could effectively engage preschoolers in dialogic reading. Given that past work has shown that children can and do learn new words from social robots, we decided to also look at what factors may modulate their engagement and learning—such as the verbal expressiveness of the robot.

fluffy robot tells a story to a child, who leans in over a tablet storybook listening

Tega robot

For this study, we used the Tega robot. Designed and built in the Personal Robots Group, it's a squash-and-stretch robot specifically designed to be an expressive, friendly creature. An Android phone displays an animated face and runs control software. The phone's sensors can be used to capture audio and video, which we can stream to another computer so a teleoperator can figure out what the robot should do next, or, in other projects, as input for various behavior modules, such as speech entrainment or affect recognition. We can stream live human speech, with the pitch shifted up to sound more child-like, to play on the robot, or playback recorded audio files.

Here is a video showing one of the earlier versions of Tega. Here's research scientist Dr. Hae Won Park talking about Tega and some of our projects, with a newer version of the robot.

Study: Does vocal expressivity matter?

We wanted to understand how the robot's vocal expressiveness might impact children's engagement and learning during a story and dialogic reading activity. So we set up two versions of the robot. One used a voice with a wide range of intonation and emotion. The other read and conversed with a flat voice, which sounded similar to a classic text-to-speech engine and had little dynamic range. Both robots moved and interacted the exact same way—the only difference was the voice.

This video shows the robot's expressive and not-so-expressive voices.

Half of the 45 kids in the study heard the expressive voice; the other half heard the flat voice. They heard a story from the robot that had several target vocabulary words embedded in it. The robot asked dialogic questions during reading. Kids were asked to retell the story back to a fluffy purple toucan puppet (who had conveniently fallen asleep during the story and was so sad to have missed it).

We found that all children learned new words from the robot, emulated the robot's storytelling in their own story retells, and treated the robot as a social being. However, children who heard the story from the expressive robot showed deeper engagement, increased learning and story retention, and more emulation of the robot's story in their story retells.

This study provided evidence that children will show peer-to-peer modeling of a social robot's language. In addition, they will also emulate the robot's affect, and they will show deeper engagement and learning when the robot is expressive.

child smiling and looking up, beside fluffy robot and fluffy toucan puppet

Links

Publications

  • Kory-Westlund, J., Jeong, S., Park, H. W., Ronfard, S., Adhikari, A., Harris, P. L., David DeSteno, & Breazeal, C. (2017). Flat versus expressive storytelling: young children's learning and retention of a social robot's narrative. Frontiers in Human Neuroscience, 11. [PDF] [online]

0 comments

a girl reaches her hand toward the face of a fluffy red robot, which sits on the table in front of her

Socially Assistive Robotics

This project was part of the Year 3 thrust for the Socially Assistive Robotics: An NSF Expedition in Computing grant, which I was involved in at MIT in the Personal Robots Group.

The overall mission of this expedition was to develop the computational techniques that could enable the design, implementation, and evaluation of "relational" robots, in order to encourage social, emotional, and cognitive growth in children, including those with social or cognitive deficits. The expedition aimed to increase the effectiveness of technology-based interventions in education and healthcare and to enhance the lives of children who may require specialized support in developing critical skills.

The Year 1 project targeted nutrition; Year 3 targeted language learning (that's this project!); Year 5 targeted social skills.

Second-language learning companions

This project was part of our effort at MIT to develop robotic second-language learning companions for preschool children. (We did other work in this area too: e.g., several projects looking at what design features positively impact children's learning as well as how children learn and interact over time.)

The project had two main goals. First, we wanted to test whether a socially assistive robot could help children learn new words in a foreign language (in this case, Spanish) more effectively by personalizing its affective/emotional feedback.

Second, we wanted to demonstrate that we could create and deploy an fully autonomous robotic system at a school for several months.

a boy sits at a table with a fluffy robot on it and leans in to peer at the robot's face, while the robot looks down at a tablet

Tega Robot

We used the Tega robot. Designed and built in the Personal Robots Group, it's a squash-and-stretch robot specifically designed to be an expressive, friendly creature. An Android phone displays an animated face and runs control software. The phone's sensors can be used to capture audio and video, which we can stream to another computer so a teleoperator can figure out what the robot should do next, or, in other projects, as input for various behavior modules, such as speech entrainment or affect recognition. We can stream live human speech, with the pitch shifted up to sound more child-like, to play on the robot, or playback recorded audio files.

Here is a video showing one of the earlier versions of Tega. Here's research scientist Dr. Hae Won Park talking about Tega and some of our projects, with a newer version of the robot.

A fluffy red robot sits behind a tablet, which is laying on a table

Language learning game

We created an interactive game that kids could play with a fully autonomous robot and the robot’s virtual sidekick, a Toucan shown on a tablet screen. The game was designed to support second language acquisition. The robot and the virtual agent each took on the role of a peer or learning companion and accompanied the child on a make-believe trip to Spain, where they learned new words in Spanish together.

Two aspects of the interaction were personalized to each child: (1) the content of the game (i.e., which words were presented), and (2) the robot's affective responses to the child's emotional state and performance.

This video shows the robot, game, and interaction.

scene from a tablet app showing a toucan looking at things in a bdroom: a suitcaes, a closet, shirts, balls, a hat

Study

We conducted a 2-month study in three "special start" preschool classrooms at a public school in the Greater Boston Area. Thirty-four children ages 3-5, with 15 classified as special needs and 19 as typically developing, participated in the study.

The study took place over 9 sessions: Initial assessments, seven sessions playing the language learning game with the robot, and a final session with goodbyes with the robot and posttests.

We found that child learned new words presented during the interaction, children mimicked the robot's behavior, and that the robot's affective personalization led to greater positive responses from the children. This study provided evidence that children will engage a social robot as a peer over time, and personalizing a robot's behavior to children can lead to positive outcomes, such as greater liking of the interaction.

a girl mimics the head tilt and expression shown by a fluffy robot

Links

Publications

  • Kory-Westlund, J., Gordon, G., Spaulding, S., Lee, J., Plummer, L., Martinez, M., Das, M., & Breazeal, C. (2015). Learning a Second Language with a Socially Assistive Robot. In Proceedings of New Friends: The 1st International Conference on Social Robots in Therapy and Education. (*equal contribution). [PDF]

  • Kory-Westlund, J. M., Lee, J., Plummer, L., Faridia, F., Gray, J., Berlin, M., Quintus-Bosz, H., Harmann, R., Hess, M., Dyer, S., dos Santos, K., Adalgeirsson, S., Gordon, G., Spaulding, S., Martinez, M., Das, M., Archie, M., Jeong, S., & Breazeal, C. (2016). Tega: A Social Robot. In S. Sabanovic, A. Paiva, Y. Nagai, & C. Bartneck, Proceedings of the 11th ACM/IEEE International Conference on Human-Robot Interaction: Video Presentations (pp. 561). Best Video Nominee. [PDF] [Video]

  • Gordon, G., Spaulding, S., Kory-Westlund, J., Lee, J., Plummer, L., Martinez, M., Das, M., & Breazeal, C. (2016). Affective Personalization of a Social Robot Tutor for Children's Second Language Skills. Proceedings of the 30th AAAI Conference on Artificial Intelligence. AAAI: Palo Alto, CA. [PDF]

  • Kory-Westlund, J. M., Gordon, G., Spaulding, S., Lee, J., Plummer, L., Martinez, M., Das, M., & Breazeal, C. (2016). Lessons From Teachers on Performing HRI Studies with Young Children in Schools. In S. Sabanovic, A. Paiva, Y. Nagai, & C. Bartneck (Eds.), Proceedings of the 11th ACM/IEEE International Conference on Human-Robot Interaction: alt.HRI (pp. 383-390). [PDF]


0 comments

a pair of bright, fluffy dragon robots sitting beside each other on a table

Social robots as language learning companions for children

Language learning is, by nature, a social, interactive, interpersonal, activity. Children learn language not only by listening, but through active communication with a social actor. Social interaction is critical for language learning.

Thus, if we want to build technology to support young language learners, one intriguing direction is to use robots. Robots can be designed to use the same kinds of social, interactive behaviors that humans use—their physical presence and embodiment give them a leg up in social, interpersonal tasks compared to virtual agents or simple apps and games. They combine the adaptability, customizability, and scalability of technology with the embodied, situated world in which we operate.

The robot we used in these projects is called the DragonBot. Designed and built in the Personal Robots Group, it's a squash-and-stretch robot specifically designed to be an expressive, friendly creature. An Android phone displays an animated face and runs control software. The phone's sensors can be used to capture audio and video, which we can stream to another computer so a teleoperator can figure out what the robot should do next, or, in other projects, as input for various behavior modules, such as speech entrainment or affect recognition. We can stream live human speech, with the pitch shifted up to sound more child-like, to play on the robot, or playback recorded audio files.

Here is a video showing the original DragonBot robot, with a brief rundown of its cool features.

A child and a woman sit in front of a small table, looking at and talking with two fluffy dragon robots that are on the table

Social robots as informants

This was one of the very first projects I worked on at MIT! Funded by an NSF cyberlearning grant, the goal of this study and the studies following were to explore several questions regarding preschool children's word learning from social robots, namely:

  • What can make a robot an effective language learning companion?
  • What design features of the robots positively impact children's learning and attitudes?

In this study, we wanted to explore how different nonverbal social behaviors impacted children's perceptions of the robot as an informant and social companion.

We set up two robots. One was contingently responsive to the child—e.g., it would look at the child when the child spoke, it might nod and smile at the right times. The other robot was not contingent—it might be looking somewhere over there while the child was speaking, and while it was just as expressive, the timing of its nodding and smiling had nothing to do with what the child was doing.

For this study, the robots were both teleoperated by humans. I was one of the teleoperators—it was like controlling a robotic muppet!

Each child who participated in the study got to talk with both robots at the same time. The robots presented some facts about unusual animals (i.e., opportunities for the child to learn). We did some assessments and activities designed to give us insight into how the child thought about the robots and how willing they might be to learn new information from each robot—i.e., did the contingency of the robot's nonverbal behavior affect whether kids would treat the robots as equally reliable informants?

We found that children treated both robots as interlocutors and as informants from whom they could seek information. However, children were especially attentive and receptive to whichever robot displayed the greater nonverbal contingency. This selective information seeking is consistent with other recent research showing that children are, first, quite sensitive to their interlocutor's nonverbal signals, and use those signals as cues when determining which informants they question or endorse.

In sum: This study provided evidence that children show sensitivity to a robot's nonverbal social cues, like they are with humans, and they will use this information when deciding if a robot is a credible informant, as they do with humans.

Links

Publications

  • Breazeal, C., Harris, P., DeSteno, D., Kory, J., Dickens, L., & Jeong, S. (2016). Young children treat robots as informants. Topics in Cognitive Science, pp. 1-11. [PDF]

  • Kory, J., Jeong, S., & Breazeal, C. L. (2013). Robotic learning companions for early language development. In J. Epps, F. Chen, S. Oviatt, & K. Mase (Eds.), Proceedings of the 15th ACM on International conference on multimodal interaction, (pp. 71-72). ACM: New York, NY. [on ACM]

Word learning with social robots

We did two studies specifically looking at children's rapid learning of new words. Would kids learn words with a robot as well as they do from a human? Would they attend to the robot's nonverbal social cues, like they do with humans?

Study 1: Simple word learning

This study was pretty straightforward: Children looked at pictures of unfamiliar animals with a woman, with a tablet, and with a social robot. The interlocutor provided the names of the new animals—new words for the kids to learn. In this simple word-learning task, children learned new words equally well from all three interlocutors. We also found that children appraised the robot as an active, social partner.

In sum: This study provided evidence that children will learn from social robots, and will think of them as social partners. Great!

With that baseline in place, we compared preschoolers' learning of new words from a human and from a social robot in a somewhat more complex learning task...

Two panels: In the first, a child looks at a dragon robot, which looks at her while saying a word; in the second, the child watches the robot look down at a tablet

Study 2: Slightly less simple word learning

When learning from human partners, children pay attention to nonverbal signals, such as gaze and bodily orientation, to figure out what a person is looking at and why. They may follow gaze to determine what object or event triggered another's emotion, or to learn about the goal of another's ongoing action. They also follow gaze in language learning, using the speaker's gaze to figure out what new objects are being referred to or named. Would kids do that with robots, too? Children viewed two images of unfamiliar animals at once, and their interlocutor (human or robot) named one of the animals. Children needed to monitor the interlocutor's non-verbal cues (gaze and bodily orientation) to determine which picture was being referred to.

We added one more condition. How "big" of actions might the interlocutor need to do for the child to figure out what picture was being referred to? Half the children saw the images close together, so the interlocutor's cues were similar regardless of which animal was being attended to and named. The other half saw the images farther apart, which meant the interlocutor's cues were "bigger" and more distinct.

As you might expect, when the images were presented close together, children subsequently identified the correct animals at chance level with both interlocutors. So ... the nonverbal cues weren't distinct enough.

When the images were presented further apart, children identified the correct animals at better than chance level from both interlocutors. Now it was easier to see where the interlocutor was looking!

Children learned equally well from the robot and the human. Thus, this study provided evidence that children will attend to a social robot's nonverbal cues during word learning as a cue to linguistic reference, as they do with people.

Links

Publications

  • Kory-Westlund, J., Dickens, L., Jeong, S., Harris, P., DeSteno, D., & Breazeal, C. (2015). A Comparison of children learning from robots, tablets, and people. In Proceedings of New Friends: The 1st International Conference on Social Robots in Therapy and Education. [talk] [PDF]

  • Kory-Westlund., J. M., Dickens, L., Jeong, S., Harris, P. L., DeSteno, D., & Breazeal, C. L. (2017). Children use non-verbal cues to learn new words from robots as well as people. International Journal of Child-Computer Interaction. [PDF]


0 comments

a young girl hugging a fluffy dragon robot behind a little play table

Click here to see the video showing this project!

Study Overview

For my master's thesis at the MIT Media Lab, I created a social robotic learning companion that played a storytelling game with young kids.

Children’s oral language skills in preschool can predict their academic success later in life. Helping children improve their language and vocabulary skills early on could help them succeed later. Furthermore, language learning is a highly social, interactive activity. When creating technology to support children's language learning, technology that leverages the same social cues and social presence that people do—such as a social robot—will likely provide more benefit than using technology that ignores the critical social aspects of language learning.

As such, in this project, I examined the potential of a social robotic learning companion to support children's early long-term language development.

Boy sitting on the floor across a mini table from a dragon robot, looking at the robot intently

Study

The robot was designed as a social character, engaging children as a peer, not as a teacher, within a relational, dialogic context. The robot targeted the social, interactive nature of language learning through a storytelling game that the robot and child played together. The game was on a tablet—the tablet showed a couple characters that the robot or child could move around while telling their story, much like digital stick puppets. During the game, the robot introduced new vocabulary words and modeled good story narration skills.

Girl moving a picture on a tablet screen, with the tablet inset in a mini table that is between her and a dragon robot

Furthermore, because children may learn better when appropriately challenged, we asked whether a robot that Matched the “level” of complexity of the language it used to the general language ability of the child might help children improve more. For half the children, the robot told easier or harder stories based on an assessment of the child’s general language ability.

17 preschool children played the storytelling game with the robot eight times each over a two-month period.

I evaluated children's perceptions of the robot and the game, as well as whether the robot's matching influenced (i) whether children learned new words from the robot, (ii) the complexity and style of stories children told, and (iii) the similarity of children’s stories to the robot’s stories. I expected that children would learn more from a robot that matched, and that they would copy its stories and narration style more than they would with a robot that did not match. Children’s language use was tracked across sessions.

Boy touching a screen that is in a mini table that is between him and a dragon robot, the robot is also looking at the table

Results

I found that all children learned new vocabulary words, created new stories during the game, and enjoyed playing with the robot. In addition, children in the Matched condition maintained or increased the amount and diversity of the language they used during interactions with the robot more than children who played with the Unmatched robot.

Understanding how the robot influences children’s language, and how a robot could support language development will inform the design of future learning/teaching companions that engage children as peers in educational play.

Girl looking intently over a mini table at a dragon robot

Links

Publications

  • Kory, J. (2014). Storytelling with robots: Effects of robot language level on children's language learning. Master's Thesis, Media Arts and Sciences, Massachusetts Institute of Technology, Cambridge, MA. [PDF]

  • Kory, J., & Breazeal, C. (2014). Storytelling with Robots: Learning Companions for Preschool Children’s Language Development. In P. A. Vargas & R. Aylett (Eds.), Proceedings of the 23rd IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN). IEEE: Washington, DC. [PDF]

  • Kory-Westlund, J., & Breazeal, C. (2015). The Interplay of Robot Language Level with Children's Language Learning during Storytelling. In J. A. Adams, W. Smart, B. Mutlu, & L. Takayama (Eds.), Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction: Extended Abstracts (pp. 65-66). [on ACM]

  • Kory-Westlund, J. (2015). Telling Stories with Green the DragonBot: A Showcase of Children's Interactions Over Two Months. In J. A. Adams, W. Smart, B. Mutlu, & L. Takayama (Eds.), Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction: Extended Abstracts (p. 263). [on ACM] [PDF] [Video] Winner of Best Video Award.

  • Kory-Westlund, J. M., & Breazeal, C. (2019). Exploring the effects of a social robot's speech entrainment and backstory on young children's emotion, rapport, relationships, and learning. Frontiers in Robotics and AI, 6. [PDF] [online]


0 comments