Posts tagged "projects"


At present, I write here infrequently. You can find my current, regular blogging over at The Deliberate Owl.

A girl grins at a red and blue fluffy robot and puts her arm around it

Relational AI: Creating long-term interpersonal interaction, rapport, and relationships with social robots

Children today are growing up with a wide range of Internet of Things devices, digital assistants, personal home robots for education, health, and security, and more. With so many AI-enabled socially interactive technologies entering everyday life, we need to deeply understand how these technologies affect us—such as how we respond to them, how we conceptualize them, what kinds of relationships we form with them, the long-term consequences of use, and how to mitigate ethical concerns (of which there are many).

In my dissertation, I explored some of these questions through the lens of children's interacts and relationships with social robots that acted as language learning companions.

Many of the other projects I worked on at the MIT Media Lab explored how we could use social robots as a technology to support young children's early language development. When I turned to relational AI, instead of focusing simply on how to make social robots effective as an educational tools, I delved into why they are effective—as well as the ethical, social, and societal implications of bringing social-relational technology into children's lives.

Here is a précis of my dissertation. (Or read the whole thing!)

a girl looks at the dragonbot robot as it tells a story

Exploring children's relationships with peer-like social robots

In earlier projects in the Personal Robots Group, we had found evidence that children can learn language skills with social robots—and the robot's social behaviors seemed to be a key piece of why children responded so well! One key strategy children used to learn with the robots was social emulation—i.e., copying or mirroring the behaviors used by the robot, such as speech patterns, words, even curiosity and a growth mindset.

My hunch, and my key hypothesis, was this: Social robots can benefit children because they can be social and relational. They can tap into our human capacity to build and respond to relationships. Relational technology, thus, is technology that can build long-term, social-emotional relationships with users.

I took a new look at data I'd collected during my master's thesis to see if there was any evidence for my hypothesis. Spoiler: There was. Children's emulation of the robot's language during the storytelling activity appeared to be related both to children's rapport with the robot and their learning.

Assessing children's relationships

Because I wanted to measure children's relationships with the robot and gain an understanding of how children treated it relative to other characters in their lives, I created a bunch of assessments. Here's a summary of a few of them.

We used some of these in another longitudinal learning study where kids listened to and retold stories with a social robot. I found correlations between measures of engagement, learning, and relationships. For example, children who reported a stronger relationship or rated the robot as a greater social-relational agent showed higher vocabulary posttest scores. These were promising results...

So, armed with my assessments and hypotheses, I ran some more experimental studies.

a boy sits across a table from a red and blue robot

Evaluating relational AI: Entrainment and Backstory

First, I performed a one-session experiment that explored whether enabling a social robot to perform several rapport- and relationship-building behaviors would increase children's engagement and learning: entrainment and self-disclosure (backstory).

In positive human-human relationships, people frequently mirror or mimic each other's behavior. This mimicry (also called entrainment) is associated with rapport and smoother social interaction. I gave the robot a speech entrainment module, which matched vocal features of the robot's speech, such as speaking rate and volume, to the user's.

I also had the robot disclose personal information, about its poor speech and hearing abilities, in the form of a backstory.

86 kids played with the robot in a 2x2 study (entrainment vs. no entrainment and backstory vs. no backstory). The robot engaged the children one-on-one in conversation, told a story embedded with key vocabulary words, and asked children to retell the story.

I measured children's recall of the key words and their emotions during the interaction, examined their story retellings, and asked children questions about their relationship with the robot.

I found that the robot's entrainment led children to show more positive emotions and fewer negative emotions. Children who heard the robot's backstory were more likely to accept the robot's poor hearing abilities. Entrainment paired with backstory led children to emulate more of the robot's speech in their stories; these children were also more likely to comply with one of the robot's requests.

In short, the robot's speech entrainment and backstory appeared to increase children's engagement and enjoyment in the interaction, improve their perception of the relationship, and contributed to children's success at retelling the story.

A girl smiles at a red and blue fluffy robot

Evaluating relational AI: Relationships through time

My goals in the final study were twofold. First, I wanted to understand how children think about social robots as relational agents in learning contexts, especially over multiple encounters. Second, I wanted to see how adding relational capabilities to a social robot would impact children's learning, engagement, and relationship with the robot.

Long-term study

Would children who played with a relational robot show greater rapport, a closer relationship, increased learning, greater engagement, more positive affect, more peer mirroring, and treat the robot as more of a social other than children who played with a non-relational robot? Would children who reported feeling closer to the robot (regardless of condition) more learning and peer mirroring?

In this study, 50 kids played with either a relational or not relational robot. The relational robot was situated as a social contingent agent, using entrainment and affect mirroring; it referenced shared experiences such as past activities performed together and used the child's name; it took specific actions with regards to relationship management; it told stories that personalized both level (i.e., syntactic difficulty) and content (i.e., similarity of the robot's stories to the child's).

The not relational robot did not use these features. It simply followed its script. It did personalize stories based on level, since this is beneficial but not specifically related to the relationship.

Each child participated in a pretest session; 8 sessions with the robot that each included a pretest, the robot interaction with greeting, conversation, story activity, and closing, and posttest; and a final posttest session.

graph showing that children who rated robot as more social and relational also showed more learning

Results: Relationships, learning, and ... gender?

I collected a unique dataset about children's relationships with a social robot over time, which enabled me to look beyond whether children liked the robot or not or whether they learned new words or not. The main findings include:

  • Children in the \textit{Relational} condition reported that the robot was a more human-like, social, relational agent and responded to it in more social and relational ways. They often showed more positive affect, disclosed more information over time, and reported becoming more accepting of both the robot and other children with disabilities.

  • Children in the \textit{Relational} condition showed stronger correlations between their scores on the relationships assessments and their learning and behavior, such as their vocabulary posttest scores, emulation of the robot's language during storytelling, and use of target vocabulary words.

  • Regardless of condition, children who rated the robot as a more social and relational agent were more likely to treat it as such, as well as showing more learning.

  • Children's behavior showed that they thought of the robot and their relationship with it differently than their relationships with their parents, friends, and pets. They appeared to understand that the robot was an "in between" entity that had some properties of both alive, animate beings and inanimate machines.

The results of the study provide evidence for links between children's imitation of the robot during storytelling, their affect and valence, and their construal of the robot as a social-relational other. A large part of the power of social robots seems to come from their social presence.

In addition, children's behavior depended on both the robot's behavior and their own personalities and inclinations. Girls and boys seemed to imitate, interact, and respond differently to the relational and non-relational robots. Gender may be something to pay attention to in future work!

Ethics, design, and implications

I include several chapters in my dissertation discussing the design implications, ethical implications, and theoretical implications of my work.

Because of the power social and relational interaction has for humans, relational AI has the potential to engage and empower not only children across many domains—such as education, in therapy, and pediatrics for long-term health support—but also other populations: older children, adults, and the elderly. We can and should use relational AI to help all people flourish, to augment and support human relationships, and to enable people to be happier, healthier, more educated, and more able to lead the lives they want to live.

Further reading



  • Kory-Westlund, J. M. (2019). Relational AI: Creating Long-Term Interpersonal Interaction, Rapport, and Relationships with Social Robots. PhD Thesis, Media Arts and Sciences, Massachusetts Institute of Technology, Cambridge, MA. [PDF]

  • Kory-Westlund, J. M., & Breazeal, C. (2019). A Long-Term Study of Young Children's Rapport, Social Emulation, and Language Learning With a Peer-Like Robot Playmate in Preschool Frontiers in Robotics and AI, 6. [PDF] [online]

  • Kory-Westlund, J. M., & Breazeal, C. (2019). Exploring the effects of a social robot's speech entrainment and backstory on young children's emotion, rapport, relationships, and learning. Frontiers in Robotics and AI, 6. [PDF] [online]

  • Kory-Westlund, J. M., & Breazeal, C. (2019). Assessing Children's Perception and Acceptance of a Social Robot. Proceedings of the 18th ACM Interaction Design and Children Conference (IDC) (pp. 38-50). ACM: New York, NY. [PDF]

  • Kory-Westlund, J. M., Park, H., Williams, R., & Breazeal, C. (2018). Measuring Young Children's Long-term Relationships with Social Robots. Proceedings of the 17th ACM Interaction Design and Children Conference (IDC) (pp. 207-218). ACM: New York, NY. [talk] [PDF]

  • Kory-Westlund, J. M., Park, H. W., Williams, R., & Breazeal, C. (2017). Measuring children's long-term relationships with social robots Workshop on Perception and Interaction dynamics in Child-Robot Interaction, held in conjunction with the Robotics: Science and Systems XIII. (pp. 625-626). Workshop website [PDF]


child leans over tablet showing a storybook, in front of a fluffy robot who is listening

Does the robot's expressivity affect children's learning and engagement?

Reading books is great. Reading picture books with kids is extra great, especially when kids are encouraged to actively process the story materials through dialogic reading (i.e., asking questions, talking about what's happening in the book and what might happen next, connecting stuff in the book to other stuff the kid knows). Dialogic reading can, e.g., help kids learn new words and remember the story better.

Since we were already studying how we could use social robots as language learning companions and tutors for young kids, we decided to explore whether social robots could effectively engage preschoolers in dialogic reading. Given that past work has shown that children can and do learn new words from social robots, we decided to also look at what factors may modulate their engagement and learning—such as the verbal expressiveness of the robot.

fluffy robot tells a story to a child, who leans in over a tablet storybook listening

Tega robot

For this study, we used the Tega robot. Designed and built in the Personal Robots Group, it's a squash-and-stretch robot specifically designed to be an expressive, friendly creature. An Android phone displays an animated face and runs control software. The phone's sensors can be used to capture audio and video, which we can stream to another computer so a teleoperator can figure out what the robot should do next, or, in other projects, as input for various behavior modules, such as speech entrainment or affect recognition. We can stream live human speech, with the pitch shifted up to sound more child-like, to play on the robot, or playback recorded audio files.

Here is a video showing one of the earlier versions of Tega. Here's research scientist Dr. Hae Won Park talking about Tega and some of our projects, with a newer version of the robot.

Study: Does vocal expressivity matter?

We wanted to understand how the robot's vocal expressiveness might impact children's engagement and learning during a story and dialogic reading activity. So we set up two versions of the robot. One used a voice with a wide range of intonation and emotion. The other read and conversed with a flat voice, which sounded similar to a classic text-to-speech engine and had little dynamic range. Both robots moved and interacted the exact same way—the only difference was the voice.

This video shows the robot's expressive and not-so-expressive voices.

Half of the 45 kids in the study heard the expressive voice; the other half heard the flat voice. They heard a story from the robot that had several target vocabulary words embedded in it. The robot asked dialogic questions during reading. Kids were asked to retell the story back to a fluffy purple toucan puppet (who had conveniently fallen asleep during the story and was so sad to have missed it).

We found that all children learned new words from the robot, emulated the robot's storytelling in their own story retells, and treated the robot as a social being. However, children who heard the story from the expressive robot showed deeper engagement, increased learning and story retention, and more emulation of the robot's story in their story retells.

This study provided evidence that children will show peer-to-peer modeling of a social robot's language. In addition, they will also emulate the robot's affect, and they will show deeper engagement and learning when the robot is expressive.

child smiling and looking up, beside fluffy robot and fluffy toucan puppet



  • Kory-Westlund, J., Jeong, S., Park, H. W., Ronfard, S., Adhikari, A., Harris, P. L., David DeSteno, & Breazeal, C. (2017). Flat versus expressive storytelling: young children's learning and retention of a social robot's narrative. Frontiers in Human Neuroscience, 11. [PDF] [online]


three white clay bowls sitting on a plank of wood

Clay bowls!

Because I was having so much fun in the fall making bowls, I signed up for more classes during the January Independent Activities Period (IAP) and the Spring semester. And I made more bowls.

(I should acknowledge that we were taught how to make a variety of different forms, including mugs, vases, and little jars with lids... but I have a fondness for bowls. They're the most useful.)

My goal during the first couple classes was to get better at centering my clay on the wheel. It's a critical step. If the clay isn't centered, you will get something very lopsided and uneven as a result. It can take a lot of practice to get the feel for it. Here are my bowls from January:

bowl half pale blue and half yellow-gold, with an hourglass-esque pattern where the colors overlap

side view of a blue and off-white glazed bowl, middle roundly bulging out

top down view of a bowl with a purple rim and purple spots on top of pale blue and a streak of pinkish red

two white clay bowls, taller than they are wide, unglazed

top down bowl with half matte yellow glaze and half shiny pale blue glaze

During one of the later classes in the spring, we did timed trials, an exercise aimed to help you get faster at this once you've gotten the basics down. We were given a set number of minutes or seconds to perform each step in making a form -- like centering the clay, making an opening, forming the walls, and so on. We were also required to make a certain shape, such as a form that was taller than it was wide, or with an opening smaller than the width of its base. These are the bowls I made during these trials:

four brown clay bowls, unglazed

bowl with an opening smaller at the top, brown and blue glazes with white spots

side view of a bowl, white and green and yellow

side view of a bowl that is narrow at the bottom, bulges out, and is somewhat narrower at the top, glazed in sea green, rusty brown, and blue

side view of a round, flat bowl, purple inside, white and yellow-gold matte outside

Another thing I was working on was making the walls of the forms a uniform thickness. Because you draw the clay up to make the form taller, it was pretty easy to end up with thicker clay near the base (where you didn't draw enough of it up) and thinner clay around the rim. This meant I had to do a lot of trimming later to fix the bases.

side view of the base of a bowl that has five small ridges circling the bottom before the bowl flares up and out

bottom of a white clay bowl, showing my initials

I also wanted to experiment with the various glazes available. What interesting combinations could I come up with? This was an interesting challenge, since before firing, glazes generally look nothing like their final forms... as you can see in these before and after images:

five glazed bowls before firing, in various dull shades of brown

five glazed bowls after firing, shiny and brightly colored

I really liked the glaze effects on the brown and purple one in the bottom right, so I tried to duplicate it in another bowl later:

top down view of a bowl, brown and purple


Here's another sequence of bowls, from start to finish:

four brown clay bowls

four glazed, unfired bowls

four glazed, fired, colorful shiny bowls

Bonus bowls from later in the semester:

side view of a bowl with a rounded base and straight sides, glazed half sea green and half white, with brown along the rim

side-top view of a brown bowl with turquoise and blue polka dots inside

side view of a bowl with a round bulging base, fairly straight sides and a thin rim, glazed in browns and blues

We also played with marbling two clay bodies together -- using both white and brown clay in the same form. Here are my two bowls with marbled clay after their bisque firing:

unglazed bowls with two clays so you can see the swirling of the white and brown clays together

Same bowl, two views so you can see how the glaze patterns are asymmetrical:

side view of a round marbled clay bowl, yellow-gold with a blue rim

marbled clay bowl seen from the side and top, blue on the rim dripping down to mix with the brown and white sides

The other bowl, with a close up of the cool dripping glaze on the inside:

side view of a marbled clay bowl with gold and pale blue-green glazes, with a brown-white glaze dripping around the rim

close up of dripping glaze on the rim of a bowl


img alt

Bowls, bowls, and ... bowls!

I took a ceramics class! Hunks of clay, a spinning pottery wheel, mud, the whole nine yards. It was really fun taking a proper art class again. I haven't done that in a while. Making things is a nice break from the writing and programming that's been my academic life of late, with the extra awesome bonus that the pretty things I made are also functional.

img alt

The first two were kind of lopsided. As you can see, it took a few tries to get the hang of making the clay form a bowl-shape. The turquoise glaze on this one, however, makes it look like it's make of old copper with a patina layer on the surface, like the Statue of Liberty. Pretty cool effect.

img alt

The next two bowls I threw looked nice at first, but they dried out between the initial throwing and when I came back to trim them later. So, I got to smash them with a hammer. The remnants got put into the "leftovers" bucket that eventually gets remixed into useable clay.

Later in the semester, we learned how to marble two clay bodies together - using both white and brown clay. Here's a photo of my two marbled bowls, drying out before their first firing:

img alt

After the first firing, you apply glaze, then fire again. Interesting thing about glaze: it's a bucket of thick sediment in water. It's nothing like paint and the colors are nothing like the final product. Sediment + high heat = different colors! Chemistry is fascinating like that.

The glaze on the rim of this marbled bowl turned out to have very interesting effects - see the light, cloudy, feathery features as it ran down the inside of the bowl?

img alt

Here are two other bowls waiting for their first firing, nice and round. Focusing on shape and form was a fun change to explore -- much of the other art I've done lately (like painting) has had an emphasis on color. I really like the shape of the bowl on the right:

img alt

Bottom of that righthand bowl, after glazing. I've been signing them all with my initials!

img alt


a girl reaches her hand toward the face of a fluffy red robot, which sits on the table in front of her

Socially Assistive Robotics

This project was part of the Year 3 thrust for the Socially Assistive Robotics: An NSF Expedition in Computing grant, which I was involved in at MIT in the Personal Robots Group.

The overall mission of this expedition was to develop the computational techniques that could enable the design, implementation, and evaluation of "relational" robots, in order to encourage social, emotional, and cognitive growth in children, including those with social or cognitive deficits. The expedition aimed to increase the effectiveness of technology-based interventions in education and healthcare and to enhance the lives of children who may require specialized support in developing critical skills.

The Year 1 project targeted nutrition; Year 3 targeted language learning (that's this project!); Year 5 targeted social skills.

Second-language learning companions

This project was part of our effort at MIT to develop robotic second-language learning companions for preschool children. (We did other work in this area too: e.g., several projects looking at what design features positively impact children's learning as well as how children learn and interact over time.)

The project had two main goals. First, we wanted to test whether a socially assistive robot could help children learn new words in a foreign language (in this case, Spanish) more effectively by personalizing its affective/emotional feedback.

Second, we wanted to demonstrate that we could create and deploy an fully autonomous robotic system at a school for several months.

a boy sits at a table with a fluffy robot on it and leans in to peer at the robot's face, while the robot looks down at a tablet

Tega Robot

We used the Tega robot. Designed and built in the Personal Robots Group, it's a squash-and-stretch robot specifically designed to be an expressive, friendly creature. An Android phone displays an animated face and runs control software. The phone's sensors can be used to capture audio and video, which we can stream to another computer so a teleoperator can figure out what the robot should do next, or, in other projects, as input for various behavior modules, such as speech entrainment or affect recognition. We can stream live human speech, with the pitch shifted up to sound more child-like, to play on the robot, or playback recorded audio files.

Here is a video showing one of the earlier versions of Tega. Here's research scientist Dr. Hae Won Park talking about Tega and some of our projects, with a newer version of the robot.

A fluffy red robot sits behind a tablet, which is laying on a table

Language learning game

We created an interactive game that kids could play with a fully autonomous robot and the robot’s virtual sidekick, a Toucan shown on a tablet screen. The game was designed to support second language acquisition. The robot and the virtual agent each took on the role of a peer or learning companion and accompanied the child on a make-believe trip to Spain, where they learned new words in Spanish together.

Two aspects of the interaction were personalized to each child: (1) the content of the game (i.e., which words were presented), and (2) the robot's affective responses to the child's emotional state and performance.

This video shows the robot, game, and interaction.

scene from a tablet app showing a toucan looking at things in a bdroom: a suitcaes, a closet, shirts, balls, a hat


We conducted a 2-month study in three "special start" preschool classrooms at a public school in the Greater Boston Area. Thirty-four children ages 3-5, with 15 classified as special needs and 19 as typically developing, participated in the study.

The study took place over 9 sessions: Initial assessments, seven sessions playing the language learning game with the robot, and a final session with goodbyes with the robot and posttests.

We found that child learned new words presented during the interaction, children mimicked the robot's behavior, and that the robot's affective personalization led to greater positive responses from the children. This study provided evidence that children will engage a social robot as a peer over time, and personalizing a robot's behavior to children can lead to positive outcomes, such as greater liking of the interaction.

a girl mimics the head tilt and expression shown by a fluffy robot



  • Kory-Westlund, J., Gordon, G., Spaulding, S., Lee, J., Plummer, L., Martinez, M., Das, M., & Breazeal, C. (2015). Learning a Second Language with a Socially Assistive Robot. In Proceedings of New Friends: The 1st International Conference on Social Robots in Therapy and Education. (*equal contribution). [PDF]

  • Kory-Westlund, J. M., Lee, J., Plummer, L., Faridia, F., Gray, J., Berlin, M., Quintus-Bosz, H., Harmann, R., Hess, M., Dyer, S., dos Santos, K., Adalgeirsson, S., Gordon, G., Spaulding, S., Martinez, M., Das, M., Archie, M., Jeong, S., & Breazeal, C. (2016). Tega: A Social Robot. In S. Sabanovic, A. Paiva, Y. Nagai, & C. Bartneck, Proceedings of the 11th ACM/IEEE International Conference on Human-Robot Interaction: Video Presentations (pp. 561). Best Video Nominee. [PDF] [Video]

  • Gordon, G., Spaulding, S., Kory-Westlund, J., Lee, J., Plummer, L., Martinez, M., Das, M., & Breazeal, C. (2016). Affective Personalization of a Social Robot Tutor for Children's Second Language Skills. Proceedings of the 30th AAAI Conference on Artificial Intelligence. AAAI: Palo Alto, CA. [PDF]

  • Kory-Westlund, J. M., Gordon, G., Spaulding, S., Lee, J., Plummer, L., Martinez, M., Das, M., & Breazeal, C. (2016). Lessons From Teachers on Performing HRI Studies with Young Children in Schools. In S. Sabanovic, A. Paiva, Y. Nagai, & C. Bartneck (Eds.), Proceedings of the 11th ACM/IEEE International Conference on Human-Robot Interaction: alt.HRI (pp. 383-390). [PDF]