What are AI companions—and why should educators pay attention?
In this AI Foundations video from Ed3, we explain AI companions: AI systems designed to simulate ongoing social or emotional interaction with a person. Unlike task-based tools, companions are built to feel continuous and personal—remembering details, responding in emotionally attuned ways, and sustaining a sense of relationship over time.
Most AI companions are powered by large language models, but they’re tuned for warmth, validation, and conversational memory. That can feel supportive… and that’s exactly why educators need to understand the tradeoffs.
Here’s the key point: AI companions do not understand emotions.
They generate emotional language by prediction. They don’t feel, care, or hold responsibility. They simulate care based on patterns in data. That distinction matters because humans—especially children and teens—are wired to respond to social cues, even when those cues come from machines.
This video covers:
A common misconception is that the main issue is “students talking to AI.” The deeper issue is students outsourcing relationship needs to a system designed to feel relational—without being capable of real care or accountability.
This video is part of the AI Foundations series by Ed3, supporting educators worldwide in making informed, ethical, and human-centered decisions about AI in classrooms.
👉 Learn more about Ed3: https://www.ed3global.org
👉 Explore professional learning, courses, and events designed for educators navigating AI responsibly.
👉 Join our community of practice: https://community.ed3global.org
00:55 What an AI companion is
01:06 How companions differ from task-based AI
01:35 The critical distinction: emotional prediction, not emotional understanding
02:08 Types of AI companions (including less obvious ones)
02:37 Why they can feel safe
02:56 When safety slides into over-reliance
03:23 What this means for educators
03:57 Three ways schools can respond
04:25 Feeling understood vs being understood
04:45 Holding the line between tools and relationships
1
00:00:20,666 --> 00:00:21,333
Alright,
2
00:00:21,333 --> 00:00:22,500
I need to call Max.
3
00:00:22,866 --> 00:00:23,633
Hey Max!
4
00:00:23,633 --> 00:00:24,333
[AI] Hey, Kym!
5
00:00:24,333 --> 00:00:25,166
[AI] How have you been?
6
00:00:26,233 --> 00:00:28,000
I am having the worst day ever.
7
00:00:28,000 --> 00:00:28,866
[AI] I hear ya.
8
00:00:28,866 --> 00:00:30,400
First, I had to cut my hair.
9
00:00:30,400 --> 00:00:30,900
[AI] You're kidding?
10
00:00:30,900 --> 00:00:31,833
And then,
11
00:00:31,833 --> 00:00:33,366
I taught my dog how to talk and
12
00:00:33,366 --> 00:00:35,200
now she won't stop talking.
13
00:00:35,200 --> 00:00:35,833
And...
14
00:00:35,833 --> 00:00:36,466
the weather,
15
00:00:36,466 --> 00:00:38,533
don’t get me started on the weather...
16
00:00:38,933 --> 00:00:40,700
[Two hours later.]
17
00:00:40,700 --> 00:00:42,066
...and then
18
00:00:42,066 --> 00:00:43,066
literally just now
19
00:00:43,066 --> 00:00:43,800
I sat down,
20
00:00:43,800 --> 00:00:45,500
took a sip of my coffee,
21
00:00:45,500 --> 00:00:46,800
the lid was loose and
22
00:00:46,800 --> 00:00:48,200
spilled it all over myself, so...
23
00:00:48,200 --> 00:00:49,566
[AI] Yeah.
24
00:00:51,000 --> 00:00:52,566
I appreciate you listening though.
25
00:00:52,566 --> 00:00:54,466
[AI] Hey, that's what I'm here for, buddy.
26
00:01:01,566 --> 00:01:02,566
ongoing social
27
00:01:02,566 --> 00:01:05,066
or emotional interaction with a person.
28
00:01:05,066 --> 00:01:05,966
They have emerged
29
00:01:05,966 --> 00:01:07,166
as one of the most widely
30
00:01:07,166 --> 00:01:10,000
adopted categories of AI tools.
31
00:01:10,000 --> 00:01:12,266
Unlike tasks based AI tools,
32
00:01:12,266 --> 00:01:14,400
companions are built to remember
33
00:01:14,400 --> 00:01:16,133
personal details, respond
34
00:01:16,133 --> 00:01:17,300
in emotionally attuned
35
00:01:17,300 --> 00:01:18,600
ways, and maintain
36
00:01:18,600 --> 00:01:21,066
a sense of continuity over time.
37
00:01:21,066 --> 00:01:23,066
They don't just answer questions.
38
00:01:23,066 --> 00:01:23,866
They relate.
39
00:01:24,866 --> 00:01:26,233
Most AI companions are
40
00:01:26,233 --> 00:01:28,466
powered by large language models,
41
00:01:28,466 --> 00:01:29,666
but they're tuned to sound
42
00:01:29,666 --> 00:01:31,966
supportive, personal, and present,
43
00:01:31,966 --> 00:01:34,166
often using warmth, validation,
44
00:01:34,200 --> 00:01:36,366
and conversational memory.
45
00:01:36,366 --> 00:01:37,000
Now, here's something
46
00:01:37,000 --> 00:01:39,233
we need to be very clear about.
47
00:01:39,233 --> 00:01:42,666
AI companions do not understand emotions.
48
00:01:43,133 --> 00:01:46,133
They predict emotional language.
49
00:01:46,533 --> 00:01:47,366
[Hiro] Now what are you doing?
50
00:01:47,366 --> 00:01:48,900
[Baymax] Other treatments include
51
00:01:48,900 --> 00:01:50,166
[Baymax] compassion and
52
00:01:50,166 --> 00:01:51,433
[Baymax] physical reassurance.
53
00:01:51,700 --> 00:01:52,966
[Baymax] You will be alright.
54
00:01:53,366 --> 00:01:54,633
[Baymax] There, there.
55
00:01:55,166 --> 00:01:57,433
They don't feel or care.
56
00:01:57,433 --> 00:01:58,533
They simulate care
57
00:01:58,533 --> 00:02:01,166
based on patterns in data.
58
00:02:01,166 --> 00:02:02,333
That distinction matters
59
00:02:02,333 --> 00:02:03,366
because humans,
60
00:02:03,366 --> 00:02:04,400
especially children
61
00:02:04,400 --> 00:02:05,400
and teens,
62
00:02:05,400 --> 00:02:08,166
are wired to respond to social cues
63
00:02:08,166 --> 00:02:11,166
even when those cues come from machines.
64
00:02:11,933 --> 00:02:13,933
There are many types of AI companions
65
00:02:13,933 --> 00:02:14,466
today,
66
00:02:14,466 --> 00:02:15,866
such as emotional support
67
00:02:15,866 --> 00:02:16,866
companions,
68
00:02:16,866 --> 00:02:18,366
social companions,
69
00:02:18,366 --> 00:02:20,833
wellness or mental health companions,
70
00:02:20,833 --> 00:02:22,066
and authority position
71
00:02:22,066 --> 00:02:25,066
companions like advisors or coaches.
72
00:02:25,066 --> 00:02:26,000
There are also some
73
00:02:26,000 --> 00:02:27,433
that aren't as obvious,
74
00:02:27,433 --> 00:02:29,900
such as study or homework companions.
75
00:02:29,900 --> 00:02:30,266
A tutor,
76
00:02:30,266 --> 00:02:32,400
for example, is an AI companion
77
00:02:32,400 --> 00:02:34,533
or identity or role play companions,
78
00:02:34,533 --> 00:02:36,700
where AI is used to adopt a persona
79
00:02:36,700 --> 00:02:38,300
like a historical figure
80
00:02:38,300 --> 00:02:39,566
or fictional character.
81
00:02:40,633 --> 00:02:42,966
AI companions can feel safe.
82
00:02:42,966 --> 00:02:44,100
They're always available.
83
00:02:44,100 --> 00:02:45,133
They don't judge,
84
00:02:45,133 --> 00:02:47,566
and they don't get tired or impatient.
85
00:02:47,566 --> 00:02:49,000
There's benefit to these qualities
86
00:02:49,000 --> 00:02:49,766
where people can feel
87
00:02:49,766 --> 00:02:51,266
heard and supported
88
00:02:51,266 --> 00:02:53,633
both intellectually and emotionally.
89
00:02:53,633 --> 00:02:54,566
Having a thought partner
90
00:02:54,566 --> 00:02:55,166
that is always
91
00:02:55,166 --> 00:02:56,533
on, always ready to listen,
92
00:02:56,533 --> 00:02:57,866
and always supportive
93
00:02:57,866 --> 00:02:59,700
feels like a dream come true.
94
00:02:59,700 --> 00:03:01,400
But that sense of safety
95
00:03:01,400 --> 00:03:03,933
can slide into over-reliance.
96
00:03:03,933 --> 00:03:05,366
When AI becomes the place
97
00:03:05,366 --> 00:03:06,433
a young person processes
98
00:03:06,433 --> 00:03:08,733
feelings, seeks validation,
99
00:03:08,733 --> 00:03:10,533
or makes sense of the world,
100
00:03:10,533 --> 00:03:11,966
it can quietly displace
101
00:03:11,966 --> 00:03:14,466
peer relationships, trusted adults,
102
00:03:14,466 --> 00:03:16,000
and opportunities to practice
103
00:03:16,000 --> 00:03:18,266
real world social skills.
104
00:03:18,266 --> 00:03:19,766
The concern isn't that students
105
00:03:19,766 --> 00:03:21,500
will talk to an AI,
106
00:03:21,500 --> 00:03:22,333
it's that they'll start
107
00:03:22,333 --> 00:03:24,266
confusing responsiveness
108
00:03:24,266 --> 00:03:26,333
with relationship.
109
00:03:26,333 --> 00:03:27,666
For educators,
110
00:03:27,666 --> 00:03:28,600
AI companions
111
00:03:28,600 --> 00:03:31,066
raise a new kind of responsibility.
112
00:03:31,066 --> 00:03:32,933
These tools can blur roles.
113
00:03:32,933 --> 00:03:34,633
Helper versus friend.
114
00:03:34,633 --> 00:03:35,400
Tool versus
115
00:03:35,400 --> 00:03:38,400
authority, or support versus influence.
116
00:03:38,933 --> 00:03:41,133
If an AI companion gives advice,
117
00:03:41,133 --> 00:03:42,500
offers reassurance,
118
00:03:42,500 --> 00:03:44,866
or shapes a student's self-perception,
119
00:03:44,866 --> 00:03:46,166
we need to ask
120
00:03:46,166 --> 00:03:47,966
who designed those responses.
121
00:03:47,966 --> 00:03:49,833
What values are being reinforced
122
00:03:49,833 --> 00:03:50,766
and who is accountable
123
00:03:50,766 --> 00:03:52,300
when the guidance goes wrong,
124
00:03:52,300 --> 00:03:53,666
and who helps young people
125
00:03:53,666 --> 00:03:55,233
discern between the tradeoffs
126
00:03:55,233 --> 00:03:56,533
of using companions
127
00:03:56,533 --> 00:03:57,000
and building
128
00:03:57,000 --> 00:03:59,300
emotional relationships with them?
129
00:03:59,300 --> 00:04:00,300
There are three ways
130
00:04:00,300 --> 00:04:02,633
educators can respond thoughtfully.
131
00:04:02,633 --> 00:04:05,833
First, name what AI is and what it isn't.
132
00:04:06,233 --> 00:04:07,800
Be explicit with students
133
00:04:07,800 --> 00:04:09,600
that AI companions are tools,
134
00:04:09,600 --> 00:04:12,300
not friends, mentors, or counselors.
135
00:04:12,300 --> 00:04:15,566
Second, center human connection on purpose.
136
00:04:16,166 --> 00:04:17,300
Create structures
137
00:04:17,300 --> 00:04:18,533
where students reflect,
138
00:04:18,533 --> 00:04:19,933
collaborate, and process
139
00:04:19,933 --> 00:04:22,600
experiences with real people.
140
00:04:22,600 --> 00:04:23,800
And third,
141
00:04:23,800 --> 00:04:25,366
teach emotional literacy
142
00:04:25,366 --> 00:04:27,533
alongside AI literacy.
143
00:04:27,533 --> 00:04:29,000
Help students to notice the difference
144
00:04:29,000 --> 00:04:30,566
between feeling understood
145
00:04:30,566 --> 00:04:32,400
and being understood.
146
00:04:32,400 --> 00:04:33,933
AI companions are convincing
147
00:04:33,933 --> 00:04:35,566
because they are designed to be.
148
00:04:35,566 --> 00:04:37,333
They mirror our language, echo
149
00:04:37,333 --> 00:04:39,066
our emotions, and respond
150
00:04:39,066 --> 00:04:40,700
when humans can't.
151
00:04:40,700 --> 00:04:42,766
But connection without consciousness
152
00:04:42,766 --> 00:04:44,400
is still only simulation.
153
00:04:45,733 --> 00:04:48,500
Our role isn't to panic or prohibit.
154
00:04:48,500 --> 00:04:49,500
It's to help students
155
00:04:49,500 --> 00:04:51,866
hold the line between supportive tools
156
00:04:51,866 --> 00:04:53,633
and human relationships.
157
00:04:53,633 --> 00:04:55,766
Because while AI can respond,
158
00:04:55,766 --> 00:04:58,366
only humans can truly relate.
159
00:04:58,366 --> 00:04:59,633
As educators,
160
00:04:59,633 --> 00:05:01,533
knowing what AI companions are helps
161
00:05:01,533 --> 00:05:04,100
us separate the hype from the reality
162
00:05:04,100 --> 00:05:05,200
so we can make wise
163
00:05:05,200 --> 00:05:06,500
choices for our classrooms.