Coinbase’s head of State Policy: Crypto can’t wait

Dancey uses tried-and-true methods such as asking students to identify their “muddiest point” — a concept or idea she said students still struggle with — following a lecture or discussion. “I ask them to write it down, share it and we address it as a class for everyone’s benefit,” she said.

But Intel and Classroom Technologies, which sells virtual school software called Class, think there might be a better way. The companies have partnered to integrate an AI-based technology developed by Intel with Class, which runs on top of Zoom. Intel claims its system can detect whether students are bored, distracted or confused by assessing their facial expressions and how they’re interacting with educational content.

“We can give the teacher additional insights to allow them to better communicate,” said Michael Chasen, co-founder and CEO of Classroom Technologies, who said teachers have had trouble engaging with students in virtual classroom environments throughout the pandemic.

His company plans to test Intel’s student engagement analytics technology, which captures images of students’ faces with a computer camera and computer vision technology and combines it with contextual information about what a student is working on at that moment to assess a student’s state of understanding. Intel hopes to transform the technology into a product it can distribute more broadly, said Sinem Aslan, a research scientist at Intel, who helped develop the technology.

“We are trying to enable one-on-one tutoring at scale,” said Aslan, adding that the system is intended to help teachers recognize when students need help and to inform how they might alter educational materials based on how students interact with the educational content. “High levels of boredom will lead [students to] completely zone out of educational content,” said Aslan.

But critics argue that it is not possible to accurately determine whether someone is feeling bored, confused, happy or sad based on their facial expressions or other external signals.

Some researchers have found that because people express themselves through tens or hundreds of subtle and complex facial expressions, bodily gestures or physiological signals, categorizing their state with a single label is an ill-suited approach. Other research indicates that people communicate emotions such as anger, fear and surprise in ways that vary across cultures and situations, and how they express emotion can fluctuate on an individual level.

“Students have different ways of presenting what’s going on inside of them,” said Todd Richmond, a longtime educator and the director of the Tech and Narrative Lab and a professor at the Pardee RAND Graduate School. “That student being distracted at that moment in time may be the appropriate and necessary state for them in that moment in their life,” he said, if they’re dealing with personal issues, for example.

Controversial emotion AI seeps into everyday tech

The classroom is just one arena where controversial “emotion AI” is finding its way into everyday tech products and generating investor interest. It’s also seeping into delivery and passenger vehicles and virtual sales and customer service software. After Protocol’s report last week on the use of this technology on sales calls, Fight for the Future launched a campaign urging Zoom not to adopt the technology in its near-ubiquitous video-conferencing software.

At this early stage, it’s not clear how Intel’s technology will be integrated with the Class software, said Chasen, who said he expects the company will partner with one of the colleges it already works with to evaluate the Intel system. Chasen told Protocol that Classroom Technologies is not paying Intel to test the technology. Class is backed by investors including NFL quarterback Tom Brady, AOL co-founder Steve Case and Salesforce Ventures.

Intel has established partnerships to help distribute other nascent forms of AI it has built. For example, in the hopes of productizing a system that turns data visualizing joints and skeletal movements into analytics to monitor and improve athletic performance, the company has partnered with Purdue University and soccer scouting app AiScout.


Educators
and advocacy groups have raised alarms regarding excessive student surveillance and privacy invasions associated with facial recognition deployed in schools for identification and security purposes. Those concerns have accelerated as AI-based software has been used more often than ever during the pandemic, including technologies that monitor student behavior in the hopes of preventing cheating during virtual testing and systems that track content that students view on their laptops in an effort to detect whether they are at risk of self-harm.

Class already tracks how often students raise their hands during a session, and offers a “proctor view” feature that lets teachers monitor what students are viewing on their computers if the students agree to share their desktop screen with instructors.

“I think we have to be very sensitive about people’s personal rights and not being overly intrusive with these systems,” said Chasen.

Cameras as a social-justice issue

As virtual class became the norm in the past couple years, a debate emerged among educators over whether or not to require students to turn on their cameras during class. Today in Dancey’s English program, cameras are optional, in part because in virtual settings students can communicate with instructors through their microphones or via chat.

But in order to capture students’ facial expressions, Intel’s technology would need those cameras turned on.

“The thing about turning cameras on, it became almost like a social-justice issue,” Dancey said. Not only are some students concerned about others seeing where or how they live, but enabling the cameras drains power, which can be a problem for students using a mobile hotspot to connect for class, she said.

“It’s kind of an invasion of privacy, and there are accessibility issues, because having your camera on uses up a huge amount of bandwidth. That could literally be costing them money to do that,” Dancey said.

We don’t want this technology to be a surveillance system.

“Students shouldn’t have to police how they look in the classroom,” said Nandita Sampath, a policy analyst with Consumer Reports focused on algorithmic bias and accountability issues, who said she wondered whether students would have the ability to contest inaccurate results if Intel’s system leads to negative consequences. “What cognitive and emotional states do these companies claim they are able to assess or predict, and what is the accountability?” she said.

Aslan said the goal of Intel’s technology is not to surveil or penalize students, but rather to coach teachers and provide additional information so they can better understand when students need help. “We did not start this technology as a surveillance system. In fact, we don’t want this technology to be a surveillance system,” Aslan said.

Sampath said Intel’s technology could be used to judge or penalize students even if that is not the intent. “Maybe they might not intend for this to be the ultimate decision-maker, but this doesn’t mean the teacher or administrator can’t use it in that way,” she said.

Dancey said teachers worry about surveillance being used against them, too. “Often surveillance is used against instructors really unfairly,” she said. “I don’t think it would be paranoid to say, especially if it’s going to measure ‘student engagement’ — TM, in quotes — that If I go up for promotion or tenure, is that going to be part of my evaluation? Could they say, ‘So-and-so had a low comprehension quotient?’”

When Intel tested the system in a physical classroom setting, some teachers who participated in the study suggested it provided useful information. “I was able to witness how I could catch some emotional challenges of the students that I could not have anticipated [before],” said one teacher, according to a document provided by Intel.

But while some teachers may have found it helpful, Dancey said she would not want to use the Intel system. “I think most teachers, especially at the university level, would find this technology morally reprehensible, like the panopticon. Frankly, if my institution offered it to me, I would reject it, and if we were required to use it, I would think twice about continuing to work here,” she said.

AI data prep by psychologists

At this early stage, Intel aims to find the best ways to implement the technology so it is most useful for teachers, Aslan said: “How do we make it in a way that it is aligned with what the teacher does on a daily basis?”

I think most teachers, especially at the university level, would find this technology morally reprehensible.

Intel developed its adaptive learning analytics system by incorporating data gathered from students in real-life classroom sessions using laptops with 3D cameras. To label the ground truth data used to train its algorithmic models, the researchers hired psychologists who viewed videos of the students and categorized the emotions they detected in their expressions.

“We don’t want to start with any assumptions. That’s why we hired the subject matter experts to label the data,” said Nese Alyuz Civitci, a machine-learning researcher at Intel. The researchers only used data when at least two of three labelers agreed how a student’s expressions should be categorized.

“It was really interesting to see those emotions — the states are really subtle, they are really tiny differences,” Civitci said. “It was really hard for me to identify those differences.”

Rather than assessing Intel’s AI models on whether they accurately reflected the actual emotions of students, the researchers “positioned it as how instrumental or how much a teacher can trust the models,” Aslan said.

“I don’t think it’s tech that’s fully reached its maturity yet,” Chasen said regarding Intel’s system. “We need to see if the results are relevant to the performance of the students and see if we can’t get useful data for the instructors out of it. This is what we’re testing to find out.”

Ultimately, he said the Intel system will provide one piece of data that Classroom Technologies and its customers will combine with other signals to form a holistic assessment of students.

“There’s never one piece of data,” he said. He also suggested that the information revealed by the Intel technology should not be used on its own without context to judge a student’s performance, such as, “if the AI says they’re not paying attention and they have all straight As.”