Google tries to fix crappy webcam quality with artificial intelligence

Chances are you’ve spent most of the past two-plus years sitting at home, looping through endless virtual meetings, staring at your laptop’s webcam and speaking into your built-in microphone.

That means for the past two-plus years, you’ve been in front of other people like a bunch of poorly lit pixels that sounded like you were yelling in a tin can. Of course, it’s not your fault: your laptop’s webcam just sucks, and so is its microphone. But Google thinks it can solve both problems with artificial intelligence.

join us on telegram

Google announced Wednesday at its annual I/O developer conference that its Workspaces team has been working on some artificial intelligence-driven ways to improve your virtual meetings. The most impressive is “Portrait Fix,” which Google says can automatically improve and sharpen your images, even with a poor connection or a bad camera.

Likewise, Portrait Lighting gives you a suite of AI-based controls to control how you light up. Since you can’t move a window to your left, Google seems to be saying, but you can make Google Meet appear as if you also have one on your right. When it comes to sound, Google has introduced a de-reverberation tool designed to minimize the echoes that come from talking into your laptop in a cramped home office.

A lot of the underlying technology here comes from the artificial intelligence and machine learning work Google does on its Pixel phones. The hardware in these phones is much better than your average laptop webcam, but the principle is the same, says Prasad Seti, the company’s vice president of digital work experience. “We want to make sure the underlying software does the same thing, and we can use it on a wide range of hardware devices,” he said.

With the growth of hybrid and remote work, the Google Workspace team has spent the past few years thinking about how to make work a little easier, Setty said. “We want technology to be an enabler, we want it to be helpful, we want it to be intuitive, we want it to solve real problems.” This pushed the Workspace team to think more about collaboration — hence the Meeting tools — but also think about how to make asynchronous work more acceptable.

Google plans to roll out a new tool that generates automated summaries of Spaces activity, so you can log in in the morning and catch up without having to read hundreds of messages. It has also launched an automated transcription service for Meet meetings and plans to eventually wrap up those meetings as well.

“We want to be able to help people deal with this information overload and use artificial intelligence to do that,” Setty said. Google has thought a lot about ‘collaborative fairness’ and ‘representative fairness’ to try to help everyone maintain a level playing field environment, no matter where they are, what technology they use, or how they work.

One of Google’s tricks is to help people without being overly involved or making employees feel like they’re being watched by Google or their employer. We think about the problem The way is, we start by empowering users. Then let them like choose how to expose that information to their teams, etc.”

As people return to the office, Google faces an even bigger meeting challenge: Solving the problem of hybrid meetings, with some people in a room and others on a single screen. This will require more improvements than good lighting and noise cancellation properties.

Leave a Comment