Image of a woman in the toilet taken by a robot vacuum cleaner shared on Facebook

Roomba robot vacuum maker iRobot says sharing images on social media recorded by special test versions of its devices is in violation of its agreements (iRobot)

Intimate footage captured by robot vacuums during testing has been shared on social media, according to a report. Sensitive photos included several images of a woman sitting on the toilet with her shorts down to the thighs. Another watch a child lying on the ground with his clearly visible face.

The images were part of 15 screenshots taken from recordings made by special development versions of the Roomba J7 series robot vacuums in 2020, which were obtained by MIT Technology Review. These images would have been shared on private groups on Facebook and the Chat Discord application by concert workers in countries like Venezuela, whose work consists in labeling audio, photo and video data used to form artificial intelligence (IA).

The manufacturer of Roomba Irobot – which Amazon is acquiring – said that the recordings came from robots with hardware and software changes that are not sold to the public. These devices are given to the testers and staff who must explicitly accept to share their data, including video recordings, with Irobot.

The images are intended to be used to improve the artificial intelligence of robots (MIT Technology Review)

The images are intended to be used to improve the artificial intelligence of robots (MIT Technology Review)

However, the incident demonstrates the cracks in a system in which data treasures are exchanged between technology manufacturers and companies that help improve their AI algorithms. This information, including photos and videos, can sometimes end up in the hands of low-cost contract workers employed in remote destinations around the world. These workers take care of everything, the abolition of harmful publications on social networks to the transcription of audio recordings designed to improve voice assistants.

In the case of Irobot, the company works with Scale AI, a startup based in San Francisco which relies on site workers to examine and label the audio, photo and video data used to form artificial intelligence. Irobot previously declared that it has shared more than 2 million images with Scale AI, and an unknown quantity with other data annotation platforms.

iRobot and Scale AI said sharing screenshots on social media violates their agreements. The images in question included a mix of personal photos and everyday shots of home interiors, including furniture, decorations and objects on walls and ceilings. They also contained descriptive labels such as “tv”, “plant_or_flower” and “ceiling light”.

The labellers had discussed the images on Facebook, Discord and other online groups they had created to share tips on handling payments and labeling delicate items, according to MIT Technology Review. Irobot told the publication that images came from its aircraft in countries like France, Germany, Spain, the United States and Japan.

This process of recording and labeling images is used to enhance a robot vacuum’s computer vision, allowing the devices to accurately map their surroundings using high-definition cameras and a set of Laser sensors. The technology helps them determine the size of a room, avoid obstacles like furniture and wires, and adjust its cleaning regimen.

Computer vision is limited to the most premium robots on the market, including the Roomba J7, a device that currently costs £459.

iRobot said it collects the vast majority of its image datasets from actual homes occupied by its employees or volunteers recruited by third parties, with the latter offering incentives for participation. The rest of the training data comes from the “steps data collection” that the company uses to create models which it then records. Its consumer devices capture mapping and navigation information and share some feature usage data with the cloud, though that doesn’t include images, according to the company’s support page.

Irobot said that he had ended his relationship with the “service provider who disclosed the images” and actively investigates the issue, and “takes measures to help prevent a similar leak by any service provider in the future” .

Leave a Reply

Your email address will not be published. Required fields are marked *