Facebook has started to feed artificial intelligence with unpublished special images


For years, Meta trained its AI programs using billions of public images downloaded by users on Facebook and Instagram servers. But apparently, Meta decided to try to train artificial intelligence on billions of images that users no It was raised to these servers.

Friday, Techcrunch I mentioned Facebook users who are trying to post something on the feature of the story have faced popup messages asking whether they want to choose “cloud processing”, which will allow Facebook “to choose the media from your camera roll and download it to our cloud regularly”, to create “ideas such as pictorial classifications, AI’s compensation, or features such as birth or graduates.”

By allowing this feature, the message continues, users agree on the terms of Meta AI, which allows Amnesty International to analyze “media and facial features” for these unpublished images, as well as the date on which the images were taken, and the presence of people or other things in them. You can give Meta the right to “keep and use” that personal information.

Mita recently admitted The data has been scraped from all the content published on Facebook and Instagram since 2007 to train artificial intelligence models. Although the company stated that it only uses public publications that have been loaded from adult users over the age of 18, it has long been mysterious about what the “audience” necessitated exactly, as well as what was calculated as a “adult user” in 2007.

Unlike Google, Which expressly states It does not train the Monopathic IQ models with the personal data collected from Google images, the conditions of the current AI’s use of AI, which have been present since June 23, 2024, did not provide any clarity on whether the unpublished images that were reached through “the processing of the cloud” are exempt from the use of training data. It is no longer dead TechcrunchRequest to comment; freedom I got to the comment as well.

Fortunately, Facebook users have an option to turn off the cloud processing of the camera roll in their settings, which, once activated, will also start removing unpublished images from the cloud after 30 days. But the alternative solution, disguised as a feature, indicates a new incursion in our private data, which exceeds the friction point known as the name With a conscience strike, a picture of public consumption. According to the participation of Redait, which was found TechcrunchMETA already submitted to re -classify artificial intelligence on previously downloaded images, even if users are not aware of the feature: Male That Facebook was a studio to boil her wedding photos without her knowledge.

Leave a Reply

Your email address will not be published. Required fields are marked *