- Apple has added more functions of child protection to Facetime in iOS 26
- The last blur videos when it detects nudity is present
- He currently also affects adult accounts, but that could be a mistake
Apple has been adding control characteristics of parents who are designed to protect minors for years, and it seems that a new one in the beta iOS 26 has just been found.
Specifically, the new feature has been added to the Facetime Video-Calling application. When Facetime detects that someone undresses in the call, stops the call and instead shows a warning message that says: “The audio and video stop because it can be showing something sensitive. If you feel uncomfortable, you must finish the call.” Then there are buttons laid as “resume audio and video and” finish the call “.
In his WWDC 2025 in June, Apple published a press release that covers new ways in which their systems will protect children and young people online. The launch included a characteristic that aligns the new Facetime behavior: “Communication security expands to intervene when nudity is detected in Facetime’s video calls and to blur nudity in the albums shared in the photos.”
The real implementation was indicated by Idevicehelp in X. Under the publication, @user_101524 added that the function can be found in the application configuration in iOS 26 going to applications> Facetime> Sensitive content warning.
By default, the function is disabled, so the user should turn on, but that has not prevented the agital debate online …
Generate controversy
While this new feature may seem sensible, it has actually generated a degree of controversy. This is because at this time, it seems to affect all IOS 26 users, not only those who are using a children’s account. This has scrambled some feathers among the people who feel that Apple is potentially censoring the behavior of adults who consent.
In addition to that, some users have questioned how Apple knows what is shown on the screen and if the company has access to customer video calls. At this point, Apple has said the following:
“Communication security uses automatic learning on the device to analyze the attachments of photos and video and determine if a photo or video seems to contain nudes. Because the photos and videos are analyzed on your child’s device, Apple does not receive an indication that nudity was detected and has no access to photos or videos as a result.”
Like many of Apple’s characteristics, the processing on the device means that the content is not sent to Apple servers and that the company is not accessible. Rather, you are using artificial intelligence (AI) to mark the video content that probably contains nudity and then censor it.
The fact that Apple’s communication security characteristics are aimed at protecting minors of minors suggests that this last Facetime feature may not be destined to cover adults and children. Its inclusion in all accounts, therefore, can be a supervision or an error. While we do not know with certainty, we should find out in September when iOS 26 leaves beta and free to the public.