Apple Intelligence has not had the best year so far, but if you think Apple is giving up, you are wrong. It has great plans and is advancing with new models training strategies that could greatly improve its AI performance. However, the changes imply a closer look at their data, if you opt.
In a new technical document of Apple Automatic Learning Research, “Understanding the aggregate trends for Apple’s intelligence using differential privacy,” Apple described the new plans to combine data analysis with user data and synthetic data generation to better train models behind many of Apple’s intelligence characteristics.
Some real data
Until now, Apple has been training its models in purely synthetic data, which tries to imitate how real data could be, but there are limitations. In Genmoji, for example, the use of Apple synthetic data does not always point to how real users are involved with the system. Of paper:
“For example, understanding how our models work when a user asks Genmoji to contain multiple entities (such as” Dinosaur in a cowboy hat “) helps us improve responses to such applications.”
Essentially, if users opt, the system can survey the device to see if you have seen a data segment. However, your phone does not respond with the data; Instead, it sends a noisy and anonymized signal, which is apparently enough for the Apple model to learn.
The process is something different for models that work with longer texts such as writing tools and summary. In this case, Apple uses synthetic models, and then send a representation of these synthetic models to users who have opted for data analysis.
In the device, the system makes a comparison that seems to compare these representations with recent emails samples.
“These selected synthetic inlays can be used to generate training or test data, or we can execute additional healing steps to further refine the data set.”
A better result
It is something complicated. However, the key is that Apple applies differential privacy to all user data, which is the process of adding noise that makes it impossible to connect that data to a real user.
Even so, none of this works if you do not opt for Apple data analysis, which generally occurs when your iPhone, iPad or MacBook sets.
Doing doing does not put your data or privacy at risk, but that training should lead to better models and, hopefully, a better Apple intelligence experience on your iPhone and other Apple devices.
It can also mean more intelligent and sensible rewrites and summaries.