Apple ML researcher Vitaly Feldman presenting at the workshop – Image Credit: Apple
Apple has shared recordings of talks from its workshop about privacy and machine learning, demonstrating how it is considering how to protect user data while it is processed using AI.
Apple has repeatedly insisted that it is a privacy-forward company, including in its artificial intelligence and machine learning efforts. Following a workshop on privacy preservation in machine learning, Apple has shared details and published work that was presented at the event.
The workshop on Privacy-Preserving Machine Learning (PPML) was a two-day event earlier in 2025. It played host to researchers both within and outside of Apple, to discuss PPML in general.
The presentations list participants from various universities, as well as Google Research, Google DeepMind, and Microsoft Research.
The workshop focused on four areas: Private Learning and Statistics, Attacks and Security, Differential Privacy Foundations, and Foundation Models and Privacy.
Apple explains that the presentations and discussions were to explore the intersection of privacy, security, and the AI landscape as it evolves. There were discussions about the challenges of building AI systems with privacy protections.
The privacy discussions aim to “foster innovation while safeguarding user privacy,” writes Apple.
Deep Thought
Apple’s published presentations and work from the event cover quite a number of areas in the field of AI and privacy.
One topic that was discussed multiple times was ways to protect users more directly, such as creating privacy-conscious conversational agents. Due to the potential threat of malicious actors taking advantage of the contextual knowledge skills of chatbots, the AirGapAgent is proposed as a way to prevent leaks via limited data access.
The paper claims that it is powerful than currently available agents. A “single-query context hijacking attack” on a Gemini Ultra agent reduced its protectiveness of user data from 94% to 45%, while AirGapAgent maintained a 97% rate.
Similar in concept, “User Inference Attacks on Large Language Models” discusses how a malicious actor can determine if the responses from an LLM are fine-tuned using a user’s data. If so, the paper asks what can be discovered and how such an attack could be defended.
Another presented a scalable private search system called Wally, which supports efficient semantic and keyword queries. The paper discusses how the system can do better at scale than others, which can get bogged down due to the processing-intensive cryptographic operations used for each database entry.
Other talks include “A Generalized Binary Tree Mechanism for Differentially Private Approximation of All-Pair Distances,” “Nearly Tight Black-Box Auditing of Differentially Private Machine Learning,” and “Elephants Do Not Forget: Differential Privacy with State Continuity for Privacy Budget.”
Privacy vs Innovation
This is not the first workshop Apple has held dedicated to machine learning subjects. In 2024, it held workshops on “Human-Centred Machine Learning,” and released talks from it in July 2025.
The release of published papers from a privacy-focused workshop is also quite apt, considering the constant criticism the machine learning industry has to deal with.
In July, Apple had to insist that its AI training is ethical, in that it won’t scrape data from sources if the publisher doesn’t agree to the practice.
However, in August, AI startup Perplexity was revealed to be actively working around restrictions like robots.txt. A report determined that it used a second browser agent to crawl webpages, even if robots.txt said it could not.
Apple’s own efforts in machine learning are seemingly faltering in public, with extended delays affecting the long-awaited upgrade of Siri under Apple Intelligence.
By continuing to push its message that privacy is paramount and demonstrating that it is walking the walk, Apple at least shows its work in the field is at least as ethical as possible compared to rivals.
Even if it’s quite late in comparison.
This story originally appeared on Appleinsider