May 21, 2024 MDG

On device

When Google announces it can monitor phone conversations for hostile scams, and Microsoft claims it can search everything you’ve ever touched on your computer, the importance of on-device AI becomes clear. Apple is likely to unveil their own on-device AI capabilities at their upcoming developer event, solidifying this trend.

The question is, how much of this AI processing will truly stay on-device? Will learnings, user preferences always remain local? It’s crucial to understand the implications of on-device AI for personal OPESEC.

On-device AI can be more nuanced.  Some examples,,,

  1. Model Updates: While the AI model itself runs on your device, updates to that model might be downloaded from external servers. These updates could include new features, improved accuracy, or bug fixes.

  2. Anonymized Data: Some on-device AI systems may collect anonymized data about usage patterns or performance to help developers improve the model. This data is stripped of personally identifiable information before being sent.

  3. User Consent: Certain applications might request your permission to share specific data for specific purposes, even with an on-device AI model. For example, a voice assistant might ask to send your voice commands to a server for better speech recognition.

  4. Hybrid Models: Some AI systems use a combination of on-device and cloud-based processing. In these cases, certain parts of the AI model might run on your device, while others rely on external servers for more computationally intensive tasks.

Whats possible now(or what should always be local, and if not, ask why?)

Optimist in me says –  Most important trend done correctly, solves for latency, keeps works offline, keeps data on-device

Pessimist in me says:  Privacy is always the first victim of  convenience. Creep happens, and this feature creep will lead to more and more user consent in the guise of utility and convenience likely geeting neither in the short term.

I came across this post from Clem at Hugging Face, it summed it up nicely while pointing to four models to use . “- No cloud, no cost, no data sent to anyone, no problem. Welcome to local AI on Hugging Face! ”  Open source FTW

, ,