Cloud Storage Privacy

Why Lock-In Is a Bigger Problem Than You Think

written by Peter Spiess-Knafl on November 29, 2025

Not all cloud storage is created equal. Google Photos actively analyzes your images to train AI models, while Apple’s iCloud offers better privacy but traps you in their ecosystem. The real problem? Once you’re locked in with years of photos, you lose control over your data’s future—including how it might be used by AI systems that don’t exist yet. End-to-end encryption combined with true data portability is the only way to future-proof your privacy.


Server racks in data center Photo by Taylor Vick on Unsplash

Cloud Storage Privacy: Why Lock-In Is a Bigger Problem Than You Think

When we say “cloud storage,” most people think of Google Photos or Apple iCloud. Both offer convenience—your photos accessible everywhere, automatic backups, smart organization. But here’s what often gets overlooked: these services work very differently under the hood, and that difference matters more than ever.

The Google Problem: Your Photos Are Training Tomorrow’s AI

Google Photos is convenient, but it’s also reading everything you upload. When you store photos with Google, the company performs server-side analysis on your images to power features like search, face recognition, and automatic categorization. This isn’t just about organizing your library—your photos are actively being used to improve Google’s AI models.

Binary code and digital technology Photo by Markus Spiske on Unsplash

Google has been transparent about this to some degree. In July 2023, Google updated its privacy policy to explicitly state they use publicly available information and user content to “train Google’s AI models and build products and features like Google Translate, Bard, and Cloud AI capabilities.” A Google spokesperson confirmed that uploads to Google Photos are used to train AI models designed to help users manage their image libraries.

What makes this concerning isn’t just what AI can do today—it’s what it will be capable of tomorrow. AI capabilities are advancing exponentially. Features that seemed impossible two years ago are now routine: understanding complex scenes and emotions, generating realistic images, inferring relationships from photo collections. Your photos from 2020 are still sitting on Google’s servers, ready to be analyzed by whatever AI capabilities emerge next year, or in five years.

The key point: whatever AI can do in 2027, it will have access to your entire photo history if it’s stored unencrypted on Google’s servers. You can’t retroactively un-share data you uploaded in 2019. The business model is clear—Google offers storage at low or no cost because your data has value for training their systems.

The Apple Paradox: Good privacy today, but you’re trapped

Apple deserves credit where it’s due. Their privacy practices are substantially better than Google’s. iCloud now offers Advanced Data Protection with end-to-end encryption for photos and other data types. When enabled, even Apple can’t access your encrypted photos.

But here's the problem: try leaving.

Padlock on laptop keyboard representing digital security Photo by Franck on Unsplash

Once you’re deep in the Apple ecosystem with years of photos, family sharing, and integrated workflows, migration becomes prohibitively expensive and complex. This is classic vendor lock-in—when the cost of switching to an alternative is so high that customers are essentially stuck with the original vendor, regardless of future changes to pricing, features, or policies.

The concern isn’t just technical difficulty. It’s about control over your data’s future. Apple’s CSAM (Child Sexual Abuse Material) scanning controversy in 2021 illustrates this perfectly. Apple announced plans to scan photos on users’ devices before uploading to iCloud, using on-device matching technology. The intention was noble—combating child exploitation—but security researchers and privacy advocates raised serious concerns about how the technology could be misused by authoritarian governments or expanded to scan for other types of content.

After widespread backlash from privacy researchers, digital rights groups, and even Apple employees, Apple abandoned the CSAM scanning feature in December 2022. Apple’s director of user privacy stated that “scanning every user’s privately stored iCloud data would create new threat vectors for data thieves to find and exploit” and could “inject the potential for a slippery slope of unintended consequences.”

The episode demonstrates an important point: even companies with good privacy practices today can face pressure to change direction. When you’re locked into a platform with all your photos, you’re betting that their priorities will stay aligned with yours indefinitely. Leadership changes, business pressures, or government requirements can shift policies faster than you can migrate years of data.

Why Lock-In Makes the Privacy Problem Worse

When you can’t easily leave a service, several risks compound:

Policy changes become non-negotiable. Terms of service updates happen regularly. When you’re locked in with years of data, you have limited ability to vote with your feet when policies change in ways you don’t like. The 2023 Google privacy policy update is a perfect example—the company explicitly added AI training language, and users with extensive photo libraries had little choice but to accept it or undertake a massive migration.

Price increases can’t be avoided. Once migration is too painful, price hikes become something you have to accept rather than an opportunity to shop around.

Future capabilities are unknown. Both Google and Apple are racing to build AI capabilities. We don’t know what will be possible in 18 months, but we do know that training data is valuable. Terms of service you agreed to in 2022 may cover AI uses that don’t exist yet. As one former Google engineer explained after leaving the company to found privacy-focused alternative Ente, he discovered Google could train its AI using data stored in its cloud and didn’t want his personal photos contributing to that practice.

Data gravity makes it worse. The longer you stay, the more trapped you become. Moving thousands of photos is exponentially harder than moving a few dozen.

What True Privacy Plus Portability Looks Like

End-to-end encryption without ecosystem imprisonment is technically possible. The key is combining three elements:

Digital encryption and security visualization Photo by FLY:D on Unsplash

Client-side encryption. Photos are encrypted on your device before they’re ever uploaded. This means that whatever AI exists in 2027 or 2030, it can’t access your encrypted photos. Future-proof privacy requires that providers never have access to unencrypted data in the first place.

Local processing for smart features. Modern devices are powerful enough to run face recognition, object detection, and other AI features locally. You get the benefits of smart organization without uploading readable data to external servers.

Standard formats and real export tools. Under the EU’s GDPR Article 20, individuals have the right to receive their personal data in a structured, commonly-used, machine-readable format and to have it transmitted directly to another provider. This right to data portability is designed to prevent lock-in, but its effectiveness depends on providers actually implementing robust export tools rather than minimal compliance.

zeitkapsl takes this approach seriously. Photos are encrypted on your device before upload—the server only sees encrypted data. Face recognition and other smart features run entirely locally. And because data portability matters, export uses standard formats without proprietary lock-in. You control your data, not just today, but in whatever future emerges.

The European Advantage: GDPR and the CLOUD Act Problem

European alternatives to U.S. tech giants offer additional benefits beyond just business model differences. The regulatory landscape itself creates fundamentally different privacy guarantees.

The GDPR provides stronger protections. The EU’s General Data Protection Regulation requires explicit consent for data processing and grants individuals meaningful rights over their data. But there’s another critical factor: jurisdiction.

The U.S. CLOUD Act reaches across borders. Enacted in March 2018, the Clarifying Lawful Overseas Use of Data (CLOUD) Act allows U.S. law enforcement to compel U.S.-based technology companies to provide data stored on their servers regardless of where that data is physically located—including on servers in Europe. This means that even if your photos are stored in a European data center, if the company operating that service is subject to U.S. jurisdiction (like Google, Apple, Microsoft, or Amazon), U.S. authorities can access that data through legal process.

The CLOUD Act applies to all electronic communication service providers that operate or have a legal presence in the U.S., even if they’re headquartered elsewhere. A European company with U.S. operations can be compelled to turn over data under the CLOUD Act. As AWS notes in their compliance documentation, encrypted content without decryption keys remains protected—but only if the provider genuinely cannot access those keys.

Why this matters for privacy. The European Data Protection Supervisor (EDPS) has viewed the CLOUD Act as potentially conflicting with GDPR, and Germany’s Commissioner for Data Protection has warned against using U.S.-based services for sensitive federal police data. The concern isn’t just hypothetical—it’s about legal jurisdiction and who can ultimately compel access to your data.

The European solution. Companies headquartered and operating entirely within the EU are subject to European law and GDPR protections, not the CLOUD Act. This isn’t just about different privacy standards—it’s about legal independence from U.S. government reach. When storage isn’t subsidized by data mining and ad targeting, and when the legal framework prioritizes individual privacy rights over law enforcement convenience, the alignment of interests between provider and user becomes fundamentally different.

For zeitkapsl, operating under European jurisdiction means your encrypted photos remain protected by GDPR standards, without exposure to CLOUD Act demands. Since the encryption happens client-side and we never have access to your unencrypted data, even a theoretical legal demand couldn’t produce readable content.

The Choice You’re Actually Making

When you choose a cloud photo service, you’re not just deciding on features and price today. You’re making a bet about:

  • Who will control your photos in five years
  • What AI capabilities will exist and whether they’ll have access to your data
  • Whether you’ll be able to leave if policies or leadership change
  • What privacy will mean as technology evolves

Person considering digital choices Photo by Nik on Unsplash

Google has your entire history ready for whatever AI capabilities come next. Apple might keep their privacy promises, but can you afford to assume? The only safe bet is data that providers never have access to in the first place.

That’s not about paranoia—it’s about understanding that technology changes faster than migration, and privacy decisions made today have implications for years to come. In an era where AI capabilities are advancing exponentially, the only truly future-proof approach is encryption that can’t be undone retroactively.

Your photos are yours. The question is whether they’ll stay that way.


Learn more about how zeitkapsl approaches privacy-first photo storage at zeitkapsl.eu


References

  1. Google Photos AI Training - PetaPixel
  2. Google Privacy Policy Update 2023 - 9to5Google
  3. Google Privacy Policy AI Training - Engadget
  4. Apple CSAM Scanning Controversy - Wired
  5. GDPR Article 20 - Data Portability
  6. US CLOUD Act - Department of Justice Resources
  7. CLOUD Act Overview - Wikipedia
  8. AWS CLOUD Act Compliance Guide

Keywords

cloud storage, privacy, end-to-end encryption, vendor lock-in, Google Photos, iCloud, data portability, GDPR, AI training, photo backup, zero-knowledge encryption, client-side encryption, photo privacy, European cloud storage, Apple Advanced Data Protection, photo storage alternatives