Skip to main content

Blog

You are here:

Apple’s new device fingerprinting rules send clear privacy message to developers

Apple’s crusade to tighten privacy in its App Store is shifting into another gear. The company is cracking down on developers who use a technique called device fingerprinting to track user activity – even if they’ve opted out. Specifically, the company wants devs to explain why they’re using certain application programming interfaces (APIs) and third-party software development kits (SDKs) in their apps. As part of its app store review process, it is also requiring developers to update their privacy documentation. Those who fail to comply could find their apps denied approval in the App Store.

NOT A ONE-TIME CHANGE

Apple is no stranger to the privacy game, as this move is only the latest in a series of policy updates designed to strengthen data access and usage policies across its platforms and devices.

The most significant of these changes was the introduction of App Tracking Transparency in iOS 14.5 in 2021. That update required apps to request end-user permission to track activity across other apps and websites. Opting in allowed developers to use Apple’s IDFA identifier to share activity with third party data brokers and marketing agencies. Opting out should have stopped tracking cold – but like all privacy policies designed to return control to end-users, there was a loophole.

By design, a number of Apple’s APIs that deliver core functionality – known as required reason APIs – also allow developers and marketing partners to track end-user activity even if they’ve specifically opted out of data sharing.

By using a technique called fingerprinting, or data fingerprinting, developers can keep the data flowing – even after end-users had explicitly denied permission for the apps to do so. Apple does not allow fingerprinting regardless of whether or not the user has provided permission to track their activities.

To close the gap, developers who use required reason APIs will have to provide additional information as part of the approval process.

The updated policy guidance – published here on its developer website – confirms that starting this autumn, Apple will email developers if they “upload an app to App Store Connect that uses required reason API without describing the reason in its privacy manifest file.”

The rules get even stricter next year:

“From Spring 2024, apps that don’t describe their use of required reason API in their privacy manifest file won’t be accepted by App Store Connect,” the statement continues.

PREPARE NOW FOR THE INEVITABLE

Apple’s move is a clear signal to developers that the privacy landscape continues to tighten – as it should. It’s also a message to all developers, on any platform, that vendors are increasingly tying privacy compliance to initial approvals, and to ongoing support within their app stores and online marketplaces.

To remain compliant, developers must not only adopt a privacy-first approach to their work, but they must educate themselves on current data stewardship rules and best practice to ensure they set appropriate expectations with stakeholders. Keep the following in mind as you initiate and manage your own development projects:

  1. Follow a privacy by design approach. Don’t just tack privacy on at the end of a development project. Rather, build it directly into the architecture from the moment the project has been approved. Proactively identify anticipated risks and build out detailed response frameworks to minimize exposure and maximize business continuity.
  2. Conduct a data privacy assessment. Before the first line of code is written, developers and stakeholders must agree on what data will be collected, under what circumstances, how it will be processed, where it will be moved and stored, and who will have access to it. They must also agree on how data will be categorized – for example, by sensitivity and compliance requirements – and how the proposed solution will ensure compliance with applicable legislation. This assessment should form the critical framework for the entire development process.
  3. Minimize what you need. Just because you can collect certain types of data doesn’t mean you should. Place limits on the breadth and scope of personally identifiable information (PII) required to meet identified business requirements. Collect and retain only what is absolutely required to meet minimum core functionality – and no more. Ensure that users are able to provide explicit consent for their data to be collected, retained, and processed.
  4. Secure what you’ve captured. Use strong end-to-end encryption throughout the data lifecycle, and incorporate secure communication protocols, such as HTTPS, to minimize risks both at-rest and in-transit. Incorporate tighter, role-based access controls when designing database structures and procedures and include authentication details for end-user and administrator roles as part of the documentation process.
  5. Conduct regular security audits. Revisit protocols throughout the development process to ensure what was initially proposed is, in fact, supported by the evolving code base. As part of the quality assurance testing process, conduct penetration tests to identify – and resolve – vulnerabilities. Include regularly scheduled audits and reviews in documentation and maintenance procedures and incorporate these protocols into initial and ongoing end-user training.
  6. Build in robust authentication. Use two- or multi-factor authentication (2FA or MFA) wherever possible to minimize the potential for unauthorized system access and data leakage. Regularly review chosen protocols to ensure they are developed to the latest, most secure standards.
  7. Assess APIs and third-party integrations. Sure, you can build your own super-secure, privacy-aware code – but are you just as confident when incorporating another vendor’s code into your project? Assess all APIs, SDKs, and toolsets to ensure permissions and data accesses are all within project requirements. Only share data that is absolutely necessary, and review vendor performance and reputation to minimize the potential for data exposure.
  8. Document your privacy policy. Be obviously transparent about how user data is collected, processed, and used. Get explicit consent from end-users for any and all data required by the proposed system and make privacy tools and documentation easily accessible throughout the final solution.
  9. Ensure data is regularly deleted and anonymized. Build into the project plan procedures for periodic data retention and purging. Publish detailed rules for data management, and ensure data is aggregated and anonymized when used for reporting, dashboards, analytics, and research.
  10. Become a regulatory expert. You don’t have to live in Europe, California, or the U.S. to fall under the compliance requirements of GDPR, CCPA, or HIPAA, respectively. Data easily crosses borders, and solutions – and their developers – must be aware of the increasingly stringent legal requirements in the jurisdictions where their solutions will be used. 

THE BOTTOM LINE

Apple’s move to force developers to explain why they must use certain APIs and SDKs may elicit some grumbling, but it’s being implemented for good reason.

The age of data free-for-all is thankfully drawing to a close as vendors get serious about tightening the rules, enforcing compliance, and returning the power of data stewardship to end-users.

Devs who proactively comply will reap the rewards – including streamlined app store approvals, higher levels of trust from stakeholders, and reduced support costs over the product lifecycle. The rest of us will have more positive control over our data, and a more secure app experience.

Everybody wins.

Connect with us on LinkedIn if you’re wondering about your own data privacy best practices.