Skip to main content


You are here:

Threads privacy policies reveal need for better development best practices

Meta’s wannabe-Twitter-killer, Threads, is having quite the ride. After charging out of the gate with 100 million downloads in its first 5 days – the fastest in history, beating the previous record-holder, ChatGPT, which took two full months – engagement fell by half during its second week. Normal new-product ups-and-downs notwithstanding, concerns over the new app’s appetite for end-user data are raising questions over privacy. Given the track record of its maker, Meta, on securing end-user data, it’s easy to understand why. Developers might want to take notice.


Now that the first few heady weeks of Threads’ existence are giving way to the cold, harsh reality of long-term use, we’re beginning to get a better look at what lies underneath. And what we’re seeing – at least from the perspective of end-user privacy – merits additional discussion, because Threads collects significantly more data than competing apps in the microblogging space. While apps like Twitter, Mastodon, Hive Social, and Bluesky all have their own privacy policies, all of them are far less aggressive in their data collection. 

It’s difficult to imagine anyone being surprised that Threads leads the way in aggressive data collection. After all, the new app comes to us from Meta, parent company of Facebook, architect of the Cambridge Analytica scandal that laid bare just how far a social media company might go in the pursuit of its customers’ data.

In a digital economy largely fueled by data culled from billions of devices, apps, browsers, and services, Meta is hardly the only player pushing hard to learn as much as possible about the folks who subscribe to its services. But in a comparison of Twitter and its direct competitors, Threads was found to be collecting data from a much broader range of categories – such as third party advertising, developer marketing, analytics, product personalization, and app functionality – than the competition.

In that respect, the scope and breadth of the data collected by Threads echoes that of Facebook. And while Threads falls under the common privacy policy that covers Meta’s other social media platforms, including Facebook, Instagram, and WhatsApp, the new app also has its own supplemental privacy policy.

Among the clauses that apply uniquely to the new app is what happens if you eventually decide to delete it: the close architectural relationship between Threads and Instagram means users who choose to delete their Threads account will also be forced to delete their Instagram account, as well.

For additional details on competitors’ privacy policies, check out the following links:


If your business – like ours – involves software development, this issue is critically important. Consider the following best practices to limit your exposure to data-related concerns throughout the software development lifecycle:

  1. Implement strong data protection measures. Encrypt critical data at every step of the process, including processing, transmission, and storage. Incorporate secure protocols such as TLS (HTTPS) when developing for the web. Bolster access controls with two-factor authentication (2FA) and/or multi-factor authentication (MFA).
  2. Follow Privacy by Design principles throughout the development process. Don’t just tack data privacy on at the end of the project. Consider it right from the outset, then assess and reassess at every major decision point throughout the development process. Instead of casting the widest possible data collection net, minimize data collection and retention by focusing only on what is essential to meet the documented business needs – nothing more – and ensure end-user consent is both explicit and transparent.
  3. Sanitize your data inputs. Do not use any user and third-party data that hasn’t been sanitized by your software to avoid SQL injection and/or Cross-Site Scripting (XSS) attacks.
  4. Regularly conduct penetration testing and security audits to better identify vulnerabilities and minimize the risks of potential security breaches. Incorporate third-party input into development and maintenance efforts to provide unbiased guidance and rapid resolution of identified weaknesses.
  5. Build transparent privacy policies clearly outlining how end-user data is collected, why it is collected, how it is used, who has access to it, and the context within which it is shared. Publish these policies in easily accessible locations, including the app itself, the app store, and your website. Maintain ongoing communication to update stakeholders on future policy updates.
  6. Use role-based access controls to limit access to sensitive data only to those individuals who need it for their jobs. Conduct regularly scheduled assessments to review the role matrix and revoke accesses as needs evolve.
  7. Ensure data is anonymized, wherever possible, to limit the risk of exposure. Where possible, remove identifiers or replace them with artificial identifiers like tokens to protect individual privacy while maintaining the utility of aggregated data.
  8. Implement robust patch management processes to ensure security and feature updates are efficiently deployed to stakeholder devices and systems. Review analytics to measure compliance and adjust processes as needed. Maintain thorough and frequent communications with the end-user community to ensure no one is running outdated – and possibly vulnerable – code.
  9. Audit third-party partners to ensure their products and platforms – such as APIs and SDKs – are built to the same security standards as the rest of your own homegrown solutions. Hold partners to the same privacy and data stewardship standards you’ve outlined in your own privacy policies.
  10. Train your people. Data privacy is only as effective as the people who are responsible for it. Ensure you and your entire development team have access to regular training to stay current with the latest best practices and privacy regulations. 


The arrival of Threads is prompting long-overdue discussions around digital privacy and the risks associated with oversharing on social media platforms. This is a healthy process that spotlights developer accountabilities when building software – as well as best practices software development professionals need to follow to minimize the potential for data leakage – and damage to the brand.

If you’re looking for answers in your own privacy journey, give us a call