Privacy is an emotional concept. We often value privacy when we feel powerless or vulnerable when faced with intrusive data practices. However, the court doesn't always consider emotions to be harm or a reason to change the way privacy is legally codified.To catalyze privacy improvements, it might be necessary to have a material view on the widening privacy gaps and their implications in wider social inequality.Apple's leaders have announced plans to update the App Tracking Transparency app (ATT) in 2020. iOS users have the option to refuse any app's ability to track their activities on other apps or websites. A staggering three-quarters (73%) of iOS users have opted out of cross-app tracking since the ATT update.Companies simply follow the least-expensive route when a user base is concerned about privacy protections.Advertisers have less data to target advertising to iOS users, making targeted ads less effective and more appealing to agencies. New findings have shown that iOS device advertisers spend one-third less on advertising.They are redirecting this capital to advertising on Android systems. Android accounts for just over 42.06%, while iOS is at 57.62%.Privacy disparities are increasingly threatening to cause material harm, not just a feeling of creepiness. If privacy is something that belongs to everyone, as most tech companies claim, why is it so expensive? Companies simply move their data practices to the most vulnerable users, i.e. those with less legal or technical resources.Advertisements are not all there isAdvertising techniques could become more sophisticated or aggressive as more money is spent on Android ads. Targeted advertising is legal for companies, provided that it follows the users' rights to opt-out under applicable laws such as the CCPA in California.Two immediate problems arise from this. First, all residents in every state other than California do not have such opt-out rights. The second is that granting certain users the right of opting out of targeted ads strongly implies that targeted advertising has risks or harms. There can be.Third parties create and maintain behind-the scenes profiles of users based upon their behavior. This is called targeted advertising. Further inferences could be made about sensitive aspects of the user's life by gathering data from apps, such as shopping habits and fitness habits.This point is a representation of the user in an under-regulated system that contains incorrectly or correctly inferred data that the user has not consented to sharing. (Except the user is located in California. But let's suppose that they reside elsewhere in the U.S.Research also shows that targeted advertising can create detailed profiles of users and discriminate in employment and housing opportunities. This is sometimes in violation federal law. Targeted advertising can also limit individual autonomy by preemptively limiting their purchasing options even when they don't want to. Targeted advertising, on the other hand can help niche or grassroots organizations connect with interested audiences. No matter what your stance is on targeted advertising, users don't have any control over whether they are subject to it.Although targeted advertising is a huge and growing practice, it is just one of many business activities that does not respect users' data. These practices aren't illegal in many parts of the U.S. Your pocketbook, rather than the law, can protect you from data abuse.Privacy is a luxuryProminent tech companies like Apple have made privacy a human rights, which is a good business decision. A bold privacy commitment by a private company is appealing, even though the U.S. federal government has not codified privacy rights for all consumers.My phone manufacturer will set a privacy standard if the government doesn't. Although only 6% of Americans say they understand how companies use their data and it is the companies who are taking broad privacy steps.What does this say about human rights? Apple products are more popular with educated, wealthy consumers than those of its competitors. This suggests a worrying future of increasing privacy disparities between the haves (and the have-nots) and where a feedback loop has been established. Those who have less resources to obtain privacy protections may not have the resources necessary to navigate the legal and technical challenges associated with targeted advertising.This does not mean that I am siding with Facebook in the Apple vs Facebook battle over privacy and affordability. (See: Systemic access control issues that were recently revealed). I believe neither side is winning in this battle.We all deserve privacy protections that are affordable for everyone. To turn the phrase around, we should have meaningful privacy protections that every company cannot afford to leave out of their products. We need both privacy that is meaningful and broadly available.Next stepsTwo areas that are key to privacy advancement in the future are privacy legislation and privacy tooling development. Again, I will use the both/and approach. It is not tech companies that need to set reliable privacy standards for consumers. We need legislators. We need tools for developers that are widely available and give developers no excuse financially, logistically, or otherwise to implement privacy at product level.Privacy legislation is an area where I believe policy professionals have already raised some great points. I will direct you to some of my favourite recent writings from them.Stacey Gray and her Future of Privacy Forum team have created an excellent blog series about how a federal privacy statute could interact with the new patchwork of state laws.Joe Jerome wrote a great summary of the privacy landscape at the state level in 2021 and the paths to widespread privacy protections for all Americans. One key takeaway is that privacy regulation's effectiveness depends on its ability to harmonize between individuals and businesses. This does not mean that regulation should not be business-friendly. However, businesses should be able reference clear privacy standards to be able confidently and respectfully manage everyday people's data.Privacy tooling: If we make privacy tools easily accessible and affordable to all developers, tech will have no excuses not to comply with privacy standards. Consider access control as an example. Engineers try to create manual controls that allow personnel and end-users to access data within a complex data ecosystem.This is a double-edged challenge. The first is that the horse has fled. While technical debt is rapidly increasing, privacy has remained outside the realm of software development. Engineers require tools to enable them to create privacy features such as nuanced access control before they can be put into production.This brings us to the second aspect: What standards and tools are widely available, even if engineers have gotten rid of all technical debt?A report by the Future of Privacy Forum in June 2021 reveals that privacy technology needs consistent definitions. This is essential for widespread adoption of trusted privacy tools. These technical transformations result in material improvements in the way tech at large, not just Brand XYZ, gives users control of their data.Privacy rules must be set by institutions that are not playing the game. Although regulation alone will not save us from our modern privacy problems, it is an essential ingredient to any solution.Every software engineering team must have privacy tools available immediately, in addition to regulation. Civil engineers cannot build a bridge that is safe only for certain people. It must be accessible to all. This must be true for data infrastructure as well, to avoid exacerbating disparities in the digital world.