What do Russian protesters have in common with people who are worried about the criminalization of abortion? They would be protected by a set of design practices from companies developing technologies.

Let's get back up. Russian police made protesters unlock their phones to look for evidence of dissent, leading to arrests and fines. Telegram, one of the main chat-based apps used in Russia, is vulnerable to these searches. Even though the Telegram app is on a personal device, it might imply that the owner doesn't support the war. The builders of Telegram failed to design the app with considerations for personal safety in high-risk environments, not just in the Russian context. Telegram can be used against its users.

Many people who use the platform have expressed concerns over his bid to forefront algorithmic content moderation and other design changes on the whim of his $44 billion fancy. This seems to be a push to remove online anonymity, something I have written about. It is harmful to those most at risk and backed by no evidence. Musk's previous actions combined with the harms from the current structures have made it clear that we are heading toward further impacts on marginalized groups. The leak of the draft Supreme Court opinion in Dobbs v. Jackson shows that the protections provided under the law are in danger. With the projected criminalization of those seeking or providing abortion services, it has become more and more apparent that the tools and technologies most used for accessing vital health care data are unsafe.

The same steps could be used to protect users. If the builders of these tools had focused on safety in high-risk environments.

Making better, safer, less harmful tech requires design based on the lived realities of those who are most marginalized. The edge cases are often ignored as being outside of the scope of a typical user's likely experiences. They are powerful indicators for understanding the flaws in our technologies. By understanding and establishing who is most impacted by different social, political, and legal frameworks, we can understand who is most likely to be a victim of weaponization. Technology which has recentered the extremes will always be generalizable to the broader usership.

I led a research project at the human rights organization in conjunction with local organizations in Iran, Lebanon, and Egypt, with support from international experts. We looked at the lived experiences of queer people who faced police persecution because of using personal technologies. A queer Syrian refugee in Lebanon was stopped at a police or army check point for papers. They had their phone searched. The person is determined to be queer by the icon for a queer app. The refugee is taken in for further interrogation and subjected to verbal and physical abuse. They will face potential imprisonment, fines, and revocation of their immigration status in Lebanon if they are sentenced under the Penal Code. This is one case.

If the logo was hidden and the sexuality of the individual was not readily available to them, what would happen? Allowing the individual to keep the app and connection to other queer people? The research and collaboration with the Guardian Project resulted in a stealth mode for the product.

Our other recommendations were also implemented by the company. The Discreet App Icon allowed users to have the app appear as a common utility, such as a calendar or calculator. In an initial police search, users can avoid being exposed to the content of the apps they own. The feature was created based on the outcomes of extreme cases, such as the queer Syrian refugee, but it proved popular with users around the world. It went from being only available in high risk countries to being free internationally in 2020, along with the popular PIN feature that was also introduced under this project. This was the first time that a dating app took such drastic security measures for its users.