Twitter user @capohai shared screenshots of a Google search that returned the first result the Palestinian keffiyeh scarves. The French senate just voted to ban women under 18 from wearing hijabs publicly, while President Macron's LREM party pulled support from Sarah Zemmahi, a candidate for the LREM political party, after she wore a hijab during a campaign ad. How many people asked Google the same question, and took Google's answer as confirmation of their prejudices or an objective statement. What number of people were affected by the results?
Although the outrage at Google's tacit equation of terrorists, Palestinians and headscarves spread from social media to the news, the keffiyeh remains the top result when a similar search is done today.
It is understandable that @capohai used Twitter to report on tech companies' unethical behavior. Privacy breaches, disinformation spreading hate speech, and more are all common issues. This example shows that the retribution model doesn't work for ethical violations.
The bigger picture shows that there has been a call for more regulation of the tech sector. However, legislation can take time to pass and implement and may not be sufficient to prevent the many ethical failures that are endemic in technology. Because algorithms can express our (bad!) values in unexpected ways, regulations cannot be predicted or stopped.
There is another option, however. It doesn't depend on social media outrage nor new regulation. Technology companies have the infrastructure in place to handle ethics issues on a large scale. They will just need to modify their existing bounty system.
There are hundreds of organizations and companies that offer bounties to anyone who finds vulnerabilities in their code that could be exploited by bad actors. Google's bounty program also covers apps sold through its Play Store. Apple has a similar program, although it only recently started a bounty program with compensation up to a million bucks for the most serious exploits. The company stated in its program notes that it would reward researchers who share critical issues and exploit techniques with us, and provide public recognition and matching donations for the bounty payment to charities.
SIGN UP Subscribe to WIRED to stay informed with your favorite Ideas writers.
Imagine how much better Silicon Valley's products and services could be if they grouped ethical violations under critical issues, and the methods used to exploit them, and offered corresponding bounties. Ethical violations can be just as harmful to a company as leaked code. This language would not need to be modified. The ethics bounty program could also use Apple's other rules. These include: 1) you must report it first; 2) you must explain clearly and provide evidence; 3) Apple cannot make it public before Apple can fix it. 4) You can receive a bonus if the company accidentally introduces a problem to a new patch.
A bounty system encourages users to look for ethical violations and report them faster. This system could be used by companies to help them find and fix problems before they can cause damage to customers or generate negative press. While some companies might not be affected by negative press, loss of customers or the propagation of prejudices, they are likely to be motivated and excited by the potential for long-term stability, goodwill and security that such a program can bring. A company's past record of responding thoughtfully and professionally to ethical issues can be a huge asset if it is looking to hire talented workers or expand into new markets.