Big tech companies have been moving into and disrupting traditional industries with their competitive advantage in emerging technologies like artificial intelligence and machine learning. The world of finance is about to get a lot more interesting with the launch of the Apple Card, the Apple Pay, and the Amazon loans for small and medium-sized businesses.
Artificial intelligence could help big tech gain an advantage in countering the ever evolving threat from cybercriminals, detecting fraud, andAutomating processes like loan applications and credit checks, but it could also become one of its biggest challenges.
Biased artificial intelligence tools can also skew results like deeply flawed police profiling tools. Apple Card was found to be biased against women after determining the creditworthiness of applicants.
The data collection vs user privacy debate is already ongoing within the EU, but highly sensitive financial data will add a new aspect. In June of this year, the EU Court of Justice backed a ruling that will give national privacy watchdogs more room to scrutinize big tech.
What can traditional banks teach the new players about building ethical and regulated capabilities for their artificial intelligence?
It requires more than lip service to tackle bias.
Cybercriminals are continuing to improve their techniques. There are still challenges in the use of artificial intelligence, such as detecting transactions that differ from normal customer behavior.
Mark Wiggerman is a data scientist at the corporate information security office at ABN AMRO.
Bias is hidden and it is not clear how the models learn from data and make predictions. It's difficult to explain the decisions of your own model.
Wiggerman says that the decision system's function is determined by the training data and that a model will often have to be retrained, which could lead to new functions and biases.
You have to continuously monitor it to see if it is within the bounds of the applicable privacy framework. It's not just about making a different decision for different groups, it's also about what impact that has on people and how that can be balanced with the potential benefits.
An example of a negative impact could be a fraud detection system that puts transactions from young people on a queue for manual inspection. The negative impact is that their transactions are processed a few hours later, and the negative impact is structural for that group.
A lot of biased systems have been reacting. It is important to monitor your systems regularly and to have clear guidelines on what constitutes ethical AI. Wiggerman shared.
Big tech and non-commercial organizations are taking some positive steps to tackle bias. You can measure bias with the help of software packages.
Like other companies, ABN AMRO is assessing the potential of Artificial Intelligence to deliver to its customers and staff. Policies are being developed to ensure fairness in applications.
Personal data is gathered.
When it comes to gathering personal data, big tech companies such as Amazon, Facebook, and Google are very popular. According to recent findings, Apple is the best company for privacy, while Google is the best company for storing personal data. Users entering their own data was the reason why Facebook had more data than they needed.
Big tech companies have a favored position and are able to use the data they already have on consumers. This collection of data raises questions about consumer privacy when it comes to financial services.
Wiggerman is wondering if it is okay for big tech to process the type of data they are gathering.
Do they need the data to make their product work? Is it possible that they could do it with less data? These questions are relevant to machine learning. I can show you your exact location and predict your movements. Do I really need this information for my product? I don't need to know if you're in the Netherlands or Belgium.
Big tech needs to take into account the fact that only collecting necessary data is part of the GDPR. Bias shouldn't be the only focus, but it should be part of a larger, responsible artificial intelligence proposition. The US is introducing more rulings similar to the EU's, but the regulations are different.
Privacy, ethics, and fairness should be included in the product development life cycle. Privacy and fairness are designed. This isn't something that should be added after. Consider security by design. Security measures should be taken into account from the beginning. All the risk policies of a company need to have an ethics trickle down.
They are working together.
Big tech is starting to offer more financial services, but they don't have the historical financial data that the traditional banking sector enjoys. They don't have experience with financial processing. Wiggerman suggests that big tech organizations work with banks to understand how they comply with audit requirements and learn from their financial history.
If they want to become a bank, they need to comply with all of the rules. If you want to develop a good product, you need a lot of financial history such as transaction data, information about how customers are onboarded, and how they use financial services.
Wiggerman believes that security and customer privacy can come together. He says yes.
I think they can be combined with the new type of technologies that are emerging from academics called privacy enhancing technologies. Multi-party computation is a relatively new technology that is being used to do joint calculations on data that is locked up.
>
If you have two companies that can't share data with each other, the same thing can be done with the help of MPC. You can do a joint computation on the data of the three or four banks that want to work together. The raw data remains secret, but the computation is usable for each bank. It allows you to keep user privacy.
In March, it was reported that the Dutch scientific research organisation TNO and the Dutch financial services company Rabobank were collaborating on a project to detect financial crime.
Wiggerman explains that there is a strong collaboration between banks when it comes to cybersecurity, and that ABN AMRO doesn't publish any of its anti-crime efforts. Information is shared between banking services when something happens. This is where tech companies might differ.
When we think of big tech, we often think of large organizations that deliver social media platforms. We don't think about financial services. It is clear that big tech is being driven by machine learning and artificial intelligence.
It should come as no surprise that these same companies are now using financial services. A collaboration between the two may be the way forward for big tech to enter the industry.