A digital twin is a copy of a process or person. You will probably have a digital double within the next decade according to some people.

If you were presented with the same materials, a digital twin would be able to make the same decisions.

It might seem like a speculative claim. People might think it's impossible, but it's more than that.

Artificial intelligence can make a lot of inferences about our personality, social behavior, and purchasing decisions if we have enough information.

The era of big data means that a lot of information is collected about you and you leave behind a lot of behavioral traces.

The extent to which organizations collect our data is also quite shocking. The Disney Company acquired a company that had a questionable record when it came to collecting data.

Seemingly benign phone applications can collect a lot of users' data every few minutes.

Users and regulators are concerned about the prospect of someone being able to identify, predict and shift their behavior.

How worried should we be?

High vs. low fidelity

fidelity refers to how close a model is to its target. A simulation has to have a certain degree of realism. When we depress keys in a video game, the image increases and decreases in speed.

The fidelity of a video game is lower than the fidelity of a driving simulation.

A high degree of fidelity is required for a digital twin to incorporate real-time, real world information.

Digital twins can have big implications. If we can model a system of human and machine interaction, we can allocate resources, anticipate shortages and make projections.

A human digital twin would have a lot of data about a person's preferences, biases, and behaviors and be able to make predictions.

Achieving a true digital twin is out of the question for the foreseeable future. In order to maintain a virtual model of the user, a lot of sensors would need to be installed. Developers prefer a low fidelity model.

Ethical issues

Social and ethical issues related to data integrity, a model's prediction accuracy, the surveillance capacities required to create and update a digital twin, and ownership and access to a digital twin are raised.

According to Benjamin Disraeli, there are three types of lies: lies, damned lies, and statistics.

The data collected about us can be used to make predictions about how we will act in certain situations.

A misunderstanding about how statisticians gather and interpret data raises an important concern.

The quantitative fallacy is one of the most important ethical issues with a digital twin.

When we look at numbers, we don't pay attention to the meanings that come from the measurement instruments used to collect them. Measurement instruments can work in one context but not another.

We must acknowledge that certain features are included in the selection. This selection is often done out of convenience or practicality.

Data and artificial intelligence can be used to make claims, but the design decisions are not available to us. We need to understand how the data was collected.

Power imbalances

There is a growing discussion in the public about the power of the private sector.

Digital divides can be created or increased by this at smaller scales. There is a threat to a new colonialism based on access to and control of information and technology.

The creation of low-fidelity digital twins provides an opportunity to monitor users, try to influence them, and represent them to others.

Failure to give users the ability to access and assess their data can threaten individual autonomy and the good of society.

Large corporations and governments have access to the same resources. There is not enough time, training or motivation. Independent oversight is needed to ensure that our digital rights are not lost.

The assistant professor is from the psychology department at the university.

Under a Creative Commons license, this article is re-posted. The original article is worth a read.