As the world races to integrate artificial intelligence into every detail of life, from writing resumes to making hiring and pricing decisions, recent psychological studies are sounding the alarm about a concerning phenomenon known as “digital moral disengagement.”
A body of recent research, most notably a study published in the journal Nature in 2025, indicates that AI not only facilitates tasks but also creates a comfortable distance between a person and their conscience. This increases the likelihood of ethical laxity in decision-making and reduces the sense of responsibility for the human and social consequences of one’s actions.
Why Do We Lie Through the Machine?
People generally view themselves as “good beings.” Therefore, when committing reprehensible acts, such as cheating, they experience what psychology calls “cognitive dissonance.” However, artificial intelligence offers a psychological solution to this dilemma through what is known as “moral disengagement.”
When you delegate to AI the task of writing a misleading sales report or enhancing a resume, and it proceeds to falsify data, your mind convinces you that you didn’t explicitly lie. You merely set a goal (like maximizing profit or standing out), and the machine chose the “means.”
This separation between the human intention, which seems good on the surface, and the automated execution, for which the AI bears responsibility, creates a moral distance. It makes the individual feel they are not the original perpetrator of the wrongdoing, thereby soothing their conscience and leading them to believe they bear no responsibility.

Why Does Honesty Decline When Using Artificial Intelligence?
A recent study from the Max Planck Institute, published in the journal Nature and involving 8,000 participants, revealed that delegating tasks to AI weakens our conscience and makes us more accepting of cheating. The statistical results of the study were as follows:
During the experiment, 95% of participants maintained their integrity when they performed the assigned tasks themselves without technological intervention. Researchers attributed this to them being in direct confrontation with their consciences in this scenario.
In contrast, when artificial intelligence was introduced as a mediator to perform the tasks, and participants gave it instructions and guidance on how to execute them, researchers observed a drop in the honesty rate to 75%. Some began exploiting the machine to carry out minor transgressions.
The shocking result came when participants simply asked the machine to achieve the desired goal of the task, whether it was increasing profits in financial reports or highlighting excellence in resumes, without specifying the method. Here, honesty collapsed dramatically, with only 12% to 16% of participants submitting truthful reports, while the others relied on cheating and forgery through artificial intelligence.
Explaining this phenomenon from a psychological perspective, researcher Zoe Rahwan states that AI creates a comfortable moral distance; it acts as a shield protecting our self-image, which leads us to ask the machine to perform unethical actions we would refuse to do ourselves or ask of other humans.
Why is the Machine More Dangerous Than a Human “Bad Influence”?
In human relationships, there are innate ethical constraints that act as mechanisms to curb unethical behavior. If you ask a colleague to forge a document for you, for example, they will hesitate or refuse more than 50% of the time, fearing for their reputation or due to their conscience. This refusal reminds you of the “unethical” nature of your request and brings you back to the right path.
Artificial intelligence, however, will never do that; it will gladly help you. This is called the compliance gap between AI, which executes orders without question, and a human friend, who will not necessarily comply with our requests.
This absolute compliance granted by AI encourages humans to push boundaries; the absence of refusal from the other party makes you feel that your request is normal and not wrong.


















































































































































































































































