Moral Deskilling: How Automation Is Quietly Weakening Ethical Judgment in Mental Health

Moral Deskilling: How Automation Is Quietly Weakening Ethical Judgment in Mental Health

Ethics rarely collapse in a dramatic moment. They erode. Quietly. Increment by increment. Often with good intentions.

In mental health care, ethical drift does not usually announce itself as misconduct or malpractice. It shows up as convenience. As relief. As the gentle outsourcing of decisions that once required pause, discomfort, and human deliberation. The rise of automation and AI has not created this drift, but it has accelerated it. What we are witnessing now is not simply a technological shift, but a cognitive one. A gradual weakening of ethical muscle memory. A phenomenon increasingly described as moral deskilling.
Read the rest here.

Who Is Responsible When the Algorithm Is in the Room?

Who Is Responsible When the Algorithm Is in the Room?

The supervision room has always been a space of translation. A clinician arrives carrying fragments of a session: a tone that lingered too long, a silence that felt weighted, a decision that did not quite settle in the body. A supervisor listens, asks questions, and helps transform experience into ethical judgment. For decades, this exchange assumed something simple but foundational: that clinical decisions emerged from human perception, human reasoning, and human responsibility.

Read the rest here.