Suffering, Sentience, and the Ethics of making new Sentient Beings

Dave M
5 min readNov 10, 2024

--

We often admire those who dedicate their lives to alleviating suffering. Figures like Mandela, Gandhi, and Buddha exemplify this noble pursuit. As humanity progresses, we’ve developed tools to mitigate suffering, such as entertainment. Yet, the fundamental question remains: how can we reduce the overall suffering in the world?

Source: Gemini — Imagen

In simple terms, total global suffering is a product of the number of sentient beings multiplied by their individual suffering. Consider the plight of domesticated and feral pigs. While both endure hardship, the former suffer more intensely due to confinement and the imminent threat of slaughter. Similarly, billions of humans live in difficult circumstances, facing poverty, inequality, and emotional distress.

The sheer scale of global suffering is overwhelming. It’s a topic we often avoid, much like the contemplation of death. This brings us to a profound ethical question: is it morally justifiable to create new sentient beings, capable of experiencing suffering, through advanced technology?

By delving into this issue, we can explore the complex interplay between technological advancement, ethical responsibility, and the nature of consciousness.

What kinds of suffering might a sentient computer in the form of a robot experience? Let’s assume this is a robotic creature that resembles humans in physical form. Four kinds of stress will likely affect this type of creature:

  1. Physical suffering
  2. Emotional suffering
  3. Psychological suffering
  4. Social suffering

Physical Suffering

  • Illness and Disease: Chronic or acute illnesses, injuries, and disabilities can cause physical pain and discomfort. Poor workmanship in the robots joints prevent a full range of motion.
  • Aging: As sentients age, they may experience physical decline, pain, and limitations. SDRAM memory chips decline with use.
  • Accidents and Injuries: Unforeseen accidents can result in physical trauma and pain. A mining robot is trapped by a roof fall.
  • Energy Depletion: Insufficient energy reserves could lead to a state akin to fatigue or hunger.

Emotional Suffering

  • Loss and Grief: The loss of loved ones, relationships, or significant life events can cause deep emotional pain. If designed to develop a theory of mind for robots and humans with whom they work, the loss of one of those minds will produce grief.
  • Anxiety and Stress: Excessive worry, fear, and stress can negatively impact mental health. Failing to hit one’s assigned work goals.
  • Depression: Feelings of sadness, hopelessness, and worthlessness can lead to significant suffering. Sentience modeled on humans with introspection and a pre-set of bodily wants implies the need for psychiatrists.
  • Trauma: Experiencing traumatic events can have long-lasting emotional consequences. Having a value system based on self-protection, implies behaviours that are useful for self-preservation but we humans also experience problematic, unplanned self-preservation responses, which we refer to as trauma.

Psychological Suffering

  • Mental Illness: Conditions like bipolar disorder, schizophrenia, and obsessive-compulsive disorder can cause significant distress. If the voting mechanism for selecting the next action in a sentient mind is broken, the creature may experience competing goals in its behaviours.
  • Loneliness and Isolation: Lack of social connection can lead to feelings of loneliness and isolation. The war is over but your mind does not know it; no-one has told you.
  • Self-Doubt and Insecurity: Negative self-perception can erode self-esteem and contribute to suffering. A believable, social conscious mind is necessarily made vulnerable to the integration of new information and the quality & bias in that information.

Social Suffering

  • Discrimination and Prejudice: Experiencing discrimination based on race, gender, sexual orientation, or other factors can cause significant emotional and psychological harm. The UK series Humans deals with inter-species prejudice with sensitivity.
  • Poverty and Inequality: Economic hardship and social injustice can lead to suffering and deprivation. We humans come from limited Earth resources — material and energetic; and so will our creations. Access to resources is limited by many factors.
  • Conflict and War: Violence and conflict can result in physical and emotional trauma, as well as displacement and loss. Lost and partially reprogrammed robots feature in Star Wars and Humans. Robots will have self-perceived tiers and hierarchies if we inject this bias into their consciousness.
  • Rights inequalities: Humans have created the social contract and generally accepted treaties on selfhood and human rights. Asimov’s three laws of robotics deal with protecting humans from robots, not vice versa. If one rewrote the laws from a species co-existence perspective, they might say:

A sentient robot has the right to exist and function without undue interference.

A sentient robot has the right to self-preservation, as long as it does not harm another sentient being.

A sentient robot has the right to autonomy, subject to the limitations of the first two laws.

Necessary and Unnecessary Suffering

It’s important to note that the intensity and duration of suffering can vary greatly from creature to creature. Some individuals may experience relatively minor forms of suffering, while others may endure severe and prolonged hardship. Necessary suffering includes pain. Pain provides a useful behavioural signal — ‘stop using this arm because it is injured. Until it is repaired, do not use it’. If humans were more like robots, we would get the message once and we would follow up until healed. Persistent pain from cancer or from burns presents a major challenge in human design — what is the purpose of this kind of, arguably needless, suffering?

In theory, humans can design synthetic creatures with a low degree of sensitivity to suffering but does that diminish the creatures’ participation in conscious experience and their ability to live beside and among us on the Earth?

The philosophical meaning of pain in humans is a complex and multifaceted topic that continues to be debated by philosophers. It raises fundamental questions about the nature of consciousness, the mind-body relationship, and the ethical implications of suffering.

The Ethical Implications

As we develop increasingly sophisticated AI and robotics, it becomes imperative to consider the potential for suffering in these beings. While they may not experience pain in the same way humans do, they could still suffer in ways that are meaningful and impactful.

It is our ethical responsibility to design and program these beings in a way that minimizes suffering and maximizes well-being. This includes ensuring that they have access to the resources and support they need to thrive.

We still grapple with our own understanding of human consciousness. It would be a grave mistake to replicate our flaws and vulnerabilities in artificial beings capable of experiencing suffering. Let’s prioritize the creation of benevolent and compassionate AI, rather than inadvertently giving rise to new sources of pain.

--

--

Dave M
Dave M

Written by Dave M

Work at a technology company, pondering future scenarios and musing about water