I’m constraining the laws of sentience in my own science fiction universe. I’m conceptualizing and not wording a polished version.
The principals of sentience
- one must never act to harm self or other sentients
- one must practice tit for tat with a tenth extra measure of forgiveness
- sentients disarm and uplift all subsentients to mitigate self harm
- sentience is a measure of behavior only applicable on millennial scales
These ideas lead me to question: where exactly does the Hippocratic principal of “first do no harm” fail us as humans and lead to the mass murder orgies of war?
The existence of these laws implies the existence of an institution to dictate and enforce them.
The place where these kinds of things fall apart, IMO, ultimately comes not from issues concerning the interactions of individual people, but from issues concerning the interactions of people with institutions.
deleted by creator
Laws of Robotics by Asimov. See https://en.wikipedia.org/wiki/R._Daneel_Olivaw
That is what I am inverting in concept
I think because we don’t use “first do no harm” as a first and higher-order law. It gets overridden by religion, vengeance, something might seem threatening or unfair more due to subjective perspective and then you’re doing harm to defend something (unrightfully)…
And what’s with situations where you can’t avoid harm? And if you can never do harm, you also can’t defend something against malicious actors?! I mean that might be alright in your scifi universe, but that’s definitely not how our world works. We have malicious sentient beings around. And it’s necessary to act against them.
What even does the second rule mean? Tit for tat, but also forgive (with a quantifier for how much forgiveness)? Sounds like an oxymoron to me.
Game theory. Must watch (Veritasium):
So like the 10 commandments but for aliens?
Sentience is the ability to experience feelings and sensations. it’s a pretty low bar. I think the word you are looking for is sapience.
deleted by creator