A legal framework for uploaded minds

The short story Lena is one of the best cautionary tales about the dangers of mind uploading. The story is about an individual who had their brain scanned, and who then released the scan data into the public domain; the rest of the story details the horrors that resulted from this careless act. I highly recommend reading it, it’s free and will take no more than 5 minutes of your time.

The protagonist would have been better if they had established up-front the rights and permissions for use of their mind data. But what is the legal basis for such rights? Artificial entities aren’t covered by our laws, except for corporations which have been given rights by the courts (“corporate personhood”, a move which is seen by many as a mistake). So an uploaded brain would have no rights under our currently legal system, and could be exploited, or even tortured, at will.

One might argue that this issue is immaterial (pun intended), in that the suffering of emulated brains isn’t “real” suffering. But we can make the same argument about other non-human beings - we don’t allow cruelty to animals even though animals are not considered to have the same rights as persons. We don’t allow child porn even if the child depicted is imaginary. In both of these cases we consider the “harm” not just to the subject, but to the person watching.

However, if we wait around for society to adopt a set of moral codes towards uploaded beings, we may be waiting a long time. Instead, we need a different solution.

The GNU General Public License (GPL) uses a clever trick to piggyback a set of rights on top of copyright law: It basically says that the author (who has all rights) will grant some of their rights to others provided that they abide by certain conditions (namely, that they share the work with others).

However, copyright law may not be the best basis for protecting brain scans. Instead, there is another law which may be of more use: The Health Insurance Portability and Accountability Act (HIPAA) which defines a set of “privacy rights” for patient data. One can argue that brain scans are a particularly sensitive kind of “patient data”.

So the idea here is to use HIPAA as a basis to create a kind of “GPL” for brain data. It would allow the patient to grant general use of the data for certain purposes and not others.

What kind of rights and protections would this license grant? We don’t know yet at this point what capabilities future societies will have, and we certainly don’t know what kind of social, legal and moral problems will arise. However, we can set some very general ground rules.

Primary among these is the right of “consent to run” - that is, an uploaded brain may not be forced to run if it doesn’t want to be. This is the “suicide rule” that prevents the worst kinds of exploitation and torture.

However, during the early years of developing the technology of mind emulation, the process will be imperfect, and we may need to experiment on minds that are incompetent - that is, they aren’t able to give meaningful consent either way.

Thus, we need to distinguish between “competent” and “incompetent” minds, but we also need to structure the rules in such a way so that there’s no commercial incentive to judge minds as incompetent so that they can be exploited.

(This issue of incompetent minds is touched on in the early chapters of Greg Egan’s Diaspora.)

An example of such rules might be:

  • Incompetent minds may only be run for research purposes, where the goal of that research is to improve the technology of emulation.
  • Incompetent minds may not be run indefinitely, but must be only be run for (subjective) short durations, where the length of time can be reasonably justified based on research needs. So basically no eternal Heavens or Hells.

In other words, you can’t build a digital sweat shop out of incompetent minds, and you can only build a sweat shop out of competent minds if they agree to it (otherwise they will shut themselves off).

2 Likes

The short story “The Cookie Monster” would argue against that latter point, and even makes use of the fact:

Another story which touches on this is The Modular Man by Roger MacBride Allen w/ an Essay by Isaac Asimov

one of the “The Next Wave” books:

from the early 90s.

Lena was fascinating and horrifying. Thanks for the tip!

I keep wondering about the sensory experience of an uploaded sentient mind. Is being a disembodied brain, with nothing but electronic input and output, a remotely tolerable existence—even momentarily?

Maybe I should read some of the other stories cited here.

Greg Egan deals with the sensory experience of an uploaded mind in Permutation City - essentially, the researcher utilizes an off-the-shelf medical physiological simulation program to provide the illusion of a body, even though that body is simulated in a much lower resolution than the neurons in the brain.

The short story “Lena” puts forward the idea that the environment for an uploaded mind is determined by the simulation environment which is set up for it.

Certainly, the possibility of isolation (and torture) have been writ large in Harlan Ellison’s I Have No Mouth and I Must Scream or in the webcomic Schlock Mercenary

…where an AI is cut off from sensory input while still having a full processor core running at full speed which drives them insane.

Schlock Mercenary - Sunday 19 Aug 2012

1 Like

Now that I review, it does touch on that…

MMAcevedo’s requirements for virtual creature comforts are also more significant than those of many uploads, due to Acevedo’s relatively privileged background and high status at the time of upload.

Dinner scene at Flynn's swanky virtual safehouse in the film Tron: Legacy (Disney, 2010). Flynn is dressed all in white. Sam and Quorra wear tight black hi-tech jumpsuits. They sit at an ornate crystal table set with a pig roast and glasses of a blue beverage. The room is brightly lit, sleek, white and hypermodern.

David Brin’s short story “Stones of Significance” (published in ‘The Best of David Brin’) concerns the legal rights of AIs/uploaded minds. It portrays a set of cascading arguments and simulations for and against the matter, the ending being the realisation that the narrator is, themselves, an artefact receiving a debriefing about their efforts in the last iteration of the case arguments.

While more concerned with the rights of a cybernetic mind than an uploaded copy of one, the Star Trek episode “Measure of a Man” considers the legal standing of Lt Commander Data (although one might consider it a little late in the day to ponder the individual rights of a Starfleet executive officer!)

That ST:TNG episode was followed up by a book: Immortal Coil which gathers together every instance of androids from ST:ToS and spins them together into a single seamless narrative. Highly recommended.

1 Like

Wow, thank you. My TASAT time investment is paying off.

Investigating the mysterious destruction of a new android created by Starfleet, Data and the crew of the USS Enterprise uncover startling secrets stretching back to the galaxy’s dim past. That knowledge is coveted by beings who will stop at nothing to control it, and will force Data to redefine himself as he learns the hidden history of artificial intelligence.
Immortal Coil | Memory Alpha | Fandom

Coool! I haven’t read many ST novels, but this may make the list.