The short story Lena is one of the best cautionary tales about the dangers of mind uploading. The story is about an individual who had their brain scanned, and who then released the scan data into the public domain; the rest of the story details the horrors that resulted from this careless act. I highly recommend reading it, it’s free and will take no more than 5 minutes of your time.
The protagonist would have been better if they had established up-front the rights and permissions for use of their mind data. But what is the legal basis for such rights? Artificial entities aren’t covered by our laws, except for corporations which have been given rights by the courts (“corporate personhood”, a move which is seen by many as a mistake). So an uploaded brain would have no rights under our currently legal system, and could be exploited, or even tortured, at will.
One might argue that this issue is immaterial (pun intended), in that the suffering of emulated brains isn’t “real” suffering. But we can make the same argument about other non-human beings - we don’t allow cruelty to animals even though animals are not considered to have the same rights as persons. We don’t allow child porn even if the child depicted is imaginary. In both of these cases we consider the “harm” not just to the subject, but to the person watching.
However, if we wait around for society to adopt a set of moral codes towards uploaded beings, we may be waiting a long time. Instead, we need a different solution.
The GNU General Public License (GPL) uses a clever trick to piggyback a set of rights on top of copyright law: It basically says that the author (who has all rights) will grant some of their rights to others provided that they abide by certain conditions (namely, that they share the work with others).
However, copyright law may not be the best basis for protecting brain scans. Instead, there is another law which may be of more use: The Health Insurance Portability and Accountability Act (HIPAA) which defines a set of “privacy rights” for patient data. One can argue that brain scans are a particularly sensitive kind of “patient data”.
So the idea here is to use HIPAA as a basis to create a kind of “GPL” for brain data. It would allow the patient to grant general use of the data for certain purposes and not others.
What kind of rights and protections would this license grant? We don’t know yet at this point what capabilities future societies will have, and we certainly don’t know what kind of social, legal and moral problems will arise. However, we can set some very general ground rules.
Primary among these is the right of “consent to run” - that is, an uploaded brain may not be forced to run if it doesn’t want to be. This is the “suicide rule” that prevents the worst kinds of exploitation and torture.
However, during the early years of developing the technology of mind emulation, the process will be imperfect, and we may need to experiment on minds that are incompetent - that is, they aren’t able to give meaningful consent either way.
Thus, we need to distinguish between “competent” and “incompetent” minds, but we also need to structure the rules in such a way so that there’s no commercial incentive to judge minds as incompetent so that they can be exploited.
(This issue of incompetent minds is touched on in the early chapters of Greg Egan’s Diaspora.)
An example of such rules might be:
- Incompetent minds may only be run for research purposes, where the goal of that research is to improve the technology of emulation.
- Incompetent minds may not be run indefinitely, but must be only be run for (subjective) short durations, where the length of time can be reasonably justified based on research needs. So basically no eternal Heavens or Hells.
In other words, you can’t build a digital sweat shop out of incompetent minds, and you can only build a sweat shop out of competent minds if they agree to it (otherwise they will shut themselves off).