Sunday, May 18, 2014
I agree with Ronald Bailey in this article, in which he argues for allowing genetic testing of embryos which some bioethicists want to ban because they fear that the parents will make poor decisions with that information and they think that the children should have an "'open future' unburdened by the knowledge of their genetic predispositions for adult onset illnesses."
This idea of an "open future" seems laughable to me. It made me think of an ostrich being stalked by hyenas. Does it have a more "open future" if it sticks its head in the ground and remains oblivious to the impending attack, or if it knows that an attack might come and has the opportunity try to deal with it to the best of its ability?
I understand that many people think that "Ignorance is bliss" and that we're all better off not knowing some things, but I (and others) often disagree, and I don't want some people forcing their knowledge preferences on others. I think this tendency to prefer ignorance comes from an irrational attempt to justify, and romanticize, traditional ways of dealing with the world that occurred before modern information was available.
This line of thought led me to think of other related issues of information and its control.
People may prefer to remain unaware of "spoilers" before they've seen a work, because they want to experience it for the first time the way the creator intended, and social pressure and conventions can help them remain ignorant in this way, but I don't think they have a coercively-enforcable right to prevent others from revealing things that they might not want to know. They don't own that information and don't have a right to control it.
If I don't think somebody has a right to prevent me from knowing test results (as long as I'm paying for the tests, and there are others who are willing to perform and help interpret the results for me), do I have a right to prevent other people from knowing things about me? What information is mine to control, and what isn't?
Do people have a right to control "private" information about themselves?
I think it depends. If the information was illegally obtained, overriding reasonable attempts to keep it private, then they probably do. If they allowed the information to get out by being unreasonably lax about controlling it, then they probably don't.
Should Google be able to publish personal details from my Gmail messages? Should it be able to sell information electronically gleened from them to third parties? What about aggregated information? Should the NSA be able to read them (without a warrant)? My intuition is that the case against the government violating my privacy is stronger, because they have the power to cause me more harm with it, and so privacy protections against them should be stronger.
I think this decision was decided wrongly. Apparently the European Union Court of Justice thinks people have a right for things about them to be "forgotten" and thus place a huge burden on Google (and other search engines, I suppose) to comply with requests to remove links to things people don't want to be easily discoverable.
The line is fuzzy and we'll have to evolve standards for where the default boundaries are (although people should be able to explicitly grant more rights over information to others; but not just by clicking "I agree" to a long legalistic notice). As computer systems, and the methods of aggregating information, change so too will reasonable conventions about what we can expect to remain private by default.
It would be nice if there were a simple rule that we could always apply and be sure about what information is ours to control, but I don't think we'll ever have such a rule (we might have laws that try to do it, but they'll be bad ones).
The world changes and I think the right way to live in it changes, too.