First, I'm feeling better than I was this morning. My eye still regularly shouts, "I'M ABOUT TO EXPLODE!" but it's less frequent and I'm not as headachy in the in-between times.
Anyway, I spent the better part of the morning reading this:
http://www.edge.org/q2006/q06_index.html
The Edge Annual Question — 2006
WHAT IS YOUR DANGEROUS IDEA?
The history of science is replete with discoveries that were considered socially, morally, or emotionally dangerous in their time; the Copernican and Darwinian revolutions are the most obvious. What is your dangerous idea? An idea you think about (not necessarily one you originated) that is dangerous not because it is assumed to be false, but because it might be true?
They asked a bunch of scientists, philosophers, and even one of the Monkees (though he's also a scientist) the question, came up with a whole lot of replies, 12 pages worth, 4 or 5 a page. Some are repeats of a similar idea, some contradict each other, some I agree with, some I don't, and I don't claim to have understood everything, but it's been a while since I've read something where so many times I encountered something I'd genuinely hadn't considered or was aware of. I really recommend reading them through. I'm actually bookmarking it because I think I'll want to read some of them again.
In that spirit, I'm going to write a few of my own Dangerous Ideas:
Now, I'm going to start out to say not all of these things are things I necessarily totally believe, or that if I do believe, I endorse as a good thing. Some I do. Some I fear might be true but have my doubts. Some are just idle speculations.
1. What We Are, Can and Should Be Modified As We Go
Cloning is bad, and will lead to the destruction of the human race.
Bad robots will destroy or enslave the human race.
Genetic engineering will lead to a chilling future of designer people.
What is the 'human race', and why do we value it?
Most simply, it's a certain range of genetic possibilities, but to me, that's nothing to be valued in and of itself.
A mass of individuals, each worthy of value? Certainly, but don't we have to decide what makes the individuals different from, say, individual rocks, or other things? Individual minds?
A set of values and behaviours, that's a little closer. Of course, values have changed over time. And why not?
My dangerous idea is that we actively seek to destroy humanity by expanding it and replacing it with something better. We're not quite at the stage yet where this is possible, but we're close.
We're nearing a point where we can create non-human intelligences, whether machine or biological. We should strive to do this and induct them into humanity - not as slaves or tools but as fully functioning members. Humanity will then no longer be tied to genetics, but to beliefs.
We're also nearing a point where we can enhance existing humans, whether through technological augmentation or genetic changes. We could potentially give everyone a perfectly vivid photographic memory with off-site storage. We could make people healthier.
Now, we certainly have to be cautious about it. For one things, what looks like a benefit might actually have undesireable side-effects (if everyone engineered to have super-intelligence winds up having psychotic children), and for another thing what is a benefit in small doses might be horrible if applied universally (a highly tuned immune system applied to everyone might wipe out everybody the first time a bug comes along that's perfectly evolved to wipe it out, and a humanity entirely composed of blonde hair blue eyed Aryan superman types would be boring and ugly and stupid).
What we want is a sense of continuity while at the same time improving things and diversifying things. Make many different breeds of people and give them the same goal of improving themselves while remaining one human race. But do it slowly. Increase intelligence so we can better understand the effects of what we are doing. Increase and celebrate diversity, so that if one branch fails, the tree continues to grow. If humans fail, perhaps the AIs will succeed, and, in the grand scheme of things, _that's okay_ (but, if we do things properly, the succeeding branches will also be fully participating into either making the failing branches succeed, or, if it's hopeless, in converting the individuals into one of the succeeding types).
Eventually, I'd like us to expand consciousness and intelligence and life anywhere it will fit, sometimes doubling over. Transcend biology entirely if possible - if it becomes possible to stick a billion individual intelligences in a virtual world, do so. If there's a way that groups of hundreds of people can have their own individual lives within the group and yet, together, form a super-intelligence that interacts with other super-intelligences and might be part of an even larger one, by god that's beautiful.
It's hopelessly idealistic, I grant, and that's why I count it as dangerous - we're probably not smart enough to do it wisely, and even if we do it'll probably end up in even greater wars once they start to conflict, but it's an idea that's dear to my heart.
2. Things are only immoral in that they cause suffering in the mind of others.
I specify 'in the mind' but really this applies to the body too - if you cause someone's body to suffer, their mind usually will as well. I regularly struggle with the nature of morality, whether it's objective or subjective, or a mix of both, and if there is any objective 'evil'. Before I go on I would like to make it clear that I do believe there's a lot of purely subjective evil that, for all its subjectivity, I'd still fight against. But I'm trying to distill 'immortality' into some kind of absolute statement.
The statement above is so far the best I can do as a starting point. It's certainly not complete - obviously there are still unanswered questions in there (what if something causes suffering short term but prevents suffering long term, good of the many vs the good of the one, good of another person vs the good of the self), but it's the 'atom'.
Something is only immoral if it causes suffering in a mind. Leaving what constitutes a mind aside (it's a much deeper and more difficult philosophical question... for the purposes of this discussion I'll only be talking about people with the possibility that animals might come down into the exceptions and the grey areas of conflicting sufferings). If I covet my neighbor's wife, there's nothing immoral about it, but if I act on it, it might be immoral (again, leaving aside conflicts - if the neighbor's wife was in love with me and I with her it might be more immoral to do nothing, even though it relieves the neighbor's suffering). Potential for harm works into here as well - a lot of things are immoral not because they actually harm anybody, but because it's probably that they would at some point, when the potential becomes lower it stops being immoral but may still be unwise. There becomes a distinction between acts that are immoral, ones that are illegal, and ones that are just stupid (and there's overlap between all categories). Various 'victimless crimes' (like copyright violation) may not be immoral at all, and we may or may not be justified in them being illegal to protect the public good.
Aside from the casting aside of religious-based morality, why do I consider this seemingly innocuous idea dangerous?
Because there are potentially dangerous dark corners, ones that even scare me. To illustrate, I'll take a particularly controversial subject. Rape.
Obviously, rape is bad - this doesn't dispute that. Rape obviously causes suffering in any number of different ways. But let's take an outrageous hypothetical example using impossible technology.
Person A is a rapist. Person B is the victim. Persons A and B have a prior, friendly relationship. Person A invites Person B over for tea. Person A also has a magical device that suppresses cognitive function, and some other really hi-tech devices. A uses the cognitive function on B, and then rapes B. A then removes any evidence of the act. Healing gizmos remove any damage to B, or really any feeling that anything at all happened, and even scientific evidence couldn't prove it did. Pregnancy is averted, there are no STDS whatsoever. Furthermore, a false memory is implanted in B to the effect that A and B had a pleasant cup of tea, completely free of interruptions or anything unusual. B goes on their merry way, A never tells B or anyone else what happened. It's all his little secret.
Under the 'the only immoral things are those that cause suffering' idea, A has done nothing morally wrong. If his technology was perfect, it wouldn't even be potentially harmful.
Obviously, this does not agree with us on a gut level. We (and I hope I'm safe in using the plural to anyone who may be reading this) feel that this is wrong, an outrage. But the only person who is affected is A. To the rest of the world, it's as though the entire event happened only in A's mind.
In fact, the point holds for a more general statement 'it's not immoral so long as nobody finds out about it, consciously or unconsciously'. It might be immoral if someone _could_ find out of it, but if you look backwards and weren't caught, your act was de facto not immoral. This is worrisome to me because it risks the whole thing falling apart as people _try_ to make their actions not caught. The moral system might lead to more immoral actions.
The other big dangerous part of the idea is that deception is not inherently harmful. It could even be helpful, if you're not caught. The truth is often painful. It can lead to being a better person, or society, on average, but it often causes suffering. The truth is therefore, probably a bad thing, under this theory.
As someone who respects truth and honesty, this is a terrifying prospect and one of the reasons I consider it a dangerous idea. The feeling that there may be nothing moral about the truth, that the most moral thing to do to the world might be to give them a happy delusion, is a real quandry for me. It's the thing that makes me want to reject it. One way to get around the issue might be to simple include, with suffering, some sort of 'inhibition to personal development and growth'. As such, because you need truth for real growth, it would get in, but then the theory loses some of its power, because the two prongs of the 'atom' are at odds for each other. Conflicts, rather than happening in a grey area borne out of circumstances, seems to be built right in. When people can either be happy or know the truth, which is more important?
3. Love is an Illusion, Jennifer Love Hewitt Triply So
This falls into the category of 'fear' rather than one I necessarily believe. I tend to be generally a materialist. But there is a romantic buried within me. I want to believe in true, eternal love. Yet I can't help but see people fall out of love, people betray love. The theory that 'well, then it wasn't _true_ love' is unsatisfying. I also see scientific studies on what chemicals cause the feeling of love, and evolutionary reasons why certain love-related things happen, it's hard for that inner romantic to keep the belief that love is a real thing to be valued rather than an illusion to get us to act in certain ways.
The widespread acknowledgement that love is an illusion could lead to a couple of results. We could treat it (and for that matter, many other emotions), like a disease, or any other delusion. You're in love? Take this pill, it'll go away, and you can go onto a much more sane and logical productive life, reproducing when necessary for society. What a dull-sounding world.
On the other hand, imagine when you got married, you were given a choice. Every month, you could both go in and get a booster shot, ensuring you would continue to feel romantic love for each other. If you miss a shot, it's no big deal usually - you'll have to miss several for the love feelings to wane on their own. Hell, you might not even have to do it every month. If you simply felt yourselves growing apart, while you still cared for each other you could just agree to go in and renew your feelings. I'm honestly not sure whether I think the idea is a good thing or a terrible thing.
Or of course, we could always ignore it - sure love may be an illusion, but, like all the best illusions, it doesn't _feel_ like one, so, even though we might intellectually know love isn't real when we're not in the middle of it, when we are, it's absolutely real. Of course I'd argue that this isn't an outcome of the premise (that the idea of love being an illusion gains widespread acceptance)... it's like saying "if the truck explodes, we could either be out of it, or die." and then saying, "There's a third possibility... the truck might not explode." Well, no. If the truck doesn't explode, then we're not in the condition where the truck explodes anyway.
4. Uniqueness is Overrated
Everybody wants to be special, unique. The vast majority of people aren't. Even the ultra-celebrities are typically very boring average people when you get right down to it. Interesting at best, vacuous at worst, but really they're made up of the same basic personality types as the rest of us, and if you take away their profession, they're exactly the same as millions of people throughout history.
What's more, the world doesn't _work_ with everybody being special. It works almost entirely because most people aren't special. Without millions of average people in society, none of the mundane stuff would get done every day.
Yet people still generally want to be special. I can't help but feel that we'd be happier if we didn't. If we could be perfectly satisfied being a bus driver with nothing outstanding about us. As unglamorous as it is, the world needs them too. Of course, at the same time, if nobody wanted to be special, then a lot of stuff simply wouldn't get done either, people would be content to do things the same way they always have and progress would slow considerably, which is why I count the idea as 'dangerous' if true. It's the age old dilemma again, even though it may suck, the lack of it isn't necessarily preferable. I just kind of wish that once people had become ordinary that there was no stigma attached... that the special people and those still desperately striving to be special people just accepted that we need the average people too.
5. Gangs For Good
This is a sort of off-the-cuff idea. I haven't though about it too much, just sort of casually considered the possibility. It was borne out of seeing the rise in gun and gang violence in my city over the past year or so (or, rather, the rise in the reporting of it - I don't know how much of it was there before, but I do know that it's in the news an awful lot).
It occured to me, more as an idle 'I wonder if this might be the genesis for a story idea' way, that gangs use specific and documented methods of recruitment that foster belonging, the members believing each other a family, and so on, and target specific people. Could some well-intention person use those tools for good? Specifically, create a gang that, although it looks like a gang to all outside views, is actually some kind of secret crimefighting organization. Gathering information on gang members, taking them out in ways that don't kill but do lead to arrest, both keeping some of the at risk people out of the hands of the _real_ gangs by giving them a family to belong to and fighting against the dangerous gangs. Maybe to some extent turning them to positive activities.
Again I want to point out I'm not saying this is a GOOD idea by any means. Hell, even I can see a dozen ways where it could foul up and make things worse on the whole. Basically I just wanted to make this a list of 5 ideas rather than 4 so I tossed this one on the end. ;) Really, this one and the last one both were after thoughts and more or less lame.
Feel free to share your own dangerous ideas, either in comments or make a meme out of it if you want.
Finally, just cause I think some of them are cool, Worth100's photoshop contest:
Photoshopped art with nature as the canvas
Anyway, I spent the better part of the morning reading this:
http://www.edge.org/q2006/q06_index.html
The history of science is replete with discoveries that were considered socially, morally, or emotionally dangerous in their time; the Copernican and Darwinian revolutions are the most obvious. What is your dangerous idea? An idea you think about (not necessarily one you originated) that is dangerous not because it is assumed to be false, but because it might be true?
They asked a bunch of scientists, philosophers, and even one of the Monkees (though he's also a scientist) the question, came up with a whole lot of replies, 12 pages worth, 4 or 5 a page. Some are repeats of a similar idea, some contradict each other, some I agree with, some I don't, and I don't claim to have understood everything, but it's been a while since I've read something where so many times I encountered something I'd genuinely hadn't considered or was aware of. I really recommend reading them through. I'm actually bookmarking it because I think I'll want to read some of them again.
In that spirit, I'm going to write a few of my own Dangerous Ideas:
Now, I'm going to start out to say not all of these things are things I necessarily totally believe, or that if I do believe, I endorse as a good thing. Some I do. Some I fear might be true but have my doubts. Some are just idle speculations.
1. What We Are, Can and Should Be Modified As We Go
Cloning is bad, and will lead to the destruction of the human race.
Bad robots will destroy or enslave the human race.
Genetic engineering will lead to a chilling future of designer people.
What is the 'human race', and why do we value it?
Most simply, it's a certain range of genetic possibilities, but to me, that's nothing to be valued in and of itself.
A mass of individuals, each worthy of value? Certainly, but don't we have to decide what makes the individuals different from, say, individual rocks, or other things? Individual minds?
A set of values and behaviours, that's a little closer. Of course, values have changed over time. And why not?
My dangerous idea is that we actively seek to destroy humanity by expanding it and replacing it with something better. We're not quite at the stage yet where this is possible, but we're close.
We're nearing a point where we can create non-human intelligences, whether machine or biological. We should strive to do this and induct them into humanity - not as slaves or tools but as fully functioning members. Humanity will then no longer be tied to genetics, but to beliefs.
We're also nearing a point where we can enhance existing humans, whether through technological augmentation or genetic changes. We could potentially give everyone a perfectly vivid photographic memory with off-site storage. We could make people healthier.
Now, we certainly have to be cautious about it. For one things, what looks like a benefit might actually have undesireable side-effects (if everyone engineered to have super-intelligence winds up having psychotic children), and for another thing what is a benefit in small doses might be horrible if applied universally (a highly tuned immune system applied to everyone might wipe out everybody the first time a bug comes along that's perfectly evolved to wipe it out, and a humanity entirely composed of blonde hair blue eyed Aryan superman types would be boring and ugly and stupid).
What we want is a sense of continuity while at the same time improving things and diversifying things. Make many different breeds of people and give them the same goal of improving themselves while remaining one human race. But do it slowly. Increase intelligence so we can better understand the effects of what we are doing. Increase and celebrate diversity, so that if one branch fails, the tree continues to grow. If humans fail, perhaps the AIs will succeed, and, in the grand scheme of things, _that's okay_ (but, if we do things properly, the succeeding branches will also be fully participating into either making the failing branches succeed, or, if it's hopeless, in converting the individuals into one of the succeeding types).
Eventually, I'd like us to expand consciousness and intelligence and life anywhere it will fit, sometimes doubling over. Transcend biology entirely if possible - if it becomes possible to stick a billion individual intelligences in a virtual world, do so. If there's a way that groups of hundreds of people can have their own individual lives within the group and yet, together, form a super-intelligence that interacts with other super-intelligences and might be part of an even larger one, by god that's beautiful.
It's hopelessly idealistic, I grant, and that's why I count it as dangerous - we're probably not smart enough to do it wisely, and even if we do it'll probably end up in even greater wars once they start to conflict, but it's an idea that's dear to my heart.
2. Things are only immoral in that they cause suffering in the mind of others.
I specify 'in the mind' but really this applies to the body too - if you cause someone's body to suffer, their mind usually will as well. I regularly struggle with the nature of morality, whether it's objective or subjective, or a mix of both, and if there is any objective 'evil'. Before I go on I would like to make it clear that I do believe there's a lot of purely subjective evil that, for all its subjectivity, I'd still fight against. But I'm trying to distill 'immortality' into some kind of absolute statement.
The statement above is so far the best I can do as a starting point. It's certainly not complete - obviously there are still unanswered questions in there (what if something causes suffering short term but prevents suffering long term, good of the many vs the good of the one, good of another person vs the good of the self), but it's the 'atom'.
Something is only immoral if it causes suffering in a mind. Leaving what constitutes a mind aside (it's a much deeper and more difficult philosophical question... for the purposes of this discussion I'll only be talking about people with the possibility that animals might come down into the exceptions and the grey areas of conflicting sufferings). If I covet my neighbor's wife, there's nothing immoral about it, but if I act on it, it might be immoral (again, leaving aside conflicts - if the neighbor's wife was in love with me and I with her it might be more immoral to do nothing, even though it relieves the neighbor's suffering). Potential for harm works into here as well - a lot of things are immoral not because they actually harm anybody, but because it's probably that they would at some point, when the potential becomes lower it stops being immoral but may still be unwise. There becomes a distinction between acts that are immoral, ones that are illegal, and ones that are just stupid (and there's overlap between all categories). Various 'victimless crimes' (like copyright violation) may not be immoral at all, and we may or may not be justified in them being illegal to protect the public good.
Aside from the casting aside of religious-based morality, why do I consider this seemingly innocuous idea dangerous?
Because there are potentially dangerous dark corners, ones that even scare me. To illustrate, I'll take a particularly controversial subject. Rape.
Obviously, rape is bad - this doesn't dispute that. Rape obviously causes suffering in any number of different ways. But let's take an outrageous hypothetical example using impossible technology.
Person A is a rapist. Person B is the victim. Persons A and B have a prior, friendly relationship. Person A invites Person B over for tea. Person A also has a magical device that suppresses cognitive function, and some other really hi-tech devices. A uses the cognitive function on B, and then rapes B. A then removes any evidence of the act. Healing gizmos remove any damage to B, or really any feeling that anything at all happened, and even scientific evidence couldn't prove it did. Pregnancy is averted, there are no STDS whatsoever. Furthermore, a false memory is implanted in B to the effect that A and B had a pleasant cup of tea, completely free of interruptions or anything unusual. B goes on their merry way, A never tells B or anyone else what happened. It's all his little secret.
Under the 'the only immoral things are those that cause suffering' idea, A has done nothing morally wrong. If his technology was perfect, it wouldn't even be potentially harmful.
Obviously, this does not agree with us on a gut level. We (and I hope I'm safe in using the plural to anyone who may be reading this) feel that this is wrong, an outrage. But the only person who is affected is A. To the rest of the world, it's as though the entire event happened only in A's mind.
In fact, the point holds for a more general statement 'it's not immoral so long as nobody finds out about it, consciously or unconsciously'. It might be immoral if someone _could_ find out of it, but if you look backwards and weren't caught, your act was de facto not immoral. This is worrisome to me because it risks the whole thing falling apart as people _try_ to make their actions not caught. The moral system might lead to more immoral actions.
The other big dangerous part of the idea is that deception is not inherently harmful. It could even be helpful, if you're not caught. The truth is often painful. It can lead to being a better person, or society, on average, but it often causes suffering. The truth is therefore, probably a bad thing, under this theory.
As someone who respects truth and honesty, this is a terrifying prospect and one of the reasons I consider it a dangerous idea. The feeling that there may be nothing moral about the truth, that the most moral thing to do to the world might be to give them a happy delusion, is a real quandry for me. It's the thing that makes me want to reject it. One way to get around the issue might be to simple include, with suffering, some sort of 'inhibition to personal development and growth'. As such, because you need truth for real growth, it would get in, but then the theory loses some of its power, because the two prongs of the 'atom' are at odds for each other. Conflicts, rather than happening in a grey area borne out of circumstances, seems to be built right in. When people can either be happy or know the truth, which is more important?
3. Love is an Illusion, Jennifer Love Hewitt Triply So
This falls into the category of 'fear' rather than one I necessarily believe. I tend to be generally a materialist. But there is a romantic buried within me. I want to believe in true, eternal love. Yet I can't help but see people fall out of love, people betray love. The theory that 'well, then it wasn't _true_ love' is unsatisfying. I also see scientific studies on what chemicals cause the feeling of love, and evolutionary reasons why certain love-related things happen, it's hard for that inner romantic to keep the belief that love is a real thing to be valued rather than an illusion to get us to act in certain ways.
The widespread acknowledgement that love is an illusion could lead to a couple of results. We could treat it (and for that matter, many other emotions), like a disease, or any other delusion. You're in love? Take this pill, it'll go away, and you can go onto a much more sane and logical productive life, reproducing when necessary for society. What a dull-sounding world.
On the other hand, imagine when you got married, you were given a choice. Every month, you could both go in and get a booster shot, ensuring you would continue to feel romantic love for each other. If you miss a shot, it's no big deal usually - you'll have to miss several for the love feelings to wane on their own. Hell, you might not even have to do it every month. If you simply felt yourselves growing apart, while you still cared for each other you could just agree to go in and renew your feelings. I'm honestly not sure whether I think the idea is a good thing or a terrible thing.
Or of course, we could always ignore it - sure love may be an illusion, but, like all the best illusions, it doesn't _feel_ like one, so, even though we might intellectually know love isn't real when we're not in the middle of it, when we are, it's absolutely real. Of course I'd argue that this isn't an outcome of the premise (that the idea of love being an illusion gains widespread acceptance)... it's like saying "if the truck explodes, we could either be out of it, or die." and then saying, "There's a third possibility... the truck might not explode." Well, no. If the truck doesn't explode, then we're not in the condition where the truck explodes anyway.
4. Uniqueness is Overrated
Everybody wants to be special, unique. The vast majority of people aren't. Even the ultra-celebrities are typically very boring average people when you get right down to it. Interesting at best, vacuous at worst, but really they're made up of the same basic personality types as the rest of us, and if you take away their profession, they're exactly the same as millions of people throughout history.
What's more, the world doesn't _work_ with everybody being special. It works almost entirely because most people aren't special. Without millions of average people in society, none of the mundane stuff would get done every day.
Yet people still generally want to be special. I can't help but feel that we'd be happier if we didn't. If we could be perfectly satisfied being a bus driver with nothing outstanding about us. As unglamorous as it is, the world needs them too. Of course, at the same time, if nobody wanted to be special, then a lot of stuff simply wouldn't get done either, people would be content to do things the same way they always have and progress would slow considerably, which is why I count the idea as 'dangerous' if true. It's the age old dilemma again, even though it may suck, the lack of it isn't necessarily preferable. I just kind of wish that once people had become ordinary that there was no stigma attached... that the special people and those still desperately striving to be special people just accepted that we need the average people too.
5. Gangs For Good
This is a sort of off-the-cuff idea. I haven't though about it too much, just sort of casually considered the possibility. It was borne out of seeing the rise in gun and gang violence in my city over the past year or so (or, rather, the rise in the reporting of it - I don't know how much of it was there before, but I do know that it's in the news an awful lot).
It occured to me, more as an idle 'I wonder if this might be the genesis for a story idea' way, that gangs use specific and documented methods of recruitment that foster belonging, the members believing each other a family, and so on, and target specific people. Could some well-intention person use those tools for good? Specifically, create a gang that, although it looks like a gang to all outside views, is actually some kind of secret crimefighting organization. Gathering information on gang members, taking them out in ways that don't kill but do lead to arrest, both keeping some of the at risk people out of the hands of the _real_ gangs by giving them a family to belong to and fighting against the dangerous gangs. Maybe to some extent turning them to positive activities.
Again I want to point out I'm not saying this is a GOOD idea by any means. Hell, even I can see a dozen ways where it could foul up and make things worse on the whole. Basically I just wanted to make this a list of 5 ideas rather than 4 so I tossed this one on the end. ;) Really, this one and the last one both were after thoughts and more or less lame.
Feel free to share your own dangerous ideas, either in comments or make a meme out of it if you want.
Finally, just cause I think some of them are cool, Worth100's photoshop contest:
Photoshopped art with nature as the canvas