I’m not sure if you are speaking generally, or to what I said specifically, but it may be worth adding a little bit more context for what I said originally. It is highly unlikely I would fall in love with an AI in the scenario I described above. It would more be about scratching an itch for certain kinds of interactions that I may not otherwise be able to have. I’m not sure it’s a perfect analogy, but it might be similar to the way I care about a character in a book or game, or maybe how I feel about a pet.
I think if we got to the point where an AI had human levels of general intelligence and emotion this conversation would be somewhat moot. The world would be so drastically different I don’t even know what kinds of assumptions to make about it to have a productive conversion about it.
That’s completely understandable.
I’m not sure if you are speaking generally, or to what I said specifically, but it may be worth adding a little bit more context for what I said originally. It is highly unlikely I would fall in love with an AI in the scenario I described above. It would more be about scratching an itch for certain kinds of interactions that I may not otherwise be able to have. I’m not sure it’s a perfect analogy, but it might be similar to the way I care about a character in a book or game, or maybe how I feel about a pet.
I think if we got to the point where an AI had human levels of general intelligence and emotion this conversation would be somewhat moot. The world would be so drastically different I don’t even know what kinds of assumptions to make about it to have a productive conversion about it.