What is really interesting is that we are so accustomed to the tale of dominance and tribes that we are taking it for granted (because we live in it). Love and kindness, how naive it may sound is the bedrock of cooperation and humans have a hard time grappling with the concept that. We want to be wise (well some of us) yet what wisdom means is insight into our behavior from the bottom up. We are at a crossroads of a moment of what we want to be. We are often led into being by our circumstances and conditioning, and that is still a relic of our evolution, but also a door for the one willing to look at it. If AI is really smarter than us, I also hope it is also wise, because the two are not the same.
> What is really interesting is that we are so accustomed to the tale of dominance and tribes that we are taking it for granted (because we live in it). Love and kindness
Right, but love, kindness, cooperation are human concepts as well.
One way to look at it is that these AIs will be developed by humans and to be able to even understand one another, it has to internalize at least some elements of human psychology. AIs will be developed/trained for particular purpose, AIs used for military needs will probably understand more of dominance as opposed to love.
It's possible that some AIs will not be anthropomorphic at all, but that's a complete wildcard. Humanity has certain biology-dictated common ground in values, which leads us often to an illusion that some things are universal. But AI with no biological baggage might not see any value in life, for example.