Short version: imagine you own a paperclip factory and you install a superhuman AI and tell it to maximize the number of paperclips it produces. Given that goal, it will eventually attempt to convert all matter in the universe into paperclips. Since some of that matter consists of humans and the things humans care about, this will inevitably lead to conflict.
https://wiki.lesswrong.com/wiki/Paperclip_maximizer
Short version: imagine you own a paperclip factory and you install a superhuman AI and tell it to maximize the number of paperclips it produces. Given that goal, it will eventually attempt to convert all matter in the universe into paperclips. Since some of that matter consists of humans and the things humans care about, this will inevitably lead to conflict.