I would say any game that includes a lot of uncertainty, subjectivity and random chance would mean that an AI would not usually have an edge over a human. Maybe games like diplomacy, magic the gathering, pandemic, Gloomhaven, maybe poker (?), backgammon (?) etc.
On a sidenote, I think it’s interesting that we’re a point where it’s starting to get hard to come up with games that humans can beat AI at.
Computers are better than humans at poker (DeepStack) and backgammon (eXtreme Gammon). XG for example is commonly used by expert backgammon players to analyse play, much like how engines are used in chess.
There is no reason why computers wouldn't eventually beat a human in the others, if someone writes a narrow AI for them. Consider for example, AlphaStar for StarCraft.
True randomness doesn't drive better results for humans, due to some intrinsic humane qualities, but rather eliminates the advantage of large volume data access and processing of AI. Basically true randomness levels the playing field. Backgammon is a perfect example.
You want a game where searching large amounts of data, computing moves and calculating probabilities fast doesn't help.
Maybe some randomness will help, but might not be enough.
My ideea is that bots can't win in the real world economy, no fund driven solely by algorithms can win more than funds driven by humans.
So if we can find a game modeled like the economy, where nothing is random but many things are uncertain, then it might be harder for the software to win.
I would imagine in many games randomness (e.g. through throwing dice, or pulling cards from a randomized pack) add noise, but the underlying strategy (including processing data, calculating probabilities etc.) still help. In such a game even though an AI might be superior, the randomness might mean that occasionally the human wins. Or for an extreme example, if nothing you do in the game really matters, it's all just down to random chance, then the AI:human win rate should approach 50%. But such a game would probably not be particularly enjoying to play.
But yes, I think you're right that you'd need a game where crunching a lot of numbers really fast doesn't give you an advantage. For instance, if the state space of the game is so big that number crunching is useless, and other approaches like AI style pattern matching (used IIRC by Alpha-Go?) don't work either.
Though ultimately, what is the uniquely human trait that would allow a human to beat an AI? Can you make a game that depends on that? Is there even such a thing?
> if nothing you do in the game really matters, it's all just down to random chance... But such a game would probably not be particularly enjoying to play.
I guess all the slot machine players tend to differ in opinion there :)
Sure, slot machine operators adjust winning chances so players keep on playing, but to a player, it's not influenced by anything they do, other than "just one more time and I'll win".
TBH, "real world economy" does allow humans to rewrite the rules, and they've done so: issuing more bonds, deflating the currency, printing more money, bailing out broke banks, hiding facts and selling before downturn goes public, pure and simple fraud...
I’m not saying a human would be better than an AI. Just saying it would be a more equal playing field, since neither would have a significant advantage.
I think this depends on whether the human has a chance in a particular run (yes) or on average of many runs (probably no because the AI will calculate the probabilities better).
But in the extreme case of a random game (like rolling the highest number on a die) they are equal (obviously).
>I would like to hear your reasoning as to why you think that a human is especially good at dealing with uncertainty.
We can ask any successful CEO. Fortune 500 companies would use bots if that was possible.
Uncertainty doesn't equal randomness. Randomness is flipping a coin and asking you the result. Uncertainty is hiding the coin behind my back and asking you in which hand is it.
That's disanalogous to board games. We're comparing board games with uncertainty to board games without uncertainty. In either of these categories, the thing that makes AI competent is unlimited training data due to self-play.
On a sidenote, I think it’s interesting that we’re a point where it’s starting to get hard to come up with games that humans can beat AI at.