We already have true AI - it's a vast, thriving industry.
Or how about calling that vast, thriving industry "weak AI," or "clever algorithms," which is what they really are. The original definition of AI was what we now call strong AI, but after some lesser problems were solved without actually creating strong AI, we had to come up with some name for those.
You know, if you're at the point where you can give a human-readable spec of the problem and the AI can make a passable attempt at it, that's basically the Turing Test -- hence why I think it deserves its status as holy grail. Something that passes would really give the impression of "there's a ghost inside here".
The problem is that fundamentally all our AI techniques are heavily data-driven. It's not clear what sort of data to feed in to represent good/bad algorithm design
Or how about calling that vast, thriving industry "weak AI," or "clever algorithms," which is what they really are. The original definition of AI was what we now call strong AI, but after some lesser problems were solved without actually creating strong AI, we had to come up with some name for those.