I don't think anybody from the SIAI is disagreeing with you here. I don't think the kind of prediction you're talking about is part of the intelligence explosion hypothesis. A superintelligence doesn't need literal omniscience, or even something that close to it, in order to be much, much, more effective than humans at achieving goals.