I think it's easy to explain both swipe and pinch and the author just misses the mark.
Device establishes a simple intuition: while you touch the screen you expect whatever is under your finger to follow where your finger goes until you lift it.
So within that intuition designer's job is to see what gestures are physically OK to do with a screen and what tasks the user needs to accomplish, and to come up with interactions that suitably bridge them together.
So it doesn't have anything to do with book reading or anything. In fact it misleads the readers. In reality designers establish a simple primitive intuition/map and build on top of it. Swipe and pinch are pretty much the two things you can work with on flat screen, and pinch is obviously zooming (per above), so there was no other way for them to do page turning
(If you are like me you probably noticed how this intuition breaks when you try to pinch or move stuff around to find out it's unmovable. This is also why in scroll lists it lets you scroll past the end: to satisfy the intuition.)
Device establishes a simple intuition: while you touch the screen you expect whatever is under your finger to follow where your finger goes until you lift it.
So within that intuition designer's job is to see what gestures are physically OK to do with a screen and what tasks the user needs to accomplish, and to come up with interactions that suitably bridge them together.
So it doesn't have anything to do with book reading or anything. In fact it misleads the readers. In reality designers establish a simple primitive intuition/map and build on top of it. Swipe and pinch are pretty much the two things you can work with on flat screen, and pinch is obviously zooming (per above), so there was no other way for them to do page turning
(If you are like me you probably noticed how this intuition breaks when you try to pinch or move stuff around to find out it's unmovable. This is also why in scroll lists it lets you scroll past the end: to satisfy the intuition.)