They note that the model has 12B parameters, which in terms of order of magnitude make it sit right between gpt 2 and 3 (1.5 and 170 respectively). With some tricks, you can run gpt 2 on good personal hardware, so this might be reachable as well with the latest hardware.
EDIT: I'm assuming you mean for inference, for training it would be an other kind of challenge and the answer would be a clear no