This article is sponsored by AWS
It would have been really nice if the warning disclaimer was at the top of the article, not at the very end - although something did start to smell fishy about 1/4-way through the text already.
Obviously, one could use the cloud for HPC. For some situations, mostly involving very spiky demand and very granular budget controls, it even makes economic sense to use it. In many other cases, it doesn't.
For an academic researcher, the killer argument against putting all your eggs in the cloud basket is very simple: what happens when my current grant runs out, and I spent all my money on the cloud? To have a realistic chance of getting another grant I need to show some preliminary results. To do that, I need some facility - and cloud needs paying up front, while with an in-house facility it is usually possibly to defer the running cost for a while (though obviously not forever). What happens to my data if it is in the cloud, and my grant runs out so that I can no longer pay the ransom upkeep money? What happens with the commercially-licensed software I might need, which is typically much more expensive for off-premises use?
The bottom line is that yes, there are some very important HPC niches the cloud could fill, but that's what they are: the niches. Or, as Sir Terry would put it, the nitches.