I'm sure it uses some very cherry-picked numbers.
Let's say a query uses 3000 watts for 10 seconds. That's probably within an order of magnitude because a 300w consumer GPU can do quite a lot in 10s, but the bigger models will be using a bit more hardware.
If every bit of that 0.00833 kWh of that went into boiling some water, you could vaporize about 13 grams or about 1/40th of a bottle of water.
We then need to consider that the data center cooling system will add about 1/3 more, and that to produce that electricity in a 40% efficient (coal) power station entirely cooled by evaporative cooling would a little more than double it again.
But we should also consider that not all thermal power stations are that inefficient, that even in evaporative cooling setups significant heat is removed by conductive b means, most thermal power stations in the US West are air cooled, and a lot of power will be coming from renewables. On a cowboy calculus basis you're back to 1/40th of a bottle or less.
One could argue that this doesn't include the energy to train the model, but amortizing that over all the queries made during the economic lifetime of the model probably doesn't add much more than 1x the power used by a single query. I would say that shouldn't be included in what a single query costs anyway.
In any case it isn't anywhere near using a bottle of water.