

License is the legal instrument which makes open source software/hardware/silicon possible, describing precisely what rights are granted or retained. The term “open source” usually means the definition propounded by the Open Source Initiative (OSI) but sometimes not in certain contexts. At the very minimum, an OSI-compliant open source license will allow any distribution of the software without having to seek additional permission from the author, must be accompanied with access to the source code, and the software does not come with provisos outright prohibiting its use for certain endeavors.
That last point is about the “use” of the software, and is a crucial distinction between “open source” and “source available”. To have source available means the source code can be examined, but usually cannot be compiled. An open source license explicitly allows all uses, but possibly with additional obligations. For example, the AGPL license allows software to be used to run a server, but creates an obligation to provide the server source code to all users that connect. Whereas something like the MIT 0-clause license has zero additional obligations, while allowing the broadest use. When a license is both Open Source and allows free use, it is known as a FOSS license.
The exact verbiage of a license are the domain of lawyers, being a legal document. But the choice of license is down to the software author or corporate owner, and is a multifaceted consideration, including marketability, compatibility with other software, and whether it’s more important that the code gets used or that it forever remains available.
The latter is the major battleground for advocates of permissive versus copyleft licenses. Some software (eg reference cryptographic algorithms) have the priority that the absolute most number of people should use them, so a permissive license makes sense. While other software (eg desktop 3D rendering suite Blender) have a priority that nobody can ever take it private by adding proprietary-only features.
Choosing open source is easy, but choosing a license to effect that choice can get tricky. For authors publishing their software, the choice may very well change the course of history (ie Linux GPL-2). For consumers or businesses using software, the license dictates how changes can be distributed.



Other commenters correctly describe the cost analysis for using evaporative cooling, but I’ll add one more reason why it’s the preferred method when water is available: evaporating water can dissipate truly outlandish amounts of heat with very few moving parts.
Harkening back to high school physics class, water – like all other substances – has a certain thermal capacity, meaning the energy needed to increase the temperature of 1 kg of water by 1 degree C. The specific thermal capacity of water is already quite high, at 4184 J/(kg*C), besting all the common metals and only losing to lithium, hydrogen, and ammonia. In nature, this means that large bodies of water are natural moderators of temperature, because water can absorb an entire day’s worth of sunlight energy but not substantially change the water temperature.
But where water really trounces the competition is its “heat of vaporization”. This is the extra energy needed for liquid water to become vapor; simply bringing water to 100 C is not sufficient to make it airborne. Water has a value of 2146 kJ/kg. Simplifying to where 1 kg of water is 1 liter of water, we can convert this unit into something more familiar: 0.596 kWh/L.
What these two physical properties of water tell us is that if our city water comes out of the pipe at 20 C, then to get it to 100 C to boil, we need the difference (80) times the thermal capacity (4184 J/kg*C), which is 334,720 J/kg . Using the same simplification from earlier, that comes out to be 0.093 kWh/L. And then to actual make the boiling liquid become a vapor (so that it’ll float away), we then need 0.596 kWh/L on top of that.
Let that sink in for a moment: the energy to turn water into vapor (0.596 kWh/L) is six times higher than the energy (0.093 kWh/L) to raise liquid water from 20 C to 100 C. That’s truly incredible, for a non-toxic, life-compatible substance that we can (but should we?) safely dump into the environment. If you total the two values, one liter of water can dissipate 0.69 kWh of energy per liter. Nice!
In the context of a 100 megawatt data center (which apparently is what the industry considers as the smallest “hyperscale data center”), if that facility used only evaporative cooling, the water requirement would be 144,927 L/hour. That is an Olympic-size swimming pool every 6.9 seconds. Not nice!
And AI datacenters are only getting larger, with some reaching into the low single-digits of gigawatts. But what is the alternative to cooling the more-modest data center from earlier? The reality is that the universe only provides for three forms of heat transfer: conduction, convection, and radiation. The heat from data centers cannot be concentrated into a laser and radiated into space, and we don’t have some sort of underground granite mountain that the data centers can conduct their heat into. Convection is precisely the idea of storing the heat into a substance (eg water, air) and then jettisoning the substance.
So if we don’t want to use water, then we have to use air. But for the two qualities of water that make it an excellent substance for evaporative cooling, air doesn’t come close – 1003 J/(kg*C) and no heat of vaporization, because air is already gaseous. That means we need to move ungodly amounts of air to dissipate 100 megawatts. But humanity has already invented the means to do this, by a clever structure that naturally encourages air to flow through it.
The only caveat is that the clever structure is a cooling tower, and is characteristic of nuclear power stations. It’s also used for non-nuclear power station cooling, but it’s most famous in the nuclear context, where generators are well into the gigawatt range. Should AI datacenters use nuclear-sized air cooling towers instead of water evaporation? It would work, but even as someone that’s not anti-nuclear, the optics of raising a cooling tower in rural America just to cool a datacenter would be untenable. And that’s probably why no AI datacenter has done that.
To be abundantly clear, I’d rather not have AI datacenters at all. But since the question was why water consumption is such a big deal, it might be best to say that it’s a physics problem: there isn’t any other readily-available way to provide cooling for 100+ megawatts, without building a 100+ meter tower. Water is always going to be cheaper and more on-hand than concrete.