Let's say a single, ATIx7xx (or better) graphics carded desktop (anything else is not price-effective) consumes ~250 Watt on idle (reasonable estimate).
Let's say you mine with it - you now consume ~350-550 Watt per desktop, let's say, average of 150 Watt per desktop increase.
With 18000 desktops, that would be, if we are careful with time allocation, (8 hours x 150 Watt x 18000 desktops) / 1000 KW/h increase in power consumption, per night.
That is ~21600 KH/h increase in power consumption per night, or 648000 KW/h increase in power consumption per month.
Now, as it is (and it is a floating, quicksand value), 1 GH/s gives you ~2-2.5 BTC per month.
150 Watts of low-medium ranged ATI card gives you ~100 MH/s, (which you want to run @ 50%, for your offices will catch fire, and no, I am not kidding). So that is ~50 MH/s optimistic value per desktop (remember, provided they are optimally equipped, that is, ATI x7xx cards or better).
50 MH/s x 18000 desktops per month is 900 GH/s per month. That would amount to ~1800 BTC, which go by $70-80 right now, say $75. So a profit of 135000 $ if you somehow can convert them to useable (spendable) form.
Now take 135000 bucks per month and substract from that 648000
KW/h per month (where I live 1 KW/h is about 20 american cents).
So you substract from 135000, 129600 Dollars for electricity, and you are left with 5400 Dollars profit (provided you have god like ability to convert that amount of bitcoins to real, useable currency at such rates).
So, in perfect world conditions, yes it breaks even, barely.
Sorry if there were any crude errors, but you get the point.