Don't forget electrical costs. At $0.10 a kWh you are paying $0.24 a day (24 hours) per 100 watts of continuous average power consumption. This is $7.20 per month per 100W @ $0.10 /kWh or $87.60 a year. Adjust up/down for your cost of electricity and power usage (120W and $0.12/kWh = 1.2 * 1.2 = 1.44x adjustment)
Believe me, I do not. With electricity costs taken into account I make around $4 per day (from 4 video cards) from Bitcoin or Litecoin on 2 gaming systems I rarely use. When I use my main gaming system it is slightly less.
Now add to this the waste heat vented into your house on the months you cool your house
Living in a colder climate these costs offset, however I have no hard numbers. The slightly higher electricity cost in the summer months are offset from a savings in natural gas cost in the winter months.
+ the depreciated costs (and wear and tear) of the computer assets you tied up processing Bitcoins
The goal is to maximize profits and not necessarily maximize the amount of bitcoins/litecoins I mine, so thanks to the power curve of most cards, it is more profitable to slightly underclock the core and/or memory clock which helps minimize wear and tear on the cards. The cards I've had since 2009 are still running and producing the same MH/s as they always have.
Many people who still mine bitcoins with GPU's are people who don't pay for electricity costs thanks to the difficulty rise from FPGA's and ASIC's. This pushed out any profitability for me, but I still have profitability from Litecoin, which is a similar cryptocurrency.
Even if there were no profits and I was just breaking even I would still do it because I would like a use for my gaming machines since I rarely game anymore but still want to sit down and play every couple of weeks.