Simulation studies of residential buildings in Seattle and other northern US heating climates generally find 50% of lighting energy (and computer energy) is useful in offseting space heat (on an annual basis). For every kilowatt-hour saved in lighting, the heating energy increases by half a kilowatt-hour (or the heat equivalent if heating with gas/oil/??). Also, if the building has cooling (again in a northern US climate) there is a reduction in cooling that is equivalent to the around 10% of the lighting change. That is for every kilowatt-hour saved in lighting there is a further savings of 0.1 kilowatt-hour in cooling. In warmer climates the heating interaction gets smaller and the cooling interaction gets larger. In florida there would be a large benefit from decreased cooling and almost not impact from heating change.
To figure the economics you need to factor in the difference between your electric and heating fuel costs. This is often very significant.
Seasonally it varies as one would expect. The "waste" heat is nearly 100% useful in the winter and near 0% in the summer (northern climate).
All of this assumes we are talking about a small amount of electric use relative to the heat loss of the space and the space is a home or small office. If one is operating more than 1 computer in a small room, or 2 or 3 in a larger one, then the available heat is likely to be more than the space requires. Likewise if the computers are located in a warm climate. In any of these cases the winter utilization can approach zero and if your cooling equipment is running then it will even be negative (increase the cooling). Larger work settings with multizone heating/cooling systems are completely different and difficult to generalize, but basically the cooling reduction is very important.
A good rule of thumb might be, if your heater is operating during the day then the computer heat is useful. If your heater is not operating at all then the computer heat is somewhere between 0 and 50%.