The heat loss varies with the difference between inside temperature and the outside. So Hometech and John.g are looking through opposite ends of a lens but effectively seeing the same thing (what kind of mixed metaphor is that?).
I've worked it out this way and feel free to correct me if I've made a mistake somewhere... To take John's example, if external is 13°C, the room at 22°C is 9 degrees above it, whereas the room at 18° is only 5 degrees above it. Heat losses for the warmer room will be 80% greater i.e. if it takes 1000W to maintain 18°C, it will take 1800W to maintain 22°C. There is a linear relationship between heat lost and the difference in temperature. In this example, 200W per degree above the external ambient.
This is consistent with heat loss calculations (radiator output is heat lost from the emitter into the room) which are calculated based on building elements having a heat loss expressed in W/m2K (K=inside/outside difference).
Going back to post 9, I don't therefore follow the example of a radiator at T30 having to be 2.4x the size. I make it 1.67x. If you divide the output of a rad calculated at T50 by 50 (the temperature difference between room temperature to be maintained and the mean emitter temperature), so the same rad at T30 would have:
T50 rated output/50x30 = 60% of the output
So if we want, say, 1000W output from a radiator run at T30, surely we need a radiator rated at 1667W (T50)? 60% of 1667W is 1000W.
The above logic conflicts with radiator manufacturers (e.g. Stelrad) who claim a radiator run at T30 will have an output 52% of the same radiator run at T50 and would therefore select a 1923W model rather than the 1667W one my logic would dictate. We cannot both be correct.