You could try to reason your way through on this one: commercial A/C units have SEER ratings of let's say 25. That means, they can expel heat from buildings using 0.04 watts/ (Btu/hr of cooling afforded). That compares, incidentally, with ~ 0.3 watts as 1 Btu/hr of electrical resistance heating. Also, buildings in cities are kept at 65F. If winter conditions are surround 25F, that's a gradient of 40F escaping passively. Whereas, in summer at 90F, that's a 25F gradient for heat leaking into the building, plus any internal (from people, lighting, etc.) that is expelled actively at that high SEER.
But, all that "stuff" misses even bigger effects:
(1) Diurnal variation is MUCH larger than the net average effect -- urban vs. rural temperature peaks in the night, as building emit their latent heat
(2) Seasonal variation of the effect depends heavily on city location, though it's generally larger in winter. Big contributors here are that cities melt their snow, hence decrease their albedo and increase their resultant sunlight energy absorption; structure of the city (geometrical trapping of sunlight), and maybe also age of buildings (newer, reflective-painted roofs are hugely anti-absorptive).
So, there's no absolute rule, but on the whole at night, and in winter, you'd see the largest urban/rural difference.
-- Cheers, -- Mr. d.