Conductive head transfer might be the same (not counting the floor being colder), but radiative heat transfer will be significantly higher, working at the 4th power of the temperature difference.
Radiative heat transfer is almost always the lowest factor out of the main three (conduction, convection, and radiation - and convection is pretty well just another kind of conduction, except that one or more of the media is moving), even to the point of disregarding it completely in a lot of scenarios here on Earth. (In space...not so much.

)
If your walls are reasonably well-insulated, and you're not surrounded by windows everywhere, then your environment is going to be irradiating you with only slightly less EM radiation than you're emitting back, so the net energy transfer is still very low. The convective heat transfer between you and the air in the room is going to far exceed anything you'll get from radiation.
Yes, you're dealing with the 4th power on the temperature differential, but it's also being multiplied by a very tiny constant.
Edit: Redacted. See the bottom of this post, where I decide to be slightly less stupid.
If your walls are uninsulated, and they're quite
terribly frigid, there would still only be a difference of 40°C between you and your walls. (In which case, you'd have ice crystals forming on your interior walls, or else you've got one hell of a nightmarishly-fatal fever.)
Put up against Stefan's Constant, and the relatively small surface area your body presents, there's not much heat to be lost due to radiation.
P = emissivity * Stefan's Constant * body surface area * ΔT⁴
Emissivity is going to be less than 1, but what the heck. 1.
P = -1 * 5.67*10^-8 * 2m² * 40⁴
P = -0.29 W.
Not a heck of a lot of power loss there. Now, assuming that the air temperature is the same in summer or winter, you will indeed incur more losses in winter due to radiation. But it looks like it's not going to be a whole heck of a lot. Keep in mind too that you're (probably) not walking around naked, so your body's full surface area is not being directly exposed to your cold walls.
And I'd also hope that your walls have better insulation than that, unless you live in a steel shack.
If you're outside in an open field though, then you've got a nice view of the sky, which presents a much lower temperature as far as a thermal analysis is concerned. Checking...values for a clear night sky seem to be a bit scattered, though I seem to remember something around -70°C being used for an example problem awhile ago, so let's try that. (It's not the much lower temperature of the Universe's background radiation, as the warmer atmosphere emits its own EM radiation back at you.)
So now you'd be talking about a difference of around 107K.
That'd be close to 15W, assuming you're standing around naked outside, showing off what you've got to Google Earth. And as your skin cools from convection with the air, you'll lose less heat to radiation.
Edit: Ok, let me be less stupid here. (That's what I get for not writing out the equation properly.) Some of those numbers were bothering me, so here we go again, with better (correct) math this time:
P = emissivity * Stefan's constant * area * (T⁴ - T⁴

P = 1 * 5.67 * 10^-8 W/m²*K⁴ * 2m² (310K⁴ - 270K⁴

P = 444.62 W
...yeah, ok. So I hope your walls are well-insulated.

Guess how often I also screwed up things like that in college. :\
And apparently I've forgotten quite a bit of my Heat Transfer class. Sucks how much is lost when it's not used for a few years.
Yes, it does indeed look like radiation transfer is a significant factor in summer/winter temperature preferences.