• teh_shame@infosec.pub
    link
    fedilink
    arrow-up
    108
    arrow-down
    1
    ·
    10 months ago

    Condensation would be a huge problem. Put a cold glass out on a humid day and it collects a lot of condensation. Now imagine it’s a radiator dripping on your floor

    • PetDinosaurs@lemmy.world
      link
      fedilink
      arrow-up
      58
      arrow-down
      3
      ·
      10 months ago

      Lots of correct answers here, but this is probably the best.

      The real benefit of ac is dehumidifying. That’s the most important part. The cooling is a side effect of that, which does also improve comfort.

      There’s a very good reason that people say “but it’s a dry heat”.

      • XeroxCool@lemmy.world
        link
        fedilink
        arrow-up
        14
        arrow-down
        1
        ·
        10 months ago

        I’ve been in 115°F dry heat in Paso Robles, California. I was comfortable outside until my feet started burning through my shoes. Totally unexpected, being from a place where the common 90°/90% days just knock me out

        • PM_Your_Nudes_Please@lemmy.world
          link
          fedilink
          arrow-up
          20
          arrow-down
          1
          ·
          10 months ago

          A few years ago, I had an argument about this with a buddy from Arizona. He was claiming that his 115° heat was worse than my Dallas 105° heat. I pointed out that his was at like 20% humidity, while Dallas was at like 70-80%. He didn’t believe me. Swore up and down that it wasn’t as bad.

          Then he flew in to visit for a week, and got heat exhaustion on the second day of his trip. He went to an amusement park when it was like 105° and humid. He originally wanted me to tag along too, since I live in the area. I told him he was crazy, and that he shouldn’t go. He called me a pussy and went. A few hours later, I get a phone call asking me to come pick him up, because he’s so hot that the on-site EMTs don’t trust him to drive back to his hotel.

          He hasn’t talked shit about humid heat ever since. Wet bulb thermometers don’t lie.

      • Today@lemm.ee
        link
        fedilink
        arrow-up
        4
        ·
        10 months ago

        I worked at a golf course in Florida in college. People would come from much hotter places and start out walking. After 9 holes (or less) they were begging for a cart.

      • owatnext@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        10 months ago

        Spent the day in Alamogordo, New Mexico visiting White Sands desert. It was 110°F and basically no humidity. I stayed hydrated, wore a hat and sunnies and was fine. Now, where I live it is like 85-90° and 90% humidity. I feel worse in that when sitting in the shade than I did in the middle of a literal desert.

    • alvvayson@lemmy.world
      link
      fedilink
      arrow-up
      15
      ·
      edit-2
      10 months ago

      This is the most important answer. Radiators are actually used in combination with heat pumps to cool up to a few degrees Celsius in a non-condensing mode.

      The problem is, it isn’t really effective. To really cool down, the radiators would need to get properly cold, but that requires cold water leading to condensation everywhere. In the radiator, but also around the piping in wall cavities, where it will feed mold growth.

      A/C’s don’t have this problem because the piping doesn’t get cold and the heat exchanger (inside fan unit) gets very cold and the condensate gets captured in a drip tray and pumped away.

      • Natanael@slrpnk.net
        link
        fedilink
        arrow-up
        5
        ·
        edit-2
        10 months ago

        Here in Sweden we are now building out “remote cooling” as the direct translation would be, in addition to the decades old remote heating infrastructure we already have. It’s literally dedicated warm water and cold water lines from a central location in the city (usually a heat plant, often burning garbage, now also from central coolers) to various buildings. They’re properly insulated all the way, and connected to the central heating system in each building.

        A building manager could hook it up to their general ventilation / A/C system to increase both heating and cooling capacity, often much more cost effective than using electricity locally for the same amount of capacity. Remote heating is already hooked up to radiators.

      • rufus@discuss.tchncs.de
        link
        fedilink
        arrow-up
        3
        arrow-down
        1
        ·
        edit-2
        10 months ago

        there are radiators with fans, blowing air through them to migitate for the condensation. that seems to work. it’s obviously more expensive than normal radiators. but you can buy them and connect them to a heat pump that supports that.

    • Cheems@lemmy.world
      link
      fedilink
      arrow-up
      6
      arrow-down
      3
      ·
      10 months ago

      Just put a drip tray on the floor with a heating element that evaporates the condensation.

      • sploosh@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        10 months ago

        Just pump it outside. There’s no reason to dump a kilowatt or more into a heater strip when removing moisture from the air on a hot day makes our sweat work better, cooling us more efficiently.

    • Zippy@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      10 months ago

      I did this exact thing with my boiler forces air system. Piped it in the summer that when watering the lawn, I could have all the child water go thru the condenser. Worked quite well for chill air but the condensation would have destroyed the condenser I am sure and mold would have rapidly been a problem. I could produce liters of water a day easily. Was not viable.

  • nottheengineer@feddit.de
    link
    fedilink
    arrow-up
    64
    arrow-down
    1
    ·
    10 months ago

    Because cold water isn’t free. If you want to create something cold, you want to be using a conpressor and at that point, you can just skip the water step and use an AC.

    • OptimusPhillip@lemmy.world
      link
      fedilink
      arrow-up
      23
      ·
      10 months ago

      Yeah, if you want a single system for heating and cooling, you’d be better off getting a heat pump. It’s the most energy efficient thing for both anyway, from what I’ve been told.

      • PM_Your_Nudes_Please@lemmy.world
        link
        fedilink
        arrow-up
        12
        arrow-down
        3
        ·
        10 months ago

        It’s the most energy efficient because you’re not using the power to produce heat; You’re just moving heat from A to B. Imagine a heating coil that is 100% efficient. For every watt of power you put in, you get one watt of heat. Now imagine being able to move heat from outside instead. For every one watt you put into the system, you can move two watts of heat into the room. It’s not using the energy to create heat, so it can actually be more efficient than something that is made to produce heat.

        The issue with heat pumps is that they need latent heat to actually be able to pump heat around. As temperatures get lower and lower outside, they become less efficient at heating your house because there is less heat outside to pump into your house. At a certain point, it becomes more efficient to just use the power to directly produce heat, instead of trying to pump it around.

        Most of the world doesn’t ever need to worry about that, but it can be a consideration in particularly cold areas. The tipping point for efficiency is usually around 0-10°F, so it’s not something that equatorial areas need to worry about. But up north, it becomes more and more of a consideration.

        • IchNichtenLichten@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          10 months ago

          Some newer ones can operate down to -22F. I’m in a place that hits those kind of of temps so I’d want a wood stove as a backup. I guess a ground source heat pump might be a better fit around here.

          • PM_Your_Nudes_Please@lemmy.world
            link
            fedilink
            arrow-up
            4
            ·
            10 months ago

            For what it’s worth, heat pump manufacturers know this, and usually include a way to generate heat. My parents use a heat pump system, and it has a radiator that only turns on when the outside temperature drops below whatever the efficiency threshold is. Radiators are cheap and easy to build, so they’re not difficult to include in an existing heat pump setup.

  • FuglyDuck@lemmy.world
    link
    fedilink
    English
    arrow-up
    46
    arrow-down
    5
    ·
    10 months ago

    It would be called a heat pump.

    And yes, they’re used all the time. Modern heat pumps are more or less radiators with fans through which liquid is pumped. Air blown over is either heated (hot liquid,) or cooled (chilled liquid.)

    In the us, you’ll see them more in comercial buildings rather than residential homes- they’re quite a bit more effecient, with heating or cooling compared with Moving the air the same distance

    • m0darn@lemmy.ca
      link
      fedilink
      arrow-up
      27
      ·
      10 months ago

      I think OP is talk specifically about hot water radiators.

      He is asking why those rads specifically can’t have cold water piped through them in the summer.

      I think the answer for that is: they weren’t designed to manage the condensation that would occur if the radiator (or pipe) temperature is lower than the dew point. Also, hot water can be a lot more above room temperature than cold water can be below. I think a lot of radiators are actually supplied with steam, not hot water which also lets them potentially use the latent heat of condensation of the steam for even more heat transfer.

      • Iron Lynx@lemmy.world
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        10 months ago

        From what I’ve gathered, water based heating systems pump around hot water, not steam. Otherwise, the radiators, which are easily touchable by anyone, would be insanely hot, and thus a significant fire and injury risk.

        In order to use them for cooling, I suppose you’re going to need a different transport medium, i.e. something like glycol instead of water. This would make the system harder and potentially a bit more dangerous to maintain, limits power when heating (water has higher specific heat than glycol), and you’re still stuck dealing with condensation at the radiators.

        Using an air system for centralised cooling works. Using a water system is much more problematic.

    • ElderWendigo@sh.itjust.works
      link
      fedilink
      arrow-up
      8
      ·
      10 months ago

      Modern heat pumps are more or less radiators with fans through which liquid is pumped. Air blown over is either heated (hot liquid,) or cooled (chilled liquid.)

      Heat pumps are more like air conditioners run in reverse. Air conditioners and refrigerators are heat pumps. They operate on the same thermodynamic cycle.

      https://youtu.be/7J52mDjZzto

    • rifugee@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      10 months ago

      I think they’re becoming more common in residential in the US, at least that’s the impression I get from the number of articles and videos that get pushed my way. They’re not as good in colder climates, evidently, but hybrid systems where there is still a furnace but a heat pump for cooling seem to be a good alternative in those situations.

  • TheOneCurly@lemmy.theonecurly.page
    link
    fedilink
    English
    arrow-up
    28
    arrow-down
    1
    ·
    10 months ago

    Hot water radiators are designed to work with temperature deltas in the 110 degree F range (target 70 room temp, 180 water temp). In the summer your temperature deltas are much tighter, you can only get to at best 32 F before the water freezes and with a target of 70 that’s only 38 degrees of temperature delta trying to cool the room. They simply won’t work efficiently enough for it to be worth it, not to mention being on the floor is very poor positioning for summer.

      • PM_Your_Nudes_Please@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        10 months ago

        Delta is commonly used to refer to a difference between two points. So in this case, a delta of 110 degrees means whatever your target temperature is, the radiator should be 110 degrees away from that temperature. Trying to reach 70° means a temp of 180 at the radiator when heating, or -40° when cooling. OP was pointing out that -40° obviously isn’t a feasible temperature for a water-based radiator, so they simply aren’t great for cooling.

      • TheOneCurly@lemmy.theonecurly.page
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        Energy transfer is proportional to the difference in temperature between the 2 things (delta T), their contact surface area (in this case the length of the radiator and the size of the fins), and time. If you want a room to change temperature quickly and with radiators that don’t take up an entire wall then you need the water temperature to be very different from the room temperature.

  • Deestan@lemmy.world
    link
    fedilink
    arrow-up
    13
    ·
    edit-2
    10 months ago

    It technically would work, and in countries like Norway where you will always get cold water out of your water supply for no cost it’s economically feasible.

    The thing that makes it not worth the effort is similar to how you can blow away a pea at an armlength away, but you can’t suck up a pea from the same distance.

    Radiators when heating, expel heat not just through convection but through radiation.

    When cold, they can’t un-radiate, so you are left with convection. This is very slow, and won’t work for the room the radiator is sized for.

    To make it work, you have to heighten the convection by blowing air through it, and now you basically have a regular Air Conditioner which already exists. :)

  • ScornForSega@lemmy.world
    link
    fedilink
    arrow-up
    11
    arrow-down
    1
    ·
    10 months ago

    The question is what do you do with the waste warm water.

    There are A/C systems that use groundwater to cool the coils and then discharge into a pond. But obviously that only works in a place where you have ample free water and a place to dispose of it.

  • morphballganon@lemmy.world
    link
    fedilink
    arrow-up
    7
    ·
    10 months ago

    Heat rises. Radiators are typically positioned near the floor so that heat fills the whole room. The ideal place for a heat sink would be on the ceiling. But even then, the effect would be hardly noticeable.

    Better to just drink the cold water, or put it in a place where you are touching it, like a wet t-shirt around your neck.

  • Hildegarde@lemmy.world
    link
    fedilink
    arrow-up
    8
    arrow-down
    1
    ·
    10 months ago

    Because it wouldn’t be effective. Either you’re running new water from the tap through the radiator, which is expensive and wasteful. Or you’re sending the warmed water back into underground pipes to cool it off again. If you’re going spend a bunch of money to bury a bunch of pipes, it’s better to something that works instead.

    A ground source heat pump is an effective way of getting the cold from the ground into your living space. A ground source heat pump is basically an air conditioner that has the hot part buried in the ground, and they are very efficient.

    Because it uses refridgeration, its far more effective and efficient than backwards radiators.

    • FuglyDuck@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      You don’t use water in modern radiators. Glycol or similar, and the water is generally recirculated.

      A radiator is a heat pump, by the way. Or rather, a componenr of one.

  • Hotspur@lemmy.ml
    link
    fedilink
    arrow-up
    3
    ·
    10 months ago

    There is a cooling strategy like this called “chilled beam” but has all the issues listed below by other posters; condensation management and power usage.

  • Darthjaffacake@lemmy.world
    link
    fedilink
    arrow-up
    4
    arrow-down
    4
    ·
    10 months ago

    Interestingly heat spreads better than cold (I’m not an entirely reliable source for this so take it with a big grain of salt) but essentially since hot things are more energetic they have a tendency to spread but since cold things are more static, cooling is more difficult. Also radiators use blackbody radiation to emit heat via light the same way metal heats up when hot whereas there’s no cold equivalent to this.

    • Eranziel@lemmy.world
      link
      fedilink
      arrow-up
      10
      ·
      10 months ago

      It’s more because of the temperature differential. The more difference between the temperature of two objects, the faster they change temperature. A radiator with 50 degree water is ~30 degrees warmer than the room (or 80+ degrees for a steam rad), while cold water is going to be 10-15 degrees cooler than the room. Any colder and you need to use not-water so it doesn’t freeze. Condensation or frost is also a big concern to avoid property damage.