1. Win some Crucial goodies in OCAU's Christmas Treasure Hunt!
    Dismiss Notice

recycling heat

Discussion in 'Electronics & Electrics' started by weak beta male, Dec 7, 2018.

  1. weak beta male

    weak beta male Member

    Joined:
    Aug 21, 2018
    Messages:
    40
    why isn't there a device to recycle the heat from computers to produce more electricity? i didnt major in physics but i know as long as there's a temperature differential then u can do work. with the amount of heat pcs and monitors produce u'd think they'd make a device to save electricity bills.
     
  2. Hive

    Hive Member

    Joined:
    Jul 8, 2010
    Messages:
    5,012
    Location:
    ( ͡° ͜ʖ ͡°)
    Yes
     
    Pugs likes this.
  3. Hater

    Hater Member

    Joined:
    Nov 19, 2012
    Messages:
    2,774
    Location:
    Canberra
    oh boy...
     
    Pugs likes this.
  4. Hive

    Hive Member

    Joined:
    Jul 8, 2010
    Messages:
    5,012
    Location:
    ( ͡° ͜ʖ ͡°)
    Fine i'll bite.

    The things that power satellites in the absence of solar/battery - radioisotope thermoelectric generators. You could apply the fundamental technology (replace radioactive heat decay source with computer heat - maybe heat piping would work nicely)

    There, you're generating a few watts and it's cost you a shit load in materials. You will never get even remotely near 100% efficiency though, try more like 1-5%
     
    Last edited: Dec 7, 2018
    Pugs likes this.
  5. Supplanter

    Supplanter Member

    Joined:
    Jun 30, 2005
    Messages:
    437
    Location:
    Wobbies World
    You could use the excess heat to power a Sterling Engine, and then use the output of that to drive a fan blade
    at 5RPM.
     
  6. SLATYE

    SLATYE SLATYE, not SLAYTE

    Joined:
    Nov 11, 2002
    Messages:
    26,820
    Location:
    Canberra
    MSI actually did make a mainboard with a Stirling-cycle fan (which was then used to cool the chipset).

    The problem is that any thermodynamic machine works better when there's a big temperature difference between the hot side and the cold side ... and the devices generating the heat (eg. CPUs and GPUs) work best when they're pretty close to room temperature. Say you've got a CPU at 343K (70°C) (which is pretty hot!) and ambient temperature is 293K (20°C). The Carnot efficiency for this system is about 17% - so if you've got a 100W CPU then you can generate up to 17W from it. Realistically, you're going to lose energy at every stage, so 5W is probably a better bet for how much power actually goes back into the mains.

    Working against this is the fact that silicon devices actually use more power when they're hot (because heat lowers the resistance in a semiconductor). From the plots I can quickly find on Google, it looks like fitting a bigger heatsink to your 100W CPU to bring the temperature down from 70°C to 55°C will save you about 7W of power.
     
    RobRoySyd likes this.
  7. Sunder

    Sunder Member

    Joined:
    Apr 26, 2012
    Messages:
    2,744
    Wait what? You mean you haven't been installing a peltier between your CPU/GPU and the heatsink for the last 20+ years? The frigging things cost $1 each + shipping, to me, that's like leaving the light on because you couldn't be bothered getting up to turn it off - You're just wasting money and killing the planet.

    https://www.ebay.com.au/itm/New-TEC...h=item28705ebae8:g:rIkAAOSwe9FcC5u7:rk:1:pf:0

    You're welcome.

    (Disclaimer, there's a missing sarcasm tag there... While this would work in theory, just like putting a hydro-electric generator in a stream would slow down the flow of water, putting a peltier in a "stream" of heat, slows down the transfer of heat, so your "hot" side would probably keep thermally throttling your CPU/GPU).
     

Share This Page