Power usage on PCs
In this week's Micro Mart there's a review of a four gigabyte (:-o) PC graphics card, which is apparently the fastest PC GFX card in existence. Until next week of course... But what caught my eye was it's power intake - it takes 375W normally, and at full wack eats up over 475 Watts!
I've put together a lot of PCs that couldn't supply just that card with power, let alone the rest of the system!
Given how we're supposed to be cutting back on our electrical usage (and therefore our national carbon footprint) you can't help but be a little unsure of the advisability of such power demands from a graphics card. Especially since, from what I understand, heat is simply waste energy and a sign of bad engineering. Not that I understand anything at all about electronics, you understand, but I read or heard that somewhere, and it does make sense.
So why do PCs have to generate so much waste heat? Is it bad engineering, or what? And should we be concerned by all the power PCs use, and should PC manufacturers be looking at ways of reducing power consumption, or will it not impact the planet negatively (albeit just as part of the larger picture)?
And how much power do consoles use, come to that?
It just seems that if we are supposed to be reducing our energy usage then surely manufacturers of electrical equipment should lead by example? They make the things that use up so much power, so you'd think that they'd be trying to reduce the power consumption and/or waste energy of electrical goods.
I've put together a lot of PCs that couldn't supply just that card with power, let alone the rest of the system!
Given how we're supposed to be cutting back on our electrical usage (and therefore our national carbon footprint) you can't help but be a little unsure of the advisability of such power demands from a graphics card. Especially since, from what I understand, heat is simply waste energy and a sign of bad engineering. Not that I understand anything at all about electronics, you understand, but I read or heard that somewhere, and it does make sense.
So why do PCs have to generate so much waste heat? Is it bad engineering, or what? And should we be concerned by all the power PCs use, and should PC manufacturers be looking at ways of reducing power consumption, or will it not impact the planet negatively (albeit just as part of the larger picture)?
And how much power do consoles use, come to that?
It just seems that if we are supposed to be reducing our energy usage then surely manufacturers of electrical equipment should lead by example? They make the things that use up so much power, so you'd think that they'd be trying to reduce the power consumption and/or waste energy of electrical goods.
Post edited by ewgf on
Comments
like soundcards gfx cards for different people.. I used to fit pro video cards on mac.. must be 11 years ago.. mac cost ?7000... gfx / rendering card on it's own another ?7000 , we used to sell them to the BBC for titling etc.. said card is probably ?500-1000 now in relation. Again a bog standard soundcard is ?5-10 on a chip.. vs pro audio ones going into ?1000's
if you think on a rendering farm , how many gpu's it would take.. it is probably better energy saving running it on 1 pc.. rather than 3-4 pc's running the same overall benchmark.
the war against the pan dimensional interstellar transwarp alien battlecruisers?
I know absolutely nothing about electronics (other than using the things, of course), so you'll have to keep the explanation simple (so please no "since r=a*b^d along the thermal quazmatics" etc type stuff), but why can't PCs be built that produce no excess heat because they draw in exactly the amount of power that they need? Is it that above a certain speed, CPUs have to produce a given amount of heat, no matter how they are designed? If so, then why?
Also, is it true that electronics run slower the hotter they are? If so, then surely a modern CPU, which runs very hot, is not being used to it's full potential.
It just seems strange to me (as a layman).
no wonder the licence fee is so high! :-o
cause of the thermal quazmatics ;)
It all comes down to the laws of thermodynamics. Heat is work and work is heat.
no... they get hotter the faster they run
believe it or not, that's cheap . my dad used to lug around quantels amongst other things, cmi fairlights etc.. and didn't know their true value when he worked there :)
quantels were what ?250k+? back in the 80's. I used one elsewhere once in some downtime .. very nice thing.. but photo shop etc surely put a dent in their sales.
there was talk of selling bbc tv centre for flats.. but they couldn't.. it's a mass of land.. but even still the blue peter garden is REALLY small (and no I didn't trash it boozy / mile :p )
(Apologies if you already know this)
Always check out www.hotukdeals.co.uk as they usually post up the latest bargains (search for "psu" without the quotes).
I cannot stress enough that you get what you pay for with a power supply. If you're installing a decent gfx card, say >= a GTX260, then you need something 600W at least IMO, and make sure they have those GFX card power adapters (can't remember their name). I would say ?50 should be the minimum spend.