I'm starting to realize how pointless, upgrading your computer is. (gaming too)

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Obsoleet

Platinum Member
Oct 2, 2007
2,181
1
0
Power consumption is like high MPG vehicles. You shouldn't sell what you have and buy a new one just for better MPG- but when you replace what you have you have a good chance to do better the next time.

You're definitely doing more net environmental harm by encouraging production buying the new parts, than you are by saving on energy which is efficiently produced at the local power plant. Use the old power hungry beast until it dies, it's better for the environment.

I love low power devices myself. One of the main bullet points for me when I went with an AppleTV for my XBMC box. 6 watts, really? I see no reason to go higher than 6watts when it can run XBMC. Seems more futuristic than a 300watt box that does essentially the same thing.
I'm also interested in iPads, WHEN I need one (when my laptop dies)- and I certainly wouldn't replace it until I had a truly compelling reason to replace it.
 
Last edited:

VirtualLarry

No Lifer
Aug 25, 2001
56,571
10,206
126
Power consumption is like high MPG vehicles. You shouldn't sell what you have and buy a new one just for better MPG- but when you replace what you have you have a good chance to do better the next time.
I somewhat agree with that. But what if the power consumption of your current devices is eating you out of house and home?

I love low power devices myself. One of the main bullet points for me when I went with an AppleTV for my XBMC box. 6 watts, really? I see no reason to go higher than 6watts when it can run XBMC. Seems more futuristic than a 300watt box that does essentially the same thing.
I'm also interested in iPads, WHEN I need one (when my laptop dies)- and I certainly wouldn't replace it until I had a truly compelling reason to replace it.

Funny you should mention Apple products, especially since they are more or less "disposable" as far as components. Landfill material for sure. (Non-removable batteries, etc.)

Apple products certainly aren't very "environmental".
 

Obsoleet

Platinum Member
Oct 2, 2007
2,181
1
0
I somewhat agree with that. But what if the power consumption of your current devices is eating you out of house and home?
If your vehicle and home electronics are forcing you out of your house, and off the road- you have a larger more systemic problem on your hand.
Either in your income, the economy, your habits (drinking habit, gambling habit, whatever), or an copiously expanded family.

But I doubt that problem will be solved by spending the equivalent of ~20 years worth of energy that it would take to power all those things to replace them with (in reality) moderately more efficient devices. And then be left with still paying the bills on those things.

Funny you should mention Apple products, especially since they are more or less "disposable" as far as components. Landfill material for sure. (Non-removable batteries, etc.)

Apple products certainly aren't very "environmental".

Of all my old phones over the years, I could never find new replaceable batteries that weren't sitting on a shelf for years after production ended anyway. Thus, replacing the battery in my old Motorola Z6C with a "new" one led to less than stellar results. It was still time for a new phone, much to my dismay.

Non-Apple products are just as disposable if not more so, they just don't appear to be so. Apple products are typically of higher build quality, therefore at least last a long time (unibody construction, copious use of gorilla glass and now "liquid metal"- while the competition relies on plastic like my crappy HTC Thunderbolt). Other than the batteries, which as I noted is more a production issue than anything- I find Apple devices to be much better build quality.
And I should note, I would NOT be surprised if the batteries in iPhones aren't higher quality than whats in Android phones anyway and last longer before needing replaced. At least with the way Apple has the whole supply chain on everything on lock (Tim Cook is a supply chain god).
Not a fanboy- my only Apple device is a ATV.. just my observation. Old unibody Macbook Pros hold up damn well too- at least all coworkers do while the Dell I'm typing on will creak occasionally. I wouldnt ever buy a Macbook or a desktop Mac, but I am done buying laptops- in favor of iPads when the laptop dies. It suits my needs and does what I need better with high build quality.

Not to mention, my AppleTV doesn't have a battery- and is higher quality than any other competing set-top box. A5X CPU, 8GB NAND, 512MB RAM for $99 is virtually impossible to beat. I got an AppleTV gen2 with A4, 8GB NAND, 256MB RAM for $40. I don't see any need to dispose of that $40 device that consumes 6watts max, has a great interface and supports XMBC.. in fact, I recycled it for the other guy who sold it to me. :)

I would have to respectfully disagree that Apple devices are 'more' disposable than their competitors. I would trade my HTC Thunderbolt for an iPhone4S in a heartbeat if anyone would like to make that swap. I won't be throwing out the AppleTV either, until the day it stops working and I'm forced to.
 
Last edited:

beginner99

Diamond Member
Jun 2, 2009
5,315
1,760
136
Power consumption is like high MPG vehicles. You shouldn't sell what you have and buy a new one just for better MPG- but when you replace what you have you have a good chance to do better the next time.

Exactly. I once read that 50% of a cars total "energy usage" comes from it's production. Hence it is easy to see, that it will never make sense to buy a new one for lowering emissions.

And with current energy prices (even here were I live = much higher than in US) in most cases you still don't save that much to make it a good choice in terms of money.
Exceptions are if you have like 10 servers with multiple GPUs running 24/7 at 100% or so. You get the point. Else common stuff like refrigerator, freezer, dish washer, washing machine, boiler (if electric) make your PC more or less irrelevant.

Still I believe when you buy a new car, don't buy an SUV and if you have the money get a hybrid or full electric (Tesla). If I would win the lottery, a Tesla roadster would be pretty high on my shopping list.
 
  • Like
Reactions: The red spirit

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
I rarely visit the CPU forum but was researching a CPU earlier today. (Bought it, and a z77 mobo. 3570k which I intend to leave at stock, or maybe even undervolt/underclock. I'll get a Hyper212 Evo for it just to preserve the option of overclocking if I somehow need to do so someday. Definitely won't spent $$$ for bigger coolers or watercooling.)

I thought GPU guys were a little nuts, but what on earth do you guys do with severely overvolted and oc'd CPUs? They eat wattage like no tomorrow, require stronger mobos and cooling, require larger PSUs, etc. and for what? At least with GPUs you can some tangible benefit, like triple-screen gaming or 3D or 120Hz smoothness. GPUs are almost always the bottleneck in games. But other than hobby purposes, why would one so severely stress a CPU? I just don't get it, unless you run programs for work (but shouldn't work pay for it?) or certain programs that need CPU cycles and not GPU cycles (but couldn't you just run stuff overnight or at other times when you aren't using the PC for something else?).
 
Last edited:

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Exactly. I once read that 50% of a cars total "energy usage" comes from it's production. Hence it is easy to see, that it will never make sense to buy a new one for lowering emissions.

He didn't say lowering emissions, he said MPG. Not the same thing. Most people who are looking at MPG are doing so to save $$$ at the pump, not to save the planet. Although there are exceptions.
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
233
106
Apple products are typically of higher build quality, therefore at least last a long time (unibody construction, copious use of gorilla glass and now "liquid metal"- while the competition relies on plastic like my crappy HTC Thunderbolt).
I haven't touched the liquid-metal cases but speaking of glass versus plastic. Plastic can be much more shock resistant than glass. I don't see it as much of a con therefore.
 
  • Like
Reactions: The red spirit

anikhtos

Senior member
May 1, 2011
289
1
0
Power consumption is like high MPG vehicles. You shouldn't sell what you have and buy a new one just for better MPG- but when you replace what you have you have a good chance to do better the next time.

You're definitely doing more net environmental harm by encouraging production buying the new parts, than you are by saving on energy which is efficiently produced at the local power plant. Use the old power hungry beast until it dies, it's better for the environment.

I love low power devices myself. One of the main bullet points for me when I went with an AppleTV for my XBMC box. 6 watts, really? I see no reason to go higher than 6watts when it can run XBMC. Seems more futuristic than a 300watt box that does essentially the same thing.
I'm also interested in iPads, WHEN I need one (when my laptop dies)- and I certainly wouldn't replace it until I had a truly compelling reason to replace it.
by the way depending the diferense the new product may actually pay itself in the run
i had the old lambs in my living room
so when i was sitting there i had 4x60 =240 watts per hour
i replace them 4x12watt=60 watt
so i save 190 watt per hour more or less i keep them on 5 hours so thats
1kwh per day at least
15cent per kilowatt that is 0.15 euro per day
365 days =54.57 euro
each lamb cost 6euro so 24 euro
so according you i would save money if i waited for the lambs i had to burned themselves off?!??!?1
or in a year i will save 30 years even when i spent 24 to do that
 

sm625

Diamond Member
May 6, 2011
8,172
137
106
Seems high, but I pay ten cents per kw/hr. fifteen and you are getting there.

I pay 10 cents a kw/h too. 9 cents actually. But see when you add in all the bullcrap fees that are on each monthly bill, then divide that out into your total kwh used during that month, it raises your final, real price per kwh to about 15 cents. The more you use, the less this final, real price will be. For the average user it ends up being around 15-18 cents per kwh. If you have a very low electric bill (< $30) then you are probly paying more than 20 cents per kwh.
 

blckgrffn

Diamond Member
May 1, 2003
9,676
4,308
136
www.teamjuchems.com
I pay 10 cents a kw/h too. 9 cents actually. But see when you add in all the bullcrap fees that are on each monthly bill, then divide that out into your total kwh used during that month, it raises your final, real price per kwh to about 15 cents. The more you use, the less this final, real price will be. For the average user it ends up being around 15-18 cents per kwh. If you have a very low electric bill (< $30) then you are probly paying more than 20 cents per kwh.

Mmm, I see your point. Our bills are almost $200 lately, so the fees represent less of the total.

I guess I consider the fees to be a locked charge, independent of usage and since I am unlikely to abandon electricity altogether, they are largely irrelevant.

Electricity is neat in that, like social security in the U.S., once you use enough (make enough) it gets cheaper. Interesting social connotations.

We just replaced our electric water heater with a gas one (and what a chore that was! venting with a finished basement... no existing chimney) so we are hoping that the investment will have an ROI (for us) in 3-5 years. Natural gas, stay cheap...

As you alluded to before, going for lower power bills is not synonymous with going green :p Granted our power is largely from coal here, so its a wash...
 

Obsoleet

Platinum Member
Oct 2, 2007
2,181
1
0
by the way depending the diferense the new product may actually pay itself in the run
i had the old lambs in my living room
so when i was sitting there i had 4x60 =240 watts per hour
i replace them 4x12watt=60 watt
so i save 190 watt per hour more or less i keep them on 5 hours so thats
1kwh per day at least
15cent per kilowatt that is 0.15 euro per day
365 days =54.57 euro
each lamb cost 6euro so 24 euro
so according you i would save money if i waited for the lambs i had to burned themselves off?!??!?1
or in a year i will save 30 years even when i spent 24 to do that

Whatever your application, if the calculation works out in your favor to replace what you have to save money over time- then by all means. That goes without saying.

I'm only saying in most cases, especially pricier items like vehicles or PCs- the calculation does not usually work out in your favor. Even more so if you consider not just your own pocket, but the energy/emissions from producing the replacement you chose.
 

rickon66

Golden Member
Oct 11, 1999
1,824
16
81
I am back at the idea that the OP was trolling in a enthusiasts forum. Just as if I posted in a gun owner forum that buying more guns was a useless exercise and a waste of money. The fact is, owning more guns for 99% of people is probably a waste of money, but the folks who frequent such forums don't want to hear that. Common sense has very little to do with what most Anandtech members want when they visit here, most people don't NEED the latest electronics , but we are sure do like them.
 
  • Like
Reactions: The red spirit

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
Even more so if you consider not just your own pocket, but the energy/emissions from producing the replacement you chose.

For me when I upgrade, I usually sell my old thing to someone who needs it, then I buy the upgraded thing.

So, from an environmental standpoint, should you get a bonus by re-using your old item? So the impact of your replacement is much less, or a net positive, because it's usually more power efficient to operate, and the "production" cost for your new item is offset by preventing a need for an item to be produced for the person who bought your old item. Looking at it like that, I can see how it makes sense to upgrade to something more efficient.

Especially something like going from 6950 overclocked to a 7850 - you save a lot of energy, but the cost increase is easier to recoup. Say you sell your 6950 for $200 (it's the 2GB version with dual bios), then purchase the 7850 for $250. So, you've spent $50 more, and ensured that the buyer of your 6950 will not require a new card to be produced.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,571
10,206
126
For me when I upgrade, I usually sell my old thing to someone who needs it, then I buy the upgraded thing.

So, from an environmental standpoint, should you get a bonus by re-using your old item? So the impact of your replacement is much less, or a net positive, because it's usually more power efficient to operate, and the "production" cost for your new item is offset by preventing a need for an item to be produced for the person who bought your old item. Looking at it like that, I can see how it makes sense to upgrade to something more efficient.

This. Most definitely. I generally sell or give away my old rigs. I find someone that needs a computer, so that my old cast-offs (well, generally not too old) find a home.

Unfortunately, that often costs me money, for parts that may be necessary to make a complete machine, or to replace parts that have failed. (And HDs aren't cheap anymore.)
 

Destiny

Platinum Member
Jul 6, 2010
2,270
1
0
I rarely visit the CPU forum but was researching a CPU earlier today. (Bought it, and a z77 mobo. 3570k which I intend to leave at stock, or maybe even undervolt/underclock. I'll get a Hyper212 Evo for it just to preserve the option of overclocking if I somehow need to do so someday. Definitely won't spent $$$ for bigger coolers or watercooling.)

I thought GPU guys were a little nuts, but what on earth do you guys do with severely overvolted and oc'd CPUs? They eat wattage like no tomorrow, require stronger mobos and cooling, require larger PSUs, etc. and for what? At least with GPUs you can some tangible benefit, like triple-screen gaming or 3D or 120Hz smoothness. GPUs are almost always the bottleneck in games. But other than hobby purposes, why would one so severely stress a CPU? I just don't get it, unless you run programs for work (but shouldn't work pay for it?) or certain programs that need CPU cycles and not GPU cycles (but couldn't you just run stuff overnight or at other times when you aren't using the PC for something else?).

Yeah... the CPU forum is crazier than the GPU forum in my opinion... that is why Intel specially developed the K series for Overclockers.... these people basically do the same thing as GPU overclockers, to get performance out of a lower cost CPU to reach performances that meet or exceed the next or higher end CPUs... I see some guys here overclock i5 CPUs by 1GHz (4.5GHz) to 1.5GHz (5GHz) and I'm going thats nutts... that max I ever overclocked a CPU is by 200Mhz then I freak out and return it to normal (and that was on a Pentium almost 8 years ago)... I haven't tried overclocking a CPU since...

sometimes it is just funny to read the posts and wonder how the hell they do that...

Kind of reminds me of my rice rocket days, when my friends and I fixed up our cars for looks and street/drag racing... but obviously PC stuff is cheaper... I'm still a noob so I can't comment more than the basics or what I read in the forums... :sneaky:
 
Last edited:
  • Like
Reactions: The red spirit

Makaveli

Diamond Member
Feb 8, 2002
4,961
1,557
136
Yeah... the CPU forum is crazier than the GPU forum in my opinion... that is why Intel specially developed the K series for Overclockers.... these people basically do the same thing as GPU overclockers, to get performance out of a lower cost CPU to reach performances that meet or exceed the next or higher end CPUs... I see some guys here overclock i5 CPUs by 1GHz (4.5GHz) to 1.5GHz (5GHz) and I'm going thats nutts... that max I ever overclocked a CPU is by 200Mhz then I freak out and return it to normal (and that was on a Pentium almost 8 years ago)... I haven't tried overclocking a CPU since...

sometimes it is just funny to read the posts and wonder how the hell they do that...

Kind of reminds me of my rice rocket days, when my friends and I fixed up our cars for looks and street/drag racing... but obviously PC stuff is cheaper... I'm still a noob so I can't comment more than the basics or what I read in the forums... :sneaky:

The difference between you and them is they have gotten over the fear!

All of us that do this were all like you when we started but after so many years you get better at it, and once you do the proper reading and research and with some trial and error its becomes easy.

I know some people fear overclocking thinking they will burn out a processor, that never has happened to me because I also do my research about safe vcore limits etc etc.

Plus these days its so much easier than when I started back in the day before even ATX power supplies.

Also I find the need to overclock is less these days because general processors are so fast even at their stock speeds. Back in the day you would overclock because you actually needed the extra performance today I find it just more about adding additional performance.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
If I had not overclocked my Q6600 I'd have needed to upgrade well before IB came around. IB today for most people does not need to be overclocked, but for folks like me who keep their CPU's for many years before upgrading, it can extend the usefull life of the CPU a good amount, which is sometimes all you need to get over the hump until the next new CPU is out. ;)
 

Subyman

Moderator <br> VC&G Forum
Mar 18, 2005
7,876
32
86
I rarely visit the CPU forum but was researching a CPU earlier today. (Bought it, and a z77 mobo. 3570k which I intend to leave at stock, or maybe even undervolt/underclock. I'll get a Hyper212 Evo for it just to preserve the option of overclocking if I somehow need to do so someday. Definitely won't spent $$$ for bigger coolers or watercooling.)

I thought GPU guys were a little nuts, but what on earth do you guys do with severely overvolted and oc'd CPUs? They eat wattage like no tomorrow, require stronger mobos and cooling, require larger PSUs, etc. and for what? At least with GPUs you can some tangible benefit, like triple-screen gaming or 3D or 120Hz smoothness. GPUs are almost always the bottleneck in games. But other than hobby purposes, why would one so severely stress a CPU? I just don't get it, unless you run programs for work (but shouldn't work pay for it?) or certain programs that need CPU cycles and not GPU cycles (but couldn't you just run stuff overnight or at other times when you aren't using the PC for something else?).

I somewhat agree. I mildly overclock my CPUs. I do not use extreme voltage. I try to get the most OC out of stock voltage as possible. Going 1.4V+ is asking for more heat, less efficiency, and reduced life. A mild OC can give linear performance/watt increase in performance. The great thing about newer CPUs is that they downclock and downvolt when not needed. So the OC is only used when the system is under load. No reason not to get the maximum potential out of stock voltage when the system is loaded.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Back in the day you would overclock because you actually needed the extra performance today I find it just more about adding additional performance.

That's half of my point. WHY oc an already-fast CPU that heavily, when the bottleneck is almost always the GPU? I can understand why people overvolt and oc GPUs more than I can why people do that to CPUs, unless they are running pretty crappy CPUs. But there are people running 2500K's at 5Ghz when even a stock 2500K will be more than enough for almost all games unless you are going for 3D/120Hz which very few people have. If it's for work or heavy video editing or something then it's more understandable, but I doubt many of those 5GHz'ers are doing it for work or professional editing or something. Am I missing something?


I somewhat agree. I mildly overclock my CPUs. I do not use extreme voltage. I try to get the most OC out of stock voltage as possible. Going 1.4V+ is asking for more heat, less efficiency, and reduced life. A mild OC can give linear performance/watt increase in performance. The great thing about newer CPUs is that they downclock and downvolt when not needed. So the OC is only used when the system is under load. No reason not to get the maximum potential out of stock voltage when the system is loaded.

Yeah I agree, a moderate oc is fine, I am not arguing about that at all. Especially if it's at stock voltage anyway, or close to it.
 

Obsoleet

Platinum Member
Oct 2, 2007
2,181
1
0
For me when I upgrade, I usually sell my old thing to someone who needs it, then I buy the upgraded thing.

So, from an environmental standpoint, should you get a bonus by re-using your old item? So the impact of your replacement is much less, or a net positive, because it's usually more power efficient to operate, and the "production" cost for your new item is offset by preventing a need for an item to be produced for the person who bought your old item. Looking at it like that, I can see how it makes sense to upgrade to something more efficient.

Especially something like going from 6950 overclocked to a 7850 - you save a lot of energy, but the cost increase is easier to recoup. Say you sell your 6950 for $200 (it's the 2GB version with dual bios), then purchase the 7850 for $250. So, you've spent $50 more, and ensured that the buyer of your 6950 will not require a new card to be produced.

I wasn't really focusing on the environmental aspect. From that point of view- using everything as long as possible is the best way to go.

I was speaking mostly from a "MPG"/$avings standpoint. Though they 'usually' go hand in hand.

If you can recoup most of your money by selling, then get a more efficient, faster product then by all means it's obviously the better deal for you. Goes without saying.

Many times though, the principle value of the item in question is a fraction of what it once was. I could use my rig as an example (it's less desirable than what you sold) or a car. But a 30k car tends to drop to say 8k or 10k within its best years. Replacing that car with another 30k car JUST to get better MPG is silly. It takes years to make up the difference. The difference is more stark with a car vs a computer because the $ involved is on a different level. Add in the production used to create the new car, and the incentive your giving the marketplace to continue mass production and you have an environmental side to it as well.

I replace things when I need to, if I need more CPU or GPU power (which I don't today with my 5870 and Q9450)- or if something breaks.

I tend to stick with what I have for longer than most, because I kind of fall in love with a reliable machine. I value something that has been and continues to work everyday for me, rather than replace it with new, untested hardware that might need RMA'd or my time troubleshooting.

I don't enjoy troubleshooting and configuring things as much as I did when I was younger. I started on a Commodore 128, started doing small upgrades on my 486 DX2/33, a ton of upgrades on my P133, and built my first top to bottom computer with my Athlon 700mhz (Thunderbird). I'm a burned out enthusiast. I just want it to work and tend to stick with what I've found over the past 15 years as being the most reliable hardware.. generally Intel chipsets, NV & AMD GPUs are basically a wash at this point (NV used to be the better choice), have a great track record with Seagate, Corsair PSUs, now Intel SSDs. I just want it to replace it when it either breaks or I have a true need to replace something.
The last thing I replaced was my WD Passport external HDD in favor of a Seagate external (something wasn't right with the WD), and before that my Antec 900 with a LianLi (smaller, quieter, no lights, done with the youth-fanboi case phase).
 

Obsoleet

Platinum Member
Oct 2, 2007
2,181
1
0
I agree the original post is kind of a troll post, interesting to see others coming to the same point I've been at for years now. This isn't probably the primary place for us burned out enthusiasts to discuss our loss of faith.
 

cytg111

Lifer
Mar 17, 2008
25,661
15,160
136
I agree the original post is kind of a troll post, interesting to see others coming to the same point I've been at for years now. This isn't probably the primary place for us burned out enthusiasts to discuss our loss of faith.

- Noway.. or maybe yesway, cause faith = something studpid in my book. I have been with the OC'in/upgrading crowd since before celerons at 300mhz clocked to 504 (air) if you got a good one from costa rica, when we made dual cpu celeron rigs by drilling physical holes in the dies.. And it was ALWAYS to get ahead in something.. more FPS in quake, giving us an edge vs the compitition, more cpu power for training neural nets etc.. We always had a use for those added cycles. I feel that reason is gone, atleast for the moment.

I believe it is because the sofware industry is struggeling with those multicores, and I am beginning to doubt that there will ever be a generic platform that will let your software stack scale linear with #cores. Software development as a concept will problary have to be reinveted. That stuff takes time.
 

anikhtos

Senior member
May 1, 2011
289
1
0
- Noway.. or maybe yesway, cause faith = something studpid in my book. I have been with the OC'in/upgrading crowd since before celerons at 300mhz clocked to 504 (air) if you got a good one from costa rica, when we made dual cpu celeron rigs by drilling physical holes in the dies.. And it was ALWAYS to get ahead in something.. more FPS in quake, giving us an edge vs the compitition, more cpu power for training neural nets etc.. We always had a use for those added cycles. I feel that reason is gone, atleast for the moment.

I believe it is because the sofware industry is struggeling with those multicores, and I am beginning to doubt that there will ever be a generic platform that will let your software stack scale linear with #cores. Software development as a concept will problary have to be reinveted. That stuff takes time.
yes in the old days it maked sense to have a stronger cpu to play the game smoother
but in the multicore era cpu power and software start loosing touch with each other
cpu relly in cores to deliver more power
when the software can not really catch up the core
in the 4c/8t cpu era and a 2c/4t cpu will be fine for gaming cause games still are very poor threated
then there is the issue of not linear use of the added cores from the software.
i remember a few years back the first 6 core cpu they wanted to test it how faster it was but there was not there any 12 thread program to test it lol. even video encoding crashed trying to use 12 threads (by the way so far video encoding is the only software that can use all cores so good)
and what about people want to use old software? a cpu with more cores does not mean anything to them as an increase to ipc will do.
cores maybe make sense to servers and some enviroments but for the home people seems to bring more problem rather than power.
the glorious old days when bying a sond card was a step to free cpu resources lol