- Mar 24, 2005
- 135
- 0
- 0
First, the Linkage.
Main Point of this Rant in BOLD down near the bottom for those who aren't bored enough to read this whole post
I've been using the same computer for over 6 years now. I built it for around $800 plus $350 for the 4+ year old 17 inch LCD display. I consider this to be a pretty good run and I've had to sacrifice upgrading in favor of saving up for a new uber machine. I was about ready to purchase a new computer when I started paying attention to the computer hardware market again in order to figure out what I wanted to buy.
I went from a strong Intel supporter and someone who only used AMD to build a cheaper system for a couple friends to someone who now knows that AMD is just as good as and arguably better than Intel currently (not trying to start AMD vs. Intel debate, that's just how I see it. Intel still has impressive technology [see Pentium M], they've just screwed up their desktop and ws/server CPUs).
I decided it would be wise to save my money and invest in another computer, which I would use for a long time -- such as the one I am currently using to post here. I saw that 64-bit x86 CPUs were coming out and I thought it would also be wise to wait for Dual-core or Multi-core CPUs to come out before I purchased my next machine. This would let me have enough processing power and 64-bit support in case the industry headed there.
I really thought the time was now. AMD's Opteron 275s coupled with Tyan's Nforce4 Pro. chipset based workstation boards which supports NVIDIA SLI is a heck of a machine.
So why not just break down and buy that if I have the money?
AMD's roadmaps (Xbit link at the top) shows that AMD will move to DDR2 for 2006 and then DDR3 for 2007. I can't see purchasing a DDR based machine now knowing that DDR2 will become the standard next year and soon will become more expensive as the memory manufacturers phase out DDR in favor of DDR2 and DDR3. Of course, this all means that AMD will be changing from DDR memory controllers, to DDR2 memory controllers, and finally to DDR3 controllers by 2007, which screams lots of new sockets and new motherboards -- assuming one wants to keep an upgrade path.
I've gone from believing 2005 would be the time to finally take the plunge to now wanting to wait for 2007 to see the offerings from both Intel and AMD supporting DDR3 memory. Early 1999 to 2007 is a LONG time to wait to upgrade.
At this point I might actually buy a cheap machine and wait to see where everything settles. DDR to DDR2 to DDR3 in 2 years is a lot of change. I'm also going to wait on buying the two Dell 2405fpws because if I do wait for 2007 there will probably be newer and better technology available at the same price anyway.
AMD's DDR2 platform sounds dead before it arives. Who's going to want to buy a DDR2 based machine that will only have an upgrade path for 1 year? (assuming that they will change to a new Socket for DDR3 and that you will need a new motherboard that supports DDR3 DIMMs)
Anyway, those are my thoughts. Hope no one else is having the same delima that I am.
Main Point of this Rant in BOLD down near the bottom for those who aren't bored enough to read this whole post
I've been using the same computer for over 6 years now. I built it for around $800 plus $350 for the 4+ year old 17 inch LCD display. I consider this to be a pretty good run and I've had to sacrifice upgrading in favor of saving up for a new uber machine. I was about ready to purchase a new computer when I started paying attention to the computer hardware market again in order to figure out what I wanted to buy.
I went from a strong Intel supporter and someone who only used AMD to build a cheaper system for a couple friends to someone who now knows that AMD is just as good as and arguably better than Intel currently (not trying to start AMD vs. Intel debate, that's just how I see it. Intel still has impressive technology [see Pentium M], they've just screwed up their desktop and ws/server CPUs).
I decided it would be wise to save my money and invest in another computer, which I would use for a long time -- such as the one I am currently using to post here. I saw that 64-bit x86 CPUs were coming out and I thought it would also be wise to wait for Dual-core or Multi-core CPUs to come out before I purchased my next machine. This would let me have enough processing power and 64-bit support in case the industry headed there.
I really thought the time was now. AMD's Opteron 275s coupled with Tyan's Nforce4 Pro. chipset based workstation boards which supports NVIDIA SLI is a heck of a machine.
So why not just break down and buy that if I have the money?
AMD's roadmaps (Xbit link at the top) shows that AMD will move to DDR2 for 2006 and then DDR3 for 2007. I can't see purchasing a DDR based machine now knowing that DDR2 will become the standard next year and soon will become more expensive as the memory manufacturers phase out DDR in favor of DDR2 and DDR3. Of course, this all means that AMD will be changing from DDR memory controllers, to DDR2 memory controllers, and finally to DDR3 controllers by 2007, which screams lots of new sockets and new motherboards -- assuming one wants to keep an upgrade path.
I've gone from believing 2005 would be the time to finally take the plunge to now wanting to wait for 2007 to see the offerings from both Intel and AMD supporting DDR3 memory. Early 1999 to 2007 is a LONG time to wait to upgrade.
At this point I might actually buy a cheap machine and wait to see where everything settles. DDR to DDR2 to DDR3 in 2 years is a lot of change. I'm also going to wait on buying the two Dell 2405fpws because if I do wait for 2007 there will probably be newer and better technology available at the same price anyway.
AMD's DDR2 platform sounds dead before it arives. Who's going to want to buy a DDR2 based machine that will only have an upgrade path for 1 year? (assuming that they will change to a new Socket for DDR3 and that you will need a new motherboard that supports DDR3 DIMMs)
Anyway, those are my thoughts. Hope no one else is having the same delima that I am.