- Jul 18, 2002
- 969
- 0
- 0
Would it be possible to convert a CPU chip to a GPU then you would have a core clock of 800Mhz or more. I know there is more to it than just sodering but would it be possible with the proper equipment and all?
Originally posted by: TerryMathews
Because:
A) Intel's graphic chip designs are usually pretty bad. AFAIK, their current integrated graphic solution is based on the old i740, which was not top-of-the-line when it was released back in the heyday of the Voodoo2.
B) Considering Intel's past achievements at CPU design, <sarcasm> a combination GPU/P4 would prolly have a die the size of an Eggo waffle and run at 4.6GHz but yet still be slower than a 1.4GHz P3.</sarcasm>
This comment strikes me as odd. Why would software rendering disregard image quality? CGI that you see in movies and tv shows are done in software using server farms of hundreds or thousands of CPU's in parallel. You couldn't get that level of realism and quality on a GPU, even with the R300 and NV30 - you could get close, but not equal.Originally posted by: Asuka
Image quality in software rendering is basically disregarded. Whereas, your GPU can do these calculations much faster.
Originally posted by: KnightBreed
This comment strikes me as odd. Why would software rendering disregard image quality? CGI that you see in movies and tv shows are done in software using server farms of hundreds or thousands of CPU's in parallel. You couldn't get that level of realism and quality on a GPU, even with the R300 and NV30 - you could get close, but not equal.Originally posted by: Asuka
Image quality in software rendering is basically disregarded. Whereas, your GPU can do these calculations much faster.
A CPU can do everything a GPU can, albeit slowly, but modern GPU's can't do everything a CPU can.
It could be done if designers wanted to do so but there would be absolutely no point in doing so. The whole reason that GPU's were made in the first place is because CPU's weren't good enough. A GPU has the mathematical formulae and algorithmic stages of rendering hardwired into it. A CPU on the other hand has gerneral purpose binary logic, arithmatic, and control flow hard wired into it.
Well, duh.Originally posted by: bizmark
Originally posted by: KnightBreed
This comment strikes me as odd. Why would software rendering disregard image quality? CGI that you see in movies and tv shows are done in software using server farms of hundreds or thousands of CPU's in parallel. You couldn't get that level of realism and quality on a GPU, even with the R300 and NV30 - you could get close, but not equal.Originally posted by: Asuka
Image quality in software rendering is basically disregarded. Whereas, your GPU can do these calculations much faster.
A CPU can do everything a GPU can, albeit slowly, but modern GPU's can't do everything a CPU can.
Yes, but here we're talking about real-time rendering for gaming and such. Give a CPU as much time as it needs to render a 3D scene, and it'll be as complex and as beautiful as you can imagine. But make a CPU, even a fast one, render 3D in real-time, and it'll look like sh|t because it can only get so much done per unit time.
Originally posted by: KnightBreed
Well, duh.My point was, Asuka's comment was a bit misleading in that it implied that software rendering had inherently flawed image quality compared to "hardware rendering."