solofly
Banned
- May 25, 2003
- 1,421
- 0
- 0
Originally posted by: apoppin
pity the fool who plays Doom III [period]
"extreme gfx" for an "extreme[ly bad"] video game:disgust:
:roll:
Can you make one better?
Originally posted by: apoppin
pity the fool who plays Doom III [period]
"extreme gfx" for an "extreme[ly bad"] video game:disgust:
:roll:
Originally posted by: Rollo
Originally posted by: apoppin
pity the fool who plays Doom III [period]
We all have different tastes, Doom3 was my favorite game in 2004.
he is living his sig . . .Originally posted by: fierydemise
Once again Rollo shows his class by responding humorously to attention whores who try to call him out
Originally posted by: apoppin
did you finish it? Did it hold your interest
Originally posted by: Rollo
Originally posted by: apoppin
did you finish it? Did it hold your interest
I played every level but the final one. I reformatted my system before I did that and forgot to back up my save games.
Nonetheless, my favorite game of the year.
ok, it's was my least favourite game [i finished] - ever . . . compared to just about any of the other 'big' titles of '03-05 . . . it was banal and uninspiring
i loved the gfx and hated the game . . . i will just let it go , , , and agree to disagree but accept that some gamers reallly liked it [admittedly . . . grudgingly . . . i DID like the fight with the Guardian . . .but the rest was such a boring retred of FPS games . . . ]
i can link you to my multitude of reasons fo hating it . . . fortunately for me the memory of D3 is fading. . . . and i'll drop it. . . . now.
![]()
Originally posted by: Nirach
Well.
Gigabyte released a quad SLI board a while back. Or announced it.. Or something.
Buy that instead of that dell crud. why do you think it's got flame design?
It's warning you. warning
Who'd manage to use 2gb of video memory anyhow?
Originally posted by: apoppin
Originally posted by: Nirach
Well.
Gigabyte released a quad SLI board a while back. Or announced it.. Or something.
Buy that instead of that dell crud. why do you think it's got flame design?
It's warning you. warning
Who'd manage to use 2gb of video memory anyhow?
Don't sli'd or x-fired GPUs share video memory? in other words, isn't it still 512MB effective vRAM?
I'm the OP... this thread is a light-hearted roast of our favorite "Modern Man", Rollo. As I stated right off the bat, "no hard feelings".i believe the OP was not intending it except in jest. . . .
Where have I heard a statement like that before? Hmmm... let me see...Who'd manage to use 2gb of video memory anyhow?
Originally posted by: Wolfshanze
I'm the OP... this thread is a light-hearted roast of our favorite "Modern Man", Rollo. As I stated right off the bat, "no hard feelings".i believe the OP was not intending it except in jest. . . .
Where have I heard a statement like that before? Hmmm... let me see...Who'd manage to use 2gb of video memory anyhow?
Bill Gates 1981:
"640K should be enough for anybody!"
Laugh now, but there will come a time when you need MORE then 2gb of video ram.
Originally posted by: Rollo
Originally posted by: apoppin
did you finish it? Did it hold your interest
I played every level but the final one. I reformatted my system before I did that and forgot to back up my save games.
Nonetheless, my favorite game of the year.
Originally posted by: Corporate Thug
Originally posted by: Rollo
Originally posted by: apoppin
did you finish it? Did it hold your interest
I played every level but the final one. I reformatted my system before I did that and forgot to back up my save games.
Nonetheless, my favorite game of the year.
wow, that must be a lonely boat
Originally posted by: Rollo
Originally posted by: Corporate Thug
Originally posted by: Rollo
Originally posted by: apoppin
did you finish it? Did it hold your interest
I played every level but the final one. I reformatted my system before I did that and forgot to back up my save games.
Nonetheless, my favorite game of the year.
wow, that must be a lonely boat
:roll:
Dude, buy a life at KMart already. :laugh:
I'm not going to sit here and debate why it is or isn't smart to like or not like Doom3, because it's pointless to question taste in art.
I basically only listen to TOOL, Cracker, and Classical music these days- it's a "lonely boat" in my peer group, but it doesn't mean country music is better because I know more people who like it.
I think David Lynch movies are the best I've ever seen, and most people I know have never seen them. Does that mean "Titanic", that everyone has seen is "better"?
Stop trying to equate taste in art with intelligence- you won't be so befuddled.
Originally posted by: Wolfshanze
I'm the OP... this thread is a light-hearted roast of our favorite "Modern Man", Rollo. As I stated right off the bat, "no hard feelings".i believe the OP was not intending it except in jest. . . .
Where have I heard a statement like that before? Hmmm... let me see...Who'd manage to use 2gb of video memory anyhow?
Bill Gates 1981:
"640K should be enough for anybody!"
Laugh now, but there will come a time when you need MORE then 2gb of video ram.
Originally posted by: apoppin
Originally posted by: Nirach
Well.
Gigabyte released a quad SLI board a while back. Or announced it.. Or something.
Buy that instead of that dell crud. why do you think it's got flame design?
It's warning you. warning
Who'd manage to use 2gb of video memory anyhow?
Don't sli'd or x-fired GPUs share video memory? in other words, isn't it still 512MB effective vRAM?
You follow me around disagreeing with what I say and flaming me.Originally posted by: Corporate Thug
now rollo, was that called for? did i say ANYTHING about art or intelligence? c'mon now. I didnt personally attack you and there is no reason for you to do so
BTW, we dont have KMARTs in HOLLYWOOD.
Originally posted by: Rollo
You follow me around disagreeing with what I say and flaming me.Originally posted by: Corporate Thug
now rollo, was that called for? did i say ANYTHING about art or intelligence? c'mon now. I didnt personally attack you and there is no reason for you to do so
BTW, we dont have KMARTs in HOLLYWOOD.
In this thread, I am asking you to get a life beyond that. While I'm glad you're proud of your address, certainly a "mover and shaker" such as yourself has better things to do with their time than follow a guy from the Midwest around arguing with him?
I've never been to Hollywood, but I have to doubt it's considered "chic" or "upscale" to hang out on AT flaming people? Or is that what the "A List" people do these days?
Originally posted by: Nirach
Originally posted by: Wolfshanze
I'm the OP... this thread is a light-hearted roast of our favorite "Modern Man", Rollo. As I stated right off the bat, "no hard feelings".i believe the OP was not intending it except in jest. . . .
Where have I heard a statement like that before? Hmmm... let me see...Who'd manage to use 2gb of video memory anyhow?
Bill Gates 1981:
"640K should be enough for anybody!"
Laugh now, but there will come a time when you need MORE then 2gb of video ram.
Oh, I know what you mean.
But by the time that games, programs, whatever, need 2gb of vram, that board and setup will be so dated even hobos will be able to use it. Even with the rapid progress that games and hardware is making now, there is only so much technology can progress before things get dirt cheap.
Originally posted by: apoppin
Originally posted by: Nirach
Well.
Gigabyte released a quad SLI board a while back. Or announced it.. Or something.
Buy that instead of that dell crud. why do you think it's got flame design?
It's warning you. warning
Who'd manage to use 2gb of video memory anyhow?
Don't sli'd or x-fired GPUs share video memory? in other words, isn't it still 512MB effective vRAM?
I honestly don't know. I couldn't even begin to tell you how it works, I've never looked into it.
I just figured there'd be no point in producing a system with 2gb actual vram and have it use only a quarter of it. I was under the impression these cards worked as flat out as supposedly possible, hence the bigger vram option.
I probably assume incorrectly, because as I said before, it's not a field I have any experiance with.
Rollo, or another of the (I don't know any oother usernames off the top of my head) SLI users from way back would probably have more idea and be better equiped to tell you, and me (especially me) what SLI utilises.
I'm still betting money on stupid air temps, at the very least, I mean. Four high end graphics cards, a sh*t-hot (Excuse the pun) processor and god knows what else they have in there. I vaguely recall it being somewhat cramped, too..
Wait and see, I guess is my best course of action for that theory though![]()
Originally posted by: keysplayr2003
Originally posted by: Nirach
Originally posted by: Wolfshanze
<snip>
It looks as though the visible fan feeds both cards of the assembly. If you noticed, the fan sits way toward the rear of the card and not over the GPU, so I think the one fan feeds air to both cards of the unit and is vented out of the case through the back. This is probably a noisy setup if one fan has to move enough air to cool 2 cards each. Just a guess. Because I cant imagine the other 3 cards getting anywhere near enough air to cool themselves. What is there, 1 mm of air space between one card or the other? This thing just might be a furnace.
I'm not sure, I only have a vague (Drunken) memory of the internals. Although, with all that gear in it, it's not going to be sat there at thirty odd degrees idle, I will be vital parts of my anatomy (Yeah, idle bet.) on it.
I want to know how many will actually catch fire. Special hand painted flame design? My arse. Special crumby cooling and planning flame design.
Originally posted by: Rollo
As far as the shared memory goes, I know the GPUs can't share it across video cards. (so it is 512 effective)
Seriously?
Man. That sounds like a total jipp to me.. two cards, I'd go for that. But four? Daamn. More money than sense IMO.
Originally posted by: RichUK
Lol I wonder when a single GFX card (none sli), single slot cooling card will be able to match this GFX setup/performance. A year or two, bah its over kill without using the 30 inch dell which is its soul purpose, unless you are the old ?minimum playable FPS aka 100 FPS max settings/res? and ?3Dmark? junkie .
Originally posted by: Rollo
Originally posted by: Wolfshanze
I'm the OP... this thread is a light-hearted roast of our favorite "Modern Man", Rollo. As I stated right off the bat, "no hard feelings".i believe the OP was not intending it except in jest. . . .
Where have I heard a statement like that before? Hmmm... let me see...Who'd manage to use 2gb of video memory anyhow?
Bill Gates 1981:
"640K should be enough for anybody!"
Laugh now, but there will come a time when you need MORE then 2gb of video ram.
I took no offense at you lampooning my colorful style of writing.:beer:
thanks for the confirmationOriginally posted by: Rollo
As far as the shared memory goes, I know the GPUs can't share it across video cards. (so it is 512 effective)
Originally posted by: n7
Originally posted by: jimmypage13
2 gtx and a 05 car? , he need to sell those gtx and buy a new car![]()
No, we must all sell our cars & buy teh Dell quad SLI system!
For those of us w/o cars...i dunno![]()