Rollo's system sucks

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

solofly

Banned
May 25, 2003
1,421
0
0
Originally posted by: apoppin
pity the fool who plays Doom III [period]

"extreme gfx" for an "extreme[ly bad"] video game:disgust:
:roll:

Can you make one better?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Rollo
Originally posted by: apoppin
pity the fool who plays Doom III [period]

We all have different tastes, Doom3 was my favorite game in 2004.

did you finish it? Did it hold your interest

i am only speaking negatively of the game . . . ;)


. . . the D3 engine is AWEsome!
:thumbsup:

------------------

Originally posted by: fierydemise
Once again Rollo shows his class by responding humorously to attention whores who try to call him out
he is living his sig . . . ;)

. . . now
:thumbsup:

i am against these types of threads . . . but it IS "Video", afterall
:roll:


make one about me . . . :p

i''ll show mine
[and tattle to the mods in FI :p]

:laugh:

i believe the OP was not intending it except in jest. . . .

seriously - if that is possible . . . for me - there are people that are envious . . . and it shows. . . . heck, it's only HW . . . eventually everyone will be able to afford quad sli. ;)
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: apoppin
did you finish it? Did it hold your interest

I played every level but the final one. I reformatted my system before I did that and forgot to back up my save games.
Nonetheless, my favorite game of the year.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Rollo
Originally posted by: apoppin
did you finish it? Did it hold your interest

I played every level but the final one. I reformatted my system before I did that and forgot to back up my save games.
Nonetheless, my favorite game of the year.

ok, it's was my least favourite game [i finished] - ever . . . compared to just about any of the other 'big' titles of '03-05 . . . it was banal and uninspiring

i loved the gfx and hated the game . . . i will just let it go , , , and agree to disagree but accept that some gamers reallly liked it [admittedly . . . grudgingly . . . i DID like the fight with the Guardian . . .but the rest was such a boring retred of FPS games . . . ]

i can link you to my multitude of reasons fo hating it . . . fortunately for me the memory of D3 is fading. . . . and i'll drop it. . . . now.
:)
 

Nirach

Senior member
Jul 18, 2005
415
0
0
Well.

Gigabyte released a quad SLI board a while back. Or announced it.. Or something.

Buy that instead of that dell crud. why do you think it's got flame design?

It's warning you. warning

Who'd manage to use 2gb of video memory anyhow?
 

CP5670

Diamond Member
Jun 24, 2004
5,660
762
126
ok, it's was my least favourite game [i finished] - ever . . . compared to just about any of the other 'big' titles of '03-05 . . . it was banal and uninspiring

i loved the gfx and hated the game . . . i will just let it go , , , and agree to disagree but accept that some gamers reallly liked it [admittedly . . . grudgingly . . . i DID like the fight with the Guardian . . .but the rest was such a boring retred of FPS games . . . ]

i can link you to my multitude of reasons fo hating it . . . fortunately for me the memory of D3 is fading. . . . and i'll drop it. . . . now.
:)

For me all those comments apply to HL2. D3 wasn't all that great either, but I at least never felt like giving it up halfway through. Far Cry and SCCT have been by far the best singleplayer games in the last two years in my opinion. I haven't played Riddick yet though, which looks good from the reviews I've seen. I need to snag a cheap copy of that from ebay.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Nirach
Well.

Gigabyte released a quad SLI board a while back. Or announced it.. Or something.

Buy that instead of that dell crud. why do you think it's got flame design?

It's warning you. warning

Who'd manage to use 2gb of video memory anyhow?



Don't sli'd or x-fired GPUs share video memory? in other words, isn't it still 512MB effective vRAM?
 

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
Originally posted by: apoppin
Originally posted by: Nirach
Well.

Gigabyte released a quad SLI board a while back. Or announced it.. Or something.

Buy that instead of that dell crud. why do you think it's got flame design?

It's warning you. warning

Who'd manage to use 2gb of video memory anyhow?


Don't sli'd or x-fired GPUs share video memory? in other words, isn't it still 512MB effective vRAM?

Technically, there is 2gb of ram...but effective i think is only 512mb of ram.
Then again, doesnt SLI have both cards display a section of the video image? So if thats split into 4 parts...there's a chance that maybe games are using ram from each card.

Hmm, that's a good question. We need a pro to answer this!

 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
As I understand it, each card has its own framebuffer and can not share video memoery with another, so the information is duplicated in each card. At least that is the way dual card SLI works. I am guessing that holds true for quad SLI as well. If it didn't it wouldn't make any sense at all to have 4x 512MB of vRAM.
 

Wolfshanze

Senior member
Jan 21, 2005
767
0
0
i believe the OP was not intending it except in jest. . . .
I'm the OP... this thread is a light-hearted roast of our favorite "Modern Man", Rollo. As I stated right off the bat, "no hard feelings".

Who'd manage to use 2gb of video memory anyhow?
Where have I heard a statement like that before? Hmmm... let me see...

Bill Gates 1981:
"640K should be enough for anybody!"


Laugh now, but there will come a time when you need MORE then 2gb of video ram.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Wolfshanze
i believe the OP was not intending it except in jest. . . .
I'm the OP... this thread is a light-hearted roast of our favorite "Modern Man", Rollo. As I stated right off the bat, "no hard feelings".

Who'd manage to use 2gb of video memory anyhow?
Where have I heard a statement like that before? Hmmm... let me see...

Bill Gates 1981:
"640K should be enough for anybody!"


Laugh now, but there will come a time when you need MORE then 2gb of video ram.

I took no offense at you lampooning my colorful style of writing. :):beer:
 
Apr 17, 2003
37,622
0
76
Originally posted by: Rollo
Originally posted by: apoppin
did you finish it? Did it hold your interest

I played every level but the final one. I reformatted my system before I did that and forgot to back up my save games.
Nonetheless, my favorite game of the year.

wow, that must be a lonely boat
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Corporate Thug
Originally posted by: Rollo
Originally posted by: apoppin
did you finish it? Did it hold your interest

I played every level but the final one. I reformatted my system before I did that and forgot to back up my save games.
Nonetheless, my favorite game of the year.

wow, that must be a lonely boat

:roll:

Dude, buy a life at KMart already. :laugh:

I'm not going to sit here and debate why it is or isn't smart to like or not like Doom3, because it's pointless to question taste in art.

I basically only listen to TOOL, Cracker, and Classical music these days- it's a "lonely boat" in my peer group, but it doesn't mean country music is better because I know more people who like it.

I think David Lynch movies are the best I've ever seen, and most people I know have never seen them. Does that mean "Titanic", that everyone has seen is "better"?

Stop trying to equate taste in art with intelligence- you won't be so befuddled.

 
Apr 17, 2003
37,622
0
76
Originally posted by: Rollo
Originally posted by: Corporate Thug
Originally posted by: Rollo
Originally posted by: apoppin
did you finish it? Did it hold your interest

I played every level but the final one. I reformatted my system before I did that and forgot to back up my save games.
Nonetheless, my favorite game of the year.

wow, that must be a lonely boat

:roll:

Dude, buy a life at KMart already. :laugh:

I'm not going to sit here and debate why it is or isn't smart to like or not like Doom3, because it's pointless to question taste in art.

I basically only listen to TOOL, Cracker, and Classical music these days- it's a "lonely boat" in my peer group, but it doesn't mean country music is better because I know more people who like it.

I think David Lynch movies are the best I've ever seen, and most people I know have never seen them. Does that mean "Titanic", that everyone has seen is "better"?

Stop trying to equate taste in art with intelligence- you won't be so befuddled.


now rollo, was that called for? did i say ANYTHING about art or intelligence? c'mon now. I didnt personally attack you and there is no reason for you to do so

BTW, we dont have KMARTs in HOLLYWOOD.

 

Nirach

Senior member
Jul 18, 2005
415
0
0
Originally posted by: Wolfshanze
i believe the OP was not intending it except in jest. . . .
I'm the OP... this thread is a light-hearted roast of our favorite "Modern Man", Rollo. As I stated right off the bat, "no hard feelings".

Who'd manage to use 2gb of video memory anyhow?
Where have I heard a statement like that before? Hmmm... let me see...

Bill Gates 1981:
"640K should be enough for anybody!"


Laugh now, but there will come a time when you need MORE then 2gb of video ram.

Oh, I know what you mean.

But by the time that games, programs, whatever, need 2gb of vram, that board and setup will be so dated even hobos will be able to use it. Even with the rapid progress that games and hardware is making now, there is only so much technology can progress before things get dirt cheap.

Originally posted by: apoppin
Originally posted by: Nirach
Well.

Gigabyte released a quad SLI board a while back. Or announced it.. Or something.

Buy that instead of that dell crud. why do you think it's got flame design?

It's warning you. warning

Who'd manage to use 2gb of video memory anyhow?



Don't sli'd or x-fired GPUs share video memory? in other words, isn't it still 512MB effective vRAM?


I honestly don't know. I couldn't even begin to tell you how it works, I've never looked into it.

I just figured there'd be no point in producing a system with 2gb actual vram and have it use only a quarter of it. I was under the impression these cards worked as flat out as supposedly possible, hence the bigger vram option.

I probably assume incorrectly, because as I said before, it's not a field I have any experiance with.

Rollo, or another of the (I don't know any oother usernames off the top of my head) SLI users from way back would probably have more idea and be better equiped to tell you, and me (especially me) what SLI utilises.

I'm still betting money on stupid air temps, at the very least, I mean. Four high end graphics cards, a sh*t-hot (Excuse the pun) processor and god knows what else they have in there. I vaguely recall it being somewhat cramped, too..

Wait and see, I guess is my best course of action for that theory though :D
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Corporate Thug

now rollo, was that called for? did i say ANYTHING about art or intelligence? c'mon now. I didnt personally attack you and there is no reason for you to do so

BTW, we dont have KMARTs in HOLLYWOOD.
You follow me around disagreeing with what I say and flaming me.

In this thread, I am asking you to get a life beyond that. While I'm glad you're proud of your address, certainly a "mover and shaker" such as yourself has better things to do with their time than follow a guy from the Midwest around arguing with him?

I've never been to Hollywood, but I have to doubt it's considered "chic" or "upscale" to hang out on AT flaming people? Or is that what the "A List" people do these days?
 
Apr 17, 2003
37,622
0
76
Originally posted by: Rollo
Originally posted by: Corporate Thug

now rollo, was that called for? did i say ANYTHING about art or intelligence? c'mon now. I didnt personally attack you and there is no reason for you to do so

BTW, we dont have KMARTs in HOLLYWOOD.
You follow me around disagreeing with what I say and flaming me.

In this thread, I am asking you to get a life beyond that. While I'm glad you're proud of your address, certainly a "mover and shaker" such as yourself has better things to do with their time than follow a guy from the Midwest around arguing with him?

I've never been to Hollywood, but I have to doubt it's considered "chic" or "upscale" to hang out on AT flaming people? Or is that what the "A List" people do these days?


how was i flaming you buddy? all i said is that i dont think most people think DIII was the best game that year. I didnt attack your intelligence or your taste per se. Sorry if you took it in that manner.

I do disagree with some of your opinions but i reassure you, i do not stalk your threads for the purpose of "flaming" you. BTW, disagreeing and flaming is NOT the same thing. RELAX, dont be so defensive. Nvidia is doing just fine, you can take a breath

to be honest, there isnt much to do ATM. I wont be going to the car show for another few minutes so I am a little bored ;).

And yes, i am proud of where i live. But there is one catch: people always seem to hate on hollywood because we see more money in a month than most see in a year. I guess thats one of the drawbacks I just have to put up with ;)

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
As far as the shared memory goes, I know the GPUs can't share it across video cards. (so it is 512 effective)
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: Nirach
Originally posted by: Wolfshanze
i believe the OP was not intending it except in jest. . . .
I'm the OP... this thread is a light-hearted roast of our favorite "Modern Man", Rollo. As I stated right off the bat, "no hard feelings".

Who'd manage to use 2gb of video memory anyhow?
Where have I heard a statement like that before? Hmmm... let me see...

Bill Gates 1981:
"640K should be enough for anybody!"


Laugh now, but there will come a time when you need MORE then 2gb of video ram.

Oh, I know what you mean.

But by the time that games, programs, whatever, need 2gb of vram, that board and setup will be so dated even hobos will be able to use it. Even with the rapid progress that games and hardware is making now, there is only so much technology can progress before things get dirt cheap.

Originally posted by: apoppin
Originally posted by: Nirach
Well.

Gigabyte released a quad SLI board a while back. Or announced it.. Or something.

Buy that instead of that dell crud. why do you think it's got flame design?

It's warning you. warning

Who'd manage to use 2gb of video memory anyhow?



Don't sli'd or x-fired GPUs share video memory? in other words, isn't it still 512MB effective vRAM?


I honestly don't know. I couldn't even begin to tell you how it works, I've never looked into it.

I just figured there'd be no point in producing a system with 2gb actual vram and have it use only a quarter of it. I was under the impression these cards worked as flat out as supposedly possible, hence the bigger vram option.

I probably assume incorrectly, because as I said before, it's not a field I have any experiance with.

Rollo, or another of the (I don't know any oother usernames off the top of my head) SLI users from way back would probably have more idea and be better equiped to tell you, and me (especially me) what SLI utilises.

I'm still betting money on stupid air temps, at the very least, I mean. Four high end graphics cards, a sh*t-hot (Excuse the pun) processor and god knows what else they have in there. I vaguely recall it being somewhat cramped, too..

Wait and see, I guess is my best course of action for that theory though :D

It looks as though the visible fan feeds both cards of the assembly. If you noticed, the fan sits way toward the rear of the card and not over the GPU, so I think the one fan feeds air to both cards of the unit and is vented out of the case through the back. This is probably a noisy setup if one fan has to move enough air to cool 2 cards each. Just a guess. Because I cant imagine the other 3 cards getting anywhere near enough air to cool themselves. What is there, 1 mm of air space between one card or the other? This thing just might be a furnace.

 

Nirach

Senior member
Jul 18, 2005
415
0
0
Originally posted by: keysplayr2003
Originally posted by: Nirach
Originally posted by: Wolfshanze

It looks as though the visible fan feeds both cards of the assembly. If you noticed, the fan sits way toward the rear of the card and not over the GPU, so I think the one fan feeds air to both cards of the unit and is vented out of the case through the back. This is probably a noisy setup if one fan has to move enough air to cool 2 cards each. Just a guess. Because I cant imagine the other 3 cards getting anywhere near enough air to cool themselves. What is there, 1 mm of air space between one card or the other? This thing just might be a furnace.

I'm not sure, I only have a vague (Drunken) memory of the internals. Although, with all that gear in it, it's not going to be sat there at thirty odd degrees idle, I will be vital parts of my anatomy (Yeah, idle bet.) on it.

I want to know how many will actually catch fire. Special hand painted flame design? My arse. Special crumby cooling and planning flame design.

Originally posted by: Rollo
As far as the shared memory goes, I know the GPUs can't share it across video cards. (so it is 512 effective)


Seriously?

Man. That sounds like a total jipp to me.. two cards, I'd go for that. But four? Daamn. More money than sense IMO.
 

RichUK

Lifer
Feb 14, 2005
10,341
678
126
Lol I wonder when a single GFX card (none sli), single slot cooling card will be able to match this GFX setup/performance. A year or two, bah its over kill without using the 30 inch dell which is its soul purpose, unless you are the old ?minimum playable FPS aka 100 FPS max settings/res? and ?3Dmark? junkie .
 

Nirach

Senior member
Jul 18, 2005
415
0
0
Originally posted by: RichUK
Lol I wonder when a single GFX card (none sli), single slot cooling card will be able to match this GFX setup/performance. A year or two, bah its over kill without using the 30 inch dell which is its soul purpose, unless you are the old ?minimum playable FPS aka 100 FPS max settings/res? and ?3Dmark? junkie .


The people who buy that XPS probably have the money for at least one 30" :D
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Rollo
Originally posted by: Wolfshanze
i believe the OP was not intending it except in jest. . . .
I'm the OP... this thread is a light-hearted roast of our favorite "Modern Man", Rollo. As I stated right off the bat, "no hard feelings".

Who'd manage to use 2gb of video memory anyhow?
Where have I heard a statement like that before? Hmmm... let me see...

Bill Gates 1981:
"640K should be enough for anybody!"


Laugh now, but there will come a time when you need MORE then 2gb of video ram.

I took no offense at you lampooning my colorful style of writing. :):beer:

it's becoming more colourful . . .

. . . i notice you are using more emoticons :p
:Q

:D

=============
Originally posted by: Rollo
As far as the shared memory goes, I know the GPUs can't share it across video cards. (so it is 512 effective)
thanks for the confirmation
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
Originally posted by: n7
Originally posted by: jimmypage13
2 gtx and a 05 car? , he need to sell those gtx and buy a new car :)


No, we must all sell our cars & buy teh Dell quad SLI system!

For those of us w/o cars...i dunno :(

Blood, plasma, semen, organs....