HardOCP Crossfire review up

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: John Reynolds
As Anthony "Reverend" Tan so succinctly stated a few years ago: "The problem with fanboys is that they don't know they are just that."

And the [H.] article has caused what one poster at [H.] has described as a mutiny, because it's come under such heavy criticism by the site's readers.

The problem with the "Holier Than Thou" types is they think the "sinners" care about their preaching.....

:evil:
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: John Reynolds
Originally posted by: RobertR1
Since HardOcp is so anal about only testing cards that are available to the general consumer and why use the 7800GTX 512MB??? I can't recall the last time there were 7800GTX 512's for sale except at launch. Surely, they're well aware of this.
Maybe because this isn't true? :roll:
While you couldn't say 512 GTX availability is "good", I've seen it go in/out of stock at least three times since launch at newegg alone, not to mention other places. Also, I've read about it being at B&M stores and seen it in stock at high end systems assemblers.

So while I'd be hard pressed to come up with a 512 GTX today, I bet I could do it if motivated.


Of course they are, and I tried pinning Kyle down to an honest answer at B3D when he talked about refusing to review the X1800 XT until it was available, warning him this standard would bite him on the ass big-time if he didn't follow it for every last single piece of review kit his site receives in the future. The majority of the history of announcements for PC hardware has been with subsequent availability, and it's the height of foolishness to change your review publication timeframes due to one "hard" launch.
Did he give you the :roll: too? Perhaps say, "Gee John, maybe your opinion matters at SIMHQ, but I'll probably run my site the way I want to."

But, hey, we're talking about a site that proclaims to the world it won't be the bitch of any PR dept. and then turns around and posts comments critical of one company's part(s) that were actually written by a competitor's PR personnel and claimed them as their own words (and, yes, this did happen at [H ] last October).
LOL and you know this how? Did ATI PR send you an email? :roll:

Stop and think about just how wrong that is for a minute. For me, that was the worst piece of so-called online journalism I've ever seen*, the absolute most unprofessional act in all my years of web surfing, and my e-mail scolding the guilty person was, curiously enough, never replied to, though this person almost always replies to my e-mails (such as when I told him how to get AA working in EverQuest 2 for all graphics boards, something he could've taught himself with a simple Google search).

Errr, why are you taking my thread so very FAR OFF TOPIC with some bullshi* hissy fit you threw last year with Kyle? If I could put the "roll" emoticon at 30pt. you'd see it here for this self indulgent, pompous display. If you don't like the current article we're discussing, why not tell us why?

Was it as bad as Driver Heaven's contest to most creatively destroy a 6800U, or was that your suggestion?

I used to respect you, I think I see pretty clearly what you're about now.

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: nitromullet
Hmmmm, interesting reply.... That is exactly the issue, unless ATI can fix these issues on the driver level the only other option is for Xfire owners to purchase new (expensive) gear in the hopes that it fixes issues. Thus far, SLI has not relied on having to purchase new cards to fix any of the issues. The v-sync issue and widescreen support were both resolved with new drivers for the 6 and 7-series cards. ATI, conversely resolved the 1600x1200 60Hz limitation by improving the hardware. Anyone who owns dual X850XT's in a Xfire rig (does someone like this exist?) is stuck with that limitation.


This is the heart of the matter, and the catalyst of my post.

Efficient super AA won't make up for the screen turning green unless it's at 60Hz, or flickering like a strobe light.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: KeepItRed
Originally posted by: Rollo
I have no problems whatsoever with X1800s, only X1800Crossfire.

I thought you said your not going to buy the X1800 series. So...you bought one? :confused:

Didn't say I did? :confused:

I said I have no problems with X1800s and think they're a good deal at their pricepoint.

E.G. If I had $500 to spend, I'd buy a X1800XT. If I had $420, I'd buy a 7800GTX. If I wanted dual card, I'd ONLY consider 7800s.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
SLI FTW!!!

Honestly, there is nothing wrong with Crossfire except that it is about a year behind SLI.

Not to mention that ATI screwed themselves by launching Corssfire so late because everyone on this forum using SLI (myself included) would have been a potential Crossfire user had SLI and Crossfire been available at the same time. I bet 50% of SLI users would have gone with Crossfire if it was available a year ago.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Rollo
Originally posted by: apoppin
Rollo when it comes to 'spinning' you are THE FUD-master
my hat's off to you
:roll:

apoppin:
I assure you my only intent with this post was to reinforce the findings of the users on the H forums 5150Joker linked us to, and the findings in the bizarre "Guru 3d" article.

I think there's enough smoke here to warrant people NOT spending $1000+ on a rig like this, not to mention the stuff that just is what it is, like the horrible default tiling mode that you can't change.

I have no problems whatsoever with X1800s, only X1800Crossfire. I'd feel bad misleading the board if I didn't put a negative slant on my take of that review. Even the guy who wrote it isn't thrilled with Crossfire as evidenced in his forum posts.

of course . . you are creating the smoke . . . your negative slant/conclusions are COMPLETELY DIFFERENT from the HardOCP review you linked to. :p
:thumbsdown:

as usual.

and IF i were getting a dual-gpu setup i'd have no problem with X-fire as a reasonable competitor to SLI IF i wsnted the x1800 . . .
 

John Reynolds

Member
Dec 6, 2005
119
0
0
Originally posted by: Rollo
But, hey, we're talking about a site that proclaims to the world it won't be the bitch of any PR dept. and then turns around and posts comments critical of one company's part(s) that were actually written by a competitor's PR personnel and claimed them as their own words (and, yes, this did happen at [H ] last October).
LOL and you know this how? Did ATI PR send you an email?

You know, for someone who goes crying to the mods your behavior is pretty deplorable. I'm replying to a post that's really little more than a defensive, personal attack against me over a post I wrote that in no way concerns you.

What I pointed out was Brent's news post. On Oct 7th last year he wrote on the front page:

The claim here is that the feature ?Vertex Texture Fetch? is actually an optional feature in Shader Model 3.0 and this specific feature is not supported directly with the Radeon X1000 family. They then go on to explain how there is a work around to enable the same result as doing a texture lookup from the vertex shader using pixel shaders. While the technique is very intriguing from a technical standpoint the fact is that ATI lacks a major Shader Model 3.0 feature in hardware.

Vertex Texture Lookups are necessary for some effects such as displacement mapping, hardware raytracing, advanced hardware skinning, hardware based collisions and soft body deformations, bone system generation, and many other areas of research developers are working on. To not include a very standardized feature means the progression of games utilizing these features will be dwarfed. In fact there is indeed already one game that supports the ?Vertex Texture Fetch? feature and is using it to provide better image quality with water.

The way the workaround works with the Radeon X1000 family requires game content developers to specifically write in special code just for ATI X1000 family hardware if they want to gain the same result as doing a texture lookup from the vertex shader. This requires more time and effort on their part to specifically support these kinds of features within the X1000 family. With NVIDIA?s GeForce 6 and 7 series however this same result can easily by done officially through the supported Shader Model 3.0 feature in DirectX. This workaround that ATI has in place reminds us of their backdoor method of performing Geometry Instancing in R420 hardware which was not officially supported in DirectX. We all know how well that turned out, basically no developers used it in their games, instead opting for the Shader Model 3.0 implementation, and one example is FarCry. CryTek added in a patch the ability to do geometry instancing using the Shader Model 3.0 implementation and not ATI?s specific implementation that only works on their hardware and is not part of the DirectX standard.

Now, anyone who reads Brent's articles knows full well, without a shred of doubt, he did not write the above news post, yet he published those words as if they were his own. I confronted Brent in an e-mail, yet got no answer. We're talking about someone with the technical knowledge level barely able to define the difference between a clock cycle and a rendering pass when it comes to graphics technology, yet there he is going on about workarounds for texture lookups, hardware skinning, raytracing, etc. I suggest everyone to hit [H's] news archives and read it on October 7th for themselves and make their own decision. To me, it unequivocably smacks of text straight from a PR pdf file. And I would be just as appalled if this were being done to AMD, Intel, NVIDIA, or any other company. Again, for not wanting to be anyone's PR bitch the boys at [H ] sure don't seem to hew that line very well. I mean, why would one of the big hardware sites take it upon themselves to refer to VTF as a "major" DirectX 9 feature without at least talking to a developer or two first, especially since that *major* feature has only been used in one game over the past 1.5 years? Not very objective, IMO.

Errr, why are you taking my thread so very FAR OFF TOPIC with some bullshi* hissy fit you threw last year with Kyle? If I could put the "roll" emoticon at 30pt. you'd see it here for this self indulgent, pompous display. If you don't like the current article we're discussing, why not tell us why?

Who said it was a hissy fit? It was a conversation. And my points tie into the ongoing pattern of bias a certain site manifestly displays toward a certain company, so it's certainly relevent for discussing this article and some of its claims.

I used to respect you, I think I see pretty clearly what you're about now.

Pffft, and I've never had one ounce of respect for you. The AEG program member who came out behind the scenes a few weeks ago? Flat-out named you as a member of the program, yet only in the context of saying that you were an embarrassment to the program and he and other members want you kicked out of it. The person who passed that info on to me? Laughing their butts off about it because they don't have an ounce of respect for you either. Like I told you in the PMs you kept sending me, you're one of the most infamous fanboys on the web and you have absolutely no one to blame but yourself for it.

Now, if you want to talk about certain sites that're very biased for ATI, I could name a few. Would probably just tickle you pink and I'd be your #1 hero for the day, like when you PMed me at B3D all happy when I flamed a ATI fanboy for making stupid claims against the 512MB 7800 GTX. Funny how you pick 'n choose when to be happy with me and when to get all indignant, eh?
 

RobertR1

Golden Member
Oct 22, 2004
1,113
1
81
Originally posted by: Rollo
Originally posted by: John Reynolds
Originally posted by: RobertR1
Since HardOcp is so anal about only testing cards that are available to the general consumer and why use the 7800GTX 512MB??? I can't recall the last time there were 7800GTX 512's for sale except at launch. Surely, they're well aware of this.
Maybe because this isn't true? :roll:
While you couldn't say 512 GTX availability is "good", I've seen it go in/out of stock at least three times since launch at newegg alone, not to mention other places. Also, I've read about it being at B&M stores and seen it in stock at high end systems assemblers.

So while I'd be hard pressed to come up with a 512 GTX today, I bet I could do it if motivated.

If the availability was decent, companies like Evga wouldn't have to jump through hoops to find alternatives to their Step-up program. The street price would also not be $100+ over MSRP if there were plenty to go around and a motivated person could also find a x800xt PE a year ago but that doesn't mean the availability was decent. The word to describe the availability for both card is "pitiful."


 

fierydemise

Platinum Member
Apr 16, 2005
2,056
2
81
John, simmer down also will you provide a link to your claim that an AEG program member named Rollo, I may disagree with Rollo but I'm not going to believe he is a shill without very solid proof.
 

John Reynolds

Member
Dec 6, 2005
119
0
0
Originally posted by: fierydemise
John, simmer down also will you provide a link to your claim that an AEG program member named Rollo, I may disagree with Rollo but I'm not going to believe he is a shill without very solid proof.

Sorry, I'm not going to air private correspondence such as e-mails or PMs. Rollo wants to get ugly and sling mud and have a real hissy fit over a post that wasn't directed at him? Well, he can take it in stride like a big boy when it comes back his way.
 

fierydemise

Platinum Member
Apr 16, 2005
2,056
2
81
John its a little tough to believe anything you say because most of your claims are based on private correspondences that you will not share (not saying this is a wrong decision, I would probably do the same thing), calling someone a marketing shill is a pretty big accusation and before I'd go around making accusations like that I'd make sure I had solid proof that I felt comfortable sharing.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: fierydemise
John, simmer down also will you provide a link to your claim that an AEG program member named Rollo, I may disagree with Rollo but I'm not going to believe he is a shill without very solid proof.

i'd like to see it too . . . i would think AEG had better judgement than to pick someone whose arguments are SO bad and SO one-sided they actually make ATi look good to unbiased people who read the counter arguments. ;)


:D

edit: i see you won't or can't . . . then please don't post what you can't back up.

And Rollo's conslusions are SO different from the review he quotes . . . i wonder if he just 'skimmed' it looking for crap to post about xfire.
 

deadseasquirrel

Golden Member
Nov 20, 2001
1,736
0
0
Originally posted by: RobertR1
Originally posted by: John Reynolds
Me, I'm not a big fan of dual PEG solutions. Too much heat, noise, and cost for too little linear performance scaling, and too much to invest in a single generation of parts when the graphics industry is still on such a fast schedule. Let things slow down to a 18 month architecture cycle and I might reconsider.

These are the exact same reason I do not prefer Dual GPU solutions for the present time or even recommend them to anyone.

While I have to agree that there IS more heat, noise, and power consumption with a dual-gpu solution than with a single, I don't think I could agree with you that it not recommended for "anyone". Seriously, what if someone has an LCD with a native resolution of 1900x1200 (or higher) and wants to play the lastest games (such as FEAR and COD2) with full details, AA/AF, at that native resolution? A single card solution doesn't cut it right now. We're on a wait-and-see approach for R580 and G71.

I've been running a dual-gpu rig for 12 months now. Granted, I don't fixate upon my electric bill to determine whether the extra power usage is breaking my bank account. Texas summers do that with my air conditioner with or without a dual-gpu rig. Extra heat? A good case with proper airflow mitigates that. Again, I'm using an overclocked A64 with 2 overclocked 6800GTs. No heat issue here. Noise? My speakers already have to compete with 2 other kids' computers in the study, as well as a TV in the next room, and a 6-month old. Noise isn't an issue for me, but I could see it being one for others.

But to NOT recommend it to ANYone??? That's going a bit far. If anything, dual-gpu systems allow for a lot of flexibility finding the right price/performance ratio. A good example would be the recent releases of the 512GTX and the X1800XT. Both were priced at around $600 (and even higher for the 512GTX).... whereas the 7800GT SLI config would set you back the same amount, yet provide more power. Price/performance at its best.

Sorry for the diatribe, but when I see people dismiss SLI (or Crossfire) across the board for everybody, I have to wonder how that person comes to such a conclusion. Yes, dual-gpu is mostly geared to those who utilize resolutions at or above 1600x1200; yes, it consumes more power, gives out more heat, and makes more noise. But if you have the motherboard capable of using such a solution, have a way to combat the heat, can put up with the noise, could care less how much power it eats, desire to play the latest games at the highest of resolutions with all the eye candy at smooth frame rates, and, most importantly, have enough coin for the purchase, then a dual-gpu solution is perfect for you.
 

John Reynolds

Member
Dec 6, 2005
119
0
0
Originally posted by: fierydemise
John its a little tough to believe anything you say because most of your claims are based on private correspondences that you will not share (not saying this is a wrong decision, I would probably do the same thing), calling someone a marketing shill is a pretty big accusation and before I'd go around making accusations like that I'd make sure I had solid proof that I felt comfortable sharing.

Dude, if the shoe fits you wear it. Look at how he's acting in this thread, how he started the thread in such a one-sided manner. Look at his antics over the years on numerous message boards. Is it really that big a stretch to believe someone like Rollo would be a member of such a program? When I was told, my reply back to the sender was, "Not a surprise to me." In fact, I'd already told Rollo in a PM weeks ago he'd been fingered, but I wasn't going to air it until he decided to get all pissy with me and act like an ass.

And perhaps you should be asking Rollo for proof of his accusation that I got my info from ATI PR, don't you think? Not that I really lose a minute's sleep over this drama. The crux of Rollo's screed against me is daring to suggest that [H ] borrowed words from a competitor's PR dept. to criticize the X1800's lack of VTF. Again, I'd suggest everyone to read that news post and, if you're unsure, go ahead and read a handful of Brent's articles/reviews and decide for yourself if those were his own words in that post.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: John Reynolds
I used to respect you, I think I see pretty clearly what you're about now.

Pffft, and I've never had one ounce of respect for you. The AEG program member who came out behind the scenes a few weeks ago? Flat-out named you as a member of the program, yet only in the context of saying that you were an embarrassment to the program and he and other members want you kicked out of it. The person who passed that info on to me? Laughing their butts off about it because they don't have an ounce of respect for you either. Like I told you in the PMs you kept sending me, you're one of the most infamous fanboys on the web and you have absolutely no one to blame but yourself for it.

Oh noes! You and your B3d cronies don't "respect me"! :laugh:

I am nobody's "shill". I post what I want to post, my agenda is my own. I make no secret of knowing people at nVidia; they do not control or attempt to influence what I post.

Glad to see I got through to you though, are you foaming at the mouth? ;)


 

RobertR1

Golden Member
Oct 22, 2004
1,113
1
81
Originally posted by: deadseasquirrel
Originally posted by: RobertR1
Originally posted by: John Reynolds
Me, I'm not a big fan of dual PEG solutions. Too much heat, noise, and cost for too little linear performance scaling, and too much to invest in a single generation of parts when the graphics industry is still on such a fast schedule. Let things slow down to a 18 month architecture cycle and I might reconsider.

These are the exact same reason I do not prefer Dual GPU solutions for the present time or even recommend them to anyone.

While I have to agree that there IS more heat, noise, and power consumption with a dual-gpu solution than with a single, I don't think I could agree with you that it not recommended for "anyone". Seriously, what if someone has an LCD with a native resolution of 1900x1200 (or higher) and wants to play the lastest games (such as FEAR and COD2) with full details, AA/AF, at that native resolution? A single card solution doesn't cut it right now. We're on a wait-and-see approach for R580 and G71.

I've been running a dual-gpu rig for 12 months now. Granted, I don't fixate upon my electric bill to determine whether the extra power usage is breaking my bank account. Texas summers do that with my air conditioner with or without a dual-gpu rig. Extra heat? A good case with proper airflow mitigates that. Again, I'm using an overclocked A64 with 2 overclocked 6800GTs. No heat issue here. Noise? My speakers already have to compete with 2 other kids' computers in the study, as well as a TV in the next room, and a 6-month old. Noise isn't an issue for me, but I could see it being one for others.

But to NOT recommend it to ANYone??? That's going a bit far. If anything, dual-gpu systems allow for a lot of flexibility finding the right price/performance ratio. A good example would be the recent releases of the 512GTX and the X1800XT. Both were priced at around $600 (and even higher for the 512GTX).... whereas the 7800GT SLI config would set you back the same amount, yet provide more power. Price/performance at its best.

Sorry for the diatribe, but when I see people dismiss SLI (or Crossfire) across the board for everybody, I have to wonder how that person comes to such a conclusion. Yes, dual-gpu is mostly geared to those who utilize resolutions at or above 1600x1200; yes, it consumes more power, gives out more heat, and makes more noise. But if you have the motherboard capable of using such a solution, have a way to combat the heat, can put up with the noise, could care less how much power it eats, desire to play the latest games at the highest of resolutions with all the eye candy at smooth frame rates, and, most importantly, have enough coin for the purchase, then a dual-gpu solution is perfect for you.

I'm playing FEAR on my x1800xt at 1920x1200 with everything on maximum and 2xAdaptive AA, 16xHQAF and the gameplay is just fine in Single player 27min, 43avg, 94high on the stress test. Perhaps Fear multiplayer could benefit from more GPU power but at this point I couldn't logically justify another $500 for an extra 2x/4x AA. Most people can't. Also when you goto upgrade, you're now taking the depreciation hit for 2 cards than just one which adds to your overall cost of ownership. While that might not matter to some, it will matter to others and something that should be considered when giving an overall opinion. You should also look at the games you play.

If I buy Sli/Crossfire for the sake of 1 or 2 games that I'll play through one time but the game I play 90% of the time runs just as well with one card, I can't justify another $500 for an extra 20hours of gameplay with just more AAA.

Ultra high end single cards are quite powerful in their own right.


 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: John Reynolds
In fact, I'd already told Rollo in a PM weeks ago he'd been fingered, but I wasn't going to air it until he decided to get all pissy with me and act like an ass.
He's not lying about that- he said I'd been fingered, I responded "Whats a focus group?" with a "razz" emoticon and told him I had a friend that works at nVidia, and that I'd forward his name as a web reviewer for SIMHQ that would like to be considered for review parts. (which I did- heh- hissy John returns the favor by joining AT and flaming me! LOL)


And perhaps you should be asking Rollo for proof of his accusation that I got my info from ATI PR, don't you think?
It looks like you just made assumptions, based on your post?

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Rollo
Originally posted by: John Reynolds
I used to respect you, I think I see pretty clearly what you're about now.

Pffft, and I've never had one ounce of respect for you. The AEG program member who came out behind the scenes a few weeks ago? Flat-out named you as a member of the program, yet only in the context of saying that you were an embarrassment to the program and he and other members want you kicked out of it. The person who passed that info on to me? Laughing their butts off about it because they don't have an ounce of respect for you either. Like I told you in the PMs you kept sending me, you're one of the most infamous fanboys on the web and you have absolutely no one to blame but yourself for it.

Oh noes! You and your B3d cronies don't "respect me"! :laugh:

I am nobody's "shill". I post what I want to post, my agenda is my own. I make no secret of knowing people at nVidia; they do not control or attempt to influence what I post.

Glad to see I got through to you though, are you foaming at the mouth? ;)
so "that's" your purpose and your agenda
:Q

then don't complain when it gets thrown right back at you ;)

and WHY is your FUD so different than the article's conclusion you link to?

Did you just 'skim' it looking for anti-xfire crap to post?

 

Conky

Lifer
May 9, 2001
10,709
0
0
I find it entirely believable that Rollo is part of AEG's "viral marketing" for Nvidia as he is consistently offering only Nvidia's products as a solution to any noob who asks for help. link to explanation of AEG
AEG link

Furthermore, his sig refers to "Reference 7800GTXs" and these aren't commonly available outside of Nvidia marketing channels.

I wouldn't even take interest in this normally but once he branded me as a fanboy I started taking notes. Rollo is hereby branded by me as an employee of AEG's viral marketing department, lol. :laugh:

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Crazyfool
I find it entirely believable that Rollo is part of AEG's "viral marketing" for Nvidia as he is consistently offering only Nvidia's products as a solution to any noob who asks for help. link to explanation of AEG
AEG link

Furthermore, his sig refers to "Reference 7800GTXs" and these aren't commonly available outside of Nvidia marketing channels.

I wouldn't even take interest in this normally but once he branded me as a fanboy I started taking notes. Rollo is hereby branded by me as an employee of AEG's viral marketing department, lol. :laugh:


Errrr, my cards come from nVidia themselves, Santa Clara return address? I've never made any secret of that, and probably wouldn't have "reference 7800GTXs" in my signature, reference 6800GTs before that, and reference 6800 before that, and posted pics of all if I were trying to hide this?

:laugh:

 

Conky

Lifer
May 9, 2001
10,709
0
0
Originally posted by: Rollo
Originally posted by: Crazyfool
I find it entirely believable that Rollo is part of AEG's "viral marketing" for Nvidia as he is consistently offering only Nvidia's products as a solution to any noob who asks for help. link to explanation of AEG
AEG link

Furthermore, his sig refers to "Reference 7800GTXs" and these aren't commonly available outside of Nvidia marketing channels.

I wouldn't even take interest in this normally but once he branded me as a fanboy I started taking notes. Rollo is hereby branded by me as an employee of AEG's viral marketing department, lol. :laugh:


Errrr, my cards come from nVidia themselves, Santa Clara return address? I've never made any secret of that, and probably wouldn't have "reference 7800GTXs" in my signature, reference 6800GTs before that, and reference 6800 before that, and posted pics of all if I were trying to hide this?

:laugh:
So, you admit you work for AEG?

:laugh:

 

M0RPH

Diamond Member
Dec 7, 2003
3,302
1
0
Originally posted by: Rollo

Errrr, my cards come from nVidia themselves, Santa Clara return address? I've never made any secret of that, and probably wouldn't have "reference 7800GTXs" in my signature, reference 6800GTs before that, and reference 6800 before that, and posted pics of all if I were trying to hide this?

:laugh:

So you do get free cards from Nvidia then? You don't buy them yourself? Interesting that you have not specifically denied being involved with AEG. Just saying that you get cards straight from Nvidia doesn't mean you're not part of the AEG program. AEG participants could very well receive cards directly from Nvidia.

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Rollo
Originally posted by: Crazyfool
I find it entirely believable that Rollo is part of AEG's "viral marketing" for Nvidia as he is consistently offering only Nvidia's products as a solution to any noob who asks for help. link to explanation of AEG
AEG link

Furthermore, his sig refers to "Reference 7800GTXs" and these aren't commonly available outside of Nvidia marketing channels.

I wouldn't even take interest in this normally but once he branded me as a fanboy I started taking notes. Rollo is hereby branded by me as an employee of AEG's viral marketing department, lol. :laugh:


Errrr, my cards come from nVidia themselves, Santa Clara return address? I've never made any secret of that, and probably wouldn't have "reference 7800GTXs" in my signature, reference 6800GTs before that, and reference 6800 before that, and posted pics of all if I were trying to hide this?

:laugh:

as i said, you couldn't work for AEG . . . imo - impossible.

You are so one-sided, you just go round and round in circles . . . spinning and spinning and refusing to be pinned down to a logical discussion. . . . that's NOT what AEG looks for.

i believe that you actually do ATi a service by posting such FUD . . . it is SO obvious, so unbelievable and ridiculous that this board is actually turning from almost 80% pro-nVidia last year to closer to 55% pro-ATi in just a few weeks . . .

. . . perhaps you are really a double agent really posting for ATi . . . making so many ridiculous posts that people go the 'other way'. . . :p
:Q

thanks, i guess
 

nts

Senior member
Nov 10, 2005
279
0
0
Originally posted by: apoppin
as i said, you couldn't work for AEG . . . imo - impossible.

You are so one-sided, you just go round and round in circles . . . spinning and spinning and refusing to be pinned down to a logical discussion. . . . that's NOT what AEG looks for.

i believe that you actually do ATi a service by posting such FUD . . . it is SO obvious, so unbelievable and ridiculous that this board is actually turning from almost 80% pro-nVidia last year to closer to 55% pro-ATi in just a few weeks . . .

. . . perhaps you are really a double agent really posting for ATi . . . making so many ridiculous posts that people go the 'other way'. . . :p
:Q

thanks, i guess

Well what about...

Originally posted by: John Reynolds
....
The AEG program member who came out behind the scenes a few weeks ago? Flat-out named you as a member of the program, yet only in the context of saying that you were an embarrassment to the program and he and other members want you kicked out of it.
...

As for Rollo, I do think he is in AEG or some other marketing program for NVIDIA.