AMD being smart about who gets a review sample

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
It's absolutely 100% right that he got taken to task for that video. It's amazingly bad and the comparisons he stated as fact were ludicrous even by the standards of actual literal corporate PR. If NV had published speculation that Fury was below the 980, they'd get rightly mocked. Pretending that he's qualified is beyond farce when he can manage to go for fifteen minutes without accidentally being right about something's performance.

However, it is also 100% not the place of the companies to do so and phrase it in a way that leaves it ambiguous as to what coverage is going to get deemed overly negative and result in a negative financial impact from loss of free stuff and deliveries of product in time to compete.

Unfortunately that leaves me without a good solution for how to keep that sort of inanity in check, but I know without a doubt that reviewers should not be beholden to the companies whose products they review.
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
Basically KitGuru pre-emptively decided to trash the R9-300 series as a joke. It turns out that the top card ran even with GTX 980 and sometimes even beat it. In some games it was a 20% improvement over 290X, something that can't only be nailed to "overclocking".

But KitGuru was lazy and didn't want to wait for hardware, they bashed it ahead of time.

And they turned out to be exactly right. The R9 390X is a factory overclocked rebrand that uses TWICE as much power as a GTX 980 for roughly the same level of performance. That's not an exaggeration, and it's not based on synthetic benchmarks. In a normal game (Crysis 2), the R9 390X uses up to 370W, compared to just 184W for the GTX 980. Basically, AMD listened to all the complaints about Hawaii regarding power consumption... and gave a big fat middle finger to the complainers, cranking it up to 11. Just like the 220W FX processors that can't even compete with a 95W Intel. It seems to me that AMD has some serious cultural problems about rebranding and excessive power usage. They need to address that, rather than attacking review sites. They can cut off the pipeline of review samples, but they can't make anyone buy their products.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Slow your roll there fella. :cool:
When someone disagrees with every single solitary point of another, that leaves room for interpretation. RS did not agree with a single point the KitGuru guy made. That is impossible. There was 13 plus minutes of video and the guy covered many topics and some were a little out there, and some were pretty spot on. RS came down on everything the guy said.
It was more or less a huge excuse for AMD not sending KitGuru a card.

That is not what I said. I just highlighted the errors in his review. Of course there is some truth to his statements that some R9 200 series are rebadged HD7000 series and some console games are limited to 720P, that some R9 200 cards aren't worth buying over Maxwell. That's not how he phrased his preview.

And they turned out to be exactly right. The R9 390X is a factory overclocked rebrand that uses TWICE as much power as a GTX 980 for roughly the same level of performance. That's not an exaggeration, and it's not based on synthetic benchmarks.

That's 1 sample of 1 brand. Other review sites do not show an R9 390x using 370W of power. It could be that they have gotten a really bad batch of MSI Gaming 390X cards or it could be that most MSI Gaming R9 390X cards are unnecessarily overvolted out of the box since they do ship with 1100mhz clocks. Can we correlate MSI Gaming 390X directly to other 390X after-market cards that may have a different set of voltages and GPU clocks? We need more data. Looks like you aren't reading reviews carefully though because there are plenty of reviews that do not show 390X using 2X the power of a 980. It wasn't unusual to see some AIBs clock HD7950 925mhz V2 cards at 1.256V while my 7970 cards shipped with 1.174V. Different vendors do things differently. Just like we don't judge the entire 980 stack of GPUs based on Gigabyte G1 980's power usage, we shouldn't judge the entire R9 390 stack by MSI Gaming series of cards. I do remember how you ignore the Gigabyte G1 980 results as you are quick to dismiss them as "Gigabyte screwed things up" but you sure are quick to judge the entire 390 series of cards by 1 MSI Gaming card. Doesn't make you appear objective in your posts to be honest.

I've noticed in your posts you tend to find the most negative things about AMD cards and completely ignore the other reviews which contradict your data. Why? I am not saying that 390X is some amazing videocard or something- it's not, but to imply that ALL R9 390X cards use 370W of power just because that specific MSI Gaming 390X at TPU did is ludicrous.

TechSpot tested HIS IceQ2 390/390X cards and the results are completely different. Why are you ignoring those data points? Because they contradict your theme that 390X uses 2X the power of a 980?

Power_01.png

Power_02.png

Power_03.png


Don't you think it's more reasonable to wait for more reviews from other sites, tests of other vendor cards too? Instead, you are quick to judge the entire 390/390X series by 1 single review data point from TPU.

BTW, can you type out your GPU history since HD4000 series. I openly state that I don't care about perf/watt unless it directly translates into more flagship performance, which is why I had GTX470 SLI max overclocked and didn't buy HD5850s. What's your GPU upgrade path exactly? You keep going on and on about perf/watt and power usage so I presume you had to own HD4000, 5000 and 6000 series as they beat NV in perf/watt for 3 consecutive generations. Also, if perf/watt mattered such a great deal, there is absolutely no way you waited for GTX400 series and didn't buy an HD5000 card for 6 months before Fermi even dropped. Millions of NV owners did though and now perf/watt is the most important metric in the universe, so important that they recommended a 960 over a 50% faster R9 290 with double the VRAM for $50 more and a 750/750Ti over 30-45% faster R9 270/270X that cost just $20-30 more.
 
Last edited:

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
He did constantly suggest that "things remain to be seen" and "I could be wrong" throughout the video.

That Kitguru Fiji speculation video is just atrocious, at first I was a bit concerned about AMD playing favorites but after seeing that this seems justifiable.
 

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
And they turned out to be exactly right. The R9 390X is a factory overclocked rebrand that uses TWICE as much power as a GTX 980 for roughly the same level of performance. That's not an exaggeration, and it's not based on synthetic benchmarks. In a normal game (Crysis 2), the R9 390X uses up to 370W, compared to just 184W for the GTX 980. Basically, AMD listened to all the complaints about Hawaii regarding power consumption... and gave a big fat middle finger to the complainers, cranking it up to 11. Just like the 220W FX processors that can't even compete with a 95W Intel. It seems to me that AMD has some serious cultural problems about rebranding and excessive power usage. They need to address that, rather than attacking review sites. They can cut off the pipeline of review samples, but they can't make anyone buy their products.

I too use the highest outlier for the power consumption of an entire line of products, rather than the HIS card Techspot got that pulls the same wattage as a 290X, and cannot get up past 333W for an entire system. Oh nooo, it pulls 40 W more than a 980, it just lit my eyebrows on fire and caused a miniature Chernobyl in my case, which is sagging under the heat.

No wait actually they managed to give 5% more clocks, 20% more memory clocks and twice the memory than last generation for no increase in wattage when smaller clock and memory boosts with no memory increase cost NV a 30W increase with the 770 compared to the 680.

Good job!
 
Last edited:

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
Utter nonsense. This very forum is absolutely convinced the 200 series tanked because of the poor thermal and noise performance of the reference design showcased in multiple reviews, even though buying a reference design was hard a year ago, much less today.

You can't read three posts without someone uttering the "perception is king" mantra.

But hey, AMD should allow someone to slit their throats, because reasons. Right?

The reference 200 series was dreadful from an ergonomics standpoint. That was not the opinion of a couple rogue biased reviews, it was universally panned because it deserved it. If the product is terrible, it should be ripped.

There are more than enough major sites out doing reviews, that readers can fact check and analyze benchmarks among them and pick out questionable results. For that reason, the benchmarks cannot be intentionally fudged to any significant extent before the online community starts asking questions. So if 2 sites have similar benchmark results, and one trashes the product and the other one praises it, it is up to reader to decide which site is to be believed. If the Fury X is slightly in front of the 980Ti in benchmarks, and it is likely near silent with the water cooling, making the power usage rather irrelevant, it's not like KitGuru could rip the product in the conclusion without looking like a total tool to anyone who can understand the data results from the rest of the review.

If the review is negative but the data does not support the negative conclusion, it does not reflect badly on the product, it reflects poorly on the reviewer.
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
He also mentions how the reduction in power usage of HBM will mean AMD will allocate more power usage towards the ASIC/GPU itself which he says is worrying to him. This is actually a good thing, not a bad thing since it allows to make a much larger GPU and cram more transistors since when you have enough memory bandwidth, you should use the extra power towards a component that makes the product faster -- and that is the GPU. How he managed to put a negative spin on this is amazing.

Not everyone wants a blast furnace crammed into their case. This is why he was probably saying that AMD's insistence on cannibalizing all process improvements to roll back into MOAR POWER is worrisome. To be fair, the Fury Nano does indicate that AMD recognizes not everyone wants absolute top performance at all costs, and that many people care about perf/watt and are working within a modest TDP limit. On the other hand, the R9 390X (370W gaming, 424W FurMark: much worse than Fermi) is an absurd and even obscene product, much like the FX-9590: outdated silicon made to run way past its limits to camouflage the inability of the company to provide a competitive design. AMD should be embarrassed to ever have released either one.

and he makes uneducated statements how the entire R9 200 series are re-badges of HD7000 series, from 2011. The only cards that were paper launched in 2011 were 7970/7950 but no HD7700-7800-7900 series was ever for sale in 2011, they only launched Q1 2012, which means they are 3 years old, not 4 years old. Another loss of credibility by fudging up data.

Yes, he should have been a bit more clear (and he also seems to think Tahiti was being rebadged again for the 300 series, which thankfully is not true now that they've got Tonga). But while you can nitpick around the edges, the fact remains that of the five ASICs used in the retail 200 series cards in 2013, four were rebadged (Cape Verde, Bonaire, Pitcairn, Tahiti) and only one was new (Hawaii). Tonga was also new, but that was released in late 2014 and really didn't belong in the 200 series at all, which is why it got such an odd numerical designation.

He also talks about how none of R9 200 series were competitive at all which is utter rubbish since R9 270/270X smoked 750/750Ti in gaming, same for 280X and 290 vs. 960 2GB. He also didn't even mention how in the UK one could get much cheaper AMD cards than NV which made a lot of them worth buying.

Some of the 200 series cards were competitive, especially during the Kepler era. During that time, they didn't fall far behind in efficiency, and defeated Nvidia products in perf/$. But Maxwell absolutely demolished them, leaving nothing for AMD to do but cut prices to the bone (and later, rebadge in hopes of tricking some buyers into thinking they were getting something new).

Your specific comparisons are flawed. 270/270X requires at least one six-pin power connector (and more in the case of the 270X), while 750/750 Ti requires no power connector. That means the two cards are in separate classes and shouldn't really be compared to each other. For someone who has an OEM system with a crappy 300-350W PSU that lacks PCIe connectors, the 750 Ti is a realistic option, but Pitcairn cards aren't. Someone doing a similar upgrade with AMD would have to resort to a much worse performing Cape Verde card. (The OEM R9 360 Bonaire card fits under 75W, but for some reason, the consumer R7 360 doesn't.)

Likewise, in terms of die size and TDP, it's inappropriate to compare GM206 (GTX 960) to much larger, more power-hungry chips like Tahiti and Hawaii. The appropriate comparison is between GM206 and Pitcairn, and Pitcairn gets absolutely demolished in that contest (not surprising, given that it's a 2012 design up against a 2015 design).

And he wrote off the entire AMD CPU stack in his GPU preview of AMD's new cards.

Well, AMD put an Intel CPU in their Project Quantum PC. The Bulldozer family has been described by high-ranking AMD employees as an "unmitigated failure". Seems to me that even AMD has essentially written off their current CPU stack. With the possible exception of Carrizo (assuming it appears in anything but cheap craptops), there won't be anything worth seeing on that front until Zen.
 

Hitman928

Diamond Member
Apr 15, 2012
6,642
12,245
136
And they turned out to be exactly right. The R9 390X is a factory overclocked rebrand that uses TWICE as much power as a GTX 980 for roughly the same level of performance. That's not an exaggeration, and it's not based on synthetic benchmarks. In a normal game (Crysis 2), the R9 390X uses up to 370W, compared to just 184W for the GTX 980. Basically, AMD listened to all the complaints about Hawaii regarding power consumption... and gave a big fat middle finger to the complainers, cranking it up to 11. Just like the 220W FX processors that can't even compete with a 95W Intel. It seems to me that AMD has some serious cultural problems about rebranding and excessive power usage. They need to address that, rather than attacking review sites. They can cut off the pipeline of review samples, but they can't make anyone buy their products.

This has already been covered, so I don't know why you still harp on it, but the MSI model you point to in techpowerup is designed for overclocking and by default uses a lot more power (over 50 W more) than other 390x models with virtually no performance increase. It's a power consumption figure specific to that one model and is not indicative of 390x power consumption in general.

E.g. the His model runs slightly overclocked as well, but only uses about 30-40 W more than a 980. Obviously the 980 is still a more power efficient design, but it's significantly closer now than it was before.

Power_01.png
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
The reference 200 series was dreadful from an ergonomics standpoint. That was not the opinion of a couple rogue biased reviews, it was universally panned because it deserved it. If the product is terrible, it should be ripped.

There are more than enough major sites out doing reviews, that readers can fact check and analyze benchmarks among them and pick out questionable results. For that reason, the benchmarks cannot be intentionally fudged to any significant extent before the online community starts asking questions. So if 2 sites have similar benchmark results, and one trashes the product and the other one praises it, it is up to reader to decide which site is to be believed. If the Fury X is slightly in front of the 980Ti in benchmarks, and it is likely near silent with the water cooling, making the power usage rather irrelevant, it's not like KitGuru could rip the product in the conclusion without looking like a total tool to anyone who can understand the data results from the rest of the review.

If the review is negative but the data does not support the negative conclusion, it does not reflect badly on the product, it reflects poorly on the reviewer.

well, [H] did that in their "how much (6gb) VRAM is necessary for GPU at 4K" article.
The conclusion was 6 gig is minimum at 4k, but their results showed 4GB cards kick ass 12GB cards.
FUD by definition doesn't work on known and proven facts, but a possibility of something un-provable occurring somewhere in the future.
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
This has already been covered, so I don't know why you still harp on it, but the MSI model you point to in techpowerup is designed for overclocking and by default uses a lot more power (over 50 W more) than other 390x models with virtually no performance increase. It's a power consumption figure specific to that one model and is not indicative of 390x power consumption in general.

TPU has, by far, the best database of power consumption figures for GPUs (going back several generations and covering all aspects of GPU usage), and their figures, unlike those from many other sites, actually isolate the card's power consumption rather than using guesswork based on whole-system power usage figures measured at the wall. Thus, these are the only power usage numbers I generally follow.

AMD chose not to send out 300 series review samples, which meant they were at the mercy of their AIBs. TPU's W1zzard says in the comment thread that he will be testing a PowerColor 390X PCS+ soon. I guess we'll see if that has more reasonable figures or if the outrageous numbers shown on the MSI card are indeed representative.
 

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
Not everyone wants a blast furnace crammed into their case. This is why he was probably saying that AMD's insistence on cannibalizing all process improvements to roll back into MOAR POWER is worrisome. To be fair, the Fury Nano does indicate that AMD recognizes not everyone wants absolute top performance at all costs, and that many people care about perf/watt and are working within a modest TDP limit. On the other hand, the R9 390X (370W gaming, 424W FurMark: much worse than Fermi) is an absurd and even obscene product, much like the FX-9590: outdated silicon made to run way past its limits to camouflage the inability of the company to provide a competitive design. AMD should be embarrassed to ever have released either one.

I take it you ignored my post where I showed how flagrantly wrong you are? Thanks!

Yes, he should have been a bit more clear (and he also seems to think Tahiti was being rebadged again for the 300 series, which thankfully is not true now that they've got Tonga). But while you can nitpick around the edges, the fact remains that of the five ASICs used in the retail 200 series cards in 2013, four were rebadged (Cape Verde, Bonaire, Pitcairn, Tahiti) and only one was new (Hawaii). Tonga was also new, but that was released in late 2014 and really didn't belong in the 200 series at all, which is why it got such an odd numerical designation.

Weird how he keeps harping on 2011 and even more weird how he keeps ranting about the age of the products as if that has anything to do with the performance of them rather than actually commenting on their performance or pricing.

Your specific comparisons are flawed. 270/270X requires at least one six-pin power connector (and more in the case of the 270X), while 750/750 Ti requires no power connector. That means the two cards are in separate classes and shouldn't really be compared to each other. For someone who has an OEM system with a crappy 300-350W PSU that lacks PCIe connectors, the 750 Ti is a realistic option, but Pitcairn cards aren't. Someone doing a similar upgrade with AMD would have to resort to a much worse performing Cape Verde card. (The OEM R9 360 Bonaire card fits under 75W, but for some reason, the consumer R7 360 doesn't.)

Likewise, in terms of die size and TDP, it's inappropriate to compare GM206 (GTX 960) to much larger, more power-hungry chips like Tahiti and Hawaii. The appropriate comparison is between GM206 and Pitcairn, and Pitcairn gets absolutely demolished in that contest (not surprising, given that it's a 2012 design up against a 2015 design).

Somehow I don't think the market is entirely comprised of people incapable of changing power supplies and stuck with bad OEM ones. It's even funnier when some of those comparisons you can get equal performance from the price from AMD and a brand new power supply to run it versus NV, and better PSUs have value in other regards. Heck, compared to a lot of OEM firecrackers, AMD+new PSU would save power and reduce the chance of a dead computer. Talk about a bargain!

TPU has, by far, the best database of power consumption figures for GPUs (going back several generations and covering all aspects of GPU usage), and their figures, unlike those from many other sites, actually isolate the card's power consumption rather than using guesswork based on whole-system power usage figures measured at the wall. Thus, these are the only power usage numbers I generally follow.

AMD chose not to send out 300 series review samples, which meant they were at the mercy of their AIBs. TPU's W1zzard says in the comment thread that he will be testing a PowerColor 390X PCS+ soon. I guess we'll see if that has more reasonable figures or if the outrageous numbers shown on the MSI card are indeed representative.

So that's twice that you've ignored that the HIS card that Techspot got, which as part of an entire system draws much less than that MSI card?
 

kawi6rr

Senior member
Oct 17, 2013
567
156
116
And they turned out to be exactly right. The R9 390X is a factory overclocked rebrand that uses TWICE as much power as a GTX 980 for roughly the same level of performance. That's not an exaggeration, and it's not based on synthetic benchmarks. In a normal game (Crysis 2), the R9 390X uses up to 370W, compared to just 184W for the GTX 980. Basically, AMD listened to all the complaints about Hawaii regarding power consumption... and gave a big fat middle finger to the complainers, cranking it up to 11. Just like the 220W FX processors that can't even compete with a 95W Intel. It seems to me that AMD has some serious cultural problems about rebranding and excessive power usage. They need to address that, rather than attacking review sites. They can cut off the pipeline of review samples, but they can't make anyone buy their products.

LMAO! Try looking at more then one card, this is to funny.
 

BD231

Lifer
Feb 26, 2001
10,568
138
106
Their biggest release this year is a card that offers performance that has been available for the better part of 6 months. Damage control was inevitable.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Great post.
But it is nothing compared to a popular Polish review site (you know which one) that have now in their tests 2 project cars results and whole 6 tests of witcher 3
Out of 20 tests only 4 are not nv sponsored titles :D

How gave them review samples!?
 

stahlhart

Super Moderator Graphics Cards
Dec 21, 2010
4,273
77
91
Thread cleaned again.

The next ones to derail this thread are getting the ban hammer.

-- stahlhart
 

omek

Member
Nov 18, 2007
137
0
0
He did constantly suggest that "things remain to be seen" and "I could be wrong" throughout the video.

Same as saying "void if" or "could cause birth defects". It's a scapegoat, something he can point at to exempt himself if/when he ends up being wrong. It's fairly easy to see.
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
That's 1 sample of 1 brand. Other review sites do not show an R9 390x using 370W of power. It could be that they have gotten a really bad batch of MSI Gaming 390X cards or it could be that most MSI Gaming R9 390X cards are unnecessarily overvolted out of the box since they do ship with 1100mhz clocks. Can we correlate MSI Gaming 390X directly to other 390X after-market cards that may have a different set of voltages and GPU clocks? We need more data. Looks like you aren't reading reviews carefully though because there are plenty of reviews that do not show 390X using 2X the power of a 980. It wasn't unusual to see some AIBs clock HD7950 925mhz V2 cards at 1.256V while my 7970 cards shipped with 1.174V. Different vendors do things differently. Just like we don't judge the entire 980 stack of GPUs based on Gigabyte G1 980's power usage, we shouldn't judge the entire R9 390 stack by MSI Gaming series of cards. I do remember how you ignore the Gigabyte G1 980 results as you are quick to dismiss them as "Gigabyte screwed things up" but you sure are quick to judge the entire 390 series of cards by 1 MSI Gaming card. Doesn't make you appear objective in your posts to be honest.

Yes, it's entirely possible that MSI screwed things up here. The difference is that, unlike with the GTX 980, there's no reference card (and we know the reason why: AMD doesn't have an adequate cooler appropriate for a card of that range). So there's no "gold standard" to compare it to. That said, TPU will be testing a PowerColor R9 390X soon (according to W1zzard's feedback in that review's comment thread), so we should see before long whether the MSI results are an aberration. If the PowerColor results are significantly lower, then I agree they would probably be more representative of performance.

Ultimately, AMD chose not to send out review samples for the 300 series cards. Had they done so, they could have gotten out ahead of the narrative by choosing to send cards that offered a good balance of performance, power usage, temperature, and noise. Since they did not do so, the reviews are coming out in a piecemeal fashion, and are all over the board. If AMD doesn't like that, they have only their own abysmal marketing department to blame.

TechSpot tested HIS IceQ2 390/390X cards and the results are completely different. Why are you ignoring those data points? Because they contradict your theme that 390X uses 2X the power of a 980?

Because those reviews cite "system power consumption", and I ignore such figures whenever possible. I want to see how much power the video card uses, not how hard the game is working the CPU.

BTW, can you type out your GPU history since HD4000 series. I openly state that I don't care about perf/watt unless it directly translates into more flagship performance, which is why I had GTX470 SLI max overclocked and didn't buy HD5850s. What's your GPU upgrade path exactly? You keep going on and on about perf/watt and power usage so I presume you had to own HD4000, 5000 and 6000 series as they beat NV in perf/watt for 3 consecutive generations.

Currently I'm using a PowerColor 7870 that I bought during the mining craze for $179. Undervolted, its power draw is only about 120W even with FurMark. At about the same time I also bought a HIS IceQ 7870 as a secondary mining card, though I'm not currently using it for anything.

Prior to that, I had a Sapphire Ultimate 7750 (fanless) for a while. Before that, an XFX 5670. And prior to that, integrated graphics on an AM2 chipset. At this time, I was using a small form factor case (Silverstone SG02) with an Antec EarthWatts (Delta) 380W power supply, so power efficiency was specifically in mind when selecting these cards. I stepped up to the Pitcairns after I switched to a Fractal Design Define R4 case (which I'm starting to get tired of now).

I may buy a FirePro W7100 this month; with the 50%-off offer that AMD has active, the price would be only $325, which isn't bad at all for a mid-tier pro card. I would have bought a Quadro instead if Nvidia offered one with the GM206 or GM204 GPU, which they don't. As it is the only Maxwell Quadros are the low-end K2200 (GM107) (and one or two even further cut down) and the ultra high-end M6000 (GM200).
 

el etro

Golden Member
Jul 21, 2013
1,584
14
81
Nvidia do this too.

Infraction issued for thread crapping, and banned for one day for derailing thread.
-- stahlhart
 
Last edited by a moderator:

el etro

Golden Member
Jul 21, 2013
1,584
14
81
Power consumption numbers of 390x differ very much to other power consumption numbers. I guess someone is wrong here.
 

iiiankiii

Senior member
Apr 4, 2008
759
47
91
The way I see it, since review samples are limited, AMD should be allowed to prioritizes when a review site gets a review sample. That priority 'list' should be done at AMD's own discretion.

Too bad AMD's PR is horrible at conveying a politically sounded reason (PR is a metagame within the industry) for not giving KitGuru a review sample for launch day review.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Same as saying "void if" or "could cause birth defects". It's a scapegoat, something he can point at to exempt himself if/when he ends up being wrong. It's fairly easy to see.

I thought that was obvious. He was offering his opinions and speculations but said, "I could be wrong". Doesn't sound like a scapegoat. Sounds like he is saying that he isn't perfect and could be incorrect.
 

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
I thought that was obvious. He was offering his opinions and speculations but said, "I could be wrong". Doesn't sound like a scapegoat. Sounds like he is saying that he isn't perfect and could be incorrect.
isn't that the same as lawyers who gets out a damning question or statement and immediately recants it, and the judge orders it striked from record?

but the damage has been done, the jurors have heard it.

saying "I could be wrong" after the fact doesn't excuse much.

AMD is being very smart about it.
 

iiiankiii

Senior member
Apr 4, 2008
759
47
91
I finally saw the KitGuru video. Holy cow! That thing was straight up bias. Even PcPer ain't that stupid to bash AMD that much. You really have to play the metagame, here.

I don't think it really matters what company it is. If that company was getting bashed that hard (without clear facts), that company would be extremely hesitate to send their limited supply of review samples to that reviewer. No doubt about it.

Man, that was hard to watch. His negativity was strong in that video.
 
Status
Not open for further replies.