• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

CPU Bottlenecking at Digit-Life, yes we need faster CPUs as much as we need faster GPUs

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
#1: YES, ALL MODERN GAMES ARE CPU LIMITED.

#2: CPU limited is NOT the same thing as CPU bottlenecked. (Ie. if substituting a faster graphics card with the same CPU results in a greater performance increase from a GPU change, the game is GPU bottlenecked not CPU bottlenecked. It can still be CPU limited in that you can get margin performance gains with a faster cpu had it been available, but not substantial gains). Providing examples at lower resolutions such as 1920x1200 on a 4870 X2 example is irrelevant since the card was released for 2560x1600 today. In tomorrow's games it'll become GPU limited at lower resolutions.

#3: Driver scaling factors into overall performance scaling and it is often impossible to separate CPU limitation vs. poor driver scaling until many versions of drivers have been released and results updated (hindsight).

On a Quad Core 3.0ghz system the results are as follows:

COD4 2560x1600 4AA/16AF
4870 = 44.4
4870 X2 = 83.6 (+88%)

HLF2 ET 2560x1600 4AA/16AF
4870 = 50.8
4870 X2 = 85.3 (+68%)

ET: QW 2560x1600 4AA/16AF
4870 = 65.2
4870 X2 = 100.7 (+54%)

Crysis 1920x1200 4AA/16AF
4870 = 18.5
4870 X2 = 24.6 (+33%) <-- a game known for poor SLI and CF scaling.

Assassin's Creed - 2560x1600 4AA
4870 = 41.8
4870 X2 = 53.8 (+29%)

Race Driver GRID - 2560x1600 4AA
GTX 280 = 42.1 (no data for 4870)
4870 X2 = 94.1 (+124% advantage)

#4 None of the above tested games at the forementioned tested settings and resolutions will run faster with C2Q 4.0ghz and a single 4870 vs. a C2Q 3.0ghz and 4870 X2. Therefore, the only question is depending on the games and resolutions one plays, are those gains worth the expense of $550 4870 X2 vs. $250 4870? That depends on the person.

To say it's not sufficient to utilitize the card to its "full" (100%) potential would be correct. But to0 be claim that C2Q 4.0ghz is inadequate for 4870X2 (i.e. saying that 4870X2 is not worth it to upgrade because the CPU is too slow to utilize such card) is just ignorance and depends entirely on the person. The benchmarks clearly show a large gain in a lot of games.
 

Janooo

Golden Member
Aug 22, 2005
1,067
13
81
Originally posted by: RussianSensation
...
On a Quad Core 3.0ghz system the results are as follows:

COD4 2560x1600 4AA/16AF
4870 = 44.4
4870 X2 = 83.6 (+88%) -- > 4870 X2 = 122 (+177%) C2Q@3.85GHz

...

ET: QW 2560x1600 4AA/16AF
4870 = 65.2
4870 X2 = 100.7 (+54%) --> 4870 X2 = 101.5 (+54%) C2Q@3.85GHz

...

This is interesting. COD4 can go higher with the CPU clock speed. ET: QW is done, 3GHz is enough.
The CPU limitation is per game. It would be nice to create a scaling matrix for the most popular games.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: RussianSensation
#1: YES, ALL MODERN GAMES ARE CPU LIMITED.
Yep, except we're seeing CPU bottlenecks now at traditionally GPU bound settings and resolutions.

#2: CPU limited is NOT the same thing as CPU bottlenecked. (Ie. if substituting a faster graphics card with the same CPU results in a greater performance increase from a GPU change, the game is GPU bottlenecked not CPU bottlenecked. It can still be CPU limited in that you can get margin performance gains with a faster cpu had it been available, but not substantial gains). Providing examples at lower resolutions such as 1920x1200 on a 4870 X2 example is irrelevant since the card was released for 2560x1600 today. In tomorrow's games it'll become GPU limited at lower resolutions.
I've already told you once I'm not going to argue over semantics. If you keep insisting there's a difference and edit the wiki page and blog about it enough, you may get dictionary.com to make a distinction as well. Until then the terms are interchangeable and nothing more than semantics.

Wiki: Bottleneck
Bottlenecking (engineering)
In engineering, bottleneck is a phenomenon by which the performance or capacity of an entire system is severely limited by a single component. The component is sometimes called a bottleneck point. The term is metaphorically derived from the neck of a bottle, where the flow speed of the liquid is limited by its neck.

Formally, a bottleneck lies on a system's critical path and provides the lowest throughput. Bottlenecks are usually avoided by system designers, also a great amount of effort is directed at locating and tuning them. Bottleneck may be for example a processor, a communication link, a data processing software, etc

Dictionary.com: Bottleneck

bot·tle·neck Audio Help /'b?tl?n?k/ Pronunciation Key - Show Spelled Pronunciation[bot-l-nek] Pronunciation Key - Show IPA Pronunciation
?noun 1. a narrow entrance or passageway.
2. a place or stage in a process at which progress is impeded.
3. Also called slide guitar. a method of guitar playing that produces a gliding sound by pressing a metal bar or glass tube against the strings.
?verb (used with object) 4. to hamper or confine by or as if by a bottleneck.
?verb (used without object) 5. to become hindered by or as if by a bottleneck.

But to humor you, even in your given example of placing faster GPU with the same CPU, there's plenty of instances where the faster GPU makes no difference with the same CPU. Just use that first Digit-Life tool, select 4800+ for both PC1 and PC2, then select 4850 and 4870.

Let's extend that to the Tech Report example with 4850CF and 4870CF at 1920 4xAA in COD4, 89.3 and 89.4 FPS respectively. Isn't the 4870CF solution supposed to be at least 25% faster than 4850CF? Sure doesn't look like it. Wonder why? :)

And of course none of that even begins to show how a faster CPU with the same previously CPU bottlenecked solutions would yield higher frame rates and differences where there were none before between different solutions.

#3: Driver scaling factors into overall performance scaling and it is often impossible to separate CPU limitation vs. poor driver scaling until many versions of drivers have been released and results updated (hindsight).

On a Quad Core 3.0ghz system the results are as follows:

COD4 2560x1600 4AA/16AF
4870 = 44.4
4870 X2 = 83.6 (+88%)

HLF2 ET 2560x1600 4AA/16AF
4870 = 50.8
4870 X2 = 85.3 (+68%)

ET: QW 2560x1600 4AA/16AF
4870 = 65.2
4870 X2 = 100.7 (+54%)

Crysis 1920x1200 4AA/16AF
4870 = 18.5
4870 X2 = 24.6 (+33%) <-- a game known for poor SLI and CF scaling.

Assassin's Creed - 2560x1600 4AA
4870 = 41.8
4870 X2 = 53.8 (+29%)

Race Driver GRID - 2560x1600 4AA
GTX 280 = 42.1 (no data for 4870)
4870 X2 = 94.1 (+124% advantage)

The fact you have to go to 2560 and 4xAA and compare multi-GPU to 1 just proves my point. That in lower resolutions and settings current GPU solutions are CPU bottlenecked. In order to see differences in performance in these CPU bottlenecked resolutions, you will need to 1) get a faster CPU 2) test more demanding games 3) increase GPU load by increasing resolution/settings or increasing AA.

What happens when we compare a "faster" solution to a slower one at resolutions that aren't insanely GPU intensive?
COD4 1920x1200 4AA/16AF
4850CF = 89.3
4870 CF = 89.4 (+0.00%)

HLF2 ET 1680x1050 4AA/16AF
4850CF = 120.7
4870CF = 135.6 (+12.3%)

ET: QW 1920x1200 4AA/16AF
4850CF = 116.3
4870CF = 121.7 (4.6+%)

etc......

#4 None of the above tested games at the forementioned tested settings and resolutions will run faster with C2Q 4.0ghz and a single 4870 vs. a C2Q 3.0ghz and 4870 X2. Therefore, the only question is depending on the games and resolutions one plays, are those gains worth the expense of $550 4870 X2 vs. $250 4870? That depends on the person.
How do you know that? We've already seen a single 4870 with a 4GHz CPU outperform a 4870CF solution that was CPU bottlenecked at 3.0GHz. We also have Hardware Canuck's review that shows there isn't any 89FPS cap in COD4 once you use a faster CPU. You're making a claim you frankly can't back up, but based on what we do know and the fact you have to go to 2560 and 4xAA to find separation between parts, its not a claim I'd make.

To say it's not sufficient to utilitize the card to its "full" (100%) potential would be correct. But to0 be claim that C2Q 4.0ghz is inadequate for 4870X2 (i.e. saying that 4870X2 is not worth it to upgrade because the CPU is too slow to utilize such card) is just ignorance and depends entirely on the person. The benchmarks clearly show a large gain in a lot of games.
Who's making the claim a 4.0GHz isn't adequate and who's coming off as ignorant? Based on those results between a 4850CF and 4870CF would you recommend someone running those resolutions to upgrade? Or would you point to 2560 results with 4xAA to justify the price difference and hassle of selling their old cards? Seriously.....
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Originally posted by: chizow

I'm not going to waste a lot of time replying to your nonsense on a point by point basis, about 15 reviews just popped up proving me to be 100% correct. Here's a few highlights:
Again, nobody is claiming there aren't any CPU limitations at play. The problem is with you trying to make such claims as the norm when in reality they?re isolated to fringe cases.

You keep linking to UT3, Quake Wars and CoD 4 as evidence when in reality those tiles are CPU limited, akin to HL2 and Painkiller of yesteryear.

Call of Juarez: http://www.xbitlabs.com/articl...hd4850-cf_5.html#sect0 (GPU bound even at 1280x1024 with 4xAA).

Bioshock: http://www.xbitlabs.com/articl...hd4850-cf_4.html#sect2 (GPU bound even at 1280x1024 with 4xAA, ignore CF not scaling).

Stalker: http://www.xbitlabs.com/articl...hd4850-cf_7.html#sect1 (GPU bound at 1280x1024 with no AA).

Crysis: http://www.xbitlabs.com/articl...hd4850-cf_6.html#sect0 (GPU bound at 1280x1024 with no AA, ignore CF not scaling).

Quake Wars: http://www.anandtech.com/video/showdoc.aspx?i=3372&p=8 (GTX280 SLI is 73% faster than single card at 1680x1050 with 4xAA).

The list goes on.

Also even your CoD 4 and UT3:

http://www.tomshardware.com/re...870-x2-amd,1992-4.html
http://www.tomshardware.com/re...870-x2-amd,1992-7.html

Clearly GPU bound at 1920x1200 with 4xAA, and these are CPU limited titles like I mentioned earlier.

But to humor you, even in your given example of placing faster GPU with the same CPU, there's plenty of instances where the faster GPU makes no difference with the same CPU. Just use that first Digit-Life tool, select 4800+ for both PC1 and PC2, then select 4850 and 4870.
We've already gone over why those Digit-life scores are useless for backing your claims; you can't draw any kind of inference from them.

It's like me benchmarking Bioshock at 320x240 with a Pentium 4 and then running around claiming the game is CPU limited.
 

alcoholbob

Diamond Member
May 24, 2005
6,389
468
126
One example of CPU limitation is Crysis at Very High Settings, combat performance is highly tied to CPU speed.

I'll dig up the benchmark but the *average framerate* on an E8500 at 2.4GHz vs 4GHz were identical (~17.4fps, within 0.1 fps of each other). But the *minimum* shifted from 7 fps to 12 fps, which is a near doubling of performance. Now this is at 1680x1050 with everything spec'd at Very High with an OCed GTX 280.
 

CP5670

Diamond Member
Jun 24, 2004
5,668
768
126
Originally posted by: Astrallite
One example of CPU limitation is Crysis at Very High Settings, combat performance is highly tied to CPU speed.

I'll dig up the benchmark but the *average framerate* on an E8500 at 2.4GHz vs 4GHz were identical (~17.4fps, within 0.1 fps of each other). But the *minimum* shifted from 7 fps to 12 fps, which is a near doubling of performance. Now this is at 1680x1050 with everything spec'd at Very High with an OCed GTX 280.

Where does this occur? Despite the game's generally poor performance, I never had the framerate drop anywhere near that low. The lowest I saw was something like 27 during one of the outdoor alien fights, and that increased into the 30s when I reduced the resolution. Does it happen on very high only?

There is no reason to set everything to very high anyway, at least on current hardware. That includes several things that don't have much of a noticeable IQ impact but hurt the performance a lot. It's better to only turn on some specific very high flags.
 

Janooo

Golden Member
Aug 22, 2005
1,067
13
81
A limitation is a function of a system (GPU and its speed, CPU and its speed, MB, memory, HDD, OS, driver, ...) and a resolution. The best way to determine if a game is GPU or CPU limited at a specific resolution is to change their speed and compare the results. Unfortunately we don't have that kind of information. The reviews only change their GPU's.
I doubt it's enough information to draw a conclusion that a game is GPU or CPU limited.

Here is an example: A card A has 50fps and a card B has 100fps in a game and a resolution. The first thought is it must be GPU limited. Well, if we increase CPU clock by 30% the card A has 65fps and the card B has 130fps. So, is it GPU or CPU limited? Well, it seems it's CPU limited now. What's the correct answer?

 

Hugh H

Senior member
Jul 11, 2008
315
0
0
I have to agree with Chizow somewhat and go against almost everybody else.

In simple terms, playing World in Conflict @ 1900 x 1200 and settings as high as they go with 4xAA, my FPS increased from 34 average to 40 average when I overcloked my E8500 from 3.8Ghz to 4.0Ghz. (Used the in-game benchmark tool a few times to verify this)

If that is not a CPU limitation then you tell me. Also I understand that different games yield different results, but World in Conflict has already disrupted my peace and I can't wait for an overclocked Nehalem.

See signature for specs.
 

Hugh H

Senior member
Jul 11, 2008
315
0
0
Originally posted by: Janooo
So 5% CPU overclock gave you 20% fps increase? Is that correct?

I didn't do the math but, yes. Going from 3.8Ghz to 4Ghz got me an increase of 6 fps average in World in Conflict (1900 x 1200 resolution, high details).

 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: BFG10K
Again, nobody is claiming there aren't any CPU limitations at play. The problem is with you trying to make such claims as the norm when in reality they?re isolated to fringe cases.

You keep linking to UT3, Quake Wars and CoD 4 as evidence when in reality those tiles are CPU limited, akin to HL2 and Painkiller of yesteryear.
So then why do you keep trying to argue CPU bottlenecks on current parts don't exist? I've never once said every game and every card solution is CPU bottlenecked, so what is your point? I'm pointing out specific examples in specific benchmarks that show this to be true, you keep coming back with examples that prove otherwise?

Also, they're not fringe cases when every 4870X2 review with a 3GHz CPU is showing similar symptoms in resolutions up to 1920 with AA. I've shown similar CPU bottlenecking in WiC, The Witcher, AC, AoC and even Crysis. I'm sure I could find more but why would I need to, that's what reviewers are reviewing and what gamers are playing. Not only are these some of the best-looking titles on the PC currently, they're also some of the most recent and most popular.

 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Well isnt i7 right around the corner? Enthusiasts should have their hands on them before the end of the year. Rest of us slobs can expect 2nd quarter 09.

 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
1. R700 > any Nvidia card & 95% of SLI setups.

2. Cpu bottlenecking does exist. Get your hands the fastest (mhz) core2 you can find.

3. QX 45nm @ 4.2Ghz+ or Wolfie @ 4.4Ghz+ are your best bets for fastest gaming cpu.

4. All the millions of benchmark outcomes depend on the cpu speed, fsb/ram speed, pci-e connection, system, drivers, benching method, run, game settings, driver settings, resolution, filter quality, etc. Hard to find two sites where all of these are identical. This fact lets many reviewers easily misconstrue opinions & conclusions on card performance.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Genx87
Well isnt i7 right around the corner? Enthusiasts should have their hands on them before the end of the year. Rest of us slobs can expect 2nd quarter 09.

I'm pretty sure they bumped up release to October with 2 non-Extreme parts priced ~300 and $500. I believe all initial chips will be tied to X58 boards though which may cost an arm and a leg. There's still a few question-marks I have about Ci7 though, specifically number of dimm slots, memory channels and multiplier control now that there's no FSB.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Originally posted by: chizow
Originally posted by: Genx87
Well isnt i7 right around the corner? Enthusiasts should have their hands on them before the end of the year. Rest of us slobs can expect 2nd quarter 09.

I'm pretty sure they bumped up release to October with 2 non-Extreme parts priced ~300 and $500. I believe all initial chips will be tied to X58 boards though which may cost an arm and a leg. There's still a few question-marks I have about Ci7 though, specifically number of dimm slots, memory channels and multiplier control now that there's no FSB.

I'm also interested in multiplier control, QPI speed, and OC methods for i7's. I'm almost positive someone will find a way to OC these things
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Hugh H
Originally posted by: Janooo
So 5% CPU overclock gave you 20% fps increase? Is that correct?

I didn't do the math but, yes. Going from 3.8Ghz to 4Ghz got me an increase of 6 fps average in World in Conflict (1900 x 1200 resolution, high details).

WiC is a very specialized example that benefits especially from CPU horsepower

actually both BFG10K and Chizow are right
- it is hard to generalize about CPU bottlenecking and they both make good points ... i'd like to see an in-depth article cover this :p

rose.gif


 

alcoholbob

Diamond Member
May 24, 2005
6,389
468
126
Improved combat performance in Crysis with high CPU speed.
Settings: 1680x1050, Very High Details, 4xFSAA and 16xAF.

So this is basically extreme GPU bottlenecking.

Average (exploration) framerate is limited by GPU, but the choking from massive combat scenes has an advantage with CPU scaling. Average framerate is 0.1fps apart from 2GHz to 4GHz CPU.

E8500 @ 2GHz - 5fps minimum, 17.4 average
E8500 @ 2.4GHz - 6fps minimum, 17.5 average
E8500 @ 3GHz - 7fps minimum, 17.5 average
E8500 @ 3.6GHz - 9fps minimum, 17.5 average
E8500 @ 4GHz - 12fps minimum, 17.5 average

At this rate at somewhere between 5GHz and 6GHz the game will be completely (and I mean 100%) GPU bottlenecked.

http://www.pcgameshardware.com...647744&image_id=839048

So in this case, if you were to play a game with a lot of action, let's say Dragon Age in the LOTR-esque scene where a wave of orcs crashes into a thousand soldiers and you're fighting in the midst of it, then you'll probably do pretty well with an overclocked CPU and feel no system strain at all, instead of turning into a slideshow.

Because that's sort of our goal--when you spend a ton of money on a computer, you'd like to play everything maxed out, regardless of the type of game. Otherwise you feel cheated. I mean imagine running your triple SLI GTX 280 system and it turns into a slide show because your CPU can't handle the polygon drawing or the collision detection--not fun at all.
 

deadseasquirrel

Golden Member
Nov 20, 2001
1,736
0
0
I've posted on my thoughts on cpu bottlenecking vs cpu limitations, etc before. My opinions change as new data is available and I greatly agree with apoppin that both sides make good points. Where I think the problem with this discussion lies is when we focus on generalizations. Chizow is correct-- we're seeing very powerful GPU configs become cpu-limited at higher resolutions such as that of a 24" monitor. Yet, as BFG points out, this limitation is not widespread across the board and does vary greatly.

I wish more review sites out there would test the new GPUs with older chips and boards (and lower memory configs)... not all of us have 4ghz C2Q with 8GB ram running Vista64.

I'm in a very different boat than many of you. I keep computers around for a long time and pass them down to kids, upgrading them rarely. Rig #4 is a 1.4ghz Tualatin with a 5900ultra.

My #1 rig I use to play my games is an X2 3800+ at 2.8ghz. I've got an x1900xtx in it. I would love to see the difference between a 4850 vs 4870 vs 4870x2 at the res I play (1080p). But I can guaran-damn-tee that there are no reviews anywhere close to similar to which I can compare. I'd like to know if I could save $100 and grab the 4850 and get the same framerate with my current setup.

I will likely grab a 4870. (The x2 is too big for my case anyway). Sure, that 4870 might be limited a bit by my 3800+, but, eventually, I will build a new machine (6 mos.), and bring that 4870 to it, slap this x1900xtx back in this one, and this one becomes rig #2.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: Astrallite
Improved combat performance in Crysis with high CPU speed.
Settings: 1680x1050, Very High Details, 4xFSAA and 16xAF.

So this is basically extreme GPU bottlenecking.

Average (exploration) framerate is limited by GPU, but the choking from massive combat scenes has an advantage with CPU scaling. Average framerate is 0.1fps apart from 2GHz to 4GHz CPU.

E8500 @ 2GHz - 5fps minimum, 17.4 average
E8500 @ 2.4GHz - 6fps minimum, 17.5 average
E8500 @ 3GHz - 7fps minimum, 17.5 average
E8500 @ 3.6GHz - 9fps minimum, 17.5 average
E8500 @ 4GHz - 12fps minimum, 17.5 average

At this rate at somewhere between 5GHz and 6GHz the game will be completely (and I mean 100%) GPU bottlenecked.

http://www.pcgameshardware.com...647744&image_id=839048

So in this case, if you were to play a game with a lot of action, let's say Dragon Age in the LOTR-esque scene where a wave of orcs crashes into a thousand soldiers and you're fighting in the midst of it, then you'll probably do pretty well with an overclocked CPU and feel no system strain at all, instead of turning into a slideshow.

Because that's sort of our goal--when you spend a ton of money on a computer, you'd like to play everything maxed out, regardless of the type of game. Otherwise you feel cheated. I mean imagine running your triple SLI GTX 280 system and it turns into a slide show because your CPU can't handle the polygon drawing or the collision detection--not fun at all.

That article seems rubbish to me.
 

alcoholbob

Diamond Member
May 24, 2005
6,389
468
126
It's probably a long timedemo with lots of exploration. You aren't always in pitched battles in Crysis (given how much ammo it takes to kill a solider you wouldn't survive very long), the dips will be with lots of on-screen enemies. It's an issue of mathematics and not something you can base off of limited inductive logic.

For example you do the C-130 drop cutscene with the first 15 minutes of Crysis the average would be easily within 3 decimal points during the point where the engine starts smoking and the C-130 starts to dive it becomes a CPU limited operation for a span of maybe 10-15 seconds say (12 fps vs 6fps for 4GHz vs 2.4GHz). In that case the difference would be ~0.005 in average framerate.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: Astrallite
It's probably a long timedemo with lots of exploration. You aren't always in pitched battles in Crysis (given how much ammo it takes to kill a solider you wouldn't survive very long), the dips will be with lots of on-screen enemies. It's an issue of mathematics and not something you can base off of limited inductive logic.

For example you do the C-130 drop cutscene with the first 15 minutes of Crysis the average would be easily within 3 decimal points but during the point where the engine starts smoking and the C-130 starts to dive it becomes a CPU limited operation for a span of maybe 10-15 seconds. In that case the difference would be ~0.005 in average framerate.

Seems to me you are basing your "assumption" on something you have no idea about. Until I see what they did and make some sense I'm not going to blatantly believe some article with strange results.
 

alcoholbob

Diamond Member
May 24, 2005
6,389
468
126
Originally posted by: Azn
Seems to me you are basing your "assumption" on something you have no idea about. Until I see what they did and make some sense I'm not going to blatantly believe some article with strange results.

I love the internet.

Everyone makes purchasing decisions based on different authority sources. It's our money, so we choose what we want to believe in based on some preconceptions we might have had in addition to what we choose to assimilate which leads us all to different conclusions.

Since it's obvious I'm not going going to spend hours and days to find out exactly what they did and review their benchmark so I can get back to you and have you never respond to your challenge (you know its true), nor will you ever bother to go out of your way to figure it out to disprove my stance, obviously we'll never learn the exact "truth" which meets your specifications. Because certainly you could have questioned every benchmark on this thread but chose to question the one you didn't want to believe.

So that's what we have here--individual preferences for authority sources. To each their own.