• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

gamegpuTom Clancy’s The Division Beta Benchmarks

csbin

Senior member
http://gamegpu.ru/mmorpg-/-onlayn-igry/tom-clancy-s-the-division-beta-test-gpu.html




Nvidia GeForce/ION Driver Release 361.75

AMD Radeon Crimson Edition 16.1

medish.jpg



a5GCG.jpg



jMSG4.jpg



1592TV.jpg
 
Well, my GTX770 here is getting at least 40-50 FPS in the game on high settings. Dark Zone I will say seems more impactful on your framerate since there's actual players running around that you can fight.
 
Good showing for the Fury X. Just wondering why it's below the 980ti at 1440p though. Same minimum, higher average yet placed below it in the chart.
 
Is this running the modified Far Cry 4 engine from Ubisoft?

Edit: Nope.

SNOWDROP™ ENGINE
Powered by the fully next-gen Snowdrop engine, Tom Clancy’s The Division™ sets a new bar in video game realism and open world rendering. Experience a chaotic and devastated New York like you’ve never seen before.
 
Last edited:
Wonder why game.gpu isn't giving cpu data for Skylake or at least overclocking the 4770k to 4790k clockspeeds. They must have a skylake system because they give HD 530 data for the igpu tests.

Edit: probably would not matter for this game, because it looks gpu limited even with 980Ti SLI. Also looks like dual card solution for AMD is not working yet, although the framerate for single cards is very good.
 
Last edited:
Is this a GameWorks game ???

If yes then dont bother with the BETA, the performance will be completely different on release date.
 
Is this a GameWorks game ???

If yes then dont bother with the BETA, the performance will be completely different on release date.

Yes and it's doubtful their will be a performance impact when the game's release is just over a month ...

By next week the game should be gold ...
 
It is Gameworks. Odd that it is very, very heavily AMD favored right now. Either Beta is Beta and Nvidia will release Gameready drivers that will give parity, or the game just really likes GCN's strengths. Rainbow Six Siege was another recent Ubisoft Gameworks title that ironically preferred AMD.
 
Wonder why game.gpu isn't giving cpu data for Skylake or at least overclocking the 4770k to 4790k clockspeeds. They must have a skylake system because they give HD 530 data for the igpu tests.

Edit: probably would not matter for this game, because it looks gpu limited even with 980Ti SLI. Also looks like dual card solution for AMD is not working yet, although the framerate for single cards is very good.

On that note: Good showing for AMD's FX "sans Skylake"
 
If the performance remains the same on release, I will be pleasantly surprised.

The performance won't decrease for anything on the same maps/etc so that much is certain when the rendering pipeline has already been built but at the same time Nvidia can roll out driver optimizations too ...

It's interesting to see a 290 match a 980 so I guess AMD's vision of compute for graphics has finally paid off ?
 
8350 looking good. Would love to see the more expensive Core i3-6100 compared to it. Here's yet another game / engine showing the strength of more cores.
 
980ti cant maintain 60 FPS on High and ultra struggles to get 30 running at 3440*1440

my wallet may soon hate me
 
gamegpu.ru :sneaky: they test a 750 ti at ultra 4K 10FPS but can't be bothered to run tests with settings bellow 1080P ultra :thumbsdown:

other than that, CPU performance looks good for AMD.
 
Yes and it's doubtful their will be a performance impact when the game's release is just over a month ...

By next week the game should be gold ...

the beta could be from any number of months ago. I think what might happen is nvidia gives them code to put in the game late. We'll see but I am expecting some late addition of nvidia code.

On that note: Good showing for AMD's FX "sans Skylake"

Which is why I can't see zen failing. fx is still hanging in there against intel for gaming this late in the game. ironically if you go high end with graphics settings your performance on fx processors matches intels highest end. So all those people buying $1000 intel CPUs to power their 4K rigs are failing (for example).
 
Which is why I can't see zen failing. fx is still hanging in there against intel for gaming this late in the game. ironically if you go high end with graphics settings your performance on fx processors matches intels highest end. So all those people buying $1000 intel CPUs to power their 4K rigs are failing (for example).

A CPU clocked 33% higher is some kind of testament that Bulldozer is hanging out with Intel's top chips? Hell with a 1.2ghz advantage it still loses to Haswell. Could you imagine the power consumption charts? ANd then toss Skylake into the mix. Woof.

Zen better be more competitive than that. If it takes Zen almost 35% more clocks to still lose, just pack it up AMD. Don't even bother.
 
A CPU clocked 33% higher is some kind of testament that Bulldozer is hanging out with Intel's top chips? Hell with a 1.2ghz advantage it still loses to Haswell. Could you imagine the power consumption charts? ANd then toss Skylake into the mix. Woof.

Zen better be more competitive than that. If it takes Zen almost 35% more clocks to still lose, just pack it up AMD. Don't even bother.

I rarely look at the 9590 numbers.

8350 = 95 average 68 minimum
i5 4670k = 104 average 65 minimum

all the i7s stopped at 104 average with higher minimums (impact on experience not quantified).

that's hanging in there. 9 fps differences for $100+ price difference + years newer and 2 (?) manufacturing nodes ahead.

if you do OC though you are pretty much matching i7 performance at stock even with 400 Mhz OC.

I don't see how a 14nm zen won't clean up. They'd have to make a processor worse than the current fx (in games like this anyway).
 
Last edited:
I rarely look at the 9590 numbers.

8350 = 95 average 68 minimum
i5 4670k = 104 average 65 minimum

all the i7s stopped at 104 average with higher minimums (impact on experience not quantified).

that's hanging in there. 9 fps differences for $100+ price difference + years newer and 2 (?) manufacturing nodes ahead.

if you do OC though you are pretty much matching i7 performance at stock even with 400 Mhz OC.

I don't see how a 14nm zen won't clean up. They'd have to make a processor worse than the current fx (in games like this anyway).

Still has a 600mhz advantage and only has a 3 FPS min advantage. You could simply OC it the difference in clocks, have better mins, better max, most likely still use less power.

I would not use Bulldozer as any kind of precursor for Zen. That's just expecting it to fail.
 
It is Gameworks. Odd that it is very, very heavily AMD favored right now. Either Beta is Beta and Nvidia will release Gameready drivers that will give parity, or the game just really likes GCN's strengths. Rainbow Six Siege was another recent Ubisoft Gameworks title that ironically preferred AMD.

Maybe after being trashed over and over again about how TERRIBLE Gameworks is in Ubisoft games, they decided to work on it?

Gameworks + Ubisoft = DO NOT BUY

For me anyway.

This can change my perception though of Gameworks and of Ubisoft.

Edit: Took a look at the actual images of the game and reviewed benches.
Is it my eyes/allergies or does the game look really meh?

I feel like there is a particular settings holding ALL the gpus back that we would normally set to a non Ultra setting. Beta, meh.
 
Last edited:
I rarely look at the 9590 numbers.

8350 = 95 average 68 minimum
i5 4670k = 104 average 65 minimum

all the i7s stopped at 104 average with higher minimums (impact on experience not quantified).

that's hanging in there. 9 fps differences for $100+ price difference + years newer and 2 (?) manufacturing nodes ahead.

if you do OC though you are pretty much matching i7 performance at stock even with 400 Mhz OC.

I don't see how a 14nm zen won't clean up. They'd have to make a processor worse than the current fx (in games like this anyway).

At the same note,the i3 gets the same avg (94 to 95) as the 8350 ,with 108FPS being the absolute top FPS so only ~10FPS to go up,which means that the minimums are not even worth mentioning,maybe one frame out of all of them.
So bottom line is the 8350 is "hanging in there" close to i3 performance...

Plus gamegpu has a history of measuring the CPU in completely CPU non demanding scenarios... just look at the video showing what their GPU test consists of...
 
I rarely look at the 9590 numbers.

8350 = 95 average 68 minimum
i5 4670k = 104 average 65 minimum

all the i7s stopped at 104 average with higher minimums (impact on experience not quantified).

that's hanging in there. 9 fps differences for $100+ price difference + years newer and 2 (?) manufacturing nodes ahead.

if you do OC though you are pretty much matching i7 performance at stock even with 400 Mhz OC.

I don't see how a 14nm zen won't clean up. They'd have to make a processor worse than the current fx (in games like this anyway).

From "Bulldozer cant game" we have a very performance/$ competitive CPU for todays games. I hope AMD learned their lesson and ZEN will not need 2-3 years of software optimization again and make ZEN continue from were Vishera and Excavator are today.
 
From "Bulldozer cant game" we have a very performance/$ competitive CPU for todays games. I hope AMD learned their lesson and ZEN will not need 2-3 years of software optimization again and make ZEN continue from were Vishera and Excavator are today.

Something tells me this wasn't AMD goals. "Congrats, you win the perf / $ crown well because no one will buy you for what you originally launched at."

You nailed it, Zen can't afford to wait 2-3 years to be considered a "good purchase."
 
At the same note,the i3 gets the same avg (94 to 95) as the 8350 ,with 108FPS being the absolute top FPS so only ~10FPS to go up,which means that the minimums are not even worth mentioning,maybe one frame out of all of them.
So bottom line is the 8350 is "hanging in there" close to i3 performance...

Plus gamegpu has a history of measuring the CPU in completely CPU non demanding scenarios... just look at the video showing what their GPU test consists of...

I would really like to see the Frametimes of the Core i3 with all four threads at 90-100%.
 
Something tells me this wasn't AMD goals. "Congrats, you win the perf / $ crown well because no one will buy you for what you originally launched at."

Do you actually believe that anyone would pay more than what the FX8350 is being sold TODAY for an SB/Ivy Core i5 ??
People keep forgetting this is a 2012 CPU that is very close with Intels 2013-2015 Core i5s in many of the latest games.

You nailed it, Zen can't afford to wait 2-3 years to be considered a "good purchase."

Definitely, if ZEN needs 2-3 years for software to catch up then say bye bye. ZEN needs to continue from where the Current CPUs are in order to succeed.
 
Back
Top