PDA

View Full Version : So DX12 looking like it's living up to the hype



baseline bum
03-11-2016, 05:50 PM
http://tombraider.tumblr.com/post/140859222830/dev-blog-bringing-directx-12-to-rise-of-the-tomb
Dev Blog: Bringing DirectX 12 to Rise of the Tomb Raider

Jurjen Katsman, Studio Head at Nixxes Software

[Our developer blogs lift the curtain on the creation of Lara's first tomb raiding expedition, and the technology we use to constantly improve it. Following the release of Rise of the Tomb Raider for PC, the title will be one of the first in the industry to integrate DirectX 12 support, allowing fans with older PCs or newer rigs to run at higher framerates and higher graphical settings. Nixxes Studio Head Jurjen Katsman dives deep into the new technology below.]

Pushing the boundaries of technology on PC has always been a passion of the development team at Nixxes, and Crystal Dynamics and Square Enix have been great partners for us in doing so. One of the challenges with PC development is guaranteeing players on as many different PC configurations as possible can have a great experience. For us this means ensuring that users with older PCs can still get a great gameplay experience, but also that users with higher-end machines can get the most out of their hardware, including the highest quality visuals, frame-rate, and other technical enhancements.

One thing we are very excited about to help us further realize those goals is the new DirectX 12 graphics API that is available on Windows 10. In the patch released today on Steam – and coming soon to the Windows 10 Store – we will be adding DirectX 12 support to Rise of the Tomb Raider.

At Nixxes we have a long history of working with consoles as well, and one of the large differences between developing for consoles and developing for PCs is the level of access to the hardware available to us. We can leverage every single hardware feature and every bit of CPU power available in the most efficient way possible. With DirectX 12 we are taking a massive step forwards for bringing a lot of that flexibility to the PC as well. For Rise of the Tomb Raider the largest gain DirectX 12 will give us is the ability to spread our CPU rendering work over all CPU cores, without introducing additional overhead. This is especially important on 8-core CPUs like Intel i7’s or many AMD FX processors.

Let me explain how this helps the performance of your game. When using DirectX 11, in situations where the game is under heavy load – for example in the larger hubs of the game – the individual cores may not be able to feed a fast GPU like an NVIDIA GTX 980 or even NVIDIA GTX 970 quick enough. This means the game may not hit the desired frame-rate, requiring you to turn down settings that impact CPU performance. Even though the game can use all your CPU cores, the majority of the DirectX 11 related work is all happening on a single core. With DirectX 12 a lot of the work is spread over many cores, and the framerate of the game will run at can be much higher for the same settings. Check out the picture below for a visual example of how the CPU work is distributed:

http://56.media.tumblr.com/a85622525088836cca7f0eab11e4086d/tumblr_inline_o3vsjaoaAk1qij4lt_1280.png

As an example to illustrate the point, below is a screenshot of a scene in the game running on an Intel i7-2600 processor with 1333Mhz memory, paired with a GTX 970. Using DirectX 11 at High Settings we would only get 46 fps.

http://56.media.tumblr.com/c3605c699e8c23c4a12b4534481ab1f6/tumblr_inline_o3vsjnzHO71qij4lt_1280.jpg

Now look at the same location the new DirectX 12 implementation, we can lift it up to 60!

http://56.media.tumblr.com/518d4ba8baa49eb37cbe26a9d52ac87c/tumblr_inline_o3vsk0mwJl1qij4lt_1280.jpg

The above advantage we feel is the most important one for Rise of the Tomb Raider, but there are many more advantages that make us excited about DirectX 12. Another big feature, which we are also using on Xbox One, is asynchronous compute. This allows us to re-use GPU power that would otherwise go to waste, and do multiple tasks in parallel. And there is a never before seen level of control over NVIDIA SLI and AMD CrossFireX configurations, which means that as a developer we can take full control over those systems and ensure users get a great experience with them.

Being one of the first game titles out there using DirectX 12 there are still many more optimizations to make and DirectX 11 is available for the most predictable and proven experience. However, as seen above there are large gains to be found already, and we encourage you to check out DirectX 12 for yourself in our latest patch!

[Rise of the Tomb Raider is available now on Xbox One, Xbox 360, and PC. Secure your copy at BuyROTTR.com]

baseline bum
03-11-2016, 05:57 PM
Glad I got a hyperthreaded processor now tbh.

baseline bum
03-11-2016, 05:57 PM
I'll be even gladder when Bird Sister gets off her ass though tbh

ElNono
03-11-2016, 06:00 PM
The theory is sound... we'll see the implementations, especially in current gen hardware. The API is a lot like Vulcan and what you find in consoles.

I assume newer hardware will have better out of the box support, and it really looks we might finally have a solid performance upgrade.

baseline bum
03-11-2016, 06:08 PM
The theory is sound... we'll see the implementations, especially in current gen hardware. The API is a lot like Vulcan and what you find in consoles.

I assume newer hardware will have better out of the box support, and it really looks we might finally have a solid performance upgrade.

ROTR is a game that really hits lesser cpus hard in DX11. It's the first game I have seen since Crysis 3 that rapes i3s.

Cry Havoc
03-11-2016, 06:24 PM
AMD cards have been seeing anywhere from 50-120% boosts using DX12. We'll have to see if that continues to translate to the "real world" of gaming.

DJR210
03-12-2016, 11:34 AM
AMD cards have been seeing anywhere from 50-120% boosts using DX12. We'll have to see if that continues to translate to the "real world" of gaming.

Stopped reading at AMD

baseline bum
03-12-2016, 12:04 PM
Stopped reading at AMD

Hey man, the new Pascal GP104 cards are supposed to release on May 27th. You gonna jump on a GTX 1080 or GTX 1800? (not sure which it's supposed to be called).

http://www.pcper.com/news/Graphics-Cards/Rumor-NVIDIAs-Next-GPU-Called-GTX-1080-Uses-GDDR5X

DJR210
03-12-2016, 02:30 PM
Hey man, the new Pascal GP104 cards are supposed to release on May 27th. You gonna jump on a GTX 1080 or GTX 1800? (not sure which it's supposed to be called).

http://www.pcper.com/news/Graphics-Cards/Rumor-NVIDIAs-Next-GPU-Called-GTX-1080-Uses-GDDR5X

Would be nice.. Hopefully I've got some extra cash around that time.

baseline bum
03-12-2016, 02:40 PM
Would be nice.. Hopefully I've got some extra cash around that time.

They're claiming 2x performance per watt over Maxwell. With a single 8 pin I'd imagine this is going to be around a 150W card again, so I'd guess we'd be looking at something around a 980 Ti in the 70 series and a little better than a 980 Ti the 80 series. Hopefully they do 70 series at $350 or so like with Maxwell, that would be insane again.

baseline bum
03-12-2016, 02:43 PM
Nvidia's asynchronous compute is supposed to be way stronger in Pascal than Maxwell too, so it looks like they're expecting DirectX 12 to come in a big way this year.

313
03-12-2016, 02:45 PM
They're claiming 2x performance per watt over Maxwell. With a single 8 pin I'd imagine this is going to be around a 150W card again, so I'd guess we'd be looking at something around a 980 Ti in the 70 series and a little better than a 980 Ti the 80 series. Hopefully they do 70 series at $350 or so like with Maxwell, that would be insane again.
:lol AMD

baseline bum
03-12-2016, 02:52 PM
:lol AMD

AMD looks pretty impressive right now. They had a GTX 950 level gpu they were showing off in January running on maybe 45W or so, which would put them right in line with what Nvidia claims. And who knows if Nvidia can deliver on their claim? AMD is also going to have HBM2 vram on their higher end cards, which could give a really big advantage over the GDDR5X Nvidia is putting on their GP104 cards when it comes to memory bandwidth. Nvidia will only put HBM2 on their Titan/80Ti class cards using GP100. Right now I think I'm expecting AMD to have the better cards this generation, as they have actually shown one of their new cards off while Nvidia got caught showing off a mobile 980 and calling it Pascal back in January. :lol

ElNono
03-12-2016, 03:23 PM
They're claiming 2x performance per watt over Maxwell. With a single 8 pin I'd imagine this is going to be around a 150W card again, so I'd guess we'd be looking at something around a 980 Ti in the 70 series and a little better than a 980 Ti the 80 series. Hopefully they do 70 series at $350 or so like with Maxwell, that would be insane again.

Don't forget the PCIE slot provides an extra 75W... a single 8 pin would allow a card up to 225W to run (150W 8-pin + 75W PCIE)... so definitely more than 150W, but less than the Titan X or 980 Ti...

Pricing will be interesting, because the 960 and 970 are still tremendous value... I would suspect these cards to price around the 980 Ti value for a while..

313
03-12-2016, 03:28 PM
AMD looks pretty impressive right now. They had a GTX 950 level gpu they were showing off in January running on maybe 45W or so, which would put them right in line with what Nvidia claims. And who knows if Nvidia can deliver on their claim? AMD is also going to have HBM2 vram on their higher end cards, which could give a really big advantage over the GDDR5X Nvidia is putting on their GP104 cards when it comes to memory bandwidth. Nvidia will only put HBM2 on their Titan/80Ti class cards using GP100. Right now I think I'm expecting AMD to have the better cards this generation, as they have actually shown one of their new cards off while Nvidia got caught showing off a mobile 980 and calling it Pascal back in January. :lol
������ When are AMD supposedly releasing? Mid summer? I'm ready to upgrade so whichever comes first I'll probably pull the trigger on.

DJR210
03-12-2016, 04:21 PM
If my 680 was a 4 GB I'd be ok for another year or so tbh

baseline bum
03-12-2016, 04:29 PM
Don't forget the PCIE slot provides an extra 75W... a single 8 pin would allow a card up to 225W to run (150W 8-pin + 75W PCIE)... so definitely more than 150W, but less than the Titan X or 980 Ti...

Pricing will be interesting, because the 960 and 970 are still tremendous value... I would suspect these cards to price around the 980 Ti value for a while..

I think they're doing the 8 pin so you have some overclocking headroom, and since their cards will still spike over the rated power (Maxwell sure did). There is no way they'd release a 200W card with only a single 8-pin. Also, why do you consider the 970 tremendous value now? It has been $330 since it dropped in September 2014. GPU value sucks right now with the 970 and R9 390 around $330 when they're both old chips. Maxwell GM204 came out September 2014 and Hawaii dates back to October 2013.

I don't agree with your pricing projections. I think the odds are better Nvidia will undercut their Maxwell lineup like they did to Kepler when they released the $550 GTX 980 that outperformed the $650 (at that time) GTX 780 Ti and when their $330 GTX 970 came within 6% of the $650 GTX 780 Ti. I'd be surprised to see the 70 series selling for any more than $400 and 80 series for any more than $600. Nvidia made boatloads of money off their 970 by offering 780 Ti level performance. Plus they shifted the new normal for enthusiast gamers up from $250 (GTX 760) to $330 (GTX 970) by doing so. It would be dumb to not try to recreate that kind of enthusiasm and perhaps push the new normal even a little higher to say $360 or so by offering another strong 70 series card. Plus they hurt AMD so badly with the 970; the R9 290 dropped from $400 to $250 within a couple of weeks, and that was an extremely costly gpu for AMD to make. Why not kick AMD when they're down? The last thing Nvidia can want is AMD making a shitload of money they can dump into R&D to compete with Nvidia over the long term. Nvidia can bust their asses now and make them go to focusing only on APUs and shit. They're so close to destroying AMD as any kind of competent rival.

Also if Nvidia prices these cards over $650 like you're suggesting, no one is going to buy them. Everyone will just wait for Big Pascal GP100. It won't give Nvidia the chance to double dip by having people buy the 1080 and then a year later the 1080 Ti like so many did with the 980 and then 980 Ti.

baseline bum
03-12-2016, 04:34 PM
������ When are AMD supposedly releasing? Mid summer? I'm ready to upgrade so whichever comes first I'll probably pull the trigger on.

Probably May or June.

baseline bum
03-12-2016, 04:53 PM
If my 680 was a 4 GB I'd be ok for another year or so tbh

The 680 is one of the things that gives me hope Pascal might be really awesome. The 680 was when Nvidia did their last node shrink from 40 nm to 28 nm, and that card spanked their Titan equivalent GTX 580 from the previous generation even with shitty launch drivers. Just a few months later the 680 started mopping the fucking floor with the 580 as the drivers caught up to the hardware advances they made that generation.

Wild Cobra
03-12-2016, 05:06 PM
I'm contemplating putting a better graphics card in my tower. I don't really need it as I seldom play burays on it. However, using multi monitor, and having several windows operation at once, it would be nice to have the full 1080P without glitches. My GTX 720 has wimped out a few times on me.

May as well get a DX12 if I do, right?

ElNono
03-12-2016, 05:29 PM
I think they're doing the 8 pin so you have some overclocking headroom, and since their cards will still spike over the rated power (Maxwell sure did). There is no way they'd release a 200W card with only a single 8-pin. Also, why do you consider the 970 tremendous value now? It has been $330 since it dropped in September 2014. GPU value sucks right now with the 970 and R9 390 around $330 when they're both old chips. Maxwell GM204 came out September 2014 and Hawaii dates back to October 2013.

I don't agree with your pricing projections. I think the odds are better Nvidia will undercut their Maxwell lineup like they did to Kepler when they released the $550 GTX 980 that outperformed the $650 (at that time) GTX 780 Ti and when their $330 GTX 970 came within 6% of the $650 GTX 780 Ti. I'd be surprised to see the 70 series selling for any more than $400 and 80 series for any more than $600. Nvidia made boatloads of money off their 970 by offering 780 Ti level performance. Plus they shifted the new normal for enthusiast gamers up from $250 (GTX 760) to $330 (GTX 970) by doing so. It would be dumb to not try to recreate that kind of enthusiasm and perhaps push the new normal even a little higher to say $360 or so by offering another strong 70 series card. Plus they hurt AMD so badly with the 970; the R9 290 dropped from $400 to $250 within a couple of weeks, and that was an extremely costly gpu for AMD to make. Why not kick AMD when they're down? The last thing Nvidia can want is AMD making a shitload of money they can dump into R&D to compete with Nvidia over the long term. Nvidia can bust their asses now and make them go to focusing only on APUs and shit. They're so close to destroying AMD as any kind of competent rival.

Also if Nvidia prices these cards over $650 like you're suggesting, no one is going to buy them. Everyone will just wait for Big Pascal GP100. It won't give Nvidia the chance to double dip by having people buy the 1080 and then a year later the 1080 Ti like so many did with the 980 and then 980 Ti.

Because the 960/970 are great performers for the price... How much lower can they really go without completely cannibalizing them?... There's still going to be a ton of DX11 games for the next couple of years, and these cards still perform extremely well (even compared to console counterparts)... These cards will have to be priced somewhere between that and the 980/980 TI, IMO, at least until they unload enough old inventory...

baseline bum
03-12-2016, 06:06 PM
Because the 960/970 are great performers for the price... How much lower can they really go without completely cannibalizing them?... There's still going to be a ton of DX11 games for the next couple of years, and these cards still perform extremely well (even compared to console counterparts)... These cards will have to be priced somewhere between that and the 980/980 TI, IMO, at least until they unload enough old inventory...

The 970 has plenty of room to go down in price as they have nothing between $200 and $330. The prices will crash when Pascal releases, but why would Nvidia care when they won't be selling Maxwell chips anymore to their partners like EVGA, Asus, etc?

ElNono
03-12-2016, 07:10 PM
The 970 has plenty of room to go down in price as they have nothing between $200 and $330. The prices will crash when Pascal releases, but why would Nvidia care when they won't be selling Maxwell chips anymore to their partners like EVGA, Asus, etc?

I'm sure they'll keep making scaled down GMxxx for their sub-$300/fanless cards. But hey, if they do crash, that'd be great, tbh... I actually bought a 960 here at home (which runs pretty great, tbh), to eventually upgrade to a beefier Pascal...

baseline bum
03-12-2016, 07:28 PM
I'm sure they'll keep making scaled down GMxxx for their sub-$300/fanless cards. But hey, if they do crash, that'd be great, tbh... I actually bought a 960 here at home (which runs pretty great, tbh), to eventually upgrade to a beefier Pascal...

That'll be interesting if they keep producing GM206 cards like the 950 and 960 to go against AMD, since AMD already has a chip targeting that level of performance at half the power consumption. If you do want a card when the prices crash though, better get it ASAP when the new shit launches. When the 980 and 970 launched the price of the GTX 780 crashed like hell, but they sold so quickly that the 780 got really expensive almost overnight and stayed there if you wanted to buy new. Sucks for people who planned on doing SLI 780 who couldn't jump on that second card right away.

ElNono
03-12-2016, 08:05 PM
That'll be interesting if they keep producing GM206 cards like the 950 and 960 to go against AMD, since AMD already has a chip targeting that level of performance at half the power consumption. If you do want a card when the prices crash though, better get it ASAP when the new shit launches. When the 980 and 970 launched the price of the GTX 780 crashed like hell, but they sold so quickly that the 780 got really expensive almost overnight and stayed there if you wanted to buy new. Sucks for people who planned on doing SLI 780 who couldn't jump on that second card right away.

They can develop a proven design like the GMxxx on the new litography too. That cuts down on cost and consumption. They'll probably be rebranded as a new model and require less dissipation, but there's still a market out there for sub-$300 and things like all-in-one and cheaper "gaming" laptops...

baseline bum
03-13-2016, 02:19 PM
I'm sure they'll keep making scaled down GMxxx for their sub-$300/fanless cards. But hey, if they do crash, that'd be great, tbh... I actually bought a 960 here at home (which runs pretty great, tbh), to eventually upgrade to a beefier Pascal...

Here's what I mean. Check out the prices for this MSI Gaming GTX 780 Ti. Maxwell launched I think September 28 2014, and you can see how the bottom fell out of the price of this 780 Ti a few days before that when it was constantly oscillating between $600-$700 before that. By the end of October it dropped to $370. By the beginning of November it was back to almost $600 and by December it was $700. So get in fast if you want to buy a 980 Ti on fire sale price when Pascal comes out.

http://pcpartpicker.com/part/msi-video-card-gtx780tigaming?history_days=730

http://i.imgur.com/9IXrMQM.png

baseline bum
03-17-2016, 12:03 PM
Rumored specs of Pascal:

http://cdn.videocardz.com/1/2016/03/NVIDIA-GeForce-X80-X80Ti-X80-TITAN.png

Looks like it could shit all over the 980 Ti.

Cry Havoc
03-17-2016, 02:03 PM
DX12 could be the leap that VR needs to run smoothly.

baseline bum
03-17-2016, 02:42 PM
DX12 could be the leap that VR needs to run smoothly.

Looks like AMD might be the way to go this upcoming generation if DX12 gets heavy adoption (and it's looking like it might this season). The floating point performance of that rumored X80 is slightly less than the Fury X. I wonder what kind of memory AMD is looking at in their non halo $300-$600 range of cards. HBM2 seems in way too short of supply to be put in this bracket, but isn't HBM1 still limited to 4GB? That'll be a tough sell then like it was with the Fury X when the 980 Ti had 6GB.

baseline bum
03-17-2016, 02:44 PM
I just always worry about AMD going belly up. Ever since the mining craze of 2014 they haven't even come close to matching Nvidia's sales numbers, and I think the company will be truly fucked if the Zen processors don't sell like crazy in 2017. What kind of driver support are we gonna get if AMD declares bankruptcy in a couple of years? :lol