Page 1 of 2 12 LastLast
Results 1 to 25 of 30
  1. #1
    Take the fcking keys away baseline bum's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Mar 2003
    Post Count
    93,381
    http://tombraider.tumblr.com/post/14...se-of-the-tomb
    Dev Blog: Bringing DirectX 12 to Rise of the Tomb Raider

    Jurjen Katsman, Studio Head at Nixxes Software

    [Our developer blogs lift the curtain on the creation of Lara's first tomb raiding expedition, and the technology we use to constantly improve it. Following the release of Rise of the Tomb Raider for PC, the le will be one of the first in the industry to integrate DirectX 12 support, allowing fans with older PCs or newer rigs to run at higher framerates and higher graphical settings. Nixxes Studio Head Jurjen Katsman dives deep into the new technology below.]

    Pushing the boundaries of technology on PC has always been a passion of the development team at Nixxes, and Crystal Dynamics and Square Enix have been great partners for us in doing so. One of the challenges with PC development is guaranteeing players on as many different PC configurations as possible can have a great experience. For us this means ensuring that users with older PCs can still get a great gameplay experience, but also that users with higher-end machines can get the most out of their hardware, including the highest quality visuals, frame-rate, and other technical enhancements.

    One thing we are very excited about to help us further realize those goals is the new DirectX 12 graphics API that is available on Windows 10. In the patch released today on Steam – and coming soon to the Windows 10 Store – we will be adding DirectX 12 support to Rise of the Tomb Raider.

    At Nixxes we have a long history of working with consoles as well, and one of the large differences between developing for consoles and developing for PCs is the level of access to the hardware available to us. We can leverage every single hardware feature and every bit of CPU power available in the most efficient way possible. With DirectX 12 we are taking a massive step forwards for bringing a lot of that flexibility to the PC as well. For Rise of the Tomb Raider the largest gain DirectX 12 will give us is the ability to spread our CPU rendering work over all CPU cores, without introducing additional overhead. This is especially important on 8-core CPUs like Intel i7’s or many AMD FX processors.

    Let me explain how this helps the performance of your game. When using DirectX 11, in situations where the game is under heavy load – for example in the larger hubs of the game – the individual cores may not be able to feed a fast GPU like an NVIDIA GTX 980 or even NVIDIA GTX 970 quick enough. This means the game may not hit the desired frame-rate, requiring you to turn down settings that impact CPU performance. Even though the game can use all your CPU cores, the majority of the DirectX 11 related work is all happening on a single core. With DirectX 12 a lot of the work is spread over many cores, and the framerate of the game will run at can be much higher for the same settings. Check out the picture below for a visual example of how the CPU work is distributed:



    As an example to illustrate the point, below is a screenshot of a scene in the game running on an Intel i7-2600 processor with 1333Mhz memory, paired with a GTX 970. Using DirectX 11 at High Settings we would only get 46 fps.



    Now look at the same location the new DirectX 12 implementation, we can lift it up to 60!



    The above advantage we feel is the most important one for Rise of the Tomb Raider, but there are many more advantages that make us excited about DirectX 12. Another big feature, which we are also using on Xbox One, is asynchronous compute. This allows us to re-use GPU power that would otherwise go to waste, and do multiple tasks in parallel. And there is a never before seen level of control over NVIDIA SLI and AMD CrossFireX configurations, which means that as a developer we can take full control over those systems and ensure users get a great experience with them.

    Being one of the first game les out there using DirectX 12 there are still many more optimizations to make and DirectX 11 is available for the most predictable and proven experience. However, as seen above there are large gains to be found already, and we encourage you to check out DirectX 12 for yourself in our latest patch!

    [Rise of the Tomb Raider is available now on Xbox One, Xbox 360, and PC. Secure your copy at BuyROTTR.com]

  2. #2
    Take the fcking keys away baseline bum's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Mar 2003
    Post Count
    93,381
    Glad I got a hyperthreaded processor now tbh.

  3. #3
    Take the fcking keys away baseline bum's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Mar 2003
    Post Count
    93,381
    I'll be even gladder when Bird Sister gets off her ass though tbh

  4. #4
    🏆🏆🏆🏆🏆 ElNono's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Apr 2007
    Post Count
    152,631
    The theory is sound... we'll see the implementations, especially in current gen hardware. The API is a lot like Vulcan and what you find in consoles.

    I assume newer hardware will have better out of the box support, and it really looks we might finally have a solid performance upgrade.

  5. #5
    Take the fcking keys away baseline bum's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Mar 2003
    Post Count
    93,381
    The theory is sound... we'll see the implementations, especially in current gen hardware. The API is a lot like Vulcan and what you find in consoles.

    I assume newer hardware will have better out of the box support, and it really looks we might finally have a solid performance upgrade.
    ROTR is a game that really hits lesser cpus hard in DX11. It's the first game I have seen since Crysis 3 that rapes i3s.

  6. #6
    No darkness Cry Havoc's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Jan 2007
    Post Count
    33,655
    AMD cards have been seeing anywhere from 50-120% boosts using DX12. We'll have to see if that continues to translate to the "real world" of gaming.

  7. #7
    Club Rookie of The Year DJR210's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Dec 2012
    Post Count
    18,653
    AMD cards have been seeing anywhere from 50-120% boosts using DX12. We'll have to see if that continues to translate to the "real world" of gaming.
    Stopped reading at AMD

  8. #8
    Take the fcking keys away baseline bum's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Mar 2003
    Post Count
    93,381
    Stopped reading at AMD
    Hey man, the new Pascal GP104 cards are supposed to release on May 27th. You gonna jump on a GTX 1080 or GTX 1800? (not sure which it's supposed to be called).

    http://www.pcper.com/news/Graphics-C...80-Uses-GDDR5X

  9. #9
    Club Rookie of The Year DJR210's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Dec 2012
    Post Count
    18,653
    Hey man, the new Pascal GP104 cards are supposed to release on May 27th. You gonna jump on a GTX 1080 or GTX 1800? (not sure which it's supposed to be called).

    http://www.pcper.com/news/Graphics-C...80-Uses-GDDR5X
    Would be nice.. Hopefully I've got some extra cash around that time.

  10. #10
    Take the fcking keys away baseline bum's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Mar 2003
    Post Count
    93,381
    Would be nice.. Hopefully I've got some extra cash around that time.
    They're claiming 2x performance per watt over Maxwell. With a single 8 pin I'd imagine this is going to be around a 150W card again, so I'd guess we'd be looking at something around a 980 Ti in the 70 series and a little better than a 980 Ti the 80 series. Hopefully they do 70 series at $350 or so like with Maxwell, that would be insane again.

  11. #11
    Take the fcking keys away baseline bum's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Mar 2003
    Post Count
    93,381
    Nvidia's asynchronous compute is supposed to be way stronger in Pascal than Maxwell too, so it looks like they're expecting DirectX 12 to come in a big way this year.

  12. #12
    MORE LIFE SOON COME 313's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Feb 2013
    Post Count
    11,595
    They're claiming 2x performance per watt over Maxwell. With a single 8 pin I'd imagine this is going to be around a 150W card again, so I'd guess we'd be looking at something around a 980 Ti in the 70 series and a little better than a 980 Ti the 80 series. Hopefully they do 70 series at $350 or so like with Maxwell, that would be insane again.
    AMD

  13. #13
    Take the fcking keys away baseline bum's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Mar 2003
    Post Count
    93,381
    AMD looks pretty impressive right now. They had a GTX 950 level gpu they were showing off in January running on maybe 45W or so, which would put them right in line with what Nvidia claims. And who knows if Nvidia can deliver on their claim? AMD is also going to have HBM2 vram on their higher end cards, which could give a really big advantage over the GDDR5X Nvidia is putting on their GP104 cards when it comes to memory bandwidth. Nvidia will only put HBM2 on their an/80Ti class cards using GP100. Right now I think I'm expecting AMD to have the better cards this generation, as they have actually shown one of their new cards off while Nvidia got caught showing off a mobile 980 and calling it Pascal back in January.

  14. #14
    🏆🏆🏆🏆🏆 ElNono's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Apr 2007
    Post Count
    152,631
    They're claiming 2x performance per watt over Maxwell. With a single 8 pin I'd imagine this is going to be around a 150W card again, so I'd guess we'd be looking at something around a 980 Ti in the 70 series and a little better than a 980 Ti the 80 series. Hopefully they do 70 series at $350 or so like with Maxwell, that would be insane again.
    Don't forget the PCIE slot provides an extra 75W... a single 8 pin would allow a card up to 225W to run (150W 8-pin + 75W PCIE)... so definitely more than 150W, but less than the an X or 980 Ti...

    Pricing will be interesting, because the 960 and 970 are still tremendous value... I would suspect these cards to price around the 980 Ti value for a while..

  15. #15
    MORE LIFE SOON COME 313's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Feb 2013
    Post Count
    11,595
    AMD looks pretty impressive right now. They had a GTX 950 level gpu they were showing off in January running on maybe 45W or so, which would put them right in line with what Nvidia claims. And who knows if Nvidia can deliver on their claim? AMD is also going to have HBM2 vram on their higher end cards, which could give a really big advantage over the GDDR5X Nvidia is putting on their GP104 cards when it comes to memory bandwidth. Nvidia will only put HBM2 on their an/80Ti class cards using GP100. Right now I think I'm expecting AMD to have the better cards this generation, as they have actually shown one of their new cards off while Nvidia got caught showing off a mobile 980 and calling it Pascal back in January.
    ������ When are AMD supposedly releasing? Mid summer? I'm ready to upgrade so whichever comes first I'll probably pull the trigger on.

  16. #16
    Club Rookie of The Year DJR210's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Dec 2012
    Post Count
    18,653
    If my 680 was a 4 GB I'd be ok for another year or so tbh

  17. #17
    Take the fcking keys away baseline bum's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Mar 2003
    Post Count
    93,381
    Don't forget the PCIE slot provides an extra 75W... a single 8 pin would allow a card up to 225W to run (150W 8-pin + 75W PCIE)... so definitely more than 150W, but less than the an X or 980 Ti...

    Pricing will be interesting, because the 960 and 970 are still tremendous value... I would suspect these cards to price around the 980 Ti value for a while..
    I think they're doing the 8 pin so you have some overclocking headroom, and since their cards will still e over the rated power (Maxwell sure did). There is no way they'd release a 200W card with only a single 8-pin. Also, why do you consider the 970 tremendous value now? It has been $330 since it dropped in September 2014. GPU value sucks right now with the 970 and R9 390 around $330 when they're both old chips. Maxwell GM204 came out September 2014 and Hawaii dates back to October 2013.

    I don't agree with your pricing projections. I think the odds are better Nvidia will undercut their Maxwell lineup like they did to Kepler when they released the $550 GTX 980 that outperformed the $650 (at that time) GTX 780 Ti and when their $330 GTX 970 came within 6% of the $650 GTX 780 Ti. I'd be surprised to see the 70 series selling for any more than $400 and 80 series for any more than $600. Nvidia made boatloads of money off their 970 by offering 780 Ti level performance. Plus they shifted the new normal for enthusiast gamers up from $250 (GTX 760) to $330 (GTX 970) by doing so. It would be dumb to not try to recreate that kind of enthusiasm and perhaps push the new normal even a little higher to say $360 or so by offering another strong 70 series card. Plus they hurt AMD so badly with the 970; the R9 290 dropped from $400 to $250 within a couple of weeks, and that was an extremely costly gpu for AMD to make. Why not kick AMD when they're down? The last thing Nvidia can want is AMD making a load of money they can dump into R&D to compete with Nvidia over the long term. Nvidia can bust their asses now and make them go to focusing only on APUs and . They're so close to destroying AMD as any kind of competent rival.

    Also if Nvidia prices these cards over $650 like you're suggesting, no one is going to buy them. Everyone will just wait for Big Pascal GP100. It won't give Nvidia the chance to double dip by having people buy the 1080 and then a year later the 1080 Ti like so many did with the 980 and then 980 Ti.
    Last edited by baseline bum; 03-12-2016 at 04:44 PM.

  18. #18
    Take the fcking keys away baseline bum's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Mar 2003
    Post Count
    93,381
    ������ When are AMD supposedly releasing? Mid summer? I'm ready to upgrade so whichever comes first I'll probably pull the trigger on.
    Probably May or June.

  19. #19
    Take the fcking keys away baseline bum's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Mar 2003
    Post Count
    93,381
    If my 680 was a 4 GB I'd be ok for another year or so tbh
    The 680 is one of the things that gives me hope Pascal might be really awesome. The 680 was when Nvidia did their last node shrink from 40 nm to 28 nm, and that card spanked their an equivalent GTX 580 from the previous generation even with ty launch drivers. Just a few months later the 680 started mopping the ing floor with the 580 as the drivers caught up to the hardware advances they made that generation.

  20. #20
    Veteran Wild Cobra's Avatar
    My Team
    Portland Trailblazers
    Join Date
    May 2007
    Post Count
    43,117
    I'm contemplating putting a better graphics card in my tower. I don't really need it as I seldom play burays on it. However, using multi monitor, and having several windows operation at once, it would be nice to have the full 1080P without glitches. My GTX 720 has wimped out a few times on me.

    May as well get a DX12 if I do, right?

  21. #21
    🏆🏆🏆🏆🏆 ElNono's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Apr 2007
    Post Count
    152,631
    I think they're doing the 8 pin so you have some overclocking headroom, and since their cards will still e over the rated power (Maxwell sure did). There is no way they'd release a 200W card with only a single 8-pin. Also, why do you consider the 970 tremendous value now? It has been $330 since it dropped in September 2014. GPU value sucks right now with the 970 and R9 390 around $330 when they're both old chips. Maxwell GM204 came out September 2014 and Hawaii dates back to October 2013.

    I don't agree with your pricing projections. I think the odds are better Nvidia will undercut their Maxwell lineup like they did to Kepler when they released the $550 GTX 980 that outperformed the $650 (at that time) GTX 780 Ti and when their $330 GTX 970 came within 6% of the $650 GTX 780 Ti. I'd be surprised to see the 70 series selling for any more than $400 and 80 series for any more than $600. Nvidia made boatloads of money off their 970 by offering 780 Ti level performance. Plus they shifted the new normal for enthusiast gamers up from $250 (GTX 760) to $330 (GTX 970) by doing so. It would be dumb to not try to recreate that kind of enthusiasm and perhaps push the new normal even a little higher to say $360 or so by offering another strong 70 series card. Plus they hurt AMD so badly with the 970; the R9 290 dropped from $400 to $250 within a couple of weeks, and that was an extremely costly gpu for AMD to make. Why not kick AMD when they're down? The last thing Nvidia can want is AMD making a load of money they can dump into R&D to compete with Nvidia over the long term. Nvidia can bust their asses now and make them go to focusing only on APUs and . They're so close to destroying AMD as any kind of competent rival.

    Also if Nvidia prices these cards over $650 like you're suggesting, no one is going to buy them. Everyone will just wait for Big Pascal GP100. It won't give Nvidia the chance to double dip by having people buy the 1080 and then a year later the 1080 Ti like so many did with the 980 and then 980 Ti.
    Because the 960/970 are great performers for the price... How much lower can they really go without completely cannibalizing them?... There's still going to be a ton of DX11 games for the next couple of years, and these cards still perform extremely well (even compared to console counterparts)... These cards will have to be priced somewhere between that and the 980/980 TI, IMO, at least until they unload enough old inventory...

  22. #22
    Take the fcking keys away baseline bum's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Mar 2003
    Post Count
    93,381
    Because the 960/970 are great performers for the price... How much lower can they really go without completely cannibalizing them?... There's still going to be a ton of DX11 games for the next couple of years, and these cards still perform extremely well (even compared to console counterparts)... These cards will have to be priced somewhere between that and the 980/980 TI, IMO, at least until they unload enough old inventory...
    The 970 has plenty of room to go down in price as they have nothing between $200 and $330. The prices will crash when Pascal releases, but why would Nvidia care when they won't be selling Maxwell chips anymore to their partners like EVGA, Asus, etc?

  23. #23
    🏆🏆🏆🏆🏆 ElNono's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Apr 2007
    Post Count
    152,631
    The 970 has plenty of room to go down in price as they have nothing between $200 and $330. The prices will crash when Pascal releases, but why would Nvidia care when they won't be selling Maxwell chips anymore to their partners like EVGA, Asus, etc?
    I'm sure they'll keep making scaled down GMxxx for their sub-$300/fanless cards. But hey, if they do crash, that'd be great, tbh... I actually bought a 960 here at home (which runs pretty great, tbh), to eventually upgrade to a beefier Pascal...

  24. #24
    Take the fcking keys away baseline bum's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Mar 2003
    Post Count
    93,381
    I'm sure they'll keep making scaled down GMxxx for their sub-$300/fanless cards. But hey, if they do crash, that'd be great, tbh... I actually bought a 960 here at home (which runs pretty great, tbh), to eventually upgrade to a beefier Pascal...
    That'll be interesting if they keep producing GM206 cards like the 950 and 960 to go against AMD, since AMD already has a chip targeting that level of performance at half the power consumption. If you do want a card when the prices crash though, better get it ASAP when the new launches. When the 980 and 970 launched the price of the GTX 780 crashed like , but they sold so quickly that the 780 got really expensive almost overnight and stayed there if you wanted to buy new. Sucks for people who planned on doing SLI 780 who couldn't jump on that second card right away.

  25. #25
    🏆🏆🏆🏆🏆 ElNono's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Apr 2007
    Post Count
    152,631
    That'll be interesting if they keep producing GM206 cards like the 950 and 960 to go against AMD, since AMD already has a chip targeting that level of performance at half the power consumption. If you do want a card when the prices crash though, better get it ASAP when the new launches. When the 980 and 970 launched the price of the GTX 780 crashed like , but they sold so quickly that the 780 got really expensive almost overnight and stayed there if you wanted to buy new. Sucks for people who planned on doing SLI 780 who couldn't jump on that second card right away.
    They can develop a proven design like the GMxxx on the new litography too. That cuts down on cost and consumption. They'll probably be rebranded as a new model and require less dissipation, but there's still a market out there for sub-$300 and things like all-in-one and cheaper "gaming" laptops...

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •