Page 1 of 3 123 LastLast
Results 1 to 25 of 63
  1. #1
    Monuments DisAsTerBot's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Jul 2008
    Post Count
    3,140
    Where are my fellow spurstalk pc gamers at?!
    Thinking about upgrading my now 7 year old pc. I'm just a casual gamer but I've got some money to burn so I'd like to get some pieces to keep my rig running for another 5-7. So let me know what yall think I should go with. I'd like to keep the case. Current ancient parts list below:

    https://pcpartpicker.com/user/cccchris/saved/rWKPxr

  2. #2
    Take the fcking keys away baseline bum's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Mar 2003
    Post Count
    93,371
    Where are my fellow spurstalk pc gamers at?!
    Thinking about upgrading my now 7 year old pc. I'm just a casual gamer but I've got some money to burn so I'd like to get some pieces to keep my rig running for another 5-7. So let me know what yall think I should go with. I'd like to keep the case. Current ancient parts list below:

    https://pcpartpicker.com/user/cccchris/saved/rWKPxr
    You gotta tell us more about what you want out of the system. A few questions to answer to get a better idea what makes sense here:

    1. What specific games and type of games are you interested in?
    2. What kind of monitor are you planning to play on? Eg what resolution and what refresh rate?
    3. What's your budget?
    4. Do you care much about raytracing?
    5. Are in the US? If not, where are you?

    5-7 years is unrealistic as gpus are ass right now for the price, nothing even close to the value of what the GTX 1060 offered 7 years ago. Eg Nvidia's current GTX 1060 equivalent is the $600 RTX 4070. TBH Nvidia is impossible to recommend right now unless you want the best of the best and are willing to spend $1600 for it in the RTX 4090. At least CPU, RAM, and SSD are amazing for the price. Motherboards are ass for the price, as are power supplies, but nowhere near as bad as gpus.

  3. #3
    Monuments DisAsTerBot's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Jul 2008
    Post Count
    3,140
    You gotta tell us more about what you want out of the system. A few questions to answer to get a better idea what makes sense here:

    1. What specific games and type of games are you interested in?
    2. What kind of monitor are you planning to play on? Eg what resolution and what refresh rate?
    3. What's your budget?
    4. Do you care much about raytracing?
    5. Are in the US? If not, where are you?

    5-7 years is unrealistic as gpus are ass right now for the price, nothing even close to the value of what the GTX 1060 offered 7 years ago. Eg Nvidia's current GTX 1060 equivalent is the $600 RTX 4070. TBH Nvidia is impossible to recommend right now unless you want the best of the best and are willing to spend $1600 for it in the RTX 4090. At least CPU, RAM, and SSD are amazing for the price. Motherboards are ass for the price, as are power supplies, but nowhere near as bad as gpus.
    Thanks for the reply.
    1. I play nba2k, cod, rts
    2. I play on my Tv which is 2160p I believe not sure refresh rate
    3. I’m flexible. I don’t mind spending 500 or so for a gpu
    4. I have to look up what that is
    5. In texas

  4. #4
    Monuments DisAsTerBot's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Jul 2008
    Post Count
    3,140
    I was hoping to reuse the case and power supply

  5. #5
    Take the fcking keys away baseline bum's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Mar 2003
    Post Count
    93,371
    Thanks for the reply.
    1. I play nba2k, cod, rts
    2. I play on my Tv which is 2160p I believe not sure refresh rate
    3. I’m flexible. I don’t mind spending 500 or so for a gpu
    4. I have to look up what that is
    5. In texas
    1. NBA 2k on PC is still based on the previous gen console versions. ing sucks but 2k pulled this crap years ago too waiting until the second half of the gen. COD isn't too hard to run at good resolutions and framerate. Don't really know RTS too well other than they're decently cpu intensive.
    2. I'd probably go with an RX 7800 XT or RX 7900 XT in your shoes because you'll want to render games at 1440p or 1800p and then use FSR or RSR to upscale to 4k. It's a little overkill for rendering 1440p but assuming your TV doesn't support HDMI 2.1 VRR you won't be able to use things like Freesync that can let framerates still look great below 60 fps. If you do have HDMI 2.1 VRR I'd probably still go with one of those two gpus since that would mean you could target 120 fps as I think that's the HDMI 2.1 standard.
    3. 7800 XT is $500. The 7900 XT goes around $600 sometimes, usually $700 though I think.
    4. None of the games you mention implement raytracing AFAIK. I wouldn't bother.
    5. Duh I should have guessed US when you posted a pcpartpicker.com link

    Personally I use an RX 6700 XT on a 4k monitor, usually rendering 1440p for harder to run games and 1800p for easier to run games, then upscaling to 4k. Though something like Alan Wake 2 I have to render 1080p high or 1440p medium and upscale to 4k with FSR. But I have a monitor that does FreeSync smoothly down to 40 fps so I don't have to chase 60 fps minimums to make the games look good like I would on a display not supporting variable refresh rate.

    My expectations are for this console gen you can probably get away with a $150 cpu like an i5-12400F is Ryzen 5 5600 if targeting 60 fps, though if targeting 120 fps I'd probably spend more on a Ryzen 5 7600 or Ryzen 7 7700. Mainly because the console cpus are basically Ryzen 7 2700x but underclocked to use less power. I don't know of any RTS than don't run well on an i5-12400F or Ryzen 5 5600, but I'd definitely look up benchmarks for the hardest to run RTS you're interested in. Sadly your motherboard is going to cost around $150 too if you want anything that isn't total crap. In 2016 you could run games great on a $70 budget motherboard but for instance the Intel B660 boards below $130 tend to cause i5 to throttle because the VRMs heat up so much. You really need to spend double what boards used to cost to get something not complete trash sadly. At least with Intel. I think AMD is a little better now but the boards still aren't cheap.

    Tell me if your TV supports HDMI 2.1 VRR. If it doesn't even if it's a 120Hz panel I probably wouldn't chase 120fps on a panel without variable refresh rate support since then you're looking at either tearing or stuttering.
    Last edited by baseline bum; 12-06-2023 at 06:52 PM.

  6. #6
    Take the fcking keys away baseline bum's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Mar 2003
    Post Count
    93,371
    I was hoping to reuse the case and power supply
    PSU I probably wouldn't reuse considering it's 7 years old. EVGA B2 is decent (otherwise they wouldn't have given it a 5 year warranty) but not something I'd be willing to trust more than the 7 years you currently have on it, especially considering a 7800 XT or 7900 XT will put it under a pretty decent load gaming at 4k. If it was an EVGA G2 from that era that would be a little different as those were pretty high end psus. For PSU I'd buy something C-Tier or higher. I ended up buying an A-Tier unit (Corsair RMx 750) since they barely cost any more than the C-Tier units a year ago when I upgraded my system, but the PSU market probably isn't as bad now as it was then.

    Cultists PSU tier list:

    https://cultists.network/140/psu-tier-list/

  7. #7
    Monuments DisAsTerBot's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Jul 2008
    Post Count
    3,140
    PSU I probably wouldn't reuse considering it's 7 years old. EVGA B2 is decent (otherwise they wouldn't have given it a 5 year warranty) but not something I'd be willing to trust more than the 7 years you currently have on it, especially considering a 7800 XT or 7900 XT will put it under a pretty decent load gaming at 4k. If it was an EVGA G2 from that era that would be a little different as those were pretty high end psus. For PSU I'd buy something C-Tier or higher. I ended up buying an A-Tier unit (Corsair RMx 750) since they barely cost any more than the C-Tier units a year ago when I upgraded my system, but the PSU market probably isn't as bad now as it was then.

    Cultists PSU tier list:

    https://cultists.network/140/psu-tier-list/
    Thanks baseline, a lot of good info. I don’t have hdmi2.1. Honestly I’m fine playing at 1080p anything more is a bonus in my eyes. I was playing on part picker the other day and came up with this: https://pcpartpicker.com/list/sVYQcH
    Overkill?

  8. #8
    Take the fcking keys away baseline bum's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Mar 2003
    Post Count
    93,371
    Thanks baseline, a lot of good info. I don’t have hdmi2.1. Honestly I’m fine playing at 1080p anything more is a bonus in my eyes. I was playing on part picker the other day and came up with this: https://pcpartpicker.com/list/sVYQcH
    Overkill?
    I think you're spending way too much on board + cpu and too little on gpu unless there are games you play that murder locked i5 or Ryzen 5 cpus. Don't think I'd spend that much on an i7 plus board just to target 60 fps. Also these days cpu overclocking is basically dead, you get minimal performance gain from it on cpus now. Not like the days when you could push an i5-4690k to 25% more performance in Battlefield and such. And with that Z-series boards usually aren't worth paying for IMO. Especially not at the opportunity cost of getting a better gpu.

    Also if you really do want to OC the cpu, the Hyper 212 is pretty weak for the price, not enough cooler for modern i7/i9/5800x3D/7800x3D pushed balls to the wall. Not even unlocked i5 I think. I wanna say Thermalright Peerless Assassin 120 and Fuma Scythe are the best coolers in that price range. The Gamers Nexus youtube channel has some great videos on coolers in case there is one I forgot.

  9. #9
    Take the fcking keys away baseline bum's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Mar 2003
    Post Count
    93,371
    Also SSD are really cheap now, like $50 for decent 1TB M.2 PCIE-4.0x4, even good 2TB ones below $100. COD games on a 250GB SSD sounds rough. I was using that same 850EVO 250GB until a year and a half ago and couldn't fit RDR2 on it for example.

  10. #10
    Monuments DisAsTerBot's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Jul 2008
    Post Count
    3,140
    Also SSD are really cheap now, like $50 for decent 1TB M.2 PCIE-4.0x4, even good 2TB ones below $100. COD games on a 250GB SSD sounds rough. I was using that same 850EVO 250GB until a year and a half ago and couldn't fit RDR2 on it for example.
    https://pcpartpicker.com/list/YFwKKX
    What do you think??

  11. #11
    Monuments DisAsTerBot's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Jul 2008
    Post Count
    3,140
    Also is there any market for my old parts if I end up building this new pc?

  12. #12
    Take the fcking keys away baseline bum's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Mar 2003
    Post Count
    93,371
    Should be able to run basically everything at 4k with FSR Quality or FSR Ultra Quality (or RSR at 1800p/1440p for games not supporting FSR) with high settings. Should be able to even run a decent number of games at native 4k without any upscaling. I made a few small changes:

    * Changed your cooler to the Peerless Assassin 120 since it's both $15 cheaper and a little better performing
    * Changed the RAM to a different G.Skill set with the same timings for $8 less
    * Changed the SSD to something with a write cache and PCIE-4.0x4 instead of PCIE-3.0x4 for $16 more. If you'd rather save the $16 though no big deal to stick with the original SSD you picked as it's plenty fast too.
    * Changed the PSU to a slightly better one with a ten year warranty and able to run 100W more for the same price after a $10 rebate.

    https://pcpartpicker.com/list/4N9PQP

    Gamers Nexus testing makes the Peerless Assassin look like a no-brainer buy for air cooler on any cpu below an i9.



    If you'd like to cut $45 off the build without losing much you might consider swapping down to the Ryzen 7 7700 (non X) and using the stock cooler instead of buying the Peerless Assassin. Techpowerup's gaming benchmarks put the 7700 and 7700x at virtual the same performance in their gaming benchmarks, where they're testing at 720p to ensure it's a cpu test and not a gpu test. Though it is with a Noctua cooler.

    https://www.techpowerup.com/review/a...-non-x/18.html

    I'd probably stick with the 7700X and the Peerless Assassin though if you care about noise levels, especially given how ridiculously hot our last two summers have been.

    NOTE: I don't know AMD boards well enough to feel comfortable recommending a specific board. I know MSI Pro-A are usually pretty good value with competent VRM cooling on the Intel side so I would expect the same on the AMD too.
    Last edited by baseline bum; 12-06-2023 at 11:58 PM.

  13. #13
    🏆🏆🏆🏆🏆 ElNono's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Apr 2007
    Post Count
    152,631
    I have nothing against the AMD GPUs, but if you're really looking to go long with them, there are some plusses that are NVidia RTX only.

    For example, DLSS is way, way superior to FSR2 or FSR3, and AMD is in no position to catch up with the current hardware, IMVHO.

    Similarly, things like ray reconstruction are NVidia only right now, and while you might not use ray tracing today, it's something I can see game supporting in the future.

    All that said, as bum said, it's difficult to recommend an NVidia GPU right now. Not saying your choices are bad or anything, just pointing this out.

  14. #14
    Monuments DisAsTerBot's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Jul 2008
    Post Count
    3,140
    I have nothing against the AMD GPUs, but if you're really looking to go long with them, there are some plusses that are NVidia RTX only.

    For example, DLSS is way, way superior to FSR2 or FSR3, and AMD is in no position to catch up with the current hardware, IMVHO.

    Similarly, things like ray reconstruction are NVidia only right now, and while you might not use ray tracing today, it's something I can see game supporting in the future.

    All that said, as bum said, it's difficult to recommend an NVidia GPU right now. Not saying your choices are bad or anything, just pointing this out.
    well damn, just when i thought i was narrowing the choices down. So if I go Nvidia, do I need to go with Intel? I know you said it's difficult to recommend an Nvidia gpu, BUT what do you recommed, lol?

  15. #15
    Take the fcking keys away baseline bum's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Mar 2003
    Post Count
    93,371
    I have nothing against the AMD GPUs, but if you're really looking to go long with them, there are some plusses that are NVidia RTX only.

    For example, DLSS is way, way superior to FSR2 or FSR3, and AMD is in no position to catch up with the current hardware, IMVHO.

    Similarly, things like ray reconstruction are NVidia only right now, and while you might not use ray tracing today, it's something I can see game supporting in the future.

    All that said, as bum said, it's difficult to recommend an NVidia GPU right now. Not saying your choices are bad or anything, just pointing this out.
    While DLSS is unquestionably better than FSR, you're also paying through the nose for it. So either pay $650 to $700 for an RTX 4070 that gets the same performance as the RX 7800 XT but has DLSS, or drop to the RTX 4060 Ti 16GB for $450 to $500 and end up with a card that the 7800 XT crushes in performance. We're talking the 7800 XT beating the 4060 Ti 16GB in framerate by 37% at 1440p in Techpowerup's testsuite of ~20 games for example. In the former case of paying extra for the RTX 4070 you're also having your VRAM cut to 12GB from the 16GB on the 7800 XT because Nvidia loves planned obsolescence. They have been doing this since the GTX 680 in 2012 where they put too little VRAM on the card to get you to upgrade faster. I'm so glad I bought a 12GB 6700 XT instead of an 8GB RTX 3060 Ti seeing how horribly 8GB is aging these days.

    FSR is still pretty good when upscaling from higher resolutions like 1440p and especially 1800p, while DLSS's superiority really shows mostly when upscaling from low resolutions like 1080p and 720p. Hardware Unboxed did a good video on this, where 4k quality (eg rendering 1440p and upscaling to 4k) would be the closest to DisAsTerBot's use case on a 7800 XT targeting 60 fps, though it's good enough hardware to target 4k ultra quality instead most of the time, where the difference will be even less.



    Even though DLSS is clearly better, on a 4060 Ti 16GB for the same money you're just not going to be able to upscale from as high a render resolution as you would on a 7800 XT, which completely negates the DLSS advantage IMO. As for RT, it requires taking such a hatchet to resolution in games like Cyberpunk for example you're still not going to lock to 60 fps at 1080p even on a 4070, so DLSS performance to 4k won't either. You're talking about dropping to DLSS Ultra Performance at 4k which is going to drop your render resolution to 720p where even DLSS can't save it if you want RT in Cyberpunk. So RTX 4070 really doesn't seem like enough gpu for RT to be worthwhile now much less in future games. Whereas for less RT heavy games like Doom Eternal, Far Cry 6, Resident Evil 4, Spiderman, and many others you're not taking nearly the hit in performance on AMD vs Nvidia turning RT on in them. RT still seems like something I wouldn't bother too much with any lower than the $1000 RTX 4080.

    Nvidia definitely has the better technology in upscaling and raytracing, but between the much higher price and the planned obsolescence via insufficient levels of VRAM, there is no way I could recommend the RTX 4070 over the 7800 XT. And between the 4060 Ti 16GB (there is also a trash 8GB 4060 Ti you should steer way the clear away from) and RX 7800 XT the 7800 XT can brute force way beyond the 4060 Ti 16GB to completely overcome Nvidia's better upscaling. Even in RT the 4060 Ti 16GB is only ~9% faster than the 7800 XT in Cyberpunk at 1080p in Techpowerup's benches, with the 7800 XT 11% faster than the 4060 Ti 16GB at 1080p in their raytracing testsuite.

    https://www.techpowerup.com/review/a...800-xt/34.html

    Nvidia is just monopoly pricing right now.
    Last edited by baseline bum; 12-07-2023 at 01:02 PM.

  16. #16
    Take the fcking keys away baseline bum's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Mar 2003
    Post Count
    93,371
    well damn, just when i thought i was narrowing the choices down. So if I go Nvidia, do I need to go with Intel? I know you said it's difficult to recommend an Nvidia gpu, BUT what do you recommed, lol?
    Nah the biggest reason one might favor Intel over AMD is if you care a lot about emulation where Intel tends to beat AMD by a little on single core performance at the same price tier. Except for PS3 emulation, you want AMD there because the Ryzen 7000 series supports AVX-512 while Intel hasn't on its consumer cpus since the 11000 series and AVX-512 makes a night and day difference in some games on RPCS3. For PC gaming outside of emulation Intel and AMD are pretty near parity. AMD 7000 series is a little better than Intel 13000 series IMO, whereas Intel 12000 series was a little better than AMD 5000 series. Have to go back a few years to see much difference, say when Intel's 9000/10000 series were considerably better than AMD 3000 series.

  17. #17
    🏆🏆🏆🏆🏆 ElNono's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Apr 2007
    Post Count
    152,631
    While DLSS is unquestionably better than FSR, you're also paying through the nose for it. So either pay $650 to $700 for an RTX 4070 that gets the same performance as the RX 7800 XT but has DLSS, or drop to the RTX 4060 Ti 16GB for $450 to $500 and end up with a card that the 7800 XT crushes in performance. We're talking the 7800 XT beating the 4060 Ti 16GB in framerate by 37% at 1440p in Techpowerup's testsuite of ~20 games for example. In the former case of paying extra for the RTX 4070 you're also having your VRAM cut to 12GB from the 16GB on the 7800 XT because Nvidia loves planned obsolescence. They have been doing this since the GTX 680 in 2012 where they put too little VRAM on the card to get you to upgrade faster. I'm so glad I bought a 12GB 6700 XT instead of an 8GB RTX 3060 Ti seeing how horribly 8GB is aging these days.

    FSR is still pretty good when upscaling from higher resolutions like 1440p and especially 1800p, while DLSS's superiority really shows mostly when upscaling from low resolutions like 1080p and 720p. Hardware Unboxed did a good video on this, where 4k quality (eg rendering 1440p and upscaling to 4k) would be the closest to DisAsTerBot's use case on a 7800 XT targeting 60 fps, though it's good enough hardware to target 4k ultra quality instead most of the time, where the difference will be even less.



    Even though DLSS is clearly better, on a 4060 Ti 16GB for the same money you're just not going to be able to upscale from as high a render resolution as you would on a 7800 XT, which completely negates the DLSS advantage IMO. As for RT, it requires taking such a hatchet to resolution in games like Cyberpunk for example you're still not going to lock to 60 fps at 1080p even on a 4070, so DLSS performance to 4k won't either. You're talking about dropping to DLSS Ultra Performance at 4k which is going to drop your render resolution to 720p where even DLSS can't save it if you want RT in Cyberpunk. So RTX 4070 really doesn't seem like enough gpu for RT to be worthwhile now much less in future games. Whereas for less RT heavy games like Doom Eternal, Far Cry 6, Resident Evil 4, Spiderman, and many others you're not taking nearly the hit in performance on AMD vs Nvidia turning RT on in them. RT still seems like something I wouldn't bother too much with any lower than the $1000 RTX 4080.

    Nvidia definitely has the better technology in upscaling and raytracing, but between the much higher price and the planned obsolescence via insufficient levels of VRAM, there is no way I could recommend the RTX 4070 over the 7800 XT. And between the 4060 Ti 16GB (there is also a trash 8GB 4060 Ti you should steer way the clear away from) and RX 7800 XT the 7800 XT can brute force way beyond the 4060 Ti 16GB to completely overcome Nvidia's better upscaling. Even in RT the 4060 Ti 16GB is only ~9% faster than the 7800 XT in Cyberpunk at 1080p in Techpowerup's benches, with the 7800 XT 11% faster than the 4060 Ti 16GB at 1080p in their raytracing testsuite.

    https://www.techpowerup.com/review/a...800-xt/34.html

    Nvidia is just monopoly pricing right now.
    I don't dispute the criticism about NVidia, pricing and what not. However, as a person that works with FSR2 and 3 a lot, I can tell you it's pure unadultered compared to DLSS, TSR or TAAU.

    It's actually fairly amazing to me in this day and age how people just got accustomed to the artifacting and flickering introduced by that tech.

    However, it is true that if you're not upscaling much, you could probably get away with using TAAU.

    Anyways, just wanted to point that out. I understand both sides of the coin.

  18. #18
    Take the fcking keys away baseline bum's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Mar 2003
    Post Count
    93,371
    I don't dispute the criticism about NVidia, pricing and what not. However, as a person that works with FSR2 and 3 a lot, I can tell you it's pure unadultered compared to DLSS, TSR or TAAU.

    It's actually fairly amazing to me in this day and age how people just got accustomed to the artifacting and flickering introduced by that tech.

    However, it is true that if you're not upscaling much, you could probably get away with using TAAU.

    Anyways, just wanted to point that out. I understand both sides of the coin.
    FSR3/DLSS3 frame gen I have no interest in and can't imagine ever turning it on. First thing I always do with any TV I get is turn off all that frame gen crap. So would you rather use DLSS from 1080p to 4k instead of FSR2 from 1440p to 4k? Which is the kind of tradeoff you have to make going Nvidia vs AMD when holding cost of the gpu fixed at ~$500.

  19. #19
    🏆🏆🏆🏆🏆 ElNono's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Apr 2007
    Post Count
    152,631
    FSR3/DLSS3 frame gen I have no interest in and can't imagine ever turning it on. First thing I always do with any TV I get is turn off all that frame gen crap. So would you rather use DLSS from 1080p to 4k instead of FSR2 from 1440p to 4k? Which is the kind of tradeoff you have to make going Nvidia vs AMD when holding cost of the gpu fixed at ~$500.
    I don't really care about frame gen either, tbh, unless there's some game I really, really rather play at 120fps (but the input lag they add is noticeable anyways).

    Just talking about upscaling and things like ray guiding. Look at the first 15 mins of the video you posted. FSR is basically ugly/unusable in half the tests he does. I would rather use DLSS for any resolution.

    That said, you're correct that it's difficult to justify the price NVidia charges these days. Not saying there's an easy answer, just pointing out that if you're looking for the next 3+ years, RTX probably has the better longevity value.

  20. #20
    Take the fcking keys away baseline bum's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Mar 2003
    Post Count
    93,371
    I don't really care about frame gen either, tbh, unless there's some game I really, really rather play at 120fps (but the input lag they add is noticeable anyways).

    Just talking about upscaling and things like ray guiding. Look at the first 15 mins of the video you posted. FSR is basically ugly/unusable in half the tests he does. I would rather use DLSS for any resolution.

    That said, you're correct that it's difficult to justify the price NVidia charges these days. Not saying there's an easy answer, just pointing out that if you're looking for the next 3+ years, RTX probably has the better longevity value.
    IDK. The two games FSR 4k Quality that were pretty bad that first 15 minutes were Forspoken and Dead Space, but DLSS was awful in Dead Space too (though for ghosting on DLSS while it was shimmer on FSR). While DLSS Performance mode (eg when you're dropping resolution by a scale of 2x on each axis) murders FSR Performance mode (eg the same 2x scale on each axis), as is repeatedly shown in the video, I have doubts about DLSS saving a gpu with insufficient VRAM into the future a few years from now when you'd need to run DLSS/FSR Performance instead of DLSS/FSR Quality on a 7800 XT. For instance Hardware Unboxed also showed full resolution textures being repeatedly cycled out and replaced with immediately noticeable LOD textures in Forspoken and Hogwarts legacy on the 8GB RTX 3070 while they stayed full res on the 12GB RX 6700 XT, I think at 1440p if I remember right. Are you not worried similar will happen in the next years with 12GB cards as 16GB becomes more mainstream? Nvidia's VRAM deficiencies seem like at least one step back on future proofing if DLSS can be considered one step forward.

  21. #21
    🏆🏆🏆🏆🏆 ElNono's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Apr 2007
    Post Count
    152,631
    IDK. The two games FSR 4k Quality that were pretty bad that first 15 minutes were Forspoken and Dead Space, but DLSS was awful in Dead Space too (though for ghosting on DLSS while it was shimmer on FSR). While DLSS Performance mode (eg when you're dropping resolution by a scale of 2x on each axis) murders FSR Performance mode (eg the same 2x scale on each axis), as is repeatedly shown in the video, I have doubts about DLSS saving a gpu with insufficient VRAM into the future a few years from now when you'd need to run DLSS/FSR Performance instead of DLSS/FSR Quality on a 7800 XT. For instance Hardware Unboxed also showed full resolution textures being repeatedly cycled out and replaced with immediately noticeable LOD textures in Forspoken and Hogwarts legacy on the 8GB RTX 3070 while they stayed full res on the 12GB RX 6700 XT, I think at 1440p if I remember right. Are you not worried similar will happen in the next years with 12GB cards as 16GB becomes more mainstream? Nvidia's VRAM deficiencies seem like at least one step back on future proofing if DLSS can be considered one step forward.
    The shimmering is everywhere on almost every le. If you want to look for truly awful FSR2 upscales, check Immortals of Aveum and the latest Cuberpunk.
    Also, Hogwarts was Unreal 4. You're just not going to get future les using that. Immortals is UE 5 and because it's so demanding, the upscaler becomes more prominent. That le doesn't even use RayTracing (at least on consoles).

    I also don't think we're going to see 16GB cards as the 'standard' for a while, at least while current gen consoles are still at 8GB. If anything, that mostly applies to texture quality, and while you do use LOD bias with upscalers, if you're really rendering at 1080p or 1440p, it shouldn't matter as much.
    Lastly, for streaming, I think we're hopefully going to see some hardware support for DirectStorage soon, which should alleviate that issue quite a lot. That would bring parity with current gen consoles and deal with non-cpu driven texture loads.

    But we'll see. Again, not being contrarian to your point, just highlighting that I feel AMD is far behind in a lot of this tech.

  22. #22
    Take the fcking keys away baseline bum's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Mar 2003
    Post Count
    93,371
    The shimmering is everywhere on almost every le. If you want to look for truly awful FSR2 upscales, check Immortals of Aveum and the latest Cuberpunk.
    Also, Hogwarts was Unreal 4. You're just not going to get future les using that. Immortals is UE 5 and because it's so demanding, the upscaler becomes more prominent. That le doesn't even use RayTracing (at least on consoles).

    I also don't think we're going to see 16GB cards as the 'standard' for a while, at least while current gen consoles are still at 8GB. If anything, that mostly applies to texture quality, and while you do use LOD bias with upscalers, if you're really rendering at 1080p or 1440p, it shouldn't matter as much.
    Lastly, for streaming, I think we're hopefully going to see some hardware support for DirectStorage soon, which should alleviate that issue quite a lot. That would bring parity with current gen consoles and deal with non-cpu driven texture loads.

    But we'll see. Again, not being contrarian to your point, just highlighting that I feel AMD is far behind in a lot of this tech.
    Is it standard for big budget PS5 games to only use 8GB of the memory pool as VRAM? Figured using ~10GB would be more the sweet spot for at least multiplatform games thanks to the Series X having segmented memory with a fast 10GB pool and a slower 6GB one. Have seen so many problems with 8GB on PC ports. Eg games like RE Village, RE4, Doom Eternal, The Last of Us, Forspoken, Hogwarts, Deathloop, Far Cry 6, and a lot of others I forget where VRAM starved cards like the 3070/3070Ti/3060Ti can lose to the 3060 12GB when playing at higher res or turning on RT.

    Maybe I just don't notice it like you do since you're a gamedev deep in the or maybe it's because on PoorC I'm gaming on a 28 inch 4k monitor instead of say a 70 inch 4k TV, but I have been pretty pleased with FSR and even RSR in most games I have played. Though always using at least a 1440p native res. I did hate how it looked in Starfield where I had to render 1080p to keep in my panel's FreeSync range. Oh well, game bored me to death anyways.

  23. #23
    🏆🏆🏆🏆🏆 ElNono's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Apr 2007
    Post Count
    152,631
    Is it standard for big budget PS5 games to only use 8GB of the memory pool as VRAM? Figured using ~10GB would be more the sweet spot for at least multiplatform games thanks to the Series X having segmented memory with a fast 10GB pool and a slower 6GB one. Have seen so many problems with 8GB on PC ports. Eg games like RE Village, RE4, Doom Eternal, The Last of Us, Forspoken, Hogwarts, Deathloop, Far Cry 6, and a lot of others I forget where VRAM starved cards like the 3070/3070Ti/3060Ti can lose to the 3060 12GB when playing at higher res or turning on RT.
    So on the PS5 at least, you have ~12GB of total usable memory, once you account for loading the executable, which on a AAA game can vary but is normally ~1GB in size. But this is unified memory (UMA), so the same RAM is used for the game data, audio, etc and as VRAM. There's a lot of other stuff that's not strictly video that uses up memory, like audio, game data, etc. Some of that gets streamed, but some stuff needs to be resident so you sorta work it out depending on the game you're making. There's also techniques like virtual texturing where you can stream chunks or parts of textures instead of having a fixed-size texture memory pool.
    The SSD helps a ton with that, and that's why I think DirectStorage is a bit of the PC answer to that, which should help things.


    Maybe I just don't notice it like you do since you're a gamedev deep in the or maybe it's because on PoorC I'm gaming on a 28 inch 4k monitor instead of say a 70 inch 4k TV, but I have been pretty pleased with FSR and even RSR in most games I have played. Though always using at least a 1440p native res. I did hate how it looked in Starfield where I had to render 1080p to keep in my panel's FreeSync range. Oh well, game bored me to death anyways.
    We're also pretty picky about clean/pristine looks. You own a PS5, you know what I'm talking about (I can't really speak for the PC ports, which I've heard they've been pretty shoddy). You remember the big hoopla about the shimmering on HZD 2 in 4K. I was shocked that slipped through, we don't ship games like that.
    At any rate, I can't comment much more about this without getting in trouble, but we'll revisit in the future.

  24. #24
    Take the fcking keys away baseline bum's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Mar 2003
    Post Count
    93,371
    So on the PS5 at least, you have ~12GB of total usable memory, once you account for loading the executable, which on a AAA game can vary but is normally ~1GB in size. But this is unified memory (UMA), so the same RAM is used for the game data, audio, etc and as VRAM. There's a lot of other stuff that's not strictly video that uses up memory, like audio, game data, etc. Some of that gets streamed, but some stuff needs to be resident so you sorta work it out depending on the game you're making. There's also techniques like virtual texturing where you can stream chunks or parts of textures instead of having a fixed-size texture memory pool.
    The SSD helps a ton with that, and that's why I think DirectStorage is a bit of the PC answer to that, which should help things.




    We're also pretty picky about clean/pristine looks. You own a PS5, you know what I'm talking about (I can't really speak for the PC ports, which I've heard they've been pretty shoddy). You remember the big hoopla about the shimmering on HZD 2 in 4K. I was shocked that slipped through, we don't ship games like that.
    At any rate, I can't comment much more about this without getting in trouble, but we'll revisit in the future.
    Horizon Forbidden West its first few months had shimmering unlike anything I have seen since the 360 gen. That's a pretty extreme example. Strange though, I thought PS5 had 16GB of unified memory.

  25. #25
    No darkness Cry Havoc's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Jan 2007
    Post Count
    33,655
    I don't really care about frame gen either, tbh, unless there's some game I really, really rather play at 120fps (but the input lag they add is noticeable anyways).

    Just talking about upscaling and things like ray guiding. Look at the first 15 mins of the video you posted. FSR is basically ugly/unusable in half the tests he does. I would rather use DLSS for any resolution.

    That said, you're correct that it's difficult to justify the price NVidia charges these days. Not saying there's an easy answer, just pointing out that if you're looking for the next 3+ years, RTX probably has the better longevity value.
    Threre's a third option here, and that's that AMD cards support XESS on some les (a growing list). XESS might still not be up to DLSS quality but it's better than FS1/2, and at least equal to 2.2 for now. It's a really solid option in Cyberpunk 2077, if FSR 2.1 is 85% of DLSS, then XESS is probably 90-95%. I also don't see the artifacting you're talking about on FSR until you drop to at least balanced at 1080p or performance at 1440p. Balanced 1440p for most les is really solid, if not as great as DLSS is. But XESS is even better and gives the Radeon cards more parity with Nvidia.

    DB, you can easily break this down into a few categories to determine what you want on your next GPU:

    NVidia chips:
    Newest tech
    Beats AMD solidly to blowing them away in AI work/stable diffusion
    Raytracing king now and for the foreseeable future
    Lower power draw (if electric bills hurt)
    DLSS is probably the best single piece of tech in terms of improving a single GPU's shelf/life or allowing it to overperform that we've ever seen. It's better than overclocking to squeeze out another 5-15%, that's for sure.
    Much more expensive FLOP to FLOP
    Also includes less VRAM than AMD
    Chipsets historically haven't aged as well as AMD offerings (with the notable exception of the 1080ti).
    Great resale value, for what you're buying

    AMD GPUs:
    Much better performance/price ratio for rasterized gaming (not Raytracing)
    Generally at least 4GB more VRAM at the same price point
    A much, much better software suite for accessing system settings. Seriously, NVidia's UI looks like it's from 2002.
    Arguably more well-rounded set of features (AMD Anti-Lag, Chill, Super Res, Enhanced sync, sharpening) -- None of these are as good as DLSS is though, but they're useful nonetheless
    Historically their cards outlive their usual lifespans because of driver optimization
    Very capable at the higher end of doing raytracing/AI work.
    They run hotter
    Use more juice from the wall
    For enterprise applications they're way behind Nvidia

    A wash for me is drivers, as I feel like neither company can claim they're truly reliable. The /r/nvidia sub just flatly tells people to wait several days before upgrading to new versions now, but AMD's drivers don't seem to have the same magic to give their older cards the huge boosts they used to get (former 290x owner, heyo).

    I just built a new rig and went with the 7900XT. I'd say the 7800xt is the sweet spot right now for gaming, though maybe overkill for what you're playing. I decided that VRAM is going to be more of a concern in the future than which upscaler I'm using, as they're already good and likely will only be better in 2-3 years when my card really needs to use them.

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •