Whats the word on the launch performance of GTA V on AMD cards? I don't recall too much moaning..
That really is a chicken move by the devs though, since an R9 290 has damn near the horsepower of a GTX 970 and not a piece of crap like the 960.
Whats the word on the launch performance of GTA V on AMD cards? I don't recall too much moaning..
Not too bad. For a couple of days AMD was actually bundling GTA V with the R9 290 and R9 290x. Kind of surprised to see the GTX 970 outperforming the GTX 780 Ti here.
I never owned an AMD CPU, tbh, although some of our clients had boxes with them...
My main beef with Intel was on how inefficient they were power-wise (I'm talking 386, 486, Pentium days here). They always had the raw processing power, but you had a single-core CPU chewing 70W at idle and the fan buzzing like a mother er. Then with the Core2 processors, they added speed-step and that was an improvement, and now the i series is just phenomenal. AMD also copied a lot of that with their PowerNow! tech, but even in their current procs, if you get near 3GHz, you're gonna be hitting 90-100W TDP...
Now I would agree most people might not give a about power and noise, but I do put some value on design improvements like that. When you are surrounded by computers for 8+ hours a day for years on end, you start appreciating the little things like that.
You should look at the Broadwell i7-5775C or i5-5675C then, since they're 65W TDP. They're supposed to release early next month. I wonder how low they could get the TDP on 14nm Broadwell Xeon E3s without the iris pro graphics.
I take that back, the Broadwell i7 is supposed to be $480! And the Broadwell i5 $350. Who is going to pay that just to get ing Iris Pro Graphics when you could get an i7-4771 and an R9 280 for that price?
I hear those are more expensive than the Haswell because of the Iris Pro and the eDRAM, it's gonna take some time for the prices to drop. I'm also wondering how they stack up performance-wise against the Haswell, since the released Broadwell processors so far have been underwhelming, IIRC. I suppose they're great overclockers, but I'm not really into that.
Do you know if that eDRAM is supposed to be usable as another level of cache when the iGPU is disabled? Because they chopped 2MB of L3 cache off their desktop i5 and i7 for Broadwell.
AFAIK, eDRAM is used as a L4 cache, so it's used by both the CPU and IGP. The thing is, CPU caches use SRAM, which is bigger on the die, and thus costlier, and uses more voltage than the actual logic part, increasing thermal and power. Intel has been ing about how SRAM doesn't scale well, with a good chunk of the die nowadays used strictly by caches. DRAM is simply more compact, uses less power, but it needs to be constantly refreshed.
There's a more in-depth explanation here:
http://www.eetimes.com/author.asp?doc_id=1323410
That said, I suspect a good chunk of that eDRAM is likely reserved to cache framebuffer memory for the IGP.
Also, for Haswell, only parts with Iris Pro have eDRAM... this doesn't seem like it's going to change for Skylake either (although all Skylake parts that include an IGP will be Iris Pro, from what I read)
LOL if Intel has all the overclockers who will never use Iris Pro subsidizing their mobile users by paying for a useless igpu. I have heard the locked i7-6700 and i5-6500 are supposed to have prices more in line with Haswell i7 and i5, but LMAO if they force Iris Pro on the K series. It would be worth it just to hear all the butthurt from everyone who spends out the ass on Z boards, watercooling, and K CPUs for not much performance gain in gaming to have to spend $150 more.
AFAIK, all the Skylake SKUs they've listed have Iris Pro. Some come with eDRAM and some do not. They probably are going to make the unlocked very premium. At those power/thermal, you can probably overclock the out of those processors.
Might as well go with the 140W enthusiast edition CPUs then for those prices. Maybe that's Intel's strategy though.
I don't think they did it just for console players. There are a lot of pc gamers still running with 560s, 7770s, etc etc
Why release a game that can only run on the highest of end PCs?
The benchmarks are disappointing as for a game that doesn't look that great. Looks like the 290 = 960 rumor is bull though.
Well they are running w/ SSAO on which is gonna be as taxing as Ubersampling in the last Witcher.. SSAO looks great, but who seriously uses that .
Even with it off and HBAO+ on the an X is only getting 63.8 fps in that benchmark. I run basically every game I have with ambient occlusion maxed when available (e.g., Dragon Age Inquisition, Dying Light, GTA V, Far Cry 4) and they all run 60 fps on a GTX 970 at mostly ultra settings. I hear HairWorks murders performance too, which is disappointing since AMD's TressFX isn't too big a performance hit on Maxwell.
Yeah I see that now.. what an unoptimized piece of .. and hair tech. Such a minor detail to name a tech over. TressFX looked re ed on TR. It's crazy that 3 years later all these AAA les still don't look as good as modded Skyrim.
I kind of liked shampoo commercial mode on TR tbh
LOL texture pop in at ultra settings using SLI an X
Wow, that's real annoying. So much for immersion.
The PS4 version of this game looks particularly bad. LOL framerates in the teens in cutscenes, in the twenties in gameplay, and all the pop in.
When did PS4 become the scrub console?
Stuttering mess on PC even with a GTX 980 at 1080p
AMD ethered by Forbes over their crying about Witcher 3
http://www.forbes.com/sites/jasoneva...ias-hairworks/
AMD
they sabatoged our performance
it's not fair
forget business and market share, play nice and let us be successful too!!
Weird, I haven't had any issues so far
There are currently 1 users browsing this thread. (0 members and 1 guests)