I mean, it depends. Yes, older cards can still push 1080p at acceptable framerates, but a lot of the newer games are really pushing the envelope. Some of the more modern effects like tessellation and ambient occlusion are absolutely destroying cards right now and add a ton of detail to an image.
I would argue that the largest reason for CPUs staying viable so long is due to the fact that so much more of the graphic calculations are being offloaded to the video card these days. In 2010, your system was probably about 50/50 for some games as far as how much your CPU mattered vs GPU. Now it's more like 10/90, as long as you have a modern quad core it's going to be able to handle most of what's going on in games because it's supplemental to the card.
Also consider the 580 at launch was a monster $500 card. It will run Witcher 3 on low @ 1080 and get around 40fps. Definitely impressive that it can still run the game, but I would argue it's on the verge of being obsolete. That said, a GTX580 is still capable of more floating point operations than a PS4.