Page 9 of 11 FirstFirst ... 567891011 LastLast
Results 201 to 225 of 255
  1. #201
    Take the fcking keys away baseline bum's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Mar 2003
    Post Count
    93,371
    That's false by the way. AMD said that quote was out of context and that the user-level cards will launch in June also.
    link? I hope Vega is awesome.

  2. #202
    Spur-taaaa TDMVPDPOY's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Feb 2005
    Post Count
    41,338
    remember when they launch rx480 with limited supply, got retailers were gold mining and reselling higher price...what did that do? those who missed out went and bought a NVidia card...

    I can sense the same repeating with vega cards, 17k-20k units on release...

    ppl who couldn't afford the 1080ti, will go buy the ti card if NVidia drops a further 100bucks on it...

  3. #203
    No darkness Cry Havoc's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Jan 2007
    Post Count
    33,655
    link? I hope Vega is awesome.
    http://vocaroo.com/i/s0Qz6Mlo7wHn

    Skip to around 11:00.

  4. #204
    Spur-taaaa TDMVPDPOY's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Feb 2005
    Post Count
    41,338
    RIP AMD...
    lol vega release date 26/27 june

    at computex2017, there media conference was a fkn joke...
    when they talked and tested vega, it was a load of bs....wants to show 4k smooth gaming by showcasing 2 VEGA cards in a threadripper cpu setup...= fail
    wtf would ppl buy 2 cards when they expect 1 would be enough?
    why were they testing 2 cards?
    why test it on a threadripper cpu system? when that cpu is just overkill for gamers....why not test it on a r5 or r7 system?

    trying to hide something?

  5. #205
    Take the fcking keys away baseline bum's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Mar 2003
    Post Count
    93,371
    Well Vega looks dead on arrival.

    needing two cards to get 60 fps on Prey at 4k
    A single GTX 1080 can almost do that, and a single GTX 1080 Ti can easily do that
    you can do 60 fps locked at 4k with two GTX 980
    Cry Havoc

    Get in here DJR210

  6. #206
    Take the fcking keys away baseline bum's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Mar 2003
    Post Count
    93,371
    RIP AMD...
    lol vega release date 26/27 june

    at computex2017, there media conference was a fkn joke...
    when they talked and tested vega, it was a load of bs....wants to show 4k smooth gaming by showcasing 2 VEGA cards in a threadripper cpu setup...= fail
    wtf would ppl buy 2 cards when they expect 1 would be enough?
    why were they testing 2 cards?
    why test it on a threadripper cpu system? when that cpu is just overkill for gamers....why not test it on a r5 or r7 system?

    trying to hide something?
    Dude that's Vega Frontier Edition launching June 26/27, the workstation card. It's Pat said the gaming card is launching at the end of July (see the 44:30 mark of the video)


  7. #207
    Take the fcking keys away baseline bum's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Mar 2003
    Post Count
    93,371
    If AMD needs two Vega cards to run Prey at 60 fps at 4k man I don't see how they make a penny off Vega. Vega is going to be a disaster for them financially. HBM2 is crazy expensive, I mean how are they going to make any money off these when Nvidia is already selling the GTX 1080 for $500 and could easily drop it to $450 to really AMD over. I can't believe all we're going to see is them maybe matching the GTX 1080 fourteen months later while using a memory tech that's going to ensure Vega is going to be significantly more expensive to produce than GP104 is for Nvidia. And then Nvidia already looks to be near getting Volta out there after they demoed the Tesla V100 a few weeks ago.

  8. #208
    Spur-taaaa TDMVPDPOY's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Feb 2005
    Post Count
    41,338
    when they showed the 2 vega cards on the ripper system, did u see the game was stuttering at 4k? lmao

  9. #209
    Take the fcking keys away baseline bum's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Mar 2003
    Post Count
    93,371
    when they showed the 2 vega cards on the ripper system, did u see the game was stuttering at 4k? lmao
    That wasn't stutter, it was tearing. It happens when you're rendering say 90 fps on a 60 Hz screen.

  10. #210
    Spur-taaaa TDMVPDPOY's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Feb 2005
    Post Count
    41,338
    If AMD needs two Vega cards to run Prey at 60 fps at 4k man I don't see how they make a penny off Vega. Vega is going to be a disaster for them financially. HBM2 is crazy expensive, I mean how are they going to make any money off these when Nvidia is already selling the GTX 1080 for $500 and could easily drop it to $450 to really AMD over. I can't believe all we're going to see is them maybe matching the GTX 1080 fourteen months later while using a memory tech that's going to ensure Vega is going to be significantly more expensive to produce than GP104 is for Nvidia. And then Nvidia already looks to be near getting Volta out there after they demoed the Tesla V100 a few weeks ago.
    these fkn mutts didn't even show case the card properly to persuade buyers, no mention or display of fsync2 .....spend 1hr talkin about partnerships, while the NVidia conference went for like 2hrs in more detail about their lineup of products..

    if NVidia drops the 1080ti another 50bucks, then it makes things interesting especially those who didn't waited and didn't want to pay a premium for NVidia card + monitor setup for that gsync premium option...

    I believe the amd batch of hbm2 was late order when NVidia already secured their own supplies b4 amd got to the table....now that NVidia has worked out how ddr6 = hbm2, they didn't go for the hbm2 for volta cards, yet amd still sticking with the expensive route for hbm2 when they should just went with ddr6 with already supplies ready...fail. unless they think the performance opportunity costs they cant achieve ddr6 benchmarks compared to hbm2?

  11. #211
    Spur-taaaa TDMVPDPOY's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Feb 2005
    Post Count
    41,338
    That wasn't stutter, it was tearing. It happens when you're rendering say 90 fps on a 60 Hz screen.
    so why didn't fsync2 or fsync1 kicked in to smooth it out?....don't tell me they were using a 4k 60hz panel and not a fsync 4k monitor...

    did u also notice they didn't show fps on test bench? pussies

    http://www.pcworld.com/article/31990...re-months.html

    https://www.techpowerup.com/reviews/...is/Prey/4.html
    heres a list of single gpu cards at 4k....

    why didn't these clowns just show the gamer card instead of the workstation card, makes no fkn sense....
    Last edited by TDMVPDPOY; 05-30-2017 at 11:35 PM.

  12. #212
    Take the fcking keys away baseline bum's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Mar 2003
    Post Count
    93,371
    so why didn't fsync2 or fsync1 kicked in to smooth it out?....don't tell me they were using a 4k 60hz panel and not a fsync 4k monitor...

    did u also notice they didn't show fps on test bench? pussies
    Vega is going to be such a ing disaster. The die size is more than 500 mm^2 and it's using HBM2 and is going to have to compete with a GTX 1080 whose die size is 314 mm^2 and which uses much much cheaper GDDR5X. Even the GP102 gpu in the an Xp and GTX 1080 Ti is only 471 mm^2. God a 500 mm^2 die is going to have horrible yields early on, and then HBM2 is already known to have terrible yields. How is this company going to make its loan payments in 2019 and not go bankrupt when Vega looks like it's going to be a big money loss for them. I can't imagine they'll break even on any Vega consumer card based on what we're hearing.

  13. #213
    Spur-taaaa TDMVPDPOY's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Feb 2005
    Post Count
    41,338
    Vega is going to be such a ing disaster. The die size is more than 500 mm^2 and it's using HBM2 and is going to have to compete with a GTX 1080 whose die size is 314 mm^2 and which uses much much cheaper GDDR5X. Even the GP102 gpu in the an Xp and GTX 1080 Ti is only 471 mm^2. God a 500 mm^2 die is going to have horrible yields early on, and then HBM2 is already known to have terrible yields. How is this company going to make its loan payments in 2019 and not go bankrupt when Vega looks like it's going to be a big money loss for them. I can't imagine they'll break even on any Vega consumer card based on what we're hearing.
    if the workstation card is already pure , u think the gamers card going to be any different in performance? I doubt it give NVidia 1080-1080ti cards any compe ion, therefor they either stick with t he current price with no discounts...

    hence those new NVidia 120hz gsync monitors costs and arm an a leg compared to the 100hz gsync models...

  14. #214
    Spur-taaaa TDMVPDPOY's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Feb 2005
    Post Count
    41,338
    those r5 r7 and threadripper cpus are a beasts...only problem is the stupid ram compatible that's stopping me from upgrading my comp....

    if they solve that ram memory , then there probably be more adopters

    speaking of threadripper vs intel new gen cpus...seems like the only difference is amd supporting more pci-e lanes and cheaper price...that's about it..

  15. #215
    Spur-taaaa TDMVPDPOY's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Feb 2005
    Post Count
    41,338
    amd just cut prices on their ryzen cpus across the board 10-20%... not bad if u want a r7-1700 for USD$300bucks

    just b4 threadripper and intel new cpus being released, and how old are the r7 cpus? 2-3months since release, not bad...bring on compe ion


    ps.

    threadripper high end cpu is 1200 cheaper then the new highest intel chips...wtf intel...would u still buy a threadripper?
    Last edited by TDMVPDPOY; 06-03-2017 at 07:04 AM.

  16. #216
    Spur-taaaa TDMVPDPOY's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Feb 2005
    Post Count
    41,338
    r5/r7 vs threadripper

    same clock speeds per core
    dual ddr vs quad ddr

    so tell me, are there actually games out there that maximizes more then 2 cores? let alone benefiting/maximizing whether dual or quad memory? the only benefits so far is just streaming or running some other intensive (streaming/rendering) while gaming....

    more then 4 cores is just a future proof cpu, but nothing uses more then 2 cores, even the quad core cpus was never maximized...

    why they trying to pushed server cpus to become mainstream/enthusiast?

  17. #217
    Take the fcking keys away baseline bum's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Mar 2003
    Post Count
    93,371
    r5/r7 vs threadripper

    same clock speeds per core
    dual ddr vs quad ddr

    so tell me, are there actually games out there that maximizes more then 2 cores? let alone benefiting/maximizing whether dual or quad memory? the only benefits so far is just streaming or running some other intensive (streaming/rendering) while gaming....

    more then 4 cores is just a future proof cpu, but nothing uses more then 2 cores, even the quad core cpus was never maximized...

    why they trying to pushed server cpus to become mainstream/enthusiast?
    What year do you think this is, 2009? Go buy a dual core cpu without hyperthreading and tell me how well your games run. I know if you play GTA V on a straight dual core you get so much stuttering and the framerate is often in the 20s (I have tested this by disabling two cores and disabling hyperthreading on my Xeon E3-1231v3). Lots of modern games will pin overclocked i5 at near 100% on all cores if you have a decent gpu and run with an unlocked framerate.

  18. #218
    Spur-taaaa TDMVPDPOY's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Feb 2005
    Post Count
    41,338
    What year do you think this is, 2009? Go buy a dual core cpu without hyperthreading and tell me how well your games run. I know if you play GTA V on a straight dual core you get so much stuttering and the framerate is often in the 20s (I have tested this by disabling two cores and disabling hyperthreading on my Xeon E3-1231v3). Lots of modern games will pin overclocked i5 at near 100% on all cores if you have a decent gpu and run with an unlocked framerate.
    hey have u been watching some reviews of the x299 platform? those reviewers are t aking a dump on intel atm about its latest platform...rushing it out just to tackle amd, and they wont be released till next year or some ....

  19. #219
    Take the fcking keys away baseline bum's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Mar 2003
    Post Count
    93,371
    hey have u been watching some reviews of the x299 platform? those reviewers are t aking a dump on intel atm about its latest platform...rushing it out just to tackle amd, and they wont be released till next year or some ....
    Yeah X299 is stupid for gaming unless Skylake-X has a big gain in IPC (which I can't see happening, why would it when it's Skylake?), because games are optimized for the weak octacore Jaguar cpus in the PS4/XBone. But that's why no recent game runs worth a on a straight dual core: because if devs optimized AAA games towards dual cores they'd never run on PS4 or XBone since those Jaguar cpus have awful IPC and really low clockspeeds. But that's the reason quadcore i7 have outperformed quadcore i5 in games since 2014 or so: the hyperthreading on quadcore i7 is more than enough to make up for the core count deficiency in AAA games because the Jaguar octacores are so weak.

  20. #220
    Spur-taaaa TDMVPDPOY's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Feb 2005
    Post Count
    41,338
    seems like amd video cards are the preferred cards to horde to use in mining systems, dunno why... ethereum or digital currency mining...

    the whole setup doesn't make sense to me as I continue to read into it and watch yt clips, I thought digital currency has no nominal value since the international monetary banks don't recognize a currency they don't control/trade .... easy money mining compared to those LAN games cafes farming rare items to resell in games...

    but it looks like they be hording up amd cards again when its release again....

  21. #221
    Spur-taaaa TDMVPDPOY's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Feb 2005
    Post Count
    41,338
    lol apple jumping on vega for its next line of apple imac pros, still using intel xeon cpus

    https://www.theverge.com/2017/6/5/15...date-wwdc-2017

    specs of the vega card in the imac pro http://wccftech.com/amd-debuts-radeo...-8-16-gb-hbm2/

    do u guys think apple made the correct choice of getting out of the cpu market and using intels...should amd follow suit and quit vga production and stick to cpu production only?

    hence zen2 will be backward compatible with am4 socket boards....good continuity from amd
    Last edited by TDMVPDPOY; 06-06-2017 at 09:09 AM.

  22. #222
    Spur-taaaa TDMVPDPOY's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Feb 2005
    Post Count
    41,338
    mad for compe ion from amd

    rx vega consumer gaming cards > 1080, not bad https://www.youtube.com/watch?v=_ROpkIwaCC4

    as for the frontier edition for the professionals sector, lmao why is there a +500bucks difference from the air cool version vs the water cool version...



    now I'm just waiting for the new fsync2 monitors...what sort of price ppl should be expecting

  23. #223
    Take the fcking keys away baseline bum's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Mar 2003
    Post Count
    93,371
    Looks like the i7-7440X on X299 may end up the king of the hill gaming cpu. I saw a review where dude had a delidded one running at 5.3 GHz on I think a cheap Corsair AIO liquid cooler.

  24. #224
    Take the fcking keys away baseline bum's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Mar 2003
    Post Count
    93,371
    mad for compe ion from amd

    rx vega consumer gaming cards > 1080, not bad https://www.youtube.com/watch?v=_ROpkIwaCC4

    as for the frontier edition for the professionals sector, lmao why is there a +500bucks difference from the air cool version vs the water cool version...



    now I'm just waiting for the new fsync2 monitors...what sort of price ppl should be expecting
    It's horrific news if true that Vega with its 500 mm^2 die and expensive HBM2 can only compete with a one year old GTX 1080 which uses a much smaller 314 mm^2 die. And Nvidia is producing an 815 mm^2 Volta chip on a 12 nm process (V100). Man if they put out a GTX 2080 at half that die size this year Vega might be stuck trying to compete with $400 GTX 2070. That is not a good result at all for AMD. They need to be beating the GTX 1080 Ti considering Vega is likely more expensive to produce than the GP102 chip in the GTX 1080 Ti and the two Pascal ans.

  25. #225
    Spur-taaaa TDMVPDPOY's Avatar
    My Team
    San Antonio Spurs
    Join Date
    Feb 2005
    Post Count
    41,338
    It's horrific news if true that Vega with its 500 mm^2 die and expensive HBM2 can only compete with a one year old GTX 1080 which uses a much smaller 314 mm^2 die. And Nvidia is producing an 815 mm^2 Volta chip on a 12 nm process (V100). Man if they put out a GTX 2080 at half that die size this year Vega might be stuck trying to compete with $400 GTX 2070. That is not a good result at all for AMD. They need to be beating the GTX 1080 Ti considering Vega is likely more expensive to produce than the GP102 chip in the GTX 1080 Ti and the two Pascal ans.
    well the rx400 and rx500 series don't compete against +1070 cards or higher...
    don't think amd had any plans during that 1yr gap to compete with NVidia, and now they have an answer 1yr later, even with volta releasing, those current +1080 ned to drop at leasts +100-150bucks across its line to kill off amd vega cards...but then again even with the price drop, u still be hurt at ur pockets if ur upgrading ur monitor to a gsync model

    go vega if u want fsync2 and fsync2 monitors coming out, dunno if fsync2 monitors is any better then current gen gsync monitors, but its an improvement over fsync1

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •