PDA

View Full Version : NVidia GeForce G80 DirectX 10 cards to hit the market in November



baseline bum
10-11-2006, 10:45 AM
Recommended 800W minimum for dual card SLI configuration. The card is also supposed to have dedicated physics processing, similar to what's being done on the PS3.

NVIDIA "G80" Retail Details Unveiled (http://www.dailytech.com/article.aspx?newsid=4441)
Anh Huynh - October 5, 2006 12:39 AM

DailyTech's hands-on with the GeForce 8800 series continues with more information about the GPU and the retail boards. The new NVIDIA graphics architecture will be fully compatible with Microsoft’s upcoming DirectX 10 API with support for shader model 4.0, and represents the company's 8th generation GPU in the GeForce family.

NVIDIA has code-named G80 based products as the GeForce 8800 series. While the 7900 and 7800 series launched with GT and GTX suffixes, G80 will do away with the GT suffix. Instead, NVIDIA has revived the GTS suffix for its second fastest graphics product—a suffix that hasn’t been used since the GeForce 2 days.

NVIDIA’s GeForce 8800GTX will be the flagship product. The core clock will be factory clocked at 575 MHz. All GeForce 8800GTX cards will be equipped with 768MB of GDDR3 memory, to be clocked at 900 MHz. The GeForce 8800GTX will also have a 384-bit memory interface and deliver 86GB/second of memory bandwidth. GeForce 8800GTX graphics cards are equipped with 128 unified shaders clocked at 1350 MHz. The theoretical texture fill-rate is around 38.4 billion pixels per second.

Slotted right below the GeForce 8800GTX is the slightly cut-down GeForce 8800GTS. These graphics cards will have a G80 GPU clocked at a slower 500 MHz. The memory configuration for GeForce 8800GTS cards slightly differ from the GeForce 8800GTX. GeForce 8800GTS cards will be equipped with 640MB of GDDR3 graphics memory clocked at 900 MHz. The memory interface is reduced to 320-bit and overall memory bandwidth is 64GB/second. There will be fewer unified shaders with GeForce 8800GTS graphics cards. 96 unified shaders clocked at 1200 MHz are available on GeForce 8800GTS graphics cards.

Additionally GeForce 8800GTX and 8800GTS products are HDCP compliant with support for dual dual-link DVI, VIVO and HDTV outputs. All cards will have dual-slot coolers too. Expect GeForce 8800GTX and 8800GTS products to launch the second week of November 2006. This will be a hard launch as most manufacturers should have boards ready now.

leemajors
10-11-2006, 10:49 AM
what price range are you expecting? standard new launch $500 area?

TDMVPDPOY
10-11-2006, 10:54 AM
good, cant wait, i need a new comp soon, but this will be out of my budget

baseline bum
10-11-2006, 10:56 AM
Power and the NVIDIA "G80" (http://www.dailytech.com/article.aspx?newsid=4442)
Sven Olsen (Blog) - October 4, 2006 11:56 PM


Gentlemen, start your DirectX10 engines

DailyTech received its first looks at a GeForce 8800 production sample today, and by the looks of it, the card is a monster: at least with regard to size and power requirements.

The GeForce 8800 comes in two flavors, which we will get into more detail about over the course of the next few days. The first card, the GeForce 8800GTX, is the full blown G80 experience, measuring a little less than 11 inches in length. The GeForce 8800GTS is a cut down version of the first, and only 9 inches in length.

The marketing material included with the card claims NVIDIA requires at least a 450W power supply for a single GeForce 8800GTX, and 400W for the 8800GTS. Top tier vendors in Taiwan have already confirmed with DailyTech that GeForce 8800 cards in SLI mode will likely carry a power supply "recommendation" of 800W. NVIDIA's GeForce 7950GX2, currently the company's top performing video card, carries a recommendation of 400W to run the card in single-card mode.

NVIDIA is slated to launch both versions of the GeForce 8800 in November of this year. More details on the GeForce 8800 will be available later today on DailyTech.
Update 10/05/2006: We originally reported the GeForce 8800GTX and 8800GTS are 9" in length. The reference design for the 8800GTX is actually a little less than 11 inches. The GTX has two 6-pin power adaptors, the GTS has only one.

baseline bum
10-11-2006, 10:59 AM
The 8800GTX should go for $649, with the 8800GTS @ $449-$499. (source: wikipedia (http://en.wikipedia.org/wiki/Geforce_8))

I will be buying at least one of the 8800GTX cards for sure, and will put off building my system until I can get my hands on them.

nForce 590 better freaking be available by then!

leemajors
10-11-2006, 11:01 AM
thanks for the info, looking to scoop up a dual core and new mb for christmas, then i can wait till the prices come down a bit and see if there are any new games i like round summertime.

baseline bum
10-11-2006, 11:02 AM
This is gonna kill the Aegix PhysX Processor barely before it got its feet off the ground. Sucks to see a great idea and something as revolutionary as the PhysX PPU will be dead in a year, but then again I'll have one more free PCI slot, and NVidia embracing the technology will ensure it'll be heavily used in game development.

baseline bum
10-11-2006, 11:05 AM
thanks for the info, looking to scoop up a dual core and new mb for christmas, then i can wait till the prices come down a bit and see if there are any new games i like round summertime.

What are you looking to do with it? I ask, because Intel's Quad Core Kentsfield will come out in January. Not a big deal right now if you're a gamer (since nothing on PC is written for more than 2 cores), but then again, quad-core games should be on the horizon, as the PS3 is forcing it on developers with their 7-core system, similar to how the 3-core Xenon (XBox 360) forced PC developers to finally start writing for dual core SMP.

baseline bum
10-11-2006, 11:08 AM
Here's a pic of a GeForce 8800GTX. I forgot to mention it will be water-cooled. Not sure about the 8800GTS though.

http://img264.imageshack.us/img264/7826/gtx8800fw1.th.jpg (http://img264.imageshack.us/my.php?image=gtx8800fw1.jpg)

leemajors
10-11-2006, 11:46 AM
What are you looking to do with it? I ask, because Intel's Quad Core Kentsfield will come out in January. Not a big deal right now if you're a gamer (since nothing on PC is written for more than 2 cores), but then again, quad-core games should be on the horizon, as the PS3 is forcing it on developers with their 7-core system, similar to how the 3-core Xenon (XBox 360) forced PC developers to finally start writing for dual core SMP.

honestly, i am just looking to upgrade my pcu and mb right now because i can't think of anything else to get for christmas. i have a p4 3.2 ghz right now and was gonna hand that down to my brother. i don't need cutting edge stuff at the moment, i am too busy keeping up with my 19 month old to play many games anyway. but in another 6 months to a year i may need to bump up my old gfx card as well.

baseline bum
10-11-2006, 12:53 PM
Sorry, forgot to post the article about physics processing in the 8800GTX.
================================================== ===========


Documents Leak NVIDIA's Quantum Physics Engine (http://www.dailytech.com/article.aspx?newsid=4444)
Kristopher Kubicki (Blog) - October 5, 2006 1:21 AM

NVIDIA is ready to counter the Triple Play

With the release of the G80, NVIDIA will also release a new engine dubbed Quantum physics engine. Quantum Effects Technology is similar (at least in spirit) to NVIDIA's PureVideo Technology -- a dedicated layer on the GPU for physics calculations. A few documents alluding to this new engine appeared on public FTP mirrors late last week.

Quantum utilizes some of the shaders from NVIDIA's G80 processor specifically for physics calculations. Physics calculations on GPUs are nothing new; ATI totes similar technology for its Stream Computing initiative and for the Triple Play physics.

NVIDIA and Havok partnered up this year claiming that SLI systems would get massive performance gains by utilizing additional GeForce GPUs as physics processors. Quantum may be the fruits of that partnership, though NVIDIA documentation clearly states that Quantum will work just fine without SLI.

NVIDIA's documentation claims Quantum will specifically compete with AGEIA's PhysX, yet does not mention who is providing the middleware. Given that there are only two acts in town right now, it would be safe to say Havok has a hand in the Quantum engine.

dougp
10-11-2006, 01:04 PM
Ah, NVIDIA freakin owns ... I'll be building a desktop system next year because of this.

Has ATi even announced a DX10 card yet?

baseline bum
10-11-2006, 01:23 PM
The R600 is ATI's DirectX 10 chipset. NVidia still hasn't officially announced anything. I've gone very sour on ATI after seeing how their drivers will cause things like undrawn textures, LODs being displayed too long, and so on. I have to hunt down different driver revisions to correct these problems in a few games. ATI is dead to me until they get some competent programmers.

Here's what wikipedia says about the R600:


Rumors about Radeon R600

* ATI's R600 is expected to launch in early 2007.[1]
* The R600 is estimated to be the largest GPU ever made. Production based on an 80 nm process with ATI having 65 nm clearly in sights. [2]
* 64 unified shader pipelines, 32 TMUs, and 32 ROPs.
* Its design is similar to the Xbox 360's GPU "Xenos", with the distinction of also being DX10 compliant. However, The R600 will not feature the 10 MB daughter-die embedded DRAM framebuffer.
* The R600 will support the upcoming GDDR4 memory interface with 512 MB RAM running higher than the X1950XTX's memory clock speeds of 1.0 GHz (2.0 GHz effective).
* Designed "from the ground up," according to sources at ATI, for DX10, however it will be backwards compatible with DX9.
* Designed for Windows Vista.
* PCI-E Interface.
* HDMI connector (may have support for Displayport).
* Anandtech reports that the next generation of GPUs will range in power consumption from 130W to 300W. This increase in power consumption will make for higher-wattage Power Supply Units (to the 1 kW-1.2 kW range) and/or the addition of internal, secondary PSUs solely for powering the GPUs.[3]

dougp
10-11-2006, 01:40 PM
The R600 is ATI's DirectX 10 chipset. NVidia still hasn't officially announced anything. I've gone very sour on ATI after seeing how their drivers will cause things like undrawn textures, LODs being displayed too long, and so on. I have to hunt down different driver revisions to correct these problems in a few games. ATI is dead to me until they get some competent programmers.

Here's what wikipedia says about the R600:
that rambuffer - I'm not too familiar with it, but I'm assuming it cache's textures, right?

if it does, would this accelerate gameplay quite a bit where they don't use a single map texture like the new Quake Wars ... I'm just pullin this idea out of my ass though.

I went south on ATi after the Rage Fury 128. I found out that the lead driver programmer at the time was actually a kid in high school ...

leemajors
10-11-2006, 03:04 PM
that rambuffer - I'm not too familiar with it, but I'm assuming it cache's textures, right?

if it does, would this accelerate gameplay quite a bit where they don't use a single map texture like the new Quake Wars ... I'm just pullin this idea out of my ass though.

I went south on ATi after the Rage Fury 128. I found out that the lead driver programmer at the time was actually a kid in high school ...

damn i hadn't seen anything about quake wars yet - looks awesome! carmack is god.

dougp
10-11-2006, 03:07 PM
damn i hadn't seen anything about quake wars yet - looks awesome! carmack is god.
Carmack isn't making it - he's working on the new id game.

leemajors
10-11-2006, 04:54 PM
Carmack isn't making it - he's working on the new id game.

he didn't have anything to do with the engine design?

leemajors
10-12-2006, 11:01 PM
hey bum, which motherboard would you recommend for a dual core intel chip?

dougp
10-13-2006, 08:01 AM
he didn't have anything to do with the engine design?
Well, of course he did - it's a very modified Doom3 engine ...

leemajors
10-13-2006, 08:48 AM
Well, of course he did - it's a very modified Doom3 engine ...

i thought he may have been overseeing, but i see splash damage is doing it. i had heard a while ago id may be trying out a strategy game, and assumed this was it. thanks for reminding me to check their site though :bang

baseline bum
10-13-2006, 02:05 PM
hey bum, which motherboard would you recommend for a dual core intel chip?

I wouldn't recommend any of them right now if you want to do SLI and have fast RAM at the same time. If you don't care about SLI, this one's (http://www.newegg.com/Product/Product.asp?Item=N82E16813131025) the king of conroe CPU motheroboards (it's a pretty hefty $259 though).

This one's (http://www.newegg.com/Product/Product.asp?Item=N82E16813131070) a pretty nice board, with 8 channel sound, Firewire, and 3 pci slots, and is $158 and allows use of DDR2 800 at full speed without overclocking anything.

This one (http://www.newegg.com/Product/Product.asp?Item=N82E16813131030) will save you $14, and is similar, minus 1 SATA port and the firewire.

I generally like Asus boards best for Intel CPUs, and Abit for AMDs.

resistanze
11-11-2006, 12:43 AM
Update: Geforce 8800GTX Review (http://www.anandtech.com/video/showdoc.aspx?i=2870)

DarkReign
11-11-2006, 11:52 AM
nVidia 8800 reviewed on actual games (http://www.gamespot.com/features/6161267/index.html?tag=topslot;action;4&om_act=convert&click=topslot)

TDMVPDPOY
11-11-2006, 12:12 PM
dont go out and buy this card yet, atm nvidia has no direct competition for this card so they can just list the price the way they wanted, wait till ATI brings our the R series, which will give some price competition, then you can decide..

if you wanna wait till price goes down, just settle for a shit card for the moment....or buy one of ebay...alot of geeks will be sellin there cards....