MSI officially launches the N580GTX-M2D15D5, the fastest single chip graphics card in the world. The N580GTX houses the new 40nm GF110 GPU with 512 CUDA cores, supporting DirectX 11 and NVIDIA 3DVision Surround technology. 1536MB on-board high GDDR5 ensures you never run out of video memory playing the latest games at ultra-high resolution. An advanced vapour chamber cooler ensures a silent and cool running graphics card. With real-time monitoring, fan control, overvoltage and overclocking available through MSI's exclusive Afterburner technology, the N580GXT delivers everything you need to be the fastest in the latest games and benchmarks.
The new NVIDIA GF110 GPU is a powerful chip with advanced compute features powered by 512 CUDA cores, 1536MB GDDR5 graphics memory. The MSI N580GTX is cooled by an advanced custom Vapor Chamber cooler for the best possible heat dissipation, ensuring heat is quickly absorbed from the GPU and exhausted out the rear of the card. The cooler is designed specifically to have optimum performance when running in SLI.
MSI's exclusive Afterburner utility allows for flexible overclocking and monitoring options. The GPU frequency can be adjusted and GPU voltage control allows more headroom when you're looking for more GPU performance. The advanced fan control function allows end-user control of fan speeds to ensure stability when required or low noise when needed. MSI Kombustor helps users test the stability of overclocks and monitor overclocking conditions.
The N580GTX-M2D15D5 allows a wide variety of display options. Two Dual Link DVI-I connectors and HDMI 1.4a give a wide variety of choices when combining your favourite (3D) screen with the N580GTX-M2F15D5. Card stability is guaranteed by the use of All Solid Capacitors for extended product lifespan.
|GPU||NVIDIA GeForce GTX 580|
|CUDA Cores||512 Units|
|Core Clock||772 MHz|
|Processor Clock||1544 MHz|
|Memory Clock||4008 MHz|
|b>Memory Size||1536M GDDR5|
|Memory Bus||384 bits|
|Output||Mini HDMI / DVI*2|
|Afterburner Support||Overclock / GPU Overvoltage /
Del med dine venner