Document Sample
Nvidia Powered By Docstoc
					From Wikipedia, the free encyclopedia


NVIDIA Corporation

Company history
Jen-Hsun Huang (the present CEO), Curtis Priem, and Chris Malachowsky co-founded the company in 1993 with venture-capital funding from Sequoia Capital. [1] In 2000 Nvidia acquired the intellectual assets of its one-time rival 3dfx, one of the biggest graphics companies of the mid- to late-1990s. On December 14, 2005, Nvidia acquired ULI Electronics, which at the time supplied third-party Southbridge parts for chipsets to ATI, Nvidia’s competitor. In March 2006, Nvidia acquired Hybrid Graphics[2] and on January 5, 2007, it announced that it had completed the acquisition of PortalPlayer, Inc.[3] In December 2006 Nvidia, along with its main rival in the graphics industry AMD (which acquired ATI), received subpoenas from the Justice Department regarding possible antitrust violations in the graphics card industry.[4] Forbes magazine named Nvidia its Company of the Year for 2007, citing the accomplishments it made during the said period as well as during the previous 5 years.[5] In February 2008 Nvidia acquired Ageia Technologies for an undisclosed sum. "The purchase reflects both companies[’] shared goal of creating the most amazing and captivating game experiences", said Jen-Hsun Huang, president and CEO of Nvidia. "By combining the teams that created the world’s most pervasive GPU and physics engine brands, we can now bring GeForce-accelerated PhysX to twelve million gamers around the world." (The press-release made no mention of the acquisition-cost nor of specific products.) The company’s name combines an initial n — a letter usable as a pronumeral in mathematical statements — and the root of video— which comes from Latin videre, "to see", thus implying "the best visual experience" or perhaps "immeasurable display". The name NVIDIA suggests "envy" (Spanish envidia or in Latin, Italian, or Romanian invidia); and Nvidia’s GeForce 8 series product uses the slogan "Green with envy". The company-name appears entirely in upper-case ("NVIDIA") in company technical documentation.

Type Founded Headquarters

Public (NASDAQ: NVDA) 1993 2701 San Tomas Expressway Santa Clara, California USA Jen-Hsun Huang, Co-founder, President and CEO Chris Malachowsky, Co-founder, NVIDIA Fellow, Senior Vice President, Engineering and Operations Jonah M. Alben, Vice President, GPU Engineering Debora Shoquist, Senior Vice President, Operations Dr Ranga Jayaraman, CIO Semiconductors- Specialized Graphics processing units Chipsets ▲$4.1 Billion USD (2007) ▲$797.6 Million USD (2007) over 4,985 (as of June 2008) www.nvidia.com

Key people

Industry Products Revenue Net income Employees Website

Nvidia (NASDAQ: NVDA, pronounced /ɛnˈvɪ.di.ə/) is a multinational corporation specializing in the development of graphics processing units and chipset technologies for workstations, personal computers, and mobile devices. Based in Santa Clara, California, the company has become a major supplier of integrated circuits (ICs) such as graphics processing units (GPUs) and chipsets used in graphics cards, and video-game consoles and personalcomputer motherboards. Notable Nvidia product lines include the GeForce series for gaming, the Quadro series for computer aided design and digital content creation on workstations, and the nForce series of integrated motherboard chipsets.

Nvidia’s product-portfolio includes graphics-processors, wireless-communications processors, PC platform (motherboard core-logic) chipsets, and digital-mediaplayer software. The community of computer users arguably knows Nvidia best for its "GeForce" product-line,


From Wikipedia, the free encyclopedia

December 2004 saw the announcement that Nvidia would assist Sony with the design of the graphics processor (RSX) in the PlayStation 3 game console. In March 2006 it emerged that Nvidia would deliver RSX to Sony as an IP-core, and that Sony alone would be responsible for manufacturing the RSX. Under the agreement, Nvidia will provide ongoing support to port the RSX to Sony’s fabs of choice (Sony and Toshiba), as well as die shrinks to 65 nm. This is a departure from Nvidia’s business arrangement with Microsoft, in which Nvidia managed production and delivery of the Xbox GPU through Nvidia’s usual third-party foundry contracts. (Meanwhile, Microsoft has chosen to license a design by ATI and make their own manufacturing arrangements for Xbox 360’s graphics hardware, as has Nintendo for their Wii console to succeed the ATI-based GameCube.) On February 4, 2008, Nvidia announced plans to acquire physics software producer AGEIA, whose PhysX physics engine program forms part of hundreds of games shipping or in development for PlayStation 3, Xbox 360, Wii, and gaming PCs.[6] This transaction completed on February 13, 2008[7] and efforts to integrate PhysX into the GeForce 8800’s CUDA system began. [8] [9] On June 2, 2008 Nvidia officially announced its new Tegra product-line.[10] The Tegra is a System-on-a-Chip (SoC) that integrates an ARM CPU, GPU, northbridge and southbridge onto a single chip. Commentators opine that Nvidia will target this product at the smart-phone and mobile Internet device sector.

Nvidia headquarters in Santa Clara

Graphics chipsets
A graphics processing unit on an Nvidia GeForce 6600 GT which not only offers a complete line of "discrete" graphics chips found in AIB (add-in-board) video cards, but also provides a core-technology in both the Microsoft Xbox game console and nForce motherboards. In many respects Nvidia resembles its competitor ATI: Both companies began with a focus in the PC market and later expanded their activities into chips for nonPC applications. Nvidia does not sell graphics boards into the retail market, instead focusing on the development of GPU chips. Since Nvidia is a fabless semiconductor company, chip manufacturing is provided under contract by Taiwan Semiconductor Manufacturing Company, Ltd. (TSMC). As part of their operations, both ATI and Nvidia create "reference designs" (circuit board schematics) and provide manufacturing samples to their board partners. Manufacturers of Nvidia cards include BFG, EVGA, Foxconn, and PNY. XFX, ASUS, Gigabyte Technology, and MSI exemplify manufacturers of both ATI and Nvidia cards. • NV1 – Nvidia’s first product, based on quadratic surfaces • RIVA 128 and RIVA 128ZX – DirectX 5 support, OpenGL 1 support, Nvidia’s first DirectX-compliant hardware • RIVA TNT, RIVA TNT2 – DirectX 6 support, OpenGL 1 support; the series that made Nvidia a market-leader • Nvidia GeForce - Desktop-graphics accelerationsolutions • Nvidia Quadro – High-quality workstation solutions • Nvidia Tesla - Dedicated GPGPU processing for High Performance Computing systems • Nvidia GoForce – Media processors for PDAs, Smartphones, and mobile phones featuring nPower technology • GPU for game consoles • Xbox GeForce 3 - class GPU (on an Intel Pentium III/Celeron platform) • PlayStation 3 - RSX ’Reality Synthesizer’

Motherboard chipsets
• nForce series • nForce (AMD Athlon/Duron K7 line)


From Wikipedia, the free encyclopedia
• nForce2 (AMD Athlon/Duron K7 line, SPP (system platform processor) or IGP (Integrated Graphics Platform) and MCP (Media and Communications Processor), also features SoundStorm) • nForce3 (AMD Athlon 64/Athlon 64 FX/Opteron, MCP only) • nForce4 (AMD Athlon 64/Athlon 64 X2/Athlon 64 FX/Opteron, MCP only;Intel Pentium 4/Pentium D, SSP + MCP) • nForce 500 (AMD Athlon 64 FX/Athlon 64 X2/ Athlon 64/Sempron or Intel Core 2 Extreme/Core 2 Duo/Pentium 4/Celeron D/Pentium D) • nForce 600 (AMD Quad FX or Intel Core 2 Quad/ Core 2 Extreme/Core 2 Duo/Pentium 4/Celeron D/Pentium D) • nForce 700 (Intel Core 2 and AMD Phenom)

Nvidia drivers cause known issues on computers running Windows Vista. The forums on the Nvidia homepage have various topics where users discuss the failure and recovery error of the driver without any solution. X.Org Foundation and Freedesktop.org have started the Nouveau project, which aims to develop free software drivers for Nvidia graphics cards by reverse-engineering Nvidia’s current proprietary drivers for Linux.

According to a survey[13] conducted by Jon Peddie Research, a market-watch firm, in the third quarter of 2007, Nvidia occupied the top slot in the desktop graphicdevices market with a market share of 37.8%. However, in the mobile space, it remained third with 22.8% of the market. Overall Nvidia has maintained its position as the second-largest supplier of PC graphic shipments, which includes both integrated and discrete GPUs, with 33.9% market share, their highest in many years, which puts them just behind Intel (38%). According to the Steam hardware survey[14] conducted by the game-developer Valve, Nvidia had 64.64% of PC video card market share (as of 1 December 2008 (2008 -12-01)). ATI had 27.12% of the PC video card market share. But this could relate to Valve releasing trial versions of The Orange Box to Nvidia graphics-card users, which link to the test. However, free copies of The Orange Box were also released to ATI card purchasers, notably those who purchased the Radeon 2900XT.

Documentation and drivers
Nvidia does not publish the documentation for its hardware, meaning that programmers cannot write appropriate and effective open-source drivers for Nvidia’s products. Instead, Nvidia provides its own binary GeForce graphics drivers for X.Org and a thin open-source library that interfaces with the Linux, FreeBSD or Solaris kernels and the proprietary graphics software. Nvidia also supports an obfuscated open-source driver that only supports two-dimensional hardware acceleration and ships with the X.Org distribution. Nvidia’s Linux support has promoted mutual adoption in the entertainment, scientific visualization, defense and simulation/training industries, traditionally dominated by SGI, Evans & Sutherland and other relatively costly vendors. Because of the proprietary nature of Nvidia’s drivers, they continue to generate controversy within the free-software communities. Some Linux and BSD users insist on using only open-source drivers, and regard Nvidia’s insistence on providing nothing more than a binary-only driver as wholly inadequate, given that competing manufacturers like Intel offer support and documentation for open-source developers, and that others like ATI at least release partial documentation.[11] Because of the closed nature of the drivers, Nvidia video cards do not deliver adequate features on several platforms and architectures, such as FreeBSD on the x86-64 architecture and the other BSD operating systems on any architecture. Support for three-dimensional graphics acceleration in Linux on the PowerPC does not exist; nor does support for Linux on the hypervisor-restricted PlayStation 3 console. While some users accept the Nvidia-supported drivers, many users of open-source software would prefer better out-ofthe-box performance[12] if given the choice. However, the performance and functionality of the binary Nvidia video card drivers surpass those of open-source alternatives following VESA standards.

Market history
Before DirectX

An Nvidia RIVA 128 AGP video card Nvidia released its first graphics card, the NV1, in 1995. Its design used quadratic surfaces, with an integrated playback-only sound-card and ports for Sega Saturn gamepads. Because the Saturn also used forwardrendered quadratics, programmers ported several Saturn


From Wikipedia, the free encyclopedia
games to play on a PC with NV1, such as Panzer Dragoon and Virtua Fighter Remix. However, the NV1 struggled in a market-place full of several competing proprietary standards. Market interest in the product ended when Microsoft announced the DirectX specifications, based upon polygons. Subsequently NV1 development continued internally as the NV2 project, funded by several millions of dollars of investment from Sega. Sega hoped that an integrated sound-and-graphics chip would cut the manufacturing cost of their next console. However, Sega eventually realized the flaws in implementing quadratic surfaces, and the NV2 was never fully developed.

astonishing range of quality integrated features, it failed to displace the market leader, 3dfx’s Voodoo 2, because the actual clock-speed ended up at only 90 MHz, about 35% less than expected. Nvidia responded with a refresh part: a die shrink for the TNT architecture from 350 nm to 250 nm. A stock TNT2 now ran at 125 MHz, an Ultra at 150 MHz. Though the Voodoo 3 beat Nvidia to the market, 3dfx’s offering proved disappointing: it was not much faster and lacked features that were becoming standard, such as 32-bit color and textures of resolution greater than 256 x 256 pixels. The RIVA TNT2 marked a major turning-point for Nvidia. They had finally delivered a product competitive with the fastest on the market, with a superior feature-set, strong 2D functionality, all integrated onto a single die with strong yields, that ramped to impressive clockspeeds. Nvidia’s six month cycle refresh took the competition by surprise, giving it the initiative in rolling out new products.

Transition to DirectX
Nvidia’s CEO Jen-Hsun Huang realized at this point that after two failed products, something had to change for the company to survive. He hired David Kirk as Chief Scientist from software-developer Crystal Dynamics. Kirk combined the company’s experience in 3D hardware with an intimate understanding of practical implementations of rendering. As part of the corporate transformation, Nvidia sought to fully support DirectX, and dropped multimedia functionality in order to reduce manufacturing costs. Nvidia also adopted the goal of an internal 6-month productcycle, under the supposition that the failure of any one product could be mitigated by having a replacement waiting in the pipeline. However, since the Sega NV2 contract remained secret, and since Nvidia had laid off employees, it appeared to many industry observers that Nvidia had ceased active research-and-development. So when Nvidia first announced the RIVA 128 in 1997, the specifications were hard to believe: performance superior to market leader 3dfx Voodoo Graphics, and a full hardware triangle setup engine. The RIVA 128 shipped in volume, and the combination of its low cost and high performance made it a popular choice for OEMs.

Market leadership: GeForce

GeForce 4 MX 64MB card. Produced 2002-2003. The autumn of 1999 saw the release of the GeForce 256 (NV10), most notably bringing on-board transformation and lighting. It ran at 120 MHz; it implemented advanced video-acceleration, motion-compensation and hardware sub-picture alpha-blending; and had four pixel pipelines. The GeForce outperformed existing products — such as the ATI Rage 128, 3dfx Voodoo 3, Matrox G400 MAX, and RIVA TNT2 — by a wide margin. Due to the success of its products, Nvidia won the contract to develop the graphics hardware for Microsoft’s Xbox game-console, which earned Nvidia a large $200 million advance. However, the project drew the time of many of Nvidia’s best engineers. In the short term, this was of no importance, and the GeForce 2 GTS shipped in the summer of 2000.

Ascendency: RIVA TNT
Having finally developed and shipped in volume the market-leading integrated graphics chipset, Nvidia set itself the goal of doubling the number of pixel pipelines in its chip, in order to realize a substantial performance-gain. The TwiN Texel (RIVA TNT) engine which Nvidia subsequently developed could either apply two textures to a single pixel, or process two pixels per clock-cycle. The former case allowed for improved visual quality, the latter for doubling the maximum fill-rate. New features included a 24-bit Z-buffer with 8-bit stencil support, anisotropic filtering, and per-pixel MIP mapping. In certain respects (such as transistor-count) the TNT had begun to rival Intel’s Pentium processors for complexity. However, while the TNT offered an


From Wikipedia, the free encyclopedia
The GTS benefited from the fact that Nvidia had by this time acquired extensive manufacturing experience with their highly integrated cores, and as a result they succeeded in optimizing the core for clock-speeds. The volume of chips produced by Nvidia also enabled it to bin-split parts, picking out the highest-quality cores for its premium range. As a result, the GTS shipped at 200 MHz. The pixel fill rate of the GeForce256 nearly doubled, and texel-fill rate nearly quadrupled because multi-texturing was added to each pixel pipeline. New features included S3TC compression, FSAA, and improved MPEG-2 motion compensation. Shortly afterward Nvidia launched the GeForce 2 MX, intended for the budget and OEM market. It had two pixel-pipelines fewer, and ran at 165 MHz and later at 250 MHz. Offering strong performance at a mid-range price, the GeForce 2MX became one of the most successful graphics chipsets. Nvidia also shipped a mobile derivative called the GeForce2 Go at the end of 2000. Nvidia’s success proved too much for 3dfx to recover its past market-share. The long-delayed Voodoo 5, the successor to the Voodoo 3, did not compare favorably with the GeForce 2 in either price or performance, and failed to generate the sales needed to keep the company afloat. With 3dfx on the verge of bankruptcy near the end of 2000, Nvidia purchased most of 3dfx’s intellectual property (in dispute at the time). Nvidia also acquired anti-aliasing expertise and about 100 engineers (but not the company itself, which filed for bankruptcy in 2002). Nvidia developed the GeForce 3, which pioneered DirectX 8 vertex and pixel-shaders, and then refined it with the GeForce 4 Ti line. After the GeForce 2 MX came the GeForce 4 MX. Nvidia announced the GeForce 4 Ti, MX, and Go in January 2002, one of the largest releases in Nvidia history. Cleverly, the chips in the Ti and Go series differed only in chip and memory clock-speeds. (The MX series lacked the pixel and vertex shader functionalities; it derived from GeForce 2 level hardware.)

part of the SoundStorm platform. Nvidia also had a contractual obligation to develop newer and more hack-resistant NV2A chips, and this requirement further shortchanged the FX project. The Xbox contract did not allow for falling manufacturing costs as processor technology improved, and Microsoft sought to re-negotiate the terms of the contract, withholding the DirectX 9 specifications as leverage. Relations between the two companies, which had previously been very good, deteriorated as a result. Both parties later settled the dispute through arbitration and the terms were not released to the public. Due to the Xbox dispute, no consultation with Nvidia took place during the development of the DirectX 9 specification. ATI limited rendering color support to 24-bit floating point, and emphasized shader performance. Developers built the shader-compiler using the Radeon 9700 as the base card. In contrast, Nvidia’s cards offered 16- and 32-bit floating point modes, offering either lower visual quality (as compared to the competition), or slower performance. The 32-bit support made them much more expensive to manufacture, requiring a higher transistor count. Shader performance often remained at half or less of the speed provided by ATI’s competing products. Having made its reputation by designing easy-to-manufacture DirectXcompatible parts, Nvidia had misjudged Microsoft’s next standard and paid a heavy price: as more and more games started to rely on DirectX 9 features, the poor shader performance of the GeForce FX series became more obvious. With the exception of the FX 5700 series (a late revision), the FX series did not compete well against ATI cards. Nvidia released an "FX only" demo called Dawn, but a hacked wrapper enabled it to run on a 9700, where it ran faster despite translation overhead. Nvidia began to use application detection to optimize their drivers. Hardware review sites published articles showing that Nvidia’s driver auto-detected benchmarks, and produced artificially inflated scores that did not relate to real world performance. Often it was tips from ATI’s driver development team that lay behind these articles. While Nvidia did partially close the performance gap with new instruction reordering capabilities introduced in later drivers, shader performance remained weak and over-sensitive to hardware-specific code compilation. Nvidia worked with Microsoft to release an updated DirectX compiler that generated code optimized for the GeForce FX. Furthermore, GeForce FX devices also ran hot, because they drew as much as double the amount of power as equivalent parts from ATI. The GeForce FX 5800 Ultra became notorious for its fan noise, and acquired the nicknames "dustbuster" and "leafblower" - Nvidia jokingly acknowledged these accusations with a video in which the marketing team compares the cards to a Harley-Davidson motorcycle.[15] Although the quieter 5900 replaced the 5800 without fanfare, the FX chips still

Stumbles with the FX series
At this point, Nvidia’s market position was dominant. However, ATI Technologies remained competitive due to its new Radeon product, which performed mostly on a par with the GeForce 2 GTS. Though ATI’s answer to the GeForce 3, the Radeon 8500, came later to market and initially suffered from driver issues, the 8500 proved a superior competitor due to its lower price and untapped potential for growth. Nvidia countered ATI’s offering with the GeForce 4 Ti line, but not before the 8500 carved out a niche. ATI opted to work on its next-generation Radeon 9700 rather than on a direct competitor to the GeForce 4 Ti. During the development of the next-generation GeForce FX chips, many of Nvidia’s best engineers focused on the Xbox contract, including the API used as


From Wikipedia, the free encyclopedia
needed large and expensive fans, placing Nvidia’s partners at a manufacturing cost disadvantage compared to ATI. As a result of Microsoft’s actions, and the resultant FX series’ weaknesses, Nvidia lost its market leadership position to ATI.

the performance gain as a result returned Nvidia to market leadership.

GeForce 6 series and later

The old Nvidia logo, in use until 2006 With the GeForce 6 series, Nvidia had clearly moved beyond the DX9 performance problems that plagued the previous generation. The GeForce 6 series not only performed competitively where Direct 3D shaders were concerned, but also supported DirectX Shader Model 3.0, while ATI’s competing X800 series chips only supported the previous 2.0 specification. This proved an insignificant advantage, mainly because games of that period did not employ extensions for Shader Model 3.0. However, it demonstrated Nvidia’s desire to design and follow through with the newest features and deliver them in a specific timeframe. What became more apparent during this time was that the products of the two firms, ATI and Nvidia, offered equivalent performance. The two firms traded blows in specific titles and specific criteria — resolution, image quality, anisotropic filtering/anti-aliasing — but differences were becoming more abstract, and the reigning concern became price-to-performance. The midrange offerings of the two firms demonstrated the consumers’ appetite for affordable, high-performance graphics cards, and it is now this price segment in which much of the firms’ profitability is determined. The GeForce 6 series were released in a very interesting period: the game Doom 3 was just released where ATI’s Radeon 9700 struggled at the OpenGL performance. In 2004, the GeForce 6800 performed excellently, while the GeForce 6600GT remained as important to Nvidia as the GeForce2 MX a few years previously. The GeForce 6600GT enabled users of the card to play Doom 3 at very high resolutions and graphical settings, which was thought to be highly unlikely considering its selling price. The GeForce 6 series also introduced SLI (which is similar to what 3dfx was using on the Voodoo 2). A combination of SLI and

Badge displayed on products certified by Nvidia to utilize SLI technology The GeForce 7 series represented a heavily beefed-up extension of the reliable 6-series. The industry’s introduction of the PCI Express bus standard allowed Nvidia to release SLI (Scalable Link Interface), a solution that employs two similar cards to share the workload in rendering. While these solutions do not equate to double the performance, and require more electricity (two cards visà-vis one), they can make a huge difference as higher resolutions and settings are enabled and, more importantly, offer more upgrade flexibility. ATI responded with the X1000 series, and their own dual-rendering solution called "CrossFire". Sony chose Nvidia to develop the "RSX" chip used in the PlayStation 3 — a modified version of the 7800 GPU. Nvidia released the 8-series chip towards the end of 2006, making the 8-series the first to support Microsoft’s next-generation DirectX 10 specification. The 8-series GPUs also featured the revolutionary Unified Shader Architecture, and Nvidia leveraged this to provide an additional functionality for its graphics cards: better support for General Purpose Computing on GPU (GPGPU). A new product-line of "compute-only" devices called Nvidia Tesla emerged from the G80 architecture, and subsequently Nvidia also became the market leader of this new field by introducing the world’s first C programming language API for GPGPU: CUDA. Nvidia released two models of the high-end 8-series (8800) chip: the 8800GTS (640MB and 320MB) and the 8800GTX (768MB). Later, Nvidia released the 8800 Ultra (essentially an 8800GTX with a different cooler and higher clocks). All three of these cards derive from the 90 nm G80 core (with 681 million transistors). The GTS


From Wikipedia, the free encyclopedia
model had 96 stream processors and 20 ROPS and the GTX/Ultra had 128 stream processors and 24 ROPS. In early 2007 Nvidia released the 8800GTS 320mb. This card resembles an 8800GTS 640, but with 32MB memory chips instead of 64MB (the cards contained 10 memory chips). In October 2007 Nvidia released the 8800GT. The 8800GT used the new 65 nm G92 GPU and had 112 stream processors. It contained 512Mb of VRAM and operated on a 256bit bus. It had several fixes and new features that the previous 8800s lacked. Later in December 2007 Nvidia released the 8800GTS G92. It represented a larger 8800GT with higher clocks and all of the 128 stream processors of the G92 unlocked. Both the 8800GTS G92 and 8800GT have full PCI Express 2.0 support. In February 2008 Nvidia released the 9600-series chip, which supports Microsoft’s DirectX 10 specification, in response to ATI’s release of the Radeon HD3800 series. After March Nvidia released the GeForce 9800 GX2, which, roughly put, packs two GeForce 8800 GTS G92s into a single card. In June 2008 Nvidia released their new flagship GPUs named the GTX 280 and GTX 260. The cards used the same basic Unified Architecture deployed in the previous 8 and 9 series cards, but with a tune-up in power. Both of the cards take as their basis the GT200 GPU. This GPU contains 1.4 billion transistors on a 65 nm fabrication. The GTX 280 has 240 shaders (stream processors) and the GTX 260 has 192 shaders (stream processors) . The GTX 280 has 1GB of GDDR3 VRAM and uses a 512-bit memory bus. The GTX 260 has 896MB of GDDR3 VRAM on a 448-bit memory bus (revised in September 2008 to include 216 shaders). The GTX 280 allegedly provides approximately 933 GFLOPS of floating point power. In January 2009, Nvidia released a 55 nm die shrink of GT200 called the GT200b. The update to the GTX 280 (card called GTX 285) allegedly providing 1062.72 GFLOPS of floating point power; an update to the GTX 260 (still called the GTX 260) with 216 shaders and a dual-chip card (called GTX 295), featuring two GT200b (55 nm-shrinked GT200 chips) which are a hybrid of the GT200 cores that were featured on the original GTX 280 and GTX 260. The difference here is that each individual GPU features 240 stream processors, but only a 448-bit memory bus. The GTX 295 has 1.75GB (1792MB, 896MB per GPU) of GDDR3 VRAM. The GTX 295 allegedly provides approximately 1788.48 GFLOPS of floating point power. March 2009 saw the released of the GTS 240 and GTS 250 main stream chips. Based on the previous generation G92s but 55 nm die shrink code named the G92b. The GTS 240 (based on the 9800GT) with 112 shaders (stream processors) and a 256-bit memory bus. The GTS 250 (based on the 9800GTX +) with 128 shaders (stream

processors) also with a 256-bit memory bus and 0.5GB or 1GB of GDDR3 of VRAM.

This file is a candidate for speedy deletion. It may be deleted after Thursday, 28 May 2009.

On May 12th 2009, Nvidia released images of a new revised edition of the GTX295. This design, being similar to ATI’s HD4870x2, is different to the original. In the first production run of the GTX295, it was literally two separate graphics accelerators sandwiched in the same casing and connected by a ribbon SLI cable. The new design encompasses both GPU’s on the one PCB. The card still has the same specifications of the first production run, although speculation admits it will be less expensive, due to lower manufacturing costs from being a more compact device.

Defective mobile video adapters
In July 2008, Nvidia noted increased rates of failure in certain mobile video adapters.[16] A writer for The Inquirer alleged that the problems potentially affect all G84 and G86, mobile and desktop, video adapters,[17] though NVIDIA have denied this.[18] [19] In response to this issue, Dell and HP released BIOS updates for all affected notebook computers which turn on the cooling fan earlier than before in an effort to keep the defective video adapter at a lower temperature. Leigh Stark has suggested that this may lead to the premature failure of the cooling fan.[20] It is also possible that this resolution may only delay component failure past warranty expiration. In August 2008 rumors emerged that these issues also affected G92 & G94 mobile video adapters.[21] But at the end of August 2008, Nvidia reportedly issued a productchange notification announcing plans to update the bump material of GeForce 8 and 9 series chips “to increase supply and enhance package robustness”. [22] In response to the possibility of defects in some mobile video adapters from Nvidia, some notebook manufacturers have allegedly turned to ATI to provide graphics options on their new Montevina notebook computers.[23] On 18 August 2008, according to the direct2dell.com blog, Dell began to offer a 12-month limited warranty


From Wikipedia, the free encyclopedia
"enhancement" specific to this issue on affected notebook computers worldwide.[24] On 8 September 2008, Nvidia made a deal with large OEMs, such as Dell and HP, that they will get $200 per affected notebook[25] On 9 October 2008, Apple Inc. announced on a support page that MacBook Pro notebook computers had exhibited faulty Nvidia GeForce 8600M GT graphics adapters.[26] The manufacture of affected computers took place between approximately May 2007 and September 2008. Apple also stated that they would repair MacBook Pros affected within two years of the original purchase date free-of-charge and also offered refunds to customers who had paid for repairs related to this issue. On 9 December 2008, The Inquirer conducted another series of tests to check whether the new MacBook Pro notebook computers used eutectic solder or high-lead solder.[27] They found that the 9400M chipset used eutectic solder, while the 9600M used a high-lead solder which they associated with the "old process" responsible for the failures. [8] [9] [10]

[Phoronix] PhysX For CUDA, Linux Support A Given? GeForce 8 graphics processors to gain PhysX support - The Tech Report http://www.techtree.com/India/News/ Nvidia_Rolls_out_Tegra_Processors/ 551-89833-581.html NVIDIA Rolls out "Tegra" Processors "X.org, distributors, and proprietary modules". Linux Weekly News. Eklektix. 2006-08-14. http://lwn.net/Articles/195351/. Retrieved on 2008-11-03. LinuxQuestions.org 20 September 2007: ’ "NVIDIA Continues to Gain Graphics Market Share, AMD Keeps on Downfall – JPR.". X-bit Labs. 2007-10-29. http://www.xbitlabs.com/news/ video/display/20071029062106.html. Retrieved on December 2 2007. Valve - Survey Summary Data YouTube - Nvidia Hair Dryer NVIDIA Corporation (2008-07-02). "NVIDIA Provides Second Quarter Fiscal 2009 Business Update". http://www.nvidia.com/object/ io_1215037160521.html. Retrieved on 2008-10-05. "Certain notebook configurations with GPUs and MCPs manufactured with a certain die/packaging material set are failing in the field at higher than normal rates. To date, abnormal failure rates with systems other than certain notebook systems have not been seen." Demerjian, Charlie (2008-07-09). "All Nvidia G84 and G86s are bad". The Inquirer. http://www.theinquirer.net/gb/inquirer/news/2008/ 07/09/nvidia-g84-g86-bad. Retrieved on 2008-10-05. "The short story is that all the G84 and G86 parts are bad. Period. No exceptions. All of them, mobile and desktop, use the exact same ASIC, so expect them to go south in inordinate numbers as well." Hruska, Joel (2008-07-16). "NVIDIA denies rumors of faulty chips, mass GPU failures". Ars Technica. http://arstechnica.com/news.ars/post/ 20080716-nvidia-denies-rumors-of-mass-gpufailures.html. Retrieved on 2008-10-05. "This is a serious charge to level at any company, and we contacted NVIDIA for additional information. The company’s response first affirms its intent to stand behind its customers and repair any and all notebooks that experience field failures. It then states: 1) The issue is limited to a few notebook chips only; we have not seen and don’t expect to see this issue on any NVIDIA-based desktop systems. 2) Only a very small percentage of the notebook chips that have shipped are potentially affected, and the problem depends on a combination of environmental conditions, configuration, and usage


[12] [13]

[14] [15] [16]

See also
• • • • • • • • • • Graphics Processing Unit ATI Technologies Comparison of ATI graphics processing units Comparison of Nvidia graphics processing units Matrox Nvidia Demos Nvision Video In Video Out (VIVO) Molecular modeling on Nvidia GPUs NVIDIA Ion


[1] [2] [3] [4] Forbes.com - Magazine Article The Register Hardware news: NVIDIA acquires Hybrid Graphics Press Release: Nvidia acquires PortalPlayer, dated January 5, 2007. "Justice Dept. subpoenas AMD, NVIDIA". New York Times. 2006-12-01. http://news.zdnet.com/ 2100-9584_22-6140041.html. Brian Caulfield (2008-01-07). "Shoot to Kill". Forbes.com. http://www.forbes.com/home/ technology/forbes/2008/0107/092.html. Retrieved on 2007-12-26. "NVIDIA to Acquire AGEIA". DailyTech.com. 2008-02-04. http://www.dailytech.com/ Update+NVIDIA+to+Acquire+AGEIA/ article10573.htm. NVIDIA Completes Acquisition of AGEIA Technologies: Financial News - Yahoo! Finance [18]





From Wikipedia, the free encyclopedia







model. 3) We continue to work closely with our We are already seeing a spike in high-end ATI partners and have taken the necessary steps to options on almost all new Montevina notebooks, ensure that all NVIDIA chips currently in with fewer NVIDIA options day by day." production do not exhibit the problem." [24] Menchaca, Lionel (2008-08-18). "NVIDIA GPU Kingsley-Hughe, Adrian (2008-07-17). "NVIDIA: Update: Dell to Offer Limited Warranty Nothing to see here, move along". ZDNet. Enhancement to All Affected Customers http://blogs.zdnet.com/hardware/?p=2250. Worldwide". Direct2Dell Blog. Retrieved on 2008-10-05. "...So just how widespread http://direct2dell.com/one2one/archive/2008/08/18/ are NVIDIA’s GPU failure problem. According to nvidia-gpu-update-dell-to-offer-warrantyNVIDIA, it’s nothing to worry about...." enhancement-to-all-affected-customersStark, Leigh (2008-08-18). "NVIDIA DISASTER: worldwide.aspx. Retrieved on 2008-08-18. "..." thousands of GPUs faulty". APC. ninemsn Pty Ltd. [25] Abazovic, Fuad (2008-09-08). "Nvidia gives OEMs http://apcmag.com/Content.aspx?id=2750. $200 per bad mobile GPU". Fudzilla. Fudzilla. Retrieved on 2008-08-18. "... updates that force http://www.fudzilla.com/ your computer to cool itself down not only kill your index.php?option=com_content&task=view&id=9297&Itemid=65 battery life further but also leave you running the Retrieved on 2008-11-03. "Nvidia made a deal with risk that now with the extra needed fan cycles, that big OEMs, such as Dell and HP, that they will get cooling system built into your laptop might die $200 per affected notebook and we are hearing that sooner than expected." OEMs are quite happy about it. It turns out that this Demerjian, Charlie (2008-08-12). "Nvidia G92s and is more than generous and that this covers the cost G94 reportedly failing: Desktop boards this time". of a new chip, the repair cost and all the other cost The Inquirer. Incisive Media Investments Ltd.. related to this issue." http://www.theinquirer.net/gb/inquirer/news/2008/ [26] "MacBook Pro: Distorted video or no video issues". 08/12/nvidia-g92s-g94-reportedly. Retrieved on Apple Inc. 2008-10-10. http://support.apple.com/kb/ 2008-08-18. "A little digging revealed what this, TS2377. Retrieved on 2008-11-03. "Apple has and more, is all about, and it’s far uglier than just determined that some MacBook Pro computers with the ’notebook’ version. It seems that four board the NVIDIA GeForce 8600M GT graphics processor partners are seeing G92 and G94 chips going bad in may be affected." the field at high rates... From the look of it, all G8x [27] "INQUIRER confirms Apple Macbook Pros have variants other than the G80, and all G9x variants Nvidia bad bump material- The Inquirer". The are defective" Inquirer. 2008-12-09. http://www.theinquirer.net/ Shilov, Anton (2008-08-29). "Nvidia Updates Bump inquirer/news/921/1049921/inquirer-confirmsMaterial of GeForce 8800, 9800 Chips.". X-Bit apple-macbook-pros-have-nvidia-bad-bumpLabs. http://xbitlabs.com/news/video/display/ material. Retrieved on 2008-12-10. "The Inquirer 20080829133428_Nvidia_Updates_Bump_Material_of_GeForce_8800_9800_Chips.html. 15" notebook’s GPU reviews the new MacBook Pro Retrieved on 2008-09-29. "Nvidia Corp. has solder." reportedly issued yet another product change notification (PCN) document, informing its customers that it plans to change bump material on • nvidia.com – Corporate website its code-named G92 chips, which power a great • nZone.com - Nvidia’s gaming community site amount of GeForce graphics cards. Potentially, this • nvidiagraphicscards.com - Complete list of all Cards may mean that those graphics processing units are and GPUs by Nvidia also subject to failures similar to [sic] already Coordinates: 37°22′14.62″N 121°57′49.46″W / confirmed by Nvidia." 37.3707278°N 121.9637389°W / 37.3707278; O’Brien, Kevin (2008-08-12). "More Defective -121.9637389 NVIDIA Graphics Chipsets". NotebookReview.com. TechTarget. http://www.notebookreview.com/ default.asp?newsID=4554. Retrieved on 2008-08-18. "Expect to see more BIOS updates released to increase cooling fan cycles, and more ATI graphics options from notebook manufacturers.

External links

Retrieved from "http://en.wikipedia.org/wiki/Nvidia" Categories: Companies listed on NASDAQ, Companies in the NASDAQ-100 Index, Nvidia, Companies established in 1993, Fabless semiconductor companies


From Wikipedia, the free encyclopedia


This page was last modified on 21 May 2009, at 03:01 (UTC). All text is available under the terms of the GNU Free Documentation License. (See Copyrights for details.) Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc., a U.S. registered 501(c)(3) tax-deductible nonprofit charity. Privacy policy About Wikipedia Disclaimers


Shared By: