NVIDIA: We Are Going to Open a Can of Whoop Ass

I have read several articles about NVIDIA CEO’s latest comments on Intel integrated graphics. This mostly interested people seeking the “drama du jour”, on a topic that is quite frankly not new: integrated graphics versus discrete graphics. I’ll limit the drama to the title, but I think that the questions and issues raised in this controversy are interesting and worth blogging about:

Here’s my take: If you look at these two statements, you might ask yourself why Intel is building a “discrete graphics” product, if people won’t need it in the future. Secondly, I see no evidence as to why rasterization is no longer scalable.

The reason why graphics processing is extremely scalable is comprehensible to everyone:

The image that you see on your screen is made of pixels (2560×1600 pixels = 4 Million). Each pixel can be treated independently from its neighbor, making it a perfect candidate for parallel computing. We would love to have as many processorsas we have pixels so that all the pixels can be processed simultaneously. In reality, the number of pixel processors has gone from 1 to 128 in the span of a decade, which is already extraordinary. It is obvious that this trend can continue for some time, so the reason why Intel thinks that rasterization isn’t sustainable is rather obscure.

The Larrabee project itself is based on the idea of added parallelism: up to 80 cores by 2012. Another common argument for spelling the end of rasterization is the idea that Ray-Tracing was going to replace Rasterization completely as a way of displaying 3D graphics (in games). That’s an opinion, but most game programming professionals would tell you that there is no evidence that this would ever happen, or even that it would be a good thing. Interestingly, David Kirk from NVIDIA had a point of view on the subject that many graphics professionals would agree with, saying that Ray-tracing was definitely “a tool” in the graphics toolbox, but not a panacea. As a graphics guy myself, I agree.

In the spirit of full disclosure, I have to tell you that I used to work for NVIDIA, but I’m not going to ask you to simply believe everything I just said above. Use your own judgment, keep your eyes open and see for yourself if discrete graphics goes away in the coming years and if a top game gets rendered solely using ray-tracing. My guess is: you will not see it.

Filed in Computers >Top Stories. Read more about .

Discover more from Ubergizmo

Subscribe now to keep reading and get access to the full archive.

Continue reading