Cg: nVidious Plot or Graphical Revolution?
June 20th 2002, 01:55 CEST by m0nty
Slipping in amongst the news of Morrowind, NWN and WarCraft 3, nVidia and Microsoft have unveiled what they claim is the strongest push yet to bring cinematic-quality real-time graphics from the movie screen to the PC. The introduction in version 9 of Microsoft's DirectX of a High-Level Shading Language (HLSL), revealed earlier in 2002 and which has just gone into beta, have been trumped by nVidia, which has announced a high-level programming language called Cg - short for "C for graphics".
The common, familiar C-like syntax enables rapid development of stunning, real-time shaders and visual effects for graphics platforms, and is compatible with Microsoft's recently announced High Level Shading Language for DirectX® 9.0.
With Microsoft around, you know there will always be standards issues (remember OpenGL?), and this is no different. Cg represents nVidia's latest salvo in its battle with ATI and other graphics card vendors for the hearts and minds of developers over their preferred programmable shaders and 3D APIs. The vendors have already started attacking each others' DX9 support, and Cg brings a whole new set of arguments about cross-platform implementation and proprietary technologies. Let the new standards war begin!
If there is to be a war, nVidia surely have the metaphorical SoF2 briefcase camped beyond all hope of the terrorists escaping with it. In this interview with nVidia svengali David Kirk conducted by new Eurogamer offshoot Gamesindustry.biz, HLCL is even characterised as "Microsoft's own implementation of Cg", suggesting that nVidia is dictating terms to the dreaded Redmondians. Kirk fends off the inevitable question about incompatibility with non-nVidia hardware thusly:
"Our compiler generates shader code and sends it to DirectX or OpenGL, and shaders are a standard, so they should run on any card that supports the shader standards, including our competitors? Besides, I think it's in our interest to make sure that Cg runs well on everything - we want people to really use this technology, and that's all about taking away their reasons not to... Making the compiler so that it didn't work well on ATI cards, for example, would be really bad for us too."
Do you trust this man? Has nVidia created Cg out of the goodness of its heart, or is it really just another attack on its competitors? Does it matter, or should we just accept nVidia's dominance of the graphics card market? Will the majority of programmers choose Cg with all its supporting apps, or play it safe with the more limited HLCL? Will Cg become the new Glide - the troublesome rendering technology that was left behind when 3Dfx died? What does it mean for graphical engine makers, like id, Epic and Lithtech? What does this mean to games already in development: will they have to redo parts of their code? (Hi George B!)