We thought DirectX 10 would be the one reason to consider holding our noses and upgrading to Vista. While there’s no reason why Microsoft could not release DirectX 10 for Windows XP, the company has so far insisted on keeping DX10 and Shader Model 4.0 exclusive to their new OS.
The new API gives game developers the tools to dramatically increase the visual complexity of their games. However, from what we’ve seen of DX10 games so far there are too few compelling reasons to justify abandoning XP right now for anyone that does not fall into the “must early adopt” category. Vista’s slow adoption rate is one reason why developers have been reluctant to move to it. Valve recently released statistics culled from its Steam gaming service that revealed only three percent of its one million anonymous users had machines equipped with both a DX10- compatible videocard and Vista.
“[Microsoft’s] decision to couple DX10 with Vista was a mistake,” said Valve’s director of marketing, Doug Lombardi. “There is no difference between running Orange Box games [Half-Life 2: Episode 2, Team Fortress 2, and Portal] on Vista versus XP, but there are some benefits to having a DX10 GPU.”
But this is more than just a chicken-or-the-egg problem. DX10 and Shader Model 4.0 are also more complex to program that DX9 and SM 3.0, and most of the games that shipped last year were far along in their development cycles when Microsoft made these new tools available.
Lombardi, for example, told us that Valve’s developers do make use of the unified architecture that’s unique to DX10-class GPUs in order to deliver more sophisticated facial animation in Team Fortress 2, but you don’t need Vista for this because they didn’t tap DX10 or SM 4.0.
The few games we’ve seen that do make use of DX10 (both new games and previously released games with DX10 patches) don’t look significantly better running under Vista than they do with Windows XP. But what’s worse is that they run slower on Vista. When we patched the RTS game Company of Heroes and ran it at 1920x1200 resolution in Windows XP (using an EVGA GeForce 8800 GTS with 640MB of memory), we achieved a playable 42.3 frames per second. When we played the same game on the same machine using Vista, frame rate plummeted to a creaky 20.2 frames per second. It would be one thing if the trade-off resulted in supremely better graphics, but we couldn’t see any significant differences. We had a similar experience with World in Conflict.
Microsoft’s recent announcement of DirectX 10.1 and Shader Model 4.1 have rendered the situation even more complex. These new versions were released along with Vista Service Pack 1, but they’re supported only by AMD’s and NVIDIA’s very newest GPUs (we’re talking about the G92 and the RV670). So if you thought buying any Radeon 2000-series or any GeForce 8000-series card rendered you future-proof, you’re in for a rude awakening.
Microsoft, of course, insists these updates don’t render these cards obsolete. “The updated API,” said Microsoft’s Sam Glassenberg, lead DX10.1 programmer, “provides full support for all existing Direct3D 10 hardware and upcoming hardware that supports the extended feature set.
The API is a strict superset. No hardware support has been removed in DirectX 10.1” The new API renders mandatory several features that were previously optional. Compliant GPUs must now support at least 4x AA and 32-bit floating-point filtering, for instance.
Considering how slowly both consumers and developers are moving to Vista, we don’t anticipate the point releases of these new tools to have much of an impact on the market.