My X2 has been amazing, I don't need any driver hacks and it's far from a "poor user experience". The problems people seem to have stem from earlier driver releases but I've never had 1 problem with my X2!
The X2 is a better attempt at dual GPUs than the GX2 IMO.
Erm, I'm talking about the tech itself not NV or ATI. Both manufacturers basically have to write profiles for individual games and/or engines to make it work. These are the 'hacks' which are on top of another 'hack' that is dual GPUs to start with. Rather than proper load balancing, each GPU takes it in turn to render a frame (with AFR). Of course the game doesn't know its running on two GPUs - and even if it did it wouldn't help much.
'Taking it in turns' to render a frame is a bit of a misnomer, since if that happened it would be no faster than a single GPU. Instead, both GPUs each have to render a frame at the same time. The driver accomplishes this by saying 'rendering complete' to the game, so the game sends another frame to be rendered. Of course this raises serious timing issues - for example, how do you sync up input? What if a future frame needs data from a previous frame? Worse still since you're effectively rendering frames in pairs - how do you slightly offset the frames so that you get a nice smooth frame delivery (aka. avoid microstutter). Other methods like SFR are even worse.
Until proper load-balancing is set up, that is the GPUs are made far more modular where the units themselves are just spread out - multiple GPUs will give wonderfully high framerates in benchmarks, but not massively improved actual gameplay - certainly not performance increases commensurate with double the price and double the power consumption.
Not sure how the X2 can be classed as an 'elegant' design though. Its very clever, but not elegant imo. Its a massively complicated PCB with more layers than a single-GPU board. Basically its a single PCB with two GPUs put side by side, memory around them and a PCI-E bridge in the middle. Worse, the heat from one GPU is blasted over the other GPU heating it up more - and increasing noise. Can't see it being any more elegant than NV's new dual GPU board which doesn't suffer from heating one GPU more than the other, or the excess noise. Neither solution is elegant imho.