<html><body style="word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space; "><div><div>Yeah, but if you follow the changes in the structure architecture going forward, the trend is away from separate controllers with nVidia trying to commandeer a space in the processing from Intel - and Apple seems to have bought into it. I am not sure that the VRAM is as big an issue as the availability of the GPU itself to take on some of the chores farmed out in the past to additional controllers:<div><br></div><div><a href="http://www.appleinsider.com/articles/09/02/21/how_intels_battle_with_nvidia_over_core_i7_impacts_apple.html">http://www.appleinsider.com/articles/09/02/21</a></div><div><br></div><div>I would expect that Intel will be (or probably already has in the pipeline) offering newer (and hopefully better, much better) graphic options; to this point it is not clear how much of their muscle they have put on this development. But I am sure that if they feel that they are being challenged by another supplier (nVidia), I am sure they will respond. Based on the article referenced herein, there wouldn't appear to be an overtaxation of memory to manage the bus for the SATA, USB, Audio, Networking, & PCIe-WiFi.</div><div><br></div><div>I still have two cubes, both currently in hiatus, but will back up running soon. The real difference other than limited USB is the bus speed, and even though you can get around the slower USB with FW400 alternatives, the ever accelerated bus and RAM speed is making the use of the Cube more and more nostalgic for me than anything else.</div><div><br></div><div>It still rules the world of cool, even though they can run hot! It will remain ahead of its time forever if for no reason the radical change it made for the rest of the computer world trapped inside the beige box. I am using MacBook Pro these days and my wife has a white iMac that is a couple years old, and the ideas of managing space and heat in all the new stuff is definitely the progeny of the Cube.</div><div><br></div><div>Dave Iverson<br></div><div><br></div>On Mar 5, 2009, at 10:28 AM, Luis Meleiro wrote:</div><div>snip! </div><div>I've never liked the idea of sharing the GPU memory with the computer memory. Don't ask why, but I always get the feeling that users may lose computing performances with VRAM shared. Anyway,I appreciate the precious hint! LM</div></div><br></body></html>