The DynAPI was created in the late 1990s by a student at MIT to allow code to perform relatively equally well whether it ran on the Internet Explorer browser or the Netscape browser. It had a cool smoothing algorithm (a recursive 'setTimeout' call), and was focused on fixing the discrepancies in the DOM binding. Framework development continued, with some frameworks beginning to be more expansive (Tibet was one representative of this group, and one I worked on at the long-gone Digigroups company). But all this work was largely ignored by the business community.
Sadly, the tale is not one big happy hockey stick climb into technical ecstasy. I guess that should not be surprising, since after all, humans are involved! Today we have a truly degenerate form of "professional" software development going on in the incredibly sloppy jQueryUI-based development.
Debugging the code that powers that abstraction is difficult. I have a six-month-old quad-core Intel-based box and Firebug is still sluggish to move through it. I have even seen ExtJs behave like the Web browsers of yore that would simply "drop" key css values now and then. In that scenario, the developer has to find the dropped value and plug it back in when needed from a cache. And, ExtJs is rooted in dynamic DOM rendering, which is a shaky foundation as that is not the main Q.A. testing path for Web browsers. Another framework in this general basket - call it the "sledgehammer approach" basket - is GWT.
In the middle we have some various attempts at a happy medium. Prototype/Scriptaculous would seem to be the ancestor of this line. Dojo is in there, but a lot of it looks to me like sloppy work. Its poor-man's aspect-oriented programming is a pain to work with in the debugger and is more a pollution of object-orientation than AOP. Google's Closure, which powers Google Docs, has been released as an opens-source framework and, although I have yet to delve into it, I'm interested. I like MochiKit, although I have not had a chance to use it. I'd also include in this group the one powering the site where this blog was originally published, Vox.com, of which I'm a co-author, and one (technical demos) used on etrade.com, of which I'm the primary author. Unfortunately, it's not available at this time for use.
That could make the bandwidth requirements of remote-desktopping feasible. Already, I seem to get pretty good performance in my personal remote-desktop use. Certainly not good enough for high-end games, but adequate for everything else. And virtual machines, especially ones that run on a hypervisor o/s, are an effective way of providing many "computers" on a single physical computer.