Friday, June 16, 2006

Software Prognostications

I've regularly contemplated what the future holds for software on a macro level. Over the years, I have observed the cycle between centralized and decentralized computing, and every time I see centralized solutions come to the forefront, I look past them to see yet another round of decentralization on the horizon.

I believe that centralized computing paradigms, including the current push to offer "hosted solutions" is reflective of the lack of sufficient system-interconnectivity-bandwidth to support what ultimately becomes the follow-on generation of more widely distributed computing solutions. Mainframes and minicomputers with attached terminals pre-dated the client-server push; client-server solutions came into their own as Ethernet bandwidth in the office expanded; HTML and "thin client" solutions then pushed processing back to the servers due to Internet (and inter-office Intranet) bandwidth limitations (and, sacrificed user experience for widespread accessibility), then richer Graphical web-interfaces using Flash, Client-Side Java, AJAX, and the likes have pushed some processing back to the clients. So, what is next?

Since the late 1990's, with the advent of widespread high-speed Internet connections, I have said that the time will come (soon) when networking speeds will allow for software with very powerful client-side processing and robust GUIs (Graphical User Interfaces) to dominate the desktop once again (supplanting lame or lackluster HTML and web-applications). Many programs now do this, using powerful client-side software with rich GUIs to exploit the processing power of the client machine (aka, PC) as well as the network bandwidth of the Internet and the power of Servers on the Net. Some examples include BitTorrent clients, stock-trading interfaces, and so on. Again, what is next?

Well, we aren't quite where I wanted to see us by now. I envisioned a world of native-executable applications being downloaded on demand over the Internet as users need a particular bit of functionality. In late 2000, the Cleveland Software Development and Consulting firm (Intersoft Development, Inc) that I owned and was CEO of, actually created a rudimentary software infrastructure to support the hosting, distribution, verification (authenticated / secure software signatures), and automatically updating of native executable applications -- calling the whole thing "Robust Internet" featuring the "Robust Widget/Package Manager". We stored full blown single-file EXE's on a server, along with various information in an accompanying database (like, software description, owning-company, version info, software-dependencies including OS, and so forth), coupled with the "Robust Widget Manager" GUI that allowed users to: search (over the web) for particular software/packages, download the desired software (executable), verify the issuer/certificate, track what downloaded software was currently available on their machine, and update/remove as desired. Some companies have emulated this methodology to some extent since, though still not quite as I envisioned. I still think it is a viable method of robust client-side software distribution that would nearly eliminate all the hell that accompanies installation/removal of programs, since any and all files needed by a program were to be kept in one directory-tree "owned" by that downloaded program and only that downloaded program (making updates/removals a snap since no DLL dependencies would exist, no inter-program conflicts would exist, etc). And, now that disk space is nearly free, and RAM is also quite affordable, Dynamic Link Libraries (DLLs) in general should be a thing of the past - they served their purpose, and generally no longer make any sense on the client.

Ok, so that is what I envisioned back in 2000,... and, it may still happen... but, I am now seeing further into the future. And, what do I see? Something quite similar to what I envisioned with the Robust Web experience, but a step further down that path, especially now that it is obvious that processing power, disk space, RAM, and bandwidth potential can support what I have in mind.

The future, in short: "applications" will be completely and totally self-sufficient and not rely on anything outside themselves except for a network connection and the hardware they run on. How can this be? Won't applications need an Operating System? Yes, but, in my future "applications", the OS will be an integral part of the "application". Thus the "application" will be completely autonomous. In essence, each application will be the software you desire, already installed in a completely configured Operating System that contains exactly what is needed for that software to perform its functions. So, if you want a word-processor, that "word processor application" will be the word-processor plus the OS it needs to run (plus, as I mentioned, proper network/Internet connectivity built in). If by now you think I have lost my mind, consider that what I am really talking about is highly specialized virtual machines that are pre-configured and ready to run. Though not quite what I portend to see, the VMWare Virtual Appliances are a precursor to my vision.

Microsoft is one company that wants to successfully combat the shosted-application siege that is coming at it from all sides (including Google and the likes). I say, take it up a notch Microsoft! As bandwidth, storage, processing power, and the likes increase exponentially and price per unit of each falls, it will be possible for Microsoft to offer pre-configured purpose-built Windows Virtual Machines that target specific user needs. This is a bet that MS executives would probably find incredibly tough to take, but perhaps the sum of all sales of task-specific pre-configured VM's could actually exceed the sales that their traditional (complete desktop domination) approach is able to maintain in the future as other players come on line with hosted solutions or solutions similar to what I'm discussing.

You need just a Word Processor? Well, Google may likely offer an online word processor soon, or Microsoft could offer a Word Processing VM (that will only run Word). Better yet Microsoft, think of this: every software developer that would want to offer its Microsoft-Centric-Solution in a pre-configured Microsoft Windows VM would pay Microsoft a license (reasonable) fee for a per-client-VM fee to host its application on your OS inside a VM. Notice I have not mentioned Linux yet -- well, if you looked at that VMWare appliance directory, you will have noticed that it is predictably all Linux / Open-Source operating system based appliances, since only such open-source solutions can be freely distributed. Take notice now MS.

What I am proposing would require a significant paradigm shift for Microsoft - having it adopt and welcome a fee-per-VM-hosted-application in order to maintain its OS dominance. Moreover, there needs to be some grand software vision implemented to make this all possible, whereby the OS is marginalized a bit (it is no longer the focus, the applications are), a "usage governor" is placed in the OS to only allow licensed applications to run in the OS-VM, and a simple inter-VM-connectivity framework is implemented to facilitate standardized inter-program communication between applications hosted in various VMs (much of this does exist via TCP/IP, and such, though a clean simple abstraction layer could make this much simpler and standardized), and data-storage on one or more VMs (and/or a "host" OS if desired).

I personally believe that my ideal application-VMs should only contain the programs and OS needed to run the application, and that all user-settings, user-data, and the likes should be stored on a "host" OS (or specialized user-data-VM), since this will allow for the application-VM to be completely and totally replaced at will (with an upgrade or whatever). Which, if you have been thinking, "gee, how will I perform a Windows update on 20 Windows VMs?",... I say not to worry, just download the entire application-VM from MS (or any application-VM vendor) that has all the latest OS and application updates applied, since inevitably bandwidth will support this! And, there we are: the ultimate evolution of decentralized processing!

If anyone wants me to further expand on this vision, I will gladly address questions and ideas in a future posting. I have much more to say about this vision, but this posting should be enough to stimulate discussion :)

1 comment:

Kirby said...

Mike, you're insane. Now that we've cleared that up…I think your vision is not insane especially considering where virtualization seems to be going. Having a virtualized application that includes the OS running under something like the VMware Players makes total sense to me. I already do this for customer development work. Each customer project I work on is isolated in its own virtual machine. The VM serves one purpose and one purpose only, to be the development environment for a particular project. And by hosting the virtual application on a server I can access the environment from any machine on the network. A key advantage to this approach is I don't have to worry about the requirements of one project causes side effects on another project.

The virtual machine is the virtual application and the application is the development environment. Your vision just takes this one scenario further. Here's hoping your vision becomes a reality.