The marketing tactics that credit-card companies use never cease to amaze me. One that recently caught my attention for its complete disregard of reality is the American Express "One" Card. The major selling point of this card, from what all their advertisements focus on, is the "high yield savings account" that comes with it - well, that comes with your spending when you use this card.
Yes, that is right, their marketing strategy for this credit-card is quite literally "spend and save". Wow! Now that sounds like the American Dream! To save money, all I have to do is spend. And, the more I spend, the more I save! I must be dreaming! Funny thing is, this is logic-sequence is basically what the TV Commercials sound like. They keep stressing how contributions to a "high yield savings account" are tied to your spending. And, they show people running around spending money on everything imaginable, and then checking out their "savings statement" that results from this spending. Give me a break!
Ok, let's look at the reality behind this offer. First, their "high yield savings account" currently pays a whopping 4.0% interest rate, which quite honestly isn't that bad. But, if you are a typical credit-card user and carry a balance, that "high rate" doesn't seem so high when compared to the current interest-rate you will pay Amex on you balances (Prime Rate + 5.99%), or 13.24% (Prime is now at 8.25%). Perhaps a better way to say this is that if you carry a balance, you will currently still be paying Amex a net 9.24% on every purchase! There are other caveats to the plan that can further reduce your "savings" as well.
So, keep in mind, the fact is that as long as you spend money, you are not really saving money! Period. The only persons that should consider the Amex deal are those that carry no balance on their credit-cards; and, these are the only people that can make any credit-card offer really pay off.
A REAL "savings plan" is just that: save! In other words, cut your spending and place the money into a savings account or other investment vehicle. Plenty of good money-market rates are out there that beat Amex's "high yield" of 4% too. ING DIRECT currently pays 4.35% on their money market. Various banks in my area are running promotional short-term (13 month or so) CD rates of 5.25-5.35% now. Not bad for short-term yields.
But, if you really feel that you must spend to save, I'll make you a great offer. For every $1000 you send me, I'll put 50% of it into a high-yield savings account for you! Send as much as possible, because the more you send, the more you save! :)
Techniques, How-To Guides, Bug-Fixes, Product Reviews, News and Trends — focused on Software Development, Programming, Technology with SQL Server, Delphi, Nvidia CUDA, VMware, Dart Language, JavaScript, SVG, and more. Plus, stimulating discussion — economy, stock market, investing, finance, politics, environment, energy, and other far flung ideas for improving the world.
Friday, June 30, 2006
Spend and Save - Credit-Card Insanity
Tuesday, June 27, 2006
Commercial Gluten-Free (GF) Desserts
So, I just went to an open-house/tasting for a baker that is entering the Gluten-Free (GF) foods market here in the Cleveland area. She (calling herself "Joi Gluten Free") baked up three different snacks / desserts that she is offering for sale at a local health food and GF foods store in the area.
The selections, and my opinion of each, included:
The thing that really burns me up is the COST of these GF products, which would be bad enough to deal with if the products tasted any good, and is ridiculous for products that are at best "OK". For example, the lady was charging $11.29 for FIVE CUPCAKES! This is insane! How much would five "ho-ho's" cost these days -- $1.00?? OK, so they are "specialty" items, and I could see them being as much as a buck apiece, but not $2.25/cupcake. And, they are made with ingredients that really aren't that much more expensive than regulare wheat-containing ingredients (in fact, if you know where to buy the rice flours, tapioca, sorghum, and such, it can be just as cheap). If we were talking Gourmet Restaurant-Quality foods here, the price would be another issue; but we are not.
Next issue I have with commercial GF stuff is this "need" that GF bakers seem to have for adding all sorts of Gums, binders, preservatives, and other artificial this and that to the mix. This particular baker whom I discussed the issue with insisted that these things are needed or the stuff will just fall apart! Uh... WRONG! Once again, we do not use any gums at all to hold out desserts together, and they are just like the real thing. Sure, we don't have to have a 2-week shelf life, so preservatives are out for us too. Some of the things used in this particular batch of "commercial" desserts included: BHA, Guar Gum, Fumaric Acid, Mono-and-Diglycerides, Potassium Sorbate, Partially Hydrogenated Vegetable Oil, Polysorbate 60, Yellow #5 & 6, Red #40, Erythorbic Acid, Xanthan Gum, and Carob Gum. Whew! That is one heck of a lot of ingredients that *I* do not feel like consuming! And, the sad fact is, all that stuff did nothing to make these desserts taste good, so why use them?
So, demand more from commercial Gluten-Free Foods producers. Do not purchase their sub-par goods; make them raise the level of their baking to where their products taste just like the "real" thing! If they can't figure out how to make something that is restaurant-quality, tell them to call me and we can work out a deal.
The selections, and my opinion of each, included:
- Creme Filled Chocolate Cupcakes - taste was just OK, consistency wasn't bad but was not exceptional either. Out of the three items this lady presented, these were best (not saying much);
- Lemon Filled White Cupcakes - horrendous! Absolutely disgusting. The "lemon filling" was like a lemon chewing gum that you would not want to chew.
- Individual "Cheesecakes" with Cherry Topping - lacking taste, consistency was sub-par, and these "cakes" were a total cop-out having no crust whatsoever. Lame. How hard is it to sell cheesecake filling and have it be GF (it's cream-cheese, sugar, eggs basically!)
The thing that really burns me up is the COST of these GF products, which would be bad enough to deal with if the products tasted any good, and is ridiculous for products that are at best "OK". For example, the lady was charging $11.29 for FIVE CUPCAKES! This is insane! How much would five "ho-ho's" cost these days -- $1.00?? OK, so they are "specialty" items, and I could see them being as much as a buck apiece, but not $2.25/cupcake. And, they are made with ingredients that really aren't that much more expensive than regulare wheat-containing ingredients (in fact, if you know where to buy the rice flours, tapioca, sorghum, and such, it can be just as cheap). If we were talking Gourmet Restaurant-Quality foods here, the price would be another issue; but we are not.
Next issue I have with commercial GF stuff is this "need" that GF bakers seem to have for adding all sorts of Gums, binders, preservatives, and other artificial this and that to the mix. This particular baker whom I discussed the issue with insisted that these things are needed or the stuff will just fall apart! Uh... WRONG! Once again, we do not use any gums at all to hold out desserts together, and they are just like the real thing. Sure, we don't have to have a 2-week shelf life, so preservatives are out for us too. Some of the things used in this particular batch of "commercial" desserts included: BHA, Guar Gum, Fumaric Acid, Mono-and-Diglycerides, Potassium Sorbate, Partially Hydrogenated Vegetable Oil, Polysorbate 60, Yellow #5 & 6, Red #40, Erythorbic Acid, Xanthan Gum, and Carob Gum. Whew! That is one heck of a lot of ingredients that *I* do not feel like consuming! And, the sad fact is, all that stuff did nothing to make these desserts taste good, so why use them?
So, demand more from commercial Gluten-Free Foods producers. Do not purchase their sub-par goods; make them raise the level of their baking to where their products taste just like the "real" thing! If they can't figure out how to make something that is restaurant-quality, tell them to call me and we can work out a deal.
Sunday, June 25, 2006
Furthering the Open-Source and Free-Source-Code movement
As of a couple days ago, I began to release some of my programming source code, best-practices documentation, and other technology to the public, free of charge. I am a firm believer in the Open Source Software (OSS) movement, and have directly benefited from the likes of Linux and other OSS software projects. Now, I am joining the ranks of those contributing to free software and source-code in an attempt to further software development as a whole.
Free Software, Source Code and Best-Practices Documentation is now available.
[UPDATE: Jan-2017, the source code and other documents I used to host externally have been moved onto this blog directly where at all possible.]
I will be openly sharing aspects of my software development experience with Borland Delphi Programming, Microsoft SQL Server Programming and Optimization, and much more. At this time, only a few items are available as I just started this process of releasing free source code, but you can expect the list of source code, functions, procedures, objects, how-to articles, best-practices, whitepapers, techniques, methodologies, and the likes to grow.
Needless to say, any and all source code, software, and the likes provided freely to the public is without any warranty or guarantee of any kind, and is provided 'as-is', without any express or implied warranty. In no event will the authors be held liable for any damages arising from the use of this software. Permission is granted to anyone to use this software for any purpose, including commercial applications, and to alter it and redistribute it freely, subject to the certain restrictions (see individual source code files for restrictions).
Continue to read this Software Development and Technology Blog for computer programming articles (including useful free / OSS source-code and algorithms), software development insights, and technology Techniques, How-To's, Fixes, Reviews, and News — focused on Dart Language, SQL Server, Delphi, Nvidia CUDA, VMware, TypeScript, SVG, other technology tips and how-to's, plus my varied political and economic opinions.
Free Software, Source Code and Best-Practices Documentation is now available.
[UPDATE: Jan-2017, the source code and other documents I used to host externally have been moved onto this blog directly where at all possible.]
I will be openly sharing aspects of my software development experience with Borland Delphi Programming, Microsoft SQL Server Programming and Optimization, and much more. At this time, only a few items are available as I just started this process of releasing free source code, but you can expect the list of source code, functions, procedures, objects, how-to articles, best-practices, whitepapers, techniques, methodologies, and the likes to grow.
Needless to say, any and all source code, software, and the likes provided freely to the public is without any warranty or guarantee of any kind, and is provided 'as-is', without any express or implied warranty. In no event will the authors be held liable for any damages arising from the use of this software. Permission is granted to anyone to use this software for any purpose, including commercial applications, and to alter it and redistribute it freely, subject to the certain restrictions (see individual source code files for restrictions).
Continue to read this Software Development and Technology Blog for computer programming articles (including useful free / OSS source-code and algorithms), software development insights, and technology Techniques, How-To's, Fixes, Reviews, and News — focused on Dart Language, SQL Server, Delphi, Nvidia CUDA, VMware, TypeScript, SVG, other technology tips and how-to's, plus my varied political and economic opinions.
Friday, June 16, 2006
Software Prognostications
I've regularly contemplated what the future holds for software on a macro level. Over the years, I have observed the cycle between centralized and decentralized computing, and every time I see centralized solutions come to the forefront, I look past them to see yet another round of decentralization on the horizon.
I believe that centralized computing paradigms, including the current push to offer "hosted solutions" is reflective of the lack of sufficient system-interconnectivity-bandwidth to support what ultimately becomes the follow-on generation of more widely distributed computing solutions. Mainframes and minicomputers with attached terminals pre-dated the client-server push; client-server solutions came into their own as Ethernet bandwidth in the office expanded; HTML and "thin client" solutions then pushed processing back to the servers due to Internet (and inter-office Intranet) bandwidth limitations (and, sacrificed user experience for widespread accessibility), then richer Graphical web-interfaces using Flash, Client-Side Java, AJAX, and the likes have pushed some processing back to the clients. So, what is next?
Since the late 1990's, with the advent of widespread high-speed Internet connections, I have said that the time will come (soon) when networking speeds will allow for software with very powerful client-side processing and robust GUIs (Graphical User Interfaces) to dominate the desktop once again (supplanting lame or lackluster HTML and web-applications). Many programs now do this, using powerful client-side software with rich GUIs to exploit the processing power of the client machine (aka, PC) as well as the network bandwidth of the Internet and the power of Servers on the Net. Some examples include BitTorrent clients, stock-trading interfaces, and so on. Again, what is next?
Well, we aren't quite where I wanted to see us by now. I envisioned a world of native-executable applications being downloaded on demand over the Internet as users need a particular bit of functionality. In late 2000, the Cleveland Software Development and Consulting firm (Intersoft Development, Inc) that I owned and was CEO of, actually created a rudimentary software infrastructure to support the hosting, distribution, verification (authenticated / secure software signatures), and automatically updating of native executable applications -- calling the whole thing "Robust Internet" featuring the "Robust Widget/Package Manager". We stored full blown single-file EXE's on a server, along with various information in an accompanying database (like, software description, owning-company, version info, software-dependencies including OS, and so forth), coupled with the "Robust Widget Manager" GUI that allowed users to: search (over the web) for particular software/packages, download the desired software (executable), verify the issuer/certificate, track what downloaded software was currently available on their machine, and update/remove as desired. Some companies have emulated this methodology to some extent since, though still not quite as I envisioned. I still think it is a viable method of robust client-side software distribution that would nearly eliminate all the hell that accompanies installation/removal of programs, since any and all files needed by a program were to be kept in one directory-tree "owned" by that downloaded program and only that downloaded program (making updates/removals a snap since no DLL dependencies would exist, no inter-program conflicts would exist, etc). And, now that disk space is nearly free, and RAM is also quite affordable, Dynamic Link Libraries (DLLs) in general should be a thing of the past - they served their purpose, and generally no longer make any sense on the client.
Ok, so that is what I envisioned back in 2000,... and, it may still happen... but, I am now seeing further into the future. And, what do I see? Something quite similar to what I envisioned with the Robust Web experience, but a step further down that path, especially now that it is obvious that processing power, disk space, RAM, and bandwidth potential can support what I have in mind.
The future, in short: "applications" will be completely and totally self-sufficient and not rely on anything outside themselves except for a network connection and the hardware they run on. How can this be? Won't applications need an Operating System? Yes, but, in my future "applications", the OS will be an integral part of the "application". Thus the "application" will be completely autonomous. In essence, each application will be the software you desire, already installed in a completely configured Operating System that contains exactly what is needed for that software to perform its functions. So, if you want a word-processor, that "word processor application" will be the word-processor plus the OS it needs to run (plus, as I mentioned, proper network/Internet connectivity built in). If by now you think I have lost my mind, consider that what I am really talking about is highly specialized virtual machines that are pre-configured and ready to run. Though not quite what I portend to see, the VMWare Virtual Appliances are a precursor to my vision.
Microsoft is one company that wants to successfully combat the shosted-application siege that is coming at it from all sides (including Google and the likes). I say, take it up a notch Microsoft! As bandwidth, storage, processing power, and the likes increase exponentially and price per unit of each falls, it will be possible for Microsoft to offer pre-configured purpose-built Windows Virtual Machines that target specific user needs. This is a bet that MS executives would probably find incredibly tough to take, but perhaps the sum of all sales of task-specific pre-configured VM's could actually exceed the sales that their traditional (complete desktop domination) approach is able to maintain in the future as other players come on line with hosted solutions or solutions similar to what I'm discussing.
You need just a Word Processor? Well, Google may likely offer an online word processor soon, or Microsoft could offer a Word Processing VM (that will only run Word). Better yet Microsoft, think of this: every software developer that would want to offer its Microsoft-Centric-Solution in a pre-configured Microsoft Windows VM would pay Microsoft a license (reasonable) fee for a per-client-VM fee to host its application on your OS inside a VM. Notice I have not mentioned Linux yet -- well, if you looked at that VMWare appliance directory, you will have noticed that it is predictably all Linux / Open-Source operating system based appliances, since only such open-source solutions can be freely distributed. Take notice now MS.
What I am proposing would require a significant paradigm shift for Microsoft - having it adopt and welcome a fee-per-VM-hosted-application in order to maintain its OS dominance. Moreover, there needs to be some grand software vision implemented to make this all possible, whereby the OS is marginalized a bit (it is no longer the focus, the applications are), a "usage governor" is placed in the OS to only allow licensed applications to run in the OS-VM, and a simple inter-VM-connectivity framework is implemented to facilitate standardized inter-program communication between applications hosted in various VMs (much of this does exist via TCP/IP, and such, though a clean simple abstraction layer could make this much simpler and standardized), and data-storage on one or more VMs (and/or a "host" OS if desired).
I personally believe that my ideal application-VMs should only contain the programs and OS needed to run the application, and that all user-settings, user-data, and the likes should be stored on a "host" OS (or specialized user-data-VM), since this will allow for the application-VM to be completely and totally replaced at will (with an upgrade or whatever). Which, if you have been thinking, "gee, how will I perform a Windows update on 20 Windows VMs?",... I say not to worry, just download the entire application-VM from MS (or any application-VM vendor) that has all the latest OS and application updates applied, since inevitably bandwidth will support this! And, there we are: the ultimate evolution of decentralized processing!
If anyone wants me to further expand on this vision, I will gladly address questions and ideas in a future posting. I have much more to say about this vision, but this posting should be enough to stimulate discussion :)
I believe that centralized computing paradigms, including the current push to offer "hosted solutions" is reflective of the lack of sufficient system-interconnectivity-bandwidth to support what ultimately becomes the follow-on generation of more widely distributed computing solutions. Mainframes and minicomputers with attached terminals pre-dated the client-server push; client-server solutions came into their own as Ethernet bandwidth in the office expanded; HTML and "thin client" solutions then pushed processing back to the servers due to Internet (and inter-office Intranet) bandwidth limitations (and, sacrificed user experience for widespread accessibility), then richer Graphical web-interfaces using Flash, Client-Side Java, AJAX, and the likes have pushed some processing back to the clients. So, what is next?
Since the late 1990's, with the advent of widespread high-speed Internet connections, I have said that the time will come (soon) when networking speeds will allow for software with very powerful client-side processing and robust GUIs (Graphical User Interfaces) to dominate the desktop once again (supplanting lame or lackluster HTML and web-applications). Many programs now do this, using powerful client-side software with rich GUIs to exploit the processing power of the client machine (aka, PC) as well as the network bandwidth of the Internet and the power of Servers on the Net. Some examples include BitTorrent clients, stock-trading interfaces, and so on. Again, what is next?
Well, we aren't quite where I wanted to see us by now. I envisioned a world of native-executable applications being downloaded on demand over the Internet as users need a particular bit of functionality. In late 2000, the Cleveland Software Development and Consulting firm (Intersoft Development, Inc) that I owned and was CEO of, actually created a rudimentary software infrastructure to support the hosting, distribution, verification (authenticated / secure software signatures), and automatically updating of native executable applications -- calling the whole thing "Robust Internet" featuring the "Robust Widget/Package Manager". We stored full blown single-file EXE's on a server, along with various information in an accompanying database (like, software description, owning-company, version info, software-dependencies including OS, and so forth), coupled with the "Robust Widget Manager" GUI that allowed users to: search (over the web) for particular software/packages, download the desired software (executable), verify the issuer/certificate, track what downloaded software was currently available on their machine, and update/remove as desired. Some companies have emulated this methodology to some extent since, though still not quite as I envisioned. I still think it is a viable method of robust client-side software distribution that would nearly eliminate all the hell that accompanies installation/removal of programs, since any and all files needed by a program were to be kept in one directory-tree "owned" by that downloaded program and only that downloaded program (making updates/removals a snap since no DLL dependencies would exist, no inter-program conflicts would exist, etc). And, now that disk space is nearly free, and RAM is also quite affordable, Dynamic Link Libraries (DLLs) in general should be a thing of the past - they served their purpose, and generally no longer make any sense on the client.
Ok, so that is what I envisioned back in 2000,... and, it may still happen... but, I am now seeing further into the future. And, what do I see? Something quite similar to what I envisioned with the Robust Web experience, but a step further down that path, especially now that it is obvious that processing power, disk space, RAM, and bandwidth potential can support what I have in mind.
The future, in short: "applications" will be completely and totally self-sufficient and not rely on anything outside themselves except for a network connection and the hardware they run on. How can this be? Won't applications need an Operating System? Yes, but, in my future "applications", the OS will be an integral part of the "application". Thus the "application" will be completely autonomous. In essence, each application will be the software you desire, already installed in a completely configured Operating System that contains exactly what is needed for that software to perform its functions. So, if you want a word-processor, that "word processor application" will be the word-processor plus the OS it needs to run (plus, as I mentioned, proper network/Internet connectivity built in). If by now you think I have lost my mind, consider that what I am really talking about is highly specialized virtual machines that are pre-configured and ready to run. Though not quite what I portend to see, the VMWare Virtual Appliances are a precursor to my vision.
Microsoft is one company that wants to successfully combat the shosted-application siege that is coming at it from all sides (including Google and the likes). I say, take it up a notch Microsoft! As bandwidth, storage, processing power, and the likes increase exponentially and price per unit of each falls, it will be possible for Microsoft to offer pre-configured purpose-built Windows Virtual Machines that target specific user needs. This is a bet that MS executives would probably find incredibly tough to take, but perhaps the sum of all sales of task-specific pre-configured VM's could actually exceed the sales that their traditional (complete desktop domination) approach is able to maintain in the future as other players come on line with hosted solutions or solutions similar to what I'm discussing.
You need just a Word Processor? Well, Google may likely offer an online word processor soon, or Microsoft could offer a Word Processing VM (that will only run Word). Better yet Microsoft, think of this: every software developer that would want to offer its Microsoft-Centric-Solution in a pre-configured Microsoft Windows VM would pay Microsoft a license (reasonable) fee for a per-client-VM fee to host its application on your OS inside a VM. Notice I have not mentioned Linux yet -- well, if you looked at that VMWare appliance directory, you will have noticed that it is predictably all Linux / Open-Source operating system based appliances, since only such open-source solutions can be freely distributed. Take notice now MS.
What I am proposing would require a significant paradigm shift for Microsoft - having it adopt and welcome a fee-per-VM-hosted-application in order to maintain its OS dominance. Moreover, there needs to be some grand software vision implemented to make this all possible, whereby the OS is marginalized a bit (it is no longer the focus, the applications are), a "usage governor" is placed in the OS to only allow licensed applications to run in the OS-VM, and a simple inter-VM-connectivity framework is implemented to facilitate standardized inter-program communication between applications hosted in various VMs (much of this does exist via TCP/IP, and such, though a clean simple abstraction layer could make this much simpler and standardized), and data-storage on one or more VMs (and/or a "host" OS if desired).
I personally believe that my ideal application-VMs should only contain the programs and OS needed to run the application, and that all user-settings, user-data, and the likes should be stored on a "host" OS (or specialized user-data-VM), since this will allow for the application-VM to be completely and totally replaced at will (with an upgrade or whatever). Which, if you have been thinking, "gee, how will I perform a Windows update on 20 Windows VMs?",... I say not to worry, just download the entire application-VM from MS (or any application-VM vendor) that has all the latest OS and application updates applied, since inevitably bandwidth will support this! And, there we are: the ultimate evolution of decentralized processing!
If anyone wants me to further expand on this vision, I will gladly address questions and ideas in a future posting. I have much more to say about this vision, but this posting should be enough to stimulate discussion :)
Thursday, June 15, 2006
Resurgence in Native Windows Executable Demand? (Win32 EXE's)
Is it possible that demand for Native Microsoft Windows Executables (Win32 EXE's) is growing, even in the wake of Microsoft's Dotnet 2.0 Technology? Perhaps so.
I was speaking recently to a colleague of mine over at White Peak Software in Salem, Mass., who has witnessed firsthand a recent resurgence in demand for software development services to specifically create native (compiled) 32-bit executables for Windows (as opposed to the newer DotNet 1.1/2.0 technologies).
It seems there are a variety of reasons for this recent rebound in native-code development, including:
Here are some of the things Kirby Turner, the CEO of White Peak Software, had to say about recent demand for Win32 EXE development, and how he has used Borland Delphi 2006 to solve customer demands and deliver top-notch native, compiled software to his clients:
"I’ve been doing non-stop Delphi programming for the last week [...] and, [CompanyName] isn’t the only Delphi customer work I’m doing. I just finished a small Delphi application for a Philladelphia customer, and I should hear by week’s end about another Delphi project. The trend in the last month has been the need for data import / data generation programs where the customer doesn’t want any outside dependencies in the program, i.e., no .NET Framework.
The call I had on Monday was probably the best. The customer said, “I’m sure you want to do this program using the latest .NET Framework, but I can’t guarantee the end user will have it installed. It’s unlikely .NET Framework 1.1 is even installed.” My response was, “Actually, for this type of program I prefer to write it as a native Win32 application. I can deliver a single EXE that will run on any Windows platform from Windows 98SE and up, and it does not require any additional installs or frameworks. In other words, no .NET Framework.” The customer got very excited.
I’m starting to sense that customers want native Win32 applications, but are:
Yes Kirby, I agree, we need to know whether this is more than a short term trend, and market accordingly. But, I believe that marketing the advantages of a natively compiled Windows Executable (developed with Borland Delphi or other RAD environment that supports native application output) can create a "trend" in itself. I have seen the situation so many times where software developers and software development managers fall victim to trying to fit the "latest and greatest" technology (like DotNet) to every new project they have, whether it makes sense or not. So, the advantages of Native Executables must somehow be marketed to those in charge of software development projects in such a manner to counter the prevailing market forces (like, the massive Microsoft push for DotNet-only solutions).
Regardless, this recent demand for Win32 EXE development gives me hope that not everyone is falling victim to the hype surrounding DotNet and other framework-dependent and runtime-library dependent development methodologies. Sometimes "older" or "outdated" ways of doing things can still be the right way!
I was speaking recently to a colleague of mine over at White Peak Software in Salem, Mass., who has witnessed firsthand a recent resurgence in demand for software development services to specifically create native (compiled) 32-bit executables for Windows (as opposed to the newer DotNet 1.1/2.0 technologies).
It seems there are a variety of reasons for this recent rebound in native-code development, including:
- Removal of programmatic dependencies. On the contrary, if using DotNet for development, you automatically are dependent on the DotNet Framework, and therefore must count on all machines having that framework installed -- which isn't a good assumption. Likewise with Java development that many times assumes the latest JRE (Java Runtime Environment) or some Java Swing Libraries, and so on are available on all computers running the application.
- Simplicity of deployment. The beauty of a single-file Executable (EXE) is hard to match, given that if the EXE has any and all dependencies compiled right into the EXE, there is nothing else to be deployed. So, one file to deploy. End of story.
- Simplified upgrades. Your code is much less likely to be "broken" (or otherwise rendered incompatible) by upgrades to frameworks, runtime libraries, DLLs, and the likes. And, if my executable changes, I have even found that even non-technical users can quickly deal with just replacing one EXE with another (heck, it is just a file!) if I Email them an update.
- Raw processing speed and small memory footprint. Keep in mind that if a DotNet or Java application appears to be quite small, don't discount the fact that behind the scenes that 500K "program" is also making use of a 20+MB framework that is loaded into memory! Suddenly those "lightweight" applications do not appear so lightweight when compared to a 1MB EXE that is just 1MB total - no framework dependencies.
Here are some of the things Kirby Turner, the CEO of White Peak Software, had to say about recent demand for Win32 EXE development, and how he has used Borland Delphi 2006 to solve customer demands and deliver top-notch native, compiled software to his clients:
"I’ve been doing non-stop Delphi programming for the last week [...] and, [CompanyName] isn’t the only Delphi customer work I’m doing. I just finished a small Delphi application for a Philladelphia customer, and I should hear by week’s end about another Delphi project. The trend in the last month has been the need for data import / data generation programs where the customer doesn’t want any outside dependencies in the program, i.e., no .NET Framework.
The call I had on Monday was probably the best. The customer said, “I’m sure you want to do this program using the latest .NET Framework, but I can’t guarantee the end user will have it installed. It’s unlikely .NET Framework 1.1 is even installed.” My response was, “Actually, for this type of program I prefer to write it as a native Win32 application. I can deliver a single EXE that will run on any Windows platform from Windows 98SE and up, and it does not require any additional installs or frameworks. In other words, no .NET Framework.” The customer got very excited.
I’m starting to sense that customers want native Win32 applications, but are:
- afraid to ask for it;
- believe it’s not possible any more;
- and, if it is possible, then it will cost a lot of money.
Yes Kirby, I agree, we need to know whether this is more than a short term trend, and market accordingly. But, I believe that marketing the advantages of a natively compiled Windows Executable (developed with Borland Delphi or other RAD environment that supports native application output) can create a "trend" in itself. I have seen the situation so many times where software developers and software development managers fall victim to trying to fit the "latest and greatest" technology (like DotNet) to every new project they have, whether it makes sense or not. So, the advantages of Native Executables must somehow be marketed to those in charge of software development projects in such a manner to counter the prevailing market forces (like, the massive Microsoft push for DotNet-only solutions).
Regardless, this recent demand for Win32 EXE development gives me hope that not everyone is falling victim to the hype surrounding DotNet and other framework-dependent and runtime-library dependent development methodologies. Sometimes "older" or "outdated" ways of doing things can still be the right way!
Thursday, June 08, 2006
BitTorrent for Software Distribution
Have you ever wanted to download the latest version of an Open Source Software release, or a free beta release of a commercial software product but been unable to do so (in a timely fashion) because the download site was overwhelmed? Well, this is where BitTorrent technology may be able to help out.
BitTorrent is the name of a peer-to-peer (P2P) file distribution client application and also of a file sharing protocol that is designed to distribute large amounts of data widely without incurring the corresponding consumption in costly server and bandwidth resources. It allows groups of computers to share files with one another, instead of having to access a central repository, thus reducing or eliminating that bottleneck of a central download site.
The other day, I went to download the latest and greatest Knoppix 5.0.1 DVD edition ISO file (just over 4GB in size) using a direct-download link (non-torrent), only to have it take nearly a day and ultimately fail at around 98% complete! What a waste of time. So, I installed the latest Azureus Java BitTorrent Client and used it to download that same Knoppix 5.01 DVD in a matter of 4 hours - it went flawlessly!
The BitTorrent client (Azureus in this case) is an amazing piece of work. It provides you with an incredible amount of control over the entire file-sharing process, and gives you plenty of visual and text indicators to let you know exactly what is going on. You have to see it in action to appreciate.
Basically, to download a desired torrent, like the openSUSE Linux 10.1 DVD, from the Azureus client you just go to File / Open Torrent / Add URL, and paste in the URL of the torrent (i.e., file) you wish to acquire. You can use the link to openSUSE that I provided for an example (which, is a great piece of software in itself). Then, Azureus will locate other "seeds" and "peers" from which to download the software you want. Here is a link to a list of Azureus BitTorrent terminology and definitions you may find helpful.
Keep in mind that for this technology to work well, you must also share the file(s) once you have them in your possession, since, if others had not shared you would never have the opportunity to get them. So, return the favor, and allow the BitTorrent client to run and act as a "seed" for others once your download is complete. You can throttle the upload bandwidth with very fine-grained settings - including per-file upload limitations as well as global limitations. Also, if you are concerned about security, run the entire BitTorrent process from within a dedicated VM (Virtual Machine); if that is a new concept to you, see my prior posts regarding Virtual Machine Technology with VMWare.
After using BitTorrent (Azureus) to successfully download Knoppix and openSUSE, and doing so in much less time than would have been required if downloading from a single central server, I am a firm believer in the Torrent technology and sharing. And, I live by what I say: I keep a BitTorrent client running well after I am done downloading in order to help others quickly acquire these latest open-source offerings. My Share-Ratio is 1.5 currently, meaning I have uploaded 150% of the amount of downloading I have done. It is recommended that you at least achieve a 1.0 ratio, sharing at least the same amount of bandwidth that you have used from others. And, in the case of OSS (Open Source Software), it is a great way to further the cause of your favorite operating systems and applications.
BitTorrent is the name of a peer-to-peer (P2P) file distribution client application and also of a file sharing protocol that is designed to distribute large amounts of data widely without incurring the corresponding consumption in costly server and bandwidth resources. It allows groups of computers to share files with one another, instead of having to access a central repository, thus reducing or eliminating that bottleneck of a central download site.
The other day, I went to download the latest and greatest Knoppix 5.0.1 DVD edition ISO file (just over 4GB in size) using a direct-download link (non-torrent), only to have it take nearly a day and ultimately fail at around 98% complete! What a waste of time. So, I installed the latest Azureus Java BitTorrent Client and used it to download that same Knoppix 5.01 DVD in a matter of 4 hours - it went flawlessly!
The BitTorrent client (Azureus in this case) is an amazing piece of work. It provides you with an incredible amount of control over the entire file-sharing process, and gives you plenty of visual and text indicators to let you know exactly what is going on. You have to see it in action to appreciate.
Basically, to download a desired torrent, like the openSUSE Linux 10.1 DVD, from the Azureus client you just go to File / Open Torrent / Add URL, and paste in the URL of the torrent (i.e., file) you wish to acquire. You can use the link to openSUSE that I provided for an example (which, is a great piece of software in itself). Then, Azureus will locate other "seeds" and "peers" from which to download the software you want. Here is a link to a list of Azureus BitTorrent terminology and definitions you may find helpful.
Keep in mind that for this technology to work well, you must also share the file(s) once you have them in your possession, since, if others had not shared you would never have the opportunity to get them. So, return the favor, and allow the BitTorrent client to run and act as a "seed" for others once your download is complete. You can throttle the upload bandwidth with very fine-grained settings - including per-file upload limitations as well as global limitations. Also, if you are concerned about security, run the entire BitTorrent process from within a dedicated VM (Virtual Machine); if that is a new concept to you, see my prior posts regarding Virtual Machine Technology with VMWare.
After using BitTorrent (Azureus) to successfully download Knoppix and openSUSE, and doing so in much less time than would have been required if downloading from a single central server, I am a firm believer in the Torrent technology and sharing. And, I live by what I say: I keep a BitTorrent client running well after I am done downloading in order to help others quickly acquire these latest open-source offerings. My Share-Ratio is 1.5 currently, meaning I have uploaded 150% of the amount of downloading I have done. It is recommended that you at least achieve a 1.0 ratio, sharing at least the same amount of bandwidth that you have used from others. And, in the case of OSS (Open Source Software), it is a great way to further the cause of your favorite operating systems and applications.
Thursday, June 01, 2006
Virtual Machine Advantages with VMWare
Stop wasting your time "setting up" and "fixing" your computers!
Ask yourself:
If you have yet to put VM technology to use, whether for your personal computing needs or for your business, you are really missing out on some incredible productivity and efficiency opportunities. VMs can save you some serious time, make better use of your computing hardware, and nearly eliminate the pain associated with setting up a new computer or computing environment. In more industry specific terms, VM technology enables:
If you are still not following what the technology is, I suggest reading this article entitled "what is virtualization technology". And, better yet, just dive right in. How? Simple. Download and install the free VMWare Player. Next, get ready for something really exciting! There is a super simple way to see a VM in action. Just download a pre-configured VM! There are many pre-configured VMs available for download, referred to as Virtual Appliances, that feature all sorts of pre-configured applications and operating environments. VMWare hosts a directory of virtual appliances to help you quickly find one that fits your needs; currently over 50 choices exist, and there are more showing up every day! I recommend trying out one called "Damn Small Linux" for starters, since, as its name implies, it is a reasonably small download (55MB), and once you "unzip" the download into a directory of your choosing, simply double-click on the ".vmx" file and watch your VM start to run in a window of its own, as if it was a computer running on your computer (which, it is -- just a software-only computer!)
Now, if the lightbulb in your brain has gone on and you think: "now I get it!", that is great! And, perhaps you see how VM technology can make your life easier. But, just in case it is a mystery to you yet, why would you want this VM anyhow? That's simple. You can keep the host system installation and configuration to a bare minimum, and invest your time getting your VMs to contain the applications you need access to most. Then, you can easily move / copy that VM to another machine and run it from there on short notice. Because the "machine" is just a few files, copying it, backing it up, and moving it is quite simple. And, you can make "snapshots" of the VM at different points in time (and, using VMWare Workstation, you can easily manage these snapshots and roll-back to prior verions, and so forth).
By way of examples, perhaps I can further encourage thinking about how VMs can help you:
Also, keep in mind that VMs do not get you out of proper licensing. I.e., if you distribute 25 copies of your development VM to 25 desktops, and that VM uses Windows XP or another commercial OS (plus any commercial applications you may have installed), you need to purchase as many copies of the appropriate licenses are you are using simultaneously. Consult with your software provider(s) for the specifics of how they treat licensing for VMs.
A final note: there are even ways to convert a physical machine into a virtual machine (P2V), and vice-versa. That is a bit out of scope for now, but it is possible, and lends even more flexibility to computer utilization scenarios.
Consider the possibilities for simplifying your computing life with VM / virtualization technology now, and act soon - you have nothing to lose, and much to gain!
Ask yourself:
- How many times have I purchased a new computer only to then waste hours (or days) installing the operating system (OS) and my your favorite applications?
- How long did it take restore my backup when my hard-drive crashed last time (better yet, how long did it take me to find a recent backup)?
- How difficult was it is to move my application settings over to a different machine?
- When was the last time I installed an application on my system only to cause chaos for other existing applications or cause the system to crash completely?
- How long will it take me to setup 25 computers with a similar development environment for each member of our programming team?
- Oh no! Did I just get a Virus from that Word document a friend sent me?!!
- Finally: how many machines do I have to host all my favorite OS's and applications? Gee, I wonder why my electric bill is so high! :)
If you have yet to put VM technology to use, whether for your personal computing needs or for your business, you are really missing out on some incredible productivity and efficiency opportunities. VMs can save you some serious time, make better use of your computing hardware, and nearly eliminate the pain associated with setting up a new computer or computing environment. In more industry specific terms, VM technology enables:
- server / desktop consolidation;
- simplified development and testing environment setup;
- easier business continuity plan execution;
- simplified server / desktop environment management and security.
If you are still not following what the technology is, I suggest reading this article entitled "what is virtualization technology". And, better yet, just dive right in. How? Simple. Download and install the free VMWare Player. Next, get ready for something really exciting! There is a super simple way to see a VM in action. Just download a pre-configured VM! There are many pre-configured VMs available for download, referred to as Virtual Appliances, that feature all sorts of pre-configured applications and operating environments. VMWare hosts a directory of virtual appliances to help you quickly find one that fits your needs; currently over 50 choices exist, and there are more showing up every day! I recommend trying out one called "Damn Small Linux" for starters, since, as its name implies, it is a reasonably small download (55MB), and once you "unzip" the download into a directory of your choosing, simply double-click on the ".vmx" file and watch your VM start to run in a window of its own, as if it was a computer running on your computer (which, it is -- just a software-only computer!)
Now, if the lightbulb in your brain has gone on and you think: "now I get it!", that is great! And, perhaps you see how VM technology can make your life easier. But, just in case it is a mystery to you yet, why would you want this VM anyhow? That's simple. You can keep the host system installation and configuration to a bare minimum, and invest your time getting your VMs to contain the applications you need access to most. Then, you can easily move / copy that VM to another machine and run it from there on short notice. Because the "machine" is just a few files, copying it, backing it up, and moving it is quite simple. And, you can make "snapshots" of the VM at different points in time (and, using VMWare Workstation, you can easily manage these snapshots and roll-back to prior verions, and so forth).
By way of examples, perhaps I can further encourage thinking about how VMs can help you:
- Let's say I have 4 different PCs running web servers that each host their own web site. These PCs are rarely at or near capacity in terms of computing power utilization. In fact, they are at a fraction thereof. Wouldn't it be nice if I could have all of these web-sites remain completely independent of each other (since one uses PHP, another uses Python, one DotNet 1.1, another DotNet 2.0, and that is just the web-dev language!) but yet get rid of some machines? This is where the "server consolidation" side of VM technology is a perfect fit. Replace the four PCs with one box capable of running all the sites, and have each of what was a physical web-server now be a Virtual web-server. With ample NICs, each can even be bound to their own physical network connection.
- I have a complex developer-desktop configuration where each person in my team needs a Windows XP environment with Visual Studio, SQL-Server Client tools, OpenOffice, a few utilities, and so on. Well, this is a perfect time to create a VM with these required applications, and distribute the VM to each of the team members. When an update to Windows needs to be applied, or a Visual Studio Upgrade is ready for use, and so on, simply roll out the new developer-VM to the masses. Note: this implies that the VMs will not be altered by the team members lest their changes be lost when the new VM is put in place; I will discuss in the future ways to minimize issues with this approach.
- Ease movement and migration of entire systems;
- Simplify large-scale deployment of a common system configuration;
- Quickly restore to a particular state in the event of a system failure, program-installation chaos, or virus infection or other issue (notice: you still need to have a snapshot/copy of your VM stored somewhere safe, but it is as simple as copying a directory with a few files in it -- I burn DVDs with my VM images for safe keeping);
- Reduce the number of physical machines that are sitting in your home or office sucking electricity, taking up space, and producing excess heat.
- Easily test new programs and/or OS's, especially with pre-built "appliance" VMs and/or ISO-Images (see my prior post on Linux Live CDs that discusses running bootable-ISO images in VMWare).
Also, keep in mind that VMs do not get you out of proper licensing. I.e., if you distribute 25 copies of your development VM to 25 desktops, and that VM uses Windows XP or another commercial OS (plus any commercial applications you may have installed), you need to purchase as many copies of the appropriate licenses are you are using simultaneously. Consult with your software provider(s) for the specifics of how they treat licensing for VMs.
A final note: there are even ways to convert a physical machine into a virtual machine (P2V), and vice-versa. That is a bit out of scope for now, but it is possible, and lends even more flexibility to computer utilization scenarios.
Consider the possibilities for simplifying your computing life with VM / virtualization technology now, and act soon - you have nothing to lose, and much to gain!
Subscribe to:
Posts (Atom)