Wednesday, March 11, 2009

Software Bloat: Acrobat Reader Executable Size Growth over the years

Does this image say it all? Since I develop various software applications that, among other things, create reports in Adobe's Acrobat / PDF format, I tend to keep quite a few versions of Acrobat reader around for testing my program output with. And, I just could not help laughing when I looked at my file-server's Acrobat Reader executable programs directory, where, since 1997, I have accumulated versions of Acrobat Reader 3.2, 4.0, 5.0, 6.01, 7.05, 8.12, 9.0, and now the newest addition of Acrobat Reader 9.1

Check out the Software Bloat factor in Acrobat Reader over the years:


What started as a simple utility to display PDF files (Portable Document Format files from Adobe), has grown from an under 4Megabyte (MB) installation program to a whopping 42MB installation for Adobe Acrobat Reader 9.1 - which now includes things like Adobe Air and Acrobat.com and all sorts of other ridiculous crap.

I find this all crazy... how a PDF-Reader application can grown 10-TIMES in size (for the installer executable download) in a period of 12 years. Sure, it is no worse than Microsoft Windows or most other major applications that also seem to grow by orders of magnitude over the years, but why?

Is there really that much more functionality in the applications these days to justify this explosive file-size increase, or is it lazy software develoment, a lack of tuning, a lack of focus on clean and efficient code and re-use, or a combination of all the above? It is just hard to accept (for me) why a program designed to allow me to view PDF files on any computer must require a 40MB Download (I will not even get into how insanely large the post-installation size-requirements for this application are!).

I remember the days when I had an entire operating system (on a TRS-80) running in a mere 32K of RAM with simple word-processing programs, games, and the like. Sure, those programs were much simpler, and did not have modern GUI desktops and flash, but the entire OS AND PROGRAMS ran in 1/1000th the memory that this latest Acrobat Reader intallation program download is. Unreal.

This begs the question: what will Acrobat Reader 10 be like, or Acrobat Reader 11, or Acrobat Reader 12, or Acrobat Reader 13, or Acrobat Reader 14, etc... will we hit a ONE GIGABYTE download for their future PDF reader eventually? Given the history, file sizes have gone up to nearly 11-times their size in about the same number of years, and the version numbers at about 1 for every 2 years... so, in a decade, expect a 1/2-GB install of Acrobat Reader 14! It is coming! And, you will be using it (note: I use Acrobat Reader nearly daily; many people do)

Tuesday, March 10, 2009

Mark-to-Market Solution : Mark-to-Moving-Average

I am of the opinion that, although mark-to-market (M2M) valuation principles make sense in some accounting situations where asset and financial-instrument market prices are readily obtained (on open exchanges - like commodity exchanges, ForEx, etc), this M2M accounting methodology does not make much sense for assets like homes and property, where the prices of those assets at any given time is hard to accurately value. I'd go as far as to place that *value* reference I made (in prior sentence) in quotation marks; i.e., they are assets which are hard to determine a "value" for at any point in time, because there is not always a willing seller and/or buyer for the assets, or an agreement on the value-point at which a buyer and seller would "value" the asset at the same price.

Mark-to-Market accounting is in the news a lot these days due to the SEC, and other regulatory body, requirements for banks and other financial institututions and the like to "value" their assets on a mark-to-market (M2M) basis. Well, this is creating a LOT of problems for Banks, Insurance Companies, and many other firms, as it is not just difficult to ascertain the current "value" of all assets held in a portfolio (be it loans, property, or even bonds for which there is no *current* market).

How does one determine the value for assets (like Sub-Prime mortgages or such) that there is no current buyer for? Is a financial firm simply to be forced to "value" all their loans at what is essentially fire-sale (or lesser) values of barely pennies on the dollar because there is no *current* buyer for those instruments? Many of these loans have some physical property behind them (i.e., homes), and most of those homes are NOT worth just a few cents on the dollar. Sure, there has been a huge decline in the underlying home prices and value of other assets subject to M2M accounting, but there were also nearly equally inflated prices in these assets over the past few years (which helped encourage the "Bubble" and ensuing financial crisis).

FIXING MARK-TO-MARKET ACCOUNTING!
So, let's address the pitfalls of mark-to-market accounting now, with a solution that considers the fact that quickly-rising asset prices (that can lead to financial "bubbles") are as bad (or certainly precipitators) of bubbles that result in quickly-falling asset prices.

I see an EASY solution to all of this mess: simply value assets using a MOVING-AVERAGE, and perform a "mark to moving average" accounting strategy to "smooth" values over time. Why do assets need to always be valued as of "right now", when such a strategy encourages panic as rapid oscillations occur when there is an imbalance between the supply-side and demand-side (i.e., sellers and buyers) of assets that lead to excessively inflated or deflated "values" when looking at "value" as a point-in-time evaluation.

Recently, Federal Reserve chairman Ben Bernanke has suggested that regulators need to examine mark-to-market accounting during financial crises like the one we are currently experience, but he also rejected calls for immediate suspension of mark-to-market practices. He discussed how, when markets for certain financial-vehicles/assets "dry up", that mark-to-market can be misleading (note to Ben: during upward bubbles, mark-to-market can contribute to over-enthusiasm too, as balance sheets start looking exceptionally great as "value" rises extraordinarily fast; see past 7 years for reference).

So, please, someone from the Federal Reserve, SEC, FDIC, GAAP, or other regulatory and accounting-policy-setting players: consider something like a "mark-to-market-moving-average" strategy. The moving-average TERM will certainly need to be debated (e.g., does a 5-year moving-average home sales-price in a region work well, or a 3-year, or a 10-year?), as does the data to be used in calculating such averages (various home-price surveys, sales data, etc.). The longer-term TRENDS and moving-average historical-values are what matters more than anything for determining fair "value" for many assets.

Fact is, right now, average-selling-price is nearly useless as an indicator, because it reflects the sale of a boatload of foreclosed properties at fire-sale prices, and it does not mean most other persons would consider (or need to) sell their homes for similar discounts. Again, this is where a "smoothing" effect of a moving-average-M2M approach would be VERY helpful for everyone involved.

I can not help thinking how, over the years, we have all become so used to ever-faster information and ever more "instant" information, that we have thrown out common-sense in favor of trying to always obtain a "right now" look at everything (be it asset pricing or many other things). Mark-to-market in its current form is a perfect example of where data can be TOO INSTANT or current, and such data is actually misleading (if not outright harmful) compared to longer-term trends.

RESTORING REASON TO ASSET VALUATION
The entire goal of my proposed mark-to-average accounting is to remove these wild oscillations that are so damaging to our economy and financial system (both during "boom" times upward, and "crash" times downward).

The idea should help prevent situations like people in San Diego bidding 30-percent over asking price (and other such ridiculous behavior as we all witnessed over the past "boom" years), and then having that newly-paid price be representative as the "value" of a home, as well prevent situations where a $100K house (or mortgage on one) is now only "valued" at $2K just because there are no current buyers interested in it. Each of these extremes needs to be avoided. And, mark-to-market has only encouraged each of these scenarios by allowing banks to watch their own balance-sheet "values" explode during boom-times (and, feed further insane lending based on that "value"), and go down the toiled during the bust cycles (and then bring lending to a standstill as well). This can all be fixed.