Posted on 12/17/2001 4:33:52 AM PST by damnlimey
I do. Smaller is better.
Every line of code has a finite chance of having a defect. Every defect has a finite chance of remaining undetected in the released product. Every released defect has a finite chance of causing unintended operation. But you and Mr. Gates apparently aren't too worried about that.
Everyone wants behemoth sized and gadget packed goods, rather than efficient and simplistic wares.
Seems like more of status symbol than practicality. Get XP and brag in your hood today!
Storage was a tape recorder
It had many useful (at the time) programs, and using basic I was able to write my own, well within the limits of the machine.
However I wanted more. Each new machine gave me more hard drive space, when I had my first computer with 4 MB of hard drive I wondered how I would ever fill the space. Well I did. My previous computer had 9 GB of harddrive, and when I got it I did not think it would be possible to ever fill the space, well I did.
My current computer has 80 GB, and I no longer wonder if I will ever fill the space, but wonder how long will it be before I have to start deleteing files.
I am looking forward to my first "80 Terabyte" hard drive.
The point is, as any project will expand for the time alloted it, and hard drive, no matter how big will get filled. As long as the price of hard drives falls as the size of the hard drive increases, I no longer worry about bloat ware.
:-}
#include "iostream.h"
int main(int argc,char **argv) { cout << "Hello world"; }
The executable is 8800 bytes in Solaris 2.6. However, most of that is part of the necessary overhead to communicate with the OS via the linked-in library. If you wrote all the library code yourself, you'd be wasting a lot of time and introducing unnecessary bugs.
The whole point of OO is really the optimize the efficiency of writing code and the liklihood that it will work correctly. If you write clean C++, the executables might be somewhat larger, but they will run very fast.
Streaming video.
Quake III.
Video conferencing.
MP3s.
DVDs.
CD burning.
Video editing.
Pause/rewind live TV.
...and plenty more, all doable within minutes of unboxing a new PC.
Oh sure, there's plenty of bloat. You don't need a large percentage of what disk space gets allocated for. Pounding out a quick memo doesn't need more than the first word processor run on an Apple II. But...compare the size proportions between commonly used applications and the data files they handle. The music you're listening to is likely a 5MB .MP3 (out of a collection spanning gigabytes); the LotR trailer you'll watch during lunch is about 50MB; the Unreal Tournament session you'll play this evening chews up 500MB in maps & related data files ... and none of these apps could be reasonably, conveniently be run on the relatively svelt machines of the past.
Complaints of "bloat" have appeared with every upgrade. Sure it all gets bigger...yet there's no question that there's more functionality in the straining load in my 700MHz, 20GB laptop than the PS/2 collecting dust in my brother's basement - that's WHY the older machines are mostly collecting dust: the "bloatware" machines actually do more.
I actually have one more thing to add to this discussion -- OO programming also can increase code size tremendously, in return for adding flexibility, stability and code reusability.
Tweaking a program for optimal speed always means doing serious damage to your OO architecture. Before about 2 or 3 years ago, good OO design was typically non-existent. The new languages, Java and C#, are forcing developers to begin to understand and use solid OO design. Which will be slightly slower code than pure optimized code. But the benefits are thru the roof!
So I would say that, in and of itself, code size isn't a very useful way to measure software quality.
Features, ease of use, stability, flexibility, scalability, solid componentize architecture . . . these are the 'measurements' of software quality.
If you just apply these to MS software, I think you find the *real* proof that their software is low quality.
But just being 'big' doesn't necessarily mean 'bad'. In good software, 'bigger' should mean more functional.
Use vi...it works fine < /grin>
Many people might not remember this, but one of the main reasons argued for graphical interfaces was that by building all that code into the OS, programs would be able to share & re-use code so effectively that programs would GET SMALLER!
Hmmm. Didn't seem to happen.
Mark W.
Serious question for you. Do you sell or market Microsoft products ?
I use the same word processor program I used ten years ago. The new machine is ten times faster and has ten times the memory and storage capacity of the old machine.
However, the new machine is not ten times faster, or ten times better. While I can certainly pick up 'obsolete' machines for the cheap, no one is making new machines that can do what my old one did for one-tenth the cost. Indeed, to do exactly what I do now with my current machine, I'd have to spend just as much -- though two or three Moore generations have passed since I bought my last machine.
Is this 'running twice as fast to stay in the same place' what Moore had in mind?
Just because you don't use new capabilities doesn't mean new machines are aren't more capable.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.