Free Republic
Browse · Search
News/Activism
Topics · Post Article

Skip to comments.

Rethinking software bloat.
Information week.com ^ | 12/17/01 | Fred Langa

Posted on 12/17/2001 4:33:52 AM PST by damnlimey

Rethinking 'Software Bloat'

PRINT THIS ARTICLE
DISCUSS THIS ARTICLE
WRITE TO AN EDITOR
 
Fred Langa takes a trip into his software archives and finds some surprises--at two orders of magnitude.
By Fred Langa

 
Reader Randy King recently performed an unusual experiment that provided some really good end-of-the-year food for thought:
I have an old Gateway here (120 MHz, 32 Mbytes RAM) that I "beefed up" to 128 Mbytes and loaded with--get ready--Win 95 OSR2. OMIGOD! This thing screams. I was in tears laughing at how darn fast that old operating system is. When you really look at it, there's not a whole lot missing from later operating systems that you can't add through some free or low-cost tools (such as an Advanced Launcher toolbar). Of course, Win95 is years before all the slop and bloat was added. I am saddened that more engineering for good solutions isn't performed in Redmond. Instead, it seems to be "code fast, make it work, hardware will catch up with anything we do" mentality.
It was interesting to read about Randy's experiment, but it started an itch somewhere in the back of my mind. Something about it nagged at me, and I concluded there might be more to this than meets the eye. So, in search of an answer, I went digging in the closet where I store old software.

Factors Of 100
It took some rummaging, but there in a dusty 5.25" floppy tray was my set of install floppies for the first truly successful version of Windows--Windows 3.0--from more than a decade ago.

When Windows 3.0 shipped, systems typically operated at around 25 MHz or so. Consider that today's top-of-the-line systems run at about 2 GHz. That's two orders of magnitude--100 times--faster.

But today's software doesn't feel 100 times faster. Some things are faster than I remember in Windows 3.0, yes, but little (if anything) in the routine operations seems to echo the speed gains of the underlying hardware. Why?

The answer--on the surface, no surprise--is in the size and complexity of the software. The complete Windows 3.0 operating system was a little less than 5 Mbytes total; it fit on four 1.2-Mbyte floppies. Compare that to current software. Today's Windows XP Professional comes on a setup CD filled with roughly 100 times as much code, a little less than 500 Mbytes total.

That's an amazing symmetry. Today, we have a new operating system with roughly 100 times as much code as a decade ago, running on systems roughly 100 times as fast as a decade ago.

By itself, those "factors of 100" are worthy of note, but they beg the question: Are we 100 times more productive than a decade ago? Are our systems 100 times more stable? Are we 100 times better off?

While I believe that today's software is indeed better than that of a decade ago, I can't see how it's anywhere near 100 times better. Mostly, that two-orders-of-magnitude increase in code quantity is not matched by anything close to an equal increase in code quality. And software growth without obvious benefit is the very definition of "code bloat."

What's Behind Today's Bloated Code?
Some of the bloat we commonly see in today's software is, no doubt, due to the tools used to create it. For example, a decade ago, low-level assembly-language programming was far more common. Assembly-language code is compact and blazingly fast, but is hard to produce, is tightly tied to specific platforms, is difficult to debug, and isn't well suited for very large projects. All those factors contribute to the reason why assembly language programs--and programmers--are relatively scarce these days.

Instead, most of today's software is produced with high-level programming languages that often include code-automation tools, debugging routines, the ability to support projects of arbitrary scale, and so on. These tools can add an astonishing amount of baggage to the final code.

This real-life example from the Association for Computing Machinery clearly shows the effects of bloat: A simple "Hello, World" program written in assembly comprises just 408 bytes. But the same "Hello, World" program written in Visual C++ takes fully 10,369 bytes--that's 25 times as much code! (For many more examples, see http://www.latech.edu/~acm/HelloWorld.shtml. Or, for a more humorous but less-accurate look at the same phenomenon, see http://www.infiltec.com/j-h-wrld.htm. And, if you want to dive into Assembly language programming in any depth, you'll find this list of links helpful.)

Human skill also affects bloat. Programming is wonderfully open-ended, with a multitude of ways to accomplish any given task. All the programming solutions may work, but some are far more efficient than others. A true master programmer may be able to accomplish in a couple lines of Zen-pure code what a less-skillful programmer might take dozens of lines to do. But true master programmers are also few and far between. The result is that code libraries get loaded with routines that work, but are less than optimal. The software produced with these libraries then institutionalizes and propagates these inefficiencies.

You And I Are To Blame, Too!
All the above reasons matter, but I suspect that "featuritis"--the tendency to add feature after feature with each new software release--probably has more to do with code bloat than any other single factor. And it's hard to pin the blame for this entirely on the software vendors.

Take Windows. That lean 5-Mbyte version of Windows 3.0 was small, all right, but it couldn't even play a CD without add-on third-party software. Today's Windows can play data and music CDs, and even burn new ones. Windows 3.0 could only make primitive noises (bleeps and bloops) through the system speaker; today's Windows handles all manner of audio and video with relative ease. Early Windows had no built-in networking support; today's version natively supports a wide range of networking types and protocols. These--and many more built-in tools and capabilities we've come to expect--all help bulk up the operating system.

What's more, as each version of Windows gained new features, we insisted that it also retain compatibility with most of the hardware and software that had gone before. This never-ending aggregation of new code atop old eventually resulted in Windows 98, by far the most generally compatible operating system ever--able to run a huge range of software on a vast array of hardware. But what Windows 98 delivered in utility and compatibility came at the expense of simplicity, efficiency, and stability.

It's not just Windows. No operating system is immune to this kind of featuritis. Take Linux, for example. Although Linux can do more with less hardware than can Windows, a full-blown, general-purpose Linux workstation installation (complete with graphical interface and an array of the same kinds of tools and features that we've come to expect on our desktops) is hardly what you'd call "svelte." The current mainstream Red Hat 7.2 distribution, for example, calls for 64 Mbytes of RAM and 1.5-2 Gbytes of disk space, which also happens to be the rock-bottom minimum requirement for Windows XP. Other Linux distributions ship on as many as seven CDs. That's right: Seven! If that's not rampant featuritis, I don't know what is.

Is The Future Fat Or Lean?
So: Some of what we see in today's huge software packages is indeed simple code bloat, and some of it also is the bundling of the features that we want on our desktops. I don't see the latter changing any time soon. We want the features and conveniences to which we've become accustomed.

But there are signs that we may have reached some kind of plateau with the simpler forms of code bloat. For example, with Windows XP, Microsoft has abandoned portions of its legacy support. With fewer variables to contend with, the result is a more stable, reliable operating system. And over time, with fewer and fewer legacy products to support, there's at least the potential for Windows bloat to slow or even stop.

Linux tends to be self-correcting. If code-bloat becomes an issue within the Linux community, someone will develop some kind of a "skinny penguin" distribution that will pare away the needless code. (Indeed, there already are special-purpose Linux distributions that fit on just a floppy or two.)

While it's way too soon to declare that we've seen the end of code bloat, I believe the signs are hopeful. Maybe, just maybe, the "code fast, make it work, hardware will catch up" mentality will die out, and our hardware can finally get ahead of the curve. Maybe, just maybe, software inefficiency won't consume the next couple orders of magnitude of hardware horsepower.

What's your take? What's the worst example of bloat you know of? Are any companies producing lean, tight code anymore? Do you think code bloat is the result of the forces Fred outlines, or it more a matter of institutional sloppiness on the part of Microsoft and other software vendors? Do you think code bloat will reach a plateau, or will it continue indefinitely? Join in the discussion!



TOPICS: Editorial; Miscellaneous
KEYWORDS:
Navigation: use the links below to view more comments.
first 1-5051-100101-129 next last
More fodder for all the MS haters out there,but it does raise an interesting question.What do you think are the real reasons for "Bloatware"
1 posted on 12/17/2001 4:33:52 AM PST by damnlimey
[ Post Reply | Private Reply | View Replies]

To: damnlimey
Greed & laziness. Pure and simple.
2 posted on 12/17/2001 4:42:18 AM PST by Exnihilo
[ Post Reply | Private Reply | To 1 | View Replies]

To: damnlimey
I think a lot of it has to do with people wanting the OS to do everything for them, in order to avoid having to do any detail work. In Win3.1, you had to go in and set drivers, and load drivers from manufacturers disks, set exactly how you wanted the hardware to perform. More and more, people are wanting to do Plug and Play, rather than get into the details of how a piece of hardware is supposed to work. That takes more of the coding as well. But I'm late for work (help desk is fun, help desk is fun...if I say it enough, I almost believe it....)
3 posted on 12/17/2001 4:42:21 AM PST by Tennessee_Bob
[ Post Reply | Private Reply | To 1 | View Replies]

To: damnlimey
I'm an MS hater, but, I must say that 32 bit OS improves the hell out of networking with older programs. There are software areas where a GUI is just unsuitable for data entry, and until another level for that data entry is reached, the fastest networking and user entry is on Win9x with a character based programming. And customers can still have pretty graphical printouts and graphs through the "OS".

Course XP blows that away.

4 posted on 12/17/2001 4:43:50 AM PST by jammer
[ Post Reply | Private Reply | To 1 | View Replies]

To: jammer
Just for comparison,this siteTiny apps shows just how small and tight code can get.
5 posted on 12/17/2001 4:51:47 AM PST by damnlimey
[ Post Reply | Private Reply | To 4 | View Replies]

To: damnlimey
More fodder for all the MS haters

This isn't more fodder. This is one of the reasons I am an MS hater. The other is that up until Windows 2000 applications software can crash and lock up the OS. That is inexcusible.

6 posted on 12/17/2001 4:54:25 AM PST by AndyJackson
[ Post Reply | Private Reply | To 1 | View Replies]

To: damnlimey
I am not an MS "hater", but I can see the obvious too.
Perhaps this is a good time to ask the question that "intrigues" me.

A long time ago we had text based Basic, which allowed anyone to write great, fast and useful programs to solve all types of problems, including Astronomic, Engineering and Technical problems of all types.
HP Basic particularly was extremely rich in commands that made anything very easy to program and output.
Where did it go?
With todays machines, that interpreted language would be incredibly fast. And useful.

Why is there no current version?
(That I know of...)

7 posted on 12/17/2001 4:57:02 AM PST by Publius6961
[ Post Reply | Private Reply | To 1 | View Replies]

To: damnlimey;bush2000;innocentbystander
Single data point: I run NT 4.0 Server and XP Home on my machine -- a PPro 200, 64MB RAM. My wife runs 98, and 2000 Pro on her machine -- a PIII 933, 128MB RAM.

Frankly, there's darn little performance difference I can detect between the two machines. In fact, I suspect I could do a blind test with a dozen random users and have a 50-50 distribution of accurate guesses as to which was the "fast" machine and which was the "dog".

I used to think my 8mhz "Turbo" XT w/640KB RAM was the bee's knees (compared to the 4 mhz (and slower!) 8 bit 64KB iron I'd cut my teeth on). Then I got my 10 (or was it 12?) mhz 286, and I was pickin' bugs from my teeth every time I got up from the keyboard. That sucker was fast.

Now, it seems that the iron has gotten so much faster than the apps that it's a "paper competition" with little real-world meaning for the vast majority of users.

Who needs the super iron? I see two classes of users, and for one class, the term "need" is applied in the loosest of all possible senses. The two classes are "gamers", and "network admins".

The only time I start feeling "cramped" on my machine is when I'm running multiple concurrent major apps, i.e., one or two instances of Visual Studio (running an app or two), SQL Server, IIS, and IE. IOW, when I'm doing that, I'm essentially running a whole network in one cramped little box. Most people don't do that.

To chime in on the author's theme, it wasn't that long ago (at least not at the rate that the years seem to keep peeling by at my age) that a 10 - 16 mhz 286-386 class machine, with 1-3MB of XMS memory was a high end network server, and cost a pile of money. Nowadays, we've got secretaries using machines that would have literally cost millions (and occupied rooms) a few years ago -- as glorified typewriters.

So, my two cents is that the "bloatware" thing is overblown. When 128 megs of RAM costs less than fifty bucks, and a 60 gig hard drive costs a buck a gig (I remember paying $275 for a 20 megabyte drive -- wholesale!), and no mix of OS and apps comes anywhere near taxing the capabilities for 99% of the users, "bloatware" is a non-issue.

8 posted on 12/17/2001 4:58:30 AM PST by Don Joe
[ Post Reply | Private Reply | To 1 | View Replies]

To: Bush2000; innocentbystander; SolitaryMan; Don Joe; lelio; Smogger; Dominic Harr; Rodney King...
Ping - let me know if you want to be added/removed from ping list!
9 posted on 12/17/2001 5:03:13 AM PST by stainlessbanner
[ Post Reply | Private Reply | To 1 | View Replies]

To: damnlimey
You want small? We use the K language (a descendent of APL) from Kx Systems. (Total size < 200KB - that's right, kilobytes!) We've built a complete database management system that fits onto a single floppy!
10 posted on 12/17/2001 5:05:04 AM PST by ZeitgeistSurfer
[ Post Reply | Private Reply | To 5 | View Replies]

To: jammer
"There are software areas where a GUI is just unsuitable for data entry, and until another level for that data entry is reached, the fastest networking and user entry is on Win9x with a character based programming."

Two things: one, there's nothing preventing you from deploying character-mode apps to an NT/2K/XP platform, and two, if a GUI-based data entry app has worse usability than a character-mode counterpart, it's the programmer's fault, not the GUI's. Granted, too may people do little more than drag and drop textboxes and then bind them to fields, but that's their fault. I can drive my car into a brick wall. If I do, that's not an indictment of Toyota.

11 posted on 12/17/2001 5:07:31 AM PST by Don Joe
[ Post Reply | Private Reply | To 4 | View Replies]

To: damnlimey
Greed, laziness, ignorance, and stupidity.

For more than 10 years I have watched with amusement as PCs that are much faster than the old ones take LONGER to boot up -- typically, a 1.2 MHz PC XT vintage 1986 would be fully ready to go in a few seconds, and the current models are more than 1000 times as fast and take nearly 10 times as long because they are doing 10000 times as much computational work!

But the REAL problem with software bloat is not the slowness, it is the complexity which makes applications almost impossible to properly debug. NOBODY I know, and I know a LOT of computer types, makes any attempt to fix Microsoft-related errors themselves as they would with Unix or Linux, nor do they bother trying to get Microsoft to fix them because it just won't happen; instead they just shrug, reboot, and work around. A certain level of "Your program has performed an illegal operation and will be shut down" and a (lower) level of total freeze-ups and blue screens of death are simply accepted as a tolerable inconvenience.

But every single time this happens, there are one or more theoretically identifiable HUMANS who made specific MISTAKES that could be tracked down and blamed on them. The practical difficulties of this are sufficient that most of us are willing to simply let them be condemned to hand-simulate the infinite loops of their own programs in programmers' hell after they pass on.

12 posted on 12/17/2001 5:14:30 AM PST by VeritatisSplendor
[ Post Reply | Private Reply | To 1 | View Replies]

To: damnlimey
Techno-Bump

This is an interesting "discovery." I guess the author is saying that the latest h/w resources allow the older os to provide better-than-ever performance because older os's don't have the unneeded overhead.

I hate unneeded overhead anyway.

Russ

13 posted on 12/17/2001 5:16:21 AM PST by kinsman redeemer
[ Post Reply | Private Reply | To 1 | View Replies]

To: ZeitgeistSurfer
--this software you have, this could be a forum style software as well, correct? If not please excuse, I am far from being any sort of alpha geek on these matters.
14 posted on 12/17/2001 5:17:43 AM PST by zog
[ Post Reply | Private Reply | To 10 | View Replies]

Comment #15 Removed by Moderator

To: damnlimey
Yea, there's bloat allright...

But there NEEDS to be another Hardware solution...

Simultaneous calls and virtual multiple clocks or something...

Then it'll all work!

Something big, yes sir... that's it!

16 posted on 12/17/2001 5:35:58 AM PST by No!
[ Post Reply | Private Reply | To 1 | View Replies]

To: damnlimey
bump
17 posted on 12/17/2001 5:43:51 AM PST by 74dodgedart
[ Post Reply | Private Reply | To 1 | View Replies]

To: damnlimey
With 60G of hard drive space for less than $200 and 512 MB of RAM for about $100, who cares how large it is? I recently read they created a 180G hard drive. Storage is cheap! As long as a reasonably fast speed is still there (I don't recall ever waiting for the operating system software to perform any operation), who cares how big the code is?
18 posted on 12/17/2001 5:44:29 AM PST by SW6906
[ Post Reply | Private Reply | To 1 | View Replies]

To: damnlimey
I used to hand code assembly language for the 6502 microprocessor (Apple II - 1 Mhz). That little thing was awesome, many instructions executing in a single clock or two, like an early RISC device. You could do amazing things. I graduated to Turbo Pascal on the early 4.77 Mhz IBM-PC. I used that with some inline assembly code, and that sh*t would fly!

Ahh, the good ole days..

19 posted on 12/17/2001 5:45:03 AM PST by Paradox
[ Post Reply | Private Reply | To 1 | View Replies]

To: SW6906
Let me correct that: I do wait about a minute or so for W2K to start up, but from then on, it's lightning fast (1.33G processor, 512MB RAM!).
20 posted on 12/17/2001 5:46:27 AM PST by SW6906
[ Post Reply | Private Reply | To 18 | View Replies]

To: SW6906
Bloatware could well be what is driving the rapid advance of todays hardware,
more resource hungry software calls for bigger faster hardware which in turn
leaves the door open for developers to add more bells and whistles to their
products and on and on ad infitum.
Hey,who knows ,maybe it's a good thing ;^)
21 posted on 12/17/2001 6:01:25 AM PST by damnlimey
[ Post Reply | Private Reply | To 18 | View Replies]

To: damnlimey
infitum = infinitum,wheres that take it back button when you need it?
22 posted on 12/17/2001 6:02:39 AM PST by damnlimey
[ Post Reply | Private Reply | To 21 | View Replies]

To: SW6906
With 60G of hard drive space for less than $200 and 512 MB of RAM for about $100, who cares how large it is?

I do. Smaller is better.

Every line of code has a finite chance of having a defect. Every defect has a finite chance of remaining undetected in the released product. Every released defect has a finite chance of causing unintended operation. But you and Mr. Gates apparently aren't too worried about that.

23 posted on 12/17/2001 6:06:22 AM PST by VoiceOfBruck
[ Post Reply | Private Reply | To 18 | View Replies]

To: stainlessbanner
I'll stay on the list, please.
24 posted on 12/17/2001 6:17:55 AM PST by rdb3
[ Post Reply | Private Reply | To 9 | View Replies]

To: damnlimey
Similiar to driving an SUV verses a compact.

Everyone wants behemoth sized and gadget packed goods, rather than efficient and simplistic wares.

Seems like more of status symbol than practicality. Get XP and brag in your hood today!


25 posted on 12/17/2001 6:24:55 AM PST by Rain-maker
[ Post Reply | Private Reply | To 1 | View Replies]

To: damnlimey
Reason for bloat? Jamming programs with little used features and lazy programmers who would rather patch than fix.
26 posted on 12/17/2001 6:35:08 AM PST by Blood of Tyrants
[ Post Reply | Private Reply | To 1 | View Replies]

To: damnlimey
My first computer was a Vic20 with 2 (that is two, like in one, two) K of memory, expandable to 16K

Storage was a tape recorder

It had many useful (at the time) programs, and using basic I was able to write my own, well within the limits of the machine.

However I wanted more. Each new machine gave me more hard drive space, when I had my first computer with 4 MB of hard drive I wondered how I would ever fill the space. Well I did. My previous computer had 9 GB of harddrive, and when I got it I did not think it would be possible to ever fill the space, well I did.

My current computer has 80 GB, and I no longer wonder if I will ever fill the space, but wonder how long will it be before I have to start deleteing files.

I am looking forward to my first "80 Terabyte" hard drive.

The point is, as any project will expand for the time alloted it, and hard drive, no matter how big will get filled. As long as the price of hard drives falls as the size of the hard drive increases, I no longer worry about bloat ware.

:-}

27 posted on 12/17/2001 6:42:37 AM PST by CIB-173RDABN
[ Post Reply | Private Reply | To 1 | View Replies]

To: damnlimey
I, tried it myself, and it's true in a way:

#include "iostream.h"

int main(int argc,char **argv) { cout << "Hello world"; }

The executable is 8800 bytes in Solaris 2.6. However, most of that is part of the necessary overhead to communicate with the OS via the linked-in library. If you wrote all the library code yourself, you'd be wasting a lot of time and introducing unnecessary bugs.

The whole point of OO is really the optimize the efficiency of writing code and the liklihood that it will work correctly. If you write clean C++, the executables might be somewhat larger, but they will run very fast.

28 posted on 12/17/2001 7:25:36 AM PST by proxy_user
[ Post Reply | Private Reply | To 1 | View Replies]

To: Rain-maker

29 posted on 12/17/2001 7:54:12 AM PST by Bush2000
[ Post Reply | Private Reply | To 25 | View Replies]

To: damnlimey
This article is tripe. The fact of the matter is that hardware prices are constantly declining as products are replaced with better technology, so the cost per megabyte/gigabyte of storage is far, far less than in previous years. And frankly, it's unrealistic to think that Moore's Law applies to hardware. As hardware capabilities increase so, too, does software. And, really, when you're talking about less than 50 cents per megabyte of RAM and $2.65 per gigabyte of hard disk space, you're left wondering why the author has anything to complain about.
30 posted on 12/17/2001 8:04:58 AM PST by Bush2000
[ Post Reply | Private Reply | To 1 | View Replies]

To: Bush2000
That should read:

"And frankly, it's unrealistic to think that Moore's Law applies solely to hardware."
31 posted on 12/17/2001 8:06:05 AM PST by Bush2000
[ Post Reply | Private Reply | To 30 | View Replies]

To: damnlimey
For all the anti-bloatware talk about svelt OSs and applications, and the invariable comments about "I did/do just as much on my Vic20 or 486 w/Win3.1", there's not enough comments about what was downright impossible with lesser, smaller systems.

Streaming video.
Quake III.
Video conferencing.
MP3s.
DVDs.
CD burning.
Video editing.
Pause/rewind live TV.
...and plenty more, all doable within minutes of unboxing a new PC.

Oh sure, there's plenty of bloat. You don't need a large percentage of what disk space gets allocated for. Pounding out a quick memo doesn't need more than the first word processor run on an Apple II. But...compare the size proportions between commonly used applications and the data files they handle. The music you're listening to is likely a 5MB .MP3 (out of a collection spanning gigabytes); the LotR trailer you'll watch during lunch is about 50MB; the Unreal Tournament session you'll play this evening chews up 500MB in maps & related data files ... and none of these apps could be reasonably, conveniently be run on the relatively svelt machines of the past.

Complaints of "bloat" have appeared with every upgrade. Sure it all gets bigger...yet there's no question that there's more functionality in the straining load in my 700MHz, 20GB laptop than the PS/2 collecting dust in my brother's basement - that's WHY the older machines are mostly collecting dust: the "bloatware" machines actually do more.

32 posted on 12/17/2001 8:25:33 AM PST by ctdonath2
[ Post Reply | Private Reply | To 1 | View Replies]

To: stainlessbanner
Thanks for the heads up.

I actually have one more thing to add to this discussion -- OO programming also can increase code size tremendously, in return for adding flexibility, stability and code reusability.

Tweaking a program for optimal speed always means doing serious damage to your OO architecture. Before about 2 or 3 years ago, good OO design was typically non-existent. The new languages, Java and C#, are forcing developers to begin to understand and use solid OO design. Which will be slightly slower code than pure optimized code. But the benefits are thru the roof!

So I would say that, in and of itself, code size isn't a very useful way to measure software quality.

Features, ease of use, stability, flexibility, scalability, solid componentize architecture . . . these are the 'measurements' of software quality.

If you just apply these to MS software, I think you find the *real* proof that their software is low quality.

But just being 'big' doesn't necessarily mean 'bad'. In good software, 'bigger' should mean more functional.

33 posted on 12/17/2001 8:28:45 AM PST by Dominic Harr
[ Post Reply | Private Reply | To 9 | View Replies]

To: ctdonath2
Pounding out a quick memo

Use vi...it works fine < /grin>

34 posted on 12/17/2001 8:32:07 AM PST by stainlessbanner
[ Post Reply | Private Reply | To 32 | View Replies]

To: damnlimey
>What's Behind Today's Bloated Code?

Many people might not remember this, but one of the main reasons argued for graphical interfaces was that by building all that code into the OS, programs would be able to share & re-use code so effectively that programs would GET SMALLER!

Hmmm. Didn't seem to happen.

Mark W.

35 posted on 12/17/2001 8:32:37 AM PST by MarkWar
[ Post Reply | Private Reply | To 1 | View Replies]

To: damnlimey
"What Grove giveth, Gates taketh away."
36 posted on 12/17/2001 8:36:13 AM PST by dighton
[ Post Reply | Private Reply | To 1 | View Replies]

To: damnlimey
bump
37 posted on 12/17/2001 8:45:46 AM PST by Centurion2000
[ Post Reply | Private Reply | To 1 | View Replies]

To: Bush2000
This article is tripe. The fact of the matter is that hardware prices are constantly declining as products are replaced with better technology, so the cost per megabyte/gigabyte of storage is far, far less than in previous years. And frankly, it's unrealistic to think that Moore's Law applies to hardware. As hardware capabilities increase so, too, does software. And, really, when you're talking about less than 50 cents per megabyte of RAM and $2.65 per gigabyte of hard disk space, you're left wondering why the author has anything to complain about.

Serious question for you. Do you sell or market Microsoft products ?

38 posted on 12/17/2001 8:52:24 AM PST by Centurion2000
[ Post Reply | Private Reply | To 30 | View Replies]

To: damnlimey
The interesting thing about Moore's Law is that it seems to have been suspended by bloatware.

I use the same word processor program I used ten years ago. The new machine is ten times faster and has ten times the memory and storage capacity of the old machine.

However, the new machine is not ten times faster, or ten times better. While I can certainly pick up 'obsolete' machines for the cheap, no one is making new machines that can do what my old one did for one-tenth the cost. Indeed, to do exactly what I do now with my current machine, I'd have to spend just as much -- though two or three Moore generations have passed since I bought my last machine.

Is this 'running twice as fast to stay in the same place' what Moore had in mind?

39 posted on 12/17/2001 9:03:17 AM PST by JoeSchem
[ Post Reply | Private Reply | To 1 | View Replies]

To: JoeSchem
You're not staying in place with the new machine. If you really want to buy a machine to run your old word processor, drop by the local used-computer store and pick up a complete system for $50. The new machine "for the same price" will do far more, like video editing and VCD burning - not possible on your 10-year-old box.

Just because you don't use new capabilities doesn't mean new machines are aren't more capable.

40 posted on 12/17/2001 9:14:11 AM PST by ctdonath2
[ Post Reply | Private Reply | To 39 | View Replies]

To: damnlimey
not necessarily anti-MS. ALL code becomes bloated in time - creeping featuritis, bells and whistles, cross-platform and hardware issues. I'm bumping to check out the Assembly language links for later :-)
41 posted on 12/17/2001 9:15:53 AM PST by fnord
[ Post Reply | Private Reply | To 1 | View Replies]

To: JoeSchem
(Typed faster than I think...)

They're not selling new machines at 1/10th the price because people don't want them. It's been tried. People want to play Napstered .MP3s and watch the latest LotR trailers and play Final Fantasy X. There isn't an adequate market for cheap machines running at a ten-year-old performance level. Moreso, there is a critical mass for productivity (CPU, keyboard, monitor, ...), and raising that productivity level by 10 costs far less than 10x the critical-mass cost: 1/10th the performance costs roughly 1/2 the price to make, so why not pay just twice as much to get ten times the performance? The extra power is only a fraction of the total cost; the machine still costs the same to assemble, advertise & ship.

42 posted on 12/17/2001 9:23:56 AM PST by ctdonath2
[ Post Reply | Private Reply | To 39 | View Replies]

To: damnlimey
Bump
43 posted on 12/17/2001 9:24:59 AM PST by Fiddlstix
[ Post Reply | Private Reply | To 1 | View Replies]

To: stainlessbanner
Remember, you can't write evil without vi.
44 posted on 12/17/2001 9:27:37 AM PST by ctdonath2
[ Post Reply | Private Reply | To 34 | View Replies]

To: damnlimey
But the same "Hello, World" program written in Visual C++ takes fully 10,369 bytes--that's 25 times as much code!
So 25x of the code bloat is something that I can't even control?
45 posted on 12/17/2001 9:28:36 AM PST by lelio
[ Post Reply | Private Reply | To 1 | View Replies]

To: damnlimey
I still use (extensively) a DOS program called T/Master which does database, spreadsheets, word processing, telecommunications (async telecom stuff that has become largely irrelevant in the face of the Internet), graphing and the like. In fact, I believe that I owe a lot of the success I've had as a mainframe programmer/analyst to the ability of this tool to enable me to do stuff (analysis of test results, summarization of data, etc.) on my PC that I'd have to write mainframe programs to do if I didn't have it. It's not sold or maintained any more, and it's got some minor Y2K flaws, but it's still darn helpful. I only use Microsoft Word and Excel when I absolutely have to.
46 posted on 12/17/2001 9:28:56 AM PST by CubicleGuy
[ Post Reply | Private Reply | To 1 | View Replies]

To: damnlimey
What do you think are the real reasons for "Bloatware"

Shhh. It's good for my Intel stock.

47 posted on 12/17/2001 9:28:57 AM PST by RogueIsland
[ Post Reply | Private Reply | To 1 | View Replies]

To: ctdonath2
As mentioned to me by someone experienced with "downsizing code to fit in space program computers": "Dad, programmers now are just lazy because they CAN be. Fast processors + tons of CHEAP memory = no motivation to write efficient code". Much different from when Dad did machine language for MC6800 chip....
48 posted on 12/17/2001 9:32:46 AM PST by Johnny Crab
[ Post Reply | Private Reply | To 44 | View Replies]

To: VoiceOfBruck
Smaller is better. Every line of code has a finite chance of having a defect. Every defect has a finite chance of remaining undetected in the released product. Every released defect has a finite chance of causing unintended operation.

Yep. Well said.

49 posted on 12/17/2001 9:37:13 AM PST by Jefferson Adams
[ Post Reply | Private Reply | To 23 | View Replies]

To: damnlimey
Blah blah blah.. I don't know what you cats are programming (or if you are programming at all) but todays software enviroment requires what we like to call Rapid Application Development (RAD). Which means we cannot sit around in a room tinkering in Assembly all day long working on one function. You want to know the reason why software is bloated? Because development in high level languages is 1000 times faster than development in lower level languages.

If you develop a fanstastic piece of software in 2 years using assembly and I develop a competing piece of software in overbloated VB, but I release my working program in 2 months guess who captures the market?

I don't know about the rest of you but the software I am developing requires RAD. It is much cheaper to throw some hardware at the problem after the fact then to pay dozens more programmers to sit around and write the app in aseembly (or in my case even C++).

50 posted on 12/17/2001 9:52:20 AM PST by Smogger
[ Post Reply | Private Reply | To 1 | View Replies]


Navigation: use the links below to view more comments.
first 1-5051-100101-129 next last

Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.

Free Republic
Browse · Search
News/Activism
Topics · Post Article

FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson