I believe the "why" of that is that programmers are getting lazy, and their management (and their customers) aren't calling them to account. Either that, or programmers are being denied the time to practice their art properly to produce a quality product.
Common excuses for this sorry state of affairs include:
Well, Random Access Memory (RAM) is cheap, relatively. It keeps on getting cheaper all the time - another consequence of Moore's Law. However, at any given moment, RAM is very dear - you only have a certain amount of it, and it must be managed carefully. If a program that you want to run requires more RAM than you have, well, you lose.
The real problem is that the people who write software (the programmers, or coders, or software engineers; whatever phrase you want to use) have a terrible productivity problem - it's a slow, painstaking process to write software well. Anything that you can do to speed that up can be a really big win. This is why such people often have the biggest (capacity) and fastest computers at their disposal.
Unfortunately, this colors their view of the world: they come to believe that everyone has big, fast machines on their desks (or the whizziest, latest, most powerful portable computer). So, they code their programs to run acceptably on the hardware they have, without optimizing further for acceptable performance on smaller, slower machines. End result: performance of their software that they consider acceptable (on their hardware) is unacceptable on more commonly found hardware that everyone else uses.
When ever a program increases in size, the question that must be asked is, "Is this increase in size proportional to an increase in function or speed?" Mostly, I'd argue, the answer is, "No."
A perfect case in point is the Netscape Navigator, which started out life as a simple web browser, but now includes a Netnews reader, and an Internet E-mail client. While these are related functions, and grouping them in one program makes some sense, Netscape Navigator is now ten times its original size on the IBM PC and Macintosh! This is a perfect example of "software bloat" where the increase in size is far out of proportion to the increase in function, and one which Netscape's customers should not stand for.
Hey, if your program isn't fast enough, just wait a while, the aforementioned Moore's Law will take care of your performance problems for you!
Well, not really:
This takes time to happen. Years, usually.
If you can optimize performance for the hardware you have, instead of for the mythical hardware of the future, think of how much faster your program will be when that future hardware actually arrives!
The smaller a computer you can get your program to run acceptably on, the bigger your potential market, and the cheaper it will be to run the program.
There is an old saying in the computer industry:
"The hardware giveth, and the software taketh away."
As fast as the hardware guys give us faster computers, the software guys seem to be yet faster at dreaming up ways to use all that new performance, and then some.
Some of this is legitimate (the computer graphics people, for example, will never be satisfied - they'll always want more cycles to render their graphics in ever greater detail & resolution), but most of it is sheer laziness on the part of the people writing software - not using efficient algorithms and data structures when they're known to exist (and implementations of them are readily available).
It should not be thus. Software should always be written to take maximum advantage of the hardware it runs on, and use the most efficient algorithms and data structure for the task at hand.