I am a programmer and architect (the kind that writes code) with a focus on testing and open source; I maintain the PHPUnit_Selenium project. I believe programming is one of the hardest and most beautiful jobs in the world. Giorgio is a DZone MVB and is not an employee of DZone and has posted 638 posts at DZone. You can read more from them at their website. View Full User Profile
As we have outlined in the previous
post, the memory size and computing power available to the average
programmer has increased thousands of times from the first years of his
art, at least in the boundaries of a single machine (network is a common
This performance improvement has made radical changes to the style we
use in writing software and in its development process.
The first notable change is the progressive introduction of
higher-level programming languages. C is near the raw metal of a
machine, but upon it portable languages have been written, such as Java
and Python. They are still third-generation programming languages but
they sacrifice performance for portability by providing a virtual
machine and an interpreter respectively.
This is a general trait of higher-level languages: trading machine
time to save the developers' time, which with the hardware
improvements is now the most expensive resource. At the time of his
release to the public in 1995, Java applications were considered slow
programs with an extensive memory footprint (with reasons). However
today this is no longer significant for a vast set of applications, and
the same is true for other high-level languages like Python and Php.
Premature optimization is now the evil, not
the Java Virtual Machine.
Some people say that software bloats faster
than Moore's law can help it: we went to the Moon in 1969 with 4 kilobyte
of Ram, now we need 100-200MB to run an operating system. But
the features and power of our machines is now much greater (discounted
by the amount that runs software's bloated parts), we can do things that
were only dreams in 60s.
Continuos integration of software project and immediate feedback via
tests are two things that derive from a large amount of computing power
available. Donald Knuth is a magician in algorithms, but back in the 60s
he has to write a program by hand during the day, and let the
machine compile it at night. Now we have the whole process of
building and testing for a moderate size program run in minutes from
every code check-in. Algorithms were proved on paper: now they are
tested on large datasets.
Object-oriented programming is a practice fundamentally less performant
than "classical" structured programming, because it stores pointers to
virtual methods in every variable, even a wrapped integer. But it lets
you have a real domain model in an application, where different
entities, both represented as integers, cannnot be mixed up. Few of us
will start a serious enterprise application without this paradigm
available: the hardware improvements again made possible to simplify the
programmer's life, even if sometimes bloat would be introduced.
And the list goes on: every best practices post you can find
today (also mine) is in part generated by the continuous hardware
improvements that have occurred, and it is a good thing, because it
means we are leveraging the machines' power. Meaningful naming for
entities? Try it with an 8 character limit. Iterative development,
refactoring? Made possible by the insulation layers between components,
which are a form of "bloat". Distributed version control? Thank you,
cheap space on hard disks.
Moore's law won't save anyone introducing bloat in software. But it
makes new programming practices feasible, directly borrowing them from