Technology doesn’t stop

Stumbling upon a paper I wrote 10 years ago on computer technology (server configurations), we used to highlight how much 4GB of internal memory represented… at the time. 4GB was a big deal back in 2001.

Out of curiosity I went to Dell’s web site to see the current upper limit for a server: the figure has gone up to 2 terabytes (PowerEdge R910). That’s a 500x increase in memory capacity in just 10 years. My 14-inch laptop has 8GB of memory loaded, I can carry it with one hand while the 2001 server model took 2 people to lift it up.

Who knows what a computer will look like in 10 years from now. Why would an application need 400GB of internal memory on a laptop? We don’t even know what will run locally and remotely by then. Likewise, one would have wondered back in 2001 why a server would ever need 2 terabyte of internal memory. Of course this is Moore’s Law, but the point is that we find applications that actually use these resources exponentially.

On a related note, Monty Widenius (creator of MySQL) has an interview out where he and other database experts are starting to scratch their head on how to manage databases of petabyte size. Time to re-invent the distributed database?

Share

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>