I went to college between 1991 and 1995. I’ve got a Computer Science degree. That degree was a new degree at my University so a lot of what we were learning was cutting edge because the curriculum had just been set 3-4 years before. At the time, the Web didn’t exist. It was invented in 1993, but only became popular in 1995. The building block of the web was HTML and Web Servers (HTTP). I learned nothing about that in college.
Cloud computing didn’t exist when I went to college. The standard practice of client-server applications (this is what we used to call software in the ‘cloud’ back then) was to have a very big server, so there was no concept of scale-out (multiple servers), or network architecture with servers taking unique roles. Your one big server was everything.
Video encoding was impossible back then, so I learned nothing about it. Not that I need to learn how to write my own video codec, but I use YouTube/Vimeo embeds, I use a Flash Player and all that was enabled by new codec and streaming technology that didn’t exist in 1995.
Mobile development also didn’t exist. You were lucky (or rich) if you had a cell phone back then and those were very dumb devices. Actually, the whole concept of Mobile applications only started to appear circa 1999 when Microsoft launched Windows CE 3.0 with several partners. As soon as I got one of those devices, I wrote an app (it was an English-Portuguese dictionary).
Databases were also very simple back then. Of course there was ACID (atomicity, consistency, isolation, durability) on those databases, but they were just tables and indexes. Think about dBase IV if you remember that. Now we have an immense amount of storage technologies, not only in software, but in hardware as well. From RAID to External HD, from cloud-based relational database to NoSQL.
Was it a waste of my time?
As I start a new project (EveryMove) I have to wonder how much I have to learn to keep up with new technology, methodologies, tools and processes. It’s an immense effort and unless you are at a constant learning mode, it’s very easy to get behind. If you take a 4-year break and go build a restaurant then decide to come back, you’d be shocked how much changes in such a short period of time.
I already wrote that Computer Science schools have a hard time keeping their curriculum up-to-date with new technologies.
I think there are two things I learned while getting my Computer Science degree that are with me since then and have proved extremely valuable. The first is the Computer Science fundamentals, things like data structure, algorithms, compression, I/O concepts, networking principles, security, etc. Those things are like physics laws, they don’t change and once “discovered” they become part of our knowledge, and it turns out that most of those things date back to 1940s and 1950s, way before computers using microchips. A person going through Computer Science College today will learn exactly the same thing I did back in the 90s and a person going to college in 2030 will also learn those things. But that’s not enough, which brings me to…
The second thing I took away from college was the need to be self-taught and realize you have to learn dozens of new things every year. Contrary to other degrees like English, History, Law, Civil Engineering, Biology, Dentistry, Medicine and many others, in Computer Science what you learning is not necessarily augmenting previous learning, but replacing it. Which means there is no point on your career you can be considered a big shot because you know more than everyone else. Just take a 15-day vacation in Europe and you fall behind others!