General Question

PhiNotPi's avatar

Has computing technology advanced in the last ten years?

Asked by PhiNotPi (12686points) June 9th, 2012

Every once and a while, a new technology allows computers to improve dramatically in speed, processing power, size (smaller size), or memory. Examples include the invention of the vacuum tube, transistor, RAM, integrated circuits, and others.

In the last decade, I think that there has been a lack of significant progress in computers, specifically hardware. The increased power of the modern PC is not due to some revolutionary breakthrough, it is simply because more transistors have been added. The transistors themselves have been simply shrunken down, not really changing the operation. Intel is still making new proccessors based off of the 8086 instruction set, which was made in 1978.

So, what sort of major innovations have occurred in this past decade? Do you believe that we have made as much progress between 2012 and 1978 as between 1978 and 1944 (equal time spans)?

Observing members: 0 Composing members: 0

10 Answers

jrpowell's avatar

I have said for a long time that the hard drive was the big bottleneck. And the SSD seems to have proved me right.

Right now I am running iTunes and typing this and have about 10 other apps running (including Photoshop and Cinema4D) and my CPU is pretty much idling. For 95% of the people the CPU’s are fast enough.

Everyone would see a big boost from a SSD.

gorillapaws's avatar

I think most of the advancements recently have been on the software side of things. Because cpu’s are hitting hard physical limitations due to heat and the melting point of silicon, machines are becoming increasingly more parallel. Languages and frameworks are now being designed to accommodate programmers writing concurrent code much more easily, and safely than before.

DeanV's avatar

Absolutely. I’ll put this into context with game consoles, not because I think they’ve been developed more in the past 10 years than others, but because I think I have an alright example relating to them.

I bought a Playstation 2 about 8 years ago (2004) with a 32MB memory card for about $250. Two months ago I bought a Playstation 3 with a 320GB hard drive for $250. That’s about a 10,000 times increase in storage capacity in less than 8 years, as the PS3 was actually released all the way back in 2006. If that’s not an advancement I don’t know what is, and it’s not even with something commonly considered a computer, even if the insides are about the same.

lillycoyote's avatar

This is absolutely the most important piece of computer technology to be developed in the past 30 years, in my opinion. Forget processor speed, memory, “smart” devices; forget touch screen tablets, prosthetic computer aided body parts, AI anything; none of it matters anymore because of this:

Kinsight uses Kinect sensor to find lost keys and wallets

I simply cannot wait until this technology becomes commercially available. All of you can have the rest of it. I just want this.

Hours and hours and hours lost, years possibly, cumulatively, spent looking for keys, wallets, glasses, purses, two shoes that match,etc. ... this will all end all that when I can get Kinsight Kinect sensors and attach them to anything and everything that isn’t nailed down.

Hallelujah and Praise the Lord! This is the technology I have been waiting for all my life!

:-)

LostInParadise's avatar

I think you are right. We are still using the same Von Neumann architecture. The same holds true for software. The last major change was object oriented languages, and that has been around for decades. There have been some recent developments in functional programs, but the basic idea for that, in LISP, is also quite old. Quantum computing, if and when it is completed, promises to be a major game changer.

Thammuz's avatar

The change from 32 to 64 bit has changed things quite a lot, also the introduction of multiple parallel cores makes a huge difference, because it allows true parallelism for the first time.

Kayak8's avatar

@lillycoyote I trained my search dog to find lost wallets and keys—the technology is HERE!

poisonedantidote's avatar

The last 10 years have probably had more advances than any other decade, even if you limit it to just the hardware.

In the 90’s processor speeds and storage capacities were making bigger comparative leaps from speed to speed, I remember going from 75mhz to 166mhz to 800mhz of cpu in just one summer, and in the year 2000 we started to see fancy mobiles with video and other fatures, but none of it quite worked properly.

This decade, has innovated as well as taken what was allready there and made it good.

We had 3D graphics in games for a long time, but there used to be clipping and the camera would not always align properly, and it is only really recently that we have cracked it. The same goes for the video cameras on mobiles, people now walking around with iphones and things.

The bigger ones that come to mind for the last decade or so:

- Global intrnet speeds generally up to a good standard.
– The uprising of wifi and wireless technology
– The birth of the “super website”, facebook, youtube, google, etc.
– Multiple CPU cores
– High tech mobiles
– Cooling systems
– HD graphics
– Web 2.0

etc…

dabbler's avatar

I’m going to answer broadly not just ref CPU advances, which have been formidable.

Lower Power ! Connectivity!

All our gadgets consume a small fraction as much power as they did ten years ago. Starting with displays (LCD now instead of CRTs), CPUs (power per MIP is way lower), graphics (we have play Hi-res videos in battery operated gadgets for hours).

Don’t discount packing more transistors into CPUs, that is no mean feat! Reduced geometries have facilitated the continuing speed increases in CPUs and in peripheral gadgets like USB connections. And smaller geometries allow stuff to process with far less power.
More transistors enables 64-bits and multiple cores per @Thammuz‘s observation.

Batteries have come a long way as Lithium polymers are packing so much more power per weight than they used to.

USB has changed things a LOT, we have self-describing gadgets that often plug into our computers and get set up with zero intervention. Sure beats digging up drivers for everything.

Network and Wireless communication are vastly superior than ten years ago. Just plug a new printer in and chances are it will find your WiFi and be available on your LAN, voila!

These days you can watch a streaming video on your TV over a wire (HDMI) from your tablet that is getting the stream from Netflix over WiFi.

For 35$ you can get a complete linux system with a footprint the size of a credit card – with ethernet and 1080p video output.

mattbrowne's avatar

The progress between 2012 and 1978 has vastly accelerated compared to between 1978 and 1944. I’ll give you just one example:

“Giant magnetoresistance (GMR) is a quantum mechanical magnetoresistance effect observed in thin-film structures composed of alternating ferromagnetic and non-magnetic layers.”

Without the discovery of the GMR effect in 1988 we would not have 1 terabyte hard disks for $80, the size of the palm of a hand. This discovery was worth a Nobel Prize in 2007.

Answer this question

Login

or

Join

to answer.

This question is in the General Section. Responses must be helpful and on-topic.

Your answer will be saved while you login or join.

Have a question? Ask Fluther!

What do you know more about?
or
Knowledge Networking @ Fluther