Social Question

RedDeerGuy1's avatar

Where is the weakest link in modern computing nowadays?

Asked by RedDeerGuy1 (24986points) January 21st, 2016

Where, other than the user, is the bottle neck?

Observing members: 0 Composing members: 0

37 Answers

XOIIO's avatar

Well, it depends what aspect you are talking about, graphics, memory, processing, honestly the biggest overall bottleneck I would say is the internet, the speed at which we can transfer data, which is what affects things most overall.

If we had instantaneous data transfer, or 100gb/s upload/download everywhere, research would be able to take pace at a much faster rate.

Cruiser's avatar

Windows.

CWOTUS's avatar

Redacted

RedDeerGuy1's avatar

@XOIIO Would it be beneficial if the old cable lines for tv where upgraded to fiber optics?

elbanditoroso's avatar

Old copper. Integrated circuits on circuit boards that have been in place for a dozen years. Even copper and silicon eventually wear out.

CWOTUS's avatar

This is probably too broad a question to answer at less than book length.

(First of all, “aside from the user” is an impossible exclusion. If the user cannot or will not program the computer competently, operate its controls adequately or frame the correct question to which he or she requires a response – or understand a perfectly valid response after correct programming and operation and framing of the question – well, there’s your answer. All of the time.) The user’s ability to learn and adapt to new technology is a real thing that all users also face, in greater or lesser degree.

But “the weak link in computer” after the user is probably the software used to manage the computer itself, from the Operating System down to the keyboard and other user interface.

Another weak link is the (so far) serial nature of computing. The computers that most of us use only do one thing at a time, although they can do many things very quickly, so it sometimes seems like “all at once”. So those computers can’t do parallel computing, which is something that I think most humans can do innately.

“Fuzzy thinking” is a thing that has to be programmed into computers to perform “looks like”, “sounds like”, “reminds me of” and other inexact matching. (And let’s not even get into “smells like”. Although we can make sensors to detect tiny amounts of chemicals that humans can’t smell, I don’t think a computer will soon be able to “learn” much from scent. Anyway, that’s another input problem.)

The hardware itself – and economic / budget constraints forced on nearly all users – mean that the computers we use will always be older, less efficient, less durable and less reliable than they might be if we had unlimited resources available.

Even forward and backward compatibility is a problem, depending on the data sets to be analyzed, the format of the data and where it currently resides.

But I’m not going to write the book, so I’ll stop there.

ragingloli's avatar

Always has been, and always will be, the bag of flesh in front of the machine.

jerv's avatar

Aside from wetware, I’d say it’s a combination of slow internet (though that’s more of a problem in the US than the rest of the first world) and red tape.

Look at how much further we’d be if not for patent trolls and those seeking to protect their intellectual property by underhanded means. For instance, if Apple didn’t have an overzealous legal department, they would’ve had their lunch eaten long ago.

jaytkay's avatar

Always has been, and always will be, the bag of flesh in front of the machine.

In IT support that’s called a PEBCAK error.

Problem Exists Between Chair And Keyboard

ARE_you_kidding_me's avatar

It’s power, as in watts. We have huge architectures in processing and support chips now. As transistor count, peripheral and clock speed goes up so does power like I^2R power. We have done stuff like lower the operating voltage and add fans with huge heat sinks but we are hitting a wall. Notice that the modern multi-core processors don’t boast the clock speeds that the older single core processors did, they heat up and suck power fast past ~3Ghz and keeping them cool is becoming a challenge. Unless we do other little hacks to keep things cool there is not much else that can be done with copper and silicon.

Now..individual computers can have any number of bottlenecks.

kritiper's avatar

Security and protection from EMP. (Other than the nerds who put them together and program them with terms only nerds can understand…)

jerv's avatar

@ARE_you_kidding_me Not really. My Haswell sucks fewer watts than my Clarkdale did, and my old Northwood drew almost as many watts as both of them put together while being far less computationally powerful. Hell, my phone is almost as powerful as my old Northwood but draws so few amps that it doesn’t even need a heatsink.
I’m thinking that even if we hit the limit around 10nm, if we were willing to deal with teh sort of power/thermal issues we had ten years ago, we could go forward by going a little backwards, if you catch my drift.

@kritiper I consider security to be a wetware issue. Now, I’ll see your EMP and raise you TEMPEST hardening.

ARE_you_kidding_me's avatar

@jerv, again creative workarounds for a physics problem. Still a few rabbits in that hat with architecture.

SecondHandStoke's avatar

The corporate hack/Microsoft mentality.

XOIIO's avatar

I’d say the apple mentality is much worse, charging ludicrously overpriced amounts for new tech, not that they are leading the tech industry, they are just good at marketing.

SecondHandStoke's avatar

This Apple user does not buy any products or services due to marketing.

Logic dictates that my choices must be based on some other merit.

SecondHandStoke's avatar

Oh boy.

Here we go…

ragingloli's avatar

“Logic dictates that my choices must be based on some other merit.”
That is why I do not use any crapple products.

XOIIO's avatar

There’s nothing to “go”, it’s clearly obvious that dollar for dollar, apple charges quite a lot more for computers which perform the same, or generally, less adequately as other counterparts. They are tailored more for design work, same as the operating system, not for sheer performance as others can be.

The issue with that is, as in all similar things, it causes the price of competitors to rise as well, but thankfully not to those exorbitant extremes.

If prices were at a more reasonable level across the board, people would be more inclined to adopt the newest generation, putting fresh money that could go to advancing technology even more, since they would not have as much value to get out of a product as they do if they payed a much higher price.

Someday, hopefully we get to the point where we can manufacture these sorts of things extremely efficiently, lowering the cost by several orders of magnitude, but I have a feeling we are quite a ways off from that.

Until then, I’ll happily stick with my upgradeable counterparts, that way I get a mix of both more modern advances, and better value per dollar, as well as generating less waste.

jerv's avatar

@SecondHandStoke Well, for those that place a higher value on aesthetics, Apple is surely superior. For those that value “turn-key” technology that requires little/no setup or any sort of technical competence to use right out of the box…. debatable simply because Apple lost the monopoly on that not long after they lost their superiority in graphics design.

So the only merits I can think of aside from $500 Gucci socks being “better” than $2 Hanes socks boil down to being stuck 20+ years in the past.

ARE_you_kidding_me's avatar

There is nothing special about anything Apple makes or does. What they do well is keep standards and quality in check. I don’t think it’s worth paying the 3Xcost premium though.

jaytkay's avatar

I haven’t been a Mac user for many years, but I see the advantages.

The biggest one in my eyes is that Macs have a much higher resale value than Windows PCs.

Another huge advantage for me, unlike with Windows, is that I can tell my friends with Macs to go to the Apple store for free help.

SecondHandStoke's avatar

@XOIIO mentions performance, but clearly sees performance from a narrow view.

Muscle cars of old (and their retrofaubulous contemporary counterparts) might have thrilling straight line acceleration but become worthless the moment the blacktop turns interesting. When ergonomics, design and build quality are taken into consideration things look even more grim.

The same holds for the computing environment. My MacBook might not the the most powerful and expandable device but there are other aspects of performance to consider, the UI especially.

Regarding cost:

Most US drivers of Honda Civics might never know that it’s engine’s crankshaft is forged, increasing durability and decreasing mass. This is one of a host of reasons why it simply costs more to build a Civic, as opposed to say, a Toyota Corrolla. There’s no getting around the fact that it influences the number on the window sticker.

These sort of innovations effect the price of admission for computers as well.

jerv's avatar

True. For a while, Apple was the only one (aside from Tandy and Amiga) that had a GUI right out of the box. But they managed to market it well enough for people to overlook the fact that they didn’t actually invent it; they simply took inspiration from Xerox.

@jaytkay Resale value only matters if you sell before it’s worthless. Those who treat computers as tools instead of fashion accessories consider resale value irrelevant, preferring to base value on how well it does it’s work. Also, the fact that PCs are modular enough to upgrade deprecated parts instead of buying a full system kind of makes it comparing apples to oranges (no pun intended). My PC is kind of six years old, but the original power supply was replaced before I ever booted it up, and since then I’ve upgraded the hard drive, motherboard, CPU and video card. Aside from the case and DVD burner, there’s nothing original on it.

As for support, it’s out there; Apple is merely the only one that puts stores in malls for it.

jerv's avatar

@SecondHandStoke The UI is actually why I prefer Android over iOS, and rate OS X lower than Win7 or many distros of Linux. Then again, Mac4Lin exists, so OS X isn’t even the only one that has the exact same UI as OS X.

kritiper's avatar

@jerv What?? Can you plainspeak?

SecondHandStoke's avatar

As a Mac person forced to use Windows in an commercial capacity (car sales) I can say with certainly that a main component of the bottleneck is employee/user morale.

XOIIO's avatar

I think comparing apple machines to other machines (regardless of the OS you put on them, since you have options) is like comparing a civic with a semi truck lol

ARE_you_kidding_me's avatar

@XOIIO More like comparing a Honda to a Toyota

ragingloli's avatar

More like comparing a stock Honda dressed up to look like a ferrari with the bonnet welded shut to a tuned up Toyota.

jerv's avatar

@kritiper Translation – Those who claim Apple has the best UI are full of shit. Others do practically the same thing, and Linux can do it so close that it actually looks exactly the same.

I would say that it’s like comparing a mid-‘80s Cadillac to a Toyota. Back then, Caddy was where GM put all of their experimental tech that usually turned out to be crap, like the 4–6-8 engine. They also had a habit of doing things half-assed like the Cimmaron and Catera. However, despite their wonkiness and unreliability, they still somehow cost 2–5 times what their competitors do. Aside from the fact that Apple actually has sales volume while Cadillac doesn’t, they’re pretty similar.

On the other hand, Toyota runs the range from no-frills Corollas to leather-clad Lexus, sports cars (Supra) to work trucks (pretty much every pickup in Central/South America or Africa). So you can get exactly what you want at whatever price you are willing to pay instead of being told what you want by someone who will squeeze you for every penny they can get.

But back to the original question, I think the big thing is that many people don’t want to feel like they are using a computer. Many still think that the only people who use computers are nerds with thick glasses and no social skills, and they actively (though subconsciously) resist learning anything so as to distance themselves from Urkel. Apple had that as a selling point for years; “For the rest of us” was basically saying that you could use a Mac even if you knew so little about computers that a PC would electrocute you when you drooled on it.

That is why modern computers are dumbed down, much like a car that has had the steering wheel and pedals replaced with a green button that says “Go!”. Apple goes one better and welds everything shut so that autopilot is the only way to move and you can’t even do basic maintenance/repairs without taking it to the shop. And look at how many people are now having a lot of problems, often self-inflicted ones. Imagine what the freeways would be like if there were no licensing requirements and the roads were jammed with four-year-olds who can’t even see over the wheel and you have out current computing situation.

But that goes back to the user, at least indirectly. The need to dumb things down to be usable by those who refuse to learn causes software bloat that takes up a lot of RAM and CPU. That is why so many supercomputers run Linux, and usually from a command line. Put on a big, fluffy set of mittens that makes it impossible for you to poke yourself in the eye and you’ll be a lot slower at any task that requires manual dexterity. The same applies to computers; the things that stop people who don’t know any better from getting themselves hurt also slow things down for those that can be trusted with non-plastic silverware.

@SecondHandStoke There are those who hate computers and will never be happy that their job forces them to use computers. Those people will be about equally miserable on all OSs, though OS X usually won’t even have the software to be able to do anything work-related anyways, making it more likely that they’ll use Windows or Linux at work. Then there are people who love computers but like to customize them for their own tastes and needs. OS X loses out there due to their “walled garden” while Linux has almost he exact opposite problem of being so customizable that newbies suffer from option shock and run away screaming.

XOIIO's avatar

@jerv Perfect analogy with the welding, I mean, now they have the ram and hard drives on the macbooks as part of the main board, and the batteries glued in solid.

Battery isn’t as big of a deal, but not being able to upgrade or replace the hard drive or ram? That’s just mental.

jerv's avatar

@XOIIO The battery alone is proof that they want their laptops half-dead in 3 years and bricked by the time they’re as old as my Toshiba.

XOIIO's avatar

Agreed, though with a heat gun and some caution, I’m sure you could figure out a way to replace it, depending on the type of glue they use.

jaytkay's avatar

I like Lenovo laptops. The T series, business laptops with magnesium frames. A really nice 2012 Thinkpad goes for $250.

Compare to recent eBay sales.
2012 Macbook Pro
$681.00
$553.33
$599.99
$600.00
$549.95

Macs have some advantages. People like Macs and get a lot of utility from Macs.

You can stamp your little feet and hold your breath, but that does not change reality.

XOIIO's avatar

Agreed, I personally think the thinkpads were better quality when IBM was making them, or at least mostly, before they switched to using more plastic and less of the magnesium alloy for the case/frame, the newest one I had was a t410s. They have a great value per dollar.

jerv's avatar

@jaytkay So? Paintings by Monet and Van Gogh are worth more than the crayon scribblings of a grade-schooler. A ‘69 charger is worth more than my ‘86 Corolla. Your point?

You can stamp your feet and hold your breath, but that does not change that many people (myself included) couldn’t give less of a fuck about resale value on their computers.

Now, if you’re the type of person who, given the option, would rather lease a computer than own one the same way you lease a BMW, then you’re in an entirely different demographic from me. Macs do not provide 3–5 times the utility, therefore I place their value far below what those with different desires consider “fair market value”.

Answer this question

Login

or

Join

to answer.
Your answer will be saved while you login or join.

Have a question? Ask Fluther!

What do you know more about?
or
Knowledge Networking @ Fluther