Today, of course, seven seconds on any modern computer is an a veritable eternity. It it time enough to accomplish most anything. If your computer appears to be slow, it's because your computer now has a zillion things to do, and a zillion divided by seven is still long enough for you to go get a cup of coffee, or at least long enough to give you an excuse to go get a cuppa.
Earlier this year I noticed that Google's Chrome Browser had gotten slower, which was kind of weird being as Google is all about delivering more faster (can you use more as a noun? Maybe I should say faster more? Or make a new compound word like moraster. Hmm. Maybe not.)
Then I realized that one of the new tricks for speeding up computers is to explore future possibilities. At the low level this involves following both branches of a jump before you actually get there. Some low level operations take longer than others. While a simple load or store instruction might only take a nanosecond, dividing might take four or five.
If there is a conditional jump instruction right after the divide, in a computer from the dark ages, the part of the computer that loads the next instruction is going to be sitting there idle, and we can't have that, the devil and idle hands you know. So the whiz kids came up with the idea of going ahead and loading the next instructions for both branches of the jump instruction. That way when the logic/math part of the CPU finishes mucking about with the divide instruction we'll have the next instruction already loaded, ready for execution.
When the CPU finally gets to the branch instruction and makes a decision on which way to go, all the stuff that got loaded for the wrong branch is discarded, and the look-ahead instruction pre-fetcher can concentrate on the right branch, at least until it comes to another one, and then it goes through the same contortions again.
Following two execution paths means you need more circuitry, and since that seems to be the one thing we can have in abundance these days, it's not a problem. At least if you don't count the man-years that have been invested in figuring out the gory little details of implementation. And you can bet they are gory.
Anyway, back to Google and Chrome. I suspect what Chrome is doing is something similar to the look-ahead instruction prefetcher in the CPU. There are guessing as to what you are going to do next and following that branch and maybe several others, just in case, so they can deliver what you choose more quickly.
The reason I noticed that Chrome was slower is that I am using a veritable antique computer. I don't really know what's inside. It's an old Dell I picked up a recycler's shop for a pittance, like $100 or so. Just for the record, let's check (Start / Control Panel / System).
It doesn't say anything about quad cores or even dual cores. Hmm. Let's see what a current computer looks like.
I suspect it's a quad-core CPU, though it doesn't actually say that. Quad-core means it has four CPU's in one processor chip. Certainly has more memory.
What brought this all up was the comment from Sarah Sukhoi about Engineers and Designers that I ran across yesterday.
The pharaohs of ancient Egypt built huge pyramids out of stone. They are big and impressive and they lasted for thousands of years, but nobody builds like that anymore. Look at a modern city and you have a zillion buildings of all shapes and sizes growing willy-nilly all over the landscape. Centralized authority and control has all but vanished.
Something similar has happened in the computer business. When computers were a big deal, the standards for software design were strenuous and only the most dedicated survived. Now computers are ubiquitous (the average smart phone is more powerful than the CRAY-1, the awesomest super computer ever), and the standards for software design have relaxed (or plummeted, depending on your point of view), and anyone who can spell QWERTY can get in on the game.
This is why I am no longer at Intel. I was trained in the old school of designing software to do a specific task. I can write code to do anything you want. The problem here is you have to know what you want, and nobody at Intel knew exactly what that was. What they wanted was the next killer ap, a new software program that everyone would want. So we had a bunch of people like Sarah's Designers running around, full of enthusiasm, spouting buzzwords. My problem was I had no enthusiasm for any of the stuff (crap) that everyone was so hyped about. It didn't look serious or useful. Some of it, like video conferencing, did not even look possible.
During my last few years (it may have only been months, but it sure seemed like years), Pro-Share was the big deal. Andy Grove had come to Oregon and the honchos had demoed some aps for him. Some of them might have been useful, or even successful, but the one that caught his eye was video conferencing, and so a whole division was given the task of trying to force a gigabyte of data down a pipe that would hold, at best, a few hundred K. It was basically hopeless, but they did give it the old college try. Some of the video compression algorithms they developed may even have provided the foundation for our current video-over-the-internet experience.
Processing video does eat up the CPU cycles, and since that is what Andy was looking for (the killer ap that demands a powerful CPU) it made a certain kind of sense. Too bad they had to wait ten years for broadband internet to catch up. I wonder, if we looked back, who was pushing for broadband internet. Anybody in the computer business wanted more speed, but wiring the entire country for broadband was going to take some capital investment. The cable TV companies already had most of the country wired, but all their equipment was geared to one-way distribution, and if you have a business that is making money are you going to be inclined to sink a bunch of money in new equipment for an unproven market? I think not.
Which makes me wonder why Verizon even bothered with wiring the country with fiber-optic cable, or as much of it as they did. I have it, and I mostly like it, though Frontier (who took it over here) seems to be slightly pathetic. I went with Verizon and fiber-optic because, 1) I already had a Verizon account and 2) I had heard nothing but bad things about Comcast.
Anyway, back to the main point, i.e. Sarah's comment about Designers versus Engineers. Good software design requires a certain mental discipline. I am a fairly smart guy (in certain restricted fields. Ask my wife.) and I found the course work required to get a computer science degree very difficult. Everything in there was alien. Oh, there was logic and math, but these guys had combined it in ways that made my head hurt. Still, I must have some affinity for it because I stuck with it. Weird.
People who are real computer gear-heads are few and far between, and since the demand for programmers is very high, all kinds of other people are getting in the game, and since enthusiasm counts for a whole lot in the employment business, we are getting all kinds of software that is cute and/or fancy but doesn't necessarily work very well.
If it is useful and/or popular, we can always spend money to go back and try and fix it. If it doesn't make that first cut, then we can forget about it.
Some people are naturally more social, some are enthusiastic. Good gear-heads are generally neither.
Pyramids on my mind - acoustic 12 string - Douglas from the Dirt Road Bluze Band.