Intel's Ronler Acres Plant

Pergelator

Silicon Forest

Tuesday, September 23, 2014

Pyramids of my Mind


When I got started messing with computers as a student at UT Austin in 1978, mainframes still ruled the earth. I remember we got seven seconds to run our programs before the system would boot us off. Seven seconds back in those days was just enough time for one of those dinosaurs to take a small bite of a problem, chew it once or twice and swallow it. If you failed to adhere to the rituals and protocols laid down by the high priests, your program would run out of time and off you would go with no more ceremony than a bum getting tossed out of the Hyatt.
    Today, of course, seven seconds on any modern computer is an a veritable eternity. It it time enough to accomplish most anything. If your computer appears to be slow, it's because your computer now has a zillion things to do, and a zillion divided by seven is still long enough for you to go get a cup of coffee, or at least long enough to give you an excuse to go get a cuppa.
    Earlier this year I noticed that Google's Chrome Browser had gotten slower, which was kind of weird being as Google is all about delivering more faster (can you use more as a noun? Maybe I should say faster more? Or make a new compound word like moraster. Hmm. Maybe not.)
    Then I realized that one of the new tricks for speeding up computers is to explore future possibilities. At the low level this involves following both branches of a jump before you actually get there. Some low level operations take longer than others. While a simple load or store instruction might only take a nanosecond, dividing might take four or five.
     If there is a conditional jump instruction right after the divide, in a computer from the dark ages, the part of the computer that loads the next instruction is going  to be sitting there idle, and we can't have that, the devil and idle hands you know. So the whiz kids came up with the idea of going ahead and loading the next instructions for both branches of the jump instruction. That way when the logic/math part of the CPU finishes mucking about with the divide instruction we'll have the next instruction already loaded, ready for execution.
     When the CPU finally gets to the branch instruction and makes a decision on which way to go, all the stuff that got loaded for the wrong branch is discarded, and the look-ahead instruction pre-fetcher can concentrate on the right branch, at least until it comes to another one, and then it goes through the same contortions again.
    Following two execution paths means you need more circuitry, and since that seems to be the one thing we can have in abundance these days, it's not a problem. At least if you don't count the man-years that have been invested in figuring out the gory little details of implementation. And you can bet they are gory.
    Anyway, back to Google and Chrome. I suspect what Chrome is doing is something similar to the look-ahead instruction prefetcher in the CPU. There are guessing as to what you are going to do next and following that branch and maybe several others, just in case, so they can deliver what you choose more quickly.
    The reason I noticed that Chrome was slower is that I am using a veritable antique computer. I don't really know what's inside. It's an old Dell I picked up a recycler's shop for a pittance, like $100 or so. Just for the record, let's check (Start / Control Panel / System).


It doesn't say anything about quad cores or even dual cores. Hmm. Let's see what a current computer looks like.


I suspect it's a quad-core CPU, though it doesn't actually say that. Quad-core means it has four CPU's in one processor chip. Certainly has more memory.

What brought this all up was the comment from Sarah Sukhoi about Engineers and Designers that I ran across yesterday.

The pharaohs of ancient Egypt built huge pyramids out of stone. They are big and impressive and they lasted for thousands of years, but nobody builds like that anymore. Look at a modern city and you have a zillion buildings of all shapes and sizes growing willy-nilly all over the landscape. Centralized authority and control has all but vanished.

Something similar has happened in the computer business. When computers were a big deal, the standards for software design were strenuous and only the most dedicated survived. Now computers are ubiquitous (the average smart phone is more powerful than the CRAY-1, the awesomest super computer ever), and the standards for software design have relaxed (or plummeted, depending on your point of view), and anyone who can spell QWERTY can get in on the game.

This is why I am no longer at Intel. I was trained in the old school of designing software to do a specific task. I can write code to do anything you want. The problem here is you have to know what you want, and nobody at Intel knew exactly what that was. What they wanted was the next killer ap, a new software program that everyone would want. So we had a bunch of people like Sarah's Designers running around, full of enthusiasm, spouting buzzwords. My problem was I had no enthusiasm for any of the stuff (crap) that everyone was so hyped about. It didn't look serious or useful. Some of it, like video conferencing, did not even look possible.
    During my last few years (it may have only been months, but it sure seemed like years), Pro-Share was the big deal. Andy Grove had come to Oregon and the honchos had demoed some aps for him. Some of them might have been useful, or even successful, but the one that caught his eye was video conferencing, and so a whole division was given the task of trying to force a gigabyte of data down a pipe that would hold, at best, a few hundred K. It was basically hopeless, but they did give it the old college try. Some of the video compression algorithms they developed may even have provided the foundation for our current video-over-the-internet experience.
    Processing video does eat up the CPU cycles, and since that is what Andy was looking for (the killer ap that demands a powerful CPU) it made a certain kind of sense. Too bad they had to wait ten years for broadband internet to catch up. I wonder, if we looked back, who was pushing for broadband internet. Anybody in the computer business wanted more speed, but wiring the entire country for broadband was going to take some capital investment. The cable TV companies already had most of the country wired, but all their equipment was geared to one-way distribution, and if you have a business that is making money are you going to be inclined to sink a bunch of money in new equipment for an unproven market? I think not.
    Which makes me wonder why Verizon even bothered with wiring the country with fiber-optic cable, or as much of it as they did. I have it, and I mostly like it, though Frontier (who took it over here) seems to be slightly pathetic. I went with Verizon and fiber-optic because, 1) I already had a Verizon account and 2) I had heard nothing but bad things about Comcast.
    Anyway, back to the main point, i.e. Sarah's comment about Designers versus Engineers. Good software design requires a certain mental discipline. I am a fairly smart guy (in certain restricted fields. Ask my wife.) and I found the course work required to get a computer science degree very difficult. Everything in there was alien. Oh, there was logic and math, but these guys had combined it in ways that made my head hurt. Still, I must have some affinity for it because I stuck with it. Weird.
   People who are real computer gear-heads are few and far between, and since the demand for programmers is very high, all kinds of other people are getting in the game, and since enthusiasm counts for a whole lot in the employment business, we are getting all kinds of software that is cute and/or fancy but doesn't necessarily work very well.
    If it is useful and/or popular, we can always spend money to go back and try and fix it. If it doesn't make that first cut, then we can forget about it.
    Some people are naturally more social, some are enthusiastic. Good gear-heads are generally neither.

Bonus

Pyramids on my mind - acoustic 12 string - Douglas from the Dirt Road Bluze Band.

Monday, September 22, 2014

Walken on the Moon


Jack sent me this pic, which reminded me of this tune. 


I always think it's by Bob Marley and the Wailers, but that is incorrect.

Googlicious

I'm checking out what I've said about pyramids and I get distracted looking at a satellite view of the Meroe Pyramids in Sudan. I zoom out and there isn't much to see except the Nile river and a couple of roads, or at least Google says there are a couple of roads there. Let's turn off the labels and see if what the place really looks like. Wait, what? No way to turn off the labels anymore?
    Google has had a new version of their map program floating around for a while and I just got a notice that all my maps had gotten moved to the new map ap. Well, that's the way of the world, change or die, and I suppose I can try to adapt. So how do you turn off the labels? Help takes me to some posts from numerous  people asking the same thing. No answer, but in reading along I turned up this little gem:

Sarah Sukhoi

Mar 10

There are two groups of people at google.

Engineers and Designers.

The Engineers used to run the company, smart people who made stuff that worked.

The designers are the ones who huff glue and then shout buzzwords like "WEB 2.0!" "MOBILE FUNCTIONALITY" "SOCIAL MEDIA ENABLED" like they are suffering from a nerdy form of tourettes.

Until about 3 years ago, google was run by the Engineers.  Somehow the Designers have made their way into the decision making positions and we get horrible products like Wave, Buzz, youtube comments tied to google, the new gmail, the new docs, and the new google maps.

Send feedback to google and tell them to put the engineers back in charge.  So we can get products that just work.
I couldn't fail to disagree less.


Quantum-ness


Coding Game Mars Lander


I don't like the multi-verse theory. I like to think that this universe is the only one. Keeps things simpler. On the other hand, there are aspects of the behavior of sub-atomic whatsits (like photons) that don't really make sense. Then there was Plato's (or Aristotle, one of those old Greek dudes) who speculated that the world we experience is only a projection from the real world, something like a shadow on the wall of cave cast by firelight is only a projection of the hand that is casting the shadow.

So I'm thinking that our brains have evolved a sort of imposed order on the primordial soup of sub-atomic whatsits and what we see and experience is what we collectively imagine. If all the people vanished, the universe wouldn't vanish, just our cosmology would. Animals still perceive our world, but (as far as we know), they aren't concerned about the heavens, or sub-atomics whatsits.

We haven't been able to come up with a way to travel to other stars, but given our wild imaginations, I expect that someday we will. And it will be because someone discovers (or imagines) a here-to-for-undiscovered principle.

Right now Obayashi is predicting that we will build a space elevator by the middle of this century. At least they forecast the travel time to orbit will be about a week. First time I've heard a reasonable estimate for that. A week! Ain't nobody got time for that! Gimme a rocket!

Muthas

This joke:


Reminded me of this:

Russia-China Peace Mission August 2014, they might be in Mongolia.

Inspired by Roberta X.

Madmen of Benghazi

A masked Libyan gunman stands on a street in the eastern city of Benghazi, early on July 29, 2014, as violence flares (AFP Photo/Abdullah Doma)
All inflamatory, all anti-ISIS, all the time. We start with my book report.

The Madmen of Benghazi by Gerard De Villiers

I picked this book up from the bargain shelf at Powell's. It is not a very good book, if fact it was quite awful, but unlike many other bad books that I quit reading after a chapter or nine, I read this one all the through. It helps that it wasn't a long book. In many ways it is similar to a series I read many years ago. This sentence from the back cover might help:
Originally published from 1960 until his death in 2013, his bestselling SAS series of 200 spy novels, starring Malko Linge, has long been considered France's answer to Ian Fleming with Malko as his James Bond.
The book does have several points in it's favor. It's a spy thriller, and I like stories of espionage. It's set predominately in Cairo, Egypt, and Benghazi, Libya, and I haven't seen many stories set in these places. It's semi-current: Qaddafi dies halfway through the story. The names of several known terrorists play outlying roles, and Qatar (the country) is the root of all evil. And then we have the recurring image of the super-model stuffed into a skin tight blue dress. Her appearances in the story covers a multitude of sins, both literary and religious.
      On the downside the reasoning employed by the characters borders on incompetent. Then again, these are spies, and if LeCarre's The Looking Glass War is even remotely accurate then what we have here is a realistic portrayal. That's a scary thought. So it was kind of fun, it provided a little exercise in geography, and it gave a small glimpse of life in North Africa, but it's not what I would call a good book.

P.S. SAS might refer to Special Air Services, except that was a British organization, and this book was originally written in French. On the other hand, there is no mention of any French Intelligence operations. Huh. Wonder how that happened.


A friend of mine put this up on Facebook:
I may have po'd some people out there.. That was my intent. I feel we all sit back than bitch! Nothing like going to work..unarmed and being cut down..what was more than likely ISIS. I do know in 1982- 86. I lived in Chicago. Met a Syrian here on a work visa. Opened minded at the time.. He scared me with his response to, HOW DO U LIKE AMERICA? He spit and said I hate ALL AMERICANS. I SPIT on YOUR COUNTRY!
It doesn't take many encounters like this to convince people that the only good mussleman is a dead one.


Dustbury put up a post about a Muslim Anti-ISIS demonstration in Oklahoma City, which included this line:
[T]he majority of signs held by the pro-peace crowd at Northwest Expressway and Pennsylvania Avenue by Penn Square Mall, were to drive the point home that terrorist group ISIS is not a representation of Islam, as some held the sign saying “ISIS DOES NOT REPRESENT ME!”
This is very nice, but it's not what we're really looking for. Being blood-thirsty American Imperialist running-dogs (to use our full third world title), we want to hear something more like "DEATH TO ISIS" or "KILL ALL THE JIHADISTS". Oh wait, that's kind of what being a Jihadist is all about isn't it? How do you tell the good Jihadists from the bad Jihadists? Especially when the only good Jihadist is a dead one? So I can sort of see why they went with their milder slogan.

Hamburger Icon

I posted a question on the Google Help Forum last night, and this morning I got a reply, which I have paraphrased for your amusement.

Table of Contents for a spreadsheet can be seen by using the All Sheets button.
It is located on the bottom row on the left side.
It is the 'hamburger' icon (3 horizontal lines).
Clicking on it shows you a list of the individual sheets that are in this spreadsheet document.