It’s all Tech… Hopes and dreams ratio, but never forget, this time it’s different

Sharing is Caring!

Karl Denninger: The Latest Bubble, And Why It WILL Pop

It’s “AI” of course.

There is no such thing.  There never has been and I argue there likely never will be either.  Certainly, there is no evidence we’re any closer to it in actuality than we have ever been in the age of computing, which runs back to roughly the 1960s.

Many will likely disagree with me on this, but you’re arguing with someone who literally cut his programming teeth on both punch cards and reverse-engineering a Burroughs machine code print-out on green-bar paper without an instruction set manual by pure trial and error to map operators and operands so I could change a city tax rate in the bookkeeping software loaded originally from punch-tape, when Burroughs wanted an obscene amount of money to make a literal 10 second edit (since they had the source code, of course) and send over a new one.

If you want to know what that was yes, this is the machine series.  In fact it looked exactly like that, including the cabinet and attached upper paper handler (that was detachable and had a double-setup used for payroll and other things where both a ledger and check were required.)  Storage was core memory so it retained its program when turned off, but there was no persistent (e.g. disk) storage at all.

Yeah, that far back and that adventure was the first “revenue” producing computer-related thing I did.

Computer processing has never really changed.  Computers produce precise calculations at speeds which humans cannot match.  We produced a computer for the Apollo command module using the same sort of core memory that was in the Burroughs machine because due to physics it was not possible to carry enough propellant for the moon-going astronauts to be able to slow down enough to re-enter Earth orbit on their return.  The issue is simply that every pound you wish to carry into space you have to lift it off the surface of our planet first, and while we could engineer enough capacity to do that for the crew capsule and supporting machinery, then make the burn to get into a lunar transfer orbit, decelerate so you are “captured” by the moon’s gravity and in orbit there, then accelerate sufficiently to head back to Earth adding the propellant necessary to slow back down so Earth would capture you on the return was not possible; there was simply not enough lifting capacity at the beginning to carry that much propellant.  No human could manage to hit the re-entry corridor on the return with the required precision even with precisely-aligned sights in the window — the odds were too high that a human being attempting to do so would miss and, if you miss the corridor everyone on board dies either by burning up or skipping off the atmosphere into space.

Therefore being able to rapidly and accurately calculate the required trajectory, and execute it and corrections to thrust during the burns, was required.  This got tested, nastily-so, on Apollo 13 if you recall where the primary issue after the fuel cells were lost became both power for ship systems and oxygen for the crew.  In fact there was concern that their calculated corridor burn, which after the original incident on a correction basis was done by hand, was very slightly off — perhaps by enough to kill them all.

See also  It's not a dog.

As technology has advanced both the speed of processing along with storage and its speed have wildly increased.  But the fundamental character of how a computer works has not changed since the first calculating machines.  Yes, before transistors and even tubes there were calculating machines but they were all, even when mechanically-based, deterministic devices.  We have found evidence of such devices that, for example, calculated the precise date and time of solar eclipses.  Being deterministic, absolute facts a calculating machine can give you that answer, and it will be correct.

See also  RIGHT AFTER INTERVIEW: Kari Lake: Joe Biden 'is back in diapers again', and other post interview reviews. ETA it's all devastating!

But “intelligence” isn’t that.  It is not simply the manifest weight of how many times something is repeated, for example.  You do not need to see a child walk in front of a car and get smushed to know that said child will be killed by the car; the outcome is intuitively obvious to humans yet while we can describe the acceleration or impact that a living body can withstand without being damaged or destroyed we have to teach a machine that this is undesirable and thus to be avoided.

Worse, even after we do that its not enough because the machine cannot accurately infer from other cues in the environment that a child might be present where said kid cannot be seen (e.g. behind the bumper or hood of a vehicle) and might run out into the road.  Yet humans both can and do, every day, make exactly that sort of inference and we don’t have to view millions of miles of driving video to do it either; we in fact draw that inference — correctly — before we have two digits years on this planet and have ever been anything more than a casual passenger in a vehicle.

I could go through a hundred examples from today and the so-called “AI revolution” that show this conclusively and that in fact no meaningful change has occurred.  Adding more variables and faster processing doesn’t solve the problem because the problem is not deterministic and thus the computer is incapable of resolving it.

My cat is better at inferring where prey is hiding when he’s hungry than the best of AIs and said cat consumes a tiny fraction of that AI’s power budget in BTUs.

Views: 173

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.