Wednesday, November 27, 2013

Holiday Video Game Classics

It's that time of year. The music starts playing in the stores. The decorations go up. You come home, have a look in the closet, and find your old video game systems there waiting for you.

I don't know why it is, but this time of year seems to turn our minds back to games past. Pac Man, Super Mario, Donkey Kong. Just yesterday I had a friend over, we ended up doing a tour of my old video game systems. Later my daughter came to me. She'd been going through her old Gameboy cartridges, finding old friends in a dresser drawer. (I play my Gameboy games with my GameCube adapter, which puts them up on my 55 inch TV. No squinting for me any more.)

The problem is, how do you hook up these older systems to the new TV you bought in the meanwhile?

Fortunately, I've got an entire web page on hooking up your classic video game or computer system to your TV. So you can not only hook up your old Atari or Nintendo, but also your Commodore 64, Apple II, Atari 800, Vic-20, or whatever.

If it's gathered some dust, or got put away with finger smudges on it, I've also got a page on caring for your older electronics.

I love the old stuff myself. I know I'm not alone. When we have guests over the holidays, I can't tell you how much fun we have playing fun, simple old video games like Ms Pac Man, Pong, Shadow Squadron, Space Invaders, and so on. Even Katamari Damacy is on the old-time play list now, though the PS2 hooks up to our modern TV just fine--this year. In a few years I'll probably have to create a web page on that, or a page on building a video modification to allow PS2 players to relive favorite moments with their older game system.

We gotta keep bringing our old friends along into the future with us, right? ;)

Tuesday, November 26, 2013

Why Electronics Took Over the World

How did we end up in a world where computers are everywhere?

Originally, we had vacuum tubes as electronic components. Each of these has to be hand-made. When you consider that even the most basic computer, about the power of a programmable calculator, requires about 4000 electronic switches in it (including some basic control, memory, and interface circuits), you can see that needing 4000 hand made parts is going to get expensive. And that's before you wire them together into a working computer. It's like having to hire a team of scribes every time you want to get a new book.

Each of those tubes is like a decorated capital drawn by a scribe.

Transistors were a big step forward. Transistors aren't made one at a time by hand. Packaging them involved some hand work back when they were new, but the guts of them were produced en masse. Making transistors was like printing a sheet covered in letter "B" so that you could cut them up to have a letter B to stick wherever you need one. Similarly, transistors are made in a large group, which is then cut up into individual transistors then packaged for use.

So why not print the equivalent of a small piece of often-used text, rather than cutting it up into individual letters? This is the basis of the integrated circuit. It was another step forward in reducing the cost of electronics manufacturing. The first circuits were like having commonly used words, in complexity. Over time, technology advanced to allow more and more sophisticated circuits.

Eventually the circuits got more and more complex, and more useful. Building a computer got to be about as complex as creating a book on a typewriter. That means it took patience, and skill, and it was still expensive, but not nearly as expensive as hiring a team of scribes.

Each integrated circuit has from a few to as many as a few hundred transistors on it at this point. Building a basic computer circuit could be accomplished with a couple of hundred ICs.

The next step was a big one. Integrating the entire guts of a computer onto a single die, then printing them not one at a time, but by the tens then the hundreds at a time.

In the mid 1970s enough transistors were printed together, in the right circuits, to make a basic computer. When added to some memory (which was another technology that had recently benefited from the improvements in integrated circuits), a few ICs for control and for interfacing to the outside world, a complete computer could be built out of a handful of integrated circuits. Like my MAG-85 computer project, which uses about 10 ICs to build a basic 70's style computer.

But that wasn't enough. It was enough for calculators and very simple computers that require someone with a high level of skills to get the most out of them. If we'd stopped there, only very technical or very driven people would have computers. We had to increase their complexity to make them more capable, and easier to use.

Since then, we've improved our "printing processes" to allow us to produce integrated circuits that contain not just a few thousand "switches", but billions. Your computer, cell phone, or tablet contains the equivalent of billions of vacuum tubes. And yet, those billions of sub-microscopic electronic switches all together require less electrical power to operate than one single vacuum tube. They also generate less heat.

If we put the entire world population to work building electron tubes as fast as they can, we couldn't produce enough tubes to reproduce the computing power of a single cell phone in a year. In part because we couldn't build tubes that can switch as quickly as the transistors in a cell phone.
the guts of a simple vacuum tube
Imagine building a few billion of these, by hand. Image courtesy RJB1.
But the computer in the heart of that cell phone is one chip that was printed alongside hundreds of others just like it in a mass production process that's very similar to printing. Many of today's computer chips literally cost less to make than a printed magazine or book. Far less, usually.

This triumph of manufacturing, reducing electronics to a simple, inexpensive, high volume printing process, is why we have computers everywhere from our cell phones to our irons and dishwashers. They're cheaper to build than the parts they replace.

Have a look at a current computer chip sometime. Inside it are several billion man-made structures. You could look at them with a microscope if the top were removed, but you would only see patterns, not individual elements. The individual elements are too small to see in visible light now.
There are several billion man-made things in this image.

Tuesday, November 19, 2013

ZBrush for CNC Got Better

Yesterday, I learned that ZBrush (my 3D design program) now has an extension that lets it directly export files I can use with my Computer Aided Manufacturing (CAM) programs. I spent most of my work time today testing the output of various designs to see how they looked.

ZBrush has this 'built-in' since version 4R6 came out, it was available as a plug-in before, but since it calls itself a 3D printing plug-in, I ignored it, assuming it was software to sent object data to one or more of the commercial 3D printing services, like Shapeways. Turns out it's an exporter for standard 3D object file formats like .stl.

This is a huge improvement for my workflow of going from design to a finished part prototype in the real world. Before I had to use a very complex conversion program. Its control panel makes the flight deck of a 747 look simple. And if I didn't get the settings just right, I could get some really nasty effects in the final machining. Using the same settlings over again doesn't work, I had to adjust things based on the size of the object, the scale of features on it, the size of the material it would be cut out of, the relative size of the tool, etc., etc.

Now that difficult & frightening step is gone. I do a couple of passes to simplify the 3D object design as much as possible without losing detail (which I was doing anyway, it speeds up everything later), set a couple of simple settings in the exporter, like the real-world size the final object will be, then export.

The resulting files load just fine into the two different programs I use that create the list of instructions for my CNC machine to cut the 3D object out of a solid block of some material (usually a polyurethane plastic). I did a dry run to set up two test files tonight--doing everything short of actually making the parts. Tomorrow I plan to make an actual part from a new file as a final test. Probably something fun.

For those interested in trying this at home, I use both MeshCAM and Vectric's Cut3D for CAM. Cut3D is my usual preference, though I'm using an older version of MeshCAM (4). I prefer Cut3D's interface for setting tabs, and its included machining preview.

Both produce excellent GCode for my CNC (a MicroCarve A4 driven by EMC2 and a Gecko G540 controller.)

For doing image depth maps, I use EMC2's built in facility, though if you want to bypass the copious experimentation & two pages of notes I use to get it looking good, you might want to look into one of the dedicated commercial programs for this.

Thursday, November 14, 2013

Ted Nelson's Computer Lib 40th Anniversary to be Honored at Chapman University

I forgot to mention an additional item in my post on meeting Ted Nelson. Chapman University will be honoring the 40th anniversary of the publication of Computer Lib on April 24th-26th, 2014, presumably at Chapman's campus in Orange, California.

Here are images of the flyer (once again, apologies for the fold. I put it in my hip pocket since I wasn't toting anything else to carry things at the time.)

Wednesday, November 13, 2013

Lee Felsenstein at Homebrew Computer Club Reunion

Lee was the MC at the main part of the club meetings back in the day, and he reprised that role on the night of the HCC reunion. He was also the designer of the computer in that day that I most desired, the Sol 20 Computer. I loved that system--the look, the keyboard, its operation.

Image by cellanr
There were just two things you wanted to know about the Sol to make life happier: Build the fully expanded system right at the outset. Opening up the heart of the system to expand it later was a major PITA. The other? Use someone else's disk subsystem. Though with the information available today a Helios disk subsystem could probably be made to work.

I still have the sales brochures for the Sol 20. I pull them out every now and then to drool over them again. Part of it is nostalgia, but part of it is the great design itself. Actual Sol 20s sell for more than I can afford, but perhaps I'll build myself a look-alike system from sheet metal and walnut wood sometime, anyway, and print up a nice black name badge.

I still have an Osborne 1 computer. This one is one I got only relatively recently. It is pretty well maxed out on upgrades (disk upgrades, video upgrades, etc.) and is a pleasure to use. It's not as pretty as a Sol, but I enjoy showing it off in current day computer classes. The kids love it--especially the floppy disk drives and the tiny screen. But...they get hooked on Zork.

Lee Felsenstein Today

In our conversation last Monday, Lee showed me a project he's working on today as an educational tool. It's a programmable logic simulator, targeted at middle school students. What Lee showed me was a pair of printed circuit boards that have captive fasteners to clamp them together around a plastic matrix. The matrix holds surface mount diodes, which the students can place into the matrix to program it. In essence, it's a 16 by 8 programmable logic array that is programmed through physically locating the diodes.

OK, I know that sounds totally abstruse to many of you, so let me tell you what makes this a great idea, and why your middle schooler ought to know about this stuff even if you've gotten through life without having to so far (assuming you don't know already).

The core of computers are built out of logic circuits. The memories feed the logic circuits with data (in current designs--it doesn't have to be that way though it's presently the assumption), in essence, the programmable logic is the complement to the memory. This analogy of the logic and memories being complementary components of a computer holds on many levels. It's possible to build logic out of memories--I've done it--but it's not efficient.

Initial education in logic circuits can be accomplished with a simple breadboard and some logic chips. A few AND chips, OR chips, NAND chips, inverters, and so on. Add some resistors and LEDs and the kids are off and running. For a little while. Once they master this, and understand what's going on, they immediately start expanding their ideas.

Then a problem hits. More chips and more wiring between them mean more complexity, and more difficulty in realizing their ideas.

At this point, it's possible to introduce them to programmable logic devices. Teach them that the logic functions they had in the ICs live inside the PLDs, and that they can program the devices rather than run wires. The problem is that this is a big, big jump up in abstraction level, especially for a kid in the middle school age bracket (which is the perfect age to introduce this stuff, which I'll go into later.)

Whereas Lee's invention maintains a physical element. The programming is accomplished by manually placing diodes into a matrix, rather than typing characters on a screen then punching the 'program' button to dump it to a Flash PLD. This keeps it from getting too abstract, encourages experimentation, and maintains the hand-on element that's necessary for students in the 9-13 years age range.

Building Blocks of Electronics

Electronic logic is building blocks. Your kids play with building blocks, right? They start with simple structures to learn how to build more complex structures. Before long, they can use every single piece they've got building large, complex structures. Once the individual blocks and a few simple ways of interconnecting them are understood, they can take off and make great big projects that reach to the ceiling.

It's the same with electronic logic. It's a collection of simple building blocks. The problem is, the complexity of assembly is a little greater. Enough that once you get past a certain level (I'd say 20-30 ICs), it gets progressively more difficult to implement your ideas. The ideas out-race the ability to construct.

This shouldn't be an obstacle. The ideas should be allowed to continue to grow, without removing the physical aspects that make the activity interesting.

The Lee Felsenstein Magic

Lee has hit a sweet spot here. With all the excitement about the Raspberry Pi (which I will save my criticisms of as an educational tool for a future article), Lee's project should have that sort of excitement going for it. This is about students building their own processor. This knowledge is important. This is what the people who caused the microprocessor revolution used to cause the revolution in our lives. This is the knowledge that put a CPU in your telephone, your oven, and your iron. This is what tunes your radio.

Assembling a processor from random logic is a huge project. Yes, people still do that (I've even build a very, very simple one from racks of relays, myself, under cover of testing those relay racks and their support wiring after installation.) Building your own processor with a PLD is a lot easier, once you understand the building blocks.

Lee explains himself well on his project page. Have a look. I will be following the progress of the project.

And I'm really glad I got a chance to meet up with Lee again after all these years. He was one of my mentors and inspirations in my youth, just as he describes those who mentored him. It seems to be a common thread that those of us getting older want to assist the younger generation just as we were assisted when getting started in technical pursuits (as hobbies--the jobs came later.)

And if you're raising a kid--don't just foist off software on them as something to play and "learn" with. Software isn't reality. I've designed any number of computers on paper and in software, and then go on to build far fewer of them. Because software and paper aren't the real thing. The real thing has all sorts of little niggles and oddities that you'll never learn about in any way other than doing the real thing. Teach your kids to solder, use solderless breadboards, and use real components at all levels of complexity. Don't try to do too much at once, start with kits then move your way toward recreating circuits on breadboards then to soldering them on prototyping boards.

But do the real thing. Right alongside your other crafts projects. Because electronics is just as much a craft with some useful products as is crochet or embroidery (both of which I do) or quilt-making or sewing (which some of those close to me do). And most of all, have fun!

Tuesday, November 12, 2013

Ted Nelson & Computer Lib at Homebrew Computer Club Reunion

I had a great time at the Homebrew Computer Club Reunion last night, which, I learned, was made possible by a Kickstarter (thank you, KS backers!)

One of the great conversations I had there was with Ted Nelson, author of Computer Lib/Dream Machines and his wife, Marlene Mellicoat. My wife and I had a wonderful time speaking with them. Ted has published a new edition of Computer Lib. It's not a reprint from scans of the original, but a new printing from the original negatives. It's as clear and sharp as the original was back when, possibly even better. It's in the same large format, as well, not scaled down for the size of paper that happens to be cheap and convenient for most books.

Sorry about the fold, I only had my hip pocket as a place to put his flier last night.
I was working so hard at being social last night it didn't even occur to me that I could probably have purchased a copy directly from him right then. I saw that he had a number of copies in his bag, too. It's little things like this that I always think of when people tell me how smart I am. Yeah, about some things, maybe, but about other things I'm not so much.

Nevertheless, I'm going to purchase it now, after the event. I read someone else's copy back when, having noticed it as a pillar on a bricks and boards bookshelf among a number of copies of The Fabulous Furry Freak Brothers (Fat Freddy's Cat was my favorite of the crew.) Now I'm looking forward to having a Computer Lib/Dream Machines book of my own.

If you're not familiar with Ted's work, I strongly recommend correcting that. The web could be so much more than it is, and require far less human "curation" than it does, if it hadn't turned into the mishmash mess of information without proper structure that it has become. I'd say more, but rather than reading my take on what he thinks, go to the source:

Hopefully I'll have a chance to post more about last night's event in future articles. There was so much packed into so little time that my head is still spinning from it. (They managed to recreate the atmosphere of the original meetings perfectly in that regard.)

It was great that my wife got to hear Ted speak during the formal presentation portion of the evening, too. I got to hear him speak a few times back when, he's a dynamic and engaging speaker. He makes you think about how things could be, possibilities that are better than reality. Now we have hearing Ted speak as a shared experience.

Tuesday, November 5, 2013

Hidden Ogres Spotted?

I was a backer in the Ogre Designer's Edition Kickstarter. Now that they're shipping out the rewards, I've been watching the updates closely. Today, they posted images that they claim had "stealth ogres" in them. At first I was skeptical.

Then I pulled out an old, experimental pair of multi-spectral image enhancement goggles that we were playing around with in the lab back in the late cold war boom of the 80s. You know, when we actually invented everything that Popular Science raves about being the "latest thing" now (except for the stuff that was invented in the 50s and 60s, of course).

There they were! At least two Ogres, possibly more. They have very sophisticated stealth capabilities. Have a look:

At least one Ogre (a Mark V?) concealed in a nearby pattern of shadows.

Two or more Ogres concealed in plain sight, possibly including the dreaded Ogre Mark VI!
There was an error in this gadget