It’s not often that a YouTube video on a technical topic gives one goosebumps. And it’s not often that someone unpacking a computer makes history.
Francois Rautenbach, a computer hardware and software engineer from Tshwane, achieved both with a series of videos he quietly posted on YouTube in 2016, and shared by Gadget.
It showed the “unboxing” of a batch of computer modules that had been found in a pile of scrap metal 40 years ago and kept in storage ever since. Painstaking gathering of a wide range of evidence, from documents to archived films, had convinced Rautenbach he had tracked down the very first Guidance and Navigation Control computer, used on a test flight of the Saturn 1B rocket and the Apollo Command and Service Modules.
Apollo-Saturn 202, or Flight AS-202, as it was officially called, was the first to use an onboard computer – the same model that would eventually take Apollo 11 to the moon. Rautenbach argued that the computer on AS-202 was also the world’s first microcomputer. That title had been claimed for several computers made in later years, from the Datapoint 2200 built by CTC in 1970 to the Altair 8800 designed in 1974. The AS-202 flight computer goes back to the middle of the previous decade.
His video succinctly introduced the story: “On 25th August 1966, a very special computer was launched into space onboard Apollo flight AS-202. This was the first computer to use integrated circuits and the first release of the computer that took the astronauts to the moon. Until recently, the software for the Block 1 ACG (Apollo Guidance Computer) was thought to be lost…”
One can be forgiven for being sceptical, then, when he appeared on screen for the first time to say, “I’ve got here with me the software for the first microcomputer.”
Then he unwrapped the first package and says: “Guys, these modules contain the software for the first microcomputer that was ever built, that was ever used.”
The goosebumps moment came when he revealed the NASA serial number on a device called a Rope Memory Module, and declared: “These modules are the authentic flight AS-202 software modules. These were found on a rubbish dump, on a scrap metal heap, about 40 years ago … and we are going to extract the software from this module.”
In a series of three videos, he extracted the software, showed how the computer was constructed, and used a hospital X-Ray machine to inspect its insides. The third video started with the kind of phrase that often sets off the hoax-detectors in social media: “Okay, so you guys won’t believe what I’ve been doing today.” But, in this case, it was almost unbelievable as Rautenbach took the viewer through a physical inspection of the first Apollo guidance computer.
How did an engineer from Tshwane stumble upon one of the great treasures of the computer age? He tended to avoid the limelight, and described himself as “a hardware/software engineer who loves working on high-velocity projects and leading small teams of motivated individuals”.
In an interview with Gadget, he said: “I am the perpetual hacker always looking for a new challenge or problem to solve. I have experience in designing digital hardware and writing everything from embedded firmware to high-level security systems. Much of the work I did over the last five years revolved around building new and creative payment solutions.”
The breadth of his work gave him the expertise to investigate, verify, and extract the magic contained in the AS-202 computer. A global network of contacts led him to the forgotten hardware, and that is when the quest began in earnest.
“I got interested in the Apollo Guidance Computer after reading a book by Frank O’Brien (The Apollo Guidance Computer: Architecture and Operation). Most of us grew up with the fallacy that the AGC was less powerful than a basic programmable calculator. I discovered that this was far from the truth and that the AGC was in fact a very powerful and capable computer.
“I started communicating with experts in the field and soon realised that there was a wealth of information available on the AGC and the Apollo space program in general.
“One day I received some photos of AGC Rope Memory modules from a friend in Houston marked ‘Flight 202’. After a little googling, I realised that these modules contained the software from Flight AS-202. As I learned more about AS-202, I discovered that this was the first time the AGC was used in an actual flight.”
Rautenbach eventually tracked down the source of the photos: a man who had picked up the entire computer, with memory modules, at an auction, as part of a three-ton lot of scrap metal.
“At one point he opened up to me and said he had other modules. He admitted he had a full Apollo guidance computer, and my theory was that it was used to develop the Apollo 11 guidance computer. He sent me more information, and I thought he had THE computer.
“He’s got all this junk in his backyard. He started selling stuff on eBay and one day got a visit from the FBI wanting to know where he got it. He was able to find the original invoice and showed it to them and they went away. But it scared him and he didn’t want to tell anyone else in the USA what he had. Not being from America was an advantage.”
Rautenbach flew to Houston last year, opened the sealed packages and filmed the process.
“This was the first microcomputer. I opened it and played with it. I realised this was the first computer that actually flew. I also found Rope Memory modules that said Flight 202, and he didn’t know what that was. I found it was from AS-202, and I said we can extract stuff from this.”
Rautenbach paid a deposit to borrow the units and have them sent to South Africa, so that he could extract and rebuild the software. He also made contact with Eldon Hall, leader of the team that developed the Apollo guidance computer and author of the 1966 book, Journey to the Moon: The History of the Apollo Guidance Computer.
The correspondence helped him verify the nature of the “scrap”. The Apollo command module from flight AS-202 was restored and is now on permanent display on the USS Hornet, the legendary aircraft carrier used to recover many Apollo command modules and now a museum. However, the computer parts were sold as scrap in 1976. And NASA never preserved a single copy of the software that had been used on its first guidance computer.
Fortunately, a sharp-eyed speculator realised the lot may contain something special. He sold off some of the scrap over the years, until that visit by the FBI. He still preferred to remain nameless.
In August 2016, on the 50th anniversary of the launch of AS-202, Rautenbach quietly began posting the evidence online. He also announced that the raw data he had extracted would be made available to anyone who wished to analyse it.
His videos on the unboxing of the AS-202 computer and the extraction of the software can be viewed on YouTube at http://bit.ly/as202, where he also planned to post instructions for accessing the software.
- Arthur Goldstuck is founder of World Wide Worx and editor-in-chief of Gadget.co.za. Follow him on Twitter and Instagram on @art2gee
NASA’s description of flight AS-202 can be found at: http://nssdc.gsfc.nasa.gov/nmc/spacecraftDisplay.do?id=APST202
Technical specifications of the Apollo Guidance Computer can be found at: https://en.wikipedia.org/wiki/Apollo_Guidance_Computer
Apollo comes back to Pretoria
Francois Rautenbach pointed out that South Africa played a prominent role during the 93 minutes of flight AS-202: “Pretoria is mentioned no less than three times in the post-flight report. The AS-202 flight actually reached it’s highest point above South Africa. The telemetry data from the flight were recorded on computer tape at Hartebeesthoek and later shipped back to NASA.”
Data gives coaches new eyes in sports
Collecting and analysing data is entering a new era as it transforms both coaching and strategy across sports ranging from rugby to Formula 1, writes ARTHUR GOLDSTUCK
Coaches and managers have always been among the stars of any sports. They become household names as much as the sports heroes that populate their teams. Now, thanks to the power of data collection and analysis, they are about to raise their game to unprecedented levels.
The evolution of data for fine-tuning sports performance has already been experienced in Formula 1 racing, baseball and American football. All are known for the massive amount of statistic they produce. Typically, however, these were jealously guarded by coaches trying to get an edge over their rivals. Thanks to the science of “big data”, that has changed dramatically.
“American baseball has the most sophisticated data science analytics of any sports in the world because baseball has this long history of stats,” said Ariel Kelman, vice president of worldwide marketing at Amazon Web Services (AWS), the cloud computing giant that is working closely with sports teams and leagues around the world. “It’s an incredibly opaque world. I’ve tried for many years to try and get the teams to talk about it, but it’s their secret sauce and some of these teams have eight, nine or ten data scientist.”
In an interview during the AWS Re:Invent conference in Las Vegas last week, Kelman said that this statistical advantage was not lost on other sports, where forward-thinking coaches fully understood the benefits. In particular, American football, through the National Football League there, was coming on board in a big way.
“The reason they were behind is they didn’t have the player tracking data until recently in in the NFL. They only had the player tracking data three years ago. Now the teams are really investing in it. We did an announcement with the Seattle Seahawks earlier this week; they chose us as their machine learning, data science and cloud provider to do this kind of analysis to help figure out their game strategy.
“They are building models predicting the other teams and looking at players and also evaluating all their practices. They are setting up computer vision systems so that they can track the performance of the players during their practices and have that inform some of the game strategies. The teams then even talk about using it for player evaluation, for example trying to figure out how much should we pay this player.”
Illustrating the trend, during Re:Invent, Kelman hosted a panel discussion featuring Rob Smedley, a technicalconsultant to Formula 1, Cris Collinsworth, a former professional footballer in the NFL and now a renowned broadcaster, and Jason Healy, performance analytics managerat New Zealand Rugby.
Healey in particular represents the extent to which data analysis has crosses sporting codes. He has spent four yearswith All Blacks, after 10 years with the New Zealand Olympic Committee, helping athletes prepare for the OlympicGames.
“The game of rugby is chaos,” he told the audience. “There’s a lot of a lot of things going on. There’s a lot of trauma and violence and it can be difficult to work out the load management of each player. So data collection is a big piece of the technical understanding of the game.
“A problem for us in rugby is the ability to recall what happened. We have to identify what’s situational and what’s systemic. The situational thing that happens, which is very unlikely to be replicated, gets a lot of attention in rugby. That’s the sensational big moment in the game that gets talked about. But it’s the systemic plays and the systemic actions of players that lies underneath the performance. That’s where the big data starts to really provide some powerful answers.
“Coaches have to move away from those sensational andsituational moments. We’re trying to get them to learn what is happening at that systemic level, what is actually happening in the game. How do we adjust? How do we make our decisions? What technical and defensive strategies need to change according to the data?”
Healey said AWS was providing platforms for tracking players and analysing patterns, but the challenge was to bring people on this technology journey.
“We’re asking our coaching staff to change the way they have traditionally worked, by realising that this data does give insights into how they make their decisions.”
Kelman agreed this was an obstacle, not just in sport, but in all sectors.
“Across all of our customers, in all industries, one of the things that’s often underestimated the most is that getting the technology working is only the first step. You have to figure out how to integrate it with the processes that us humans, who dislike change, work with. The vast majority of it is about building knowledge. There’s ways to transfer that learning to performance.”
Of course, data analytics does not assure any side of victory, as the All Blacks discovered during the recent Rugby World Cup, when they were knocked out in the semi-finals, and South Africa went on to win. We asked Healey how the data-poor South Africans succeeded where the data-rich All Blacks couldn’t.
“You have to look at how analytics and insights and all thesetechnologies are available to all the coaches these days,” he said. The piece that often gets missed is the people piece. It’s the transformation of learning that goes into the player’sactual performance on the field. We’re providing them with a platform and the information, but the players have to make the decisions.. We can’t say that this particular piece of technology played a role in winning or losing. It’s simply just a tool.”
The same challenge faces motor racing, which generates massive amounts of data through numerous sensors and cameras mounted in vehicles. Rob Smedley, who spent 25 years working in engineering roles for Formula 1 teams, quipped that his sport had a “big data” problem before the phrase was invented.
“We’ve always been very obsessive about data. Take car telemetry, where we’ve got something like 200 to 300 sensors on the car itself. And that goes into something like two to three thousand data channels. So we’re taking about around 600 Gigabytes of data generated every single lap, per car.
“On top of that, where we’ve also got all the time data and GPS data. The teams are using it for performance advantage. We’re into such marginal gains now because there are no bad teams in Formula 1 anymore. Data analytics provide those marginal gains.”
• Arthur Goldstuck is founder of World Wide Worx and editor-in-chief of Gadget.co.za. Follow him on Twitter and Instagram on @art2gee
IoT faces 5-year gap
In five years, the world will have more than 40 billion devices. Locally, IoT specialist,Eseye, says that South African CIOs are recognising IoT (Internet of Things) and M2M (Machine to Machine) technologies as strategic imperatives, but the journey is still in its infancy.
“As legacy systems start to reach end of life, digital shifts will become inevitable. This, coupled with an increasing demand for improved bottom line results from existing and new markets, makes IoT a more viable option over the next five years. This is particularly prevalent in manufacturing, especially where time to market and product diversification has become necessary for business survival,” says Jeremy Potgieter, Regional Director – Africa, Eseye.
He says that within this sector one thing matters – output: “Fulfilling the product to market lifecycle is what makes a manufacturer successful. Addressing this functionality and production optimisation through technology is becoming more critical as they focus on increasing output and reducing downtime. By monitoring machinery and components in the production line, any concerns that arise, which impacts both the manufacturer and consumers alike, will be more efficiently dealt with by using an IoT approach.”
Potgieter says that there is also the growing strategic approach to increase the bottom line through new markets. As manufacturers seek new revenue streams, Eseye is encouraging the use of rapid IoT enabled device product development : “By addressing the connectivity aspects required at deployment, manufacturers are immediately diversifying their portfolios. Eseye, as an enabler, assists by providing market ready SIMs, which can be embedded into IoT connected devices at OEM level, connecting them to a plethora of services (as designed for) upon entry to market, anywhere in the world.”
In addition, Potgieter says that organisations are increasingly looking towards IoT connectivity managed services to capitalise on specialist expertise and ensure the devices are proactively monitored and managed to ensure maximum uptime, while reducing data costs.
Impacting IoT adoption though, is undoubtedly the network infrastructure required. Potgieter says that this varies significantly and will depend on criteria such as sensor types and corresponding measurements, the overall communication protocols, data volume, response time, and analytics required: “While the majority of IoT implementations can be enabled using cloud-based IoT platform solutions, the infrastructure required still remains important. A cloud platform will simplify infrastructure design and enable easy scaling capability, while also reducing security and data analytics implementation issues.”