As a retired computer programmer, manager, director, analyst, well, I did a little bit of everything over my 35 years or so as a professional in the field, I have been following the saga of the Boeing 737 Max 8 crashes with special interest. I have been doing so because all evidence is now pointing at special programming software that Boeing designed to overcome the tendency of the plane to nose up during takeoff because the newer, larger engines, which were moved forward and up on the wings, to accommodate them, without having to redesign the height and wing design of the existing 737 planes.
Boeing apparently was allowed by the FAA to self-certify the safety of this design and the software installed to help manage what they knew would be a problem. And from reports I have read, they changed the specifications of the software after the initial specs were done, in a way that let the software basically reboot itself when it sensed a problem, and entered an unstoppable chain of repeated implementations of trim forcing the nose down, which pilots had no time to react to or could not intervene to stop. This was exacerbated by the apparent lack of proper training for pilots in what to expect from this automated “safety” feature.
The following quote is from a long newsletter I receive by email from SANS Institute, which runs a well known and respected operation, Internet Storm Center
SansInstitute is an organisation established in 1989 that specialises in information security and SecurityManagement. see http://www.sans.org/aboutsans.php. SANS stand for SysAdmin, Audit, Network, Security. They are involved in research, training, certification and organisation of security industry events.
Boeing Self-Certified Safety of New Flight Software
(March 16 & 19, 2019)
Boeing’s Maneuvering Characteristics Augmentation System (MCAS) flight software, which is increasingly looking like a major factor in the crashes, five months apart, of two Boeing Max 737 aircraft, was certified by Boeing itself. The US Federal Aviation Administration (FAA) delegated that responsibility to the manufacturer because, in the words of an FAA safety official, “it would be detrimental to our competitiveness if foreign manufacturers are able to move improved products into the marketplace more quickly.” Boeing did not train pilots on the new software features and “regulators agreed that it was a derivative model and that it didn’t require additional simulator training.”
Editor's Note
[Pescatore]
I’m not even going to attempt to comment on the complex aircraft safety issues involved here, but a couple of quotes in this story leap out as far as “lessons learned” that can be applied to making arguments to management about cybersecurity needs: (1) Boeing is quoted as saying the FAA “…concluded that it met all certification and regulatory requirements” and the Boeing System Safety Analysis “…concluded that the system complied with all applicable FAA regulations.” Sounds very similar to the common post breach statements of “We were PCI compliant, even though 100M customer accounts were compromised.” Compliant is not safe or secure. (2) Software automation was assumed to provide benefits without requiring training of the human experts on how to handle the inevitable cases where the software wasn’t working right. In cybersecurity, when product/services claim zero false negatives but *never* mention false positives, training is required on how to deal with potential false positives before taking action that will cause business impact.
[Murray]
As with most catastrophic accidents, this one is likely to prove to involve a combination of contributing factors. The lesson for IT developers is to identify all possible failure modes for one’s system (including “other”), what evidence of the failure the operator or manager will see, and what corrective action they must be prepared to take. That said, all software developers should aspire to the record for quality of Boeing and Airbus.
[Honan]
For too long we have let manufacturers determine the reliability and security of their software and systems through self-certification. The "trust us; it works" approach to software engineering has to end. I am glad to see that Members of the European Parliament adopted the European Cybersecurity Act including “the first EU-wide cybersecurity certification framework to ensure a common cybersecurity certification approach in the European internal market and ultimately improve cybersecurity in a broad range of digital products (e.g. Internet of Things) and services." ec.europa.eu: The Cybersecurity Act strengthens Europe's cybersecurity
I posting this because I think the SANS brief news item, followed by commentary from contributing editors, get to the heart of the crisis we are confronting as we rush into an era of human technology where increasingly complex computerized systems, including the Internet of Things, automated cars, and increasingly automated systems flying our airplanes, are creating systems of such complexity that is almost impossible to adequately test every logic path before production. Complicating this is the fact that so many of these systems have not been designed with good security baked in, in fact, too many have again and again been proven to be woefully lacking in security. But that is another story that needs to be dealt with.
Here is a link provided in the SANS New Bites newsletter where you can view the entire newsletter online as a web page. It is always worth reading.
News Bites. Annotated News Update from the Leader in Information Security Training, Certification and Research