Leave a review for our podcast & we'll send you a pack of infosec cards.
Critical systems once operated by humans are now becoming more dependent on code and developers. There are many benefits to machines and automation such as increased productivity, quality and predictability.
But when websites crash, 911 systems go down or when radiation-therapy machines kill patients because of a software error, it’s vital that we rethink our relationship with code and as well as the moral obligation of machines and humans.
Should developers who create software that impact humans be required to take a ‘do no harm’ ethics training? Should we begin measuring developers by the functionality they create as well as security and moral frameworks they’re able to provide?
Other articles discussed:
- The Parable Of The Paperclip Maximizer
- Krack Attack
- The Trolley Problem
- Secret Database Hack
- Cryptographic failure
Tool of the week: Assemblyline: Files go in, and a handful of small helper applications automatically comb through each one in search of malicious clues.
Panelists: Kilian Englert, Kris Keyser, Mike Buckbee