Keep an eye out for the conjunction of AI and weapons.

The AI industry is largely fixated on the near-term commercial opportunities from AI development. That is fine as far as it goes. But it results in turning a blind eye to a particular set of existential risks arising from the combination of AI and weapons. Weaponry is too emotive a topic for many in the AI sector to consider. Some would rather pretend it did not exist and certainly do not want anything to do with it. Shades of the epic movie, A few good men. So, what do we all think of lethal autonomous weapons? What do we think of introducing AI into the decision-making around launching nuclear weapons? The easy answer is that we hate both ideas. But what if we develop neither capability and our enemy does? This is not a question that the AI industry will be able to duck for ever.

Link to report:

You may also like to read other AI papers: