The view that fiction belongs on modern military reading lists is becoming mainstream. One only need look at the titles on the reading lists put out by US Special Operations Command or the senior officers of the Navy and Marine Corps to see that it has a valued place in military professional development. And this is not limited to the Iliad and the Odyssey. Fiction, and specifically science fiction and future-war fiction, is going mainstream in Western militaries.
There are myriad ways to try to understand how robotics and autonomy will change warfare. The more creative, such as theater, the better. The short story remains one of my favorite ways to work through these kinds of questions, particularly with the idea that a carefully written narrative can help check our assumptions and biases about how we want things to unfold vs. how they might actually. My latest military future fiction short story is Operation CANDLEMAKER. It follows two frontline characters present when US forces employ autonomous weapons in combat for the first time. My favorite feedback about the story so far is that “the laws of physics and Murphy prevail.” High praise indeed.
After seventeen years in the US Navy, Commander Wayne McCabe got seasick for the first time when a robot had the helm.
Technically, there was no actual metal humanoid at the controls because the 130-foot Sea Hunter-class trimaran warship was driving itself, six miles south of Jazireh-ye Larak in the Strait of Hormuz. McCabe ground his teeth as he fought the urge to throw up yet again and wondered what he was really doing aboard the USS Nantucket. McCabe adjusted the five-point harness on the captain’s chair by feel and looked at the spot on the console in front of him where the ship’s chief engineer had duct taped a red “NO” plastic button from a party store. Just out of reach. Fitting.
If McCabe hadn’t been aboard, then it would have essentially been a ghost ship. The nine other Sea Hunter-class ships in his squadron were unmanned and were the only ships in the mine-laden waters, making him the sole American sailor in the entire strait. The ships ran as close to silent as possible, communicating just by laser burst. They kept watch using infrared search and tracking sensors that flew like parasails 1,000 feet above the ship. In the middle of this summer night, the Nantucket was all but invisible.
At least it was cool, if not cold, sitting in the “fridge,” as he had jokingly called the bridge because of the onboard air conditioning constantly battling to keep the floating computer within its optimum operating range. He wore a tan aviator’s flight suit and augmented-reality (AR) helmet, deepening his sense of irony over his lack of control. This deployment was going to be hard to explain to the kids; he was aboard the Nantucket, at the cutting edge of naval warfare, but he was no more than a passenger. He was technically in command of the entire squadron, yet practically, he was in charge of nothing. But you couldn’t court martial an algorithm, so the Navy brass had to keep a human “in the loop” in case things went awry with the onboard autonomous combat system.
It was a packed house, just not the usual crowd for a think tank event.
But last week in London, an unusual evening of theater and discussion about artificial intelligence and the future of conflict brought together more than 200 people, including actors and art students, military and civilian government officials, tech and defense industry, among others.
The event, “Staging the Future: Artificial Intelligence and Conflict,” was put on by the Atlantic Council and the Royal United Services Institute, in partnership with Central St. Martins and the Platform Theatre. There are myriad efforts underway currently to better understand, and prepare for, a future in which computers and other machines can operate with human-like reasoned judgments and individual initiative but many of these reports or conferences overlook the crucial questions of the human element. As theater is inherently an analog – and live — activity, it focuses the audience’s attention on the actors on stage.
Among all the voices to consider in the debate over what role lethal autonomous capabilities should play in military and security systems, the very people who dream up and create science-fiction realities are the clearest in articulating the risks of robots run amok or even more devastating human-created technological disasters. The latest letter from 116 senior robotics and AI leaders cautions against the use of artificial intelligence in the defense domain, arguing humanity is at a point of no return. “We do not have long to act. Once this Pandora’s box is opened, it will be hard to close,” they wrote.
The problem is, however, that this is an era when civilian technology innovation outstrips what is conjured up in government labs. The global “AI” revolution is already underway and its impact will certainly shape future conflict. Don’t expect a Terminator reboot. Pandora’s box then may be the last one to be opened, as Facebook, Google, Baidu, Alibaba, Uber and scores of other companies have already lifted the lids on what is possible with learning machine software and robotics because there is generational society-changing and economic potential on the line. So much so that the US wants to block Chinese investment in certain cases in related technologies. As I told RealClear Defense …
August Cole, a senior fellow at the Atlantic Council and writer at the consulting firm Avascent, said the concerns raised by tech leaders on autonomous weapons are valid, but a ban is unrealistic. “Given the proliferation of civilian machine learning and autonomy advances in everything from cars to finance to social media, a prohibition won’t work,” he said.
Setting limits on technology ultimately would hurt the military, which depends on commercial innovations, said Cole. “What needs to develop is an international legal, moral and ethical framework. … But given the unrelenting speed of commercial breakthroughs in AI, robotics and machine learning, this may be a taller order than asking for an outright ban on autonomous weapons.”
As parents well know, the terrible twos are so named for a reason. For authors, publishing date anniversaries — book birthdays – can be equally tricky at the age when a book’s launch buzz is long forgotten and the weight of creative expectations has moved from crawling underfoot to fluid running.
There might be tantrums.
Or cake and candles.
With Ghost Fleet, the summer of 2017 marked the two-year anniversary since the novel’s launch on June 30, 2015. The book’s recent addition to the Commander of US Special Operations Command and the Chief of Staff of the US Army professional reading lists mean the book’s “twos” aren’t terrible at all. It takes time to build an audience, particularly as readers are busier and busier. The hope is that the connection with characters, stories and concepts continues to spread from person to person, organization to organization with increasing urgency and enthusiasm. The book is already on myriad military reading lists, but seeing it still being endorsed as professionally relevant makes a parent proud.
On the Army’s list, Ghost Fleet joined a short list of esteemed fiction titles, including Gates of Fire by Stephen Pressfield, Matterhorn by Karl Marlantes and Virgil’s The Aenid. “Each of us faces busy schedules every day and finding time to read and think is a recurring challenge. But even as we train our units and physically condition our bodies, we must improve our minds through reading and critical thinking,” wrote Gen. Mark Miley, the 39th Chief of Staff of the Army in his preface to the list.
On the concise SOCOM reading list, Ghost Fleet can be found under “Disruptive Technology” alongside 3D Printing Will Rock The World by John Hornick and The Red Web: The Struggle Between Russia’s Digital Dictators and the New Online Revolutionaries by Andrei Soldatov and Irina Borogan. It is the sole fiction title.
By next summer, work on the next book will be well underway and I hope there will be time and occasion, once again, to celebrate another Ghost Fleet birthday.
The audience of venture capitalists, engineers and other tech-sector denizens chuckled as they watched a video clip of an engineer using a hockey stick to shove a box away from the Atlas robot that was trying to pick it up. Each time the humanoid robot lumbered forward, its objective moved out of reach. From my vantage point at the back of the room, the audience’s reaction to the situation began to sound uneasy, as if the engineer’s actions and their invention’s response had crossed some imaginary line.
If these tech mavens aren’t sure how to respond to increasingly life-like robots and artificial intelligence systems, I wondered, what are we in the defense community missing?
Throughout my reading life, I’ve picked books for lots of reasons. I can remember going to Tower Books in Seattle when I was in elementary school and perusing the sci-fi section, discovering David Drake’s Hammer’s Slammers on the strength of its cover art alone. I did that a lot. Other times, I’ve had books recommended to me with heartfelt conviction, like when a former Navy SEAL first told me about Gates of Fire and the work of Steven Pressfield. More recently, the suggestion to read the novel Room came during a conversation on how I could get better at character development while speaking with Ken Liu, a writer whose prose and translation—and work ethic—are indomitable.
Today there is growing acceptance that fiction belongs on military reading lists, and it is leading to some outstanding suggestions. A great novel or short story (particularly sci-fi) pushes us to confront our assumptions, helps us understand other perspectives, and stokes our imagination in ways that nonfiction cannot. In particular, dystopian sci-fi stories have their place on these lists for their cautionary value in an era when technology’s downsides can sometimes only be revealed after calamity. Terrible times often produce the most memorable heroes.
These are among my favorite fiction titles that I’ve read (or re-read as in the case of The Profession) recently. As a package, these books complement each other for their exploration of everything from human migration and trafficking to political collapse to narco superpowers to private armies. Plus, they all have great covers.
Peruse the shelves at any bookstore and one section sure to be well stocked is for books on writing. There are notable titles, from rule books such as Elements of Style to the inspirational and craft-oriented Naming the World, that are well suited to equipping writers with the tools to put their words on a page as effectively as possible.
There are other types of writing books, though, that pull back the curtain of what it is like to write professionally. Stephen King’s On Writing: A Memoir of the Craft is one that stands out, and so is Ray Bradbury’s Zen in the Art of Writing endure as being entertaining and useful.
Add to the list two more worthy reads. These recent autobiographical books hew to the “show, don’t tell” adage and reveal the stories behind their authors’ lives: Steven Pressfield’s The Knowledge and John le Carré’s The Pigeon Tunnel. While these books are rooted in the past, their value to future-minded writers is in understanding the importance of personal stories in creating inspiring characters and credible settings perched on the knife edge between fact and fiction. This is perhaps even more important in tales that can be easily overrun with technology and fantasy.
Executive orders from the White House so far appear to be the readiest arrow in the Trump administration’s quiver for hot-button, high-stakes political issues like Middle East refugees and cutting federal red tape. Fired recklessly and without counsel or apparent expert advice they are sure to sow chaos and discord, likely by design. What will happen when the administration’s missives start to address the outstanding military and strategic questions about game-changing battlefield advances like AI and robotics? It is worth re-reading the Politico story from late December, “Killer Robots Await Trump’s Verdict” that tackled this question before the tumult of Inauguration Day and the reshuffling of the National Security Council that favors politicking over military and intelligence acumen (the Chairman of the Joint Chiefs of Staff and Director of National Intelligence are no longer deemed essential to meetings of top officials during a crisis but top political advisors are.) Read the Politico story.
As I said in the story, “We’re on the doorstep of what armed conflict looks like in the 21st century” and robotics and autonomy are going to play decisive roles in the air, on the ground, under the sea and in cyberspace. What that role is depends in large part on the initiatives of the Trump administration — or how they respond to other nations and groups who use these capabilities first.
During a recent podcast with Army Capt. Jake Miraldi of West Point’s Modern War Institute about future conflict, we got into a range of future-conflict questions from who will lead innovation around AI/autonomy, what will we do with bad advice from machines, will technology disruption shock the US military, and whether an algorithm might one day be writing my novels for me (and doing a better job…). Listen to the MWI podcast “Autonomy on the Battlefield.”
As the Trump administration gets down to work making its mark on the first 100 days in office, its members would be wise to remember that the next national security risks are so new they’re almost impossible to comprehend, let alone see.
This is where science fiction comes in. The right novel or short story can bring existential threats down to Earth, and make them seem solvable with the right recipe of science, heroes, and villains.