General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsThe Pentagon Inches Toward Letting AI Control Weapons
Link to tweet
https://www.wired.com/story/pentagon-inches-toward-letting-ai-control-weapons
Last August, several dozen military drones and tanklike robots took to the skies and roads 40 miles south of Seattle. Their mission: Find terrorists suspected of hiding among several buildings.
So many robots were involved in the operation that no human operator could keep a close eye on all of them. So they were given instructions to findand eliminateenemy combatants when necessary.
The mission was just an exercise, organized by the Defense Advanced Research Projects Agency, a blue-sky research division of the Pentagon; the robots were armed with nothing more lethal than radio transmitters designed to simulate interactions with both friendly and enemy robots.
The drill was one of several conducted last summer to test how artificial intelligence could help expand the use of automation in military systems, including in scenarios that are too complex and fast-moving for humans to make every critical decision. The demonstrations also reflect a subtle shift in the Pentagons thinking about autonomous weapons, as it becomes clearer that machines can outperform humans at parsing complex situations or operating at high speed.
*snip*
It's almost like these people have never see The Terminator.
Spider Jerusalem
(21,786 posts)what if all those captchas were training AI for military targeting?
sarisataka
(18,483 posts)Star Trek or any of several other shows and movies...
PJMcK
(21,995 posts)No kidding, much of the advances in science and technology have been prophesied by science fiction writing and films, (which are also written, of course).
Since Artificial Intelligence is so profoundly opaque and not understood, these steps by the military need to be very carefully implemented.
After all, life is often stranger than fiction.
bluewater
(5,376 posts)Manned fighter aircraft will be going the way of WWI biplanes, the vaunted F35 will be obsolete within 7 years.
Those doglike robots are THIS close to becoming autonomus machine gun and bomb toting death machines.
Things are changing that fast.
What a wonderful world, eh?
Happy Hoosier
(7,215 posts)I can say that 7 years is a pipedream.
Automation in aircraft is complex, even for some relatively simple tasks. We can (and have) automated some things. But full tactical automation is still a ways off.
The interim step is something we call a "loyal wingman" model where manned fighter has control of 1-3 unmanned air vehicles, for surveillance and target designation, or SEAD missions.
bluewater
(5,376 posts)I take your point, but technology is prone to abrupt and immense acceleration, especially military tech.
A drone swarm can be effective in combat because quantity has a quality all its own, to quote Joseph Stalin.
Programing a drone swarm would actually be easier than tying to program a single aircraft to replace a human pilot because the swarm tactics can be more basic.
but OK, call it 10 years away.
Joking. Mostly, Well, not really. The future is coming fast.
Thanks for the discussion.
And you are right, those wingman jet drones are exactly the intermediate step that will have immense impact very soon. Those type of supersonic jet drones, not the toy drones people are used to seeing, are exactly what I see replacing manned fighters. No need to train pilots, capable of pulling more g-forces, probably smaller cross-section and designed to be cheaper and disposable in battle, make them inevitable. IMHO. lol
Happy Hoosier
(7,215 posts)I do not disagree in general, but as usual, the technical press is missing nuance.
bluewater
(5,376 posts)We are not actually disagreeing (much), I am just being more lighthearted about it.
Thanks for the discussion.
Happy Hoosier
(7,215 posts)The trick is always judgement and intuition. Real pilots really do develop an "instinct." Even with machine learning, we find that hard to replicate, at least without creating a potentially dangerous situation. I suspect the "loyal wingman" phase will last a long time. Although the next generation of fighter pilots may be more about managing "loyal wingman" rather than yanking and banking themselves.
bluewater
(5,376 posts)First, thanks for a good discussion.
But what do you define as a "long time"? Serious question. 10 years? 20? 50?
My personal belief is that AI technology will advance incredibly rapidly, making autonomous AI combat drones possible in 10 years.
I could be wrong, but remember that just prior to WWII people thought battleships would be around for a long time too.
Heck it wasn't that long ago people thought a super-computer would never beat a human grandmaster at chess or Go. Now AI engines on commercial pc's can do that.
Military technology advances incredibly fast once an arms race commences, and i believe we are entering an arms race with China.
Cheers.
Happy Hoosier
(7,215 posts)It just takes so long for new technology to become operational in the DoD. One project I am working on has had the basic technology proven for 20 years. We are JUST NOW about to flight test a production-quality prototype.
So once the basic tech is sound and reliable, take on at least 20 years.
Response to Nevilledog (Original post)
Chin music This message was self-deleted by its author.
Happy Hoosier
(7,215 posts)we already have some automated weapons systems, like the CIWS, which protects ships from aircraft and missiles. It can acquire, target, and engage targets when it is authorized to do so. Been around for 40 years.
bluewater
(5,376 posts)The future is coming and we can't top it.
Happy Hoosier
(7,215 posts)... but the issues are not a cut and dried as the article suggests.
I work in this area (kinda... I work primarily in automated navigation, not tactical engagement) and although it's fairly easy to deploy a simplistic capability, they are generally very vulnerable.
roamer65
(36,744 posts)BannonsLiver
(16,294 posts)bluewater
(5,376 posts)The future is what the future is.
Wounded Bear
(58,598 posts)people
(622 posts)a drone blowing up an aid worker and his family, including several children, in Afghanistan in our country's last few days there -- mistaking him for a terrorist with a bomb? What about the drone several years ago that mistakenly targeted a wedding party in Afghanistan?
robbob
(3,522 posts)Those are drones controlled by a human operator. Some of these drone operators have killed literally hundreds of people, and that job takes a heavy mental toll on them. I believe there was a documentary about this a couple years ago.
marie999
(3,334 posts)where AI believes that we are worse than our enemy they blow us up instead.
Amishman
(5,554 posts)A complex script is not artificial intelligence, and lacks any abstract decision making abilities.
bluewater
(5,376 posts)But let's not be dissing AI as being just "complex scripts", AI is more than just that.
Hey, thinking about it, I am not sure if most of human decision making isn't just a "complex script".
Bettie
(16,069 posts)so, so much could go wrong.
getagrip_already
(14,618 posts)To get an idea of the complexity of a swarm system. Fascinating, but scary.
You could no more defend against a swarm of small drones than you could against a swarm of angry hornets.
They react individually, but follow the pattern of the collective.
Soon? Look how quickly more conventional drones found there way into combat. Or helicopters. Or tanks. Or missiles.
The adoption and deployment rate is fairly quick; especially when they are net new, and don't have to replace or modify an existing weapon system.
Coming to a sky (hopefully not) near you.