DoD's Augmented Cognition plans – EPCOT meets PK Dick

The Department of Defense's "Augmented Cognition" video is supposed to represent a plausible scenario for a human-computer interface that uses EEG and other technologies to figure out what to feed to operators, allowing teams to do fast analysis of giant amounts of data.

The video is like a 1960s jetpack future video mapped onto 21st century corporate video production techniques. Everything just works in a way that seems antithetical to the way heterogeneous data-sources work — weird Babelfish translations and kooky spam results in the top ten listings. Star Wars and Blade Runner and Alien showed us technology that looked as though people actually used it — dented and rusted and sometimes badly fitting. The future depicted here is straight out of Epcot Center, seen from a few hundred yards away, far enough that you can't see the duct-tape holding it all together. This seems to be built out of monolithic pieces, tightly coupled — not small pieces, loosely joined.

The technology is kooky and interesting. A mind-reading tiara figures out how confused you are and takes stuff off your screen until you're less confused. If you go critical, it plays you soothing hyponogogic music. I don't know that this would actually work, but like most feedback mechanisms, I think that this would inspire me to figure out how to fool the machine into keeping the most amount of info visible at all times. This would be an excellent neural training device for being overloaded without lighting up the "I'm overloaded" bits of my brain.


Link

(via Beyond the Beyond)


Update:
Noah sez, "I spent a *bunch* of time with
AugCog researchers last fall. The result: this Wired News article,
which came out last week. Mind-reading killer drone controllers,
anyone?"