Holy hell yeah you did. How would you go about doing that in a single expression? A bunch of back references to figure out the country? What if that’s not included? Oy.
Holy hell yeah you did. How would you go about doing that in a single expression? A bunch of back references to figure out the country? What if that’s not included? Oy.
Softly. With their words.
My sense in reading the article was not that the author thinks artificial general intelligence is impossible, but that we’re a lot farther away from it than recent events might lead you to believe. The whole article is about the human tendency to conflate language ability and intelligence, and the author is making the argument both that natural language does not imply understanding of meaning and that those financially invested in current “AI” benefit from the popular assumption that it does. The appearance or perception of intelligence increases the market value of AIs, even if what they’re doing is more analogous to the actions of a very sophisticated parrot.
Edit all of which is to say, I don’t think the article is asserting that true AI is impossible, just that there’s a lot more to it than smooth language usage. I don’t think she’d say never, but probably that there’s a lot more to figure out—a good deal more than some seem to think—before we get Skynet.
Oh man Garak is one of the best characters in Trek. And that’s a competitive list.