Technological Firestarters

Steven D Marlow
5 min readMar 22, 2022

--

This might be a career ending post, such as it is.

I’m just done with it all. Not what Artificial Intelligence is meant to represent, but with the people that claim to represent it. Yann “It’s the media that called my DL work AI 10 years ago, but now I’m in the media claiming to know the one true path in AI hoping no one notices this major rebranding of my work because DL isn’t enough“ LeCun is only the final straw. Elon Musk and his cult of personality basically commands that anyone in tech writing about him not see anyone else in the room. Anyone that brings ground breaking improvements to Tesla and it’s Full Self-Driving won’t even get a mention in the history books because the headlines will just be “Elon solves AI challenge” or “Elon proves detractors wrong.”

Prior to LeCun’s new “vision for human-level AI” shtick, there was Ben Goertzel. As the ML/DL crowd began to pull all of the oxygen out of the room, Ben and others rebranded the core goal of AI as Artificial General Intelligence, which was mocked or dismissed by “AI” leaders that were not really doing AI research (just ML product development or playing a game of SOTA one upmanship). For all those on the fringe, not part of the big tent events, AGI seemed like the only shelter. Then, for whatever reason, Ben created the company TrueAGI, for “enterprise-level” services. How anyone working toward AGI didn’t take that as a huge betrayal is beyond me. He sold everyone out.

Artificial Intelligence, if we go back to the beginning, was meant to be a new multidisciplinary field of research that borrowed people and ideas from other areas, working on something that was going to require many different perspectives in order to solve; in order to get computing machines to think in a way that matched or mimicked human-level thinking.

That dream is dead.

It would be easy to blame corporate interests, with their desire to own the market and block competition, but so far the hundreds of billions of dollars that has gone into “research” hasn’t resulted in anything fundamentally new. We can say that “AI products” often serve as data collection portals that feed into “research” and that DATA is the real market.

No, the issue is with the people and the culture. Pride. Ego. Narcissism. Those at the top of the field refuse to give even an inch of ground, even as the hill they dominate fails to move them any higher. When it looks like others are starting to make progress climbing a larger hill, they use their power and influence to swoop-down and claim that territory for themselves. Leaders, but only by decree. Only by what they can take from others.

Yes, I know the counter-argument. Put up or shut up (and that’s a direct quote from a Turing Award winner). Given that it costs millions of dollars just trying to repeat the “success” of others, or that “leaders in the field” have implicitly denied funding for efforts that don’t get their seal of approval on, I’d say that counts as a ladder pull moment. I got my funding, now try and get your own (as I bad-mouth your position). Of course, the counter argument to that is “DL had to struggle at first, and was maligned, so stop crying about it.”

I wouldn’t want to reduce everything down to a giant tabs or spaces issue because reductionism is one of the concepts that splits the community. Though, if asked to name all the little fault lines, all the diametric positions, I’m not sure I could. Seems to point to AI as being pure philosophy, not actual science, even though people have been swinging at the pinata for over 60 years.

My personal position is that Machine Learning is only looking at the last mile. It doesn’t matter if you want to say it’s rooted in math, or neuroscience, or physics because the problem space is wrong. When I write Every Good Boy Does Fine, most of you will recognize that as a mnemonic for those learning music notation. A large language model might tell you that “the mnemonics are words assigned to the letters E, G, B, D, and F, which are (of course) the lines of the treble clef staff” because that’s basically something I copy-pasted from a website just now. That’s the big game in AI; take input text and use it as an index to return an output text. Better results are just a larger database or better statistical models.

History will show that the Deep Learning hill was just a landfill; the composting of human culture and social cohesion in failed effort to understand what it even means to be human.

I don’t care if others are going in a direction that, from my perspective, doesn’t help them reach their stated goal. What does bother me is that only a few “researchers” are guiding the rest of the field, and when it looks like they are taking a nose-dive, my only option is to break away from the pack. The occasional arguments don’t matter. Incremental progress doesn’t matter. A trillion dollars down the Machine Learning hole hasn’t mattered. Posts like this won’t matter in the slightest.

This seems to be less about me going my own way and more about a formal announcement, a shout into the void, that says I’m comfortable doing something completely different, and can now embrace the hubris of thinking that I’m right where everyone else is wrong. And as I have no desire for my own framework to be suffocated out of existence, it’s introduction to the larger world will have to be met on the battlefield where all the giants of industry play.

--

--

Steven D Marlow
Steven D Marlow

Written by Steven D Marlow

I'm applying for the mad scientist position. Have robot. Will travel.

Responses (2)