Let's Watch the Marcus / Bengio AI Debate

If something happens during the debate and we’d like to take some time to discuss it, should we just make a note to discuss later? Or pause the debate and have a quick discussion?

  • Pause debate and discuss
  • Never pause live debate

0 voters


I am live now with the Pre Show debate.


1 Like

I’ve watched this, and Bengio sounded like he’s running out if ideas. I know it’s not quite helpful of a comment but even myself a noob in NS, wth is system 1!

World changing ideas don’t pop out on a schedule like Star Wars movies. The fact that someone has them at all is utterly wonderful.


Sounded like sexier classical AI on top of DL.

More and more epicycles. Still, geocentrism was utterly wrong.

Marcus spent a lot of time attacking a cartoon version of older connectionist work; dwelling on his early research work in comparison the then current (1986) PDP books - living in the past much?

Stray thought - does every one of his classes have to sit through his epic battle with early connectionist theories 30 years ago?

It warms my heart to see Bengio mention some concepts near and dear to me - Boltzmann machines, gating, global workspace, and large scale organization of sub-networks. I wish he would have spent some more time on these concepts. He hinted at how connectionist techniques do some of the same things as symbolic AI but did not make the case in a clear way. Since his slide deck is the same one he has used in earlier presentations it would be nice if he worked up an elevator pitch for this “way forward” concept and added it to this presentation.


So, the former are objects, and the latter are more like groups, systems, or concepts?
Do you see any low-level differences in the way inputs from these two types of tracts are processed? My guess is that objects are composed primarily through lateral interactions: some form of connectivity clustering, by gradient from lateral inhibition in grids. And relatively non-local concepts would be composed primarily through vertical or centroid clustering: more direct Hebbian learning?

1 Like

The dividing line between representation has been the subject of interest for me for years.
In the visual system you can see a progression from Gabor in V1 to some degree of abstraction in V2 on up to texture in V4.

Mapping with receptive field properties only goes so far as every map seems to multi-task.
There seems to be the same sort of progression in the auditory and somato-sensory cortex.

On the motor side they have worked backwards about two layers.
So for about 50 maps per side I can account for perhaps 10 of them. The areas between seem to be so abstract that I have no real frame of reference to describe what they are doing. We get some hints from the EC/HC areas but other than some intriguing connections to perceived spatial relationships- nobody really has a handle on what is going on there either.

Trying to force an arbitrary grouping that matches up to sensory based grouping feels wrong to me. It’s kind of hard to express this but I will try - in the real world we break thing down with integer grouping and relationships. I think the distribution between map level is more of a log function. Since this is different than how the world works it is basically incomprehensible. We have no vocabulary or familiarity with how that kind of representation works - as such - the relationships don’t make any sense.

I have pointed to this paper before but it is very relevant to this post, it describes the “where” of some of the processing, and the general semantic contents, but not detailed contents of those maps.


Yoshua Bengio apparently also sometimes allows his reserach meetings to be published. Here is a clip from a brainstorming session with some of his students at the Montreal Institute for Learning Algorithms. (I skipped the first uninteresting minute). It’s somewhat dated though. It from about three years ago. Still, interesting.


I’d be so curious to hear their discussion on how all these learning challenges are addressed by HTM theory

1 Like