Mental framing problems and ReLU as a switch

Blog - Mental framing problems and ReLU as a switch

https://sciencelimelight.blogspot.com/2025/08/mental-framing-problems-viewing-relu-as.html

I think the phrase “Missed Serendipity” about some of the early ReLU papers is very apt. The history of neural networks would be very different if a particular point had fully come into mental focus and latched so to say. Perhaps also hatched so to say.

1 Like

I have also been rabbiting on about switching and weight pathway mutual reinforcement.

https://sciencelimelight.blogspot.com/2025/08/relu-networks-as-switch-based-linear.html

“Rabbiting on” is a British, informal phrasal verb that means to talk at length, especially about something unimportant or uninteresting according to Merriam-Webster. It implies that the conversation is tedious and likely boring for the listener.

It’s actually from the Cockney rhyming slang “rabbit & pork “– talk

1 Like

I’ll explain another way.

A (ReLU) neuron has in-going weight connections and out-going weight connections (some of the weights in the next layer.)

When the output of a ReLU neuron is zero those out-going weights are disconnected from the system. They don’t do anything.

When the output of the ReLU(x) function is active (x>0) then x=ReLU(x). It’s like the ReLU function wasn’t there (x=x). And the out-going weights of the neuron are directly connected to the sum of the in-going weights.

You dealing with a switched linear system.

The you are freed up to think of other ways of switching in and out other blocks of weights.

The degree of Framing Lock-In around this topic is… well, I’ve never seen anything like it.

1 Like