Visualizing Properties of Encoders

Well, I started with a very loose idea of “cross product”. Basically …a relationship which combines elements to create new elements?

I’ve later found parallels in maths, like the thread of representation more generally as tensors, quantum maths, and category theory. Bob Coecke might be the best example of that:

From quantum foundations via natural language meaning to a theory of everything

"In this paper we argue for a paradigmatic shift from ‘reductionism’ to ‘togetherness’.

(Edit: to link this to where I refer to it in earlier threads)

Coecke emphasizes the entangled, contradictory, aspect. Of language in particular. I like that. But actually I think Coecke misses a trick. Because quantum maths is top down. It captures the entangled quality, but not so much an expansive quality (new particles?)

Linas Vepstas has also been doing work for years trying to find grammar for language. And he also came to something of a category theory view.
Sheaves: A Topological Approach to Big Data

I actually think both those approaches are limited by starting with the maths. I’m reminded of Chaitin’s assessment of Goedel’s proof of mathematical incompleteness. That the proof relied on a superior power of computation over mathematical formalism:

“If you look at Gödel’s original paper you see what to me looks like LISP, it’s very close to LISP, the paper begs to be rewritten in LISP!”

A Century of Controversy Over the Foundations of Mathematics
G.J. Chaitin’s 2 March 2000 Carnegie Mellon University School of Computer Science Distinguished Lecture.
http://arxiv.org/html/nlin/0004007

But as a way of summarizing a contrast between the status quo in AI today, which I think you have summarized well as a kind of dot product, I think the maths of cross products, and more generally in the way that Coecke, Vepstas, and others are relating it to tensor maths and category theory, at least goes part of the way to a full power of expansion perhaps only summarizable by computation.

You could make that distinction. By that definition “denotation” is limited to where you have a dictionary. I see that as the fundamental problem with current AI. We are attempting to encode the world as a large dictionary. The dictionary keeps on getting bigger and bigger. You have to decide when you want to stop thinking of it as a dictionary, and just keep the building process. You might call that a shift to emphasis on connotation.

I disagree. I think it is key that we are unable to keep track of the semantic meaning of every single bit. The ambiguity of elements, bits, is the key power of the system we need. I’m sure that’s why words in language are ambiguous. It’s not nature being lazy. Ambiguity is a more powerful way to represent meaning. It means you can create more meanings. If the elements depend for their meaning on the whole, then the same elements can “mean” a near infinity of combinations.

This, I think, is a doomed endeavour.

Read Chaitin’s very entertaining talk about the attempts to find a logical basis for maths at the turn of the earlier century. It failed, he says. But that failure was the invention of the power of general computing… Even maths was demonstrated not to have a complete set of representational primitives which captured all meaning. But it hints at a greater power of computation.

There’s something in combinatorial power which we are only beginning to discover.

1 Like