Ask any blind person why they are intelligent just because they lack vision ?
The lack of any one (or more) senses does not mean that the system is incapable of intelligence. Vision in AGI is like a glittering object to a magpie, looks nice, sells well, gets funding but is an easy win with no real intelligence, it’s just an unnecessary input for AGI. Again, ask any blind person.
GPT-3 has a defined window size (recurrent depth), which limits the complexity as to the input-output relationship. If GPT-3 could really do math, why is is unable to perform 4 and upwards digit math with any reliability ?
Math is a recurrent process that needs recursive temporal state. Just think of basic long division, it’s just a simple “set” of patterns that we apply to any size number. GPT-x does not do the math that way.
Take a look at the C4 data (1TB) set and you would see why GPT-3 appears to be able to perform math, because the input set contains all the data for 3 digit math. The model lacks the ability to perform unlimited recursion for long division for example. Basic recursive patterns.
Beware of a system learning these types of patterns and think they can do all math and not just the application of a pattern https://www.cuemath.com/learn/math-tricks/
Does the creation of custom hardware (NVIDIA Bluefield-4 / Ampere Next Net) really mean scalable ? Any problem can be scaled by hardware development.
Upto a point and then they will need custom hardware (e.g. past experiments with implementations on FPGA’s).
The upcomming hardware over the next decade eclipses what an AGI system actually needs, just look at the spec for Bluefield-4 (due 2024) let alone the likes of Cerebras with 850k cores and 2.6tn transistors that are operational now.
Scalability is a totally invalid criteria/argument because market economics will make anything scalable if it works.
The real question is which code works best to learn.
Yes, when a response is " No, I prefer Crete as it is much bigger, and more interesting, especially the Palace of Knossos." when asking for a comparison to an Archipelago in the Antarctic. The response in context is non sensical. If that was what an AGI came out with I would be really, really scared. Very much like Wizard of Oz, until you look behind the curtain.