I think we have to be quite cautious about stating what there isn’t. We’re in very early days of figuring out how LLMs and transformers encode higher-level concepts especially as you nest them more and more deeply. “We looked a bit and didn’t find it yet” is the state of most knowledge, not “we looked so extensively that we know that it isn’t there”.
1 Like