Man the models can’t store verbatim its training data, the amount of data is turned into a model that is hundreds or thousands of times smaller than the original source data. If it was capable of simply recovering everything that it was trained on this would be some magical compression algorithm and that by itself would be extremely impressive.
Oh ok, you want to claim this is compressing the entirety of the internet in a model that isn’t even 1 terabyte of data and be unimpressed that is something.
But it isn’t compression. It is a mathematical fact that neural networks are universal function approximators, this is undisputed, and analytic functions are continuous so to be an analytical function approximator it must be able to fill in the gaps between discrete data points by itself, which necessarily means spiting out data outside of the input distribution, data it has not seen.
The majority of people right now are fairly out of touch with the actual capabilities of modern models.
There’s a combination of the tech learning curve on the human side as well as an amplification of stories about the 0.5% most extreme failure conditions by a press core desperate to feature how shitty the technology they are terrified of taking their jobs is.
There’s some wild stuff most people just haven’t seen.
I can just as well say that the screenshot above is the top 0.5% pushed by people trying to sell the tech. I don’t really have an opinion either way tbh, I’m just being cynical. But my own experience with those tools hasn’t been impressive.
Wow, that’s actually quite impressive.
I’m sure eventually someone will make a bot called something like ai-explains-the-joke that does this automatically.
I wonder how much was scraped from knowyourmeme.com
I mean it still parsed the specific text in the meme and formulated a coherent explanation of this specific meme, not just the meme format
Or it matched the text with an existing explanation upon which it was indexed.
That’s not how GPTs work
That’s literally how they work
Man the models can’t store verbatim its training data, the amount of data is turned into a model that is hundreds or thousands of times smaller than the original source data. If it was capable of simply recovering everything that it was trained on this would be some magical compression algorithm and that by itself would be extremely impressive.
Congratulations on discovering compression
Oh ok, you want to claim this is compressing the entirety of the internet in a model that isn’t even 1 terabyte of data and be unimpressed that is something.
But it isn’t compression. It is a mathematical fact that neural networks are universal function approximators, this is undisputed, and analytic functions are continuous so to be an analytical function approximator it must be able to fill in the gaps between discrete data points by itself, which necessarily means spiting out data outside of the input distribution, data it has not seen.
Lmao you think it found a specific explanation for this specific variation of this meme?
The majority of people right now are fairly out of touch with the actual capabilities of modern models.
There’s a combination of the tech learning curve on the human side as well as an amplification of stories about the 0.5% most extreme failure conditions by a press core desperate to feature how shitty the technology they are terrified of taking their jobs is.
There’s some wild stuff most people just haven’t seen.
I can just as well say that the screenshot above is the top 0.5% pushed by people trying to sell the tech. I don’t really have an opinion either way tbh, I’m just being cynical. But my own experience with those tools hasn’t been impressive.