wonder Expert Licensed User Longtime User Jul 29, 2019 #1 So I've been fine-tuning a GPT-2 (345M) model with Harry Potter books. It generated this Conclusion: The deeper the network, the deeper the bullsh*t!
So I've been fine-tuning a GPT-2 (345M) model with Harry Potter books. It generated this Conclusion: The deeper the network, the deeper the bullsh*t!
aeric Expert Licensed User Longtime User Jul 30, 2019 #2 Our model, called GPT-2 (a successor to GPT), was trained simply to predict the next word in 40GB of Internet text. Click to expand... I guessed the bullshit lies inside the 40GB of Internet text.
Our model, called GPT-2 (a successor to GPT), was trained simply to predict the next word in 40GB of Internet text. Click to expand... I guessed the bullshit lies inside the 40GB of Internet text.
E emexes Expert Licensed User Jul 30, 2019 #3 I did that with.. uh, erotic fiction, and the output was disturbingly indistinguishable from the input.
I did that with.. uh, erotic fiction, and the output was disturbingly indistinguishable from the input.
wonder Expert Licensed User Longtime User Jul 30, 2019 #4 Someone made a website out of it: https://talktotransformer.com/
wonder Expert Licensed User Longtime User Jul 30, 2019 #5 aeric said: I guessed the bullshit lies inside the 40GB of Internet text. Click to expand... Unfortunately we don't have access to the higher models (yet). In my case, I'm using the 345M model.
aeric said: I guessed the bullshit lies inside the 40GB of Internet text. Click to expand... Unfortunately we don't have access to the higher models (yet). In my case, I'm using the 345M model.