What might it be good for? Chat-GPT looks back at up to 4000 tokens (maybe 3000 words?) in the prompt(s), and then, probabilistically, it selects the next token to output. It was trained on half...
Share this post
Playing wiþ Chat-GPT
Share this post
What might it be good for? Chat-GPT looks back at up to 4000 tokens (maybe 3000 words?) in the prompt(s), and then, probabilistically, it selects the next token to output. It was trained on half...