the goal here was to be able to put in the vector for (e.g.) "dog" & get back a line about dogs. but it's learning the punctuation and the length of the lines, so putting in single words yields stuff like
abacus ➝ Beginabliny
allison ➝ It is is is is is is ine is ineay
cheese ➝ Great occhanting seaw
daring ➝ The left the lonious courtina
mastodon ➝ shorn born born borner
parrish ➝ the oh
purple ➝ Greath green green green
trousers ➝ To blenting my blank
whoops ➝ Aaann aaas! aaan aaas!
zoo ➝ T
(I know I'm literally just redoing karpathy experiments from "the unreasonable effectiveness of rnns" era but it really is unreasonable how effective this is. also my first nn-training experiment that didn't just immediately overfit and where actually adding more neurons to the hidden layers helped instead of hurt)
The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!