training a neural network to "caption" summary vectors of lines of poetry, then using the network to generate text for summary vectors of arbitrary sentences. this is after seven epochs of training on my laptop last night; source line is first, output is second

the goal here was to be able to put in the vector for (e.g.) "dog" & get back a line about dogs. but it's learning the punctuation and the length of the lines, so putting in single words yields stuff like

abacus ➝ Beginabliny
allison ➝ It is is is is is is ine is ineay
cheese ➝ Great occhanting seaw
daring ➝ The left the lonious courtina
mastodon ➝ shorn born born borner
parrish ➝ the oh
purple ➝ Greath green green green
trousers ➝ To blenting my blank
whoops ➝ Aaann aaas! aaan aaas!
zoo ➝ T

Show thread

(I know I'm literally just redoing karpathy experiments from "the unreasonable effectiveness of rnns" era but it really is unreasonable how effective this is. also my first nn-training experiment that didn't just immediately overfit and where actually adding more neurons to the hidden layers helped instead of hurt)

Show thread

christmas, recurrent neural networks 

generated text for "Merry Christmas, friends!":

Love! haste! so love! my good soul!

Show thread
Follow

christmas, recurrent neural networks 

@aparrish

wow, that's wonderful

Sign in to participate in the conversation
mastodon@bau-ha.us

The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!