training a neural network to "caption" summary vectors of lines of poetry, then using the network to generate text for summary vectors of arbitrary sentences. this is after seven epochs of training on my laptop last night; source line is first, output is second

the goal here was to be able to put in the vector for (e.g.) "dog" & get back a line about dogs. but it's learning the punctuation and the length of the lines, so putting in single words yields stuff like

abacus ➝ Beginabliny
allison ➝ It is is is is is is ine is ineay
cheese ➝ Great occhanting seaw
daring ➝ The left the lonious courtina
mastodon ➝ shorn born born borner
parrish ➝ the oh
purple ➝ Greath green green green
trousers ➝ To blenting my blank
whoops ➝ Aaann aaas! aaan aaas!
zoo ➝ T

(I know I'm literally just redoing karpathy experiments from "the unreasonable effectiveness of rnns" era but it really is unreasonable how effective this is. also my first nn-training experiment that didn't just immediately overfit and where actually adding more neurons to the hidden layers helped instead of hurt)

christmas, recurrent neural networks Show more

Follow

christmas, recurrent neural networks Show more

Sign in to participate in the conversation
mastodon@bau-ha.us

social.bau-ha.us is your friendly neighbourhood Mastodon instance!
Proudly presented by Maschinenraum Weimar .
This is a small instance with a focus on community. Feel free to join, but please be aware that there are some rules (in short: don't be a bunghole. Check the link for our definition of "being a bunghole".).
Primary languages are German and English.
Be excellent to each other! (J.W.v.Goethe, 1932) [[Citation needed]]