cool, just read this in the twitch-stream chat:
"benanne: code will be released in a few days! takes some tinkering to get it to run though"
here's a little more info about
the real-time render-video i copied from the chat:
carmethene_tv: @317070 +1, thanks for doing this! any chance of making the software behind this available? it would be fantastic for parties
paradonym: carmethene_tv I'd guess the needed CPU/GPU power is WAY too high for standard desktop systems
benanne: @Carmethene_tv source will be made available in a few days, but it will mainly be useful for experimentation I tihnk
benanne: and yeah, you need a decent GPU and lots of RAM
paradonym: @benanne - does this run on CAD or standard GPUs?
benanne: GTX 980 previously, now it's running on an older GTX 680
benanne: standard gamer GPUs
gwelengu: wow, 32 GB? that's crazy. I'll never be able to run this
ChaozCoder: @paradonym: or maybe it was some kind of bundled GPU i don't know exactly, but nothing like a supercomputer
benanne: you need 4GB of video RAM
ChaozCoder: @gwelengu: he said the machine has 32 GB you probably don't need that much
ChaozCoder: @benanne: ok
benanne: 32GB of CPU RAM but that may be fixed at some point. We suspect some sort of memory leak in the encoder
ChaozCoder: yeah i meant cpu ram not gpu ram
Dustinator2: my rig is a little lacking in the RAM department but my graphics card makes up for it until I can get more
ChaozCoder: memory leaks, that's why i love c# (for non performance critical stuff)
benanne: @Chaozcoder: not much we can do about it though, it's not in the code @317070 wrote but rather in the video encoder it's using. Maybe he'll try with a different one, I don't know
ChaozCoder: @benanne: oh
benanne: he may already have solved it, I don't know, it definitely seems to be more stable today
317070: @Chaozcoder @Benanne I think I solved the problem though. There was some fishy socket-handling by this bot which could have caused the issue.
ChaozCoder: so the bot was causing it, that is hilarious lol
foofoobarfoo: which software/language has been used in implementing this? python/theano..?
ChaozCoder: of all he things
benanne: okay. Still 27GB of memory in use right now
benanne: @Foofoobarfoo yeah, Python, Theano, Lasagne
foofoobarfoo: it's something like a diabolo network? (autoencoders)
benanne: My guess is the code could be made much more memory efficient but he doesn't need to bother because the machine has 32GB anyway
benanne: @Foofoobarfoo: no it's a feedforward neural net run backwards. Technical details here:http://317070.github.io/LSD/
paradonym: Next step: comparing different hardware's dream-images to the exact same thing
wiibrewer: You should train these on larger image sets so they become more defined
carmethene_tv: again, thanks for setting this up, it's a fantastic idea
benanne: @Wiibrewer: yeah, with more data it should work better. We did not train this net ourselves though, we just downloaded the parameters to save time
Dogeapi: was hoping for a link to some code at the github page, but good writeup!
carmethene_tv: so it only recognises what's in the list linked below?
MysteryXi: What is this running on? Isn't this super CPU intensive?
wiibrewer: @Benanne ahh, gotcha. Does it require monitoring to be trained?
wiibrewer: @Benanne meaning does someone have to be there to train it?
benanne: @Wiibrewer: it requires a lot of parameter tuning, but once you've foudn the parameters you just leave it running for a couple of weeks on a GPU
benanne: @Mysteryxi: the neural net is running on a GTX 680 GPU. The CPU only does the frame interpolation and video encoding (which is actually pretty intensive)
wiibrewer: @Benanne awesome, so in theory I could set up and automate this thing myself?
@benanne so this is mainly a RAM-Work? - Could you probably post Hardware occupation? How much load on GPU/CPU/RAM aso?
benanne: if you have the hardware, sure
* carmethene_tv: @Benanne where should I keep an eye out for the software release?
benanne: $ uptime 18:03:19 up 19:39, 2 users, load average: 5.44, 5.88, 5.92
benanne: hexacore CPU is pretty much maxed out
wiibrewer: @Benanne is 20 gigs of ram, gtx770, quad core cpu strong enough?
benanne: for the GPU we don't have utilization info because it's a gamer GPU. But it's at 73C right now which means it's not maxed out probably
benanne: wiibrewer: if you reduce the resolution a bit it should be. Or it might not even be necessary, I don't know
benanne: oh yeah, important detail: this is running on a Linux machine. Setting it up on Windows / Mac OS is not impossible but it'll be a lot more tedious
paradonym: this chat nearly convinces me purchasing 32 gb of RAM instead of a GTX 980 ti
MysteryXi: @benanne What distro?
benanne: @Paradonym: get both,980 Ti is sweet too
paradonym: to run this as CPU benchmark
benanne: @Mysteryxi Ubuntu 14.04 LTS
edit2: they're updating it right now and are enabling 2word combination. I wonder what the tractor-sloth will look like..