danielslater.net
DANIEL SLATER'S BLOG: Using Net2Net to speed up network training
http://www.danielslater.net/2016/05/using-net2net-to-speed-up-network.html
Thursday, May 26, 2016. Using Net2Net to speed up network training. When training neural networks there are 2 things that combine to make life frustrating:. Neural networks can take an insane amount of time of train. If a network could be trained quickly number 2 wouldn't really matter, we could just do a grid search(or even particle swarm optimization. Or maybe Bayesian optimization. The amount of time to train is counted in days so better hope your first guess was good. I have posted a numpy implementa...
danielslater.net
DANIEL SLATER'S BLOG: June 2015
http://www.danielslater.net/2015_06_01_archive.html
Thursday, June 18, 2015. Why there is a big unsupervised learning shaped whole in the universe. Unfortunately so far we are a long way from that and the technique shown here seems trivial compared to that goal. But the goal is interesting enough that it is worth pursuing. The answer to the question "what am I asking an unsupervised network to do? An example competitive learning neural net. It is also available in C#. From random import uniform class UnsupervisedNN(object): def init (self, size of input a...
danielslater.net
DANIEL SLATER'S BLOG: October 2015
http://www.danielslater.net/2015_10_01_archive.html
Tuesday, October 27, 2015. Even more quick summaries of research papers. Following on from part 1. Here are even more quick summaries of research papers,. Generative NeuroEvolution for DeepLearning. Uses the MNIST hand-drawn digits dataset, trains both a normal deep network and a convolutional network. HyperNEAT on it's own performs very badly. Using HyperNEAT to generate a number of layers and then backprop on the final layer is achieves 58.4% for normal ANNs. ES-HyperNeat uses the connection weights va...
danielslater.net
DANIEL SLATER'S BLOG: August 2015
http://www.danielslater.net/2015_08_01_archive.html
Saturday, August 29, 2015. Presenting WikiDataDotNet - Client API for WikiData. Is one of those things that sets the mind boggling at the possibilities of the internet. It's a project, started by the WikiMedia foundation, to collect structured data on everything. If you are doing anything related to machine learning, it is the best source of data I have so far found. Https:/ www.wikidata.org/w/api.php? This will return a JSON file with sections like:. Here we see the id of the item, in this case Q38.
danielslater.net
DANIEL SLATER'S BLOG: May 2016
http://www.danielslater.net/2016_05_01_archive.html
Thursday, May 26, 2016. Using Net2Net to speed up network training. When training neural networks there are 2 things that combine to make life frustrating:. Neural networks can take an insane amount of time of train. If a network could be trained quickly number 2 wouldn't really matter, we could just do a grid search(or even particle swarm optimization. Or maybe Bayesian optimization. The amount of time to train is counted in days so better hope your first guess was good. I have posted a numpy implementa...
danielslater.net
DANIEL SLATER'S BLOG: Mini-Pong and Half-Pong
http://www.danielslater.net/2016/05/mini-pong-and-half-pong.html
Monday, May 2, 2016. I'm going to be giving a talk/tutorial at PyDataLondon 2016. On Friday the 6th of may, if your in London that weekend I would recommend going, there are going to be lots of interesting talks, and if you do go please say hi. Possibly they are using other tricks not reported in the paper, or just lots of hyper parameter tuning, or there are still more bugs in my implementation(entirely possible, if anyone finds any please submit). Distance from building our future robot overlords, stil...
ouxinyu.github.io
Xin-Yu Ou(欧新宇)
http://ouxinyu.github.io/Link.html
无监督学习和深度学习 Tutorial,由 Stanford Andrew Ng 团队领导完成,十分适合想要学习深度学习童鞋. 由 Stanford 教授 Andrew NG 领衔创办的教育平台,它与全世界最顶尖的大学和机构合作,提供任何人可学习的免费在线课程。 Discover the current state of the art in objects classification. Include MNIST, CIFAR-10, CIFAR-100, STL-10, SVHN, ILSVRC2012 task1. WikiCFP is a semantic wiki for Calls For Papers in science and technology fields. There are about 40,000 CFPs on WikiCFP. Over 100,000 researchers use WikiCFP each month. VALSE QQ群 364188996,VALSE-B QQ群 422075165. Caffe is a deep learning frame...
bcomposes.com
machine learning – Bcomposes
http://bcomposes.com/category/machine-learning
Computational linguistics, machine learning, programming, and random thoughts. Simple end-to-end TensorFlow examples. A walk-through with code for using TensorFlow on some simple simulated data sets. I’ve been reading papers about deep learning for several years now, but until recently hadn’t dug in and implemented any models using deep learning techniques for myself. To remedy this, I started experimenting with Deeplearning4J. A few weeks ago, but with limited success. I read more books. At the Universi...
bcomposes.com
R – Bcomposes
http://bcomposes.com/category/r
Computational linguistics, machine learning, programming, and random thoughts. Simple end-to-end TensorFlow examples. A walk-through with code for using TensorFlow on some simple simulated data sets. I’ve been reading papers about deep learning for several years now, but until recently hadn’t dug in and implemented any models using deep learning techniques for myself. To remedy this, I started experimenting with Deeplearning4J. A few weeks ago, but with limited success. I read more books. At the Universi...
srippa.wordpress.com
July | 2015 | Bits and pieces
https://srippa.wordpress.com/2015/07
A collection of items that interest me. July 24, 2015. May 6, 2016. Getting started with Deep learning. Structures list of videos, tutorials, courses on AI, cognitive computing and deep learning. Another link to various AI related resources. Tutorials and best data scientists to follow (2015). List of best blogs to follow (2015). Deep learning tutorial from Stanford. Deep learning in a nutshell:. Nandos de Freitas: YouTube video series. Deep learning summer school 2015. ML class 10-701 (2015). List of Py...