Articles and Publications

New artificial intelligence beats tactical experts in combat simulation –, July 3, 2016
Artificial intelligence (AI) developed by a University of Cincinnati doctoral graduate was recently assessed by subject-matter expert and retired United States Air Force Colonel Gene Lee — who holds extensive aerial combat experience as an instructor and Air Battle Manager with considerable fighter aircraft expertise — in a high-fidelity air combat simulator. The artificial intelligence, dubbed ALPHA, was the victor in that simulated scenario, and according to Lee, is “the most aggressive, responsive, dynamic and credible AI I’ve seen to date.”

Face recognition app taking Russia by storm may bring end to public anonymity – The Guardian, May 17, 2016
Unlike other face recognition technology, their algorithm allows quick searches in big data sets. “Three million searches in a database of nearly 1bn photographs: that’s hundreds of trillions of comparisons, and all on four normal servers. With this algorithm, you can search through a billion photographs in less than a second from a normal computer,” said Kabakov, during an interview at the company’s modest central Moscow office.

Fast machine-learning online optimization of ultra-cold-atom experiments – Nature, May 16, 2016
We apply an online optimization process based on machine learning to the production of Bose-Einstein condensates (BEC). BEC is typically created with an exponential evaporation ramp that is optimal for ergodic dynamics with two-body s-wave interactions and no other loss rates, but likely sub-optimal for real experiments. Through repeated machine-controlled scientific experimentation and observations our ‘learner’ discovers an optimal evaporation ramp for BEC production. (code available below)

Huge leap forward’: Computer that mimics human brain beats professional at game of Go – Science Mag, January 27, 2016
Eighteen years after a computer beat then-reigning world champion Garry Kasparov at chess, a machine has defeated a professional player at the ancient eastern board game Go. The new advance is much bigger, artificial intelligence (AI) researchers say, as Go is such a computationally demanding game that even a decade ago some researchers thought a computer would never defeat a human expert. What’s more, the machine won not by virtue of overwhelming computational power, but by employing “machine learning” tools that enable it to teach itself and to think more like humans do.

Inceptionism: Going Deeper into Neural Networks – Google, June 17, 2015
Artificial Neural Networks have spurred remarkable recent progress in image classification and speech recognition. But even though these are very useful tools based on well-known mathematical methods, we actually understand surprisingly little of why certain models work and others don’t. So let’s take a look at some simple techniques for peeking inside these networks … (code available below)

This app knows how you feel from the look on your face – TED, May 2015
Our emotions influence every aspect of our lives — how we learn, how we communicate, how we make decisions. Yet they’re absent from our digital lives; the devices and apps we interact with have no way of knowing how we feel. Scientist Rana el Kaliouby aims to change that. She demos a powerful new technology that reads your facial expressions and matches them to corresponding emotions. This “emotion engine” has big implications, she says, and could change not just how we interact with machines — but with each other.


Tutorials, Lectures, and Classes

Lists of Machine Learning And Statistics Repositories

Machine Learning (Coursera) by Andrew Ng, Stanford University, 2012
A MUST for anyone who dives into Machine Learning. The entire series of lectures may be downloaded in a single archive here.

Michael Nielsen, Recurse Center, current
Scientist, writer, and programmer Michael Nielsen provides a wealth of knowledge and expertise in Machine Learning. Author of “Reinventing Discovery”, “Quantum Computation and Quantum Information”, and “Neural Networks and Deep Learning“, a free online book explaining the core ideas behind artificial neural networks and deep learning (includes free Python code).

Neural Networks Demystified (animated) by Stephen Welch – Nov 4, 2014 with the code for each video also available.

A Gentle Introduction To Machine Learning (animated) by Kastner, Kyle, Southwest Research Institute – July 1, 2013

Machine Learning: The Basics (PPT lecture) by Ron Bekkerman – Mar 19, 2012

Artificial Intelligence – Thinking Allowed (interview) by Jeffrey Mishlove – November 3, 2011
An interview with John McCarthy (1927-2011), inventor of LISP; discusses the history of artificial intelligence and the future role which non-monotonic reasoning will play in enabling computers to simulate the human mind.

Automated Design Using Darwinian Evolution and Genetic Programming (lecture) – February 18, 2009
John Koza, founder of GP, describes an automated “What You Want Is What You Get” process for designing complex structures based on the principles of natural selection, sexual recombination, and developmental biology.

Neural Networks and Deep Learning (free e-book) – January 2016
Neural networks and deep learning currently provide the best solutions to many problems in image recognition, speech recognition, and natural language processing. This book will teach you many of the core concepts behind neural networks and deep learning:

  • Neural networks, a beautiful biologically-inspired programming paradigm which enables a computer to learn from observational data
  • Deep learning, a powerful set of techniques for learning in neural networks



SciKit Learn
The go-to for Machine Learning in Python, presenting a tried and tested, industry standard library of simple and efficient tools for data mining and data analysis built on NumPy, SciPy, and matplotlib on an open source, commercially usable, BSD license. SciKit Learn offers code examples and tutorials for:

  • Classification
  • Regression
  • Clustering
  • Dimensionality reduction
  • Model selection
  • Preprocessing

Deep Learning Libraries in Python
This list was generated by Adrian Rosebrock –thank you!

A deep learning framework made with expression, speed, and modularity in mind. It is developed by the Berkeley Vision and Learning Center (BVLC) and by community contributors. Caffe offers expressive architecture encourages application and innovation, extensible code fosters active development, speed, and a community of academic research projects, startup prototypes, and even large-scale industrial applications in vision, speech, and multimedia.

A Python library that allows you to define, optimize, and evaluate mathematical expressions involving multi-dimensional arrays efficiently. Theano features:

  • Tight integration with NumPy – Use numpy.ndarray in Theano-compiled functions.
  • Transparent use of a GPU – Perform data-intensive calculations up to 140x faster than with CPU.(float32 only)
  • Efficient symbolic differentiation – Theano does your derivatives for function with one or many inputs.
  • Speed and stability optimizations – Get the right answer for log(1+x) even when x is really tiny.
  • Dynamic C code generation – Evaluate expressions faster.
  • Extensive unit-testing and self-verification – Detect and diagnose many types of errors.

Tensor Flow
An open source software library for numerical computation using data flow graphs. Nodes in the graph represent mathematical operations, while the graph edges represent the multidimensional data arrays (tensors) communicated between them. The flexible architecture allows you to deploy computation to one or more CPUs or GPUs in a desktop, server, or mobile device with a single API.

A lightweight library to build and train neural networks in Theano. Its main features are:

  • Supports feed-forward networks such as Convolutional Neural Networks (CNNs), recurrent networks including Long Short-Term Memory (LSTM), and any combination thereof
  • Allows architectures of multiple inputs and multiple outputs, including auxiliary classifiers
  • Many optimization methods including Nesterov momentum, RMSprop and ADAM
  • Freely definable cost function and no need to derive gradients due to Theano’s symbolic differentiation
  • Transparent support of CPUs and GPUs due to Theano’s expression compiler

A lightweight, portable, flexible distributed/mobile Deep Learning with Dynamic, mutation-aware data flow dep scheduler for Python, R, Julia, Scala, Go, Javascript and more

Experiments with scikit-learn compatible estimators, transformers, and datasets for Theano.

A number of wrappers and abstractions around existing neural network libraries, most notably Lasagne, along with a few machine learning utility modules. All code is compatible with scikit-learn.

Keras: Deep Learning library for Theano and TensorFlow
Keras is a minimalist, highly modular neural networks library, written in Python and capable of running on top of either TensorFlow or Theano. It was developed with a focus on enabling fast experimentation. Being able to go from idea to result with the least possible delay is key to doing good research.

Digits by NVIDIA
If you just want to start with zero know how, NVIDIA Digits is an interface to Caffe DNN enviornment (an industry standard) is the best option. As of now DIGITS is designed to play with image kind of datasets. But if you know Caffe then you can easily tweak the parameters to work for any kind of datasets (time series, text etc). A very useful and functional interface to do professional work. The Github download is available here.

Inceptionism (Deep Dream) by Google
This repository contains IPython Notebook with sample code, complementing Google Research blog post about Neural Network art. See original gallery for more examples. (full story above)

Magenta by Google
Magenta is a project from the Google Brain team that asks: Can we use machine learning to create compelling art and music? If so, how? If not, why not?

Machine-learning online optimization package
M-LOOP is package written in python for online optimization of quantum experiments. It uses algorithms based on machine learning to efficiently find an optimal set of parameters for the experiment, all in real time. (full story above)

OpenAI Gym Beta
A toolkit for developing and comparing reinforcement learning algorithms. It supports teaching agents everything from walking to playing games like Pong or Go.

Tinker With a Neural Network
Tinker With a Neural Network Right Here in Your Browser. Don’t Worry, You Can’t Break It. We Promise.

Genetic Programming Implementations at

Karoo GP by Kai Staats
Karoo GP is a evolutionary algorithm, a genetic programming application suite written in Python which provides both symbolic regression and classification analysis. Karoo GP is a scalable platform with multicore support, designed to readily work with realworld data. No programming required. As a teaching tool, it enables instructors to share step-by-step how an evolutionary algorithm arrives to its solution. As a hands-on learning tool, Karoo GP supports rapid, repeatable experimentation.



Machine Learning A Cappella – Overfitting Thriller! –a must see :)

EC class/code libraries

Machine Learning Demonstrations

Tetris AI – Genetic Programming Vs Tetris Game

“Santa Fe Trail” problem – Genetic Programming

Call for Papers

Google’s ROACH for ML – May 2016