Best Julia library for neural networks

13

6

I have been using this library for basic neural network construction and analysis.

However, it does not have support for building multi-layered neural networks, etc.

So, I would like to know of any nice libraries for doing advanced neural networks and Deep Learning in Julia.

Dawny33

Posted 2015-11-19T06:04:53.053

Reputation: 7 606

1https://github.com/dmlc/MXNet.jl – itdxer – 2015-11-19T09:40:42.067

1@itdxer Thank you for the link. Can you put that as an answer by elaborating about it? – Dawny33 – 2015-11-19T10:17:56.380

Answers

7

Mocha.jl - Mocha is a Deep Learning framework for Julia, inspired by the C++ framework Caffe.

Project with good documentation and examples. Can be run on CPU and GPU backend.

Bartłomiej Twardowski

Posted 2015-11-19T06:04:53.053

Reputation: 409

1

I think they stopped developing Mocha and MXNet is the way to go forward. See malmaud's comment here: https://github.com/pluskid/Mocha.jl/issues/157

– niczky12 – 2016-08-01T13:31:17.993

I've used Mocha for a while, it's got some issues and lacks a community, I concur that MXNet is where active development is. There's also a Julia wrapper for Tensorflow: https://github.com/malmaud/TensorFlow.jl (disclamer: I haven't used either, MXNet or the TF Julia Wrapper)

– davidparks21 – 2016-10-08T22:19:58.260

9

MXNet Julia Package - flexible and efficient deep learning in Julia

https://github.com/dmlc/MXNet.jl

Pros

  • Fast
  • Scales up to multi GPUs and distributed setting with auto parallelism.
  • Lightweight, memory efficient and portable to smart devices.
  • Automatic Differentiation

Cons

itdxer

Posted 2015-11-19T06:04:53.053

Reputation: 219

5

Just to add a more recent (2019) answer: Flux.

Flux is an elegant approach to machine learning. It's a 100% pure-Julia stack,
and provides lightweight abstractions on top of Julia's native GPU and
AD support. Flux makes the easy things easy while remaining fully hackable.

For example:

model = Chain(
  Dense(768, 128, σ),
  LSTM(128, 256),
  LSTM(256, 128),
  Dense(128, 10),
  softmax)

loss(x, y) = crossentropy(model(x), y)

Flux.train!(loss, data, ADAM(...))

Wayne

Posted 2015-11-19T06:04:53.053

Reputation: 226

3

As of Oct 2016 there's also a Tensorflow wrapper for Julia.

davidparks21

Posted 2015-11-19T06:04:53.053

Reputation: 363

1

One newer library to look at as well is Knet.jl. It will do things like use GPUs under the hood.

Chris Rackauckas

Posted 2015-11-19T06:04:53.053

Reputation: 111