dummy-link

MLJ

A Julia machine learning framework

Readme

MLJ

A pure Julia machine learning framework.

MLJ News for MLJ and its satellite packages, MLJBase, MLJModels and ScientificTypes | MLJ Cheatsheet

join!(MLJ, YourModel)

Call for help. MLJ needs your help to ensure its success. This depends crucially on:

  • Existing and developing ML algorithms implementing the MLJ model interface

  • Improvements to existing but poorly maintained Julia ML algorithms

The MLJ model interface is now relatively stable and well-documented, and the core team is happy to respond to issue requests for assistance. Please click here for more details on contributing.

MLJ is presently supported by a small Alan Turing Institute grant and is looking for new funding sources to grow and maintain the project.

Build Status Slack Channel mlj Coverage Status

MLJ aims to be a flexible framework for combining and tuning machine learning models, written in the high performance, rapid development, scientific programming language, Julia.

The MLJ project is partly inspired by MLR.

List of presently implemented models

Installation

At the julia REPL prompt

using Pkg
Pkg.add("MLJ")
Pkg.add("MLJModels")

To obtain a list of all registered models:

using MLJ
models()

To add a package - for example, DecisionTree - to your load path:

using Pkg
Pkg.add("DecisionTree")

To load all code needed to use a model - for example, DecisionTreeClassifier,

@load DecisionTreeClassifier

which also returns a default instance. Refer to the documentation for more on instantiating and running loaded models.

Package conflicts. If you encounter package conflicts during installation, and are not familiar with the Julia package manager, then you can try installation in a fresh environment by first entering these commmands:

using Pkg
Pkg.activate("my_mlj_env", shared=true)

In future REPL sessions, you can activate your (now populuted) environment with the same command.

A docker image with installation instructions is also available.

Features to include:

  • Automated tuning of hyperparameters, including composite models with nested parameters. Tuning implemented as a wrapper, allowing composition with other meta-algorithms. ✔

  • Option to tune hyperparameters using gradient descent and automatic differentiation (for learning algorithms written in Julia).

  • Option to tune hyperaparameters using Bayesian optimisation

  • Data agnostic: Train models on any data supported by the Tables.jl interface. ✔

  • Intuitive syntax for building arbitrarily complicated learning networks .✔

  • Learning networks can be exported as self-contained composite models ✔, but common networks (e.g., linear pipelines ✔, stacks) come ready to plug-and-play.

  • Performant parallel implementation of large homogeneous ensembles of arbitrary models (e.g., random forests). ✔

  • Model registry and facility to match models to machine learning tasks. ✔

  • Benchmarking a battery of assorted models for a given task.

  • Automated estimates of cpu and memory requirements for given task/model.

  • Friendly interface for handling probabilistic prediction. ✔

Frequently Asked Questions

See here.

Getting started

Get started here, or take the MLJ tour.

History

Antecedents for the current package are AnalyticalEngine.jl, Orchestra.jl, and Koala.jl. Development was also guided by a research study group at the University of Warwick, beginning with a review of existing ML Modules that were available in Julia at the time (in-depth, overview).

alt text

Further work culminated in the first MLJ proof-of-concept

For administrators: Implementing requests to register new models.

First Commit

08/01/2018

Last Touched

1 day ago

Commits

1014 commits