This is the first post of a new course about Deep Learning and Torch.
In this post you will see how to install, how to configure the development environment, and a simple overview about the Lua language.
1. About Deep Learning with Torch Tutorial / Course
I’ll be covering the basic theoretical aspects of Deep Learning, and show how to implement it with Torch.
|Get updates||Follow @aron-bordin|
2. About Torch
Torch is a scientific computing framework with wide support for machine learning algorithms that puts GPUs first. It is easy to use and efficient, thanks to an easy and fast scripting language, LuaJIT, and an underlying C/CUDA implementation.
A summary of core features:
- a powerful N-dimensional array
- lots of routines for indexing, slicing, transposing, …
- amazing interface to C, via LuaJIT
- linear algebra routines
- neural network, and energy-based models
- numeric optimization routines
- Fast and efficient GPU support
- Embeddable, with ports to iOS, Android and FPGA backends
When working with Deep Learning, there are 2 essential packages that you will use with Torch:
optim. These two packages will be covered in the following posts. Also, the main component of Torch is the
Tensor that will be covered in the next post.
3. Installing Torch
Feel free to follow the installation instructions in the official page: http://torch.ch/docs/getting-started.html
Basically, you need to run the following command on terminal:
After the installation process, you should have some new commands available such as
luarocks, and so on.
If this commands are not available, make sure to append the
~/torch/bin folder to the system path.
4. Playing with Lua
If you are experienced with Lua, you can skip to the next section :)
Torch uses a LuaJIT, to play with it, opens a new terminal and type:
luajit, and you may see something like:
LuaJIT 2.1.0-beta1 -- Copyright (C) 2005-2015 Mike Pall. http://luajit.org/ _____ _ |_ _| | | | | ___ _ __ ___| |__ | |/ _ \| '__/ __| '_ \ | | (_) | | | (__| | | | \_/\___/|_| \___|_| |_| JIT: ON SSE2 SSE3 SSE4.1 fold cse dce fwd dse narrow loop abc sink fuse th>
4.1 - Variables
In Lua, all variables are defined as global by default (you need to use the
local directive to declare it as a local variable).
The name can be composed with letters, digits, and the underscore char. And Lua is case sensitive.
Lua is dinamically typed, so you don’t defined the type of the variable, it will be automatically defined by the compiler.
Here is a list of some of the standard variable types:
- nil: empty value (with no value)
- boolean: accept
- number: real values (with double precision)
- string: array of characters, can be defined with
- function: a method written in Lua or C
- table: ordinary arrays, dictionaries, symbol tables, sets, records, graphs, trees, etc
if you want to see the type of a variable, you can use the
4.2 - Operators
Lua provides the following operators:
- +: sum two numbers
- -: subtract two numbers
- *: multiply two numbers
- /: divide two numbers
- %: module
- ^: power
- ==: equal
- ~=: not equal
- >: greater than
- <: lesser than
- >=: greater or equal than
- <=: lesses or equal than
- ..: concatenates two strings
- #: returns the length of the object (example: #”Hi!” is 3)
You can use the following loop methods with Lua:
4.4 - If
You can control your logic with the if statement as follows
if and else:
if, else, and elseif:
4.6 - More about Lua
As you need more resources about Lua, follows the Lua Tutorial, and google about it.
LuaRocks is the package manager for Lua modules.
It allows you to create and install Lua modules as self-contained packages called rocks. You can download and install LuaRocks on Unix and Windows.
To install a new package, for example, the dp package, a pretty good deep learning library for streamlining research and development using the Torch7 distribution, you can use the command:
This posts covered the basic of Torch7 installation and a simple overview of the Lua package. We’re going to start with more technical
details in the next post, where you are going to learn about
Tensor, the core of Torch.
Thank you for reading, I see you in the next post.
Subscribe via RSS