Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: add a Getting Started Section #1577

Open
wants to merge 4 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
186 changes: 186 additions & 0 deletions nx/guides/getting_started/installation.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,186 @@
# Installation

The only prerequisite for installing Nx is Elixir itself. If you don´t have Elixir installed
in your machine you can visit this [intallation page](https://elixir-lang.org/install.html).

There are several ways to install Nx (Numerical Elixir), depending on your project type and needs.

## Using Mix in a standardElixir Project
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
## Using Mix in a standardElixir Project
## Using Nx in a Standard Elixir Project


If you are working inside a Mix project, the recommended way to install Nx is by adding it to your mix.exs dependencies:

1. Open mix.exs and modify the deps function:

```elixir
defp deps do
[
{:nx, "~> 0.5"} # Install the latest stable version
]
end
```

2. Fetch the dependencies, run on the terminal:

```sh
mix deps.get
```

## Installing Nx from GitHub (Latest Development Version)

If you need the latest, unreleased features, install Nx directly from the GitHub repository.

1. Modify mix.exs:

```elixir
defp deps do
[
{:nx, github: "elixir-nx/nx", branch: "main"}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
{:nx, github: "elixir-nx/nx", branch: "main"}
{:nx, github: "elixir-nx/nx", branch: "main", sparse: "nx"}

]
end

```

2. Fetch dependencies:

```sh
mix deps.get

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change

```

## Installing Nx in a Standalone Script (Without a Mix Project)

If you don’t have a Mix project and just want to run a standalone script, use Mix.install/1 to dynamically fetch and install Nx.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
If you don’t have a Mix project and just want to run a standalone script, use Mix.install/1 to dynamically fetch and install Nx.
If you don’t have a Mix project and just want to run a standalone script, use `Mix.install/1` to dynamically fetch and install Nx.


```elixir
Mix.install([:nx])

require Nx

tensor = Nx.tensor([1, 2, 3])
IO.inspect(tensor)

Comment on lines +60 to +61
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
IO.inspect(tensor)
IO.inspect(tensor)

```

Run the script with:

```sh
elixir my_script.exs

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change

```

Best for: Quick experiments, small scripts, or one-off computations.

## Installing the Latest Nx from GitHub in a Standalone Script

To use the latest development version in a script (without a Mix project):

```elixir
Mix.install([
{:nx, github: "elixir-nx/nx", branch: "main"}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
{:nx, github: "elixir-nx/nx", branch: "main"}
{:nx, github: "elixir-nx/nx", branch: "main", sparse: "nx"}

])

require Nx

tensor = Nx.tensor([1, 2, 3])
IO.inspect(tensor)
```

Run:

```sh
elixir my_script.exs

```

Best for: Trying new features from Nx without creating a full project.

## Installing Nx with EXLA for GPU Acceleration

To enable GPU/TPU acceleration with Google’s XLA backend, install Nx along with EXLA:

1. Modify mix.exs:

```elixir
defp deps do
[
{:nx, "~> 0.5"},
{:exla, "~> 0.5"} # EXLA (Google XLA Backend)
]
end
```

2. Fetch dependencies:

```sh
mix deps.get
```

3. Run with EXLA enabled:

```elixir
EXLA.set_preferred_backend(:tpu)
```

Best for: Running Nx on GPUs or TPUs using Google’s XLA compiler.

## Installing Nx with Torchx for PyTorch Acceleration

To run Nx operations on PyTorch’s backend (LibTorch):

1. Modify mix.exs:

```elixir
defp deps do
[
{:nx, "~> 0.5"},
{:torchx, "~> 0.5"} # PyTorch Backend
]
end

```
Comment on lines +123 to +140
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Best for: Running Nx on GPUs or TPUs using Google’s XLA compiler.
## Installing Nx with Torchx for PyTorch Acceleration
To run Nx operations on PyTorch’s backend (LibTorch):
1. Modify mix.exs:
```elixir
defp deps do
[
{:nx, "~> 0.5"},
{:torchx, "~> 0.5"} # PyTorch Backend
]
end
```

Let's not mention Torchx. We could maybe reference EMLX, but it's not even released yet, so let's leave this for later.


2. Fetch dependencies:

```sh
mix deps.get
```

3. Run with EXLA enabled:

```elixir
Torchx.set_preferred_backend()
```

Best for: Deep learning applications with PyTorch acceleration.

## Installing Nx with OpenBLAS for CPU Optimization

To optimize CPU performance with OpenBLAS:

1. Install OpenBLAS (libopenblas):
- Ubuntu/Debian:
```sh
sudo apt install libopenblas-dev
```
- MacOS (using Homebrew):
```sh
brew install openblas
```
2. Modify mix.exs:

```elixir
defp deps do
[
{:nx, "~> 0.5"},
{:openblas, "~> 0.5"} # CPU-optimized BLAS backend
]
end
```

3. Fetch dependencies:

```sh
mix deps.get
```

Best for: Optimizing CPU-based tensor computations.
Comment on lines +141 to +186
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
2. Fetch dependencies:
```sh
mix deps.get
```
3. Run with EXLA enabled:
```elixir
Torchx.set_preferred_backend()
```
Best for: Deep learning applications with PyTorch acceleration.
## Installing Nx with OpenBLAS for CPU Optimization
To optimize CPU performance with OpenBLAS:
1. Install OpenBLAS (libopenblas):
- Ubuntu/Debian:
```sh
sudo apt install libopenblas-dev
```
- MacOS (using Homebrew):
```sh
brew install openblas
```
2. Modify mix.exs:
```elixir
defp deps do
[
{:nx, "~> 0.5"},
{:openblas, "~> 0.5"} # CPU-optimized BLAS backend
]
end
```
3. Fetch dependencies:
```sh
mix deps.get
```
Best for: Optimizing CPU-based tensor computations.

82 changes: 82 additions & 0 deletions nx/guides/getting_started/introduction.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,82 @@
# What is Nx?
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@josevalim This is the start of our revamped docs.

We're taking the current getting started guide and both splitting it and getting into more detail.
I'm thinking we should merge these onto a new-docs branch and only merge that onto main after the getting started is fully done.

WDYT?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it is fine to push to main directly, given the plan is for continuous work on it, right?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We'll make it so that this PR is merged when it fully replaces the previous introduction to nx guide.


Nx is the numerical computing library of Elixir. Since Elixir's primary numerical datatypes and structures are not optimized for numerical programming, Nx is the fundamental package built to bridge this gap.

[Elixir Nx](https://github.com/elixir-nx/nx) smoothly integrates typed, multidimensional data called [tensors](introduction.html#what-are-tensors)).
Nx has four primary capabilities:

- In Nx, tensors hold typed data in multiple, optionally named dimensions.
- Numerical definitions, known as `defn`, support custom code with
tensor-aware operators and functions.
- [Automatic differentiation](https://arxiv.org/abs/1502.05767), also known as
autograd or autodiff, supports common computational scenarios
such as machine learning, simulations, curve fitting, and probabilistic models.
- Broadcasting, which is a term for element-by-element operations. Most of the Nx operations
make use of automatic implicit broadcasting. You can see more on broadcasting
[here.](intro-to-nx.html#broadcasts)

Nx tensors can hold unsigned integers (u2, u4, u8, u16, u32, u64),
signed integers (s2, s4, s8, s16, s32, s64),
floats (f8, f16, f32, f64), brain floats (bf16), and complex (c64, c128).
Tensors support backends implemented outside of Elixir, such as Google's
Accelerated Linear Algebra (XLA) and PyTorch.

Numerical definitions provide compiler support to allow just-in-time compilation
targetting specialized processors to speed up numeric computation including
TPUs and GPUs.

## What are Tensors?

In Nx, we express multi-dimensional data using typed tensors. Simply put,
a tensor is a multi-dimensional array with a predetermined shape and
type. To interact with them, Nx relies on tensor-aware operators rather
than `Enum.map/2` and `Enum.reduce/3`.

It allows us to work with the central theme in numerical computing, systems of equations,
which are often expressed and solved with multidimensional arrays.

For example, this is a two dimensional array:

$$
\begin{bmatrix}
1 & 2 \\\\
3 & 4
\end{bmatrix}
$$

As elixir programmers, we can typically express a similar data structure using a list of lists,
like this:

```elixir
[
[1, 2],
[3, 4]
]
```

This data structure works fine within many functional programming
algorithms, but breaks down with deep nesting and random access.

On top of that, Elixir numeric types lack optimization for many numerical
applications. They work fine when programs
need hundreds or even thousands of calculations. However, they tend to break
down with traditional STEM applications when a typical problem
needs millions of calculations.

To solve for this, we can simply use Nx tensors, for example:

```elixir
Nx.tensor([[1,2],[3,4]])

Output:
#Nx.Tensor<
s32[2][2]
[
[1, 2],
[3, 4]
]
```

To learn Nx, we'll get to know tensors first. The following overview will touch
on the major features. The advanced section of the documentation will take a deep dive into working
with tensors in detail, autodiff, and backends.
Loading