Handling the Dimensions and Sizes of PyTorch Tensors
Definition
Let’s call $A$ a PyTorch tensor.
The following pair $(a_{0}, a_{1}, \dots, a_{n-1})$ is called the size of $A$.
$$ \text{A.size() = torch.Size}([a_{0}, a_{1}, \dots, a_{n-1} ]) $$
Let’s refer to $\prod \limits_{i=0}^{n-1} a_{i} = a_{0} \times a_{1} \times \cdots a_{n-1}$ as the dimension of $A$.
Call $A$ a $n$-dimensional tensor.
$a_{i}$ are the sizes of the respective $i$th dimensions, which are integers greater than $1$. Since this is Python, note that it starts from the $0$th dimension.
>>> A = torch.ones(2,3,4)
tensor([[[1., 1., 1., 1.],
[1., 1., 1., 1.],
[1., 1., 1., 1.]],
[[1., 1., 1., 1.],
[1., 1., 1., 1.],
[1., 1., 1., 1.]]])
For example, the dimension of such a tensor $A$ is $3$, its size is $24=2 \cdot 3 \cdot 4$, and its dimension is $(2,3,4)$.
.dim()
, .ndim
Returns the dimension of the tensor.
>>> A.dim()
3
>>> A.ndim
3
.shape
, .size()
Returns the size of the tensor.
>>> A.shape
torch.Size([2, 3, 4])
>>> A.shape[1]
3
>>> A.size()
torch.Size([2, 3, 4])
>>> A.size(2)
4
.view()
, .reshape()
Changes the size of the tensor while keeping its dimension.
If you use $-1$ as an argument, the size is adjusted automatically. For instance, as in the following example, changing a tensor of size $(2,3,4)$ with .view(4,-1)
changes its size to $(4,6)$.
>>> A.reshape(8,3)
tensor([[1., 1., 1.],
[1., 1., 1.],
[1., 1., 1.],
[1., 1., 1.],
[1., 1., 1.],
[1., 1., 1.],
[1., 1., 1.],
[1., 1., 1.]])
>>> A.view(3,-1)
tensor([[1., 1., 1., 1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1., 1., 1., 1.],
[1., 1., 1., 1., 1., 1., 1., 1.]])
>>> A.view(-1,4)
tensor([[1., 1., 1., 1.],
[1., 1., 1., 1.],
[1., 1., 1., 1.],
[1., 1., 1., 1.],
[1., 1., 1., 1.],
[1., 1., 1., 1.]])