In PyTorch, to concatenate tensors along a given dimension, we use torch.cat() method. This method accepts the sequence of tensors and dimension (along that the concatenation is to be done) as input parameters. This method concatenates the sequence of tensors along the given dimension. All tensors must have the same shape (except in the concatenating dimension) i.e., the sizes of tensors must match except in the concatenating dimension.
Fig: Concatenate two tensors along different dimensions |
Prerequisites
- Python 3
- PyTorch
- Jupyter Notebook
Syntax
torch.cat((tensor1, tensor2, ...), dim)
Parameters:
tensor1, tensor2, ...: These are the input tensors. Note that the sizes of tensors must be the same except the concatenating dimension. For better understanding go thorough the examples below.dim: The tensors - tensor1, tensor2,..., will be concatenated along this dimension.
Note: If dimension of the tensors is n then dim must be in range [-n, n-1]. Such as if tensors are two-dimensional then dim can take value in range [-2, 1].
Examples of Concatenating Tensors
torch.cat((t1, t2), 0) # concatenate along first dim
torch.cat((t1, t2), 1) # concatenate along second dim
Now look at some complete examples to understand deeply.
Example 1: Concatenating Two Tensors of Same Sizes
In this program example, we concatenate two 2-dimensional tensors of same size along dimension 0 and 1.
#import required module
import torch
# create two tensors
t1 = torch.randn(2,3)
t2 = torch.randn(2,3)
# dispaly the tensors
print(t1)
print("------------------------------------")
print(t2)
print("------------------------------------")
# concatenate the above tensors
t = torch.cat((t1,t2), 0)
print(t)
print(t.shape)
print("------------------------------------")
t = torch.cat((t1,t2), 1)
print(t)
print(t.shape)
Output
tensor([[-0.8608, 0.3543, 0.7615], [-1.5630, -0.7468, -1.7119]]) ------------------------------------ tensor([[ 0.1708, -1.4171, -0.3423], [-0.2184, 2.3268, 1.4466]]) ------------------------------------ tensor([[-0.8608, 0.3543, 0.7615], [-1.5630, -0.7468, -1.7119], [ 0.1708, -1.4171, -0.3423], [-0.2184, 2.3268, 1.4466]]) torch.Size([4, 3]) ------------------------------------ tensor([[-0.8608, 0.3543, 0.7615, 0.1708, -1.4171, -0.3423], [-1.5630, -0.7468, -1.7119, -0.2184, 2.3268, 1.4466]]) torch.Size([2, 6])
Example 2: Concatenating Two Tensors of Different Sizes
In this program example, we concatenate two 2-dimensional tensors of different sizes along dimension 0 and 1. But you notice we can't concatenate along dimension 1 as the dimension along 0 is different, i.e., first tensor has 2 and second has 1.
#import required module
import torch
# create two 2D tensors
t1 = torch.randn(2,3)
t2 = torch.randn(1,3)
# dispaly the tensors
print(t1)
print("------------------------------------")
print(t2)
print("------------------------------------")
# concatenate the above tensors
t = torch.cat((t1,t2), 0)
print(t)
print(t.shape)
print("------------------------------------")
t = torch.cat((t1,t2), 1)
print(t)
print(t.shape)
Output
Fig -Concatenate two tensors of different size. |
Notice that get a RuntimeError when concatenating the tensors along dim 1. It's so because the dimension of tensors along dim 0 are not the same.
Example 3: concatenating two 3-D tensors with same sizes
In the following program, we concatenate two tensors (3-dimensional) along dims 0, 1 and 2. Note here we are able to concatenate along these dims as the sizes of tensors are the same.#import required module
import torch
# create two 3D tensors
t1 = torch.randn(2,3,4)
t2 = torch.randn(2,3,4)
# dispaly the tensors
print(t1)
print("-------------------------------------------")
print(t2)
print("-------------------------------------------")
# concatenate the above tensors
t = torch.cat((t1,t2), 0)
print(t)
print(t.shape)
print("-------------------------------------------")
t = torch.cat((t1,t2), 1)
print(t)
print(t.shape)
print("-------------------------------------------")
t = torch.cat((t1,t2), 2)
print(t)
print(t.shape)
Output
tensor([[[-0.4088, -0.3405, -0.5122, -1.1153], [-0.5247, -0.2283, 0.0185, -1.2393], [ 0.5817, -0.8054, 1.0051, 0.1855]], [[ 0.4781, 0.6894, 1.5388, -0.1348], [ 1.7484, -1.8953, -0.4417, 0.9228], [ 2.7996, -0.3553, -1.4830, -0.8816]]]) ------------------------------------------- tensor([[[-0.1227, 0.3787, 0.4890, -0.3944], [ 0.1819, 0.2270, -0.1462, -0.2637], [-1.3739, 0.4905, -0.5923, 0.6304]], [[ 0.6627, -0.4194, 0.0393, -2.6827], [ 0.1396, -0.0193, 2.2819, 2.9780], [ 0.3608, -0.1011, 0.7639, 0.8245]]]) ------------------------------------------- tensor([[[-0.4088, -0.3405, -0.5122, -1.1153], [-0.5247, -0.2283, 0.0185, -1.2393], [ 0.5817, -0.8054, 1.0051, 0.1855]], [[ 0.4781, 0.6894, 1.5388, -0.1348], [ 1.7484, -1.8953, -0.4417, 0.9228], [ 2.7996, -0.3553, -1.4830, -0.8816]], [[-0.1227, 0.3787, 0.4890, -0.3944], [ 0.1819, 0.2270, -0.1462, -0.2637], [-1.3739, 0.4905, -0.5923, 0.6304]], [[ 0.6627, -0.4194, 0.0393, -2.6827], [ 0.1396, -0.0193, 2.2819, 2.9780], [ 0.3608, -0.1011, 0.7639, 0.8245]]]) torch.Size([4, 3, 4]) ------------------------------------------- tensor([[[-0.4088, -0.3405, -0.5122, -1.1153], [-0.5247, -0.2283, 0.0185, -1.2393], [ 0.5817, -0.8054, 1.0051, 0.1855], [-0.1227, 0.3787, 0.4890, -0.3944], [ 0.1819, 0.2270, -0.1462, -0.2637], [-1.3739, 0.4905, -0.5923, 0.6304]], [[ 0.4781, 0.6894, 1.5388, -0.1348], [ 1.7484, -1.8953, -0.4417, 0.9228], [ 2.7996, -0.3553, -1.4830, -0.8816], [ 0.6627, -0.4194, 0.0393, -2.6827], [ 0.1396, -0.0193, 2.2819, 2.9780], [ 0.3608, -0.1011, 0.7639, 0.8245]]]) torch.Size([2, 6, 4]) ------------------------------------------- tensor([[[-0.4088, -0.3405, -0.5122, -1.1153, -0.1227, 0.3787, 0.4890, -0.3944], [-0.5247, -0.2283, 0.0185, -1.2393, 0.1819, 0.2270, -0.1462, -0.2637], [ 0.5817, -0.8054, 1.0051, 0.1855, -1.3739, 0.4905, -0.5923, 0.6304]], [[ 0.4781, 0.6894, 1.5388, -0.1348, 0.6627, -0.4194, 0.0393, -2.6827], [ 1.7484, -1.8953, -0.4417, 0.9228, 0.1396, -0.0193, 2.2819, 2.9780], [ 2.7996, -0.3553, -1.4830, -0.8816, 0.3608, -0.1011, 0.7639, 0.8245]]]) torch.Size([2, 3, 8])
Example 4: concatenating two 3-D tensors with different sizes
In this example program, we concatenate two 3-D tensors of different size. The first dimension values of the tensors are different and other dimensions are the same. So we can concatenate along the first dimension.
#import required module
import torch
# create two 3D tensors
t1 = torch.randn(10,32,32)
t2 = torch.randn(4,32,32)
# dispaly the tensors
# print(t1)
print(t1.shape)
print("-------------------------------------------")
# print(t2)
print(t2.shape)
print("-------------------------------------------")
# concatenate the above tensors
t = torch.cat((t1,t2), 0)
print(t)
print(t.shape)
Output
torch.Size([10, 32, 32]) ------------------------------------------- torch.Size([4, 32, 32]) ------------------------------------------- torch.Size([14, 32, 32])
Notice the size of the final tensor after concatenation. Same as mentioned above the dimension values (along which the tensor are concatenated) - 10 and 4 are added and other dim values (i.e., 32) are same.
Example 5: Concatenating three tensors
In the below program, we concatenate three 3-D tensors along the first dimension.#import required module
import torch
# create two 3D tensors
t1 = torch.randn(10,32,32)
t2 = torch.randn(4,32,32)
t3 = torch.randn(7,32,32)
# dispaly the tensors
print(t1.shape)
print("-------------------------------------------")
print(t2.shape)
print("-------------------------------------------")
print(t3.shape)
print("-------------------------------------------")
# concatenate the above tensors
t = torch.cat((t1,t2, t3), 0)
# print(t)
print(t.shape)
Output
torch.Size([10, 32, 32]) ------------------------------------------- torch.Size([4, 32, 32]) ------------------------------------------- torch.Size([7, 32, 32]) ------------------------------------------- torch.Size([21, 32, 32])
Notice the concatenation of three or more tensors are same as concatenating two tensors.
In this post we have discussed different examples to concatenate the tensors along a given dimension. To concatenate along a particular dimension we used torch.cat() method. The tensors to be concatenated should have the same size (except the dimension along which the concatenation is done).
Advertisements
Useful Resources:
Next Post: PyTorch - How to convert array to tensor?
Previous Post: PyTorch - How to cast a tensor to another type?
Next Post: PyTorch - How to convert array to tensor?
Previous Post: PyTorch - How to cast a tensor to another type?
Comments
Post a Comment