# What's the difference between dim in Pytorch and axis in Tensorflow?

## Issue

I have two lines and I want to understand whether they will produce the same output or not?

In tensorflow: `tf.norm(my_tensor, ord=2, axis=1)`

In pytorch: `torch.norm(my_tensor, p=2, dim=1)`

Say the shape of my_tensor is `[100,2]`

Will the above two lines give the same result? Or is the axis attribute different from dim?

## Solution

Yes, they are the same!

``````import tensorflow as tf
tensor = [[1., 2.], [4., 5.], [3., 6.], [7., 8.], [5., 2.]]
tensor = tf.convert_to_tensor(tensor, dtype=tf.float32)
t_norm = tf.norm(tensor, ord=2, axis=1)
print(t_norm)
``````

Output

``````tf.Tensor([ 2.236068   6.4031243  6.708204
10.630146   5.3851647], shape=(5,), dtype=float32)
``````
``````import torch
tensor = [[1., 2.], [4., 5.], [3., 6.], [7., 8.], [5., 2.]]
tensor = torch.tensor(tensor, dtype=torch.float32)
t_norm = torch.norm(tensor, p=2, dim=1)
print(t_norm)
``````

Output

``````tensor([ 2.2361,  6.4031,  6.7082, 10.6301,  5.3852])
``````