In mathematics the **Euclidean distance** or **Euclidean metric** is the "ordinary" distance between the two points that one would measure with a ruler, which can be proven by repeated application of the Pythagorean theorem. By using this formula as distance, Euclidean space becomes a metric space (even a Hilbert space).

## DefinitionEdit

The **Euclidean distance** for two points *x* = (*x*_{1},...,*x*_{n}) and *y* = (*y*_{1},...,*y*_{n}) in Euclidean *n*-space is defined as

- $ d(x,y):=\sqrt{(x_1-y_1)^2 + (x_2-y_2)^2 + \cdots + (x_n-y_n)^2} = \sqrt{\sum_{i=1}^n (x_i-y_i)^2} $

## Two-dimensional distance Edit

For two 2D points P=[*px*,*py*] and Q=[*qx*,*qy*], the distance is computed as

- $ \sqrt{(px-qx)^2 + (py-qy)^2} $

### 2D approximations for computer applications Edit

A fast approximation of 2D distance based on an octagonal boundary can be computed as follows.
Let *dx* = |*px*-*qx*| (absolute value) and *dy* = |*py*-*qy*|. If *dy*≥*dx*, approximated distance is 0.41*dx*+0.941246*dy*.
(If *dy*<*dx*, swap these values.)
The difference from the exact distance is between -6% and +3%; more than 85% of all possible differences are between -3% to +3%.

The following Maple code implements this approximation and produces the plot on the right, with a true circle in black and the octagonal approximate boundary in red:

fasthypot := unapply(piecewise(abs(dx)>abs(dy), abs(dx)*0.941246+abs(dy)*0.41, abs(dy)*0.941246+abs(dx)*0.41), dx, dy): hypot := unapply(sqrt(x^2+y^2), x, y): plots[display]( plots[implicitplot](fasthypot(x,y) > 1, x=-1.1..1.1, y=-1.1..1.1, numpoints=4000), plottools[circle]([0,0], 1), scaling=constrained,thickness=2 );

Other approximations exist as well. They generally try to avoid the square root, which is an expensive operation in terms of processing time, and provide various error:speed ratio. Using the above notation, *dx* + *dy* - 1/2×min(*dx*,*dy*) yields error in interval 0% to 12%. (Attributed to Alan Paeth.)

Also note that when comparing distances (for which is greatest, not for the actual difference), it isn't necessary to take the square root at all. If distance $ A $

```
is greater than distance $ B $
```

, then $ A^2 $

```
will also be greater than $ B^2 $
```

. Or, when checking to see if distance $ A $

```
is greater than $ 2B $
```

, that is the same as comparing $ A^2 $

with $ (2B)^2 $ or $ 4B^2 $

, etc. An example of the first case might be when trying to determine which nearest grid point an arbitrary point should "snap to" in a 2D CAD/CAM system. This isn't really an approximation, however, as the results are exact.

## Three-dimensional distance Edit

For two 3D points P=[*px*,*py*,*pz*] and Q=[*qx*,*qy*,*qz*], the distance is computed as

- $ \sqrt{(px-qx)^2 + (py-qy)^2 + (pz-qz)^2} $

### 3D approximations for computer applications Edit

As noted in the 2D approximation section, when comparing distances (for which is greatest, not for the actual difference), it isn't necessary to take the square root at all. If distance $ A $

```
is greater than distance $ B $
```

, then $ A^2 $

```
will also be greater than $ B^2 $
```

. An example is when searching for the minimum distance between two surfaces in 3D space, using a 3D CAD/CAM system. One way to start would be to build a point grid on each surface, and compare the distance of every grid point on the first surface with every grid point on the second surface. It isn't necessary to know the actual distances, but only which distance is the least. Once the closest two points are located, a much smaller point grid could be created around those closest points on each surface, and the process repeated. After several iterations, the closest two points could then be fully evaluated, including the square root, to give an excellent approximation of the minimum distance between the two surfaces. Thus, the square root only needs to be taken once, instead of thousands (or even millions) of times.