Given two points x and y placed at opposite corners of a rectangle, find the minimal euclidean distance between another point z and every point within this rectangle.
For example, the two points
x = [-1,-1];
y = [1,1];define a square centered at the origin. The distance between the point
z = [4,5];
and this square is
d = 5;
(the closest point in the square is at [1,1])
The distance between the point z = [0,2] and this same square is d = 1 (closest point at [0,1])
The distance between the point z = [0,0] and this same square is d = 0 (inside the square)
Notes:
- you can always assume that x < y (element-wise)
- The function should work for points x,y,z in an arbitrary n-dimensional space (with n>1)
Solution Stats
Problem Comments
Solution Comments
Show commentsProblem Recent Solvers276
Suggested Problems
-
Numbers with prime factors 2, 3 and 5.
681 Solvers
-
Matrix indexing with two vectors of indices
774 Solvers
-
Get the length of a given vector
12895 Solvers
-
Reverse the elements of an array
1109 Solvers
-
Number of Even Elements in Fibonacci Sequence
1633 Solvers
More from this Author33
Problem Tags
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!
For the n dimensional case it would be better to say that x and y lie on opposite vertices of the n-hypercuboid such that each edge is parallel to a coordinate axis.
Two points do not define a rectangle. This is especially true in 3D space. A correct answer to this question would be 0 or the point closest to the circle defined by the two points on the diameter, depending on the rectangle you chose to make. (Every rectangle formed from two points defining opposite corners makes a circle, and in 3D, a sphere). I genuinely do not know how you want me to handle the 3D cases.
I agree to Brandon's comment. In a 2D space, there should be infinite numbers of rectangles, which makes a circle. In a 3D space, the infinite numbers of rectangle makes a sphere.