AlgoMaster Logo

K Closest Points to Origin

Last Updated: March 29, 2026

medium
4 min read

Understanding the Problem

We have a list of 2D points and need to find the k points that are closest to the origin (0, 0). The Euclidean distance from a point (x, y) to the origin is sqrt(x^2 + y^2). But here is a useful observation: since we only care about relative ordering of distances (not the actual distance values), we can compare x^2 + y^2 directly and skip the square root entirely. Squaring preserves the ordering because the square root function is monotonically increasing.

The problem says the answer can be returned in any order, which is a strong hint. We do not need a fully sorted result, just any k elements that happen to be the smallest. This opens the door to approaches that are faster than a full sort.

Key Constraints:

  • 1 <= k <= points.length <= 10^4 -> With n up to 10,000, O(n log n) sorting is comfortable. But the "any order" hint suggests we can do better.
  • -10^4 <= xi, yi <= 10^4 -> Squared distance x^2 + y^2 is at most 2 * 10^8, which fits in a 32-bit integer. No overflow concerns.
  • k <= points.length -> k is always valid, no edge case handling needed.

Approach 1: Sort by Distance

Intuition

The most direct approach: compute the squared distance for every point, sort the entire array by that distance, and take the first k elements. Since we only need relative ordering, we can sort by x^2 + y^2 without computing the actual square root.

This is clean and easy to get right. Sorting gives us a fully ordered result, which is more than we need (we only need the k smallest, not all n in order), but it works.

Algorithm

  1. Sort the points array using a custom comparator that compares x^2 + y^2 for each point.
  2. Return the first k elements of the sorted array.

Example Walkthrough

1Compute squared distances: [1,3]->10, [-2,2]->8, [5,1]->26, [3,-3]->18
0
10
[1,3]
1
8
[-2,2]
2
26
[5,1]
3
18
[3,-3]
1/3

Code

This approach works but sorts all n points when we only need the k smallest. What if we maintained just the top k closest points as we scanned the array?

Approach 2: Max-Heap of Size K

Intuition

Instead of sorting the entire array, we can maintain a max-heap (priority queue) of size k. The heap holds the k closest points we have seen so far, with the farthest of those k at the top.

As we scan through the points, we compare each new point's distance against the heap's maximum. If the new point is closer, we remove the farthest point and insert the new one. If the heap has fewer than k elements, we just add the point directly.

Algorithm

  1. Create a max-heap (priority queue ordered by descending distance).
  2. For each point in the array:
    • If the heap has fewer than k elements, add the point.
    • Otherwise, if the point's squared distance is less than the heap's maximum, remove the max and add this point.
  3. Extract all k points from the heap and return them.

Example Walkthrough

1Initialize: scan points, k=2. Distances: 10, 8, 26, 18
0
10
i
1
8
2
26
3
18
1/6

Code

The heap approach is better when k is small, but we still pay O(log k) per element. What if we could partition the array around the k-th smallest distance instead?

Approach 3: Quickselect

Intuition

The problem boils down to finding the k-th smallest distance and returning everything at or below that threshold. This is exactly what the quickselect algorithm does: it partially sorts an array so that the element at position k is in its final sorted position, with all smaller elements to its left and all larger elements to its right.

Quickselect works by choosing a pivot, partitioning the array around it (all elements with smaller distance go left, larger go right), and then checking where the pivot landed. If the pivot is at index k, we are done. If it is past k, recurse on the left half. If it is before k, recurse on the right half.

The efficiency comes from only recursing on one side. Unlike quicksort which recurses on both halves (total work: O(n log n)), quickselect recurses on one half, cutting the problem size roughly in half each time: n + n/2 + n/4 + ... = O(2n) = O(n) on average.

Algorithm

  1. Define a helper function quickSelect(points, left, right, k).
  2. Pick a random pivot index between left and right.
  3. Partition the array: move points with smaller distance to the left of the pivot, larger to the right.
  4. If the pivot index equals k, return (the first k elements are the answer).
  5. If the pivot index is less than k, recurse on the right portion.
  6. If the pivot index is greater than k, recurse on the left portion.
  7. Return the first k elements of the (partially sorted) array.

Example Walkthrough

1Initial distances: [1,3]->10, [-2,2]->8, [5,1]->26, [3,-3]->18. k=2
0
left
10
1
8
2
26
3
right
18
1/8

Code