Greedy Algorithms LeetCode Pattern Guide — Interval Scheduling and Exchange Arguments
A practical greedy algorithms LeetCode pattern guide for recognizing interval scheduling, proving choices with exchange arguments, and avoiding the traps that make greedy solutions fail.
Greedy Algorithms LeetCode Pattern Guide — Interval Scheduling and Exchange Arguments
This greedy algorithms LeetCode pattern guide is for the moment when a problem looks like dynamic programming, but the right answer is one decisive local choice repeated with discipline. Greedy works when a locally optimal move can be proven to preserve a globally optimal solution. In interviews, that proof matters almost as much as the code. Interval scheduling and exchange arguments are the cleanest way to learn the pattern because they force you to say why sorting by an end time, start time, deadline, or ratio is safe instead of just hoping it is.
The fastest candidates do not memorize “greedy means sort.” They identify the decision being made, define what makes one choice dominate another, and explain the invariant that remains true after each pick. This guide gives you that playbook.
Where greedy appears in coding interviews
Greedy problems show up when the interviewer wants to test judgment under constraints, not just implementation. Common forms include:
| Problem shape | Typical greedy choice | Example interview phrasing | |---|---|---| | Interval scheduling | Pick the interval with the earliest finish time | “Maximum number of non-overlapping meetings” | | Interval removal | Remove the interval with the later finish time when overlapping | “Minimum intervals to erase” | | Deadline scheduling | Prioritize shortest, earliest, or most profitable job depending on objective | “Can all courses be completed before deadlines?” | | Jump or reachability | Keep the farthest reachable boundary | “Minimum jumps to reach the end” | | Resource allocation | Use a heap to release or reuse the earliest ending resource | “Meeting rooms needed” | | Exchange / replacement | Replace a weaker selected item with a stronger one | “Maximum subsequence under a constraint” |
The phrase “optimal” is not enough. The key question is: after I make this choice, can every optimal solution be transformed into one that also makes my choice without becoming worse? If yes, greedy is plausible.
The core greedy framework
Use this five-step framework before you write code:
- Name the unit of choice. Are you choosing an interval, a jump boundary, a character, a task, or an edge?
- Name the objective. Maximize count, minimize removals, minimize cost, maximize profit, or prove feasibility.
- Find a dominance rule. One candidate is always at least as good as another for the future. For interval scheduling, earlier finishing intervals leave more room.
- State the invariant. After each step, your partial answer can be extended to an optimal answer.
- Test against a counterexample. Try sorting by start time, length, value, or local profit and see which rule fails.
That last step is not optional. Many greedy-looking problems are actually dynamic programming because a local choice changes future values in a way you cannot summarize with one invariant.
Interval scheduling: the canonical example
The classic problem asks for the maximum number of non-overlapping intervals. The winning strategy is:
- Sort intervals by end time ascending.
- Keep the first interval whose start is at or after the end of the last chosen interval.
- Count or return the chosen intervals.
Why end time? Because choosing the interval that finishes earliest leaves the largest remaining timeline for all future intervals. Sorting by start time fails because an early-starting meeting might be huge and block five short meetings. Sorting by duration fails because the shortest meeting might sit in the middle and block better combinations on both sides.
A compact Python version:
def max_non_overlapping(intervals):
intervals.sort(key=lambda x: x[1])
taken = 0
last_end = float('-inf')
for start, end in intervals:
if start >= last_end:
taken += 1
last_end = end
return taken
For “minimum intervals to remove,” the code is nearly the same. Count overlaps while keeping the interval with the earlier end. If two intervals overlap, removing the one that ends later is safe because it can only reduce room for future intervals.
def erase_overlap_intervals(intervals):
intervals.sort(key=lambda x: x[1])
removed = 0
last_end = float('-inf')
for start, end in intervals:
if start < last_end:
removed += 1
else:
last_end = end
return removed
Notice the boundary condition. If intervals that touch are allowed, use start >= last_end. If touching counts as overlap, use start > last_end. Interviewers often hide correctness in that single symbol.
How to explain exchange arguments
An exchange argument is a proof that your greedy choice can replace whatever the optimal solution chose first.
For interval scheduling, say it like this:
“Let g be the interval with the earliest end time. Consider any optimal solution. If that solution already includes g, we are done. If it starts with some other interval o, then g ends no later than o. Replacing o with g does not create an overlap with the remaining intervals, because g frees at least as much time as o. The solution still has the same number of intervals. Therefore there is an optimal solution that begins with the greedy choice. Then the same argument applies recursively to the remaining intervals.”
That paragraph is the difference between a guessed greedy solution and an interview-ready one. You do not need formal math notation, but you do need to show the local choice is safe.
Choosing the right sort key
Most greedy misses come from using the wrong sort key. Use this decision table:
| Objective | Usually sort by | Why | |---|---|---| | Max non-overlapping intervals | End ascending | Leaves maximum future space | | Min removals for overlaps | End ascending | Keeps the interval least harmful to future choices | | Merge intervals | Start ascending | You need to detect contiguous overlap regions | | Meeting rooms | Start ascending plus min-heap of end times | You process arrivals and release finished rooms | | Minimum arrows to burst balloons | End ascending | One arrow at the earliest end covers the tightest group | | Assign cookies / boats / resources | Size ascending or two pointers | Match smallest sufficient resource or pair extremes | | Kruskal-style MST | Weight ascending | Cheapest safe edge crossing components |
If you cannot say why the sort key dominates alternatives, pause. The problem may require DP, binary search, or a heap-based simulation instead.
Greedy plus heap: when one variable is not enough
Some greedy problems still need a data structure. In “Meeting Rooms II,” you sort meetings by start time, but the greedy choice is to reuse the room that ends earliest. A min-heap stores current room end times. When the next meeting starts after the earliest ending room, pop and reuse it; otherwise allocate a new room.
The invariant is: the heap contains exactly the rooms currently occupied, represented by their next available times. The heap minimum is the best room to reuse because it becomes free first. This is greedy, but not the simple interval-counting version.
import heapq
def min_meeting_rooms(intervals):
intervals.sort(key=lambda x: x[0])
heap = []
for start, end in intervals:
if heap and heap[0] <= start:
heapq.heappop(heap)
heapq.heappush(heap, end)
return len(heap)
Red flags that greedy may be wrong
Greedy is suspicious when:
- Taking a small local win can block a much larger later win.
- The problem asks for exact sum, number of ways, or all combinations.
- Future value depends on multiple dimensions you cannot collapse into one boundary.
- You need to revisit earlier decisions after seeing later items.
- A simple counterexample breaks your sort key.
For example, coin change with arbitrary denominations is not safely greedy. Picking the largest coin first works for U.S. coins but fails for denominations like [1, 3, 4] and amount 6, where greedy picks 4 + 1 + 1 but optimal is 3 + 3. That is a dynamic programming signal.
Practice roadmap
Work through patterns in this order:
- Non-overlapping intervals. Focus on end-time sorting and boundary conditions.
- Minimum arrows / erase overlap. Practice translating maximize-kept into minimize-removed.
- Meeting rooms. Add heap state while keeping the greedy invariant clear.
- Jump Game. Track farthest reach and current window instead of trying paths.
- Task scheduling. Learn when frequency counts and idle slots create a greedy formula.
- Kruskal’s algorithm. Connect greedy safety to cuts and components.
After each problem, write one sentence: “The greedy choice is safe because…” If that sentence sounds vague, redo the proof.
Common implementation traps
- Sorting the wrong field. Interval scheduling uses end time; merging intervals uses start time.
- Forgetting inclusive boundaries. Decide whether
[1,2]and[2,3]overlap before coding. - Over-updating state. In removal problems, do not update
last_endwhen you discard the later-ending interval. - Confusing count kept with count removed. Either return
n - keptor increment removed on conflicts, not both. - Using greedy when objective changes. “Maximum total value of non-overlapping intervals” is weighted interval scheduling, usually DP with binary search.
How to talk about greedy in interviews and resumes
In an interview, narrate the proof before optimizing code:
“I’m considering greedy because each interval affects the future only through its end time. If I choose the earliest-ending compatible interval, any optimal solution that chose a later-ending first interval can exchange it for mine without losing feasibility. That gives the same count and at least as much remaining space.”
On a resume, avoid “solved greedy problems.” Tie the concept to decisions:
- “Designed scheduling logic that maximized completed jobs by prioritizing earliest finish times under resource constraints.”
- “Reduced allocation conflicts by modeling bookings as interval scheduling and validating the greedy invariant with edge-case tests.”
- “Implemented heap-backed meeting-room assignment to minimize concurrent resource usage.”
Greedy algorithms reward precision. The code is often short, but the correctness argument is the product. If you can identify the dominance rule, defend it with an exchange argument, and watch boundary conditions, you will handle most LeetCode greedy interval problems with confidence.
A quick counterexample lab
Before you commit to a greedy rule, build one tiny counterexample against the alternatives. For interval scheduling, sorting by earliest start fails on [[1,10],[2,3],[3,4],[4,5]] because the first interval blocks three compatible ones. Sorting by shortest duration can fail when the shortest interval sits in the middle of the timeline and prevents one interval before it plus one after it. Sorting by latest start is useful for some reverse scheduling problems, but it does not maximize forward capacity by default.
This habit makes your interview explanation stronger. You can say, “I tested the tempting start-time rule, but it fails because early start does not imply low opportunity cost. End time is the relevant future constraint.” That shows you selected the greedy strategy, not just recognized a familiar problem.
Related guides
- Dynamic Programming LeetCode Pattern Guide — Knapsack, LIS, and Interval DP Templates — A pattern-first DP guide for LeetCode interviews, with recognition rules, knapsack, LIS, interval DP templates, loop-order traps, and interview narration.
- Graph Algorithms LeetCode Pattern Guide — DFS, BFS, Dijkstra, and Union-Find — A practical graph algorithms LeetCode pattern guide covering DFS, BFS, Dijkstra, union-find, topological sort, grid graphs, and interview explanation patterns. Use it to identify the graph shape before choosing the algorithm.
- Backtracking LeetCode Pattern Guide — Permutations, Combinations, and Subsets — A practical backtracking LeetCode pattern guide with templates, decision rules, duplicate-handling tricks, pruning advice, and interview talk tracks for permutations, combinations, subsets, and constraint problems.
- BFS LeetCode Pattern Guide — Shortest-Path, Level-Order, and Multi-Source — A BFS LeetCode pattern guide for recognizing queue-based problems, implementing level-order traversal, solving unweighted shortest paths, using multi-source BFS, and avoiding common visited-state bugs.
- Binary Search LeetCode Pattern Guide — Bounds, Rotated Arrays, and Search-on-Answer — A binary search LeetCode pattern guide for exact search, lower/upper bounds, rotated arrays, and search-on-answer problems. Includes invariants, templates, traps, and interview-ready explanations.
