Dijkstra’s Algorithm: Finding the Cheapest Path
Dijkstra’s algorithm finds the shortest path from a starting node to all other nodes in a graph. For pathfinding, we use it to find the path from start to goal.
The key idea: always explore the node with the smallest known cost so far. This guarantees we find the optimal path.
How It Works (In Plain Language)
Here’s Dijkstra in simple terms:
- Start at the source with cost 0
- Keep track of best distances to each node we’ve seen
- Always expand the cheapest node we haven’t processed yet
- Update neighbors when we find a cheaper path
- Stop when we reach the goal (or process all nodes)
Think of it like water spreading from a source. It flows to the nearest unvisited areas first, always taking the path of least resistance.
The Algorithm Step by Step
Let’s break down what happens:
Initialization
- Set distance to start node = 0
- Set distance to all other nodes = infinity
- Create a priority queue (min-heap) with just the start node
- Create a “visited” set (initially empty)
- Create a “parents” map to reconstruct the path
Main Loop
While the priority queue is not empty:
- Pop the node with smallest distance from the queue
- Mark it as visited (we’ve found the optimal path to it)
- For each neighbor:
- Calculate new distance = current distance + edge weight
- If new distance < stored distance:
- Update stored distance
- Set parent to current node
- Add neighbor to queue (or update if already there)
- If we popped the goal node, we’re done!
Path Reconstruction
Once we reach the goal:
- Start from goal node
- Follow parent pointers back to start
- Reverse the list to get start → goal path
Pseudocode
function dijkstra(graph, start, goal):
distances = map with all nodes → infinity
distances[start] = 0
parents = empty map
visited = empty set
queue = priority queue containing (0, start)
while queue is not empty:
current_distance, current = queue.pop()
if current in visited:
continue // Skip if already processed
visited.add(current)
if current == goal:
return reconstruct_path(parents, start, goal)
for neighbor, edge_weight in graph.neighbors(current):
if neighbor in visited:
continue
new_distance = current_distance + edge_weight
if new_distance < distances[neighbor]:
distances[neighbor] = new_distance
parents[neighbor] = current
queue.push((new_distance, neighbor))
return null // No path found
function reconstruct_path(parents, start, goal):
path = []
current = goal
while current != null:
path.append(current)
current = parents.get(current)
return reverse(path)
Visual Example
Here’s a simple grid to visualize:
[S] [ ] [ ]
[ ] [X] [ ]
[ ] [ ] [G]
Dijkstra explores like this:
Step 1: Start at S (cost 0)
- Add neighbors: right (cost 1), down (cost 1)
Step 2: Process one of the cost-1 nodes (say, right)
- Add its neighbors: right (cost 2), down (cost 2, but blocked by wall)
Step 3: Process down from start (cost 1)
- Add its neighbors: down (cost 2), right (cost 2, but wall blocks)
Step 4: Process cost-2 nodes
- Continue exploring outward
Step 5: Eventually reach G
- Follow parents back to S
Interactive Algorithm Stepper
Watch Dijkstra run step by step on a grid. Use the controls to step through the algorithm:
Dijkstra's Algorithm Stepper
Click "Next" to step through the algorithm
Understanding the Visualization
The colors mean:
- Green: Start position
- Red: Goal position
- Gray: Visited nodes (explored by algorithm)
- Yellow: Final path (if goal reached)
- Black: Walls (obstacles)
- White: Unexplored nodes
Watch how Dijkstra expands outward in all directions, always processing the cheapest unvisited node first.
Time Complexity
Dijkstra’s complexity depends on the data structure:
With binary heap (priority queue):
- Time: O((V + E) log V)
- Where V = number of nodes, E = number of edges
On a grid:
- V = rows × cols (minus walls)
- E ≈ 4V (each node has ~4 neighbors)
- So roughly O(V log V)
For a 100×100 grid, that’s about 10,000 nodes, so roughly 10,000 × log(10,000) ≈ 133,000 operations.
Why Dijkstra Works
Dijkstra guarantees optimal paths because:
- Greedy choice: Always picks the cheapest unvisited node
- Optimal substructure: If the path to A is optimal, and A→B is the cheapest edge, then the path through A to B is optimal
- No negative weights: Assumes all edge weights are non-negative
If you have negative weights, Dijkstra can fail. Use Bellman-Ford instead.
Key Takeaways
Before moving forward:
- Dijkstra explores outward - Processes nodes in order of increasing cost
- Uses a priority queue - Always picks the cheapest unvisited node
- Guarantees optimal paths - With non-negative weights
- Time complexity - O((V + E) log V) with binary heap
- Can be slow - Explores many nodes, especially far from goal
Quick Check
What’s Next?
Dijkstra works, but it explores a lot of nodes. In the next page, we’ll learn about A* search, which uses a “heuristic” to focus the search toward the goal. This usually makes it faster.