The Bellman–Ford Algorithm solves the Single-Source Shortest Path (SSSP)
problem: it computes the shortest paths from one source vertex to all other vertices.
🔹 1. What is Bellman–Ford?
The Bellman–Ford Algorithm finds the shortest distance from a single source vertex to all
other vertices in a weighted graph.
Works with negative edge weights
Can detect negative weight cycles
Formal Problem Statement
Given a weighted directed graph G = (V, E) and a source vertex s, compute the shortest path from s to every other vertex v ∈ V.
Input / Output / Constraints
Input:
A weighted directed graph G = (V, E)
A source vertex s ∈ V
Edge weights w(u, v) — which may be negative
Output:
dist[v] = shortest distance from s to every v
prev[v] = predecessor of v on the shortest path (for path reconstruction)
Constraints:
If a negative-weight cycle is reachable from s, no shortest path is defined
The graph may be disconnected — unreachable vertices retain dist = ∞
🔹 2. Core Idea
The algorithm repeatedly relaxes all edges.
👉 Relaxation: If a shorter path to a vertex is found, update its distance.
🔹 3. How Bellman-Ford Searches
Rather than exploring vertices one by one like DFS/BFS, Bellman-Ford searches by repeatedly relaxing edges.
Each relaxation pass is essentially asking:
"Is there a better path to this vertex than what I currently know?"
It doesn't visit vertices in a fixed order — it scans all edges, every iteration, which is what allows it
to handle negative weights that greedy algorithms like Dijkstra cannot.
🔹 4. Why Bellman-Ford doesn't require sort
Bellman-Ford doesn't pre-sort edges — sorting them (e.g. by weight) doesn't improve correctness because all
edges must be checked every iteration regardless of order. However, in practice, processing edges in a
consistent order can help with early termination when no updates occur in a full pass.
💡 Early termination is an optimization — if a full iteration passes with zero relaxations,
shortest paths are already settled and the algorithm can stop before completing all |V|−1 iterations.
🔹 5. Algorithm Paradigm - Dynamic Programming
Dynamic Programming (DP) is a problem-solving paradigm that solves complex problems by:
Breaking them into smaller subproblems
Solving each subproblem once
Storing and reusing results to avoid redundant work
It applies when a problem has two key properties:
Optimal Substructure — the optimal solution to the whole problem contains optimal
solutions to its subproblems
Overlapping Subproblems — the same subproblems are solved repeatedly
🔹 6. How Does Bellman-Ford Fall Under DP?
Bellman-Ford is DP because it doesn't recompute — it progressively refines shortest path estimates
iteration by iteration, each one standing on the shoulders of the last. The dist[] array is its
memory, and the recurrence dist[v] = min(dist[v], dist[u] + w) is its engine.
🔸 Optimal Substructure
The shortest path from source S to vertex V through vertex U is:
shortest path to U + edge weight (U → V)
In other words — the best path to V builds on top of the best path to U. That's
optimal substructure.
🔸 Overlapping Subproblems
Every iteration reuses dist[] values computed in the previous iteration. You're not
recomputing from scratch — you're building on what you already know.
🔹 7. Why V − 1 Iterations?
For a graph with V vertices:
The longest shortest path can have at most V − 1 edges
So we repeat relaxation V − 1 times to guarantee correctness
🔹 8. Steps of the Algorithm
1
Initialize distances Source = 0, Others = ∞
2
Relax all edges Repeat for V − 1 iterations
3
Check for negative cycle Perform one extra iteration — if
any value updates → negative cycle exists
🔹 9. Demonstration (WITH Negative Edge)
✅ Example Graph
Vertices: A, B, C, D — Source: A
Edge
Weight
A → B
4
A → C
5
B → C
−3 (negative)
C → D
2
D → B
1
Graph with negative edge B → C (weight −3)
🔸 Step 0: Initialization
Vertex
Distance
A
0
B
∞
C
∞
D
∞
🔸 Iteration 1
A → B: 0 + 4 = 4 → B = 4
A → C: 0 + 5 = 5 → C = 5
B → C: 4 + (−3) = 1 → C =
1 ✅ (negative edge improves path!)
C → D: 1 + 2 = 3 → D = 3
D → B: 3 + 1 = 4 → no change
After Iteration 1:
Vertex
Distance
A
0
B
4
C
1
D
3
🔸 Iteration 2
No changes occur — all distances already optimal.
🔸 Iteration 3
No changes → stable. Algorithm converges.
🔹 Final Distances
Vertex
Distance
A
0
B
4
C
1
D
3
Key Learning:
✔ Negative edges are handled correctly
✔ They can actually produce shorter paths
✔ Algorithm still converges normally
🔹 10. Demonstration (Negative Cycle Case)
❗ Example Graph with Negative Cycle
Vertices: A, B, C — Source: A
Edge
Weight
A → B
1
B → C
−1
C → A
−1
👉 Cycle: A → B → C → A | Total weight = 1 + (−1) + (−1) =
−1
Triangle graph with negative cycle (A → B → C → A)
🔸 Step 0: Initialization
Vertex
Distance
A
0
B
∞
C
∞
🔸 Iteration 1
A → B: 0 + 1 = 1 → B = 1
B → C: 1 + (−1) = 0 → C =
0
C → A: 0 + (−1) = −1 → A =
−1
🔸 Iteration 2
A → B: −1 + 1 = 0 → B = 0
B → C: 0 + (−1) = −1 → C =
−1
C → A: −1 + (−1) = −2 → A =
−2
🔸 Observation: Distances keep decreasing!
Iteration
A
B
C
Start
0
∞
∞
1
−1
1
0
2
−2
0
−1
👉 Values never stabilize — they keep going down infinitely.
🔸 Final Check (Extra Iteration)
If we relax again, distances still decrease.
✅ Therefore: Negative cycle exists. Shortest path is undefined.
Key Learning (Negative Cycle):
✔ Shortest path is not defined when a negative cycle exists
✔ Algorithm detects the cycle because values keep decreasing
✔ The extra iteration is critical for detection
🔹 11. Important Observations
Scenario
Behavior
Stable graph
Values stop changing
Negative edge
Still stable (converges normally)
Negative cycle
Values keep decreasing — never stable
🔹 12. Time Complexity Analysis
Bellman-Ford runs in O(V × E) time in the worst case.
This comes from its three phases:
Initialization — visits every vertex once → O(V)
Relaxation — repeats |V|−1 times over all edges → O(V × E)
Negative cycle check — scans all edges once more → O(E)
🔸 Recurrence Relation & Complexity Derivation
The time complexity is derived from how Bellman-Ford resolves its core recurrence relation:
dist[v] = min(dist[v], dist[u] + w). It computes this iteratively using nested loops:
for i = 1 to |V| -
1:→ runs V−1 times
for each edge (u, v, w):→ runs E times per iteration
if dist[u] + w < dist[v]:→ O(1) comparison
Multiplying these nested operations gives the dominant complexity: (V − 1) × E × O(1) = O(V ×
E).
The relaxation phase dominates, yielding the final complexity of O(V × E).
Why is this acceptable?
Dijkstra's algorithm is faster at O((V + E) log V), but it cannot handle negative edge weights.
Bellman-Ford sacrifices speed in exchange for correctness on any weighted graph — including those with
negative edges and cycle detection.
🔸 How Complexity Changes with Input
Input Characteristic
Effect on Complexity
Sparse graph (E ≈ V)
O(V × E) becomes O(V²)
Dense graph (E ≈ V²)
O(V × E) becomes O(V³)
Early termination triggers
Drops toward O(E) best case
Negative cycle present
Still O(V × E) — cycle check adds only O(E)
Disconnected graph
Same complexity, unreachable vertices stay at ∞
🔹 13. Problem Classification: P vs NP
Bellman-Ford solves the Single Source Shortest Path problem in O(V × E) — a fully
polynomial time solution. This places it firmly in the class P, not NP.
🔸 Why Bellman-Ford is in P
Bellman-Ford runs in O(V × E) — every step is bounded and predictable. There is no exhaustive
search, no backtracking, no exponential blowup. It always terminates in polynomial time regardless of input.
🔸 The Closest NP-Complete Relative
The Longest Path Problem — finding the longest simple path in a graph — is
NP-Complete. It looks almost identical to shortest path, but flipping min to max breaks
optimal substructure, meaning DP no longer works and no polynomial solution is known.
🔹 14. Limitations of Bellman–Ford
❌
High Time Complexity Takes O(V × E) — slow for large graphs.
❌
Inefficient vs. Alternatives Slower than Dijkstra’s for graphs without negative
weights.
❌
Cannot Resolve Negative Cycles If a negative cycle exists, shortest path is
undefined. Algorithm only detects, not resolves.
❌
Repeated Edge Relaxation Even if the solution is found early, it still checks all
edges multiple times (unless optimized with early termination).
❌
Not Suitable for Real-Time Large Systems Too slow for very large networks or
time-critical computations.
🔹 15. Easy Way to Remember
“Keep relaxing all edges V − 1 times. If it still improves, a negative cycle exists.”
⚙️ Controls
Source node: not selected
Generate a graph first
📖 How to Use
1. Generate a Graph
Click RANDOM to test standard distances, or NEGATIVE CYCLES to see how the algorithm detects invalid
paths.
2. Choose Source Node
Pick a node from Select source node. The chosen node will be used
as
the starting vertex for Bellman-Ford.
3. Trace the Algorithm
▶ Run (All Steps): Plays through
the algorithm automatically.
Step ▶ / ◀ Prev: Step through the edge relaxations one by
one.
4. Restart/Clear
Use ↺ Reset to begin tracing again, or ✕ Clear to empty everything.
💡 Watch the 🧠 Step Explanation on the right as you go!
📝 Pseudocode
🗺️ Graph Visualization
Generate a graph and step through the algorithm to see each relaxation.
Step through to see the results table update live.
🧠 Step Explanation
Start stepping through the algorithm to see a detailed explanation of each
operation here.
⏱ Complexity
Metric
Value
Time Best
O(E)
Time Worst
O(V × E)
Space
O(V)
🔍 Algorithm Comparison
Algorithm
Neg. Weights?
Time
Dijkstra
✕ No
O((V+E) log V)
Bellman-Ford
✓ Yes
O(V × E)
Floyd-Warshall
✓ Yes
O(V³)
💡 Use Cases
Network Routing (RIP)The Routing Information Protocol uses Bellman-Ford
to find shortest paths in computer networks.