Discover how a 19th-century mathematical shortcut transformed the performance of a 21st-century gaming algorithm
As developers, we constantly seek ways to optimize our code. Sometimes, the most powerful optimization comes from seemingly simple ideas. Today, I’ll share a fascinating story about Carl Friedrich Gauss, a 19th-century mathematician, and show how his arithmetic shortcut can dramatically improve performance in Swift — both in theoretical and real-world scenarios.
Buckle up, because at the end of this article, you’ll understand the magic of Gauss’s formula.
Imagine you need to calculate the sum of all integers from 1 to nnn. It’s a classic problem with several ways to solve it. For demonstration, I wrote three Swift functions:
1 — For-In Loop
This is the most straightforward approach. We iterate through each number from 1 to n and add them up:
func sumFromOneForIn(upto n: Int) -> Int {
var result = 0
for i in 1...n {
result += i
}
return result
}
2 — Reduce Method
Swift’s functional programming capabilities let us use the reduce method to sum the numbers in a range:
func sumFromOneReduce(upto n: Int) -> Int {
return (1...n).reduce(0, +)
}
3 — Gauss’s Formula
Now, let’s introduce the star of our show — Gauss’s formula. As a child, Gauss cleverly discovered that the sum of numbers from 1 to n could be calculated with this simple equation:
Translated to Swift, it looks like this:
func sumFromOneGaussFormula(upto n: Int) -> Int {
return (n * (n + 1)) / 2
}
Testing Performance: To measure how these methods perform, I used Swift’s DispatchTime API to calculate execution times in nanoseconds. Here’s how the results looked for n = 1,000,000:
let n = 1_000_000
let timeForIn = SumCalculator.measureExecutionTime(of: SumCalculator.sumFromOneForIn, upto: n)
let timeReduce = SumCalculator.measureExecutionTime(of: SumCalculator.sumFromOneReduce, upto: n)
let timeGaussFormula = SumCalculator.measureExecutionTime(of: SumCalculator.sumFromOneGausFormula, upto: n)
print("For-In: \(timeForIn) ns")
print("Reduce: \(timeReduce) ns")
print("Formula: \(timeGaussFormula) ns")
Who’s the Fastest?
Before we dive into the performance comparison, let’s briefly discuss Big-O notation — a fundamental concept in computer science that helps us evaluate an algorithm’s efficiency.
Big-O notation describes how the execution time (or space usage) of an algorithm grows relative to the size of the input. It’s expressed in terms of n, where n represents the size of the input data.
For example:
O(1): Constant time. The algorithm’s performance does not depend on n.
O(n): Linear time. The execution time grows proportionally with n.
O(n²): Quadratic time. The execution time grows exponentially as n increases.
Understanding Big-O is crucial because it allows us to predict how an algorithm will perform as the input size scales, making it easier to choose the most efficient solution for a given problem.
Now, back to our comparison. Here’s how the three methods stack up in terms of Big-O:
For-In Loop: This approach processes each number individually, resulting in a time complexity of O(n). For large n, the execution time grows linearly.
Reduce Method: While it uses functional programming, it also iterates through the range internally, giving it the same O(n) complexity as the For-In loop.
Gauss’s Formula: This is the clear winner. Thanks to its constant-time complexity O(1), it calculates the sum directly using a single arithmetic operation, regardless of n.
For n = 1,000,000, this difference is striking. The For-In loop and Reduce method take thousands of nanoseconds, while Gauss’s formula completes in just a fraction of that time.
By leveraging a smarter algorithm, we achieve not just better performance but also scalability, making Gauss’s formula a go-to solution for summing sequences or similar problems.
Execution time of ForIn: 172266625 nanoseconds
Execution time of Reduce: 195087334 nanoseconds
Execution time of GaussFormula: 209 nanoseconds
Average execution time of ForIn: 170162783 nanoseconds
Average execution time of Reduce: 173793275 nanoseconds
Average execution time of GaussFormula: 16 nanoseconds
The average execution time:
- Gauss Formula: 16 nanoseconds (0.000000016 seconds)
- Reduce or ForIn: 170.000.000 nanoseconds (0.17 seconds)
With Gauss’s formula, the function ran 10,625,000 times faster.
Real-World Application: Leaderboard Scores
Let’s move from theory to practice. I am building a gaming app where players accumulate scores as they progress through levels. To display a leaderboard, I need to calculate the total score up to a specific level n.
Here’s how I can solve it using the same three approaches (deja vu):
1 — Using a Loop (Naive Approach)
Iterates through the scores and sums them:
func cumulativeScoreLoop(scores: [Int], upto level: Int) -> Int {
var total = 0
for i in 0..<min(level, scores.count) {
total += scores[i]
}
return total
}
2 — Using Reduce (Functional Approach)
Summing scores with Swift’s reduce:
func cumulativeScoreReduce(scores: [Int], upto level: Int) -> Int {
return scores.prefix(level).reduce(0, +)
}
3 — Using Gauss’s Formula (Optimized)
If scores are predictable (like 1, 2, 3, …), we can compute the total directly:
func cumulativeScoreGaussFormula(upto level: Int) -> Int {
return (level * (level + 1)) / 2
}
Performance Comparison
With 1,000,000 levels, the Gauss formula significantly outperforms the loop and reduction methods.
Bonus Application:
Finding Missing Numbers Here’s another practical use case: detecting a missing number in a sequence. Suppose you have a sequence of numbers from 1 to n, but one number is missing. By calculating the expected sum using Gauss’s formula and subtracting the actual sum, you can find the missing number instantly.
func findMissingNumber(_ nums: [Int], upto n: Int) -> Int {
let expectedSum = (n * (n + 1)) / 2
let actualSum = nums.reduce(0, +)
return expectedSum - actualSum
}
Let's test:
let nums = [1, 2, 4, 5] // Missing 3
let missingNumber = findMissingNumber(nums, upto: 5)
print("Missing number: \(missingNumber)") // Output: 3
The Beauty of Gauss’s Formula
This journey from theory to real-world applications reveals how a centuries-old mathematical insight can optimize modern code. Gauss’s formula reduces time complexity from O(n) to O(1), making it ideal for high-performance apps and large datasets.
Chances are, a clever trick like this could save you both time and computational resources. Next time you face a computational challenge, ask your self: Is there a smarter way?