LeetCode Problem Workspace
Perfect Squares
Perfect Squares asks for the least number of perfect squares summing to a given integer n using dynamic programming or BFS.
3
Topics
6
Code langs
3
Related
Practice Focus
Medium · State transition dynamic programming
Answer-first summary
Perfect Squares asks for the least number of perfect squares summing to a given integer n using dynamic programming or BFS.
Ace coding interviews with Interview AiBoxInterview AiBox guidance for State transition dynamic programming
To solve Perfect Squares, we need to determine the smallest number of perfect square numbers that sum to a target n. Dynamic programming or BFS can help compute the least number efficiently, considering state transitions or level-based search.
Problem Statement
Given a positive integer n, return the least number of perfect square numbers that sum to n. A perfect square is the product of an integer with itself. For example, 1, 4, 9, and 16 are perfect squares. Your task is to find how many of these squares sum to n with the fewest terms.
For instance, if n = 12, the least number of perfect squares that sum to 12 is 3 (12 = 4 + 4 + 4). If n = 13, the least number of perfect squares is 2 (13 = 4 + 9).
Examples
Example 1
Input: n = 12
Output: 3
12 = 4 + 4 + 4.
Example 2
Input: n = 13
Output: 2
13 = 4 + 9.
Constraints
- 1 <= n <= 104
Solution Approach
State Transition Dynamic Programming
Dynamic programming solves this problem by progressively building the least number of squares needed to sum up to each integer from 0 to n. You maintain an array dp where dp[i] represents the least number of perfect squares summing to i. The state transitions are made by checking all smaller perfect squares and updating dp values accordingly.
Breadth-First Search (BFS)
In BFS, treat the problem as finding the shortest path in an unweighted graph where each node represents an integer and edges represent adding a perfect square. Starting from n, explore possible moves by subtracting perfect squares, and use BFS to find the shortest path to 0, minimizing the number of steps.
Greedy Approximation with Pre-computed Squares
A greedy approach can involve subtracting the largest possible perfect square at each step. This is not always optimal but offers a fast approximation. Precomputing squares less than or equal to n can speed up this approach by reducing the number of possibilities to check in each iteration.
Complexity Analysis
| Metric | Value |
|---|---|
| Time | Depends on the final approach |
| Space | Depends on the final approach |
For dynamic programming, the time complexity is O(n * sqrt(n)) since each dp[i] is updated by iterating over all perfect squares less than or equal to i. The space complexity is O(n) for storing the dp array. BFS has a time complexity of O(n * sqrt(n)) as well, considering that it explores each integer and its possible moves, but it may require additional space for the queue.
What Interviewers Usually Probe
- Can the candidate identify the optimal approach between dynamic programming and BFS for solving the problem?
- Does the candidate handle state transitions efficiently in dynamic programming or BFS?
- How well does the candidate explain the trade-offs between time complexity and space complexity in different approaches?
Common Pitfalls or Variants
Common pitfalls
- Not considering the correct base case in dynamic programming, such as dp[0] = 0.
- For BFS, not properly managing the queue, leading to unnecessary computations or missing the shortest path.
- Assuming that a greedy approach will always yield the optimal solution, which is not guaranteed.
Follow-up variants
- What if the problem allows negative integers? This would add complexity in handling transitions.
- How can this problem be solved with memoization instead of an iterative approach?
- What happens if we only use odd perfect squares or squares of primes? This introduces constraints and alters the solution.
FAQ
How can I optimize my solution for Perfect Squares?
To optimize Perfect Squares, you can use dynamic programming with a bottom-up approach or BFS for a level-wise search. Consider space and time trade-offs for each method.
What is the time complexity for dynamic programming in Perfect Squares?
The time complexity for dynamic programming is O(n * sqrt(n)) because for each value from 1 to n, you check all perfect squares up to that number.
Can BFS be used to solve the Perfect Squares problem?
Yes, BFS can be used to solve Perfect Squares by treating the problem as a shortest path problem, where each state represents an integer and the goal is to reach 0.
Why does the greedy approach not always work for Perfect Squares?
The greedy approach does not always work because subtracting the largest perfect square at each step may not lead to the minimum number of terms required to sum to n.
What is the base case in dynamic programming for Perfect Squares?
The base case in dynamic programming is dp[0] = 0, since no perfect squares are needed to sum to 0.
Solution
Solution 1
#### Python3
class Solution:
def numSquares(self, n: int) -> int:
m = int(sqrt(n))
f = [[inf] * (n + 1) for _ in range(m + 1)]
f[0][0] = 0
for i in range(1, m + 1):
for j in range(n + 1):
f[i][j] = f[i - 1][j]
if j >= i * i:
f[i][j] = min(f[i][j], f[i][j - i * i] + 1)
return f[m][n]Solution 2
#### Python3
class Solution:
def numSquares(self, n: int) -> int:
m = int(sqrt(n))
f = [[inf] * (n + 1) for _ in range(m + 1)]
f[0][0] = 0
for i in range(1, m + 1):
for j in range(n + 1):
f[i][j] = f[i - 1][j]
if j >= i * i:
f[i][j] = min(f[i][j], f[i][j - i * i] + 1)
return f[m][n]Continue Topic
math
Practice more edge cases under the same topic.
arrow_forwardauto_awesomeContinue Pattern
State transition dynamic programming
Expand the same solving frame across more problems.
arrow_forwardsignal_cellular_altSame Difficulty Track
Medium
Stay on this level to stabilize interview delivery.
arrow_forward