LeetCode Problem Workspace
Convert 1D Array Into 2D Array
Convert a 1D integer array into a structured 2D array with specified rows and columns using all elements sequentially.
3
Topics
6
Code langs
3
Related
Practice Focus
Easy · Array plus Matrix
Answer-first summary
Convert a 1D integer array into a structured 2D array with specified rows and columns using all elements sequentially.
Ace coding interviews with Interview AiBoxInterview AiBox guidance for Array plus Matrix
This problem requires creating a 2D array from a given 1D array with exact dimensions. If the total number of elements in original does not match m times n, conversion is impossible. Otherwise, fill rows sequentially from the 1D array, forming a clear row-major matrix structure.
Problem Statement
You are given a 0-indexed integer array original and two integers m and n. Construct a 2D array with m rows and n columns by placing elements of original sequentially. If original cannot fill an m x n matrix exactly, return an empty array.
Elements from original are grouped into consecutive segments of length n to form each row. The first n elements form the first row, the next n elements form the second row, continuing until all elements are placed. Ensure the 2D array strictly uses all original elements without leftovers.
Examples
Example 1
Input: original = [1,2,3,4], m = 2, n = 2
Output: [[1,2],[3,4]]
The constructed 2D array should contain 2 rows and 2 columns. The first group of n=2 elements in original, [1,2], becomes the first row in the constructed 2D array. The second group of n=2 elements in original, [3,4], becomes the second row in the constructed 2D array.
Example 2
Input: original = [1,2,3], m = 1, n = 3
Output: [[1,2,3]]
The constructed 2D array should contain 1 row and 3 columns. Put all three elements in original into the first row of the constructed 2D array.
Example 3
Input: original = [1,2], m = 1, n = 1
Output: []
There are 2 elements in original. It is impossible to fit 2 elements in a 1x1 2D array, so return an empty 2D array.
Constraints
- 1 <= original.length <= 5 * 104
- 1 <= original[i] <= 105
- 1 <= m, n <= 4 * 104
Solution Approach
Check feasibility
First, verify if the product of m and n equals the length of original. If not, return an empty array immediately to handle the impossibility case.
Initialize 2D array
Create an empty list for the result. Prepare to append m rows, each of size n, as you iterate through original.
Fill rows sequentially
Iterate through original in steps of n, slicing n elements at a time, and append each slice as a new row to the result array. This preserves the original order and ensures correct row-major placement.
Complexity Analysis
| Metric | Value |
|---|---|
| Time | O(m \times n) |
| Space | O(1) |
Time complexity is O(m \times n) because each element is accessed once. Space complexity is O(1) additional space aside from the output since slices reference elements directly.
What Interviewers Usually Probe
- Are you checking whether the total elements match m times n?
- Can you fill rows sequentially while maintaining the order from the original array?
- What will you return if conversion is impossible due to element count mismatch?
Common Pitfalls or Variants
Common pitfalls
- Not verifying that original.length equals m times n, leading to incorrect output or runtime errors.
- Incorrectly slicing elements, which can shift positions and break row formation.
- Assuming extra elements can be truncated instead of returning an empty array when exact fit is impossible.
Follow-up variants
- Convert 1D array into jagged 2D array where row lengths can vary, requiring dynamic slicing.
- Fill the 2D array column-major instead of row-major, changing the iteration pattern.
- Handle negative or zero elements in original and adjust the feasibility check accordingly.
FAQ
When is it impossible to convert the 1D array into a 2D array?
Conversion is impossible if the length of original does not equal m multiplied by n. Always check this first.
How do I fill the 2D array from the 1D array?
Take slices of length n from original sequentially and append each slice as a row to form the 2D array.
Does this problem require extra space?
No extra space is required beyond the output 2D array, as slicing references existing elements.
Can I use the same logic for jagged 2D arrays?
No, this exact row-major pattern assumes all rows are equal length, so jagged arrays require a different approach.
What is the main pattern in Convert 1D Array Into 2D Array?
The main pattern is Array plus Matrix: sequentially map 1D array elements into a structured row-major 2D matrix.
Solution
Solution 1: Simulation
According to the problem description, we know that to construct an $m$-row and $n$-column two-dimensional array, it needs to satisfy that $m \times n$ equals the length of the original array. If it does not satisfy, return an empty array directly.
class Solution:
def construct2DArray(self, original: List[int], m: int, n: int) -> List[List[int]]:
if m * n != len(original):
return []
return [original[i : i + n] for i in range(0, m * n, n)]Continue Topic
array
Practice more edge cases under the same topic.
arrow_forwardauto_awesomeContinue Pattern
Array plus Matrix
Expand the same solving frame across more problems.
arrow_forwardsignal_cellular_altSame Difficulty Track
Easy
Stay on this level to stabilize interview delivery.
arrow_forward