## preface

The recruitment season is generally golden three silver four, or golden nine silver ten. Recently, I interviewed more than a dozen senior front-end. There is a set of small topics to investigate the algorithm. The background returns a flat data structure and turns it into a tree.

Let's look at the following topic: the data content of leveling is as follows:

let arr = [ {id: 1, name: 'Sector 1', pid: 0}, {id: 2, name: 'Sector 2', pid: 1}, {id: 3, name: 'Sector 3', pid: 1}, {id: 4, name: 'Sector 4', pid: 3}, {id: 5, name: 'Sector 5', pid: 4}, ]

Output results

[ { "id": 1, "name": "Sector 1", "pid": 0, "children": [ { "id": 2, "name": "Sector 2", "pid": 1, "children": [] }, { "id": 3, "name": "Sector 3", "pid": 1, "children": [ // result,,, ] } ] } ]

Our requirements are very simple. We don't need to consider the performance problem first. I realized the function and went back to analyze the interview. The result surprised me.

10% of people have no ideas and have never encountered this structure

60% said they had used recursion and had ideas. They gave him a notebook, but they just couldn't write it out

20% of people can write under guidance

The remaining 10% can write, but the performance is not the best

It's really hard to meet a suitable person in the recruitment season.

Next, we use several methods to implement this small algorithm

## What is a good algorithm and what is a bad algorithm

Judging the quality of an algorithm, generally from the perspective of execution time and occupied space, the shorter the execution time, the smaller the occupied memory space, then it is a good algorithm. Correspondingly, we often use time complexity to represent execution time and space complexity to represent the occupied memory space.

### Time complexity

The calculation of time complexity is not to calculate the specific running time of the program, but the number of statements executed by the algorithm.

With the increase of n, the time complexity increases, and the algorithm takes more time. Common time complexity are

- Constant order O(1)
- Logarithmic order O(log2 n)
- Linear order O(n)
- Linear logarithmic order O(n log2 n)
- Square order O(n^2)
- Cubic order O(n^3)
- K-th order O(n^K)
- Exponential order O(2^n)

#### computing method

- Select the item with the highest relative growth
- The coefficient of the highest term is reduced to 1
- If it is a constant, it is expressed by O(1)

For example, if f(n)=3*n^4+3n+300, O(n)=n^4

Usually, the calculation time complexity is the worst case. Several points needing attention in calculating time complexity

- If the execution time of the algorithm does not increase with the increase of n, if there are thousands of statements in the algorithm, the execution time is just a large constant. The time complexity of this kind of algorithm is O(1). For example, the code is executed 100 times, which is a constant and the complexity is O(1).

let x = 1; while (x <100) { x++; }

- When there are multiple loop statements, the time complexity of the algorithm is determined by the method of the innermost statement in the loop statements with the most nested layers. For example, in the following for loop, each time the outer loop is executed, the inner loop needs to be executed n times. The execution times are determined according to N, and the time complexity is O(n^2).

for (i = 0; i < n; i++){ for (j = 0; j < n; j++) { // ...code } }

- The loop is not only related to N, but also related to the judgment conditions of executing the loop. For example, in the code, if arr[i] is not equal to 1, the time complexity is O(n). If arr[i] is equal to 1, the loop does not execute, and the time complexity is O(0).

for(var i = 0; i<n && arr[i] !=1; i++) { // ...code }

### Spatial complexity

Space complexity is the amount of storage space temporarily occupied by an algorithm during operation.

#### Calculation method:

- Ignore the constant, expressed as O(1)
- Space complexity of recursive algorithm = (recursion depth n) * (auxiliary space required for each recursion)

Some simple points of computing space complexity

- Only a single variable is copied, and the spatial complexity is O(1). For example, the space complexity is O(n) = O(1).

let a = 1; let b = 2; let c = 3; console.log('output a,b,c', a, b, c);

- Recursive implementation, call the fun function, and create a variable k each time. Call n times, space complexity O(n*1) = O(n).

function fun(n) { let k = 10; if (n == k) { return n; } else { return fun(++n) } }

## Regardless of performance implementation, recursive traversal search

The main idea is to provide a recursive getChildren method, which recursively finds subsets.

In this way, without considering performance, most people only know recursion, but they can't write it...

/** * Recursive search to get children */ const getChildren = (data, result, pid) => { for (const item of data) { if (item.pid === pid) { const newItem = {...item, children: []}; result.push(newItem); getChildren(data, newItem.children, item.id); } } } /** * Conversion method */ const arrayToTree = (data, pid) => { const result = []; getChildren(data, result, pid) return result; }

From the analysis of the above code, the time complexity of the implementation is O(2^n).

## It can be done without recursion

The main idea is to first convert the data into a Map for storage, and then traverse and directly find the corresponding data from the Map for storage with the help of object reference

function arrayToTree(items) { const result = []; // Store result set const itemMap = {}; // // Convert to map storage first for (const item of items) { itemMap[item.id] = {...item, children: []} } for (const item of items) { const id = item.id; const pid = item.pid; const treeItem = itemMap[id]; if (pid === 0) { result.push(treeItem); } else { if (!itemMap[pid]) { itemMap[pid] = { children: [], } } itemMap[pid].children.push(treeItem) } } return result; }

According to the above code, there are two loops. The time complexity of the implementation is O(2n). A Map is needed to store the data, and the space complexity is O(n)

## Optimal performance

The main idea is to first convert the data into a Map for storage, and then traverse and directly find the corresponding data from the Map for storage with the help of object reference. The different points are stored in the Map when traversing, and the corresponding relationship is found. The performance will be better.

function arrayToTree(items) { const result = []; // Store result set const itemMap = {}; // for (const item of items) { const id = item.id; const pid = item.pid; if (!itemMap[id]) { itemMap[id] = { children: [], } } itemMap[id] = { ...item, children: itemMap[id]['children'] } const treeItem = itemMap[id]; if (pid === 0) { result.push(treeItem); } else { if (!itemMap[pid]) { itemMap[pid] = { children: [], } } itemMap[pid].children.push(treeItem) } } return result; }

From the analysis of the above code, we can finish it in one cycle. The time complexity of the implementation is O(n). A Map is needed to store the data, and the space complexity is O(n)

## Small trial ox knife

method | 1000 (PCs.) | 10000 (PCs.) | 20000 (PCs.) | 50000 (PCs.) |
---|---|---|---|---|

Recursive implementation | 154.596ms | 1.678s | 7.152s | 75.412s |

No recursion, twice traversal | 0.793ms | 16.499ms | 45.581ms | 97.373ms |

No recursion, one-time traversal | 0.639ms | 6.397ms | 25.436ms | 44.719ms |

From our test results, with the increase of the number, the implementation of recursion will be slower and slower, basically increasing exponentially.

## Conclusion

Everyone thinks that the advanced front-end should write this out smoothly. Leave your opinion in the comments area. There is a better implementation than the above. Leave your answers in the comment area and learn together.