Review of Algorithm (HITSZ) 含22年真题回忆

Review of Algorithm (HITSZ)含22年真题回忆

    • 1. Time Analysis
      • 1.1 Basic
      • 1.2 Master Method
      • 1.3 Recurrence Problems
    • 2. Sorting Algorithm
      • 2.1 Comparing Sort
        • 2.1.1 Insertion Sort
        • 2.1.2 Merge Sort
      • 2.1.3 Shell Sort
      • 2.1.4 Lower boundary of comparison sort
      • 2.2 Linear Sort
        • 2.2.1 Counting Sort (Stable)
        • 2.2.2 Radix Sort (Stable)
        • 2.2.3 Bucket Sort
    • 3. Hash Table
      • 3.1 Hash method
      • 3.2 Collision
    • 4. BST
      • 4.1 Normal BST
      • 4.2 Treap
    • 5. RB Tree
    • 6. B Tree
    • 7. Augmenting Data Structures
    • 8. Skip Lists
    • 9. Amortized Analysis
      • 9.1 Basic Concept
      • 9.2 Examples
        • 9.2.1 Dynamic tables
        • 9.2.2 Binary Counter
    • 10. Dynamic Programming
    • 11. Greedy Algorithm
      • 11.1 Basic
      • 11.2 Examples
    • 12. Linear Programming
    • 13. FFT
    • 14. NP
      • 14.1 P, NP, NPC, NPH
      • 14.2 Solution for NPC problems
    • 15. 2022年真题回忆
      • 15.1 选择题
      • 15.2 大题

1. Time Analysis

1.1 Basic

NOTE:

  1. l o g a b = l o g c b l o g c a log_ab=\frac{log_cb}{log_ca} logab=logcalogcb
  2. l g n = l o g 2 n lgn=log_2n lgn=log2n
  • p ( n ) = O ( f ( n ) ) p(n)=O(f(n)) p(n)=O(f(n))

    p ( n ) ≤ c f ( n ) p(n)\le cf(n) p(n)cf(n).

  • p ( n ) = Ω ( f ( n ) ) p(n)=\Omega(f(n)) p(n)=Ω(f(n))

    p ( n ) ≥ c f ( n ) p(n)\ge cf(n) p(n)cf(n)

  • p ( n ) = Θ ( f ( n ) ) p(n)=\Theta(f(n)) p(n)=Θ(f(n))

    c 1 f ( n ) ≤ p ( n ) ≤ c 2 f ( n ) c_1f(n)\le p(n)\le c_2f(n) c1f(n)p(n)c2f(n)

  • p ( n ) = o ( f ( n ) ) p(n)=o(f(n)) p(n)=o(f(n))

    p ( n ) < c f ( n ) p(n)< cf(n) p(n)<cf(n)

  • p ( n ) = ω ( f ( n ) ) p(n)=\omega(f(n)) p(n)=ω(f(n))

    p ( n ) > c f ( n ) p(n)> cf(n) p(n)>cf(n)

Solution: Calculating l i m n → ∞ p ( n ) f ( n ) lim_{n\rightarrow\infty}\frac{p(n)}{f(n)} limnf(n)p(n).

  • If l i m n → ∞ p ( n ) f ( n ) = a ≠ 0 lim_{n\rightarrow\infty}\frac{p(n)}{f(n)}=a\ne0 limnf(n)p(n)=a=0, p ( n ) = Θ ( f ( n ) ) p(n)=\Theta(f(n)) p(n)=Θ(f(n))
  • If l i m n → ∞ p ( n ) f ( n ) = t ≤ a lim_{n\rightarrow\infty}\frac{p(n)}{f(n)}=t\le a limnf(n)p(n)=ta, p ( n ) = O ( f ( n ) ) p(n)=O(f(n)) p(n)=O(f(n)), since p ( n ) f ( n ) < δ + t ≤ δ + a \frac{p(n)}{f(n)}<\delta+t\le\delta+a f(n)p(n)<δ+tδ+a.
  • If l i m n → ∞ p ( n ) f ( n ) = t ≥ a lim_{n\rightarrow\infty}\frac{p(n)}{f(n)}=t\ge a limnf(n)p(n)=ta, p ( n ) = Ω ( f ( n ) ) p(n)=\Omega(f(n)) p(n)=Ω(f(n)), since p ( n ) f ( n ) > t − δ ≥ a − δ \frac{p(n)}{f(n)}>t-\delta\ge a-\delta f(n)p(n)>tδaδ
  • If l i m n → ∞ p ( n ) f ( n ) = 0 lim_{n\rightarrow\infty}\frac{p(n)}{f(n)}= 0 limnf(n)p(n)=0, p ( n ) = o ( f ( n ) ) p(n)=o(f(n)) p(n)=o(f(n)), since p ( n ) f ( n ) < δ \frac{p(n)}{f(n)}<\delta f(n)p(n)<δ.
  • If l i m n → ∞ p ( n ) f ( n ) = ∞ lim_{n\rightarrow\infty}\frac{p(n)}{f(n)}= \infty limnf(n)p(n)=, p ( n ) = ω ( f ( n ) ) p(n)=\omega(f(n)) p(n)=ω(f(n)), since p ( n ) f ( n ) > N 0 \frac{p(n)}{f(n)}>N_0 f(n)p(n)>N0.

1.2 Master Method

Given equation like T ( n ) = a T ( n b ) + f ( n ) T(n)=aT(\frac{n}{b})+f(n) T(n)=aT(bn)+f(n)

  • If f ( n ) = O ( n l o g b ( a ) − ϵ ) f(n)=O(n^{log_b(a)-\epsilon}) f(n)=O(nlogb(a)ϵ), T ( n ) = Θ ( n l o g b ( a ) ) T(n)=\Theta(n^{log_b(a)}) T(n)=Θ(nlogb(a))
  • If f ( n ) = Θ ( n l o g b ( a ) ) f(n)=\Theta(n^{log_b(a)}) f(n)=Θ(nlogb(a)), T ( n ) = Θ ( n l o g b ( a ) ∗ l g n ) T(n)=\Theta(n^{log_b(a)}*lgn) T(n)=Θ(nlogb(a)lgn)
  • If f ( n ) = Ω ( n l o g b ( a ) + ϵ ) f(n)=\Omega(n^{log_b(a)+\epsilon}) f(n)=Ω(nlogb(a)+ϵ), and a f ( n / b ) ≤ c f ( n ) , ∃ c < 1 af(n/b)\le cf(n), \exist c<1 af(n/b)cf(n),c<1, then T ( n ) = Θ ( f ( n ) ) T(n)=\Theta(f(n)) T(n)=Θ(f(n))

1.3 Recurrence Problems

  • Big Integer Multiplication
    X Y = a c 2 n + ( a b + c d ) 2 n / 2 + b d XY=ac2^n+(ab+cd)2^{n/2}+bd XY=ac2n+(ab+cd)2n/2+bd
    Time complexity:
    T ( n ) = 4 T ( n / 2 ) + O ( n ) T(n)=4T(n/2)+O(n) T(n)=4T(n/2)+O(n)

    Divide and conquer:
    X Y = a c 2 n + ( ( a + c ) ( b + d ) − a c − b d ) 2 n / 2 + b d XY=ac2^n+((a+c)(b+d)-ac-bd)2^{n/2}+bd XY=ac2n+((a+c)(b+d)acbd)2n/2+bd
    Time complexity:
    T ( n ) = 3 T ( n / 2 ) + O ( n ) T(n)=3T(n/2)+O(n) T(n)=3T(n/2)+O(n)

  • FFT

  • Chess Board Cover

    Put the “L” into center.

  • Binary Search

2. Sorting Algorithm

2.1 Comparing Sort

2.1.1 Insertion Sort

Insert a number into a sorted array. Need to move the element.

Time complexity:
T ( n ) = ∑ i = 1 i = n ( i − 1 ) = O ( n 2 ) T(n)=\sum_{i=1}^{i=n} (i-1) = O(n^2) T(n)=i=1i=n(i1)=O(n2)

2.1.2 Merge Sort

Divide the array into two parts, sort them separately and merge them

Time complexity:
T ( n ) = 2 T ( n 2 ) + O ( n ) T(n)=2T(\frac{n}{2})+O(n) T(n)=2T(2n)+O(n)
According to master method: a = 2 , b = 2 , f ( n ) = c n a=2,b=2,f(n)=cn a=2,b=2,f(n)=cn. Thus n l o g b a = n = Θ ( n ) n^{log_ba}=n=\Theta(n) nlogba=n=Θ(n). So that
T ( n ) = Θ ( n l g n ) T(n)=\Theta(nlgn) T(n)=Θ(nlgn)

2.1.3 Shell Sort

希尔排序动图_哔哩哔哩_bilibili

Time complexity:
≈ O ( N 1.25 ) \approx O(N^{1.25}) O(N1.25)

2.1.4 Lower boundary of comparison sort

Θ ( n l g n ) \Theta(nlgn) Θ(nlgn)

2.2 Linear Sort

2.2.1 Counting Sort (Stable)

  • Flow:

    • Input array A A A.
    • Assigning 0 to array C C C. ( O ( k ) O(k) O(k))
    • Counting each element in an array A A A, recording in C C C. ( O ( n ) O(n) O(n))
    • Accumulating array C C C. ( O ( k ) O(k) O(k))
    • Sorting in array B B B. ( O ( n ) O(n) O(n))
  • Time complexity: O ( n + k ) O(n + k) O(n+k), where n n n is the input number, k k k is equivalent to value of the maximal number.

2.2.2 Radix Sort (Stable)

  • Run Counting Sort in group
  • Time complexity: O ( b r ( n + 2 r ) ) = O ( n ) O(\frac{b}{r}(n+2^r))=O(n) O(rb(n+2r))=O(n). Where b b b is the bits number of the compared numbers, r r r is the bits number of per group number. Thus there are b r \frac{b}{r} rb groups. We do not want r > l g n r>lgn r>lgn.

2.2.3 Bucket Sort

  • Divide input array into several buckets, running insert sort in each bucket and concatenate them together.
  • Expected time: Θ ( n ) \Theta(n) Θ(n)
  • Worst-case performance: O ( n 2 ) O(n^2) O(n2)
  • How to improve? Change insert sort to other efficient sort. Like, merge sort.

3. Hash Table

3.1 Hash method

  • Division method: h ( k ) = k m o d m h(k)=k\mod m h(k)=kmodm. Do not choose m = 2 r m=2^r m=2r, this not uses all of bits in k k k.
  • Multiplication method: h ( k ) = ( A k m o d 2 r ) r s h ( w − r ) h(k)=(Ak\mod 2^r) rsh(w-r) h(k)=(Akmod2r)rsh(wr). A A A is an odd integer in the range 2 w − 1 ≤ A ≤ 2 w 2^{w-1}\le A\le2^w 2w1A2w
  • An unsuccessful search requires: 1 + α 1+\alpha 1+α, where α = m / n \alpha=m/n α=m/n, refer to load factor.

3.2 Collision

NOTE:

  1. A m o d B = A − ( A ÷ B ) ∗ B A \mod B = A - (A \div B)*B AmodB=A(A÷B)B
  2. ( A + B ) m o d N = ( A m o d N + B m o d N ) m o d N (A+B)\mod N=(A\mod N + B\mod N) \mod N (A+B)modN=(AmodN+BmodN)modN
  • Assuming h ( k ) = k m o d m h(k)=k\mod m h(k)=kmodm.
  • Learning probing: h ′ ( k , i ) = ( k m o d m + i ) m o d m h'(k,i)=(k \mod m+i) \mod m h(k,i)=(kmodm+i)modm
  • Quadratic probing: h ′ ( k , i ) = ( k m o d m + c 1 i + c 2 i 2 ) m o d m h'(k,i)=(k \mod m+c_1i+c_2i^2) \mod m h(k,i)=(kmodm+c1i+c2i2)modm
  • Double hash probing: h ′ ( k , i ) = ( k m o d m + i ∗ h 2 ( k ) ) m o d m h'(k,i)=(k \mod m+i*h_2(k)) \mod m h(k,i)=(kmodm+ih2(k))modm
  • Theorem: Open-addressed hash table with load factor α \alpha α, the expected number of probes in an unsuccessful search is at most 1 / ( 1 − α ) 1/(1-\alpha) 1/(1α) . That is to say:
    • If the table is half full, then the expected number of probes is 1 / ( 1 – 0.5 ) = 2 1/(1–0.5) = 2 1/(10.5)=2.
    • If the table is 90% full, then the expected number of probes is 1 / ( 1 – 0.9 ) = 10 1/(1–0.9) = 10 1/(10.9)=10.

4. BST

4.1 Normal BST

  • Left subtree is less than root

  • Right subtree is large than root

  • Deletion:

    • No children: Delete directly.
    • One child: Delete and make the child as the parent’s child.
    • Two child: Replace with the successor. Apply 1 or 2 to delete it.
  • Randomly built BST has O ( l g n ) O(lgn) O(lgn) search time complexity.

4.2 Treap

Treap is the BST with priority. This helps build a randomly BST.

5. RB Tree

  • Node is either red or black.
  • Root is always black.
  • Black height has only one.
  • Terminate with two black nodes.
  • Red node cannot connect with red node.
  • Theorem: A RB tree with n n n internal nodes has height h ≤ 2 l g ( n + 1 ) h\le2lg(n+1) h2lg(n+1). h = O ( l g n ) h=O(lgn) h=O(lgn).
    • In my opinion, N ≤ n + n − 1 N\le n+n-1 Nn+n1, so that h ≤ l g ( 2 n − 1 ) = O ( l g n ) h\le lg(2n-1)=O(lgn) hlg(2n1)=O(lgn)

6. B Tree

  • Degree n n n (out degree): allows at least n − 1 n-1 n1 keys, at most 2 n − 1 2n-1 2n1 keys.

  • Split when insertion.

  • h ≤ l o g t ( ( n + 1 ) 2 ) h\le log_t(\frac{(n+1)}{2}) hlogt(2(n+1)). See proof below.

在这里插入图片描述

  • Deletion: Make sure each node has at least t − 1 t-1 t1 keys and merge those nodes can be merged. i.e., both t − 1 t-1 t1.

7. Augmenting Data Structures

  • Order-Statistic Tree (OS-Tree): Recording the number of elements of subtree.
  • Interval Tree: Each node represents an interval [ a , b ] [a,b] [a,b], and maintains a m a x max max field, which satisfies:

m a x = m a x ( { n o d e . h i g h n o d e . l e f t . m a x n o d e . r i g h t . m a x ) max=max(\left\{ \begin{aligned} node.high \\ node.left.max \\ node.right.max \end{aligned} \right.) max=max(node.highnode.left.maxnode.right.max)

  • Search interval i in the Interval Tree:
    • Check if root is overlapped. If it is overlapped, then return root.
    • Check whether root.left.maxi.left, go to left. Recursively run process.
    • Otherwise, go to right. Recursively run process.

8. Skip Lists

  • Each node is promoted with 1 / 2 1/2 1/2 probability.
  • Multiple indexing
  • Idea: Skip list with l g n lgn lgn linked lists is like a binary tree (B+ tree).
  • With high probability, every search in an n-element skip list costs O ( l g n ) O(lg n) O(lgn)

9. Amortized Analysis

9.1 Basic Concept

  • Aggregated:
    T = ∑ i = 1 n u n i t n T=\frac {\sum_{i=1}^n unit}{n} T=ni=1nunit

  • Accounting: Store c c c for each normal case operation, and each normal case operation spends m m m. Thus, you store c − m c-m cm to the bank for each normal case operation. When the cost operation is coming, you spend all of your money in the bank. Make sure that the bank always has positive money.
    T = ∑ i = 1 n c n = c T=\frac{\sum_{i=1}^n c}{n}=c T=ni=1nc=c

  • Potential: Design a potential function ϕ ( D i ) \phi(D_i) ϕ(Di), s.t., c i ′ = c i + ϕ ( D i ) − ϕ ( D i − 1 ) c'_i=c_i+\phi(D_i)-\phi(D_{i-1}) ci=ci+ϕ(Di)ϕ(Di1). Make sure that ϕ ( D 0 ) = 0 \phi(D_0)=0 ϕ(D0)=0, and ϕ ( D i ) ≥ 0 \phi(D_i)\ge 0 ϕ(Di)0

9.2 Examples

9.2.1 Dynamic tables

  • Aggregated
    c i = { i , i − 1 = 2 n 1 , o t h e r w i s e c_i=\left\{ \begin{aligned} i, i-1=2^n \\ 1, otherwise \\ \end{aligned} \right. ci={i,i1=2n1,otherwise

    ∑ i = 1 n c i ≤ n + ∑ j = 1 l g ( n − 1 ) 2 j ≤ 3 n = Θ ( n ) \sum_{i=1}^nc_i\le n+\sum_{j=1}^{lg(n-1)} 2^j\le3n=\Theta(n) i=1ncin+j=1lg(n1)2j3n=Θ(n)

    Thus,
    c i ′ = Θ ( n ) / n = Θ ( 1 ) c_i'=\Theta(n)/n= \Theta(1) ci=Θ(n)/n=Θ(1)

  • Accounting

    Charge c i ′ = 3 c_i'=3 ci=3 for each operation, so that 1 1 1 for insertion, 2 2 2 is stored for extending. When extending the table, 1 1 1 for inserting old item. 1 1 1 for inserting new item. See the figure below

在这里插入图片描述
在这里插入图片描述

  • Potential

    Define
    ϕ ( D i ) = 2 i − 2 ⌈ l g i ⌉ \phi(D_i)=2i-2^{\lceil lgi\rceil } ϕ(Di)=2i2lgi
    So that
    c i ′ = c i + ϕ ( D i ) − ϕ ( D i − 1 ) c_i'=c_i+\phi(D_i)-\phi(D_{i-1}) ci=ci+ϕ(Di)ϕ(Di1)
    Thus, we have c i ′ = 3 c_i'=3 ci=3.

9.2.2 Binary Counter

  • Aggregated

    The summation of flip frequency of each bit.
    n 2 + n 4 + n 8 + . . . + n 2 n = 2 n \frac{n}{2}+\frac{n}{4}+\frac{n}{8}+...+\frac{n}{2^n}=2n 2n+4n+8n+...+2nn=2n
    So that c i ′ = 2 c_i'=2 ci=2.

  • Accounting

    Operation 0 → 1 0\rightarrow 1 01 store 2 2 2, and pay for 1 1 1. Operation 1 → 0 1\rightarrow 0 10 pay for 1 1 1 store 0. Thus, for n n n operations, we must store 2 n 2n 2n fee. Thus c i ′ = 2 c_i'=2 ci=2.

  • Potential

    Define
    ϕ ( D i ) = n u m b e r o f 1 i n c o u n t e r \phi(D_i)= number\ of\ 1\ in \ counter ϕ(Di)=number of 1 in counter
    Thus,
    c i ′ = c i + ϕ ( D i ) − ϕ ( D i − 1 ) c_i'=c_i+\phi(D_i)-\phi(D_{i-1}) ci=ci+ϕ(Di)ϕ(Di1)
    Let operation i i i changes t i t_i ti 1 1 1 to 0 0 0. And there only one 0 0 0 is changed to 1 1 1 (But be careful to the situation that the counter is overflow). Thus
    ϕ ( D i ) ≤ ϕ ( D i − 1 ) − t i + 1 \phi(D_i)\le\phi(D_{i-1})-t_i+1 ϕ(Di)ϕ(Di1)ti+1
    So that
    c i ′ ≤ t i + 1 + 1 − t i = 2 c_i'\le t_i+1 +1-t_i=2 citi+1+1ti=2

10. Dynamic Programming

  • Matrix-chain Multiplication

    The minimal multiplication times of A 1 ∗ A 2 ∗ . . . ∗ A n A_1*A_2*...*A_n A1A2...An

    Let M i n [ i , j ] Min[i,j] Min[i,j] denote the minimal multiplication times of A i ∗ . . . ∗ A j A_i*...*A_j Ai...Aj
    M i n [ i , j ] = { 0 i = j m i n i ≤ k ≤ j { M i n [ i , j ] , M i n [ i , k ] + M i n [ k + 1 , j ] + p i ∗ p k ∗ p j } i ≠ j Min[i,j]= \begin{cases} 0 & \text{ $ i=j $ } \\ min_{i\le k\le j}\{Min[i,j], Min[i,k]+Min[k+1,j]+p_i*p_k*p_j\} & \text{ $ i\ne j $ } \end{cases} Min[i,j]={0minikj{Min[i,j],Min[i,k]+Min[k+1,j]+pipkpj} i=j  i=j 

  • LCS problem

    The longest common sub sequence of X = x 1 x 2 . . . x m X=x_1x_2...x_m X=x1x2...xm and Y = y 1 y 2 . . . y n Y=y_1y_2...y_n Y=y1y2...yn

    Let L C S [ i , j ] LCS[i,j] LCS[i,j] denote the length of the longest common sequence of X i = x 1... x i X_i=x1...x_i Xi=x1...xi and Y j = y 1 . . . y j Y_j=y_1...y_j Yj=y1...yj

    Thus,
    L C S [ i , j ] = { 0 i = 0 o r j = 0 L C S [ i − 1 , j − 1 ] + 1 X i = Y j m a x { L C S [ i − 1 , j ] , L C S [ i , j − 1 ] } X i ≠ Y j LCS[i,j]= \begin{cases} 0 & \text{ $ i =0 \ or\ j=0 $} \\ LCS[i-1,j-1]+1 & \text{ $ X_i=Y_j $ } \\ max\{ LCS[i-1,j],LCS[i,j-1]\} & \text{ $ X_i\ne Y_j $ } \end{cases} LCS[i,j]=0LCS[i1,j1]+1max{LCS[i1,j],LCS[i,j1]} i=0 or j=0 Xi=Yj  Xi=Yj 

  • Triangle Decomposition of Convex Polygon

    Find the minimal summation of triangle decomposition.

    Let M i n [ i , j ] Min[i,j] Min[i,j] denote the minimal summation triangle decomposition of vertex from i i i to j j j.

    Thus,
    M i n [ i , j ] = { 0 i = j m i n i < k < j { M i n [ i , k ] + M i n [ k , j ] + w ( v i v j v k ) } i ≠ j Min[i,j]= \begin{cases} 0 & \text{ $ i=j $ } \\ min_{i< k< j}\{Min[i,k]+Min[k,j]+w(v_i v_j v_k)\} & \text{ $ i\ne j $ } \end{cases} Min[i,j]={0mini<k<j{Min[i,k]+Min[k,j]+w(vivjvk)} i=j  i=j 

在这里插入图片描述

11. Greedy Algorithm

11.1 Basic

To prove a greedy algorithm is workable, you should:

  • First, show the optimal structure: Given an optimal solution of the problem, prove that it’s subproblem is also optimal. i.e., applying the recursive function like DP problem.
  • Second, show that the greedy selection must be in the final optimal solution.

11.2 Examples

  • Activities arrangement
  • 0-1 Knapsack

12. Linear Programming

  • Simplex Algorithm
    • Build the slack form, where each variable is constrained by ≥ 0 \ge 0 0.
    • Find the most constrained variable, replacing it with the slack variable. Where slack variable can reach to 0 0 0.
    • Recursive to find the anwser.

13. FFT

To solve a FFT problem, for example:

Compute DFT of input ( x ( 0 ) , x ( 1 ) , x ( 2 ) , x ( 3 ) , x ( 4 ) , x ( 5 ) , x ( 6 ) , x ( 7 ) ) (x(0),x(1),x(2),x(3),x(4),x(5),x(6),x(7)) (x(0),x(1),x(2),x(3),x(4),x(5),x(6),x(7)) by using FFT

  • First, determine leaves. In this example, leaves are

    ( x ( 0 ) , x ( 4 ) ) (x(0),x(4)) (x(0),x(4)), ( x ( 2 ) , x ( 6 ) ) (x(2),x(6)) (x(2),x(6)), ( x ( 1 ) , x ( 5 ) ) (x(1),x(5)) (x(1),x(5)), ( x ( 3 ) , x ( 7 ) ) (x(3),x(7)) (x(3),x(7))

  • Second, draw a butterfly. i.e,

    There are two things you should note:

    1. Recursively. W 8 0 = W 4 0 W_8^0=W_4^0 W80=W40, W 8 2 = W 4 1 W_8^2=W_4^1 W82=W41, and 1 = W 2 0 1=W_2^0 1=W20
    2. Positive for even number and Negative for odd number.

在这里插入图片描述

  • Third, compute from leaves, for example
    X ( 0 ) = A ( 0 ) + W 8 0 B ( 0 ) = ( C ( 0 ) + D ( 0 ) ) + W 8 0 ( E ( 0 ) + F ( 0 ) ) = ( x ( 0 ) + x ( 4 ) + x ( 2 ) + x ( 6 ) ) + W 8 0 ( x ( 1 ) + x ( 5 ) + x ( 3 ) + x ( 7 ) ) = ∑ i = 0 i = 7 x ( i ) \begin{aligned} X(0) &=A(0)+W_8^0B(0)\\ &=(C(0)+D(0))+W_8^0(E(0)+F(0))\\ &=(x(0)+x(4)+x(2)+x(6))+W_8^0(x(1)+x(5)+x(3)+x(7))\\ &=\sum_{i=0}^{i=7} x(i) \end{aligned} X(0)=A(0)+W80B(0)=(C(0)+D(0))+W80(E(0)+F(0))=(x(0)+x(4)+x(2)+x(6))+W80(x(1)+x(5)+x(3)+x(7))=i=0i=7x(i)

14. NP

14.1 P, NP, NPC, NPH

  • P: Solve and Verify in polynomial time.

  • NP: Verify in polynomial time.

  • NP-hard: All NP problems can be reduce to the problem.

  • NPC: ① It is a NP problem ② Any NP problem can be reduced to the problem.

  • The relations

    • If N P ≠ P NP\ne P NP=P: N P H NPH NPH can be reduced to either N P NP NP problems or other n o n − P non-P nonP problems.
    • If N P = P NP=P NP=P: N P H NPH NPH can be reduce to N P = P NP=P NP=P problems or other problems. Thus if one problem is N P NP NP, it is a N P H NPH NPH problem. Thus, it is a N P C NPC NPC problem.

在这里插入图片描述

14.2 Solution for NPC problems

  • Exponential solution

    • Backtracking (DFS + prune)
    • Dynamic programming
  • Approximate algorithm

    • Approximate ratio

      Assume the best solution is c ∗ c^* c, and the approximate solution is c c c. The approximate ratio is:
      m a x { c ∗ c , c c ∗ } ≤ ρ ( n ) max\{\frac{c^*}{c},\frac{c}{c^*}\} \le \rho(n) max{cc,cc}ρ(n)
      ρ ( n ) \rho(n) ρ(n) is approximate ratio.

    • Relative error ϵ ( n ) \epsilon(n) ϵ(n)
      ∣ c − c ∗ c ∗ ∣ ≤ ϵ ( n ) |\frac{c-c^*}{c^*}|\le \epsilon(n) cccϵ(n)

15. 2022年真题回忆

选择 + 大题

15.1 选择题

共5题

  1. FFT时间复杂度 ( O l g n Olgn Olgn)
  2. 红黑树插入时间复杂度( l g n lgn lgn
  3. B树度为t, 判断关键字数、儿子数
  4. 哪种形式是hash的线性探测
  5. 忘记了

15.2 大题

  1. 树堆(Treap)插入
  2. 用FFT计算DFT
  3. 递归树计算 T ( n ) = 3 T ( n / 4 ) + c n 2 T(n)=3T(n/4)+cn^2 T(n)=3T(n/4)+cn2
  4. 线性规划,求解 18 x + 12.5 y 18x+12.5y 18x+12.5y的最大值
  5. 证明活动选择问题,选择最晚开始的活动
  6. BST与红黑树问题:给一个树形结构填值,使之符合BST;红黑树黑高度为k,最大、最少内部节点数;BST旋转一下变成红黑树
  7. B树+Augumenting
  8. 动规问题,合并石堆

总的来说不算难,但是第7题的证明题是真难想,希望有分~~

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://www.rhkb.cn/news/42587.html

如若内容造成侵权/违法违规/事实不符,请联系长河编程网进行投诉反馈email:809451989@qq.com,一经查实,立即删除!

相关文章

李永乐老师讲博弈论:帝王为啥要杀掉有功之臣

帝王为啥要杀掉有功之臣&#xff1f;李永乐老师给大家讲讲博弈论基础。 博弈论最早由数学家冯诺依曼开拓&#xff0c;后来经过约翰纳什发展&#xff0c;是数学的一个分支。博弈论所研究的是&#xff1a;在一定的规则之下&#xff0c;参与博弈的几个人通过一定的规则使自己的利…

李永乐讲卷积神经网络,卷积神经网络最新进展

BP神经网络的核心问题是什么?其优缺点有哪些? 。 人工神经网络,是一种旨在模仿人脑结构及其功能的信息处理系统,就是使用人工神经网络方法实现模式识别.可处理一些环境信息十分复杂,背景知识不清楚,推理规则不明确的问题,神经网络方法允许样品有较大的缺损和畸变.神经网络的…

傅里叶变换与反变换(李永乐老师笔记)

一) 傅里叶分解 任何一个时域空间的周期性函数都可以分解成一组正(余)弦波, (图一) 二) 傅里叶变换 时域函数 -> 频域函数 f(t) 经过F操作分解成一组正余弦波(F操作为傅里叶变换) (图二) 怎么在频域空间描述这组正余弦波呢,直觉的答案是用不同频率和相应的振幅来描述…

【笔记】ChatGPT是怎样炼成的(李宏毅2023机器学习课程引入部分)

来源&#xff1a;【授权】李宏毅2023春机器学习课程 ChatGPT太火热了&#xff0c;借此简单了解一下 ChatGPT的newbie之处在哪里&#xff1f; 同一个问题&#xff0c;它的每次回答都不同&#xff1b;处于同一个chat中&#xff0c;我可以追问多个问题&#xff0c;因为它知道上下…

登录、退出(关于token)

关于token https://www.jianshu.com/p/8d28e60af440 一般APP都是刚安装后&#xff0c;第一次启动时需要登录&#xff08;提示你需要登录或者直接启动在登录界面&#xff09;。而只要登录成功后&#xff0c;以后每次启动时都是登录状态&#xff0c;不需要每次启动时再次登录。…

ChatGPT商业版免授权源码/AI绘画系统/用户付费系统

&#x1f389; 有需要的朋友记得关赞评&#xff0c;文章底部来交流&#xff01;&#xff01;&#xff01; &#x1f389; ✨ 源码介绍 最新 ChatGPT商业版源码&#xff0c;该产品支持用户付费套餐&#xff0c;AI 绘画&#xff0c;支付对接&#xff0c;卡密购买兑换等功能&#…

Latex投稿Elsevier后被要求修改格式(三)图、表和算法汇总

目录 1. 图 2. 表 3. 算法 想要在Latex中加入相关图、表和算法只要将对应的代码段放在正确的位置即可&#xff0c;这个“正确”就是文中提到图、表和算法的段落下方啦~~ 附上Latex文章代码&#xff08;这里面还没有图、表和算法&#xff09; Latex投稿Elsevier后被要求修改…

Latex投稿Elsevier后被要求修改格式(四)如何修文章References的颜色

论文到了修改阶段&#xff0c;不可避免就要标记对应的修改部分&#xff0c;之前修改设计的都是正文部分的内容&#xff0c;修改颜色的代码相对简单&#xff0c; 如下所示&#xff1a; \textcolor{blue}{正文修改后需要标记的句子} 这样的代码无法跨行标记&#xff0c;想要实现…

视频编辑软件有哪些?介绍几种功能强大的编辑软件

视频编辑软件有哪些呢&#xff1f;如果我们录制了一段视频&#xff0c;但是其中包含了一些无用或者不太好的片段&#xff0c;我们就需要进行视频修剪&#xff0c;剪掉这些片段&#xff0c;让视频更加精炼。通过修剪视频素材&#xff0c;我们可以将一些不必要的部分去掉&#xf…

CDN,高防IP接入报错504是为什么。解答方案一。

当出现504错误的时候&#xff0c;说明节点和源之间的通信出了问题&#xff0c;一般都是因为没有加白名单引起的&#xff0c;源IP要给节点IP加白名单&#xff0c;节点IP也要给源IP加白名单。而像阿里云ECS这种类型的&#xff0c;很多人以为这个是没墙的不用加白&#xff0c;其实…

大模型之外,阿里云对未来的真正布局是什么?

2023年&#xff0c;阿里云进入到了新的发展阶段。作为全球市场第三大、中国市场第一大公共云&#xff0c;阿里云在截止到2022年3月的2022年财年&#xff0c;已经实现13年来首次实现年度盈利&#xff0c;营收规模在8年时间增长了57倍&#xff0c;2023年财年前三财季营收超过了65…

CSDN蒋涛对话阿里云CTO周靖人:大模型风起云涌,阿里云将毫无保留地开放各项能力...

4月 7 日&#xff0c;阿里大语言模型“通义千问”官宣邀测引发热议&#xff0c;国内大模型一触即发。 今天在2023阿里云峰会上&#xff0c;阿里云智能首席技术官周靖人正式发布“通义千问”。阿里所有产品未来将接入通义千问进行全面改造&#xff0c;钉钉、天猫精灵率先接入测试…

阿里云李钟:弹性计算控制系统团队的提效之路

2023 年 3 月 25 日&#xff0c;“城市领航之夜第一期”活动在上海举行&#xff0c;阿里云弹性计算控制系统技术架构负责人李钟出席了本次活动并带来了《弹性计算控制系统团队提效之路》的主题演讲&#xff0c;为大家详细分享了阿里云弹性计算控制系统团队所面临的挑战、如何通…

树莓派声控小车

树莓派声控小车 一、实验方案设计 声控小车的实现方案由三部分组成&#xff1a;①语音识别指令&#xff1b;②控制小车&#xff1b;③语音识别指令直接控制小车。 1.语音识别指令 识别声音利用百度AI开放平台的语音识别功能。由于搭配的麦克风和可识别音频的采样率不同&…

当我获取了文心一言的体验资格,立刻重复了和李彦宏发布会一样的问题,看看文心一言有没有进步(或者是“退步”?

当我获取了文心一言的体验资格&#xff0c;立刻重复了和李彦宏发布会一样的问题&#xff0c;看看文心一言有没有进步&#xff08;或者是退步&#xff1f; 引言文心一言申请方法测试结果文学创作《三体》的作者是哪里人&#xff1f;可以总结一下三体的核心内容吗&#xff1f;如果…

年终反思潮!李彦宏:“马化腾说的问题,百度也都有……”

整理 | 朱珂欣 出品 | CSDN程序人生&#xff08;ID&#xff1a;coder_life&#xff09; 到了年末&#xff0c;难免少不了“年终总结”、“反思潮”&#xff0c;互联网的“大佬”们怎能落下&#xff1f; 前段时间&#xff0c;马化腾在 2022 年内部员工大会上&#xff0c;谈及…

谷歌一雪前耻!全新PaLM 2反超GPT-4,办公全家桶炸裂升级,Bard史诗进化

来源&#xff1a;新智元 【导读】新版PaLM 2超强进化&#xff0c;办公全家桶Workspace全面升级&#xff0c;Bard全面增强、所有人可用……可以看出&#xff0c;这届I/O大会&#xff0c;谷歌是真的憋出不少大招。 谷歌I/O 2023大会&#xff0c;仿佛又给谷歌的支持者们打了一针强…

阿里巴巴取消 CTO 一职;近半数微软员工担心被 AI 抢饭碗;Flutter 3.10 发布|极客头条

「极客头条」—— 技术人员的新闻圈&#xff01; CSDN 的读者朋友们早上好哇&#xff0c;「极客头条」来啦&#xff0c;快来看今天都有哪些值得我们技术人关注的重要新闻吧。 整理 | 梦依丹 出品 | CSDN&#xff08;ID&#xff1a;CSDNnews&#xff09; 一分钟速览新闻点&…

图灵奖得主杨立昆炮轰ChatGPT:五年内就没人用了

近日&#xff0c;图灵奖得主杨立昆现身芒克辩论会&#xff0c;他表示&#xff0c;每一项科技都会伴随风险&#xff0c;但 AI 会屈从于人类并服务人类。“ChatGPT 并没有真正理解现实世界&#xff0c;因为它们都是纯粹的文本训练&#xff0c;而人类的大部分知识与文本或语言无关…

微信流量主小程序源码万能工具箱+完整搭建教程

微信流量主小程序源码系统万能工具箱小程序源码&#xff0c;自带流量主广告位功能&#xff0c;新手小白即可快速上手&#xff0c;带完美搭建教程。 微信流量主小程序源码系统春哥万能工具箱小程序源码源码下载地址&#xff1a;春哥技术博客