Bubble sort (Bubble Sort) It is also a simple and intuitive sorting algorithm . It repeatedly visits the sequence to be sorted , Compare two elements at a time , If they're in the wrong order, exchange them . The job of the interview sequence is to repeat until there is no need to exchange , That is to say, the sequence has been sorted . The name of this algorithm comes from the fact that the smaller the elements, the more slowly " floating " Go to the top of the list .
Compare adjacent elements . If the first one is bigger than the second one , Just swap them .
Do the same for each pair of adjacent elements , From the beginning of the first couple to the end of the last couple . After this step , The last element will be the maximum number .
Repeat the above steps for all elements , Except for the last one .
Keep repeating the above steps for fewer and fewer elements each time , Until there's no pair of numbers to compare .
def bubbleSort(arr):
n = len(arr)
# Traverse all array elements
for i in range(n):
# Last i elements are already in place
for j in range(0, n - i - 1):
if arr[j] > arr[j + 1]:
arr[j], arr[j + 1] = arr[j + 1], arr[j]
print(arr)
arr = [64, 34, 25, 12, 22, 11, 90]
bubbleSort(arr)
Set a variable to False, If the elements swap positions , Reassign the variable to True, Finally, judge ,
At the end of a cycle , If the variable is still False, be brak Exit loop , End sort .
def bubble_sort(items):
for i in range(len(items) - 1):
flag = False
for j in range(len(items) - 1 - i):
if items[j] > items[j + 1]:
items[j], items[j + 1] = items[j + 1], items[j]
flag = True
print(items)
if not flag:
break
return items
items=[1,2,6,4,5]
print(bubble_sort(items))
Fast sorting uses divide and conquer (Divide and conquer) Strategy to put a sequence (list) Divided into smaller and larger 2 Subsequence , Then sort the two subsequences recursively .
The steps are :
Recursion to the bottom of the judgment condition is the size of the sequence is zero or one , At this time, the sequence is obviously in order .
There are several specific methods for selecting reference value , This selection method has a decisive influence on the time performance of sorting .
def quick_sort(arr):
if len(arr) < 2:
return arr
# Select benchmark , You can choose any one , The choice is easy to understand
mid = arr[len(arr) // 2]
# Define two series of datum values
left = []
right = []
# Remove the reference from the original array
arr.remove(mid)
for item in arr:
# Put it to the right when it is greater than the reference value
if item >= mid:
right.append(item)
else:
# Lower than the reference value to the left
left.append(item)
# Use iterations to compare
return quick_sort(left) + [mid] + quick_sort(right)
arr = [64, 34, 25, 12, 22, 11, 90]
print(quick_sort(arr))
Double cursors , Improve stability
Specific steps
1. Define three variables ,to_sort It means who will be arranged first , The default is the first number of an array ;low Represents the cursor on the left , and to_sort identical , The default is the first number of an array ;high Represents the cursor on the right , The default is the last number of an array .
2. For a permutation , The first number is the number we want to arrange , Has been deposited in to_sort variable , It's actually a pit , A space that can be covered . First of all, from the high The cursor starts to go left , Encountered less than to_sort The number of stops , Then throw this number to low The cursor ( here low The number of the cursor will be overwritten , and high Its position has been backed up to low The cursor , So form a pit that can be covered ), Now it's your turn low Vernier walk ,low The cursor moves to the right , Meet the ratio to_sort Large or equal ( Here we uniformly throw equal values to the right ) Just stop and throw it to high, then high Continue to go ,high and low Turn to the middle until you meet , The location of the encounter will form a pit that can be covered ( Its value has been saved in another location ), hold to_sort Throw it into the last remaining pit , Complete the first sort
3. After the first sorting , The left and right sides form an arrangement , Repeat for arrangement 2 The process , When there is only one value in the last subarray , And I finished sorting , This is a recursive process .
def quick_sort(arr, start, end):
# The exit of recursion
if start >= end:
return
# You need three variables , Record the value of the position to be found separately ( Default to first , There is to_sort in ), Sinister low The cursor , Dexter high The cursor
to_sort = arr[start]
# Left cursor
low = start
# Right cursor
high = end
# When low >= high When , Description of the completion of an element location search , Out of the loop
while low < high:
# Controls the movement of the right cursor , If the number is better than to_sort Large or equal , The cursor continues to move left , When the conditions are not met , Assign a value to another cursor
while low < high and arr[high] >= to_sort:
high -= 1
arr[low] = arr[high]
# Controls the movement of the left cursor , If the number is better than to_sort Small , The cursor continues to move right , When the conditions are not met , Assign a value to another cursor
while low < high and arr[low] < to_sort:
low += 1
arr[high] = arr[low]
# There's one left low The pit of , Used to put to_sort
arr[low] = to_sort
# This place uses recursion , Call the quick queue again for the left and right permutations
quick_sort(arr, start, low-1)
quick_sort(arr, low+1, end)
arr1 = [20, 87, 20, 89, 20, 20, 10]
quick_sort(arr1, 0, len(arr1)-1)
print(arr1)
The best case is that each element we want to arrange can split the interval to be sorted into two permutations .
The first 1 Secondary ranking obtain 2 Permutation A total of 1 Elements Time complexity n
The first 2 Secondary ranking obtain 4 Permutation A total of 2+1 Elements Time complexity n
The first 3 Secondary ranking obtain 8 Permutation A total of 4+2+1 Elements Time complexity n
And so on , If you want to arrange n Elements , need log2(n+1) Secondary ranking , The time complexity of a single sort is fixed as n, So the time complexity is nlogn
The worst case scenario is that each element to be scheduled is on the edge , Only in to_sort There are elements on one side of , And the other side doesn't .
The first 1 Secondary ranking obtain 1 Permutation A total of 1 Elements Time complexity n
The first 2 Secondary ranking obtain 1 Permutation A total of 1+1 Elements Time complexity n
The first 3 Secondary ranking obtain 1 Permutation A total of 1+1+1 Elements Time complexity n
And so on , If you want to arrange n Elements , need n Secondary ranking , The time complexity of a single sort is fixed as n, So the time complexity is O(n2)