Computer Science

Quick Sort

Quick Sort is a sorting algorithm that uses a divide-and-conquer approach to sort an array of elements. It works by selecting a pivot element and partitioning the array into two sub-arrays, one with elements smaller than the pivot and the other with elements larger than the pivot. The process is repeated recursively until the entire array is sorted.

Written by Perlego with AI-assistance

12 Key excerpts on "Quick Sort"

  • The Complete Coding Interview Guide in Java
    eBook - ePub

    The Complete Coding Interview Guide in Java

    An effective guide for aspiring Java developers to ace their programming interviews

    2 ). The space complexity is O(log n) or O(n).
    The Quick Sort algorithm debuts with an important choice. We have to choose one of the elements of the given array as the pivot . Next, we partition the given array so that all the elements that are less than the pivot come before all the elements that are greater than it. The partitioning operation takes place via a bunch of swaps. This is the divide step in divide and conquer .
    Next, the left and the right sub-arrays are again partitioned using the corresponding pivot. This is achieved by recursively passing the sub-arrays into the algorithm. This is the conquer step in divide and conquer .
    The worst case scenario (O(n2 )) takes place when all the elements of the given array are smaller than the chosen pivot or larger than the chosen pivot. Choosing the pivot element can be done in at least four ways, as follows:
    • Choose the first element as the pivot.
    • Choose the end element as the pivot.
    • Choose the median element as the pivot.
    • Choose the random element as the pivot.
    Consider the array 4, 2, 5, 1, 6, 7, 3. Here, we're going set the pivot as the end element. The following diagram depicts how Quick Sort works:
    Figure 14.4 – Quick Sort
    Step 1 : We choose the last element as the pivot, so 3 is the pivot. Partitioning begins by locating two position markers – let's call them i and m . Initially, both point to the first element of the given array. Next, we compare the element at position i with the pivot, so we compare 4 with 3. Since 4 > 3, there is nothing to do, and i becomes 1 (i ++), while m remains 0.
    Step 2 : We compare the element at position i with the pivot, so we compare 2 with 3. Since 2<3, we swap the element at position m with the element at position i , so we swap 4 with 2. Both m and i are increased by 1, so m becomes 1 and i
  • Hands-On Data Structures and Algorithms with Python
    Quicksort is an efficient sorting algorithm. The quicksort algorithm is based on the divide-and-conquer class of algorithms, similar to the merge sort algorithm, where we break (divide) a problem into smaller chunks that are much simpler to solve, and further, the final results are obtained by combining the outputs of smaller problems (conquer).
    The concept behind quicksorting is partitioning a given list or array. To partition the list, we first select a data element from the given list, which is called a pivot element.
    We can choose any element as a pivot element in the list. However, for the sake of simplicity, we’ll take the first element in the array as the pivot element. Next, all the elements in the list are compared with this pivot element. At the end of first iteration, all the elements of the list are arranged in such a way that the elements which are less than the pivot element are arranged to the left of the pivot, that the elements that are greater than the pivot element are arranged to the right of the pivot.
    Now, let’s understand the working of the quicksort algorithm with an example.
    In this algorithm, firstly we partition the given list of unsorted data elements into two sublists in such a way that all the elements on the left side of that partition point (also called a pivot) should be smaller than the pivot, and all the elements on the right side of the pivot should be greater. This means that elements of the left sublist and the right sublist will be unsorted, but the pivot element will be at its correct position in the complete list. This is shown in Figure 11.16 .
    Therefore, after the first iteration of the quicksort algorithm, the chosen pivot point is placed in the list at its correct position, and after the first iteration, we obtain two unordered sublists and follow the same process again on these two sublists. Thus, the quicksort algorithm partitions the list into two parts and recursively applies the quicksort algorithm to these two sublists to sort the whole list:
  • AP Computer Science A Premium, 2024: 6 Practice Tests + Comprehensive Review + Online Practice
    1.The major disadvantage of merge sort is that it needs a temporary array that is as large as the original array to be sorted. This could be a problem if space is a factor.
    2.Merge sort is not affected by the initial ordering of the elements. Thus, best, worst, and average cases have similar run times.

    Quicksort

    For large n, quicksort is, on average, the fastest known sorting algorithm. Here is a recursive description of how quicksort works:
    If there are at least two elements in the array, Partition the array. Quicksort the left subarray. Quicksort the right subarray.
    The partition method splits the array into two subarrays as follows: a pivot element is chosen at random from the array (often just the first element) and placed so that all items to the left of the pivot are less than or equal to the pivot, whereas those to the right are greater than or equal to it.
    For example, if the array is 4, 1, 2, 7, 5, −1, 8, 0, 6, and a[0] = 4 is the pivot, the partition method produces
    Here’s how the partitioning works: Let a[0] , 4 in this case, be the pivot. Markers up and down are initialized to index values 0 and n − 1, as shown. Move the up marker until a value less than the pivot is found, or down equals up . Move the down marker until a value greater than the pivot is found, or down equals up . Swap a[up] and a[down] . Continue the process until down equals up . This is the pivot position. Swap a[0] and a[pivotPosition] .
    Notice that the pivot element, 4, is in its final sorted position. Analysis of quicksort:
    1.For the fastest run time, the array should be partitioned into two parts of roughly the same size.
    2.If the pivot happens to be the smallest or largest element in the array, the split is not much of a split—one of the subarrays is empty! If this happens repeatedly, quicksort degenerates into a slow, recursive version of selection sort and is very inefficient.
    3.The worst case for quicksort occurs when the partitioning algorithm repeatedly divides the array into pieces of size 1 and n
  • Learning JavaScript Data Structures and Algorithms
    Next, we will add every remaining value from the left array ({9}) to the merged result array and will do the same for the remaining values from the right array. At the end, we will return a merged array.
    If we execute the mergeSort function, this is how it will be executed:
    Note that first the algorithm splits the original array until it has smaller arrays with a single element, and then it starts merging. While merging, it does the sorting as well until we have the original array completely back together and sorted.
    Passage contains an image

    The Quick Sort

    The Quick Sort is probably the most used sorting algorithm. It has a complexity of O(n log n) , and it usually performs better than other O(n log n) sorting algorithms. Similarly to the merge sort, it also uses the divide-and-conquer approach, dividing the original array into smaller ones (but without splitting them as the merge sort does) to do the sorting.
    The Quick Sort algorithm is a little bit more complex than the other ones you have learned so far. Let's learn it step by step, as follows:
    1. First, we need to select a value from the array called pivot , which will be the value at the middle of the array.
    1. We will create two pointers (references)—the left-hand side one will point to the first value of the array, and the right-hand side one will point to the last value of the array. We will move the left pointer until we find a value that is bigger than the pivot, and we will also move the right pointer until we find a value that is less than the pivot and swap them. We will repeat this process until the left-hand side pointer passes the right-hand side pointer. This process helps to have values lower than the pivot reference before the pivot and values greater than the pivot after the pivot reference. This is called the partition
  • Programming Interviews Exposed
    eBook - ePub

    Programming Interviews Exposed

    Coding Your Way Through the Interview

    • John Mongan, Noah Suojanen Kindler, Eric Giguère(Authors)
    • 2018(Publication Date)
    • Wrox
      (Publisher)
    2 ), however, so it’s not the best algorithm to use for large amounts of randomly ordered data.
    In the preceding implementation, insertion sort is a stable, in-place sorting algorithm especially suitable for sorting small data sets and is often used as a building block for other, more complicated sorting algorithms.

    Quicksort

    Quicksort is a divide-and-conquer algorithm that involves choosing a pivot value from a data set and then splitting the set into two subsets: a set that contains all values less than the pivot and a set that contains all values greater than or equal to the pivot. The pivot/split process is recursively applied to each subset until there are no more subsets to split. The results are combined to form the final sorted set.
    A naïve implementation of this algorithm looks like:
    // Sort an array using a simple but inefficient quicksort. public static void quicksortSimple( int[] data ){ if ( data.length < 2 ){ return; } int pivotIndex = data.length / 2; int pivotValue = data[ pivotIndex ]; int leftCount = 0; // Count how many are less than the pivot for ( int i = 0; i < data.length; ++i ){ if ( data[ i ] < pivotValue ) ++leftCount; } // Allocate the arrays and create the subsets int[] left = new int[ leftCount ]; int[] right = new int[ data.length - leftCount - 1 ]; int l = 0; int r = 0; for ( int i = 0; i < data.length; ++i ){ if ( i == pivotIndex ) continue; int val = data[ i ]; if ( val < pivotValue ){ left[ l++ ] = val; } else { right[ r++ ] = val; } } // Sort the subsets quicksortSimple( left ); quicksortSimple( right ); // Combine the sorted arrays and the pivot back into the original array System.arraycopy( left, 0, data, 0, left.length ); data[ left.length ] = pivotValue; System.arraycopy( right, 0, data, left.length + 1, right.length ); }
    The preceding code illustrates the principles of quicksort, but it’s not a particularly efficient implementation due to scanning the starting array twice, allocating new arrays, and copying results from the new arrays to the original.
  • Data Structures using C
    eBook - ePub

    Data Structures using C

    A Practical Approach for Beginners

  • The performance of the Quick Sort CPU cache is superior to that of other sorting algorithms. This is because of its in-place characteristic. The CPU cache is a piece of hardware that reduces the access time to the data in the memory by keeping some part of the frequently used data of the main memory in itself. It is smaller and faster than the main memory. Therefore, Quick Sort is the fastest sorting algorithm.
  • No additional data structure is required like merge sort or heap sort.
  • Quick Sort is superior to merge sort in terms of space is concerned. Space or memory required for the Quick Sort is less than that of the merge sort.
  • Quick Sort is in-place sorting algorithm, whereas merge sort is not in-place. In-place sorting means, it does not use additional storage space to perform sorting. In merge sort, to merge the sorted arrays, it requires a temporary array and hence it is not in-place.
  • Quick Sort is better than other sorting algorithms with the same time complexity O (n log2 n) that is merge sort and heap sort. Even though Quick Sort has O(n2 ) in the worst case, it can be easily avoided with high probability by selecting the correct pivot element.
  • 7.9.4.7 Disadvantages of Quick Sort
    1. Time efficiency of the Quick Sort depends on the selection of the pivot element. Inadequately picked pivot element leads to the worst-case time complexity.
    2. Tough to implement a partitioning algorithm in Quick Sort.

    7.9.5 Merge Sort

    7.9.5.1 Introduction to Merge Sort
    Merge sort is based upon the divide and conquer algorithm. In merge sort, we divide the main list into two sub-lists, then go on dividing those sub-lists till we get sufficient length of that sub-list. Then we compare and sort the elements of each list. Merge the two sub-lists and sort the merged list. This process will be repeated until we get one sorted list.
  • Data Structures and Algorithm Analysis in C++, Third Edition
    Quicksort’s worst case arises when the pivot does a poor job of splitting the array into equal size subarrays. If we are willing to do more work searching for a better pivot, the effects of a bad pivot can be decreased or even eliminated. One good choice is to use the “median of three” algorithm, which uses as a pivot the middle of three randomly selected values. Using a random number generator to choose the positions is relatively expensive, so a common compromise is to look at the first, middle, and last positions of the current subarray. However, our simple findpivot function that takes the middle value as its pivot has the virtue of making it highly unlikely to get a bad input by chance, and it is quite cheap to implement. This is in sharp contrast to selecting the first or last element as the pivot, which would yield bad performance for many permutations that are nearly sorted or nearly reverse sorted. A significant improvement can be gained by recognizing that Quicksort is relatively slow when n is small. This might not seem to be relevant if most of the time we sort large arrays, nor should it matter how long Quicksort takes in the rare instance when a small array is sorted because it will be fast anyway. But you should notice that Quicksort itself sorts many, many small arrays! This happens as a natural by-product of the divide and conquer approach. A simple improvement might then be to replace Quicksort with a faster sort for small numbers, say Insertion Sort or Selection Sort. However, there is an even better — and still simpler — optimization. When Quicksort partitions are below a certain size, do nothing! The values within that partition will be out of order. However, we do know that all values in the array to the left of the partition are smaller than all values in the partition. All values in the array to the right of the partition are greater than all values in the partition
  • Essential Algorithms
    eBook - ePub

    Essential Algorithms

    A Practical Approach to Computer Algorithms Using Python and C#

    • Rod Stephens(Author)
    • 2019(Publication Date)
    • Wiley
      (Publisher)
    hi and searches the array backward until it finds an item that should be in the lower piece of the array. It moves that item into the hole left behind by the dividing item.
    Next the algorithm starts at index lo and searches the array forward until it finds an item that should be in the upper piece of the array. It moves that item into the hole left behind by the previously moved item.
    The algorithm continues searching backward and then forward through the array until the two pieces meet. At that point, it puts the dividing item between the two pieces and recursively calls itself to sort the pieces.
    Using Quicksort
    If you divide the items in place instead of by using stacks or queues, quicksort doesn't use any extra storage (beyond a few variables).
    Like heapsort, quicksort has expected performance, although quicksort can have performance in the worst case. Heapsort has performance in all cases, so it is in some sense safer and more elegant. However, in practice, quicksort is usually faster than heapsort, so it is the algorithm of choice for many programmers.
    In addition to greater speed, quicksort has another advantage over heapsort: it is parallelizable. Suppose a computer has more than one processor, which is increasingly the case these days. Each time the algorithm splits a section of the array into two pieces, it can use different processors to sort the two pieces. Theoretically, a highly parallel computer could use processors to sort the list in time. In practice, most computers have a fairly limited number of processors (for example, two or four), so the run time would be divided by the number of processors plus some additional overhead to manage the different threads of execution. That won't change the Big O run time, but it should improve performance in practice.
    Because it has
  • Data Structure and Algorithms Using C++
    eBook - ePub

    Data Structure and Algorithms Using C++

    A Practical Implementation

    • Sachi Nandan Mohanty, Pabitra Kumar Tripathy(Authors)
    • 2021(Publication Date)
    • Wiley-Scrivener
      (Publisher)
    Quick Sort uses the concepts of divide and conquer method. It is also known as partition exchange sort. To Partion the list, we first choose some key from the list for which about half the keys will come before and half after. This selected key is called as pivot. We next partition the entries so that all the keys which are less than the pivot come in one sublist and all the keys which are greater than the pivot come in another sublist. We will repeat the same process until all elements of the list are at proper position in the list.
    Ex.
    20 55 46 37 9 89 82 32 From the above list choose first number as pivot i.e/20 and the list is partitioned into two sublists (9) and (55 46 37 89 82 32)
    At this point 20 is in its proper position in the array x[1], each element below that position (9) is less than or equals to 20 and each element above that position (55 46 37 89 82 32) is greater than or equals to 20.
    The problem is broken into two sub problems that are to sort the two sub arrays. Since the first sub array contains only a single element, so it is already sorted .To sort the second sub array we choose its first element 55 as the pivot and again get two sub arrays (46 37 32) and (89 82) .
    So the entire array can be represented as
    9 20 (46 37 32) 55 (89 82)
    Repeating the same process we will get the result with the steps
    20 55 46 37 9 89 82 32 9 20 (46 37 32) 55 (89 82) 9 20 (37 32) 46 55 (89 82) 9 20 (32) 37 46 55 (89 82) 9 20 32 37 46 55 (82) 89 9 20 32 37 46 55 82 89
    The average run time efficiency of the Quick Sort is O(n(log2 n). In the worst case when the array is already sorted, the efficiency of Quick Sort may drop down to O(n2
  • PHP 7 Data Structures and Algorithms
    If we pass the array by reference, then we do not have to return the array. The passed array will be modified inside the function. It is down to choice how we want to achieve the sorting. Passage contains an image

    Complexity of insertion sort

    Insertion sort has a complexity similar to bubble sort. The basic difference with bubble sort is that the number of swapping is much lower than bubble sort. Here is the complexity for insertion sort:
    Best time complexity Ω(n)
    Worst time complexity
    O(n2 )
    Average time complexity
    Θ(n2 )
    Space complexity (worst case) O(1)
     
    Passage contains an image

    Understanding divide-and-conquer technique for sorting

    So far, we have explored the sorting option with a full list of numbers. As a result, we had a big list of numbers to compare every time. This can be solved if we can somehow make the list smaller. The divide-and-conquer method can be very helpful for us. With this method, we will divide a problem into two or more subproblems or sets, and then solve the smaller problems before combining all those results from the subproblems to get the final result. This is what is known as divide-and-conquer.
    The divide-and-conquer method can allow us to solve our sorting problems efficiently and reduce the complexity of our algorithm. Two of the most popular sorting algorithms are merge sort and Quick Sort, which apply the divide-and-conquer algorithm to sort a list of items, and hence, they are considered to be the best sorting algorithms. Now, we will explore these two algorithms in the next section.
    Passage contains an image

    Understanding merge sort

    As we already know that merge sort applies the divide-and-conquer approach to solve the sorting problem, we need to find out two processes to address the issue. The first one is to divide the problem set into smaller enough problems to solve easily, and then combine those results. We will apply a recursive approach here for the divide-and-conquer part. The following image shows how to take the approach for divide-and-conquer. We will now consider a smaller list of numbers 20 , 45 93 67 97 52 88 33
  • Java Basics Using ChatGPT/GPT-4
    There are several points to keep in mind regarding the quicksort. First, the case in which the two sublists are approximately the same length is the more efficient case. However, this case involves a prior knowledge of the data distribution in the given list in order to achieve optimality.
    Second, if the list contains values that are close to randomly distributed, in which case the first value or the last value are common choices for the pivot item. Third, quicksort has its worst performance when the values in a list are already sorted. In this scenario, select the pivot item in one of the following ways:
    Select the middle item in the list.
    Select the median of the first, middle, and last items in the list.
    Quicksort Code Sample
    Listing 4.10 displays the contents of QuickSort.java that illustrates how to perform a quicksort on an array of numbers.
    LISTING 4.10: QuickSort.java public class QuickSort {    public static void sort(int[] values) {      quicksort(values);    }    public static void quicksort(int[] arr) {      if(arr.length == 0)        return;      else        quicksort(arr, 0, arr.length - 1);    }    // Sort interval [lo, hi] inplace recursively    public static void quicksort (int[] arr, int lo, int hi)    {      if (lo < hi) {        int splitPoint = partition(arr, lo, hi);        quicksort(arr, lo, splitPoint);        quicksort(arr, splitPoint + 1, hi);      }    }    // Performs Hoare partition algorithm for quicksort    public static int partition(int[] arr, int lo, int hi)    {      int pivot = arr[lo];      int i = lo - 1;      int j = hi + 1;      while (true) {        do {          i += 1;        }        while (arr[i] < pivot);            do {          j -= 1;        }        while (arr[j] > pivot);
  • Data structures based on non-linear relations and data processing methods
    • Xingni Zhou, Zhiyuan Ren, Yanzhuo Ma, Kai Fan, Xiang Ji(Authors)
    • 2020(Publication Date)
    • De Gruyter
      (Publisher)
    Quick Sort essentially is a recursive algorithm using divide-and-conquer idea. Under normal circumstances, Quick Sort is faster than most sorting algorithms, and is currently the publicly acknowledged fastest sorting method. However, when the sequence is basically ordered, it will deteriorate into bubble sort, which affects the performance of the sort. In addition, Quick Sort is based on recursion and involves a huge amount of stack operations in the memory. For machines with very limited memory, it would not be a good choice.

    3.7.2  Merge sort

    Merge sort first views each individual record as an ordered sequence, and then obtains new sequences via two-way merges. In this way, it sorts all the records. Merge sort is faster than heap sort, though normally it is not as fast as Quick Sort.

    3.7.3  Heap sort

    Heap sort is suitable for situations where the amount of data is huge, for example, millions of records. Heap sort does not require a lot of recursion or multidimensional temporary arrays, which makes it suitable for sequences with a huge amount of data. When there are more than millions of records, since Quick Sort and merge sort essentially base their ideas on recursion, stack overflow error might occur.
    Heap sort will construct a heap out of all the data, with the maximum (minimum) data at the top of the heap. Then it exchanges the top data with the last data in the sequence, then it rebuilds the heap and swaps data. In this manner, it sorts all the data.

    3.7.4  Shell sort

    Shell sort divides the data into different groups. It first sorts each group and then performs one insertion sort on all the elements, in order to reduce the swapping and moving of data. Shell sort is very efficient, but the quality of grouping will hugely impact the algorithm performance. Shell sort is faster than bubble sort and insertion sort, but it is slower than Quick Sort, merge sort and heap sort. Shell sort is suitable for situations where the number of data is lower than 5,000 and speed is not very important. It is very good for sorting arrays with relatively small amount of data.
  • Index pages curate the most relevant extracts from our library of academic textbooks. They’ve been created using an in-house natural language model (NLM), each adding context and meaning to key research topics.