Essay PreviewMore ↓
One of the fundamental problems of computer science is sorting a set of items. The solutions to these problems are known as sorting algorithms and rather ironically, “the process of applying an algorithm to an input to obtain an output is called a computation” [http://mathworld.wolfram.com/Algorithm.html].
The quest to develop the most memory efficient and the fastest sorting algorithm has become one of the great mathematical challenges of the last half century, resulting in many tried and tested algorithms available to the individual who needs to sort a list of data. In fact new sorting algorithms are still being developed today, take for example the Library sort, which was published in 2004.
Of all the popular sorting algorithms, I have chosen to research and explain in detail an algorithm known as the ‘Quicksort’. Quicksort is a popular and speedy sorting algorithm that is the multi-purpose, sorting algorithm of choice for many mathematicians and computer scientists. Though of course the choosing of an algorithm comes down to which algorithm is best suited to the clients needs, and is dependent on the specific set of data to be sorted, Quicksort has proven to fulfill the required criteria on many occasions.
C.A.R. Hoare developed the Quicksort algorithm in the year 1960, while he was working for a small, English scientific computer manufacturer named Elliott Brothers (London) Ltd.
Sorting algorithms are designed to be fast, and efficient. To be able to sort a list of data as quickly as possible, using as little memory as possible. To measure or classify an algorithm according to these two criteria, we measure the algorithm’s computational complexity. The computational complexity of a sorting algorithm is it’s worst, average and best behavior. Sorting algorithms are generally classified by their computational complexity of element comparisons, against the size of the list.
This is represented with what is known as ‘Big O notation’, for example where the ideal behavior is O(n), most algorithm’s behavior is O(nlogn), and bad behavior is O(n²). O(n) behavior means that a sorting algorithm would take ten times longer to sort a list of one hundred elements, than it would to sort a list of one thousand elements. O(n²) behavior is quadratic behavior, and this would take one hundred times longer to sort a list of one hundred elements, than it would to sort a list of one thousand elements.
How to Cite this Page
"Algorithm Research - Quicksort." 123HelpMe.com. 25 Feb 2020
Need Writing Help?
Get feedback on grammar, clarity, concision and logic instantly.Check your paper »
Research Study- Improved Algorithms for Yield Driven Clock Skew Scheduling in the Presence of Process Variations
- Abstract Traditional yield driven clock skew scheduling in the presence of process variations can be formulated as a sequence of minimum ratio cycle problems, and hence can be solved efficiently by algorithms such as Lawler's and Howard's algorithms. However, the assumption of Gaussian distributions of critical path delays has been made in this formulation, which becomes inapplicable for next generation nanometer technologies. Recently, a generalization of the formulation for non-Gaussian distributions was proposed, and a modification of Lawler's algorithm was developed for solving this generalized problem.... [tags: algorithms]
1776 words (5.1 pages)
- In order to achieve a reasonable evaluation of direct trust, this paper proposes a trust evaluation algorithm based on the domain, using the technique of constructing a hierarchical tree of trust evaluation subjectively. The algorithm adopts the rules of series and parallel operations in the D-S theory, acquires the results of the recommended trust problem of a single path by quadrature methods and implements the integration of multiple paths by the weighted algorithm which takes the cooperative roles and industry roles as factors.... [tags: algorithm, manufacturing industry, ]
1633 words (4.7 pages)
- INTRODUCTION 1.1. Research Project Background and Motivation In current years, wireless sensor networks (WSN) has found applications a lot of situations, like surroundings observation and forecasting system, battleground monitoring, territory monitoring, weather conditions detection, natural animal protection, museums sanctuary and a little other human unreachable dangerous areas. Power assignment and management is critical in sensors therefore it forms one important question of conserving QOS using sensors for extended duration.... [tags: monitoring, algorithm, network designers]
775 words (2.2 pages)
- In this paper, a new algorithm for age group recognition from frontal face image is presented. The algorithm classifies subjects into four different age categories in four key stages: Pre-processing, Facial feature extraction by a new projection method, Face feature analysis, and Age classification. In order to apply the algorithm to the problem, a face image database focusing on people’s age information is required. Because there are no such these databases, we created a database for this purpose, which is called Iranian Face Database (IFDB).... [tags: algorithms, age group recognition, frontal face im]
2507 words (7.2 pages)
- The ID3 Algorithm Abstract This paper details the ID3 classification algorithm. Very simply, ID3 builds a decision tree from a fixed set of examples. The resulting tree is used to classify future samples. The example has several attributes and belongs to a class (like yes or no). The leaf nodes of the decision tree contain the class name whereas a non-leaf node is a decision node. The decision node is an attribute test with each branch (to another decision tree) being a possible value of the attribute.... [tags: Classification Algorithms]
1344 words (3.8 pages)
- Abstract—Computational problems have significance from the early civilizations. These problems and solutions are used for the study of universe. Numbers and symbols have been used for different fields e.g. mathematics, statistics. After the emergence of computers the number and objects needs to be arranged in a particular order i.e. ascending and descending orders. The ordering of these numbers is generally referred to as sorting. Sorting gained a lot of importance in computer sciences and its applications are in file systems etc.... [tags: computers]
1532 words (4.4 pages)
- The past decade has seen a lot of research on various time series representations. Various researches have been carried out that focused on representations that are processed in batch mode and visualize each value with almost equal dependability. As the tremendous usage of mobile devices and real time sensors has released the necessity and importance for representations that can simultaneously be updated, and can estimate the time oriented data with reliability and proportional to its time period for extended analysis.... [tags: data, computational resources]
939 words (2.7 pages)
- Evolutionary Algorithm and Discussions Cloud computing provides variety of internet on demand services such as software, infrastructure and data storage. for the purpose of provision of private service to the user, there is possibility to use multi-level password creation and documentation or authentication techniques. This technology assists in creating the password in several levels of the company. So, that the strict documentation and authorization is possible. The levels of security in cloud may be more developed by multi-levels documentation.... [tags: technology, cloud, storage]
970 words (2.8 pages)
- Genetic Algorithm Operations The basic GA that can produce acceptable results in many practical problems is composed of three operations: a. Reproduction b. Crossover c. Mutation The reproduction process is to allow the genetic information, stored in the good fitness for survive the next generation of the artificial strings, whereas the population's string has assigned a value and its aptitude in the object function. This value has the probability of being chosen as the parent in the reproduction process of a new generation.... [tags: mutation, welding, parameters]
1078 words (3.1 pages)
- This paper proposes an efficient and scalable multicast algorithm that accommodates dynamic groups. Our protocol relies on a shared tree architecture to deal with the problems of scalability and group dynamics. Our algorithm is based on the communication model developed by Bhat et al  that considers both network and node heterogeneity. Our algorithm uses a modified version Bhat et al.  heuristics for multicasting a message to the group. M I. INTRODUCTION any applications such as teleconferencing, distributed games and any collaborative multimedia application require an efficient group communication.... [tags: Group Communication Software Technology]
1702 words (4.9 pages)
In Big O notation, the ‘O’ symbol is an asymptotic upper bound, or the greatest element in a subset of another, partly ordered set. The ‘n’ is the size of the list being sorted.
The popular sorting algorithms are dived into two classes, O(n²), which are generally slower but simpler, and O(nlogn) which are faster but more complicated.
The Quicksort algorithm is of the O(nlogn) class, and makes O(nlogn) comparisons on average. In worst case scenarios, Quicksort makes O(n²) comparisons.
In a very simplistic explanation, The Quicksort algorithm sorts a list of data in three steps. These three steps are listed below:
1: Select a ‘Pivot’. The pivot is an element from the list.
2: Reorder the list by putting all elements with values that are less than the pivot, before the pivot; and all elements with values greater than the pivot after the pivot. Elements with values equal to the pivot can go either way. After the list has been reordered in such a way, the pivot will be in its final position, this operation is known as the ‘partition operation’.
3: Recursively sort the elements of lesser value than the pivot, and the elements of greater value than the pivot.
The Quicksort algorithm is a comparison sort. Comparison sorts compare the values of elements in a set of data by a single abstract comparison operation, which is most often a ‘less than or equal to’ operation.
The Quicksort algorithm is not a stable sorting algorithm. In a stable sorting algorithm, if two elements have equal values, they will maintain their relative order when the list is sorted.
The Quicksort is also massively recursive. “Recursion, in mathematics and computer science, is a method of defining functions in which the function being defined is applied within its own definition. The term is also used more generally to describe a process of repeating objects in a self-similar way” [http://en.wikipedia.org/wiki/Recursion].
Sorting algorithms can also be classified by the amount of memory space a computer will need to sort the algorithm.
The simplest form of the Quicksort algorithm, while being fast, requires Ω(n) extra storage space, which is comparatively not very good. There is however, a more complicated version of Quicksort that uses an in-place partition algorithm. An in-place algorithm is one that doesn’t require extra space to sort the array with sub arrays and such. Instead it swaps the elements around inside the original array and thereby only needing the space of the original array, plus a small amount of working space for local variables etc as all sorts do. The in-place version can sort a list using O(nlogn) space on average.
The Quicksort algorithm uses what is known as a ‘Divide and conquer’ strategy to sort lists of data. Using the ‘divide and conquer’ strategy, “we must divide the problem into two smaller sub-problems, solve each of them recursively, and then meld the two partial solutions into one solution to the full problem” [http://www2.toki.or.id/book/AlgDesignManual/BOOK/BOOK2/NODE53.HTML].
Another point of interest for the Quicksort sorting algorithm, is that is can easily be parallelized. In computer science, “a parallel algorithm, as opposed to a traditional sequential algorithm, is one which can be executed a piece at a time on many different processing devices, and then put back together again at the end to get the correct result” [http://en.wikipedia.org/wiki/Parallel_algorithm]. This means when running on a system with multiple processors, Quicksort can split its sub-set’s to be sorted by individual processors and greatly speed up the time in which it sorts the set of elements.
In summary, Quicksort is a comparison sort that is massively recursive, not stable and easily parallelized. Quicksort makes O(nlogn) comparisons on average and at worst makes O(n²) comparisons. Quicksort is one of the fastest sorting algorithms and a very popular choice for sorting sets of data, generally only competing with other sorts for their better worst case behaviors e.g. Heapsort and Mergesort, or when a stable sort is needed eg. Mergesort.