Algorithm Research - Quicksort

1117 Words3 Pages

An algorithm, according to the Random House Unabridged Dictionary, is a set of rules for solving a problem in a finite number of steps.

One of the fundamental problems of computer science is sorting a set of items. The solutions to these problems are known as sorting algorithms and rather ironically, “the process of applying an algorithm to an input to obtain an output is called a computation” [http://mathworld.wolfram.com/Algorithm.html].

The quest to develop the most memory efficient and the fastest sorting algorithm has become one of the great mathematical challenges of the last half century, resulting in many tried and tested algorithms available to the individual who needs to sort a list of data. In fact new sorting algorithms are still being developed today, take for example the Library sort, which was published in 2004.

Of all the popular sorting algorithms, I have chosen to research and explain in detail an algorithm known as the ‘Quicksort’. Quicksort is a popular and speedy sorting algorithm that is the multi-purpose, sorting algorithm of choice for many mathematicians and computer scientists. Though of course the choosing of an algorithm comes down to which algorithm is best suited to the clients needs, and is dependent on the specific set of data to be sorted, Quicksort has proven to fulfill the required criteria on many occasions.

C.A.R. Hoare developed the Quicksort algorithm in the year 1960, while he was working for a small, English scientific computer manufacturer named Elliott Brothers (London) Ltd.

Sorting algorithms are designed to be fast, and efficient. To be able to sort a list of data as quickly as possible, using as little memory as possible. To measure or classify an algorithm according to these two criteria, we measure the algorithm’s computational complexity. The computational complexity of a sorting algorithm is it’s worst, average and best behavior. Sorting algorithms are generally classified by their computational complexity of element comparisons, against the size of the list.

This is represented with what is known as ‘Big O notation’, for example where the ideal behavior is O(n), most algorithm’s behavior is O(nlogn), and bad behavior is O(n²). O(n) behavior means that a sorting algorithm would take ten times longer to sort a list of one hundred elements, than it would to sort a list of one thousand elements. O(n²) behavior is quadratic behavior, and this would take one hundred times longer to sort a list of one hundred elements, than it would to sort a list of one thousand elements.

More about Algorithm Research - Quicksort

Open Document