Knot theory is the branch of topology that studies mathematical knots, which are defined as embeddings of a circle S1 in 3-dimensional Euclidean space, R3. This is basically equivalent to a conventional knotted string with the ends of the string joined together to prevent it from becoming undone. Two mathematical knots are considered equivalent if one can be transformed into the other via continuous deformations (known as ambient isotopies); these transformations correspond to manipulations of a knotted string that do not involve cutting the string or passing the string through itself.
Knots can be described in various ways, but the most common method is by planar diagrams (known as knot projections or knot diagrams). Given a method of description, a knot will have many descriptions, e.g., many diagrams, representing it. A fundamental problem in knot theory is determining when two descriptions represent the same knot. One way of distinguishing knots is by using a knot invariant, a "quantity" which remains the same even with different descriptions of a knot.
Research in knot theory began with the creation of knot tables and the systematic tabulation of knots. While tabulation remains an important task, today's researchers have a wide variety of backgrounds and goals. Classical knot theory, as initiated by Max Dehn, J. W. Alexander, and others, is primarily concerned with the knot group and invariants from homology theory such as the Alexander polynomial.
The discovery of the Jones polynomial by Vaughan Jones in 1984, and subsequent contributions from Edward Witten, Maxim Kontsevich, and others, revealed deep connections between knot theory and mathematical methods in statistical mechanics and quantum field theory. A plethora of knot invariants have been invented since then, utilizing sophisticated tools as quantum groups and Floer homology.
Quicksort (also known as the partition-exchange sort) is an efficient sorting algorithm that works for items of any type for which a total order (i.e., "≤") relation is defined. This animation shows how the algorithm partitions the input array (here a random permutation of the numbers 1 through 33) into two smaller arrays based on a selected pivot element (bar marked in red, here always chosen to be the last element in the array under consideration), by swapping elements between the two sub-arrays so that those in the first (on the left) end up all smaller than the pivot element's value (horizontal blue line) and those in the second (on the right) all larger. The pivot element is then moved to a position between the two sub-arrays; at this point, the pivot element is in its final position and will never be moved again. The algorithm then proceeds to recursively apply the same procedure to each of the smaller arrays, partitioning and rearranging the elements until there are no sub-arrays longer than one element left to process. (As can be seen in the animation, the algorithm actually sorts all left-hand sub-arrays first, and then starts to process the right-hand sub-arrays.) First developed by Tony Hoare in 1959, quicksort is still a commonly used algorithm for sorting in computer applications. On the average, it requires O(n log n) comparisons to sort n items, which compares favorably to other popular sorting methods, including merge sort and heapsort. Unfortunately, on rare occasions (including cases where the input is already sorted or contains items that are all equal) quicksort requires a worst-case O(n2) comparisons, while the other two methods remain O(n log n) in their worst cases. Still, when implemented well, quicksort can be about two or three times faster than its main competitors. Unlike merge sort, the standard implementation of quicksort does not preserve the order of equal input items (it is not stable), although stable versions of the algorithm do exist at the expense of requiring O(n) additional storage space. Other variations are based on different ways of choosing the pivot element (for example, choosing a random element instead of always using the last one), using more than one pivot, switching to an insertion sort when the sub-arrays have shrunk to a sufficiently small length, and using a three-way partitioning scheme (grouping items into those smaller, larger, and equal to the pivot—a modification that can turn the worst-case scenario of all-equal input values into the best case). Because of the algorithm's "divide and conquer" approach, parts of it can be done in parallel (in particular, the processing of the left and right sub-arrays can be done simultaneously). However, other sorting algorithms (including merge sort) experience much greater speed increases when performed in parallel.