# sorting - Comparison-Based Internal Sorting Algorithms...

This preview shows pages 1–6. Sign up to view the full content.

Comparison-Based Internal Sorting Algorithms: There are many internal (in-memory) sorting algorithms, where the rule of the game is by comparing the elements to be sorted and moving them into proper ordering (ascending or descending). Selection Sort: To sort an array T [0. . n –1] into ascending order, perform the following iterative process: for k = 0 to n –2 find the smallest element in T [ k..n –1] and swap it with T [ k ] It is easy to see that if we let f ( n ) denote the number of element comparisons involved in the algorithm, then f ( n ) = Counting all the operations, the time complexity is O( n 2 ). . 2 ) 1 ( 1 1 1 1 ) ( 2 0 ) 1 ( n n n k n j j k n n k k n - = - = - = = - = - = - -

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Notice that Selection Sort uses the Divide-and-Conquer technique, in that the problem (involving an array of size n ) is divided by finding its smallest element, then solving a smaller problem involving an array of size n –1. Similar idea can be used to devise other sorting algorithms such as insertion-sort and bubble sort ; both have the time complexity of O( n 2 ). Merge Sort: This is also a sorting algorithm based on the divide-and-conquer strategy, in which an array is divided into two approximately equal halves, both are sorted recursively, then the the two sorted halves are merged into a combined sorted array. The implementation uses an efficient (liner time) merge procedure described as follows:
Merge ( U [0. . m ], V [0. . n ], T [0. . m+n –1]) /* merge two sorted arrays U and V into array T , using U [ m ] and V [ n ] as sentinel markers */ i = j = 0 U [ m ] = V [ n ] = // a large value for k = 0 to m+n –1 if U [ i ] < V [ j ] T [ k ] = U [ i ++] else T [ k ] = V [ j ++] It is easy to see that the total number of element comparisons is m + n ; the total time complexity (counting all operations) is O( m + n ).

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Merge-sort( T [0. . n –1]) if ( n == 1) return else declare two auxiliary arrays U [0. . n /2 ], V [0. . n /2 ] copy T [0. . n /2 –1] to U [0. . n /2 –1 ] copy T [ n /2 .. n –1 ] to V [0. . n /2 –1] merge-sort ( U [0. . n /2 –1 ]) // recursive call merge-sort ( V [0. . n /2 –1]) // recursive call merge( U , V , T ) Let f ( n ) denote the number of element comparisons for sorting an array of size n using merge-sort. Thus, we have the following recurrence: f (1) = 0; and for n > 1. f ( n ) = f ( n /2 ) + f ( n /2 ) + n, where the term n is the number of comparisons in merge
It can be shown that f ( n ) = O( n lg n ), but finding an exact formula for f ( n ) takes more effort (but it is possible). In general, Merge-sort is not the fastest sorting algorithm because of two reasons: recursion and the use of auxiliary arrays and the copy operations. In-place merging (I.e. merge without

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### Page1 / 18

sorting - Comparison-Based Internal Sorting Algorithms...

This preview shows document pages 1 - 6. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online