DAA and the Merge Sort Algorithm
The world is evolving quite quickly and so are we. The rapidness of the change is so high that we can’t even imagine. With every minute passing there’s something being manufactured. Let’s take an example to understand how much have we grown or evolved in the past 100s years.
In the earlier times when someone wanted to move from one place to another, they used horses for a single person and when a family had to go they used buffalo-carts for transport purpose. Then slowly time changed and in 1885 we saw the setup of the first car company known as Benz(Currently known as Mercedes). Then we moved more forward and we saw the high-tech features car with more power and more speed. Now fast-forward to the present and now we even have car companies like Tesla which do not even require you to drive. They work on Artificial Intelligence principal. And now you can understand the extent of growth we have achieved and that too at such a high rate and the rate is still getting higher and higher.
But to cope with such a world you are required to make changes to your system, technology, and way of working. Now coming on a real topic that is DAA (Design and Analysis of Algorithm) which is very important in such changing times. It plays important role in evolvement as it provides a better and smart solution to difficult problems. An algorithm is basically a set of steps that solve a certain problem. In the world of Computer Science and IT, DAA holds a great reputation and importance as it designs algorithms to solve different problems. DAA includes designing of fundamental concepts and strategies, analysis of algorithm complexity, graph theory, and sorting methods.
What are the factors that describe whether the algorithm is a success or a failure:-
i. Obviously it must work i.e. it must give you the desired result.
ii. The most important and defining criteria is the time complexity. The time an algorithm is taking to solve the same problem. The faster it solves, the better is the algorithm.
Most of the time the algorithms are designed in such a way that they work with the input of arbitrary length. Analyzing an algorithm is determining the amount of time it is taking and the space resources required to execute it. The competence and running time of an algorithm is defined as a function relating the input length to the no. of steps, known as Time complexity.
When we prepare some algorithm to solve a certain problem we must try to recognize certain patterns so that similar kinds of problems could be solved by the same algorithm only.
We know that for a single problem there many different ways to determine the same outcome. Like, for sorting a set of numbers there are different algorithms that could be used and provide you with the same output but the difference would be in their way of comparison. The number of comparisons performed might vary between different algos. The time taken in sorting the same set of numbers using different algos would be different and this determines whether the algorithm is good and efficient.
For solving problems we need to consider time complexity and space complexity as well. Since in certain systems, there is a limited amount of memory allotted to run the program thought a sufficient amount is provided. Like for an instance, if we compare Bubble sort and Merge sort. Bubble sort does not need additional memory whereas Merge sort requires additional memory. Though the time complexity of bubble sort is higher as compared to merge sort. So if the requirement is to run the program in limited memory, then it is feasible to use Bubble sort.
One of the important topics of DAA is Sorting, and the most valuable part is Merge sort in sorting algorithms.
Whenever we talk about Merge sort divide and conquer is one such phrase that comes to my mind. It employs this algorithmic paradigm based on recursion. This is a different method of working in which the problem is divided into several subproblems which are similar to the original problem and recursively solve the subproblems and then all the sub-solutions are finally combined to get the solution to the original problem. Since divide and conquer follows the recursion method that means each subproblem must be smaller than the original problem.
The merge sort algorithms actually merge two sorted adjacent subarrays, array [p..q] and array [q+1..r] into a single sorted subarray in array p..r .
We know that merge sort runs in O(n) time while merging n elements of two different subarrays. But the question is how do we get to the showing that merges sort works in O(nlog2n) time? Let’s start by thinking about three parts of Divide-Conquer and Combine.
Let us assume that we sorting a total of n elements in an entire way.
i. The dividing step takes constant time, no matter what the subarray size is. Since the divide step just finds the midpoint q of indices p and r.
ii. In the conquer step we are recursively sorting 2 subarrays of approx. n/2 elements, which takes some specific amount of time but we will account for that time when we consider subproblems.
iii. Now, finally the combining step merges a total of n elements, taking a time O(n).
There is one other thing about Merge sort that is worth noting. While merging, the copy of the whole sorted array is made, with one half in lowhalf and the other in higherhalf. But sometimes it copies more than a constant number of elements, and we say that merge sort does not work in place. Whereas in contrast both, selection and insertion sort does work in place, since they never make a copy of more than the constant number of array elements at any one time. Though as the trend suggests the computer scientist usually prefers the algorithm that works in place because there are some systems where space is limited, sufficient, and at a premium. Thus in-place algorithms are given an upper hand and are preferred above.