Complexity Analysis of Algorithms (Programming Complexities)


THE GOAL OF ANALYSIS:

The general goal of the analysis is to compare the performances of different algorithms or the solutions mainly in terms of the run time but also in terms of some other factors such as memory and developer efforts. Now, We have already noted that, when developing algorithms, it is important to consider how efficient they are, so we can make informed choices about which are best suited to be used in particular given circumstances

RUNTIME ANALYSIS:

The processes of ascertaining how the processing time increases with the size of the problem ( input size ) increases. The input size is the number of elements of input, and depending on the problem type, the input may be of different types. the following are common types of inputs:

< > Size of an array.

< > Polynomial degree.

< > The number of elements in a matrix.

< > The number of bits in the binary representation of the input.

< > Vertices and edges in a graph.

COMPARISON OF ALGORITHMS:

To compare these, there are few objective measures

>>>Execution times? : It's not a good measure as execution times are specific to a particular computer from other computers.

>>>The number of statements executed? : It's not a good measure since the number number of statements varies with the programming language as well as with the coding style of the individual programmer.

>>>Ideal solution? : We express the running time of a given algorithm as a function of the input size n (i.e. f(n)) and compare these different functions corresponding to running times. this kind of comparison is independent of machine time, programming style, etc


PERFORMANCE OF A PROGRAM:

The performance of a program is described as the proportion of computer memory and time required at an instant to run the program. 

We generally use two approaches to determine the performance of a program. One is analytical, and the other experimental. In performance analysis, we use analytical methods, while in performance measurement(experiment) we conduct experiments.

>>>Time complexity:
The time needed by an algorithm to depict an output expressed as a function of the size of a problem is called the TIME COMPLEXITY of an algorithm. Thus, the time complexity of a program is the amount of computer time it requires to run for execution.
The limiting behavior of the complexity as size increases for the runtime is called the asymptotic time complexity. It is genuinely the asymptotic complexity of an algorithm, which ultimately derives the size of problems that can be solved by the algorithm.


>>>Space Complexity: The space complexity of a program is the volume of memory it needs to run for completing the execution of the program. Formally, the space need by a program has the following components: 

1.Instruction space: Instruction space is space obliged to store the compiled version of the executed program instructions.
2.Data space: Data space is space obliged to store all constants and variable values. Data space has two components in it:
Space needed by constants and simple variables in the program.
Space needed by dynamically allocated objects such as arrays and class instances.
3.Environment stack space: The environment stack is generally used to save information obliged to resume execution of partially completed functions when required.
4.Instruction Space: The volume of instructions space that is needed by the programs depends on factors such as:
< > The compiler used to develop the program into machine code.
< > The compiler options in conclusion at the time of compilation.
< > The target computer and its applications.

TIME V/S SPACE COMPLEXITY:

When building software for obliged applications, it usually becomes a need to judge how swiftly an algorithm or program can complete the given tasks. For an instance, if you are programming for an online flight ticket booking system, it will not be deemed admissible if the travel agent or the customer has to wait for half an hour for a transaction to be completed. Thus, It certainly has to be ensured that the waiting time is reasonable and unquestionably fastest for the size of the problem, as ordinarily faster execution is better. We talk about the time complexity of the algorithm as a symbol of how the execution time depends on the size of the data structure.

Another major performance consideration is how much memory a given program will

need for specific task execution, though with modern computers this tends to be less of an issue

than it used to be at earlier times. Here we talk about space complexity as to how the memory requirement depends on the size of the data structure.

For a given task, there are often algorithms that trade space for time and vice versa. For example, we will see that, as a data storage device, hash tables exhibit a very good time

complexity at the expense of using more memory than is needed by other algorithms. It is

usually up to the algorithm/program designer to decide how best to balance the trade-off
for the application, they are designing

CLASSIFICATION OF ALGORITHMS:

If ‘n’ is the number of data items to be processed or the degree of a polynomial or the size of the file

to be sorted or searched or the number of nodes in a graph etc. If all the instructions of a program have this property, we say that its running time.


The table below shows the best description of the classification of algorithms:

Complexities Definitions


ASYMPTOTIC ANALYSIS:

!!CLICK HERE TO LEARN ABOUT ASYMPTOTIC NOTATIONS( O, Ω, Θ, o )!!.


The diagram shows relationships between different rates of growth:


Post a Comment

0 Comments