Start networking and exchanging professional insights

Register now or log in to join your professional community.

Follow

How to find complexity of algorithm?

user-image
Question added by Sohaib AlZyoud , Senior Software Engineer , Secured Services Systems (SSSIT)
Date Posted: 2013/06/17
Mahmoud Fathi
by Mahmoud Fathi , Graduate Assistant , New Mexico Highlands University

Algorithmic complexity is concerned about how fast or slow particular algorithm performs.
We define complexity as a numerical function T(n) - time versus the input size n.
We want to define time taken by an algorithm without depending on the implementation details.
But you agree that T(n) does depend on the implementation! A given algorithm will take different amounts of time on the same inputs depending on such factors as: processor speed; instruction set, disk speed, brand of compiler and etc.
The way around is to estimate efficiency of each algorithm asymptotically.
We will measure time T(n) as the number of elementary "steps" (defined in any way), provided each such step takes constant time.
Let us consider two classical examples: addition of two integers.
We will add two integers digit by digit (or bit by bit), and this will define a "step" in our computational model.
Therefore, we say that addition of two n-bit integers takes n steps.
Consequently, the total computational time is T(n) = c * n, where c is time taken by addition of two bits.
On different computers, additon of two bits might take different time, say c1 and c2, thus the additon of two n-bit integers takes T(n) = c1 * n and T(n) = c2* n respectively.
This shows that different machines result in different slopes, but time T(n) grows linearly as input size increases.
The process of abstracting away details and determining the rate of resource usage in terms of the input size is one of the fundamental ideas in computer science.

More Questions Like This