Your Web News in One Place

Help Webnuz

Referal links:

Sign up for GreenGeeks web hosting
January 21, 2023 04:47 pm GMT

Time Complexity

Time complexity is a way to measure how fast an algorithm is. When we talk about the time complexity of an algorithm, we're trying to figure out how long it takes for the algorithm to finish running. We do this by counting the number of steps or operations the algorithm takes.

The way we measure time complexity is by using something called Big O notation. Big O notation is a way to write down how the number of steps an algorithm takes changes as the size of the input gets bigger. We use the letter "O" followed by a number, such as O(n) or O(1).

For example, let's say we have an algorithm that looks through a list of numbers and finds the biggest one. If the list has 5 numbers, the algorithm might take 5 steps to find the biggest number. If the list has 10 numbers, it might take 10 steps. In this case, the time complexity of the algorithm is O(n), where n is the number of items in the list. This means that as the number of items in the list gets bigger, the number of steps the algorithm takes also gets bigger.

def find_biggest(arr):    biggest = arr[0]    for num in arr:        if num > biggest:            biggest = num    return biggestarr = [5, 10, 15, 20, 25, 30]biggest = find_biggest(arr)print("The biggest number in the list is:", biggest)

Another example, let's say we have an algorithm that checks if a number is prime or not.

def is_prime(n):    if n <= 1:        return False    for i in range(2, int(n**(1/2))+1):        if n % i == 0:            return False    return Truen = 11result = is_prime(n)if result:    print(n, "is a prime number.")else:    print(n, "is not a prime number.")

In this example, we define a function called "is_prime" that takes in an integer as input. The function first checks if the input number is less than or equal to 1 and returns False if it is. If the input number is greater than 1, the function uses a for loop to iterate through all the numbers from 2 to the square root of the input number +1. For each iteration, the function checks if the input number is divisible by the current number, and if it is, it returns False. If the function completes the for loop without finding any divisors, it returns True, indicating that the input number is prime.

The time complexity of this algorithm is O(sqrt(n)), which is close to O(1) and we can consider it as constant time. This is because the number of iterations the for loop goes through is determined by the square root of the input number and regardless of the size of the input, the number of iterations would be constant. Also, the algorithm performs a constant number of operations (checking for divisibility) for each iteration.

It's important to note that this algorithm is not the most efficient way to check for prime numbers and it is not suitable for large numbers. But for small numbers, it works well and the time complexity is O(1)

In summary, the above python code is an algorithm that checks if a number is prime or not. The algorithm takes a constant number of steps regardless of the size of the input, which means that for any input number the algorithm takes a fixed set of steps. Here, the time complexity of the algorithm would be O(sqrt(n)) which is close to O(1) and we can consider it as constant time.

It's important to note that measuring time complexity is not always easy and straightforward, sometimes the algorithm may have multiple nested loops and different operations, in such cases we have to consider the worst-case scenario and calculate the time complexity accordingly.

Here is a simple Python code example for calculating the time complexity of an algorithm:

import timedef algorithm(n):    start_time = time.time()    # algorithm code here    for i in range(n):        print(i)    end_time = time.time()    print("Time complexity: O(n)")    print("Execution time:", end_time - start_time)algorithm(5)

Binary Search

Binary search is an efficient algorithm for searching through a sorted list of elements. The basic idea behind binary search is to divide the list in half, and repeatedly divide the remaining portion in half until the desired element is found. The time complexity of binary search is O(log n), where n is the number of elements in the list.

The reason why the time complexity of binary search is O(log n) is that, with each iteration, the algorithm eliminates half of the remaining elements from consideration. So, if the list has 8 elements, the algorithm will eliminate 4 elements on the first iteration, 2 elements on the second iteration, and 1 element on the third iteration. This means that, on average, the algorithm will take log2(8) = 3 iterations to find the desired element.

Here is an example of binary search implemented in Python:

def binary_search(arr, x):    low = 0    high = len(arr) - 1    while low <= high:        mid = (low + high) // 2        if arr[mid] == x:            return mid        elif arr[mid] < x:            low = mid + 1        else:            high = mid - 1    return -1arr = [1, 2, 3, 4, 5, 6, 7, 8]x = 5result = binary_search(arr, x)if result != -1:    print("Element is present at index", result)else:    print("Element is not present in array")

In this example, we first initialize the low and high pointers to the first and last element of the list, respectively. We then repeatedly divide the remaining portion of the list in half until the desired element is found.

As we can see, the number of iterations it takes to find the element is dependent on the log of the number of elements, which makes it an efficient algorithm in terms of time complexity.

It is also important to note that, for the binary search to work, the input list should be sorted, otherwise the algorithm will not work correctly and the time complexity will be affected.

In summary, Binary search is an efficient algorithm for searching through a sorted list of elements. Its time complexity is O(log n), where n is the number of elements in the list. This is because, with each iteration, the algorithm eliminates half of the remaining elements from consideration. It is important to have the input list sorted for the binary search to work correctly.

Bubble Sort

One example of a sorting algorithm in Python is the bubble sort algorithm. Bubble sort is a simple sorting algorithm that repeatedly steps through the list, compares adjacent elements and swaps them if they are in the wrong order. The pass through the list is repeated until the list is sorted.

Here is an example of bubble sort implemented in Python:

def bubble_sort(arr):    n = len(arr)    for i in range(n):        for j in range(0, n-i-1):            if arr[j] > arr[j+1]:                arr[j], arr[j+1] = arr[j+1], arr[j]    return arrarr = [64, 34, 25, 12, 22, 11, 90]bubble_sort(arr)print ("Sorted array is:",arr)

In this example, we first define a function called "bubble_sort" that takes in an array as an input. The function uses nested loops to iterate through the array, with the outer loop going through the entire array and the inner loop comparing adjacent elements and swapping them if they are in the wrong order. The outer loop makes n iterations where n is the length of the array and each iteration reduces the number of elements need to be compared by 1, as the largest element "bubbles up" to the end of the array.

The time complexity of bubble sort is O(n^2), which means that the number of steps it takes to sort an array of n elements is proportional to the square of n. This makes bubble sort inefficient for large arrays. However, bubble sort has the advantage of being easy to understand and implement, and it's a good choice for small arrays or for educational purposes.

In summary, bubble sort is a simple sorting algorithm that repeatedly steps through the list, compares adjacent elements and swaps them if they are in the wrong order. The pass through the list is repeated until the list is sorted. The time complexity of bubble sort is O(n^2) which makes it inefficient for large arrays but it's a good choice for small arrays or for educational purposes.


Original Link: https://dev.to/abbhiishek/time-complexity-41a1

Share this article:    Share on Facebook
View Full Article

Dev To

An online community for sharing and discovering great ideas, having debates, and making friends

More About this Source Visit Dev To