Looking for basic steps to analyse the algorithm.
There is no generalized answer only few tips can be suggested -
An algorithm that does not contain any loops (for example: Write Text to Console, Get Input From User, Write Result To Console, a hash function etc) is O(1), no matter how many steps. The "time" it takes to execute the algorithm is constant (this is what O(1) means), as it does not depend on any data.
An algorithm that iterates through a list of items one by one has complexity O(n) (n being the number of items in the list). If it iterates two times through the list in consecutive loops, it is still O(n), as the time to execute the algorithm still depends on the number of items only.
An algorithm with two nested loops, where the inner loop somehow depends on the outer loop, is in the O(n^x) class (depending on the number of nested loops).
An binary search algorithm on a sorted field is in the O(log(n)) class, as the number of items is reduced by half in every step.
A recursive or iterative approach where we are using divide and conquer to solve some problem i.e. quicksort etc have O(nlog(n)).
While reading possible time complexity i found that time complexity O(log log n) exist for some algos,
Can any one explain an example which shows the calculation of the time complexity with O(log log n)
Assume priority queue in Dijkstra’s algorithm is implemented using a sorted link list and graph G (V, E) is represented using adjacency matrix.
What is the time complexity of Dijkstra’s algorithm (Assume graph is connected)?
Say we only know the worst case and best case complexity of an algo (Algo is not known). Is it possible to get the average case complexity?