About 6,600,000 results
Open links in new tab
  1. Difference between O (logn) and O (nlogn) - Stack Overflow

    Apr 27, 2019 · You still need to study a lot. O(..) describes the complexity of your algorithm. To be easy, you can imagine as the time to take to finish you algorithm for an n input, if O(n) it will finish in n seconds, O(logn) will finish in logn seconds and n*logn seconds for O(nlogn). O(1) means the cost of your algorithm is constant no matter how big n is.

  2. Examples of Algorithms which has O (1), O (n log n) and O (log n ...

    Oct 20, 2009 · O(logn) - finding something in your telephone book. Think binary search. O(n) - reading a book, where n is the number of pages. It is the minimum amount of time it takes to read a book. O(nlogn) - cant immediately think of something one might do everyday that is nlogn...unless you sort cards by doing merge or quick sort!

  3. Why is O (n) better than O ( nlog (n) )? - Stack Overflow

    Jul 9, 2020 · O(n) denotes linear time complexity. Operations with O(n) complexity grow linearly with the size of the input. O(nlogn) signifies linearithmic time complexity. It grows in proportion to nlogn, where n is the size of the input. O(1)<O(logn)<O(n)<O(nlogn) is correct

  4. O(n log n) vs O(n) -- practical differences in time complexity

    It could be because lower order terms are dominating, or it could be because in the average case, the O(nlogn) algorithm is actually O(n), or because the actual number of steps is something like 5,000,000n vs 3nlogn.

  5. algorithm - What does O (log n) mean exactly? - Stack Overflow

    Feb 22, 2010 · O(n): Find all people whose phone numbers contain the digit "5". O(n): Given a phone number, find the person or business with that number. O(n log n): There was a mix-up at the printer's office, and our phone book had all its pages inserted in a random order. Fix the ordering so that it's correct by looking at the first name on each page and ...

  6. Why is this algorithm O (nlogn)? - Stack Overflow

    Sep 2, 2016 · Perhaps the easiest way to convince yourself of the O(n*lgn) running time is to run the algorithm on a sheet of paper. Consider what happens when n is 64. Consider what happens when n is 64. Then the outer loop variable k would take the following values:

  7. Difference between O(n) and O(log(n)) - which is better and what ...

    Apr 29, 2012 · Case: where O(log n) outperforms O(1) Let us assume hypothetically that function show takes 1ms to execute. So for n=2, Code 1 will take 4 ms to execute whereas Code 2 will take just 1 ms to execute. In this case, O(log n) outperformed O(1). Case: where O(1) outperforms O(log n) As we increase the input size 'n', O(1) will outperforms O(log n ...

  8. algorithm - n log n is O (n)? - Stack Overflow

    Oct 20, 2011 · n lg3 is not O(n). It outgrows O(n)... In fact, any exponent on n that is larger than 1 results in an asymptotically longer time than O(n). Since lg(3) is about 1.58, as long as you subtract less than .58 from the exponent it is asymptotically greater than O(n).

  9. Intuitive explanation for why QuickSort is n log n?

    May 3, 2012 · Break the sorting algorithm in two parts. First is the partitioning and second recursive call. Complexity of partioning is O(N) and complexity of recursive call for ideal case is O(logN). For example, if you have 4 inputs then there will be 2(log4) recursive call. Multiplying both you get O(NlogN). It is a very basic explanation.

  10. O (N log N) Complexity - Similar to linear? - Stack Overflow

    Nov 19, 2015 · I don't mean graphically similar (to a straight line) but time-complexity similar. O(nlogn) time can easily be an order of magnitude bigger than O(n). If the graphs compared O(nlogn) and O(n) algorithms you would see what I mean. :) As the N goes bigger and bigger the O(nlogn) *moves to next logarithimic scales. –

Refresh