No courses found

No lessons found

en.wikipedia.org

Big O notation is a mathematical notation that describes the limiting behavior of a function when the argument tends towards a particular value or infinity. Big O is a member of a family of notations invented by Paul Bachmann,[1] Edmund Landau,[2] and others, collectively called Bachmann–Landau notation or asymptotic notation.

www.khanacademy.org

Read and learn for free about the following article: Big-O notation

rob-bell.net

Rob Bell's software development blog, discussing object-oriented programming, design and best practices, amongst other things.

About Big O notation

a method for describing worst-case number of computations required by an algorithm