Big-O notation sounds like a scary concept. I think it’s the mathy-word ‘notation’ that makes it so, but it’s actually not that difficult to wrap your mind around.
Defining Big-O Notation
Big-O notation is used to classify algorithms by how they respond (e.g., the time it takes to process) to changes in input size. Continue reading