In computer science, algorithms are used to perform many functions that add value to a business, organization or to a user. In order to make the best decisions which ones to use, we have to understand how each algorithm performs in times of space or time. Users can determine the best and worst case performance for an algorithm in order to make a good judgement call on what to use.
Big O notation is a method that helps us determine the worst case performance for a particular algorithm. It’s some sort of rough estimate as to what to expect. There is a formal decision that says a function exists that performs worse that our currently analyzed function in all cases beyond a certain number of input.