Big-O Notation

Home > Computer Science > Programming Languages > Algorithms and Data Structures > Big-O Notation

Big-O notation is a mathematical notation that describes the performance and scalability of algorithms. It helps to measure the efficiency of the algorithms by calculating the amount of time and space required for execution.