Algorithm design

Home > Computer Science > Algorithms and data structures > Algorithm design

This subfield focuses on creating new algorithms that are efficient and effective for solving specific problems, such as sorting, searching, and graph traversal.

Time and space complexity: This is a fundamental concept in algorithm design, which measures how much time and memory an algorithm needs to solve a particular problem.
Sorting algorithms: There are various sorting algorithms available, and each has its own unique features and characteristics. Some of the most commonly known sorting algorithms include Bubble sort, Selection sort, Merge sort, Quick sort, and Insertion sort.
Searching algorithms: Searching algorithms are used to find a particular element within a data structure. Some of the most commonly known searching algorithms include Linear search, Binary search, and Interpolation search.
Recursion: Recursion is a powerful technique in algorithm design that involves breaking down a problem into smaller sub-problems and solving them recursively.
Data structures: Data structures are used to store and organize data in a specific way, with the goal of making it easier and faster to access the data when needed. Some of the most commonly used data structures include arrays, linked lists, stacks, heaps, and queues.
Dynamic programming: A technique in which a problem is broken down recursively into smaller sub-problems while remembering the solutions to each of the sub-problems.
Graph algorithms: Graph algorithms are used to solve problems related to graph theory. Some of the most commonly used graph algorithms include Dijkstra's algorithm, Breadth-First Search (BFS), and Depth-First Search (DFS).
Divide and conquer: A technique in which a problem is broken down into smaller sub-problems, which are then solved independently and combined to form the solution to the original problem.
Greedy algorithms: A technique in which the solution to a problem is built incrementally, with each step being chosen to maximize some locally optimal choice.
Bit manipulation: This is a technique that involves manipulating individual bits in binary representation of data. Bit manipulation is often used in cryptography, network programming, and other areas.
Backtracking: A technique for solving problems by incrementally building a solution and then "backtracking" when it fails to meet the problem's constraints.
Hashing: A technique for efficient data retrieval by mapping keys to values using a hash function.
Parallel algorithms: Algorithms designed to run on parallel processing systems.
Computational geometry: Algorithms designed to solve geometric problems, such as finding the intersection of two lines or computing the area of a polygon.
Randomized algorithms: Algorithms that use randomization to solve problems, often to improve the efficiency of the algorithm or find approximate solutions.
String algorithms: Algorithms designed to solve problems related to strings and text manipulation, such as finding the length of the longest common substring between two strings.
Greedy algorithms: These algorithms make locally optimal choices at each step, hoping to find a global optimal solution.
Divide and conquer algorithms: These algorithms break a problem down into smaller subproblems, solve each subproblem recursively, and then combine the solutions to solve the original problem.
Dynamic programming algorithms: These algorithms break a problem down into overlapping subproblems and solve each subproblem only once, storing the solution and using it to solve future subproblems.
Backtracking algorithms: These algorithms solve a problem by generating a series of possible candidates and testing each one until a solution is found.
Branch and bound algorithms: These algorithms break a problem down into a tree structure where each node represents a partial solution, and prune branches that cannot lead to a valid solution.
Randomized algorithms: These algorithms use randomization as part of their design, often to improve performance or to solve problems that are difficult to solve deterministically.
Approximation algorithms: These algorithms provide an approximate solution to a problem that is known to be difficult to solve optimally.
Heuristic algorithms: These algorithms use heuristics, or rules of thumb, to find a solution that is reasonable (but perhaps not optimal).
Linear programming algorithms: These algorithms solve optimization problems where the objective function and constraints are linear.
Network flow algorithms: These algorithms solve problems involving the flow of a commodity, such as water or electricity, through a network of interconnected nodes and edges.
Graph algorithms: These algorithms operate on graphs, such as finding the shortest path between two nodes, detecting cycles, or finding a minimum spanning tree.
String algorithms: These algorithms operate on strings, such as string matching, editing distance, or longest common subsequence.
Computational geometry algorithms: These algorithms solve problems related to geometric shapes, such as computing the intersection of two lines or finding the convex hull of a set of points.
Numerical algorithms: These algorithms solve problems related to numerical analysis, such as finding the roots of an equation or numerically approximating an integral.
"In mathematics and computer science, an algorithm ( ) is a finite sequence of rigorous instructions, typically used to solve a class of specific problems or to perform a computation."
"Algorithms are used as specifications for performing calculations and data processing."
"More advanced algorithms can use conditionals to divert the code execution through various routes (referred to as automated decision-making) and deduce valid inferences (referred to as automated reasoning), achieving automation eventually."
"Using human characteristics as descriptors of machines in metaphorical ways was already practiced by Alan Turing with terms such as 'memory,' 'search,' and 'stimulus'."
"In contrast, a heuristic is an approach to problem solving that may not be fully specified or may not guarantee correct or optimal results, especially in problem domains where there is no well-defined correct or optimal result."
"As an effective method, an algorithm can be expressed within a finite amount of space and time and in a well-defined formal language for calculating a function."
"Starting from an initial state and initial input (perhaps empty), the instructions describe a computation that, when executed, proceeds through a finite number of well-defined successive states, eventually producing 'output' and terminating at a final ending state."
"The transition from one state to the next is not necessarily deterministic; some algorithms, known as randomized algorithms, incorporate random input."
"More advanced algorithms can use conditionals to divert the code execution through various routes (referred to as automated decision-making)."
"More advanced algorithms [...] deduce valid inferences (referred to as automated reasoning)."
"Using human characteristics as descriptors of machines in metaphorical ways was already practiced by Alan Turing."
"Using terms such as 'memory,' 'search,' and 'stimulus'."
"In contrast, a heuristic is an approach to problem solving that may not be fully specified or may not guarantee correct or optimal results."
"Especially in problem domains where there is no well-defined correct or optimal result."
"As an effective method, an algorithm can be expressed within a finite amount of space and time and in a well-defined formal language for calculating a function."
"Starting from an initial state and initial input (perhaps empty)..."
"...the instructions describe a computation that, when executed, proceeds through a finite number of well-defined successive states..."
"...eventually producing 'output'..."
"...terminating at a final ending state."
"The transition from one state to the next is not necessarily deterministic; some algorithms, known as randomized algorithms, incorporate random input."