Skip to main content

Introduction to Data Structures and Algorithms

Resources

graph TD
A[01. Arrays] --> B[Two Pointers]
Z[Hashing] --> B
A --> C[Stack]

B --> D[Binary Search]
B --> E[Sliding Window]
B --> F[02. Linked Lists]

D & F --> G[Trees]

G --> H[Tries]
G --> I[Heap / Priority Queue]
G --> J[Backtracking]

I --> K[Intervals]
I --> L[Greedy]

J --> M[Graphs]
J --> N[1-D DP]

I & M --> O[Advanced Graphs]
M --> P[2-D DP]

N --> P
N --> Q[Bit Manipulation]

M & Q --> R[Math & Geometry]

class A internal-link;
class F internal-link;

Related Topics

  • [[01. Arrays | Arrays]]
  • [[01. Arrays#3. Stacks | Stack]]
  • [[02. Linked Lists | Linked List]]

1. Definition

  1. Data Structures: different ways of storing data in computer memory (RAM) so that it can be accessed and manipulated efficiently. Examples: stacks, queues, array, linked list, etc.
  • The main idea behind using data structures is to minimize the time and space complexities.
  • An efficient data structure takes minimum memory space and requires minimum time to execute the data.
  1. Algorithms: operations on different data structures with sets of step-by-step instructions for executing them.

2. Complexity

The complexity of an algorithm is a measure of the amount of time and/or space required by an algorithm for an input of a given size n.

  • The term complexity almost always refers to the worst-case complexity → the highest/longest possible time/space taken by the algorithm to process an input.
  • Hence, we always design our program keeping the worst case in mind.
  1. Time complexity is the amount of time required to execute the code.
  2. Space complexity is the amount of memory required to successfully execute the functionalities of the code.

Big-O Notation (O)

Big O notation is a mathematical way to express the upper bound (biggest, highest, worst-case) of an algorithm's time complexity (or sometimes space complexity).

  • Common Big O notations include:
    • O(1): Constant time (independent of input size, doesn’t grow when input size grows)
    • O(log n): Logarithmic time (increases slowly with input size)
    • O(n): Linear time (proportional to input size)
    • O(n log n): Log-linear time (balances logarithmic and linear growth)
    • O(n^2): Quadratic time (increases more significantly with input size) - avoid for large datasets if possible
    • O (exponential): Exponential time (extremely rapid growth with input size) - generally undesirable big-o-notation | 700

There is a common confusion that O(1) is always fast. This is not the case. There could be 1000 operations and the time complexity could still be O(1). If the number of operations does not grow as the size of the data or input grows then it is O(1).