Data Structures & Algorithms (PCC-CS301) question and answers:

  1. Data structure: A way to store and organize data to perform operations efficiently.

  2. Array vs. linked list: Arrays are fixed-size, contiguous memory structures; linked lists are dynamic, with nodes linked by pointers.

  3. Stack: A Last In, First Out (LIFO) structure with push and pop operations.

  4. Queue: A First In, First Out (FIFO) structure, with enqueue to add and dequeue to remove elements.

  5. Types of linked lists: Singly linked list, doubly linked list, and circular linked list.

  6. Binary tree: A tree with each node having up to two children, left and right.

  7. Binary search tree (BST): A binary tree where the left child is less than the parent and the right child is greater.

  8. DFS vs. BFS: Depth-First Search explores as far as possible along branches, while Breadth-First Search explores all neighbors before moving deeper.

  9. Heap: A specialized binary tree that satisfies the heap property, used in priority queues and heapsort.

  10. Hash table: Stores key-value pairs for efficient lookup using a hashing function.

  11. Graph: A collection of nodes (vertices) connected by edges, represented by adjacency lists or matrices.

  12. Pointer: A variable holding the memory address of another variable, commonly used in dynamic structures.

  13. Types of sorting algorithms: Quick sort, merge sort, bubble sort, insertion sort, selection sort, heap sort, etc.

  14. Quicksort: A divide-and-conquer algorithm that partitions an array around a pivot, sorting in O(n log n) on average.

  15. Merge sort: A stable sorting algorithm that recursively divides and merges arrays, with O(n log n) complexity.

  16. Bubble sort: Repeatedly swaps adjacent elements if out of order; time complexity is O(n^2).

  17. Insertion sort: Builds the sorted array one item at a time, best for small data or partially sorted arrays.

  18. Selection sort: Finds the minimum element and places it at the beginning, with O(n^2) complexity.

  19. Recursion: A function calling itself to solve a problem in smaller subproblems.

  20. Dynamic programming: Optimizes by storing overlapping subproblem results to avoid recomputation.

  21. Greedy algorithm: Makes the locally optimal choice at each step, aiming for a global optimum.

  22. Backtracking: Explores all possible solutions by building a solution incrementally and abandoning it if it fails.

  23. Binary search: Efficiently finds elements in a sorted array by repeatedly dividing the search interval in half.

  24. Time complexities:

    • Array access: O(1), insertion/deletion: O(n)
    • Linked list access: O(n), insertion/deletion: O(1)
    • Stack/Queue: O(1) for push/pop/enqueue/dequeue.
  25. Balanced binary tree: Maintains near-equal heights in subtrees, improving efficiency of operations.

  26. AVL tree: A self-balancing binary search tree where the height difference between subtrees is ≤1.

  27. Red-black tree: A balanced BST with properties that enforce a near-equal path length to leaves.

  28. Graph traversal: Visiting nodes in a graph, typically through DFS or BFS.

  29. Adjacency matrix vs. list:

    • Matrix: 2D array, O(V^2) space, fast access.
    • List: Stores edges only, O(E) space, space-efficient for sparse graphs.
  30. Dijkstra's algorithm: Finds shortest path from a source to all other nodes in a weighted graph.