TL;DR - Summary!

  • Programming interview questions cover data structures, algorithms, system design, and problem-solving
  • Questions range from fundamental concepts to advanced architectural patterns
  • Each answer includes code examples, complexity analysis, and real-world applications
  • Technical interviewers use these questions to assess coding ability and thought process
  • Structured interview questions improve hiring consistency and candidate quality
  • Answers demonstrate both breadth of knowledge and depth of understanding
  • Real-world scenarios help evaluate practical application capabilities

20 Programming Interview Questions and Answers

1. What is the Difference Between Array and Linked List?

Answer:

Arrays and linked lists are fundamental data structures with distinct characteristics. Arrays store elements in contiguous memory locations, enabling O(1) random access through indexing. However, insertion and deletion operations require shifting elements, resulting in O(n) time complexity.

Linked lists allocate memory dynamically through pointers, with each node containing data and a reference to the next node. This approach offers O(n) access time but provides O(1) insertion and deletion once the position is located.

Key Differences:

  • Memory Allocation: Array uses contiguous memory; linked list uses scattered memory with pointers
  • Access Time: Array offers O(1); linked list requires O(n)
  • Insertion/Deletion: Array costs O(n); linked list costs O(1) after position location
  • Cache Performance: Arrays excel in cache locality; linked lists suffer from cache misses
  • Memory Overhead: Arrays minimize overhead; linked lists require pointer storage

Code Example:

# Array Access - O(1)
arr = [1, 2, 3, 4, 5]
element = arr[2] # Direct access

# Linked List Access - O(n)
class Node:
def __init__(self, data):
self.data = data
self.next = None

def access_element(head, index):
current = head
for i in range(index):
current = current.next
return current.data

2. Explain Binary Search and When to Use It

Answer:

Binary search is an efficient searching algorithm that works on sorted data by repeatedly dividing the search interval in half. It achieves O(log n) time complexity, making it dramatically faster than linear search for large datasets.

The algorithm compares the target value with the middle element. If they match, the search succeeds. If the target is smaller, the search continues in the left half; if larger, in the right half.

Prerequisites:

  • Data must be sorted
  • Array or array-like accessible structure
  • Known size of collection

Real-World Applications:

  • Database indexing for rapid record retrieval
  • Dictionary lookups in natural language processing
  • Version control systems finding specific commits
  • Library inventory systems locating books
  • Search autocomplete functionality

Implementation:

def binary_search(arr, target):
left, right = 0, len(arr) - 1
while left <= right:
mid = (left + right) // 2
if arr[mid] == target:
return mid
elif arr[mid] < target:
left = mid + 1
else:
right = mid - 1
return -1 # Target not found

# Time Complexity: O(log n)
# Space Complexity: O(1)

3. What is the Reverse of a String?

Answer:

String reversal reverses the sequence of characters. This fundamental programming exercise tests string manipulation skills and understanding of iteration patterns.

Multiple Approaches:

Iterative Method - O(n) Time:

def reverse_string_iterative(s):
return s[::-1] # Python slice notation

# Alternative - explicit loop

def reverse_string_loop(s):
result = ''
for i in range(len(s) - 1, -1, -1):
result += s[i]
return result

Recursive Method - O(n) Time, O(n) Space:

 def reverse_string_recursive(s):
if len(s) == 0:
return s
return reverse_string_recursive(s[1:]) + s[0]

Two-Pointer Method - O(n) Time:

def reverse_string_pointer(s):
arr = list(s)
left, right = 0, len(arr) - 1
while left < right:
arr[left], arr[right] = arr[right], arr[left]
left += 1
right -= 1
return ''.join(arr)

Performance Comparison:

  • Slice method: Fastest, O(n) time, O(n) space
  • Iterative: Clear, O(n) time, O(n) space
  • Recursive: Elegant but risky for long strings (stack overflow)
  • Two-pointer: Optimal for in-place reversal

4. Explain Recursion and Provide an Example

Answer:

Recursion occurs when a function calls itself to solve smaller instances of the same problem. This approach leverages the stack to maintain function call sequences and enables elegant solutions for problems with recursive structures.

Essential Components:

  1. Base Case: Condition terminating recursion
  2. Recursive Case: Function calling itself with modified parameters
  3. Progress Toward Base Case: Each call must move closer to termination

Classic Example - Factorial:

 def factorial(n):
# Base case
if n == 0 or n == 1:
return 1
# Recursive case
return n * factorial(n - 1)

# factorial(5) = 5 * factorial(4)
# = 5 * 4 * factorial(3)
# = 5 * 4 * 3 * factorial(2)
# = 5 * 4 * 3 * 2 * factorial(1)
# = 5 * 4 * 3 * 2 * 1 = 120

Fibonacci Sequence Example:

 def fibonacci(n):
if n <= 1:
return n
return fibonacci(n - 1) + fibonacci(n - 2)

# Time Complexity: O(2^n) - Exponential
# Space Complexity: O(n) - Call stack depth

Optimized Fibonacci with Memoization:

 def fibonacci_memo(n, memo={}):
if n in memo:
return memo[n]
if n <= 1:
return n
memo[n] = fibonacci_memo(n - 1, memo) + fibonacci_memo(n - 2, memo)
return memo[n]

# Time Complexity: O(n)
# Space Complexity: O(n)

Advantages:

  • Elegant for tree-like problems
  • Natural for divide-and-conquer algorithms
  • Easier to understand code logic

Disadvantages:

  • Stack overflow risk with deep recursion
  • Function call overhead increases time complexity
  • Difficult to debug

5. What is Time Complexity Analysis?

Answer:

Time complexity quantifies how algorithm runtime grows relative to input size. It describes the maximum number of operations an algorithm performs, enabling comparison between solutions independent of hardware or programming language.

Big O Notation Classifications:

  • O(1): Constant - Array access, hash table lookup
  • O(log n): Logarithmic - Binary search
  • O(n): Linear - Simple loop through all elements
  • O(n log n): Linearithmic - Efficient sorting (merge sort, quick sort)
  • O(n²): Quadratic - Nested loops (bubble sort, selection sort)
  • O(n³): Cubic - Triple nested loops (matrix multiplication)
  • O(2^n): Exponential - Recursive solutions without memoization
  • O(n!): Factorial - Permutation generation

Visual Complexity Comparison:

For n = 1000:

  • O(1): 1 operation
  • O(log n): ~10 operations
  • O(n): 1,000 operations
  • O(n log n): ~10,000 operations
  • O(n²): 1,000,000 operations
  • O(2^n): 2^1000 operations (impossible to compute)

Analysis Example:

# O(1) - Constant time
def get_first_element(arr):
return arr[0]

# O(n) - Linear time
def find_maximum(arr):
max_val = arr[0]
for num in arr:
if num > max_val:
max_val = num
return max_val
# O(n²) - Quadratic time
def bubble_sort(arr):
for i in range(len(arr)):
for j in range(len(arr) - 1):
if arr[j] > arr[j + 1]:
arr[j], arr[j + 1] = arr[j + 1], arr[j]

6. What is the Difference Between Stack and Queue?

Answer:

Stacks and queues are abstract data types with different insertion and removal patterns, making them suitable for different scenarios.

Stack (LIFO - Last In First Out):

The last element added is the first removed. Think of a stack of plates you add and remove from the top.

Operations:

  • push(x): Add element to top - O(1)
  • pop(): Remove top element - O(1)
  • peek(): View top element - O(1)

Applications:

  • Function call stack
  • Undo/redo functionality
  • Expression evaluation (postfix notation)
  • Backtracking algorithms
class Stack:
def __init__(self):
self.items = []

def push(self, item):
self.items.append(item)

def pop(self):
return self.items.pop() if not self.is_empty() else None

def peek(self):
return self.items[-1] if not self.is_empty() else None

def is_empty(self):
return len(self.items) == 0

Queue (FIFO - First In First Out):

The first element added is the first removed. Like a checkout line, the first person in line is first served.

Operations:

  • enqueue(x): Add to rear - O(1)
  • dequeue(): Remove from front - O(1)
  • peek(): View front element - O(1)

Applications:

  • Printer job scheduling
  • Customer service queues
  • Breadth-first search (BFS)
  • Message processing systems
from collections import deque

class Queue:
def __init__(self):
self.items = deque()

def enqueue(self, item):
self.items.append(item)

def dequeue(self):
return self.items.popleft() if not self.is_empty() else None

def peek(self):
return self.items[0] if not self.is_empty() else None

def is_empty(self):
return len(self.items) == 0

7. Explain Merge Sort Algorithm

Answer:

Merge sort is a divide-and-conquer sorting algorithm that divides arrays recursively into halves until single elements remain, then merges them back in sorted order. It guarantees O(n log n) time complexity in all cases best, average, and worst.

Algorithm Steps:

  1. Divide array into two halves
  2. Recursively sort left half
  3. Recursively sort right half
  4. Merge sorted halves

Implementation:

def merge_sort(arr):
if len(arr) <= 1:
return arr
mid = len(arr) // 2
left = merge_sort(arr[:mid])
right = merge_sort(arr[mid:])
return merge(left, right)

def merge(left, right):
result = []
i = j = 0
while i < len(left) and j < len(right):
if left[i] <= right[j]:
result.append(left[i])
i += 1
else:
result.append(right[j])
j += 1
result.extend(left[i:])
result.extend(right[j:])
return result

# Time Complexity: O(n log n) - All cases
# Space Complexity: O(n) - Additional array for merging

Advantages:

  • Guaranteed O(n log n) performance
  • Stable sort (preserves equal element order)
  • Predictable behavior for large datasets
  • Excellent for linked lists

Disadvantages:

  • Requires O(n) extra space
  • Slower than quick sort for average cases
  • More function call overhead

Real-World Use:

  • Database sorting operations
  • External sorting for files exceeding RAM
  • Distributed computing scenarios
  • Stability-critical applications

8. What is a Hash Table/Hash Map?

Answer:

Hash tables implement associative arrays structures mapping keys to values using hash functions. They provide average O(1) insertion, deletion, and lookup, making them essential for real-world applications.

Components:

  • Hash Function: Converts keys to array indices
  • Hash Table: Array storing key-value pairs
  • Collision Handling: Strategies for duplicate hash values (chaining, open addressing)

Hash Function Example:

 def simple_hash(key, table_size):
return len(key) % table_size # Not recommended for production

def better_hash(key, table_size):
hash_value = 0
for char in key:
hash_value = (hash_value * 31 + ord(char)) % table_size
return hash_value

Collision Handling - Chaining:

 class HashTableChaining:
def __init__(self, size):
self.size = size
self.table = [[] for _ in range(size)]

def hash_function(self, key):
return hash(key) % self.size
def insert(self, key, value):
index = self.hash_function(key)
for i, (k, v) in enumerate(self.table[index]):
if k == key:
self.table[index][i] = (key, value)
return
self.table[index].append((key, value))

def search(self, key):
index = self.hash_function(key)
for k, v in self.table[index]:
if k == key:
return v
return None
def delete(self, key):
index = self.hash_function(key)
self.table[index] = [(k, v) for k, v in self.table[index] if k != key]

Real-World Applications:

  • Database indexing
  • Cache implementation
  • Symbol table in compilers
  • Duplicate element detection
  • Anagram grouping

9. Explain Binary Tree and Its Traversals

Answer:

Binary trees are hierarchical data structures where each node has at most two children (left and right). They form the foundation for more complex structures like binary search trees and heaps.

Binary Tree Node Definition:

class TreeNode:
def __init__(self, value):
self.value = value
self.left = None
self.right = None

Tree Traversal Methods:

1. In-Order (Left, Root, Right) - O(n) Time:

def inorder_traversal(node):
if node is None:
return []
return (inorder_traversal(node.left) +
[node.value] +
inorder_traversal(node.right))
# For BST, produces sorted sequence

2. Pre-Order (Root, Left, Right) - O(n) Time:

def preorder_traversal(node):
if node is None:
return []
return ([node.value] +
preorder_traversal(node.left) +
preorder_traversal(node.right))

# Useful for copying trees

3. Post-Order (Left, Right, Root) - O(n) Time:

 def postorder_traversal(node):
if node is None:
return []
return (postorder_traversal(node.left) +
postorder_traversal(node.right) +
[node.value])
# Useful for deletion

4. Level-Order (Breadth-First) - O(n) Time:

 from collections import deque 

def levelorder_traversal(root):
if root is None:
return []
result = []
queue = deque([root])
while queue:
node = queue.popleft()
result.append(node.value)
if node.left:
queue.append(node.left)
if node.right:
queue.append(node.right)
return result

Key Distinctions:

  • In-order: Useful for BST sorted output
  • Pre-order: Useful for tree cloning
  • Post-order: Useful for deletion, size calculation
  • Level-order: Useful for breadth-first analysis

10. What is Breadth-First Search (BFS)?

Answer:

BFS explores graph nodes level by level, visiting all neighbors before moving to next-level nodes. It uses a queue data structure and guarantees finding shortest paths in unweighted graphs.

Algorithm Steps:

  1. Start from source node
  2. Add source to queue
  3. While queue not empty:
    • Dequeue node
    • Process node
    • Enqueue unvisited neighbors

Implementation:

from collections import deque

def bfs(graph, start):
visited = set()
queue = deque([start])
visited.add(start)
result = []
while queue:
node = queue.popleft()
result.append(node)
for neighbor in graph[node]:
if neighbor not in visited:
visited.add(neighbor)
queue.append(neighbor)
return result

# Time Complexity: O(V + E) - Vertices and Edges
# Space Complexity: O(V) - Queue and visited set

Graph Example:

 graph = {
'A': ['B', 'C'],
'B': ['A', 'D', 'E'],
'C': ['A', 'F'],
'D': ['B'],
'E': ['B', 'F'],
'F': ['C', 'E']
}

print(bfs(graph, 'A')) # Output: ['A', 'B', 'C', 'D', 'E', 'F']

Applications:

  • Shortest path in unweighted graphs
  • Social network friend suggestions
  • Web crawling and indexing
  • GPS navigation for nearby locations
  • Puzzle solving (maze solving)

11. Explain Depth-First Search (DFS)

Answer:

DFS explores graph nodes by going deep along each branch before backtracking. It uses a stack (recursion call stack) and discovers all nodes reachable from source, useful for topological sorting and cycle detection.

Recursive Implementation:

 def dfs_recursive(graph, node, visited):
visited.add(node)
result = [node]
for neighbor in graph[node]:
if neighbor not in visited:
result.extend(dfs_recursive(graph, neighbor, visited))
return result

visited = set()
print(dfs_recursive(graph, 'A', visited))

Iterative Implementation (Stack-Based):

 def dfs_iterative(graph, start):
visited = set()
stack = [start]
result = []
while stack:
node = stack.pop()
if node not in visited:
visited.add(node)
result.append(node)
stack.extend(reversed(graph[node]))
return result

# Time Complexity: O(V + E)
# Space Complexity: O(V) - Recursion or stack

Applications:

  • Cycle detection in graphs
  • Topological sorting
  • Strongly connected components
  • Maze solving
  • Puzzle solving
  • Backtracking algorithms

DFS vs BFS:

  • DFS: Uses stack, goes deep, finds all paths
  • BFS: Uses queue, explores level-wise, finds shortest path

12. What is Linear Search and When to Use It?

Answer:

Linear search examines each element sequentially until the target is found or list exhausted. Though slower than binary search, it works on unsorted data and suits small datasets or linked lists.

Implementation:

 def linear_search(arr, target):
for i in range(len(arr)):
if arr[i] == target:
return i
return -1

# Time Complexity: O(n)
# Space Complexity: O(1)

Characteristics:

  • Works on unsorted and sorted data
  • Simplest search algorithm
  • Best case: O(1) when element at start
  • Average case: O(n/2) ≈ O(n)
  • Worst case: O(n) when element absent or at end

When to Use Linear Search:

  • Small datasets: Under 100 elements, overhead insignificant
  • Unsorted data: No sorting preprocessing available
  • Linked lists: Random access unavailable
  • Searching for multiple elements: All occurrences needed

Optimization - Early Termination:

 def linear_search_optimized(arr, target):
for i in range(len(arr)):
if arr[i] == target:
return i
elif arr[i] > target: # Assumes sorted array
break
return -1

13. Explain the Concept of Polymorphism

Answer:

Polymorphism (Greek: 'many forms') enables objects to take multiple forms and methods to behave differently based on context. It's a cornerstone of object-oriented programming, promoting code reusability and flexibility.

Types of Polymorphism:

1. Compile-Time Polymorphism (Method Overloading):

Different methods share the same name but differ in parameters.

 // Java Example
class Calculator {
public int add(int a, int b) {
return a + b;
}
public double add(double a, double b) {
return a + b;
}
public int add(int a, int b, int c) {
return a + b + c;
}
}

Calculator calc = new Calculator();
System.out.println(calc.add(5, 10)); // Calls first method
System.out.println(calc.add(5.5, 10.2)); // Calls second method
System.out.println(calc.add(5, 10, 15)); // Calls third method

2. Runtime Polymorphism (Method Overriding):

Subclasses override parent class methods, with execution determined at runtime.

 # Python Example
class Animal:
def make_sound(self):
return 'Generic animal sound'

class Dog(Animal):
def make_sound(self):
return 'Woof!'

class Cat(Animal):
def make_sound(self):
return 'Meow!'

def play_sound(animal):
print(animal.make_sound())

# Runtime determination of which method executes
dog = Dog()
cat = Cat()
play_sound(dog) # Outputs: Woof!
play_sound(cat) # Outputs: Meow!

Benefits:

  • Code flexibility and extensibility
  • Reduced coupling between objects
  • Easier maintenance and updates
  • Promotes SOLID principles

14. What is Inheritance in OOP?

Answer:

Inheritance allows classes to acquire properties and methods from parent classes, promoting code reuse and establishing hierarchical relationships between objects.

Single Inheritance Example:

 class Vehicle: 
def __init__(self, brand):
self.brand = brand

def display_info(self):
return f'Brand: {self.brand}'

class Car(Vehicle):
def __init__(self, brand, doors):
super().__init__(brand)
self.doors = doors

def display_info(self):
parent_info = super().display_info()
return f'{parent_info}, Doors: {self.doors}'

car = Car('Toyota', 4)
print(car.display_info()) # Output: Brand: Toyota, Doors: 4

Multiple Inheritance (Python):

 class Swimmer:
def swim(self):
return 'Swimming...'

class Flyer:
def fly(self):
return 'Flying...'

class Duck(Swimmer, Flyer):
pass

duck = Duck()
print(duck.swim()) # Output: Swimming...
print(duck.fly()) # Output: Flying...

Inheritance Types:

  • Single Inheritance: One parent class
  • Multiple Inheritance: Multiple parent classes
  • Multilevel Inheritance: Grandparent → Parent → Child
  • Hierarchical Inheritance: Multiple children from one parent

Advantages:

  • Code reusability
  • Reduced redundancy
  • Logical organization
  • Easier maintenance

15. Explain Encapsulation in OOP

Answer:

Encapsulation bundles data (attributes) and methods (behaviors) within a class while hiding internal implementation details from external access. It protects data integrity and provides controlled access through public interfaces.

Example:

 class BankAccount:
def __init__(self, account_number, balance):
self.__account_number = account_number # Private
self.__balance = balance # Private

# Public method - controlled access
def deposit(self, amount):
if amount > 0:
self.__balance += amount
return f'Deposited ${amount}. New balance: ${self.__balance}'
return 'Invalid amount'

def withdraw(self, amount):
if 0 < amount <= self.__balance:
self.__balance -= amount
return f'Withdrew ${amount}. Remaining: ${self.__balance}'
return 'Insufficient funds'

# Getter
def get_balance(self):
return self.__balance

# Usage
account = BankAccount('12345', 1000)
print(account.deposit(500)) # Allowed - controlled access
print(account.withdraw(200)) # Allowed - validation applied
print(account.get_balance()) # Allowed - read access

# This would fail
# print(account.__balance) # Direct access blocked

Access Modifiers:

  • Public: Accessible everywhere (no prefix)
  • Protected: Accessible within class and subclasses (_prefix)
  • Private: Accessible only within class (__prefix)

Benefits:

  • Data protection from unauthorized access
  • Input validation enforcement
  • Implementation hiding
  • Easier maintenance and changes

16. What is the Difference Between Stack Overflow and Stack Underflow?

Answer:

Stack overflow occurs when attempting to push data beyond stack capacity, while stack underflow happens when removing from an empty stack.

Stack Overflow Example:

 def recursive_function(n): 
print(n)
recursive_function(n + 1) # Infinite recursion

# This will eventually cause stack overflow
# recursive_function(0)

# Better - with termination condition
def recursive_function_safe(n):
if n >= 1000: # Base case prevents overflow
return
print(n)
recursive_function_safe(n + 1)

Stack Underflow Example:

 class Stack:
def __init__(self):
self.items = []

def pop(self):
if len(self.items) == 0:
raise Exception('Stack underflow')
return self.items.pop()

stack = Stack()
stack.pop() # Raises stack underflow exception

Prevention Strategies:

Stack Overflow Prevention:

  • Ensure recursive functions have proper base cases
  • Limit recursion depth
  • Use iterative approaches for deep recursion
  • Implement stack size monitoring

Stack Underflow Prevention:

  • Check stack emptiness before pop operations
  • Implement isEmpty() method
  • Use try-catch blocks for error handling
  • Provide clear error messages

17. Explain Dynamic Programming

Answer:

Dynamic programming solves complex problems by breaking them into overlapping subproblems, solving each once, and storing solutions for reuse. It optimizes recursive solutions with memoization or tabulation.

Two Approaches:

1. Top-Down (Memoization):

 def fibonacci_memo(n, memo={}):
if n in memo:
return memo[n]
if n <= 1:
return n
memo[n] = fibonacci_memo(n - 1, memo) + fibonacci_memo(n - 2, memo)
return memo[n]

# Time: O(n), Space: O(n)

2. Bottom-Up (Tabulation):

 def fibonacci_tabulation(n):
] if n <= 1:
return n
dp = [0] * (n + 1)
dp[1] = 1
for i in range(2, n + 1):
dp[i] = dp[i - 1] + dp[i - 2]
return dp[n]

# Time: O(n), Space: O(n)

Optimized Space Fibonacci:

 def fibonacci_optimized(n):
if n <= 1:
return n
prev2, prev1 = 0, 1
for i in range(2, n + 1):
current = prev1 + prev2
prev2, prev1 = prev1, current
return prev1

# Time: O(n), Space: O(1)

Classic DP Problems:

  • Fibonacci sequence
  • Knapsack problem
  • Longest common subsequence
  • Coin change problem
  • Matrix chain multiplication

18. What is a Binary Search Tree (BST)?

Answer:

Binary Search Trees are hierarchical structures where each node's left subtree contains smaller values and right subtree contains larger values. This property enables efficient search, insertion, and deletion operations.

BST Properties:

  • Left child < Parent < Right child
  • No duplicate values (typically)
  • In-order traversal produces sorted sequence

BST Implementation:

 class BSTNode:
def __init__(self, value):
self.value = value
self.left = None
self.right = None

class BinarySearchTree:
def __init__(self):
self.root = None

def insert(self, value):
self.root = self._insert_recursive(self.root, value)

def _insert_recursive(self, node, value):
if node is None:
return BSTNode(value)
if value < node.value:
node.left = self._insert_recursive(node.left, value)
elif value > node.value:
node.right = self._insert_recursive(node.right, value)
return node

def search(self, value):
return self._search_recursive(self.root, value)

def _search_recursive(self, node, value):
if node is None or node.value == value:
return node
if value < node.value:
return self._search_recursive(node.left, value)
return self._search_recursive(node.right, value)

Time Complexity:

  • Search: O(h) where h = tree height
  • Insert: O(h)
  • Delete: O(h)
  • Balanced BST: O(log n)
  • Degenerate BST: O(n)

Applications:

  • Database indexing
  • File system organization
  • Auto-complete functionality
  • Network routing tables

19. Explain Quick Sort Algorithm

Answer:

Quick sort is a divide-and-conquer algorithm that selects a pivot element and partitions the array around it, recursively sorting subarrays. It's often faster than merge sort in practice.

Algorithm Steps:

  1. Choose pivot element
  2. Partition array: elements < pivot go left, elements > pivot go right
  3. Recursively sort left and right partitions

Implementation:

 def quick_sort(arr):
if len(arr) <= 1:
return arr
pivot = arr[len(arr) // 2]
left = [x for x in arr if x < pivot]
middle = [x for x in arr if x == pivot]
right = [x for x in arr if x > pivot]
return quick_sort(left) + middle + quick_sort(right)

# Time Complexity: O(n log n) average, O(n²) worst
# Space Complexity: O(log n) average, O(n) worst

In-Place Quick Sort:

 def quick_sort_inplace(arr, low=0, high=None):
if high is None:
high = len(arr) - 1
if low < high:
pivot_index = partition(arr, low, high)
quick_sort_inplace(arr, low, pivot_index - 1)
quick_sort_inplace(arr, pivot_index + 1, high)

def partition(arr, low, high):
pivot = arr[high]
i = low - 1
for j in range(low, high):
if arr[j] <= pivot:

i += 1 arr[i], arr[j] = arr[j], arr[i]
arr[i + 1], arr[high] = arr[high], arr[i + 1]
return i + 1

Pivot Selection Strategies:

  • First element: Simple but poor for sorted arrays
  • Last element: Similar issues as first
  • Middle element: Better for partially sorted arrays
  • Random element: Reduces worst-case probability
  • Median-of-three: Best practical performance

20. What is Object-Oriented Programming (OOP)?

Answer:

Object-Oriented Programming models software using objects containing data and behavior. It emphasizes modularity, reusability, and maintainability through four core principles.

Four OOP Pillars:

1. Encapsulation: Bundling data and methods, controlling access

 class Temperature:
def __init__(self):
self.__celsius = 0

def get_celsius(self):
return self.__celsius
def set_celsius(self, value):
if -273.15 <= value <= 1000:
self.__celsius = value
else:
raise ValueError('Invalid temperature')

2. Inheritance: Deriving classes from parent classes

 class Animal:
def speak(self):
pass

class Dog(Animal):
def speak(self):
return 'Woof!'

3. Polymorphism: Objects taking multiple forms

 def make_animal_speak(animal):     print(animal.speak()) # Works for any Animal subclass      

4. Abstraction: Hiding complex implementation details

 from abc import ABC, abstractmethod

class Vehicle(ABC):
@abstractmethod
def start_engine(self):
pass

class Car(Vehicle):
def start_engine(self):
return 'Car engine started'

# Users interact with Vehicle interface, not implementation

Benefits:

  • Modular code organization
  • Code reusability through inheritance
  • Easier maintenance and updates
  • Better problem modeling

Conclusion

Mastering these programming interview questions provides a solid foundation for technical interviews and real-world software development. Focus on understanding core concepts, analyzing time/space complexity, and practicing implementation. Remember that interviewers value clear communication, systematic problem-solving approaches, and the ability to optimize solutions.

For comprehensive talent management solutions including interview scheduling, candidate assessment, and performance tracking, explore Qandle's integrated HR platform.

Get started by yourself, for

A 14-days free trial to source & engage with your first candidate today.

Book a free Trial

Achieving AwesomenessRecognized with an

award images

Let's delve into the possibilities of what
we can achieve for your business.

Book a free Demo