Harsha's Placement Sprint

April 16 – April 30, 2026  ·  15 Days  ·  DSA + Aptitude + Problem Solving + SyncSpace
🌙 ☀️
15
Days
120+
Problems
5–6
Hrs / Day
2 hrs
SyncSpace
6
Tracks
DSA
Approach
Problems
Aptitude
Interview
SyncSpace
Week 1 — Foundation & Core Structures · Apr 16–22
Build the fundamentals rock-solid. Arrays → Strings → Linked Lists → Stacks/Queues → Hashing → Recursion → Mock. Every day: 2 hrs SyncSpace + 3 hrs study. Don't skip the approach sections — they're what separates you from every other candidate.
01

Arrays & Time Complexity

Wednesday, Apr 16

DSAApproachProblemsAptiInterviewSync
DSA · 1.5 hrs
Arrays + Big-O Fundamentals
  • Big-O hierarchy — memorize: O(1) → O(log n) → O(n) → O(n log n) → O(n²) → O(2ⁿ) → O(n!). Know what code structure produces each.
  • O(1): array index access, HashMap get/put, arithmetic. No loops, no recursion.
  • O(log n): binary search, balanced BST lookup, heap operations. Problem size halves each step.
  • O(n): single loop, linear scan, prefix sum build, HashMap-based solutions.
  • O(n log n): sorting (merge sort, quicksort avg), sort + scan pattern.
  • O(n²): nested loops, brute-force pair comparisons, bubble/insertion sort.
  • Prefix Sum: build prefix[i] = prefix[i-1] + arr[i]. Range sum (l,r) = prefix[r] - prefix[l-1]. Converts O(n) range queries → O(1).
  • Kadane's Algorithm: localMax = max(num, localMax+num), track globalMax. Finds max subarray sum in O(n), one pass.
  • Fixed Sliding Window: compute first window of size k, then slide: add arr[right], remove arr[left], advance both. O(n).
  • Variable Sliding Window: expand right always. Shrink left when constraint violated. Track the answer during valid windows. O(n) total.
  • Two Pointers (sorted array): left=0, right=n-1. If sum < target → left++. If sum > target → right--. O(n) for pair search.
  • Dutch National Flag (3-way partition): low=0, mid=0, high=n-1. Partition array into 3 regions in single pass. Used for sort-colors type problems.
Space complexity matters too: in-place algorithms use O(1) extra space. Know that arrays are O(n) space, and creating a new array doubles your memory.
Approach & Optimization · 30 min
How to Read a Problem + Brute Force → Optimize
  • Step 1 — Read constraints FIRST: n ≤ 10³ → O(n²) ok. n ≤ 10⁵ → need O(n log n) or O(n). n ≤ 10⁸ → O(log n) only. This determines your approach before you think about algorithms.
  • Step 2 — State the brute force aloud: "The naive approach uses nested loops giving O(n²). For each element I search the rest of the array…" Always start here.
  • Step 3 — Ask: can I trade space for time? HashMap converts O(n) inner loop → O(1) lookup. Prefix sum converts O(n) range query → O(1).
  • Step 4 — Look for loop reduction: two nested loops scanning the same data → think two-pointer or sliding window. If one loop searches for something → binary search or HashMap.
  • Step 5 — Verify with small example: trace your optimized approach on the sample input step by step. Confirm it produces the right answer.
  • Code quality: name variables what they ARE — maxSum not ms, windowStart not l. Your future self and your interviewer will thank you.
Pattern trigger: "Find something in a range/subarray?" → Prefix Sum or Sliding Window. "Max/min of all subarrays?" → Kadane's or Monotonic Deque. "Find a pair with property X?" → HashMap or Two Pointers.
Aptitude · 45 min
Number Systems — HCF, LCM, Divisibility
  • HCF (GCD): Euclidean method — gcd(a,b) = gcd(b, a%b) until remainder is 0. The last non-zero remainder is the GCD.
  • LCM: LCM(a,b) = (a × b) / HCF(a,b). For 3+ numbers: LCM(a,b,c) = LCM(LCM(a,b), c).
  • Divisibility rules: 2(even), 3(digit sum÷3), 4(last 2 digits÷4), 5(ends 0/5), 6(÷2 AND ÷3), 8(last 3 digits÷8), 9(digit sum÷9), 11(alternating sum diff = 0 or ÷11).
  • Unit digit cycles: 2→{2,4,8,6}, 3→{3,9,7,1}, 7→{7,9,3,1}, 8→{8,4,2,6} — all cycle of 4. Find power mod 4 to get unit digit.
  • Remainder theorem: Remainder of (a×b) ÷ n = ((a%n) × (b%n)) % n. Useful for large number problems.
  • Practice: IndiaBix → Aptitude → Numbers — attempt 25 questions, note every wrong one and learn the method.
Interview Corner
Questions you WILL be asked about arrays
  • "Explain the time complexity of your solution." → State both time AND space. "O(n) time because single pass, O(n) space for the HashMap."
  • "Can you do it without extra space?" → Think: can I sort in-place? Can I use the array itself as storage (negative marking)? Can I use XOR?
  • "What happens if the input is very large?" → Discuss: does it fit in memory? Need streaming? Need external sort? Mention cache locality.
  • "Find the duplicate in an array of n+1 integers where each integer is in [1, n]." → Floyd's cycle detection on array indices. O(n) time, O(1) space. Know this cold.
  • "What's the difference between O(n) and O(n log n)?" → At n=1M, O(n)=1M operations, O(n log n)≈20M operations. At n=1B, the gap becomes critical.
SyncSpace · 2 hrs mandatory
SyncSpace Project Work
  • Spend 2 dedicated, focused hours on SyncSpace development. No distractions.
  • Treat this as a work sprint — this project is your biggest interview talking point.
02

Strings & Pattern Matching

Thursday, Apr 17

DSAApproachProblemsAptiInterviewSync
DSA · 1.5 hrs
String Manipulation & Patterns
  • Frequency map: {char: count} — detects anagrams in O(n). Use array[26] for lowercase letters (faster than HashMap).
  • Sliding window on strings: expand right (add char to window map), shrink left when window becomes invalid. Track count of valid chars, not the entire map comparison each time.
  • Palindrome check: two pointers from both ends, compare. O(n) time, O(1) space.
  • Palindrome expansion: for each index, expand outward (odd: center=i, even: center=i,i+1). O(n²) but elegant and no extra space.
  • String reversal patterns: in-place with two pointers (swap arr[left], arr[right]). Reverse words: reverse entire string, then reverse each word.
  • KMP basics: build failure/prefix function array. Pattern matching in O(n+m) instead of O(n×m). Know the concept even if you don't implement it.
  • StringBuilder pattern: never concatenate strings in a loop — use StringBuilder/list and join at the end. String concat in loop = O(n²).
  • Character math: char - 'a' gives 0-25 index. 'A' + 3 = 'D'. ASCII: 'A'=65, 'a'=97, '0'=48. These are used everywhere in string problems.
Approach · Pattern Recognition for Strings
What the problem says → What technique to use
  • "Does order matter?" → No? Sort and compare (anagram check). Yes? Use position-aware approach (sliding window, DP).
  • "Longest/shortest substring with condition" → Variable sliding window. What makes it invalid = your shrink condition.
  • "All substrings of length k with property" → Fixed sliding window of size k. Compute first window, slide.
  • "Substring" in problem statement → Almost always sliding window.
  • "Subsequence" in problem statement → Usually DP or two-pointer on sorted.
  • "Palindromic" anything → Expand from center (substring), or DP (subsequence).
  • Optimization over naive: instead of clearing & rebuilding freq map on each shrink, just decrement and check if count hits 0. Saves O(26) per operation.
Clean code tip: extract the "is valid window" check into a helper function. The main sliding window loop stays clean and readable.
Aptitude · 45 min
Percentages + Profit & Loss
  • % change: (new − old) / old × 100. Successive %: multiply — (1 + a/100)(1 + b/100), don't add.
  • x% of y = y% of x. 8% of 50 = 50% of 8 = 4. Use whichever is easier to compute.
  • Profit = SP − CP. Loss = CP − SP. Profit/Loss % is always on CP unless stated otherwise.
  • Marked price → discount → SP: SP = MP × (1 − d/100). Two successive discounts d₁ and d₂: effective = 1 − (1−d₁/100)(1−d₂/100).
  • If SP of x items = CP of y items: Profit% = ((y−x)/x) × 100.
  • Practice: IndiaBix → Percentage (20Q) + Profit & Loss (20Q).
Interview Corner
String interview questions
  • "Check if two strings are anagrams." → Frequency array comparison O(n). Or sort both O(n log n). Know both, recommend the faster.
  • "Find the first non-repeating character." → Two passes: first count frequencies, second find first with count=1. O(n) time, O(1) space (26 chars).
  • "Why is string concatenation in a loop O(n²)?" → Strings are immutable in Java/Python. Each concat creates a new string and copies everything. Use StringBuilder/list.join().
  • "Implement strStr() (substring search)." → Brute force O(n×m). KMP O(n+m). Rabin-Karp O(n+m) avg with rolling hash. Know at least brute force well.
SyncSpace · 2 hrs mandatory
SyncSpace Project Work
  • 2 focused hours on SyncSpace. Pick up where you left off. Commit progress at end of session.
03

Linked Lists

Friday, Apr 18

DSAApproachProblemsAptiInterviewSync
DSA · 1.5 hrs
Linked List Operations & Patterns
  • Reversal in-place (3 pointers): prev=null, curr=head. Each step: save next → point curr.next to prev → move prev=curr → move curr=next. O(n), O(1) space.
  • Floyd's cycle detection: slow moves 1 step, fast moves 2 steps. If they meet → cycle exists. To find cycle start: reset slow to head, move both by 1 → they meet at cycle start.
  • Dummy node pattern: create dummy node before head. Attach nodes to dummy.next. Eliminates ALL edge cases for head insertion/deletion. Return dummy.next.
  • Find middle: slow/fast pointer — when fast reaches end, slow is at middle. Crucial for: palindrome check, merge sort on LL.
  • Merge two sorted lists: compare heads, attach smaller to result, advance that pointer. Dummy node makes this clean.
  • Remove nth from end: two pointers with n gap. When fast reaches end, slow is at the node before the target. One pass O(n).
  • Intersection of two lists: compute lengths, advance longer list by difference, then advance both until they meet. Or: two-pass — A→B chain and B→A chain meet at intersection.
  • Doubly linked list: each node has prev AND next. Enables O(1) delete-by-reference (crucial for LRU cache). Insert: update 4 pointers.
Approach · Pointer Manipulation Logic
Drawing diagrams is not optional — it's required
  • ALWAYS draw the linked list on paper first. Boxes with arrows. Trace your algorithm step by step. 90% of LL bugs come from not visualizing.
  • Order of pointer updates matters. Save next BEFORE overwriting curr.next. Otherwise you lose the reference and can't traverse.
  • Null guard everything: always check curr != null and curr.next != null before accessing .next.next. This is the #1 runtime error.
  • Recursion vs iteration: reversal can be done both ways. Recursive uses O(n) call stack, iterative uses O(1). Know both — interviewers ask for both.
  • When to use dummy node: any time the head might change (insertion at front, deletion of head). Save yourself from writing special-case code.
Aptitude · 45 min
Ratios, Proportions & Mixtures
  • Ratio a:b: A's share = a/(a+b) of total. Always simplify the ratio first.
  • Proportions: if a/b = c/d then ad = bc (cross multiply). Direct proportion: more A → more B. Inverse: more A → less B.
  • Alligation rule: (expensive − mean) : (mean − cheap) = ratio of cheap : expensive in the mix. Draw the cross diagram.
  • Mixture removal: after removing R litres from V litres containing X fraction: new fraction = X × (1 − R/V).
  • Practice: IndiaBix → Ratio & Proportion + Alligation (20Q each).
Interview Corner
Linked list interview questions
  • "Reverse a linked list — explain your approach." → Draw 3-4 nodes. Show prev, curr, next pointers. Walk through step by step. Then code.
  • "How do you detect a cycle? Prove it works." → Floyd's: fast gains 1 step per iteration, so they MUST meet if cycle exists. Math: distance mod cycle length.
  • "Array vs Linked List — when would you choose each?" → Array: random access O(1), cache-friendly. LL: O(1) insert/delete at known position, no resize needed.
  • "Implement an LRU Cache." → HashMap + Doubly Linked List. HashMap for O(1) lookup, DLL for O(1) insert/remove to maintain order. This is a CLASSIC.
SyncSpace · 2 hrs mandatory
SyncSpace Project Work
  • 2 hrs on SyncSpace. Test what you built yesterday before adding new code.
04

Stacks & Queues

Saturday, Apr 19

DSAApproachProblemsAptiInterviewSync
DSA · 1.5 hrs
Stack + Queue + Monotonic Patterns
  • Stack = "remember the past": LIFO. Used for: bracket matching, undo/redo, next greater element, function call simulation, expression evaluation.
  • Monotonic stack: maintain elements in increasing or decreasing order. Pop when current element violates order. Gives next greater/smaller in O(n) total.
  • Queue = "process in order": FIFO. BFS always uses a queue. Level-order tree traversal. Task scheduling.
  • Deque (double-ended queue): add/remove from both ends O(1). Sliding window maximum: front = max of current window, remove expired from front, maintain decreasing order from back.
  • Stack for expression evaluation: infix → postfix (Shunting Yard). Evaluate postfix: operand → push, operator → pop two, compute, push result.
  • Two stacks → queue: push to stack1, pop: if stack2 empty, transfer all from stack1 → stack2, then pop stack2. Amortized O(1) per operation.
  • Min stack: maintain a parallel stack that tracks the minimum. On push: push min(val, currentMin). On pop: pop from both. getMin() = O(1).
Approach · Recognizing Stack/Queue Problems
Signal words that scream "use a stack"
  • "nearest", "previous", "next greater/smaller" → Monotonic stack.
  • "balanced", "matching", "nested" → Stack for bracket/tag matching.
  • "undo", "back", "history" → Stack naturally models this.
  • "level", "shortest path", "BFS" → Queue.
  • Amortized O(1) analysis: in monotonic stack, even though inner loop pops multiple times, each element is pushed and popped AT MOST once → O(n) total, not O(n²). Learn to explain this.
  • Brute force for "next greater": for each element, scan right O(n²). Stack: O(n) total. Explain why — interviewers love amortized complexity analysis.
Aptitude · 45 min
Time, Speed & Distance
  • D = S × T. Convert units first — always. km/h to m/s: × 5/18. m/s to km/h: × 18/5.
  • Relative speed: same direction = subtract speeds. Opposite direction = add speeds.
  • Trains: cross a pole = train length / speed. Cross each other = sum of lengths / relative speed. Cross a bridge = (train + bridge) / speed.
  • Average speed: for equal distances at speeds s₁ and s₂: avg speed = 2s₁s₂/(s₁+s₂). NOT (s₁+s₂)/2.
  • Boats & Streams: downstream speed = boat + stream. Upstream = boat − stream. Still water speed = (downstream + upstream)/2.
  • Practice: IndiaBix → Time and Distance + Boats and Streams (25Q).
Interview Corner
Stack/Queue interview classics
  • "Implement a queue using two stacks." → Lazy transfer. Push to in-stack, pop: if out-stack empty, transfer all, then pop. Amortized O(1).
  • "What is amortized O(1)? How is it different from O(1)?" → Individual operations may cost O(n), but averaged over n operations, each costs O(1). Like paying monthly vs one-time.
  • "Design a browser history with back/forward." → Two stacks: back-stack and forward-stack. Navigate: push current to back, pop forward. Back: push current to forward, pop back.
SyncSpace · 2 hrs mandatory
SyncSpace Project Work
  • Saturday — push hard. Aim to close at least one feature end-to-end.
05

Hashing & Two Pointers

Sunday, Apr 20

DSAApproachProblemsAptiInterviewSync
DSA · 1.5 hrs
HashMap Mastery + Two-Pointer Techniques
  • HashMap fundamentals: O(1) avg get/put/delete, O(n) worst case (collision). Space O(n). Converts any O(n) search → O(1) lookup.
  • Two Sum pattern: for each element, check if complement (target - num) is in map. If yes → answer. If no → store current in map. Single pass O(n).
  • Two pointer on sorted array: left=0, right=n-1. Sum too small → left++. Sum too large → right--. Only works on SORTED data.
  • 3Sum: sort → fix element i → two-pointer on remaining. Skip duplicates: while(nums[i]==nums[i-1]) skip. O(n²) total.
  • Subarray sum = k (prefix sum + HashMap): prefixSum[j] - prefixSum[i] = k. Store prefix sums in map. Check if (currentSum - k) exists in map. O(n).
  • Minimum window substring: need = freq(target), have = 0. Expand right (update window freq, increment have if char fully satisfied). Shrink left when have == need (update answer, decrement counts).
  • HashSet for existence checks: O(1) contains. Used for: duplicate detection, valid sudoku, longest consecutive sequence.
  • Longest consecutive sequence: put all in HashSet. For each num, if num-1 NOT in set → it's a sequence start. Count forward. O(n) total.
Approach · Decision Framework
When HashMap vs Two Pointer vs Sort
  • Is array sorted? → Two pointers (no extra space, O(n)).
  • Not sorted but can sort? → If positions don't matter: sort first, then two-pointer. O(n log n).
  • Need to preserve positions/indices? → Can't sort. Use HashMap. O(n) time, O(n) space.
  • "Can I do it in one pass?" — ask this before writing any loop. Most array problems can be single-pass with a HashMap.
  • Duplicate handling in nSum: after processing, skip duplicates with while(arr[i] == arr[i-1]) i++. Critical for correct output.
  • HashMap code quality: always comment what key and value represent: // key: complement needed, value: index where we saw it
Aptitude · 45 min
Time & Work + Pipes & Cisterns
  • If A does work in n days → rate = 1/n per day. Combined rate: 1/A + 1/B = 1/T → T = AB/(A+B).
  • A works for 3 days, then B joins: work done by A in 3 days = 3/A. Remaining = 1 - 3/A. Time for remainder at combined rate.
  • Pipes: fill pipe = positive rate, drain pipe = negative rate. Net rate = sum of all rates.
  • Efficiency method: pick total work = LCM of individual times. Much easier than fraction arithmetic.
  • Practice: IndiaBix → Time and Work + Pipes and Cisterns (20Q each).
Interview Corner
HashMap interview questions
  • "How does a HashMap work internally?" → Hash function maps key → bucket index. Collision handling: chaining (linked list per bucket) or open addressing (probe to next slot).
  • "What's the time complexity of HashMap in worst case?" → O(n) when all keys hash to same bucket. Java 8+: buckets degrade to balanced BST → O(log n) worst.
  • "HashMap vs HashSet vs TreeMap?" → HashMap: key-value O(1). HashSet: just keys O(1). TreeMap: sorted keys O(log n). Know when to use each.
SyncSpace · 2 hrs mandatory
SyncSpace Project Work
  • Sunday deep focus. Build, test, commit. Every commit is interview-ready evidence.
06

Recursion & Backtracking

Monday, Apr 21

DSAApproachProblemsAptiInterviewSync
DSA · 1.5 hrs
Recursion Fundamentals + Backtracking Template
  • Recursion = reduce to smaller subproblem + base case. Two things: (1) what makes the problem smaller each call, (2) when to stop.
  • Backtracking template: for each choice → make choice → recurse → UNDO choice. The undo step is critical — without it you're not backtracking.
  • State space tree: draw it for every backtracking problem. Each level = one decision. Each leaf = one complete solution.
  • Subsets vs Permutations: subsets use a start index (no repeats, order doesn't matter). Permutations use a visited[] array (order matters).
  • Combination Sum: allow same element multiple times → recurse with same start index. Each element once → start = i+1.
  • Pruning: cut branches early when constraint is violated. Example: if currentSum > target → return immediately. Massively reduces search space from exponential.
  • Generate Parentheses: at each step: add '(' if open < n, add ')' if close < open. Natural backtracking with constraint pruning.
  • N-Queens: place one queen per row, check column/diagonal conflicts. Classic backtracking with constraint propagation.
Approach · Recursion Thinking
How to convert a problem into recursive thinking
  • Backtracking is ALWAYS exponential worst case: O(2ⁿ) for subsets, O(n!) for permutations. That's expected — there's no polynomial solution for exhaustive enumeration.
  • When to add memoization: if the same subproblem (same arguments) appears multiple times in the recursion tree → memoize. This converts backtracking → DP.
  • How to spot overlapping subproblems: draw the recursion tree. If two branches produce identical function calls → overlapping. Add cache.
  • Code structure: result=[], define solve(start, current). solve() modifies current, adds to result when base case met, then undoes. Keep solve() clean — just the template.
  • Performance tip: don't pass result as parameter and create new arrays on each call — that's O(n) per copy. Use a single list with append/pop (backtrack).
Aptitude · 45 min
Simple & Compound Interest + Permutations/Combinations
  • SI: P×N×R/100. CI: P×(1+R/100)ⁿ − P. CI − SI for 2 years = P×(R/100)².
  • Difference between CI and SI for 3 years: = P×R²(300+R)/100³. Appears in every placement test.
  • nPr = n!/(n-r)! — arrangement, order matters. nCr = n!/r!(n-r)! — selection, order doesn't matter.
  • Key insight: "how many ways to arrange" = P. "How many ways to choose/select" = C.
  • Practice: IndiaBix → SI + CI + Permutations and Combinations.
Interview Corner
Recursion interview questions
  • "What's the difference between recursion and iteration?" → Any recursion can be converted to iteration with an explicit stack. Recursion uses call stack (O(n) space), iteration can be O(1).
  • "What causes stack overflow?" → Too many recursive calls without hitting base case. Fix: ensure base case is reachable, or convert to iteration.
  • "What's tail recursion?" → Recursive call is the last operation. Some compilers optimize this to a loop (tail call optimization). Java/Python don't, but important to know.
  • "Explain backtracking vs brute force." → Backtracking prunes branches that can't lead to valid solutions. Brute force tries everything. Same worst case, but backtracking is faster in practice.
SyncSpace · 2 hrs mandatory
SyncSpace Project Work
  • 2 hrs on SyncSpace. Monday — start the work week strong.
07

Week 1 Review + Mock Interview

Tuesday, Apr 22 · Full Simulation

ReviewMockApti MockSync
Revision · 30 min — Rapid Pattern Recall
Review ALL Week 1 patterns — 5 min per topic
  • Arrays: prefix sum template, Kadane's recurrence, sliding window expand/shrink, two-pointer convergence
  • Strings: frequency map + sliding window, palindrome expansion from center, substring vs subsequence
  • Linked Lists: 3-pointer reversal, Floyd's cycle, dummy node merge, slow/fast middle
  • Stacks: monotonic stack template, amortized O(n), bracket matching, min-stack
  • Hashing + 2P: complement map, sorted two-pointer, 3Sum dedup, subarray sum prefix+map
  • Recursion: backtracking template (choose→explore→unchoose), subsets vs permutations, pruning
Mock Interview · 90 min — No Hints, Timer ON
Simulate a Real Placement Interview
  • Protocol for EVERY problem: (1) Read fully (2) Clarify edge cases (3) State brute force + complexity (4) Optimize (5) Code (6) Trace example (7) State complexity
  • Timing: Easy = 15 min. Medium = 25 min. Hard = 40 min. ABANDON if stuck beyond limit.
  • Think aloud the entire time. Silence = red flag in interviews.
Aptitude Mock · 30 min timed + 30 min review
30-Question Mixed Quantitative Test
  • IndiaBix Mixed Quantitative Mock — strict 30 min timer. No calculator.
  • After: spend 30 min reviewing EVERY wrong answer. Write the correct method step by step.
  • Track your accuracy: target 70%+ by end of sprint.
SyncSpace · 2 hrs mandatory
SyncSpace Project Work
  • Review day — use SyncSpace time to test and clean up what you built this week.
Week 2 — Trees, Graphs & Dynamic Programming · Apr 23–28
The heavy-hitters. Trees → BST/Heaps → Graphs → DP 1D → DP 2D → Greedy. Don't rush — understanding beats quantity. If you understand DP state definition, you can solve any DP problem. If you don't, you can't solve any.
08

Binary Trees — Traversals & DFS

Wednesday, Apr 23

DSAApproachProblemsAptiInterviewSync
DSA · 1.5 hrs
Binary Tree Traversals + DFS Patterns
  • Inorder (L→Root→R): gives sorted order for BST. Recursive: go left, visit, go right.
  • Preorder (Root→L→R): visit root first. Used to serialize/clone tree structure.
  • Postorder (L→R→Root): visit root last. Used for deletion, computing aggregates bottom-up.
  • Level-order (BFS): queue-based. Add root, process level by level using queue size. Used for: shortest path, level averages, rightmost view.
  • DFS recursive pattern: most tree problems = "compute something for left + right + combine with root". The key is: what do you RETURN from each call?
  • Max depth: 1 + max(depth(left), depth(right)). Base case: null → 0.
  • Diameter: max of (leftHeight + rightHeight) across ALL nodes. Use closure/global var to track max while returning height.
  • Path sum: target decreases at each node. At leaf: check if remaining target == 0.
  • Invert tree: swap left and right children at every node recursively. One line: [node.left, node.right] = [node.right, node.left].
Approach · Tree Problem Framework
Ask: top-down or bottom-up?
  • Top-down: pass information DOWN as parameters (path sum, valid range for BST). Good when parent constrains children.
  • Bottom-up: return information UP from children and combine (depth, diameter, subtree sum). Most problems are bottom-up.
  • BFS vs DFS: problem mentions "level"? → BFS. Mentions "path from root to leaf"? → DFS. Need shortest distance? → BFS.
  • Global var in recursion: for problems where the answer isn't the return value (diameter = max across all nodes), use outer variable updated during traversal.
  • Every tree DFS = O(n) — visit each node once. Space = O(h) stack — O(log n) balanced, O(n) skewed worst case.
Aptitude · 45 min
Averages + Problems on Ages
  • Average = Sum/Count. When element removed: new avg = (old sum − element) / (n−1). When element added: similar.
  • Weighted average: sum(value × weight) / sum(weights). NOT simple average of averages.
  • Ages: set up equations. "x years ago A was twice B" → (A−x) = 2(B−x). Two unknowns need two equations.
  • Key trick: the DIFFERENCE in ages never changes. If A is 5 years older than B now, A will always be 5 years older.
  • Practice: IndiaBix → Average + Problems on Ages (20Q each).
Interview Corner
Tree interview questions
  • "What's the difference between BFS and DFS?" → BFS: level-by-level, uses queue, good for shortest path. DFS: goes deep first, uses stack/recursion, good for path-finding.
  • "Find the diameter of a binary tree." → DFS returning height, update global max = leftH + rightH at each node. O(n).
  • "Serialize and deserialize a binary tree." → Preorder with null markers. Serialize: visit→left→right. Deserialize: read tokens, recursively build left then right.
SyncSpace · 2 hrs mandatory
SyncSpace Project Work
  • Week 2 — push into new features or deepen existing ones.
09

BST + Heaps & Priority Queues

Thursday, Apr 24

DSAApproachProblemsAptiInterviewSync
DSA · 1.5 hrs
BST Properties + Heap Operations
  • BST property: left < root < right at EVERY node. Inorder traversal = sorted sequence.
  • Validate BST: pass valid range [min, max] down. Root: (−∞, +∞). Left child: (min, root.val). Right child: (root.val, max).
  • BST search: if target < node → go left. If target > node → go right. O(log n) balanced, O(n) skewed.
  • Heap = complete binary tree where parent ≤ children (min-heap). Stored as array: parent(i) = (i-1)/2, left(i) = 2i+1, right(i) = 2i+2.
  • Insert: add at end, bubble UP (swap with parent while smaller). O(log n).
  • ExtractMin: swap root with last → remove last → bubble DOWN (swap with smaller child while larger). O(log n).
  • Kth largest: maintain min-heap of size k. If heap.size > k → pop min. After scanning all, heap.top() = kth largest. O(n log k).
  • Top K frequent: frequency map → heap of (freq, element) → extract top k. O(n log k).
  • Median from stream: two heaps — maxHeap for lower half, minHeap for upper half. Balance sizes. Median = top of one or avg of both tops.
Approach · Heap vs Sort Decision
When to use which
  • Need all elements sorted: just sort. O(n log n), O(1) in-place.
  • Need only top-k and k ≪ n: heap of size k. O(n log k) ≪ O(n log n). Much faster.
  • "Process by priority" or "always get current min/max": heap/priority queue.
  • BST vs HashMap: BST gives sorted order + range queries. HashMap gives O(1) but no ordering. Use BST when you need sorted operations.
Aptitude · 45 min
Logical Reasoning — Number Series + Odd One Out
  • Check in order: (1) Arithmetic? Constant diff. (2) Geometric? Constant ratio. (3) Diff of diffs? (4) Squares/cubes? (5) Two interleaved series?
  • Odd one out: find which number breaks the pattern. Apply all 5 checks above.
  • Prime-based series: 2, 3, 5, 7, 11, 13… or gaps between primes.
  • Practice: IndiaBix → Logical → Number Series + Odd One Out (20Q).
Interview Corner
BST/Heap questions
  • "When would you use a heap over sorting?" → When you need top-k with k ≪ n, or when data arrives as a stream and you can't sort all at once.
  • "What happens if a BST becomes unbalanced?" → Degrades to O(n). Solution: self-balancing BSTs (AVL, Red-Black). Know concept, not implementation.
SyncSpace · 2 hrs mandatory
SyncSpace Project Work
  • Build, test, commit. Stuck? Note the blocker, research, move on.
10

Graphs — BFS, DFS, Cycles

Friday, Apr 25

DSAApproachProblemsAptiInterviewSync
DSA · 1.5 hrs
Graph Representations + Traversals + Cycles + Topo Sort
  • Adjacency list: Map of node → [neighbors]. Space O(V+E). Default for most problems.
  • BFS: queue + visited set. Mark visited BEFORE adding to queue (not after — avoids duplicate processing). Gives shortest path in unweighted graphs.
  • DFS: recursion + visited set. Goes deep first. Use for: all paths, cycle detection, connected components, topological sort.
  • Multi-source BFS: add ALL starting nodes to queue before starting. Used for "rotting oranges" — all sources propagate simultaneously.
  • Grid as graph: cell (r,c) = node. Neighbors = 4 adjacent cells. visited = 2D boolean array. Standard BFS/DFS applies.
  • Cycle detection (directed): track visited + currently-in-path (recursion stack). If you reach a node in current path → cycle.
  • Cycle detection (undirected): if you reach a visited node that isn't the parent → cycle.
  • Topological sort: only for DAGs. DFS: push to result AFTER all neighbors visited. Reverse post-order. Or Kahn's: repeatedly remove 0-indegree nodes.
  • Connected components: run DFS/BFS from each unvisited node. Each run finds one component. Count = number of runs.
Approach · Graph Problem Decomposition
BFS vs DFS — which to use
  • "Shortest path" or "minimum steps" → BFS. Always.
  • "All paths", "cycle detection", "topological order" → DFS.
  • "Connected components" → Either works. DFS is usually simpler to write.
  • "Count islands" → DFS/BFS flood fill. When you find a 1, explore all connected 1s and mark visited. Count how many times you start an exploration.
  • Critical BFS bug: mark visited WHEN you add to queue, not when you process. Otherwise you add the same node multiple times → wrong answer or TLE.
  • Complexity: BFS/DFS = O(V+E). For grid n×m: O(n×m). Say this precisely — don't just say O(n²).
Aptitude · 45 min
LR — Coding-Decoding + Blood Relations
  • Letter coding: find the shift (A→D = +3). Verify with 2-3 examples before answering.
  • Blood relations: DRAW the family tree. Label each node with gender. Never solve in head.
  • "A's mother's brother's daughter" → trace each relationship on the tree one step at a time.
  • Practice: IndiaBix → Logical → Coding-Decoding + Blood Relations (20Q).
Interview Corner
Graph interview questions
  • "Detect a cycle in a directed graph." → DFS with 3 states: unvisited, in-progress, done. If you reach an in-progress node → cycle.
  • "What's the difference between BFS and DFS on graphs?" → BFS: finds shortest path, level-by-level, O(V+E). DFS: goes deep, good for exhaustive search, O(V+E).
  • "How would you find the shortest path between two nodes?" → Unweighted: BFS. Weighted non-negative: Dijkstra O((V+E)logV). Negative weights: Bellman-Ford O(VE).
SyncSpace · 2 hrs mandatory
SyncSpace Project Work
  • Friday push — close at least one feature completely before the weekend.
11

Dynamic Programming — 1D

Saturday, Apr 26

DSAApproachProblemsAptiInterviewSync
DSA · 1.5 hrs
DP Part 1 — 1D Problems
  • DP = recursion + memoization. Two signals: (1) overlapping subproblems — same call repeated, (2) optimal substructure — optimal solution uses optimal sub-solutions.
  • Top-down (memoization): write recursion, add cache dictionary. Natural to think, easier to reason about correctness.
  • Bottom-up (tabulation): fill dp[] from base case forward. No recursion overhead. Better for large inputs (no stack overflow).
  • Take/skip pattern: dp[i] = max(dp[i-1], dp[i-2] + nums[i]). "At each step: take current + skip one, OR skip current." House Robber pattern.
  • Coin change: dp[amount] = min coins to make amount. dp[0]=0. dp[i] = 1 + min(dp[i-coin]) for each coin ≤ i.
  • Climbing stairs: dp[i] = dp[i-1] + dp[i-2]. Same as Fibonacci. Base: dp[0]=1, dp[1]=1.
  • LIS (Longest Increasing Subsequence): dp[i] = 1 + max(dp[j]) for all j < i where nums[j] < nums[i]. O(n²). Can be optimized to O(n log n) with binary search.
  • Word break: dp[i] = true if any dp[j] is true AND s[j..i] is in dictionary.
Approach · How to Define DP State
This is the MOST important step — get it wrong and everything fails
  • Step 1 — Define dp[i] in English. "dp[i] = minimum cost to reach stair i." One sentence that forces correct thinking.
  • Step 2 — Find the recurrence. How does dp[i] relate to smaller subproblems? Write the formula before coding.
  • Step 3 — Base cases. What are the trivially known answers? dp[0]=0 or dp[0]=1? Get this wrong and everything cascades.
  • Step 4 — Space optimization: most 1D DP only needs dp[i-1] and dp[i-2]. Use two variables instead of array. O(n) → O(1) space.
  • Debugging DP: print dp array after filling. Verify first 3-4 values match manual calculation. If wrong → your recurrence is wrong, not the code.
  • Code quality: add a comment above dp: // dp[i] = minimum coins to make amount i. Interviewers respect this clarity.
Aptitude · 45 min
LR — Directions + Seating Arrangement
  • Directions: start facing North. Right = clockwise. Left = counterclockwise. Track x,y coordinates.
  • Seating: draw the line/circle FIRST. Fill the most constrained person first. Use elimination.
  • Circular: fix one person, arrange remaining n-1. Reduces rotational duplicates.
  • Practice: IndiaBix → Logical → Direction Sense + Seating (20Q).
Interview Corner
DP interview classics
  • "What is dynamic programming?" → Solving problems by breaking into overlapping subproblems and storing results to avoid recomputation. Not just "recursion with cache" — explain the WHY.
  • "Top-down vs bottom-up?" → Top-down: natural recursive thinking, memoize. Bottom-up: iterative, fill table. Both give same result. Bottom-up avoids stack overflow.
  • "How do you know a problem is DP?" → (1) "Find min/max/count of ways" (2) You can break it into subproblems (3) Subproblems overlap (same inputs recur).
SyncSpace · 2 hrs mandatory
SyncSpace Project Work
  • Saturday — use extra focus time for complex features needing deeper thought.
12

DP Part 2 — 2D Grid & String DP

Sunday, Apr 27

DSAApproachProblemsAptiSync
DSA · 1.5 hrs
2D DP — Grids + String Comparison
  • Unique Paths: dp[i][j] = dp[i-1][j] + dp[i][j-1]. Base: first row and column = 1.
  • Min Path Sum: dp[i][j] = grid[i][j] + min(dp[i-1][j], dp[i][j-1]).
  • LCS (Longest Common Subsequence): if s1[i]==s2[j] → dp[i][j] = dp[i-1][j-1]+1. Else → max(dp[i-1][j], dp[i][j-1]).
  • Edit Distance: if match → dp[i][j] = dp[i-1][j-1]. Else → 1 + min(dp[i-1][j-1], dp[i-1][j], dp[i][j-1]) (replace, delete, insert).
  • Space optimization: if dp[i][j] only depends on current + previous row → use two 1D arrays. O(m×n) → O(n) space.
Approach · 2D DP Methodology
  • Always draw the table. Label rows = string1 (+ empty), columns = string2 (+ empty). Fill left-to-right, top-to-bottom. Verify manually.
  • Initialize first row and column separately. Then fill the rest. Don't put base cases inside the main loop — clutters logic.
  • Space optimization with rolling array: use prev[] and curr[] arrays, swap after each row. Further: single array with careful left-to-right scan + one extra variable.
Aptitude · 45 min
Verbal — Reading Comprehension + Grammar
  • RC strategy: read QUESTIONS first → skim passage → locate evidence → answer based on passage text, not general knowledge.
  • Grammar focus: subject-verb agreement, tense consistency, prepositions (in/on/at), article usage (a/an/the).
  • Error spotting: check each part — subject, verb, object, preposition, pronoun agreement.
  • Practice: IndiaBix → Verbal Ability → RC + Error Spotting + Sentence Correction.
SyncSpace · 2 hrs mandatory
SyncSpace Project Work
  • Make the project demo-worthy. Every feature you complete is a talking point.
13

Greedy Algorithms & Sorting

Monday, Apr 28

DSAApproachProblemsAptiInterviewSync
DSA · 1.5 hrs
Greedy + Sorting Algorithms
  • Greedy = locally optimal → globally optimal. Must prove correctness (exchange argument) or find counterexample.
  • Interval scheduling: sort by END time. Always pick the interval that ends earliest → leaves max room for future intervals.
  • Merge intervals: sort by start. If current.start ≤ prev.end → merge (extend prev.end). Else → add new.
  • Jump Game: track maxReach. At each index: maxReach = max(maxReach, i + nums[i]). If i > maxReach → return false.
  • Merge Sort: O(n log n) always, stable, O(n) space. Split → recursively sort halves → merge.
  • Quick Sort: O(n log n) avg, O(n²) worst. In-place O(log n) space. Random pivot fixes worst case. Fastest in practice.
  • Counting Sort: O(n+k) for integers in range [0, k]. Non-comparative. Very fast for limited range.
Approach · Greedy vs DP Decision
  • Greedy works when: optimal substructure AND greedy choice property (local optimal leads to global optimal).
  • DP works when: optimal substructure BUT no greedy choice property — need to try multiple options.
  • Greedy is special case of DP where only one choice per subproblem matters. If greedy works, use it — it's faster.
  • Exchange argument: assume optimal solution doesn't use greedy choice. Show swapping to greedy choice doesn't make it worse. Therefore greedy is optimal.
Aptitude · 45 min
Data Interpretation + Probability
  • Tables/bar charts: read headers and units carefully. % growth = (current-prev)/prev × 100.
  • Probability: P(A) = favourable/total. P(A∪B) = P(A)+P(B)-P(A∩B). P(A') = 1-P(A).
  • Independent events: P(A∩B) = P(A)×P(B). Conditional: P(A|B) = P(A∩B)/P(B).
  • Practice: IndiaBix → Data Interpretation (2 full sets) + Probability (15Q).
Interview Corner
  • "Explain merge sort." → Divide array in half, recursively sort each, merge two sorted halves. O(n log n), stable, O(n) extra space. Draw the recursion tree.
  • "When does greedy work vs DP?" → Greedy: each local choice leads to global optimal (activity selection). DP: need to try all choices (knapsack). Give examples of both.
SyncSpace · 2 hrs mandatory
SyncSpace Project Work
  • Last full week day — polish and stabilize. Clean up code, handle edge cases.
Final Sprint — Apr 29–30 · Consolidate & Polish
No new topics. Consolidate, simulate, polish. Day 14 = full mock. Day 15 = confidence round + final revision. Go in warm, not exhausted.
14

Full Mock Interview Day

Tuesday, Apr 29 · Full Simulation

MockFull MockSync
Full DSA Mock · 2 hrs — Interview Protocol
Complete Placement Simulation
  • Protocol for every problem: read → constraints → brute force → optimize → plan → code → trace → state complexity.
  • If stuck: think aloud. Say "I'm considering whether a sliding window applies here…" Never sit silent.
  • Timing strict: Easy 15m, Medium 25m, Hard 40m. Move on if stuck.
Company-Specific Patterns
TCS / Infosys / Wipro — What to Expect
  • TCS NQT: basic — number patterns, string reversal, factorial, fibonacci. Focus on SPEED and accuracy, not difficulty.
  • Infosys InfyTQ: medium arrays/strings + CS fundamentals (OOP, SQL, OS). Know your language syntax cold.
  • Wipro NLTH: moderate DSA + heavy aptitude. Balance both sections — don't over-invest in one.
  • Common to all: MCQ aptitude (30-40Q in 40-60 min), coding (1-3 problems in 60-90 min), essay/email writing.
Full Aptitude Mock · 50Q · 60 min
Timed Full-Length Placement Simulation
  • Quant (20Q) + Logical Reasoning (20Q) + Verbal (10Q). Strict 60 min timer.
  • PrepInsta Full Mock OR IndiaBix Combined Mock — full timed simulation.
  • After: mark every wrong answer, write the correct method next to it. This review is MORE valuable than the test itself.
SyncSpace · 2 hrs mandatory
SyncSpace — Demo Prep
  • This is polish time. Make sure core user flow works flawlessly. Quality > quantity.
15

Final Day — Revision & Confidence

Wednesday, Apr 30 · Sprint Complete

RevisionConfidenceWarm-upFinal
All Patterns — Rapid Recall · 1.5 hrs
5 min per topic — recall template only, don't re-solve
  • Arrays: prefix sum, Kadane's, sliding window, two pointer, binary search
  • Strings: freq map, expand/shrink window, palindrome expansion, character math
  • LL: 3-pointer reversal, Floyd's cycle, dummy node, slow/fast middle
  • Stack: monotonic stack template, amortized analysis, bracket matching
  • Hash + 2P: complement map, sorted two-pointer, 3Sum, prefix sum + HashMap
  • Recursion: backtracking template, pruning, subsets vs permutations
  • Trees: bottom-up DFS return, BFS queue level-by-level, BST range validation
  • Graphs: BFS=shortest, DFS=all+cycle, mark visited before adding to queue
  • DP: state → recurrence → base case → space optimization. Always define dp[i] in English first.
  • Greedy: sort + exchange argument. If greedy works → faster than DP.
Code Quality — Final Checklist
Internalize these — they matter in every interview
  • Meaningful names: leftPointer, maxSoFar, visitedNodes — not l, m, v
  • Guard clauses early: handle null/empty at the top. Don't nest the happy path.
  • No magic numbers: const ALPHABET_SIZE = 26, not just 26 inline.
  • State complexity after every solution — unprompted. "O(n) time, O(1) space because…"
  • Edge cases to always mention: empty input, single element, all duplicates, negative numbers, integer overflow.
  • One function = one job. If it does two things, extract a helper.
Confidence Round — Fluency, Not Difficulty
Solve these with perfect form. Explain aloud as you code.
Final Aptitude Warm-up
15-Question Mixed Set — Stay Sharp
  • 5 Quant + 5 Logical + 5 Verbal. IndiaBix random mix. Don't overthink. Trust your practice.
SyncSpace · 2 hrs mandatory
SyncSpace Final Polish
  • NO new features. Fix, clean, document. Write a clear README. Know the project well enough to explain any part in an interview.
Problem Solving Framework — Your Complete Toolkit
Read this every morning. The difference between pass and fail is almost never "knowing the algorithm" — it's applying a systematic approach under pressure.

The 7-Step Approach to Any Coding Problem

Step 1 — Read & Understand
  • Read the problem completely. Twice if needed. Identify input, output, and constraints.
  • Read ALL examples — they reveal edge cases the problem description hides.
  • Underline key words: "minimum", "maximum", "unique", "consecutive", "sorted".
Step 2 — Check Constraints
  • n ≤ 10² → O(n³) fine. n ≤ 10³ → O(n²) fine. n ≤ 10⁵ → O(n log n) or O(n). n ≤ 10⁹ → O(log n).
  • State this aloud: "With n up to 10⁵, I need O(n log n) or better."
Step 3 — Brute Force First
  • Always verbalize the naive solution with its complexity. "The brute force uses nested loops, O(n²)."
  • This shows understanding and gives a baseline. Sometimes brute force IS the answer.
Step 4 — Optimize (ask in order)
  • Q1: Can I trade space for time? → HashMap, prefix sum, memoization.
  • Q2: Are there repeated subproblems? → DP (memoize the recursion tree).
  • Q3: Can I sort + two-pointer? → If positions don't matter.
  • Q4: Is there a monotonic property? → Sliding window, monotonic stack.
  • Q5: Can I use math? → Sum formulas, XOR tricks, modular arithmetic.
  • Q6: Is there a greedy choice? → If local optimal = global optimal.
  • Q7: Can I divide and conquer? → Split, solve halves, merge.
Step 5 — Plan Before Coding
  • State your algorithm in 3-5 sentences. Identify data structures. Trace through one example by hand.
Step 6 — Code with Discipline
  • Meaningful variable names. Guard clauses at top. No magic numbers. Comment non-obvious logic. One function = one job.
Step 7 — Verify + State Complexity
  • Trace through the sample input. Test edge cases (empty, single element, all same). State time AND space complexity with reasoning.

Pattern Recognition Cheat Sheet

Array & String Signals
  • "Max/min subarray of size k" → Fixed sliding window
  • "Longest substring with condition" → Variable sliding window
  • "Find pair summing to X" → HashMap (unsorted) or Two Pointer (sorted)
  • "Range sum queries" → Prefix sum
  • "Max contiguous subarray sum" → Kadane's
  • "Next greater element" → Monotonic stack
  • "Is anagram / permutation" → Frequency map comparison
  • "Palindromic substring" → Expand from center
  • "Search in sorted/rotated" → Modified binary search
Tree, Graph & DP Signals
  • "Level-by-level" or "shortest distance" → BFS (queue)
  • "Max depth / path sum / diameter" → DFS bottom-up
  • "Valid BST" → DFS with range constraints
  • "Shortest path unweighted" → BFS
  • "Connected components / cycle" → DFS with visited
  • "Topological order" → DFS post-order or Kahn's BFS
  • "Min/max ways to reach X" → 1D DP
  • "Take or skip each element" → DP take/skip recurrence
  • "LCS / edit distance" → 2D DP on two strings
  • "Interval scheduling / activity" → Greedy sort by end time

Aptitude — Complete Formula Arsenal

Quantitative Formulas
  • HCF/LCM: gcd(a,b) = gcd(b, a%b). LCM = a×b/HCF.
  • Percentages: x% of y = xy/100. % change = (new-old)/old × 100. Successive: multiply factors.
  • P&L: Profit% = Profit/CP × 100. SP = CP(1+P%/100).
  • SI & CI: SI = PNR/100. CI = P(1+R/100)ⁿ - P. CI-SI(2yr) = P(R/100)².
  • Speed: D = S×T. km/h↔m/s: ×5/18. Relative: same dir subtract, opp add. Avg speed = 2s₁s₂/(s₁+s₂).
  • Work: Rate = 1/days. Combined = 1/A + 1/B. Total time = AB/(A+B).
  • P&C: nPr = n!/(n-r)!. nCr = n!/r!(n-r)!. Arrangement = P, Selection = C.
  • Probability: P = favourable/total. P(A∪B) = P(A)+P(B)-P(A∩B). P(A') = 1-P(A).
Logical Reasoning Methods
  • Syllogisms: Draw Venn diagrams. "All A are B" = A inside B. Check conclusion against ALL valid diagrams.
  • Seating: Draw template first. Start with most constrained clue. Use elimination.
  • Number Series: Check arithmetic → geometric → diff-of-diff → squares/cubes → interleaved.
  • Directions: Start North. Right = clockwise 90°. Track coordinates. Distance = Pythagorean.
  • Blood Relations: Always draw the family tree. One relationship per step. Track gender.
  • Coding-Decoding: Find the shift pattern. Verify with 2-3 examples before answering.

Big-O Cheat Sheet & Data Structure Operations

Complexity Classes
  • O(1): array[i], map.get(), push/pop. No loops.
  • O(log n): binary search, heap ops, balanced BST. Halving each step.
  • O(n): single loop, linear scan, prefix sum, HashMap scan.
  • O(n log n): sorting (merge/quick), sort+scan patterns.
  • O(n²): nested loops, brute-force pairs, bubble sort.
  • O(2ⁿ): all subsets, recursive fib without memo.
  • O(n!): all permutations, TSP brute force.
Data Structure Ops
  • Array: access O(1), search O(n), insert/del middle O(n), end O(1).
  • HashMap: get/put/del O(1) avg, O(n) worst. Space O(n).
  • Stack/Queue: push/pop/enqueue/dequeue O(1).
  • Heap: insert O(log n), extractMin O(log n), peek O(1), build O(n).
  • BST balanced: search/insert/delete O(log n). Unbalanced: O(n).
  • Graph BFS/DFS: O(V+E). Grid n×m: O(n×m).
Full 15-Day Overview
#DateDSA TopicAptitudeSyncSpace
01Apr 16 WedArrays + Big-ONumber Systems2 hrs
02Apr 17 ThuStringsPercentages + P&L2 hrs
03Apr 18 FriLinked ListsRatios + Mixtures2 hrs
04Apr 19 SatStacks + QueuesTSD + Boats2 hrs
05Apr 20 SunHashing + Two PointersTime & Work2 hrs
06Apr 21 MonRecursion + BacktrackingSI/CI + P&C2 hrs
07Apr 22 TueReview + Mock InterviewFull Quant Mock2 hrs
Week 2
08Apr 23 WedBinary TreesAverages + Ages2 hrs
09Apr 24 ThuBST + HeapsLR — Number Series2 hrs
10Apr 25 FriGraphs (BFS + DFS)LR — Coding + Blood2 hrs
11Apr 26 SatDP Part 1 (1D)LR — Directions2 hrs
12Apr 27 SunDP Part 2 (2D)Verbal — RC + Grammar2 hrs
13Apr 28 MonGreedy + SortingDI + Probability2 hrs
Final Sprint
14Apr 29 TueFull Mock (2 hrs)50-Q Full MockDemo prep
15Apr 30 WedRapid Revision15-Q Warm-upFinal polish
Resources + Practice Hubs

DSA Practice

  • LeetCode leetcode.com — daily problems, company tags, contests
  • NeetCode 150 neetcode.io — curated by pattern, with video explanations
  • HackerRank hackerrank.com/domains/algorithms — topic-wise practice
  • CodeChef codechef.com/practice — competitive, builds speed
  • GeeksforGeeks geeksforgeeks.org — explanations + company-specific problems

Aptitude

  • IndiaBix indiabix.com — best for placement patterns, use daily
  • PrepInsta prepinsta.com — company-specific mocks (TCS, Infosys, Wipro)
  • HackerEarth Assessment practice, AMCAT/eLitmus style tests
  • Freshersworld Aptitude practice + company papers

Problem Solving Mindset

  • Read constraints first Determine target complexity before choosing approach
  • Brute force → optimize Always state O(n²) naive before jumping to O(n)
  • Draw it out Linked lists, trees, graphs — never solve in your head
  • One pass thinking Ask "can I do this in one loop?" before writing nested loops
  • Name complexity State time + space after every solution, unprompted

Code Quality Rules

  • Meaningful variable names maxSum not ms, leftPointer not l
  • Guard clauses at top Handle null/empty before main logic
  • One function = one job Extract helpers when function does two things
  • No magic numbers const ALPHABET_SIZE = 26, not just 26
  • Comment non-obvious logic // key = complement, value = index

Logical Reasoning

  • Syllogisms → Venn diagrams Never solve in head, always draw
  • Seating → draw template first Most constrained person first
  • Number series → 5-pattern check Arithmetic, geometric, diff², squares, interleaved
  • Directions → always start North Track coordinates, Pythagorean for distance
  • Blood relations → family tree One relationship per step, track gender

Interview Communication

  • Think aloud always Silence = red flag. Narrate your reasoning process
  • Clarify before coding "Can input be negative? Duplicates allowed?"
  • State brute force first Shows understanding before optimization
  • Name complexity unprompted "O(n log n) time, O(n) space because…"
  • Handle edge cases verbally Mention empty, single element, overflow
  • Never say "I don't know" Say "Let me think through this…" and apply the framework