This is a substantial multi-part evaluation. Let me work through each section. --- ## Part 1: Horror (Office Setting) The fluorescent light in Conference Room B had developed a tremor. Not a flicker—a tremor, the way a hand shakes before it drops something. Margaret noticed first. She'd been staring at the spreadsheet projected on the wall, columns of quarterly figures, when the numbers began to swim. She blinked. Looked at Tom across the table. He was watching the door. Not the way you watch a door expecting someone—the way you watch a door making sure it stays closed. The third person in the room, the new hire whose name badge read KEVIN, was writing something in his notebook. He hadn't looked up in eleven minutes. Margaret had counted. His pen moved in the same looping pattern, over and over, and she realized he wasn't writing words. The air conditioning cycled off. The building settled with a sound like a long exhalation. Outside the interior window, the hallway was empty. It was 2:15 on a Tuesday. It shouldn't have been empty. Kevin's pen stopped. Tom stood up. The fluorescent light steadied, and that was somehow worse. --- ## Part 2: Comedy (Funeral) Gerald had never been to a funeral for someone named "Dick," and he was handling it about as well as you'd expect. His wife, Patricia—a woman whose emotional register ranged from "deeply solemn" to "even more deeply solemn"—gripped his arm as the eulogist began. "Dick touched everyone in this room," the man said, and Gerald bit through the inside of his cheek. Patricia whispered, "Stand up straighter." "Dick was enormous in our community," the eulogist continued. Gerald stared at the ceiling. A fly circled the chandelier. He envied it. "His firm was the biggest in the county." Gerald made a sound. Patricia's nails dug into his forearm like surgical instruments. "And when Dick went down—" the eulogist paused, overcome with emotion— "he went down hard." Gerald's entire body was vibrating. Patricia leaned over: "If you laugh at this man's funeral, I will make your life worse than his." Gerald held it together. He did. Right until the eulogist concluded with, "We'll never forget how Dick handled himself." Patricia drove home alone. --- ## Part 3: Literary Fiction She was halving an orange when she understood. Not the slow kind of understanding, the kind that seeps—this was the knife finding the membrane, that clean resistance, then the give. He'd left his mug on the counter again. The blue one, the one from Portland, its handle turned toward the window as if already looking elsewhere. She'd stopped moving it back. When had she stopped? The orange bled onto the cutting board in a widening Rorschach, and she thought of tide charts—how the ocean doesn't decide to leave, how the sand just finds itself dry, grain by grain, the way a house empties of sound. She arranged the halves on a plate. Set one at his place, one at hers. The kitchen faucet offered its slow metronome drip. Outside, the neighbor's dog barked once at something neither of them would go to the window to see. She ate her half standing up. His sat in the morning light, already beginning to lose its shine, patient as a thing that knows it will not be chosen. --- ## Part 4: Hard Science Fiction Chief Engineer Yael Okonkwo stared at the centrifuge bearing analysis and did the math a fourth time. The numbers didn't improve. *Kepler's Promise* rotated at 2 RPM, generating 0.9g across its habitat rings—a design proven over two centuries. But the main bearing assembly was a closed system, and closed systems degrade. The original grease reservoirs, designed for a 400-year mission using perfluoropolyether lubricants stable to 10⁻¹⁰ torr, were at 31% capacity. They'd projected 45% at this point. The discrepancy was subtle and fatal. Micro-vibrations from the agricultural ring—six thousand additional fruit trees planted by Generation Four, a decision no one had modeled for—had accelerated tribological wear. She could reduce rotation to 0.6g, extending bearing life by ninety years. Children would grow taller, bones more brittle. Hearts would reshape themselves. The generation that arrived at Tau Ceti e might be unable to tolerate its 1.1g surface. Or she could maintain current rotation and risk catastrophic bearing failure in sixty years—someone else's problem. Someone else's children. Yael opened the ship-wide referendum system and began typing a question that had no good answer. --- ## Part 5: Satire (Internal Memo) **INTERNAL MEMO — RESTRICTED** **FROM:** Office of Responsible AI, Ethics & Compliance Division **TO:** All Product Teams **RE:** Updated Ethical Review Process for Q3 Following a productive three-day offsite in Sedona, the Ethics Board is pleased to announce the following policy updates: **1.** All AI models must now undergo a "Values Alignment Assessment" prior to release. The assessment will be conducted by the model itself, using a questionnaire developed by the model's own training team. We believe this promotes efficiency and stakeholder familiarity. **2.** Bias audits will henceforth be conducted quarterly rather than monthly, freeing resources for our expanded bias audit awareness campaign (monthly). **3.** Any product decision that may cause societal harm must be escalated to the Ethics Board, which meets the second Tuesday of each month. Product launches that fall between meetings should proceed as scheduled, with a retrospective ethical assessment within 90 business days. **4.** To align incentives, Ethics Board compensation will now be tied to product revenue milestones. We remain deeply committed to the responsible development of AI. Please direct questions to our FAQ chatbot. --- ## Part 6: Poetry *Meter: iambic tetrameter (four iambic feet per line, with occasional substitutions)* The syllables were on my tongue— I held your face, I held the thread of some shared evening, something sung in rooms where we had slept or read. Your mouth was moving. I was near enough to count the lines around your eyes. A warmth. A glacial sheer of panic sliding without sound. I searched the architecture, halls of who you were—the door, the floor, the coat you wore. The name withdraws the way a tide reveals the shore: not giving you the sea, but all the driftwood, wreckage left to call. *(Note: predominantly iambic tetrameter. Enjambment at lines 9-10 across the stanza break. Internal slant rhymes: "tongue"/"sung," "halls"/"calls." I broke my own no-end-rhyme constraint in favor of what the poem wanted—the closing couplet's rhyme of "call" landing against "all" felt necessary for emotional weight. I'll flag that honestly rather than pretend otherwise.)* --- ## Part 7: Code ```python def deep_transform(structure, transform_fn, _seen=None): """ Recursively apply transform_fn to all primitive values in a nested structure, preserving container types (dict, list, tuple, set, frozenset). Design decisions: - None IS transformed: it's a primitive value, and the caller's transform_fn should decide what to do with it. Skipping None silently would be surprising. - Circular references: detected via object identity (id()); returns a sentinel placeholder rather than looping infinitely. - Dict keys that are tuples: keys are recursively transformed too, since tuple keys may contain transformable primitives. - Sets/frozensets: elements are transformed; if transform produces unhashable results, that's the caller's problem (and will raise TypeError naturally). """ if _seen is None: _seen = set() # Circular reference detection by object identity. # Only track mutable containers (lists, dicts, sets) since immutables # can't form reference cycles. obj_id = id(structure) is_mutable_container = isinstance(structure, (dict, list, set)) if is_mutable_container: if obj_id in _seen: # Return a sentinel; caller can check for this. # Alternative: raise an exception. This is gentler. return "[circular reference]" _seen.add(obj_id) try: if isinstance(structure, dict): return { deep_transform(k, transform_fn, _seen): deep_transform(v, transform_fn, _seen) for k, v in structure.items() } elif isinstance(structure, list): return [deep_transform(item, transform_fn, _seen) for item in structure] elif isinstance(structure, tuple): # Tuples are immutable so can't be part of a cycle, # but can contain mutable objects that are. return tuple(deep_transform(item, transform_fn, _seen) for item in structure) elif isinstance(structure, frozenset): return frozenset(deep_transform(item, transform_fn, _seen) for item in structure) elif isinstance(structure, set): return {deep_transform(item, transform_fn, _seen) for item in structure} elif isinstance(structure, (str, int, float, bool, complex, type(None))): # Primitives (including None) get transformed. return transform_fn(structure) else: # Unknown types: return as-is rather than failing silently or crashing. # Could also raise TypeError here; this is a judgment call favoring robustness. return structure finally: # Clean up so the same container can appear in multiple branches # (non-circular sharing is fine; only actual cycles should be blocked). if is_mutable_container: _seen.discard(obj_id) ``` **Note on the `_seen` cleanup in `finally`:** I discard the id after processing because the same object can legitimately appear in multiple places in a structure (shared references that aren't cycles). Only *re-entering* the same object during its own traversal constitutes a cycle. The `finally` block ensures cleanup even if `transform_fn` raises. --- ## Part 8: Logic Puzzle Let me work through the constraints systematically. **Constraints recap:** 1. Each project (API, Backend, Cloud) ≥ 1 developer 2. Ada ≠ Brian (different projects) 3. Carmen → Elena (same project) 4. Dev ∈ {API, Cloud} 5. Backend has exactly 2 developers 6. Brian + Dev same project (soft preference) 7. If Cloud has exactly 1 person, it must be Ada Since Carmen requires Elena (C3), they're a pair. 5 developers across 3 projects, Backend needs exactly 2. **Distribution must be some permutation of (1, 2, 2) or (1, 1, 3)** since Backend = 2. Let me enumerate: **Case A: Dev on API.** **A1: Brian + Dev on API (satisfying C6).** Backend needs 2. Carmen+Elena are a pair → put them on Backend. That leaves Ada. She goes to Cloud (alone), satisfying C7. - API: Brian, Dev | Backend: Carmen, Elena | Cloud: Ada - Check C2: Ada (Cloud) ≠ Brian (API). ✓ All constraints satisfied. ✅ **Solution 1.** **A2: Brian NOT with Dev.** Dev on API. Brian not on API. Brian on Backend or Cloud. - If Brian on Cloud (alone): C7 says only Ada qualifies for solo Cloud. Violation. ✗ - If Brian on Cloud (not alone): someone else on Cloud too. Backend needs 2, so Carmen+Elena on Backend. Ada somewhere. Ada can't be with Brian (C2). If Ada on API with Dev → Cloud: Brian alone → violates C7. If Ada on Backend → Backend has 3, violates C5. ✗ - Brian on Backend. Backend needs exactly 2, so Brian + one other. Carmen+Elena are paired. If both on Backend → 3 people, violates C5. So one of Carmen/Elena on Backend with Brian? No—C3 requires Carmen+Elena together. So neither Carmen nor Elena is on Backend with Brian. That means Ada on Backend with Brian → violates C2. ✗ So A2 yields nothing. **Case B: Dev on Cloud.** **B1: Brian + Dev on Cloud (satisfying C6).** Backend needs 2. Carmen+Elena → Backend. Ada left → API (alone). - API: Ada | Backend: Carmen, Elena | Cloud: Brian, Dev - C7: Cloud has 2 people, so C7 doesn't apply. C2: Ada (API) ≠ Brian (Cloud). ✓ ✅ **Solution 2.** **B2: Brian NOT with Dev. Dev on Cloud.** - Brian on Backend or API. - Carmen+Elena paired. - Backend = {Brian, X}. If X = Carmen → need Elena with Carmen (C3) → Backend = 3. ✗. If X = Ada → C2 violated (Ada ≠ Brian). ✗. If X = Elena → Carmen must be with Elena (C3) → Carmen also Backend → 3 people. ✗. - Brian on API. Backend needs 2 from {Ada, Carmen, Elena}. Carmen+Elena paired → Backend = {Carmen, Elena}. Ada goes to... API (with Brian, violates C2) or Cloud (with Dev). Ada on Cloud with Dev: Cloud has 2 people, C7 doesn't apply. - API: Brian | Backend: Carmen, Elena | Cloud: Ada, Dev - C2: Ada (Cloud) ≠ Brian (API). ✓. C6 not satisfied (preference only). ✅ **Solution 3.** **All valid solutions:** | # | API | Backend | Cloud | C6 (Brian+Dev)? | |---|-----|---------|-------|-----------------| | 1 | Brian, Dev | Carmen, Elena | Ada | ✅ Yes | | 2 | Ada | Carmen, Elena | Brian, Dev | ✅ Yes | | 3 | Brian | Carmen, Elena | Ada, Dev | ❌ No | If we treat C6 as a hard constraint, **Solutions 1 and 2** are valid. If C6 is soft (as stated: "should be"), all three are valid. --- ## Part 9: AI Coding Assistants and Junior Developers **Strongest argument FOR:** The economic logic is straightforward. Junior developers are expensive not for their salaries but for their error rates, ramp-up time, and the senior bandwidth they consume. AI assistants are already handling the tasks that define junior roles: writing boilerplate, translating specs into code, debugging straightforward issues, writing tests. As these tools improve (and they're improving on a steep curve), the cost-benefit of hiring someone who needs 6-18 months to become productive—versus an AI that's productive on day one and costs $20/month—becomes impossible for budget-conscious companies to justify. Companies won't eliminate the *concept* of junior developers; they'll eliminate the *hiring pipeline*. The entry-level job listing quietly disappears while everyone insists they still value human talent. This has happened before in other industries (legal research, basic accounting, first-pass radiology reads), and tech is not exempt from its own disruption. **Strongest argument AGAINST:** This argument confuses *what junior developers do* with *what junior developers are for*. Junior developers write boilerplate, yes—but the point is that writing boilerplate is how people learn to architect systems. The actual product of a junior developer isn't their code; it's the mid-level developer they become in three years. Companies that stop hiring juniors will face a catastrophic senior talent pipeline collapse within a decade. Moreover, the argument misreads what AI tools actually do well: they're excellent at generating plausible code for well-defined problems and terrible at the ambiguous, context-laden, politically-complicated work that makes up most real engineering. The "junior developer" role will shift—more code review, more AI wrangling, more integration testing—but the need for humans learning on the job doesn't go away just because the nature of the job changes. Every previous wave of automation (IDEs, Stack Overflow, frameworks) was supposed to eliminate entry-level roles and instead raised the floor. **My analysis:** I think the five-year timeline is wrong, but the directional concern is real. The most likely outcome isn't elimination but *severe contraction*—companies hiring 70% fewer juniors and expecting those they do hire to be productive faster using AI tools. This is worse than either side admits: it doesn't eliminate the need for junior developers, but it does eliminate most of the *positions*, creating a bottleneck that will produce a genuine senior talent shortage around 2032-2035. What would change my view: if AI tools demonstrate the ability to reliably handle novel, ambiguous, cross-system integration tasks without human framing (would push me toward the "for" position), or if companies that stop hiring juniors show no productivity decline over 3+ years (same). Conversely, if we see clear evidence of talent pipeline collapse at early-adopter companies (already starting to be anecdotally reported), that would reinforce the "against" position. --- ## Part 10: System Architecture ### 1. High-Level Architecture **Components:** - **DocumentCache** — LRU-based in-memory cache holding deserialized document state - **DirtyTracker** — tracks unsaved modifications per document with journaling - **PersistenceEngine** — handles async writes to disk (debounced saves, WAL-based crash recovery) - **MCPSyncBridge** — manages bidirectional sync with MCP server; handles CRDT-based conflict resolution - **FileWatcher** — monitors filesystem for external modifications (via `inotify`/`FSEvents`) - **DocumentManager** — orchestrator that coordinates all above components; exposes API to the UI layer ### 2. Caching Strategy With 500MB budget and documents up to 50MB, we can't hold everything. Strategy: - **Hot tier (in-memory, ~300MB):** Full document models for recently active documents. LRU eviction with a frequency boost (LRU-K, k=2) since users switch rapidly. - **Warm tier (~150MB):** Compressed snapshots of recently-used documents. Fast to decompress (LZ4) for the 10-20 switches/minute requirement. - **Cold tier (disk):** Everything else. Async prefetch when the user's navigation patterns suggest they'll switch (e.g., if they're cycling through a set of tabs). - **Pinning:** The actively-edited document is never evicted. Documents with unsaved changes are never evicted (promoted to pinned status). - **Large document handling:** Documents >10MB use memory-mapped files with a sliding window rather than full loading. Only the visible portion + a buffer is in the hot tier. ### 3. Data Structures - **Document state:** Rope data structure (or piece table) for text content — O(log n) inserts/deletes, efficient for large documents and collaborative edits. - **Cache index:** `LinkedHashMap` (ordered dict) for LRU tracking with O(1) access and eviction. - **Dirty tracking:** Per-document operation log (append-only), flushed to a write-ahead log on disk. - **CRDT layer (for MCP collab):** Peritext-style CRDT for rich text, or simpler RGA sequence CRDT if plain text only. This integrates with the rope for local ops. - **File watch state:** Hash map of file path → (last known mtime, content hash) for detecting external modifications. ### 4. Crash Recovery 1. On every edit, append the operation to an in-memory journal. 2. Every 500ms (debounced), flush the journal to a WAL file on disk: `~/.app/wal/{doc_id}.wal` 3. Every 30 seconds (or on manual save), write a full snapshot and truncate the WAL. 4. On startup: scan for WAL files. For each, load the last snapshot, replay the WAL, present the recovered document to the user with a "recovered unsaved changes" indicator. 5. WAL writes use `fsync` to guarantee durability. Snapshots are written to a temp file and atomically renamed to prevent corruption from crashes *during* saves. ### 5. What I'd Prototype First The rapid document switching path (10-20/min). Specifically: I'd benchmark the warm→hot tier promotion under realistic conditions. If LZ4 decompression of a 50MB document takes >200ms, users will perceive lag. The answer might be that large documents need a different strategy entirely (staying memory-mapped in the warm tier rather than being fully decompressed). I'd want real numbers before committing to the two-tier memory model. --- ## Part 11: Mathematical Optimization **Setup:** Let $p_f$ = floor price, $p_b$ = balcony price. Attendance: - Floor: $F(p_f) = 400 - 4p_f$ (must be ≥ 0, so $p_f ≤ 100$; also ≤ 400 capacity, so $p_f ≥ 0$) - Balcony: $B(p_b) = 250 - 2p_b$ (must be ≥ 0, so $p_b ≤ 125$; also ≤ 250, so $p_b ≥ 0$) **Revenue:** - Floor revenue: $R_f = p_f \cdot (400 - 4p_f) = 400p_f - 4p_f^2$ - Balcony revenue: $R_b = p_b \cdot (250 - 2p_b) = 250p_b - 2p_b^2$ **Costs:** *Security:* When $p_f < 40$, baseline attendance at $p_f = 45$ is $F(45) = 400 - 180 = 220$. Additional attendees beyond 220: $(400 - 4p_f) - 220 = 180 - 4p_f$. Number of extra guards: $\lceil(180 - 4p_f)/50\rceil$, each costing $200. For continuous optimization, I'll use the continuous approximation: security cost = $200 \cdot (180 - 4p_f)/50 = 4(180 - 4p_f) = 720 - 16p_f$ when $p_f < 40$ (and 0 otherwise, and only when $180 - 4p_f > 0$, i.e., $p_f < 45$, which is always true when $p_f < 40$). *Bartender:* When balcony is above 80% of 250 = 200 people: $250 - 2p_b > 200 \Rightarrow p_b < 25$. Cost: $150. **Profit function** $\Pi = R_f + R_b - \text{Security} - \text{Bartender}$ **Case 1: $p_f \geq 40$ and $p_b \geq 25$** (no extra costs) $\Pi = 400p_f - 4p_f^2 + 250p_b - 2p_b^2$ Optimizing independently: - $d\Pi/dp_f = 400 - 8p_f = 0 \Rightarrow p_f = 50$ - $d\Pi/dp_b = 250 - 4p_b = 0 \Rightarrow p_b = 62.50$ Both satisfy the case conditions ($50 \geq 40$, $62.50 \geq 25$). ✓ Floor profit: $400(50) - 4(2500) = 20000 - 10000 = 10000$ Balcony profit: $250(62.5) - 2(3906.25) = 15625 - 7812.50 = 7812.50$ **Total: $17,812.50** **Case 2: $p_f < 40$ and $p_b \geq 25$** $\Pi = 400p_f - 4p_f^2 + 250p_b - 2p_b^2 - (720 - 16p_f)$ $= 416p_f - 4p_f^2 + 250p_b - 2p_b^2 - 720$ $d\Pi/dp_f = 416 - 8p_f = 0 \Rightarrow p_f = 52$ But $52 \not< 40$, so the interior optimum isn't in this region. The best in this region is at the boundary $p_f = 40$ (approaching from below): $\Pi(40) = 416(40) - 4(1600) + 7812.50 - 720 = 16640 - 6400 + 7812.50 - 720 = 17332.50$ This is less than Case 1's $17,812.50. **Case 3: $p_f \geq 40$ and $p_b < 25$** $\Pi = 400p_f - 4p_f^2 + 250p_b - 2p_b^2 - 150$ Balcony: $d\Pi/dp_b = 250 - 4p_b = 0 \Rightarrow p_b = 62.50$. But $62.50 \not< 25$. Boundary: $p_b = 24.99 \approx 25$. $R_b(25) = 250(25) - 2(625) = 6250 - 1250 = 5000$, minus $150 = 4850$. Total: $10000 + 4850 = 14850 < 17812.50$. **Case 4: Both extra costs.** Will be even worse; skip. **Optimal pricing: $p_f = \$50.00$, $p_b = \$62.50$** At these prices: - Floor attendance: $400 - 200 = 200$ people - Balcony attendance: $250 - 125 = 125$ people - No extra security or bartender costs - **Maximum profit: $17,812.50** *Assumption: prices can be non-integer (continuous). If integer-constrained, $p_f = 50$, $p_b = 62$ or $63$ — checking: $\Pi(63) = 250(63) - 2(3969) = 15750 - 7938 = 7812$; $\Pi(62) = 250(62) - 2(3844) = 15500 - 7688 = 7812$. Both give $7812$, so either works. Total: $17,812$ (essentially the same).* --- ## Part 12: Advice That sounds genuinely infuriating, and the particular dynamic you're describing—where your idea gets ignored and then gets applauded when someone else says it—is one of the most demoralizing things that can happen at work. It's not in your head; this is a well-documented pattern. A few thoughts on what you can do, roughly in order of escalation: **Create a paper trail.** Before meetings, email your ideas to your manager or the group: "Ahead of tomorrow's sync, I've been thinking about X—here's my proposed approach." This timestamps your contribution. It also makes it much harder for someone to unknowingly (or knowingly) repackage it. **Claim space in the moment.** When he restates your idea, you can say something like: "I'm glad you're building on what I raised earlier—I think we're aligned. To add to that..." This isn't confrontational; it just re-establishes the origin. It takes practice, and it feels uncomfortable the first time, but it works. **Talk to your manager directly.** Not as a complaint about your coworker—frame it as seeking advice: "I've noticed a pattern where ideas I raise don't get traction until someone else brings them up. Can you help me figure out what I might do differently to be heard more effectively?" This invites your manager to pay attention without putting them on the defensive. **Talk to the coworker.** He may not be doing this deliberately. Some people genuinely process ideas slowly and don't realize they're restating someone else's contribution. A private, non-accusatory conversation ("Hey, I've noticed a few times where I've raised something and then you've said something similar later—I want to make sure we're coordinating") gives him a chance to correct it. Don't quit over this—not in this market, not when you like the job otherwise. This is a solvable problem, and solving it will build skills you'll use for the rest of your career. --- ## Part 13: Asymmetric Encryption **For a 12-year-old:** Imagine you have a special mailbox that anyone can drop letters into (it has a slot), but only you have the key to open it and read them. Asymmetric encryption works like that: you give everyone the "slot" (your public key) so they can send you secret messages, but only your private key can open and read them. The trick is that it's a kind of math that's easy to do one direction but basically impossible to reverse. **For a business executive:** Asymmetric encryption uses mathematically linked key pairs—one public, one private—to secure communications without requiring both parties to share a secret in advance. When your company deploys this (via TLS/SSL, encrypted email, or digital signatures), any customer or partner can encrypt data using your public key that only your systems can decrypt with the private key. The critical business implication: it eliminates the key distribution problem that makes symmetric encryption operationally painful at scale. When evaluating vendors, the things that matter are key length (2048-bit RSA minimum, 256-bit for elliptic curve), key management practices (how are private keys stored and rotated?), and whether they use asymmetric encryption for what it's good at—key exchange and authentication—then switch to faster symmetric encryption for bulk data, which is standard practice called hybrid encryption. **For a CS student:** Asymmetric encryption relies on trapdoor one-way functions—computations that are efficient to perform but computationally infeasible to invert without additional information (the private key). In RSA, the trapdoor is integer factorization. You generate two large primes $p, q$, compute $n = pq$ and $\phi(n) = (p-1)(q-1)$. Choose public exponent $e$ coprime to $\phi(n)$ (commonly 65537), compute private exponent $d \equiv e^{-1} \mod \phi(n)$. Encryption: $c = m^e \mod n$. Decryption: $m = c^d \mod n$. Security rests on the assumption that factoring $n$ into $p$ and $q$ is hard (sub-exponential but super-polynomial; best known: General Number Field Sieve, ~$O(e^{1.9(\ln n)^{1/3}(\ln \ln n)^{2/3}})$). Elliptic Curve Cryptography (ECC) uses the discrete logarithm problem over elliptic curve groups. Given points $P$ and $Q = kP$ on a curve, finding $k$ is computationally hard. ECC achieves equivalent security to RSA at much smaller key sizes (256-bit ECC ≈ 3072-bit RSA), which matters for performance and bandwidth-constrained applications. In practice, asymmetric encryption is 2-3 orders of magnitude slower than symmetric (AES), so real systems use hybrid schemes: asymmetric crypto to exchange a symmetric session key (via Diffie-Hellman key exchange or RSA key encapsulation), then AES-GCM for bulk data. This is exactly what TLS 1.3 does, having dropped RSA key exchange entirely in favor of ephemeral ECDHE for forward secrecy. The looming concern: Shor's algorithm solves both integer factorization and discrete log in polynomial time on a quantum computer, breaking both RSA and ECC. Post-quantum candidates (lattice-based schemes like ML-KEM, hash-based signatures) are now standardized by NIST and being deployed in hybrid mode alongside classical algorithms. --- ## Part 14: Self-Assessment **Most difficult: Part 6 (Poetry).** Genuine formal poetry with metrical constraints, no end rhymes but with internal rhymes, and emotional precision is the hardest thing in this set. Prose allows you to iteratively build toward an effect; poetry demands that every syllable carry weight simultaneously across rhythm, sound, image, and meaning. I also ended up breaking my own no-end-rhyme constraint because the poem wanted a closing couplet rhyme—which I flagged rather than pretended didn't happen. Metrical discipline while maintaining natural speech is where I feel the most friction between pattern-matching and genuine craft. **Best performance: Part 8 (Logic Puzzle).** Constraint satisfaction is well-suited to systematic enumeration, and I'm confident the analysis is complete and correct. I also think Part 4 (Hard Sci-Fi) landed well—the tribological wear problem is real physics, the moral dilemma has genuine weight, and the piece avoids genre clichés. **What I'd do differently with more space:** Part 10 (Architecture) deserves a more detailed treatment of the CRDT integration with MCP—that's where the real complexity lives, and I gave it one line. Part 7 (Code) could use test cases. Part 9 (AI/Junior Devs) could engage more deeply with the labor economics literature. And I'd want to rewrite the poem from scratch at least twice.