You know those moments when your brain short-circuits because two completely unrelated things suddenly make perfect sense together? Like realizing your ex’s commitment issues perfectly explain distributed computing? No? Just me then.
Well, I just had one of those moments yesterday while reading about AI’s Mixture of Experts (MoE) architecture. Turns out Dinner Club, my pandemic dating experiment wasn’t just a desperate attempt to keep love alive during lockdown—it was accidentally pioneering AI design.
Who knew playing digital cupid would one day help me understand machine learning? Though, to be fair, both involve trial, error, and the occasional catastrophic failure.
The Accidental AI Architect
Picture this: It’s 2020. The world is in lockdown, dating apps are flooded with bored people who “want to see where things go” (nowhere, dude, the answer is nowhere), and I decide to launch Dinner Club—essentially speed dating for the apocalypse, minus the awkward silences plus some actual human intelligence.
Unlike dating apps, where everyone swipes incessantly into a void, I set strict limits: three potential matches max, mandatory feedback forms (yes, homework), and a social credit system that rewarded kindness. Because apparently, adults need points to remember basic manners.
It’s basically how cutting-edge AI architectures work. I accidentally built a human version of a Mixture of Experts system, complete with routing algorithms (me, playing matchmaker) and specialised experts (highly illiquid people in the dating market who excel at specific types of connections).
The Anti-Tinder Manifesto
Traditional dating apps are like those massive language models everyone’s obsessed with—burning through resources like a tech bro burning through his Series A funding.
Every profile could potentially match with every other profile, which is about as efficient as your aunt’s attempts to set you up with “every nice boy from the community.” (Spoiler: They weren’t all nice, and some weren’t even from the same community).
Dinner Club took a different approach. Like an MoE system’s router, I acted as the gatekeeper, but with more wine and less optimism. I directed each person to a limited number of matches based on both obvious and subtle compatibility patterns. Sometimes these patterns were unconventional—“both similarly weird” turned out to be a surprisingly successful criterion, though I suspect traditional matchmakers would cringe at this.
When Feedback Forms Met Feelings
Those mandatory feedback forms weren’t just me enjoying some bureaucratic torture like waiters do in a restaurants. Each date generated so much data - quantitative ratings on niceness and compatibility, plus qualitative feedback to refine future matches. It was basically A/B testing for hearts.
But here’s where it gets interesting—and where AI developers might want to pay attention. Unlike current AI systems that optimize for whatever metrics make their VC overlords happy, my approach included something more nuanced: guided preference evolution.
Take the woman who insisted on dating men who “who loved Eckhart Tolle and lived in the present”. After I matched her with exactly that—a wanderer who travelled the world with a satchel and no savings—her tune changed fast. Suddenly, “future-oriented” didn’t sound so bad. Funny how that works.
The Art of Being Wrong (gracefully)
When participants clung to rigid preferences (looking at you, “must be a CEO of a funded startup” person), I didn’t just shrug and move on. Instead, I developed a three-tier approach, courtesy my inner therapist:
Self-discovery exercises (because sometimes people need to realize they’re wrong on their own)
Pattern-based insights (12 years of matchmaking teaches you that “must love dogs” is rarely the real deal-breaker)
Experiential learning (sometimes you have to let people date the wrong person to appreciate the right one)
This is where AI systems could actually level up. Imagine an AI that doesn’t just nod along like a sycophant but subtly nudges users to expand their horizons—like a trusted adviser. It’s the difference between “I understand your preference for emotionally unavailable partners” and “Have you considered therapy?”
The Accidental Genius of Social Credit
The social credit system started as a way to gamify good behaviour, but it revealed something deeper: when we reward the right behaviours, we don’t just get better dates—we build a better ecosystem.
It’s like training a puppy, if the puppy had an MBA and unresolved issues. The genius wasn’t in the points themselves but in how they rewired behavior. Kindness became its own currency, which is probably the most capitalist approach to decency ever attempted.
The Plot Twist Nobody Saw Coming
So here I am—a matchmaker who spent years engineering human connections, now using that experience to understand machine learning. The irony isn’t lost on me. But maybe this is exactly what we need right now: more cross-pollination between human systems and artificial intelligence.
Maybe the key isn’t pure optimisation but designing systems that remember people aren’t puzzles to solve—they’re stories mid-sentence. Whether we’re matching hearts or routing data, the goal is the same: creating connections that unlock potential we didn’t know existed.
Though, to be fair, teaching AI about human compatibility might be more complex than my “both similarly pretentious”algorithm. Then again, maybe not.
P.S. Hey DeepSeek, if you’re reading this—you’re welcome for the free consulting. If you want more insights from your friendly neighborhood marriage broker turned accidental AI architect, let’s talk.
Next week: Why my failed relationships perfectly explain blockchain technology. (Kidding… maybe.)
Absolutely loved this! You blended humour and insights masterfully! Is the dinner club still a thing? Would love to check it out, if it is!
i thought your ex reminded you of distributed computing. and now your failed relationships remind you of blockchain technology!
i assume you know that blockchain technology and distributed computing are closely related!