Many Selves, One World: Ubuntu, Digital Identity, and the Algorithmic Age
Chapter 1
Digital Identity: Beyond the Password
Jason Miller
Alright, let's kick things off. I wanna start with a question—how many of “you” exist out there, right now? Not the you walking around, but the versions of yourself inside databases, shopping histories, government accounts, social media… all these little fragments making you up, whether you notice or not. I mean, if your phone's in your pocket, your that you is running around in app logs, location check-ins, loyalty cards. That's digital identity—it's way more than passwords. It's this tangled web of email addresses, locations, search habits, payment patterns—stuff that, honestly, even you might forget about. But the systems don't. They build this “picture” of you, silently, by putting together lots of tiny clues.
Philippe Funk
Yeah, and—and it's really invisible until you try to do something unusual and suddenly you get flagged. I remember, I think it was maybe four or five years ago, uh, getting some very... oddly-specific job recommendations on LinkedIn after I changed one setting. It felt like—like, there was something deeper going on under the hood. Algorithms “knew” more about me than just my job title or my skills; they were connecting dots that I hadn’t even thought of. And it made me wonder, okay, if LinkedIn is drawing conclusions, what are banks, insurance, government agencies doing with all the other data I leave behind?
Jason Miller
Yeah, it can be a little unsettling, right? Like, you opt into one thing and suddenly, you’re on a whole new path. And the wild part? Most of us don't see the code or the policies shaping those paths. Invisible, like you said.
Chapter 2
The Algorithmic Gatekeepers
Philippe Funk
So, let's talk about that code—because these days, algorithms are what stand at the gates. Who gets the apartment, what news pops up in your feed, whose loan gets approved. Every time you swipe or click, you’re—well, you’re basically being sorted by these hidden models. And these systems are only as neutral as the data and the people who built them. In the UK, I've seen bank fraud checks, supposedly “objective,” actually reject people based on address or spending patterns tied to race or class. That's not just numbers, that's bias quietly running the show. You don’t see it—until you cannot open an account.
Jason Miller
Exactly. I mean, on the surface, algorithms sound—well, kind of elegant. Neat code, impartial, right? But in reality, they’re wired with, like, all the opinions and shortcuts and, sometimes, just plain old mistakes of the humans who train them. And then, at scale, they can amplify those mistakes big time. The “computer says no” moment—nobody really tells you why. It's just a wall.
Philippe Funk
There’s a phrase I use with my students: algorithms don’t just mirror society—they ossify it. If there’s a legacy bias in your system, the code can lock it in, quietly, for years. Sometimes you only spot it as a pattern, long after harm’s already been done.
Chapter 3
Is Code Neutral?
Jason Miller
So, let’s poke this idea—this myth—that code is somehow neutral, or that tech can stand outside our messy human values. Spoiler: it can't. I reported a piece in Beijing about social credit systems—the idea that tech can assign you an “objective” digital reputation. The government would insist it's just “math,” but the reality is, every decision is rooted in policies, culture, someone's values about what’s “good” or “bad” behavior. It’s not cold and fair—it’s full of choices, just invisible ones.
Philippe Funk
Yeah, I’ve seen this “policy via code” problem everywhere. The issue isn’t just, “Is the code fair?” but “Whose rules are embedded?” Users hit these walls, and there’s no real way to appeal or even understand what happened. It's what the British call “computer says no”—which sounds, you know, quaint, until a mortgage or a visa hangs in the balance.
Jason Miller
Right, and that lack of explainability creates its own kind of injustice. If you can’t see the logic, you can't challenge the outcome. You’re not treated as a participant, just a datapoint. That’s a problem.
Chapter 4
Ubuntu as a Lens for Technology
Philippe Funk
Now, if we step back and look at this through a completely different lens—Ubuntu—a lot of our assumptions about digital identity get flipped. Ubuntu is, at its core, “I am because we are.” It's about how who you are only makes sense in relation to the people, the community, the context around you. Contrast that with the Western approach—where identity’s all about the individual, pieced together by databases.
Jason Miller
Yeah, and that Ubuntu spirit actually shows up in open-source tech, even if people don’t call it that. Think Linux, Python, Apache—those big collaborative projects. Instead of one company or country owning the system, you get a community pushing it forward. The whole tech stack we rely on is, in a way, proof that the “we” beats the “me.”
Philippe Funk
There’s real power there. When you design tools for shared benefit, everyone—literally everyone—gets further. I’d argue, the biggest advances in tech haven’t come from isolated geniuses, but from collaborators who believed in something bigger than their own CVs or portfolios.
Jason Miller
Yeah, and open-source isn’t just about code-sharing. It’s an example of dignity by design, right? Everyone gets to see, challenge, and improve what’s there. That’s accountability. That’s trust.
Chapter 5
Living with Many Selves Online
Jason Miller
If we zoom back into real life, almost everyone listening has these “many selves” online. I’ve got a version of me for my bank, one for my work email, a wildly unflattering one for my old gaming forum, and, who knows how many are feeding AI models as we speak. These fragments are all out there, doing things I probably don’t even know about.
Philippe Funk
There's a good and bad to that, right? On the one hand, splitting your digital identity brings more security. You don’t want your social logins tied to your financial ones, or your health info everywhere. On the other—systems keep carving you into little boxes, labeling, scoring, predicting. And if those boxes disagree…it gets messy fast. I had a student—a brilliant coder—who got flagged in one algorithm as “in need of financial support.” Meanwhile, the student loan board’s own scoring system denied her outright. She was the same person, but the code saw “two” of her—and both systems were wrong.
Jason Miller
Ugh, yeah, it really highlights the cost of being seen by code but not seen as a person. You end up trying to prove yourself to a process that doesn’t actually admit you could be more than one thing at once.
Chapter 6
Designing for Human Dignity
Philippe Funk
If we’re gonna build better digital systems, we need to ask new questions at the start. Not just “Is it efficient?” but “Does it respect people as they are?” “Does it let them see and control what’s happening?” That’s where the Ubuntu idea feels radical—you judge the tech by whether it upholds dignity, transparency, inclusion. We actually see good practice here and there: community-driven ID programs, ethical open-source projects that put user agency first. They're not flashy, but they work by building trust, not just checking boxes or ticking compliance forms.
Jason Miller
Yeah, it’s easy to focus on the shiny new tech, but the real value is asking whether people affected by the systems have a say. Are we designing for the people on the margins—or just the ones who already fit? If the answer is always “the latter,” we’re just automating exclusion at scale.
Philippe Funk
Absolutely. If digital identity systems reinforce old patterns of privilege, they fail their biggest test. Ubuntu would ask: “Does everyone flourish together—or is it just convenient for some and costly for others?”
Chapter 7
Three Questions for the Day
Jason Miller
So, before we wrap, we wanted to leave you with a few things to chew on. First: Who really owns or controls the digital identities you depend on every day? And if it’s not you—do you know what’s being done in your name?
Philippe Funk
Second: In all your interactions online, do you feel like a participant in a relationship, or more like—uh—a line of code? Are you treated as a full person, or just the sum of your clicks and logins?
Jason Miller
And last: What would it look like to push for digital systems—and even laws—that protect not just your privacy, but the dignity of every human the system touches? I mean, if Ubuntu’s “I am because we are” is true, maybe the goal is to build tech that reflects that, not just on posters but in actual code.
Philippe Funk
We hope you’ll talk about this, argue, bring it up with your friends and colleagues—these aren’t just technical debates. It’s about policy, culture, ethics… you know, all the stuff we can’t leave to algorithms.
Jason Miller
That’s all for today. Philippe, always good talking with you.
Philippe Funk
Likewise, Jason. And thanks to everyone tuning in. We’ll see you next time for another episode of The Ubuntu Podcast Series.
Jason Miller
Take care, everyone. Stay human—and stay curious.
