David speaks with Sondre Rasch, Co-founder and CEO of SafetyWing, a global social safety net for remote workers and companies.
They talked about:
🌐 Choosing the right projects
💡 Independent thinking and developing taste
🤔 How ideas and individuals evolve over time
🚀 The traits of great founders
🎯 Audience capture
🌐 How algorithms influence beliefs and behaviour
This is just one part of a longer conversation, and it's the second part. You can listen to the earlier episode here:
Part 1: 🎙️Building Enduring Digital Companies with Sondre Rasch (Episode 79)
🎙 Listen in your favourite podcast player
🎧 Listen on Spotify:
📹 Watch on Youtube:
👤 Connect with Sondre:
Twitter: @SRasch | https://twitter.com/SRasch
Website: Safety Wing | https://safetywing.com/
📄 Show notes:
0:00 | Intro
01:25 | Tech industry's business-project connectivity
05:03 | Independent thinking and taste development
09:38 | How ideas and individuals evolve over time
15:49 | Traits of great founders
18:24 | Audience Capture on personal beliefs
22:10 | Online personalization on individual perspectives
🗣 Mentioned in the show:
SafetyWing | https://safetywing.com/
Superside | https://www.superside.com/
Plumia | https://plumia.org/
Danielle Strachman | https://theknowledge.io/daniellestrachman/
Teal Foundation | https://www.thetealfoundation.org/
Paul Graham | https://en.wikipedia.org/wiki/Paul_Graham_(programmer)
Effective Altruism | https://www.effectivealtruism.org/
Robin Hanson | https://theknowledge.io/robinhanson-1/
Balaji Srinivasan | https://en.wikipedia.org/wiki/Balaji_Srinivasan
Sam Altman | https://en.wikipedia.org/wiki/Sam_Altman
Ned Flanders | https://en.wikipedia.org/wiki/Ned_Flanders
The Simpsons | https://en.wikipedia.org/wiki/The_Simpsons
Jordan Peterson | https://en.wikipedia.org/wiki/Jordan_Peterson
Garry Tan | https://en.wikipedia.org/wiki/Garry_Tan
Jeff Bezos | https://en.wikipedia.org/wiki/Jeff_Bezos
Moment of Zen | https://twitter.com/MOZ_Podcast
Lauren Razavi | https://theknowledge.io/laurenrazavi-2/
Eliezer Yudkowsky | https://en.wikipedia.org/wiki/Eliezer_Yudkowsky
👨🏾💻 About David Elikwu:
David Elikwu FRSA is a serial entrepreneur, strategist, and writer. David is the founder of The Knowledge, a platform helping people think deeper and work smarter.
🐣 Twitter: @Delikwu / @itstheknowledge
🌐 Website: https://www.davidelikwu.com
📽️ Youtube: https://www.youtube.com/davidelikwu
📸 Instagram: https://www.instagram.com/delikwu/
🕺 TikTok: https://www.tiktok.com/@delikwu
🎙️ Podcast: http://plnk.to/theknowledge
📖 EBook: https://delikwu.gumroad.com/l/manual
My Online Course
🖥️ Career Hyperdrive: https://maven.com/theknowledge/career-hyperdrive
Career Hyperdrive is a live, cohort-based course that helps people find their competitive advantage, gain clarity around their goals and build a future-proof set of mental frameworks so they can live an extraordinary life doing work they love.
The Knowledge
📩 Newsletter: https://newsletter.theknowledge.io/
The Knowledge is a weekly newsletter for people who want to get more out of life. It's full of insights from psychology, philosophy, productivity, and business, all designed to help you think deeper and work smarter.
My Favorite Tools
🎞️ Descript: https://bit.ly/descript-de
📨 Convertkit: https://bit.ly/convertkit-de
🔰 NordVPN: https://bit.ly/nordvpn-de
💹 Nutmeg: http://bit.ly/nutmegde
🎧 Audible: https://bit.ly/audiblede
📜Full transcript:
[00:00:00] Sondre: There are other things though when I just look at the greatest founders, they're quite clear thinkers, clear communicators, user oriented, that's like such a practical thing. But like, good with product is a thing, is a actual skill. And it's very hard to get good with product like people will work in product for years and still suck at it. Because what it really is, is taste in judgment.
[00:00:25] David Elikwu: This week I'm sharing part of my conversation with Sondre Rasch, the co founder and CEO of Safety Wing, which is building a global safety net for remote workers and companies.
[00:00:35] David Elikwu: So in this part, you're going to hear Sondre and I talking about the evolution of businesses, how to develop independent thinking and taste. We also talk about how ideas and individuals evolve over time. We talk about the traits of great start up founders. And the idea of Flanderization and audience capture. And then finally, we talk about how online platforms can shape our beliefs.
[00:00:58] David Elikwu: So this was a really interesting conversation. You can get the full show notes, the transcript, and read my newsletter at theknowledge.io And you can find Sondre on Twitter, @SRasch and on his website, Safety Wing at safetywing.com.
[00:01:11] David Elikwu: Now, if you love this episode, please do share it with a friend and don't forget to leave a review wherever you listen to podcasts, because it helps us tremendously to find other listeners just like you.
[00:01:25] David Elikwu: I think one thing I've noticed that's also really interesting is the way, almost all of your business, it's kind of one business sprouts out of the next. So, you know, you're building Superside that kind of leads into building SafetyWing which is solving part of the problem there. And then that also leads into Plumia, which is just a project of SafetyWing but also could very well be its own company as well. Just, you know, building a digital country.
[00:01:50] David Elikwu: I'd love to get more into a lot of that. But before that, I actually wanted to go back to something that was just triggered by something you were mentioning, just this the idea of taste and you were talking about it.
[00:01:59] David Elikwu: I'm really interested to get your perspective on this idea. So I was talking with Danielle Strachman about this, who is a VC. She helped to found the Teal Foundation and runs the 1917 fund, but this idea of like prepackaged beliefs actually, okay, prepackaged I also talked about with Eric, but you know, when you think of taste, part of the reason that people lack taste now or taste can be hard to find and it can seem to be rare, is the fact that, you know, when you think of, for example, like furnishing. Before people would have to go and buy, you buy a chair, so if you buy each of the items in your house at different points in your life, you don't just go to the store and like right now, you go to Ikea or you go to one of these stores, the kitchen is already there, it's already laid out. The living room is already there. You just say, I want this kitchen, come and build that in my house. Just bring me that entire setup. That's exactly what it's going to be. You see the living room, the chairs all come in a set. You buy the entire set, you put that in your house.
[00:02:59] David Elikwu: So you don't actually have to go through the process of picking out each individual thing and this same exact mindset proliferates into our daily lives. Politics are exactly the same. You can't look at someone that says they're a Democrat or Republican and you know, they're not individually going and buying each of these beliefs. You buy them as a package. You just go and say, hey, you know, which political party has the nicest package? And you just pick it up the same as you buy a kitchen. You say, okay, I'm a Republican now. I'm just gonna get all of these ideas. This is my belief system now. That's exactly what I'm gonna do.
[00:03:30] David Elikwu: And so I think you see this in loads of areas of people's lives. What I'm interested in is your take on how this interfaces with Silicon Valley and like YC in particular as an idea. Because of the professionalization of what it means to build a startup, you have this idea of, okay, like almost, YC kind of becomes like the university for building a startup and similar, you know, similar programs like that. And you are seeing it with like project manager jobs. And I think someone had a really good tweet the other day of, okay, back in the day at a great startup, what does the product team look like? It's like three different roles. And now it's okay, you've got the scrum master and then you've got the product owner, then you've got the project manager and you've got, you know, 17 different roles that all kind of do one small segment of these jobs. And that can be all very useful, and it's nothing wrong with those jobs, but it's just this idea that as things become more standardized and professionalized, taste congregates in a sense, and the more that you have potentially programs like YC or, or some of these programs where, there are gatekeepers for taste and what good looks like and people have to converge around certain ideas in order to fit through that idea. Because of that, you don't get the wild someone over there in some far part of the world just built something randomly that's completely separate. You know, they don't have to be in Silicon Valley they're all the way over here doing something very different, and that gets to be its own thing.
[00:04:54] David Elikwu: And obviously maybe you are building the solution to part of that, but I'm interested to know what you think of that and how some of these things can affect taste or do they?
[00:05:03] Sondre: Yeah. Gosh, again, again, I love knowing the questions. I think you are at the frontier of thinking, by the way. So that's really appreciative. Yeah, so there is several parts that you wanna grab onto there. I mean, just to start with like the kind of plain observation.
[00:05:19] Sondre: So I think when you have the pre-packaged ideas a hypothesis of what's going on is that they're not actually developing their taste. Like you're buying the package to save the time to have, to develop the taste. It's like time saving, energy saving, and it makes sense like someone else has done the work.
[00:05:35] Sondre: I don't have to figure it out. I'll just opt into the, to the generalized sort by most popular, it's probably gonna be good and it is probably gonna be good, until it isn't. But you're not developing your taste because you're not trying to have like an hypothesis of why is this good?
[00:05:50] Sondre: So I think that the, the underlying lack when you people don't do that is just independent thinking. They're not exercising, and then it becomes undeveloped, and then they almost become more and more dependent, because they haven't honed that muscle, or developed that ability as much. And then you find this like pseudo taste, which is really like a discussion more about the state of what's popular, which is a such an uninteresting conversation compared to a discussion about what's good within any particular domain like furniture or policy.
[00:06:20] Sondre: So yeah, so, so that's roughly, that would be my hypothesis about what's going on there and then, how does this interact with Silicon Valley Y Combinator?
[00:06:27] Sondre: Well, one little piece, which is, I hesitant to saying this because it sounds like, but anyway, I'm gonna use this example so the founder of Y Combinator is Paul Graham. I think he is one both pretty good at being an independent thinker, and he's pretty good at recognizing independent thinking. So I'm not, I have like a abysmal social presence. I'm not really active on anything, but I have a Twitter account and at some one point he asks like, who are people who have like high quality posts but no followers? And I was like, oh, I know that. And I posted, you know, tagged four accounts. And then, since that day, which is like years ago, he's been following me back and he also followed those accounts and others. And I think that, that exact thing he's saying there, right? Is the essence of what it is. Low follower count means not popular, right? You don't have the kind of social selection mechanism. High quality means you're able to tell what's good. And the only way, the only way you could stumble into that is because that's what you were engaged in to begin with. You weren't trying to figure out what's popular. You're trying to look at what people were saying and telling, you know, is this good or not?
[00:07:32] Sondre: So Silicon Valley I do think is stronger in this sense. A cynic here will say, well, there are still prepackaged beliefs and meme trends and opinion trends. And, and there is, right? So it's just because it's like they're better than an average place, doesn't mean that you don't have it here. There are still these mimetic movements here you know, right now, as you know, if you saw the open AI debacle, there we have two, which I'm very familiar with both of these little groups. The very recent almost like joke like phenomenon of the accelerationist. But that I think has a big idea history in Silicon Valley. And then of course you have EA, Effective Altruism, which I've been associated with for years. And within these trends, you do see the same thing emerging. So even though in the beginning both of these had much very high share, of independent thinkers, when something goes mainstream, you sort of can't help it, right? You get people who just want to coast, like you get people who don't contribute. And the only way to contribute is to think for yourself. And then you can get this phenomenon where they start to behave like this movements where people seem to be copying too intensely and then they start to become a bit ugly.
[00:08:40] Sondre: But I don't think there's anything about the essence of the ideas. I think it's just that's, it's a phenomenon of something going mainstream, then that just tends to happen. So, so yeah. I get that was the last point is just, there are examples of Silicon Valley of real independent thinking. I think it's great. I'm drawn to it like Paul Graham, I would say Robin Hanson that we talked about earlier. Even when I was very young, the first time I saw it I was like, this that phenomenon, you can't predict what they're gonna say. Such a strong indicator of independent thought. But there are lots of other people here in Silicon Valley that I kind of had that same feeling, Balaji, Sam Altman, but also like just random people I would meet a friend of mine, Radu, friend of my flow. Same thing, I've talked to them, I can't guess what they're gonna say. Even on like hot button issues. They arrived at them through some other mechanism you know, often like, I guess not necessarily a genius mechanism might be.
[00:09:28] Sondre: But at least they're kind of developing their character. So, and I strive for that. I think it's good. I think it's especially good in business, but I think it's also good. Just in general, what do you think?
[00:09:38] David Elikwu: Yeah. No, I agree. I think that's a really, really good point actually. I need to spend some more time thinking about that. But this idea of, I mean, that's partly where contrarian thinking comes from. This is something that Balaji talks about as well. And yeah, the idea of arriving at things independently.
[00:09:52] David Elikwu: And I think you mentioned, you know, with Paul Graham, a good trait of his that is one of the things that make him great, is he's willing to look outside of the box. And I see this actually from following him on Twitter, he's very often willing to be misunderstood, and he's willing to go against the grain and say things that a lot of people may not immediately agree with. Some of them may come around later, some of them might not, but he will just plant a flag in the sand. Or sometimes he's not planting a flag in the sand, but he's just willing to explore ideas and he will just post, an image of a chart and say, you know what, what do people think about this? You know, it's not that he's already gone and formed a super strong opinion one way or the other, but he's just willing to explore some of these ideas that might be taboo and some of these ideas that you get in trouble for.
[00:11:22] David Elikwu: So another thing that was really interesting also that was triggered by something that you said, and it was a dot I hadn't necessarily connected before, but it does relate very strongly, is you were talking about, you know, the EAC movement of, is it effective accelerationist, I know it's effective altruism and you know, the accelerationist movement.
[00:11:40] Sondre: Yeah. I think it's a play on words, I mean.
[00:11:43] David Elikwu: But I think what is really funny, just like you mentioned about seeing the trend of how these two camps have played out, is something that I've referenced before as flanderization and I don't know, did you ever watch The Simpsons?
[00:11:56] Sondre: Some episodes, but I haven't like watched all the seasons.
[00:11:59] David Elikwu: Okay. you might be familiar with Ned Flander who was like this,
[00:12:02] Sondre: Yeah. Yeah. Yeah.
[00:12:03] David Elikwu: Ned Flanders is one of the characters on The Simpsons. And what is so interesting, if you go back and watch that show from the beginning is that Ned Flanders starts off, his character is actually, just a very mild mannered Christian Guy. He just has some mild Christian values and that's about it. But what happens as soon as you show him on screen and as soon as the people in the audience start interacting with the idea of this character is you slowly see him as the show goes on morph and become more and more of a caricature of like the most polarizing versions of the ideas that he originally had. So he originally may have had some of these Christian values, he's this mild-mannered neighbor. But more and more as the audience that's watching interacts with this character, the feedback loop begins and so suddenly he becomes more and more caricature.
[00:12:48] David Elikwu: And I think you see this happen with a lot of people. It's this idea of audience capture. You know, as people want to become more famous whether as an individual or as an idea, in order to become more famous, at least this is my view, you kind of have to abstract the idea. It has to be visible from a distance and to abstracts it, that makes it easily digestible. If it's too complex, if it's too nuanced, it's not going to spread because people actually have to think about it, that's not how ideas spread. Ideas spread when they are so simple that they are just instantly recognized, instantly absorbed. People just see it and they get it, and that's kind of what ends up happening with Ned Flanders. You see it happening maybe with people like Jordan Peterson, where if you go back and watch some of his original videos, they're very nuanced. He has loads of complex views on lots of different things, but then now maybe it's very one kind of view on a lot of different things.
[00:13:37] David Elikwu: And I think you see the exact same thing with the battle between the EA movement and the Accelerationist movement, where not that long ago. A lot of the people that were in that movement were nuanced thinkers. In my mind at least, you know, these were people that had loads of different beliefs about lots of different things. Each one of the might think slightly differently. And maybe it still is, but at least on Twitter from what I can see now, because it's been highlighted more, it's becoming more and more abstract. Because as you go into your tribes, you have to figure out what the tribes stand for. And the analogy that I use is kind of like, old video games, you have these eight big characters and you can easily recognize them. I could show you a, you know, 4K picture of Mickey Mouse, or I could just show you eight squares that are colored in a particular way, and you instantly recognize that that's Mickey Mouse. And that is kind of how a lot of ideas become so that in order to recognize it from a distance, it has to become these eight squares instead of this high fidelity. Oh, there's nuances. There's these curves here and there's these shapes here. It's just, okay, that's what it is. And so, you know, you see people going into their corners going into these tribes.
[00:14:41] David Elikwu: Okay, you mention this with Aleaeza as an example. Potentially, I haven't followed his work for a very long time, but it's another example of where potentially in the past he may have had some articles that he wrote that were pro AI in some respects. But as you become more well known, there's a feedback loop where people will only react to certain ideas that you have, and so you have to reinforce certain ideas that you have and then suddenly you are all the way on the left side in this camp over here. And this is the belief system that I have, and this is the only thing that I really talk about in this particular way.
[00:15:13] David Elikwu: So connecting these two things, first of all, I'd love to know if you have any thoughts about that, but then also how that connects to what we talked about with some people like Paul Graham. You mentioned Balaji, you mentioned some of these other people, where they're able to look for things where other people aren't looking for them and have contrarian thinking. I'm really interested to know, as someone that's been through YC twice, and that is supposed to be a trait that they look for, what have you seen as commonalities you know, what makes really great founders? Is there some, anything else apart from that example of a trait that is something that you look for in really great founders or really great builders or really great VC's.
[00:15:49] Sondre: Yeah, just a lot in that question. I don't have, let me start off front to say that I don't have any answer that's coming to me. Like, yes, I have the answer. I thought about this.
[00:15:58] Sondre: So is just, I'll just think about that for a second. I can't say anything new here. I believe the sort of common curriculum here, which is that, you know, relentlessly resourceful, independent thinking, possibly determined are really good predictors, speed, bravery, but that all is like traits that play into relentlessly resourceful.
[00:16:20] Sondre: But yeah, the independent thinking one is probably the hardest one, right? So as you see startups over the last decade become very mainstream, you see a ton of people who under normal circumstances never have been founders kind of get drawn into it.
[00:16:34] Sondre: I talked a lot of Norwegian founders because I'm a Norwegian founder in San Francisco, so when they go to San Francisco, they're like, might as well reach out. I love to talk to them. And I've seen this pattern quite a bit it is really like a stressful thing because in order to make a product that is new, you know, you have to think independently. I think, I think it's very hard to arrive at that through the sort of, copying mentality.
[00:16:59] Sondre: So if you have someone who isn't really developed their kind of like independence and they're just kind of going by that. One, they tend to go down the routes of fashion for the companies they're making, but they also tend to kind of zigzag between, they're the kind of people who will say, I've heard too much different sort of advice, this is like a concern, it's a tell, right? And I'm like, so what role are you playing? Are you just like going, like, it's just whoever said the last thing? And the answer is often, yeah. That is kind of like the approach.
[00:17:29] Sondre: And then the problem is having too much. But see, the underlying problem is when you're a founder, you got, you gotta think for yourself. So you have to listen, learn, but you can't just react.
[00:17:39] Sondre: A necessary trait is independent thinking. I think, both to have your own evolving hypothesis about what to do next as a company and, you know, in your role, but also to be able to choose a product that is new and enduring.
[00:17:52] Sondre: There are other things though when I just look at the greatest founders, they're quite clear thinkers, clear communicators, user oriented, that's like such a practical thing. But like, good with product is a thing, is a actual skill. And it's very hard to get good with product like people will work in product for years and still suck at it. Because what it really is, is taste in judgment. Back to that thing we said earlier, it's taste in product and judgment. What will work? That's the real skill.
[00:18:19] Sondre: So that's, one I think is maybe a little bit underrated and not mentioned as much.
[00:18:24] Sondre: So I've, I've also heard this expression. I also haven't thought about this. I have no idea why I'm insisting to say anything at all and I've seen it. I see how the incentives work. I think it is though, a little bit exaggerated because it's format dependent. So for example, if you are on certain, you know, if you're on Twitter, then this EA Accelerationist thing becomes, but part of it is because people are just like playing games and whatever is that most synthesized low, resolution idea that, as you described, will rise to the top of the retweet chain. And of course they will indeed get more followers. So there is an incentive there, I definitely recognize that.
[00:19:04] Sondre: But I often find that it, that doesn't automatically mean that everyone in that movement thinks like that. This is also a bit of an illusion for something going mainstream.
[00:19:15] Sondre: So in effective altruism, in both of these movements, I find some of the most thoughtful people ever, ever, the least tribal people ever. I would say EA is a particularly thoughtful, and the whole community is like incredibly recent, like going to like less, not less wrong, but you know, any well, you can go into less wrong. It's the same. It's an overlapping community. And you'll see some of the most thoughtful, mild mannered kind reasonable people you've ever seen. They're a bit kind of brainy, right? They're like very high on thinking versus feeling, so they have some real blind spots and they are also a bit tilted towards thinking, not acting, but such a nice gang. I don't know why they went a little bit crazy with AI thing. I mean, it's understandable, but maybe they were even right. Who knows? I don't think so. But you know, more agree with Robin Hanson here.
[00:20:01] Sondre: And then you have the Accelerationist, they're the opposite, they're kind of similar, they're like very optimistic. There is also this whole collection of meme threads here. It broadly translates to what used to be transhumanism, which has just completely disappeared as a, as a movement. But like techno optimism, David des Max Moore, extra pianism, I mean, this guy who said the founder was just doxxed of the Accelerationist, called this company Extropic AI, which has two meanings. One is the opposite of entropy. But it also is actually the name of this sort of transhumanist movement, which kind of merges transhumanism with sort of decentralized thing from the early nineties. So I don't think that's, that's an accident.
[00:20:45] Sondre: So a lot of these people are like in the Bay Area memes fair, and we're really like, I think they're ready to shine again, because they were so beaten down. When you came here, when I came here in 2015, they were really prominent. They had a lot of these people, like the whole Singularity University that's like, that's in that meme space. But I just remember I came here in 2012, stayed in some house in Palo Alto. And one Airbnb looked through the bookshelf, it's all this kind of books, this like techno optimist thing.
[00:21:13] Sondre: But then over the last five years here, you know, you first had to sort of social justice, and then you had effective altruism really be more prominent. And then you had now this more techno optimistic kind of meme space really back with a vengeance and a bit of an ax to grind, I think like, just like a real annoyed like.
[00:21:31] Sondre: So I think that's what you're really seeing, you know, and you see some people like Gary Tan. Because techno optimism, if you take it out of the current context, it's a very mild, not scary belief system for like someone who isn't part of that, right? It's just, I wanna solve problems with technology, right? it's very non-confrontational.
[00:21:50] Sondre: We have this situation now where they're unusually on the, they've been on the defensive for a while. So you see, you know, people like Gary Tan, the current president of Y Combinator be very sort of defiant in his sort of pro optimistic attitudes. Yeah, so that was a journey. I don't know where that ended or started, but, love talking about it nonetheless.
[00:22:10] David Elikwu: Yeah, no, it's interesting. Just one of the things that you said, and you're probably right, and a lot of these people, I don't necessarily know them in real life, you've at least interacted with a few of them, but I think it's a key point that you make where in reality, I'm sure a lot of these people are far more nuanced than they present themselves as on Twitter.
[00:22:28] David Elikwu: And someone was making the joke the other day with is it Jeff Bezos or Beth Jesus or whatever his name is on Twitter, who's you know, one of the leaders of the Accelerationist movement is that, when he recently went on a The moment of Zen podcast. So he recently did a podcast interview with his face for the first time. 'cause he's been like an anonymous account. And hearing him speak is you know, he's very, a lot more nuanced than perhaps people were saying, you know, like just a mild-mannered nice guy and all of these things, which are very much in contrast with how his public persona played out, where it was, you know, not his face, not his name. And it was almost just this abstracted version of his ideas where the things that you shout about on Twitter may just be a microcosm of the actual nuance that you embrace in reality.
[00:23:16] David Elikwu: But I think the funny thing is that that happens a lot on platforms like Twitter, and I don't wanna just talk about Twitter, but on general online, the interesting thing that I think happens is that, and I did write about this a little bit, just this idea of mini metaverses, and actually I think it is appropriate, I was trying to think of the extent to which, you know, maybe it's not appropriate connection, but I think it is.
[00:23:35] David Elikwu: It's this idea that, okay, you go and watch Netflix and there's two layers of personalization that a lot of people encounter. The first layer is that based on the things that you've watched before, Netflix will decide what to show you and it will suggest to you things that you should watch. And one thing that I learned quite recently when I got locked out of my Netflix account and I used my partner's Netflix account, I actually don't know what I wanna watch when I look at her account. I have no idea. I didn't come here intending to watch anything. And when I'm seeing things that are not related to my interest, I don't actually know what this is. But anyway, that's the first layer.
[00:24:08] David Elikwu: The second layer is also that based on a ton of variables, like how long you spend looking at different title tiles, et cetera. Netflix doesn't just change what it suggests to you, but it changes how it suggests things to you. And so the images that it will use to show you, you know, this is why when people ask, you know, why do I never see the actual title imagery? Like, why don't I actually see the movie poster? They're not showing you the movie poster. Like, okay, if you like horror, they're gonna show you the horror bits of Stranger things. If you like nostalgic eighties stuff, it's gonna show you the nostalgic eighties stuff of Stranger Things. They're gonna sell you the same stuff, but in very different ways.
[00:24:45] David Elikwu: And it's this idea that everywhere you go on the internet, it is tailoring itself to your most polarizing ideas. So when you are picking what movies to watch, the entire architecture of your movie suggestion algorithm is designed around what you came in with. So whatever you came in with, it's just gonna give you a lot of that feedback back. And you go on Twitter, it's a similar thing. It's just showing you based on any clues that it's had, the most polarized version of that is what it's gonna give to you. And there's also the inherent feedback loop of the audience capture a bit where, okay, you are tempted if you want to get more likes and more dopamine, more whatever, you have to feed back into that.
[00:25:25] David Elikwu: But it's just a very interesting idea that as the world gets more personalized, it can also then change who you are and what you believe reality is. And so in real life, just going back to what we were discussing, a lot of these people might think in a far more nuanced way, but there are a lot of people probably consuming these ideas that to them now, this is what reality is, and they are just gonna get stuck in a self-reinforcing feedback loop of those ideas, the more online they are.
[00:25:52] David Elikwu: And so what happens is, so why I connected it to this idea of mini Metaverses is that, people are worried about, okay, the far future idea of the metaverse, you are wearing these goggles and you're in this virtual world. A lot of people are already in the virtual world. They haven't taken the off the goggles. Every time you log onto Twitter, you're in the virtual world. That is not necessarily reality, that is just, you know, an abstract conceptualization that's designed entirely for you and everywhere you go on the internet, Google search results, YouTube, everywhere you go is the exact same thing where it's, it's not real, it's a reality that's only been designed for you. You go on someone else's YouTube, it's a completely different.
[00:26:27] David Elikwu: So anyway, I know I've rambled a bit, but those were just some of my thoughts on what you were saying.
[00:26:31] Sondre: So I, I would tend to agree and I update a little bit in your just do an EA expression, update in your direction. That's a acute expression I have. Which is that I, yeah, there is this clustering happening, which you would think clustering.
[00:26:46] Sondre: But one thing that also struck me on the other side of this coin that is also happening at the same time. I'm just gonna make this argument. It's a bit like, I don't know if I think this, this is just a bit of a that was advocate here for a second. And that is that maybe we're not clustered enough. So if you were to go back a little bit in history, you had like you had people who lived so isolated, they developed their own languages, right? Because travel was expensive and was not interaction.
[00:27:07] Sondre: And you also, as little as the nineties, go back to the nineties and subcultures were much more intense. Like if you were to go to some, like nerdy convention either for like a philosophy or a game or something, it was remarkably different from mainstream culture. And there's very little cross pollination there. So you have these like genuine subcultures. There is this way in that that's good. So the harmonization that comes with everyone being connected to everyone else and kind the subcultures kind of bursting onto the scene, you know, whether there's like R all or what or whatever. Does have this downside to it, which is to use a politicized term. It's a, it's a lack of diversity which is that, if you have a fish in a pond, there is a fish in a pond, and then the pond dries out and there's a variation in the look and feels of the fish, one of them can flap onto the next pond. So that variation becomes like a survival mechanism for massive external impacts.
[00:28:04] Sondre: There's an argument to be made that, yeah, there's a little bit of clustering, but broadly the trend is we're getting so much more similar is ridiculous, right? We're all trending towards using English, right? There's this universal culture definitely emerging on the internet. And you have people who are like in the most protected subcultures, like referencing the mainstream culture, like you will have Rural Taliban fighters referencing like Bay Area, right? Social, you know, a squabbles, like in their, you know, writings.
[00:28:37] Sondre: You know, there's this old Rammstein song, we're all living in America, which we used to be true. I think to some degree we're all living in the Bay Area a little bit. And now via the internet maybe the problem isn't that there is some clustering, maybe the problem is that it's not clustered enough. Like maybe we're also alike that when something dramatic external happens and the kind of mainstream approach isn't gonna work, then there is no weird subculture that has this completely different approach that's gonna carry us through.
[00:29:07] Sondre: So that, that would be my potential counterpoint. I think it's that both sides are true situation. but yeah, that's another perspective.
[00:29:14] David Elikwu: No, I love that perspective and I think that's a great counterpoint.
[00:29:17] David Elikwu: Thank you so much for tuning in. Please do stay tuned for more. Don't forget to rate, review and subscribe. It really helps the podcast and follow me on Twitter feel free to shoot me any thoughts. See you next time.
Member discussion