David speaks with Eric Jorgenson, the CEO of Scribe Media. He also has a rolling fund investing in pre-seed and seed-stage tech companies, and is the author of two books: The Almanack of Naval Ravikant: A Guide to Wealth and Happiness; and The Anthology of Balaji.
They talked about:
๐ Impact of Naval and Balaji's books
๐ค How Balaji Srinivasan became a polymath
๐ป The power of technology and truth
๐ Intersection of technology and media
๐ The Westโs media and technology war with China
๐ฐ The challenges of funding innovative ventures
๐ A techno-optimist future
๐ค Different types of truth
This is just one part of a longer conversation, and it's the second part. You can listen to the earlier episode here:
Part 1: Upgrade your thinking with Eric Jorgenson (Episode 73)
๐ Listen in your favourite podcast player
๐น Watch on Youtube
๐ค Connect with Eric:
Twitter: @EricJorgenson | https://twitter.com/EricJorgenson
Website: Scribe Media | https://scribemedia.com/
Books: The Anthology of Balaji | https://amzn.to/45UjFqe
The Almanack of Naval Ravikant | https://amzn.to/47g7ncy
๐ Show notes:
00:00 | Intro
02:17 | Impact of Naval and Balaji's books
07:56 | How Balaji Srinivasan became a polymath
12:14 | The power of technology and truth
17:16 | Intersection of technology and media
23:59 | The Westโs media and technology war with China
28:50 | The challenges of funding innovative ventures
38:02 | A techno-optimist future
44:50 | Different types of truth
๐ฃ Mentioned in the show:
Naval Ravikant | https://en.wikipedia.org/wiki/Naval_Ravikant
Balaji Srinivasan | https://en.wikipedia.org/wiki/Balaji_Srinivasan
Smart Friends | https://www.ejorgenson.com/podcast
Chutzpah | https://en.wikipedia.org/wiki/Chutzpah
Andreessen Horowitz | https://en.wikipedia.org/wiki/Andreessen_Horowitz
Nassim Nicholas Taleb | https://en.wikipedia.org/wiki/Nassim_Nicholas_Taleb
Richard Feynman | https://en.wikipedia.org/wiki/Richard_Feynman
J. Robert Oppenheimer | https://en.wikipedia.org/wiki/J._Robert_Oppenheimer
Claude Shannon | https://en.wikipedia.org/wiki/Claude_Shannon
TikTok | https://en.wikipedia.org/wiki/TikTok
Musical.ly | https://en.wikipedia.org/wiki/Musical.ly
Vine | https://en.wikipedia.org/wiki/Vine
Elon Musk | https://en.wikipedia.org/wiki/Elon_Musk
SpaceX | https://en.wikipedia.org/wiki/SpaceX
Steve Jobs | https://en.wikipedia.org/wiki/Steve_Jobs
Pixar | https://en.wikipedia.org/wiki/Pixar
David Friedberg | https://en.wikipedia.org/wiki/David_Friedberg
Cana | https://www.cana.com/
The Power Law | https://amzn.to/3sJTZ1Q
Sebastian Mallaby | https://en.wikipedia.org/wiki/Sebastian_Mallaby
Where Is My Flying Car? | https://amzn.to/3sKGPlf
Bill Clinton | https://en.wikipedia.org/wiki/Bill_Clinton
Mike Maples Jr | https://www.floodgate.com/team/mike-maples-jr
๐จ๐พโ๐ป About David Elikwu:
David Elikwu FRSA is a serial entrepreneur, strategist, and writer. David is the founder of The Knowledge, a platform helping people think deeper and work smarter.
๐ฃ Twitter: @Delikwu / @itstheknowledge
๐ Website: https://www.davidelikwu.com
๐ฝ๏ธ Youtube: https://www.youtube.com/davidelikwu
๐ธ Instagram: https://www.instagram.com/delikwu/
๐บ TikTok: https://www.tiktok.com/@delikwu
๐๏ธ Podcast: http://plnk.to/theknowledge
๐ EBook: https://delikwu.gumroad.com/l/manual
My Online Course
๐ฅ๏ธ Career Hyperdrive: https://maven.com/theknowledge/career-hyperdrive
Career Hyperdrive is a live, cohort-based course that helps people find their competitive advantage, gain clarity around their goals and build a future-proof set of mental frameworks so they can live an extraordinary life doing work they love.
The Knowledge
๐ฉ Newsletter: https://newsletter.theknowledge.io/
The Knowledge is a weekly newsletter for people who want to get more out of life. It's full of insights from psychology, philosophy, productivity, and business, all designed to help you think deeper and work smarter.
My Favorite Tools
๐๏ธ Descript: https://bit.ly/descript-de
๐จ Convertkit: https://bit.ly/convertkit-de
๐ฐ NordVPN: https://bit.ly/nordvpn-de
๐น Nutmeg: http://bit.ly/nutmegde
๐ง Audible: https://bit.ly/audiblede
๐Full transcript:
[00:00:00] Eric Jorgenson: We are living in an info war. You can find every possible take on every possible issue. And it can be really maddening to try to figure out wha t is actually happening, what's true? What's a lie? Who's trying to manipulate me and for what reason? And did they even know or are they just repeating a lie that someone else told them? Did that person even know it was a lie, right? It becomes incredibly difficult to navigate the world in an effective fashion if your entire perception is built on lie
[00:00:29] David Elikwu: This week we are back with Eric Jรธrgensen, who is one of my favourite recent guests. Eric is the CEO of Scribe Media, he runs a small rolling fund investing in early stage tech companies, and he's also the author of two awesome books, The Almanac of Naval and The Anthology of Balaji.
[00:00:44] David Elikwu: So in this episode you're going to hear us talking about that second book which he's just released quite recently. This episode is essentially the techno optimist manifesto, this is the call to arms for the builders and the visionaries and the people that want to go out into the world and build or invest in incredible things.
[00:00:59] David Elikwu: So you're going to hear us talking about a wide spectrum of things, but really a lot of Balaji Srinivasan's ideas about the future and about technology. First of all, Who is Balaji? How did he become such a legendary polymath?
[00:01:12] David Elikwu: We talk about the power of technology and truth. We talk about the intersection between technology and media and also the west's media and technology war with China. Then we talked about the challenges of funding innovative ventures and why it can often be so hard to correctly imagine what the future could look like, and really we had a wide ranging discussions around different versions of what the future should look like and how we should be investing in it.
[00:01:36] David Elikwu: And then finally we touched on one more of Balaji's ideas, which is the different types of truth. And so you're going to hear us talking about What the different types of truth are and why they're relevant to you?
[00:01:45] David Elikwu: So this is a really great episode like I said for builders investors visionaries anyone who cares about what the future could look like, what things that we want to build? What technologies we want to fill our lives and how we could better shape the world that we live in.
[00:01:57] David Elikwu: So you can get the full show notes, the transcripts and read my newsletter at thenowledge.io and you can find Eric online on Twitter @EricJorgensen.
[00:02:05] David Elikwu: If you love this episode, please do share it with a friend and don't forget to leave a review wherever you listen to podcasts because it helps us tremendously to find other listeners just like you.
[00:02:17] David Elikwu: I've probably listened to 50, way more than 50 hours of Balaji. And that is not because I'm some deep researcher, that's just because each podcast is at least five hours and so you only have to listen to him talk maybe six or seven times and you've got like 50 hours.
[00:02:33] David Elikwu: And I think with both of your two books, I think they are important for different reasons. The Naval book I think is important because he doesn't say a lot, he's not always in front of the camera, he's not always coming on podcasts. He has said a lot in the past, but you know, he's almost like the wise monk that disappears into the miss and comes back with a revelation and then disappears again for a number of years. And then Balaji on the other hand, is always on podcast. Like he, I mean, actually no, he does disappear in between. But when, when he comes out, it's almost like, like a visiting preacher. And it's so funny watching him go on, on some people's podcast because he comes with his own message. It's not really so much an interview. It's like he has the, the latest revelation and he's come to tell you about, you know, his latest ideas and he just does the rounds. And I think that is probably the harder one. Well, I, I guess different reasons, right?
[00:03:20] David Elikwu: So the Naval book is okay, he said a lot but not very loudly and it's all spread out and disparate. And the job is okay pulling all of that together in one place and that's what makes it valuable. But then Balaji is actually condensing, 'cause he said so much. There's hours and hours and hours of content bringing all of that into one place.
[00:03:38] David Elikwu: So I'd love to know how you found the balance between both of those experiences and you know, what it was like actually trying to condense all of Balaji's speeches and essays and tweets, et cetera.
[00:03:49] Eric Jorgenson: Yeah, they are very different. You know, for all the, probably overlap that they have in audience and people who are interested in the similar things. Very different communicators, very different thinkers.
[00:03:59] Eric Jorgenson: In compiling Balaji, similar amount of material. I think both of them had, you know, well over a million words of source material, dozens over a hundred kind of individual sources for both of these books that were the raw material that I pulled from and had to sort through and filter through. Balaji, I was filtering a lot more of his contemporary stuff. He spends a lot of time on whatever the current event is, you know, so he had talked a ton about Covid and very few of that ended up being kind of passing my filter for being like evergreen, applicable, useful ideas for millions of different readers. That'll be relevant, you know, five or 10, 20 years in the future. So it was a harder filter on some of that stuff.
[00:04:41] Eric Jorgenson: And Balaji also, to your point on being he's just a very different, he's not a distiller. He has zingers of lines, but he will explain every single aspect of his thinking to you and rationalize every point, and show you every data source and just go all the way to ground truth and as far back through history as he can find.
[00:05:00] Eric Jorgenson: And that is more of a challenge for a curator, right? As my job as an editor, it's much more difficult than to decide what is the most useful supporting argument? Does this point need a supporting argument or is it self-evident? He has this habit of explaining something to explain something.
[00:05:17] Eric Jorgenson: And so he'll be like, let me tell you about the importance of technology. Have you heard of the French Revolution of 19, you know, 34? And you're like, no, I haven't heard of that. He's like, all right, well let me explain the French Revolution of 1934. I'm making that up. That's probably not a thing, although the French revolt constantly. So they probably did something in 1934.
[00:05:32] Eric Jorgenson: But understanding what's the most salient point and figuring out what the best phrasing of it is and how to thread those ideas together. And in Balaji's case, you know, Naval talks very intentionally about fundamental truth from the very beginning. That he's constantly searching for the most concise, most useful individual thing to say and it's a, it's beautiful. And it's partly why so many people love him. Balaji is almost the inverse. You know, he has these deep fundamental truths that he doesn't talk about very often. You don't really very often hear him say how important technology is or explain the fundamental like moral good that technology is, which is really the driving force of why he's talking about covid or why he's talking about the need for different regulations or why he's concerned about the collapse of the dollar or any of the things that he's talking about or why we want a network state. He spends so much time, really on the, on the leaves of the forest and not very much on the roots.
[00:06:34] Eric Jorgenson: And that's really what I wanted to accomplish with this book. Like I wanted to, I wanna show the deep roots, the soul, and the vision behind why Balaji's doing everything that he's doing. I hope it makes, it makes it more evident, you know, the mission behind it. I hope it brings more people into that mission.
[00:06:51] Eric Jorgenson: You know, I, I was talking to somebody yesterday on Twitter who read the book, and they, they tweeted me beforehand and they were like, is this gonna make me finally like, Balaji? And I was like, I don't know, man. Like, there was only one way to find out, right? Give it a try, give it a read. And he read the book and he came back and he said, you know, there's actually a lot in there I agree with. There's stuff I disagree with, but the fundamental, the roots of technological optimism and accelerationism and the hope, like, hope for a better future. I agree with all of that.
[00:07:19] Eric Jorgenson: There's stuff I disagree with in the methods and the tactics of how we're getting there. And I have a different view of how it unfolds, but we want the same thing generally. And I was like, great, fucking great. That's so awesome. Like, let's recognize that we are 98% on the same team and concentrate on supporting each other on the 98%, not fighting about the 2% where we disagree in some nuance about how to make a better future, right?
[00:07:43] Eric Jorgenson: So I'm already seeing some of these, this is anecdotal, but I'm already seeing this sort of happen a little more and this broad sort of acceleration movement is really like gaining steam. And I hope this is just, you know, fuel on that fire.
[00:07:56] David Elikwu: Yeah. Awesome. I think what is really interesting about Balaji for me, compared to lots of other people that you could mention within Silicon Valley or in, in other technology spaces, is that he almost equally embodies the, the practitioner, the philosopher, and the prophet. Who has, you know, this, grand idea, this, this methodology that he wants to share.
[00:08:15] David Elikwu: And what I find really interesting is that. First of all, okay. I think it would probably be useful for you to, give us a proper bio on Balaji. Because what I think is really key or what's really interesting for me is the fact that his background spans so many domains. And even like you just mentioned, sometimes he's describing something and he'll refer to some obscure mathematical theory or he'll refer to history, he will refer to geography. He has a background in genomics. He's been a CTO. So he pattern matches across all of these domains, but also simultaneously in each of those domains, he seems to have beliefs that other people don't hold, right? So within genomics, he might say something or belief something that other genomics practitioners don't. And within investing, he might see things that other people don't, you know, he talks about digitalization, et cetera.
[00:08:58] David Elikwu: So I'd love to know maybe, first of all, give us the bio and then also, you know, despite his experience in all these domains, how comes he continually comes to contrarian opinions.
[00:09:08] Eric Jorgenson: Yeah, I think you won't make it too far into the book before understanding why he's a contrarian. He's got like contrarian in his blood from a very young age. And there's a line in his bio the very first chapter that I put together, which is the his background and he says in high school or middle school even, there was standings in the front of the school for who led in honor roll, who led in grades and who had the most detentions. And he was always the top of both. So I think like high performing, brilliant and constantly in trouble, right? Like that is, that is who he's been since a very young age. And he became a fighter very early on. And I think we see that a lot, you know, that's probably why to some extent that guy was like, is this finally gonna make me like Balaji?
[00:09:50] Eric Jorgenson: Like Balaji is not trying to be liked. Balaji is trying to be correct and accomplish his goal, and does not back down really from any confrontation along that path. And I mean, I respect the chutzpah, you know, like he's just, and I'll give you the bio, but as you hear this bio, like think about all the fights that had to happen along the way and like what he had to go through and how hard he had to work to kind of like do all of these things in this time span, right?
[00:10:16] Eric Jorgenson: So he grew up in New York, like I said, detention and honor roll. Went to, I was, I think it was clear that he was like smart from a very young age. Went to Stanford, got multiple masters at Stanford, then get a PhD at Stanford in computational genomics. Yes. Started a company right out of that like, moved from academia to a startup because he thought it could increase his impact. That was like a very built on his PhD work. It was a clinical genomics company that they ended up selling for a couple hundred million dollars. So that was a huge kind of early win and has invested in hundreds of companies since then. He's taught at Stanford. He's co-founded a few more companies, one of which was an early crypto company. They got bought by Coinbase, and so he went and became the CTO of Coinbase for a while and helped them scale up relatively early on. I believe that was like 2014, 15 was around that time. Then became a general partner at Andreessen Horowitz. So like helped them do the crypto fund. Helped them do a ton of what became, I think, quite high performing funds. So moved from founder to investor, multiple times operator back and forth.
[00:11:22] Eric Jorgenson: And then became a, I don't know what you would call what his career is now 'cause no, none of us really know what, like somewhere between creator, author, investor, like profit, like the last few years he's basically just been doing Balaji, right. That means continues to mean investing. He published the network state. He is now like starting a network state. He's got the Network state podcast. He is really like, the way I think of it, he's really serving an ideology, which I think is a really interesting way to think about your career, right? He's like, I want a brighter future. I want a frontier, I want space and freedom for technology to evolve and grow and have a greater impact. And like, he spends his time now on whatever project he thinks is most likely to create that outcome for humanity. Which is a really fascinating thing, you know, as you think about your career. Like what are the, think about what the ideology is that you serve.
[00:12:14] David Elikwu: Sure. Yeah, that makes a lot of sense. And you mentioned, you know, his ideology. I'd love if you could expand on, because obviously like we mentioned, Balaji covers a ton of ground. I'd love to know from your perspective for an ordinary person, which of his ideas do you think are the most important for a regular person to grasp? And I frame it that way just because, you know, there's a lot of things that are really interesting. But perhaps just for if you are working in genomics or if you are working on something that is relevant to banking or to investing. Okay, this is something that you should be thinking about, but I think some of his ideas expand far further than that and touch the lives of regular people who may be, who may have no idea that all of this is on the horizon or all of these things may potentially be coming down the pipeline.
[00:12:59] David Elikwu: So what do you think are some of the most ideas for, for people to understand?
[00:13:02] Eric Jorgenson: Yeah, I would think, I mean, this book basically boils it down to two. If you're just gonna walk away with an idea and keep living your life, but do it in a slightly more informed way, it comes down to two things, which are the first two sections of the book. One is technology and one is truth.
[00:13:17] Eric Jorgenson: So the whole takeaway of the technology piece is that like all of the good in our lives an incredible, unbelievable degree that we've totally forgotten how to appreciate comes from technology. Like value creation comes from technology. The reason you are not like cold starving and probably already eaten by a lion is technology.
[00:13:38] Eric Jorgenson: And we still fear change so much that we fight it constantly. You know, so much has changed in our culture, in our regulatory regime, in our, just in what we appreciate in daily life that I think we've forgotten to marvel at the technology that we have and to be excited about the technology that is to come. Every technology brings new good and bad, but it's almost always, and I can think of very, very few even close counter examples, a new technology always does more good than bad.
[00:14:06] Eric Jorgenson: You know, humans are fearful creatures. It like, it's difficult to appreciate that every day. But that whole first section is just about like, please, please, please understand and respect all the good things that technology has done for us and let's continue to appreciate it and invest in it.
[00:14:20] Eric Jorgenson: And moreover, like for you as an individual, it's probably where incredible opportunities lie, no matter what you're do. If you're a plumber, a day laborer of an accountant, if you're a homeless dude, I don't know. There is a way for technology to improve your life. And no matter what position you're in, there's probably a next step, an improvement that's available to you because of an application of a technology that you may not yet appreciate.
[00:14:45] Eric Jorgenson: And so I hope there are a ton of ideas in there that can help people see those opportunities and seize them and apply them like in a very however selfish you wanna be with that, make your life better and technology is the way to do it. Let's, let's go. Use this roadmap.
[00:14:58] Eric Jorgenson: And the whole second section is really about the fundamentals of truth. The different types of truth, can I say the variables that affect your perception of truth most notably the media. I think we talked about this a little bit in the last episode, but it's easy to forget your media diet and take for granted. Like you stop thinking about the quality of the inputs that you're putting in. And just learning to separate the things that you're hearing and seeing and reading from, you know, whether that's tv, newspaper or social media or your friends from, you know, well that doesn't fully reflect the truth of the situation and let me learn to
[00:15:34] Eric Jorgenson: Let me remember that that gap is there. Let me learn to examine it. Let me learn to like, see the shades of gray that might exist. And you know, increasingly I feel like we're, we are living in an info war. You know, like you can find every possible take on every possible issue. And it can be really maddening to try to figure out wha the fuck is true. Like what is, t is actually happening, what's true? What's a lie? Who's trying to manipulate me and for what reason? And did they even know or are they just repeating a lie that someone else told them? Did that person even know it was a lie? Right? It becomes incredibly difficult to navigate the world in an effective fashion if your entire perception is built on lies.
[00:16:12] Eric Jorgenson: So the art of understanding truth and the different types of truth. 'Cause there are types of truth. And using that as your map, getting a good map and good compass, and then using that to navigate and not just operating on whatever whispers you hear and wondering why the hell the world isn't responding to, you know, the inputs that you're trying to give it. Because you're, you're working in this cloud of mist that has no bearing on what on the reality. So of course you're not making progress, right?
[00:16:38] Eric Jorgenson: So I think that's technology and truth, and I say this in the introduction, are like, you know, you're soared in your shield for your quest. And if you can't wield those effectively, you're not gonna be very effective. You're not gonna make much progress, you're not gonna reach your goals, you're not gonna create the life that you want. And I hope that, you know, the whole third section of the book is really about how to apply them whether you're starting a project or starting a company or building something for others, starting a country, starting an open source movement, starting a podcast, like whatever it is. But you don't have to start something to just be more effective by understanding the power of technology and the types of truth and the way to suss it out.
[00:17:16] David Elikwu: Yeah, I completely agree, and I definitely wanna come back around to talking a bit more about both technology and truth, but I really just wanted to jump on what you said about media. That was my Freudian slip of the direction that I was
[00:17:29] David Elikwu: Yeah, I, I think genuinely people really underrate the role that media plays in our lives and I find it really interesting so that, the question that I want to get to is your thoughts on this intersection of technology and media.
[00:17:39] David Elikwu: Specifically, because I think very often you see within Silicon Valley conversations, there often seems to be a perception that media and technologists are diametrically opposed. it's almost like they are, they're fighting their on competing sides, right? The media is trying to get the scoop. They're trying to unravel things, they're trying to, you know, it seems like, or at least there, there is a framing that the journalists have a different agenda to the technologists. There are people that can see journalists as non practitioners and maybe kind of look down on or disparage them in that sense.
[00:18:12] David Elikwu: Nasim Taleb talks about skin in the game, right? And he disparages journalists and commentators as people that are not actively doing the work. They're not actively building things, right. And I actually just saw a tweet from Nikita Beer that I laughed at earlier today, where he was, it was something like you know, people that are writing newsletters instead of building products, right? So I think overall there's just this rhetoric about how these two things are separate and maybe not aligned.
[00:18:37] David Elikwu: But then simultaneously the media that we consume completely shapes the adoption of technology. There's some extent to which technology can make itself inevitable, but you just think about nuclear as an example, and you see how, okay, the way that the media and journalists frame a particular idea, a particular form of technology, can completely shape the way in which it's adopted simply because of legislation. We've seen the same thing with Bitcoin. You see the same thing with lots of different types of technology. You know, it seems important in a sense for media and technology to be aligned so that people can get behind the right ideas and we can drive progress forward in the right way.
[00:19:14] David Elikwu: So I'd love to know maybe some more of your thoughts, perhaps from what you've read from Balaji or from Naval about what should we make of this intersection between technology and media?
[00:19:23] Eric Jorgenson: Yeah, it's a really fascinating thing and it has been actually for a long time. It's easy to forget like the battle between journalists and technologists has been going on for a long time. You know, the era of yellow journalism was also like, there were plenty of infowars around, like the Robber Baron eras also.
[00:19:39] Eric Jorgenson: I think we've have seen an incredible crescendo in it because for maybe the first time, they're really directly opposed. Like, the biggest threat to media business models over the last couple decades, two decades, three decades, has been software and internet businesses, social media, Google, attacking the classifieds. Of course, traditional media is gonna attack technology. Like if every incentive they have is to get them regulated, pulled down, dragged like lower public opinion of them. Like, just look at the incentives at play, right. It is so obvious that they would be enemies.
[00:20:13] Eric Jorgenson: And we have this I don't know if it was ever true, really, like I haven't seen a, a crazy compelling case for it. But we have the, at least at some point, maybe closer to 50 years ago, this sense that the media was really noble. That like journalists were these incredible white knights like investigating and pulling like dirty truths out into the public. And some of them absolutely are, and I respect the hell out of them.
[00:20:37] Eric Jorgenson: And in the same way that you can't lump technologists necessarily. Not all technologists are good. Some of them are evil or some of them are scammy journalists. Some of them are evil. Some of them are just, you know, responding to the incentives. And some of them are truly heroes, like heroes of humanity.
[00:20:51] Eric Jorgenson: But when taken at an institutional level, I think the incentive system tends to win out over the moral system. But it's just very, so I'd encourage people to look at the, look at the incentives at play, right? And try to read between the lines of what's being said and understand why it's being said and by whom and what their hope is. 'Cause you, as the recipient of the media are the goal. You're the mark, you are the resource, like people are competing for your opinion. Especially as there gets to be this, like, Especially as an American, I think there's this interesting world where, or interesting dynamic where like, the world powers are competing for the public opinion of America because the public opinion of America to some extent drives American policy. And American policy to some extent determines what happens all over the world, like what conflicts they get involved in, on whose behalf and to what degree is a drastic thing. And to think that's, that that's being affected by like TikTok algorithms and hashtags and like relatively cheap resource wise, that a small number of hackers can like hack the attention and beliefs of the American public and push it in different directions. And that, that tips the scale on a global conflict is insane. But I think the reality, I think it's an important thing to recognize that your attention and your opinion is being actively manipulated and you need to armor yourself against that and recognize it to some extent.
[00:22:09] Eric Jorgenson: The other piece that you said that was really interesting, that technology is inevitable. I think that's a really dangerous misconception actually. I think technology is not inevitable. Like, no specific advancement is a given. And we need not to take that for granted. We need to not believe that technology gets better on its own. Like recognize that we humanity are like fighting entropy every day of our lives to progress, not to recede.
[00:22:35] Eric Jorgenson: And I'm already working on this Elon book, which will be my third in this kind of series, in this format. And he's got a great point that as you study history, you see many civilizations lost technological advances that they had made. You know, the Romans could build Aqueducts and then at some point forgot. The Egyptians could read hieroglyphics and then at some point lost the ability to, it is not a given that technology, even the technology level that we have to maintain it is not inevitable. You can imagine all kinds of disasters that would set us back a long ways or simply breakthroughs that don't happen. Like where would we be without Einstein? Where would we be without, you know, Feynman and Oppenheimer and the people that created the bomb? Where would we be without Claude Shannon and the other, like fathers of computing? Like there are so many small pivot points that the history of technology and therefore the history of humanity hinges on, that to not reach and strive for every single one of those that we can possibly get is just so shortsighted. And so, such a shallow thought.
[00:23:36] Eric Jorgenson: And I hope that more people appreciate that. And I talked through the, like self-driving car example in our previous episode, and I think that's the most pertinent one that's like playing out today that should be so obvious to so many people. And I can't fathom that people who are, you know, fighting so hard against it.
[00:23:53] Eric Jorgenson: But we're surrounded by them. And that's why we have to remember that it's not inevitable. Never take it for granted and keep fighting for progress.
[00:23:59] David Elikwu: Okay. You make a really good point. That makes a lot of sense. So there's two questions that I wanna ask. One is gonna come back to this idea technology and actually, yeah, it does align with the inevitability of technology and some of Balaji's thoughts on that. But I did want to ask, this is a completely different question, but just from my own experience which does intersect with this idea of media and technology,
[00:24:20] David Elikwu: I would love to know what you think about TikTok specifically, because this is another one of these examples where I think I mentioned the last time we spoke, I worked in China like it was a long time ago. But, you know, I worked at a law firm in China and I spent some time there. And very often when I hear some of this dialogue, there's a sense in which it feels like I've kind of gone behind the Iron Curtain because there is a proliferating idea of in various respects what the life for the average Chinese person is like, or what China wants, or what Chinese people want in different respects.
[00:24:50] David Elikwu: So on one hand you have the perspective of, oh, Chinese people work like this, and you know, American people work like that. And so I think there was a video that came out a while ago. And it was just like this random joke video of this white girl that was I think it was about like getting married and she was like miserable and working and miserable and all of these things. And people were like, oh my gosh, this is a Chinese psyop on the American people. And the joke, at least to me was like, that girl is a Chinese influencer, and she posted that on a Chinese social media platform and it went viral in China and then someone reposted it, or maybe she reposted it on TikTok. So TikTok was like the second or third place that this video ended up on. And then it also went viral. But I mean, because it's like funny or it's interesting to people, right? And it also went viral on TikTok and there's this sudden perception that, oh my gosh, it's the psyop or whatever. So I think, okay, that's one aspect of it.
[00:25:38] David Elikwu: Then there is also the layer of, okay, you know, China is potentially our enemy, does China want our data? What's happening there? So I think there's a lot of conflicting beliefs that people might have because okay, on one hand you could say, maybe China is trying to harm our children, but on the other hand, like in China, China has banned playing video games for however many hours. China decides that children should do however many hours of exercise as part of a school day. So there are things that China does to protect its children that America doesn't do because you know you have freedom, right? You have freedom to decide how many hours you wanna spend playing video games and how many hours you want to go outside and, and play sports or or exercise. And so there's this really interesting duality where I completely see both sides of it. Like I completely see the restrictive side and the side that people might be concerned about.
[00:26:28] David Elikwu: Then simultaneously, I do see that I think in some ways the perceptions that people already hold can blind them to potential solutions. Like, because we have this perspective of China doing this or, or China saying that we don't necessarily take steps ourselves to, you know, whether it's protect our children or to build technology that works for us. And sometimes the, the protectionism comes at the cost of actually building great things, right? TikTok, not that you could have seen it coming from when they created their algorithm and their way of, of doing things, but it was Musically before that we already had Vine. Vine was owned by Twitter. Like, in the west at least, there had been competing products and they got killed, or they got crushed for various reasons. And it feels like instead of trying to build better stuff, sometimes we just focus on okay, you know who the political enemy is and we are fighting the media battle instead of fighting the technology battle.
[00:27:16] Eric Jorgenson: Yeah. I think the world is a better place when we fight the technology battle every chance we can get. You know, the space race was a great thing for our civilization. You know, it was probably a scary time to live through the Cold War, but it produced incredible things. And, you know, for the 50 years since then, we've accomplished very little in space until Elon Musk came along and was like, nope, I've had it. Here we go. Like, we're doing this. Everybody get off your ass, right.
[00:27:39] Eric Jorgenson: So, there's some extent to which like a little friendly competitiveness is a good thing.
[00:27:43] Eric Jorgenson: I don't have a opinion really on TikTok as an individual thing. Like, you know, exactly to your point. Like, I have no idea what to believe about any of it. I've heard outrageous claims in every direction. All I know is that like I get less done when it's on my phone, so I don't have it on my phone. But the same applies to Instagram, so, you know, they both fit in the same bucket to me.
[00:28:50] David Elikwu: Yeah, that makes sense. But actually what you were saying leads to second question that I was going to ask. Balaji, you wrote about in your book, he has a quote that essentially, and I think it's one of the section headings, which is, you know, technology is supposed to be building what money can't buy.
[00:29:04] David Elikwu: I think very often in reality we get waylaid from that mission. And some of that seems to be an incentive misalignment problem where because of constraints on, you know, wanting things to be profitable or having different time horizons as investors or as consumers, we don't really allow technology to proliferate in the way that we should. And so, exactly like you mentioned with, you know, space exploration before SpaceX, we weren't actually doing anything, even with SpaceX. I think Elon had to put some of his own money into it and, and keep it going. It's not always easy to get the funding, particularly in the early stages. You know, Steve Jobs was exactly the same with Pixar. People might underrate today how big of an impact Pixar had in terms of animation and stuff. But, you know, we wouldn't have had all this era of entertainment without the work that Pixar did.
[00:29:48] David Elikwu: And then simultaneously, recent example that has been on my mind for a little while is Cana the company that David Friedberg was building. And I think, you know, people have already said, oh, already failed. But to me, like the concept of being able to turn drinks into software is something, first of all, it's a really magical idea, but second, like that seems like the kind of futuristic thing that should be pursued at almost whatever cost, right? The eventual
[00:30:14] Eric Jorgenson: of this.
[00:30:15] Eric Jorgenson: Can you explain this company to me? I have not heard of this.
[00:30:17] David Elikwu: Ah, okay. So essentially, and I, I might give you the poor man's explanation, but what they were trying to do is they reduced almost all drinks. When you look at it on a, not biological, but on a, on a chemical level, they were trying to reduce all the, the variables and the molecules that create lots of different drinks into a few, like, I think there was five underlying base chemicals. And so actually if you combine some of these chemicals in different ways, then you can create all these different drinks. And so the aim was you could ship. You've seen like a soda water or some of these kind of like make it yourself or a coffee machine at home. You a machine that you put water in, into the machine and you could just buy a capsule. And the capsule is almost like, it's just software but it includes some of the actual, the base stuff that you need to create, whichever types of drinks.
[00:31:05] David Elikwu: And the rest is just software injecting into your water the properties that it needs to create, right? So wine has this level of texture, it has this type of flavor. How can you just put that into a drink?
[00:31:17] David Elikwu: And so people have tried it. So it's not as though from a, you know, on a physics level, this was impossible. People have actually tried it and people say, okay, maybe the wine does not taste like the very best wine that I've ever had. Maybe the orange juice doesn't taste like as orangey as it could, but these were some very early iterations, but at least they proved that this conceptually can work. Like you can get pretty close to having beer and soda all from just mixing a few basic chemical compounds. And so the idea that a company like that could fail when the reality of what that could have been is all you need is water and then you just buy these tiny little capsules and you can have all types of drinks. That completely changes the beverage industry, at least in my mind.
[00:31:59] Eric Jorgenson: That's incredible.
[00:32:00] David Elikwu: So, yeah, so coming back to the question that I was asking, it's that, you know, we've seen lots of different examples of people in the future, building things that money can't buy, but those seem to be hard to invest in.
[00:32:11] David Elikwu: They don't seem to be the most investible companies at the early stages because by virtue of what they are, they're always moonshots. And so how do we get around this incentive problem where very often people just fund the 10th version of this super simple SaaS product that's for large enterprises.
[00:32:28] David Elikwu: And you know, you just get lots, lots of versions of the same thing because people have a short time horizon. People wanna make money, et cetera.
[00:32:34] Eric Jorgenson: Yeah, look, venture capital needs to grow a pair, right? Like the roots of venture capital are investing in those crazy frontier things that taking on technical risk, not just market risk and very carefully, like intelligently de-risking things, step-by-step on a technological basis. If you go, you know there's an amazing book called The Power Law by Sebastian Mallaby. Is a pretty good comprehensive view of the history of venture capital and how it got started and some of the early technologies that it funded and what we see when you compare the early days to now, now venture capital, like quote unquote, is a huge, so many people classify themselves as venture capital that would never meet that traditional like individual or early stage, high technical risk definition.
[00:33:18] Eric Jorgenson: And at some point software, a lot of software was venture capital and became venture capital funded. And it has incredible, you know, business dynamics, high margins, super low marginal cost of replication. And so a ton of people made returns in software and now software's getting to be a relatively mature industry.
[00:33:35] Eric Jorgenson: And to me, most people that invest in software still call themselves venture capitalists. But they're not real, like, not most of software investment now is not a frontier technology investment. It could still be a great investment. It could still follow some, to some extent the venture sort of portfolio math where you have to invest in five or 10 to get one that returns the others, but they're much lower capital intensity thanks to everything that has kind of all of the tooling that exists in the software world now.
[00:34:01] Eric Jorgenson: And I hope and I actively participate in sort of a return to like, let's get back to funding the crazy stuff. Like when you hear about it, it should blow your mind. It should seem borderline impossible. It should shake you up a little bit. It should make you go, whoa. And exactly to your point, the very first chapter of the book is titled, Building What Money Can't Buy. That is the point of a high tech startup. I feel passionate about it. These are all the companies that like I try to invest in, we have a small early stage venture fund, and whenever possible we seek exactly that, right? We want to take on technical risk. We wanna find founders who are incredible visionaries, who are seeing 10, 20, 30 years into the future and building something, even attempting to build something that is an incredible step forward in human capabilities.
[00:34:45] Eric Jorgenson: And to your point on like, you know, just because that company, what Cana, is that what it was called?
[00:34:50] David Elikwu: Yeah.
[00:34:50] Eric Jorgenson: The Freidberg's company? Cana. Okay. He like, just because that one didn't work does not mean that concept isn't of the right thing to build or a right thing to build, right? The path to dependency of a high tech company coming all the way to fruition and getting completely deployed to as many people as possible is an extremely low odds event. Any individual one is incredibly risky, and I hate it when I see, you know, I see and hear venture capitalists pass. I'm like, oh, that hasn't worked. Like I've seen five of those and none of them have worked. It's like maybe we as a civilization need to start 20 of them before one of them works and it will still be worth it and good for all of us.
[00:35:27] Eric Jorgenson: So like, as a founder, as an investor, don't get discouraged if you've seen someone, something that doesn't work. Like if we can imbue water with a very simple, relatively cheap machine with any fluid, and that's better for everybody on earth. Like hell yeah, we should keep trying that. Somebody go find out what they can, you know how you can license that patent or try again or invest in the next one. Like keep trying, keep trying.
[00:35:50] Eric Jorgenson: We've been trying some of these things forever, but that company is an amazing one because I I love that it, something that most people would perceive as like a tangible, physical miracle that takes place in your house. And aside from the internet, we haven't had a lot of those. Like there's some really cool experiences we've had. You know, like hue light bulbs are pretty cool. And like, stuff like that that like, you know, if you showed it to somebody even from 20 years ago, they'd been like, what the hell is that? That is awesome, like how cool. Home automation and stuff like that.
[00:36:16] Eric Jorgenson: But that, the beverage example is great because it hints at, you know, what's possible with like a much more kind of mature nanotechnology, you know, if you read some of these whether it's science fiction or just, or scientists who just studied, especially nano technologies, just 'cause it's the most physically mind blowing.
[00:36:34] Eric Jorgenson: There's an amazing book on this called where's My Flying Car? And the author, Josh Storz Hall is a PhD in Nanotechnology. I did an interview with him and I read my favorite notes on my podcast Smart Friends. So you can, if you don't have time to read it, it's a little bit of a thick book, but like you can get a sense of it and it'll blow your mind at like what the experts know to be possible.
[00:36:51] Eric Jorgenson: Like we should absolutely be having a Manhattan project for nanotechnology right now. And we tried that in the nineties. I think Bill Clinton signed into law and then a bunch of like academic grifters just kind of like redefined their projects to steal all the grant money into stuff that, like, wasn't really true nanotechnology research, which pisses me off. But that's the nature of the beast. Back to the point of like, we have to keep trying, like we should keep investing in it. We should keep doing research on it. A Foresight Institute is a great nonprofit that continues to kind of push that forward and has collected a lot of that wisdom and they throw events and you can go like hang out with a bunch of other technologists and futurists that are trying to see what's possible and push it and usher us a little closer to that.
[00:37:26] Eric Jorgenson: But yeah, it does not take long to like dive down that YouTube rabbit hole of what's possible with mature nanotech and like, let that truly blow your mind. There's a huge piece of this next industrial revolution that we are destined for. And if we're lucky we'll live through and it'll be fucking insane and absolutely mind blowing.
[00:37:42] Eric Jorgenson: And I hope that that, there are forces against it. And I hope that we all sort of reach for that a little sooner and get a little more optimistic for it. And the world that we will have and the lives that we will live on the other side of some of those innovations will make us feel like, the hunting gathering tribes look to us right now. It's not, it doesn't have to be hundred years away. It's crazy.
[00:38:02] David Elikwu: Yeah, the future is a really interesting subject. And just like what you were saying, I think one interesting aspect of it that I've spent time thinking about is the extent to which we can fear the future quite easily.
[00:38:13] David Elikwu: And I don't know what your current thoughts are on AI, but obviously that is, and I think very important that people do think about the, you know, second order consequences of developing AI and okay, what happens if you reach artificial general intelligence, et cetera. But I think also, like one of my favorite examples is in Star Trek. If you look at some things that they thought the future would look like, and then you things that they overlooked. And go back to all of that old Sci-Fi. Right? Nobody imagined that you have doors that open by themselves when you walk near them. Every door you have to go and push a button for the door to open. But meanwhile, they have transponders. They have transponders. People can teleport from place to place. Like some of the basic stuff that we just figured out super easily, nobody ever imagined it, right? People are thinking you're gonna have
[00:38:55] Eric Jorgenson: Suitcaseswith wheels.
[00:38:57] David Elikwu: before you have, Exactly, yeah. just some of these basic things Like, when we think sci-fi, people skip over. But then the reality is very different. You know, sometimes we're very wrong about the future. Like we have no idea which things will end up being super easy to do. Like some of the things we perceive right now as road blockers creating the future that we want that for example, may be relating to AI, there could be some other unlock that suddenly actually creates a very different world. And, and suddenly the things you are worried about are not worries at all. And, and suddenly you're living in a very different world.
[00:39:27] David Elikwu: And I know in the book Balaji talks a lot about maybe two types of futures and you name the chapters after it, like our digital future and our physical future and within the digital future. I know Balaji talks about like the blockchain, pseudonyms, digital natives in self-organized digital countries, and then the physical future. He talks a lot about transhumanism and abundance and some of these ideas may seem a bit esoteric or farfetched for people, but I'd love if maybe you could explain more about them and are there any roadblocks that we need to overcome to realize these futures? Or is it just merely conceptual, at least for now?
[00:40:00] Eric Jorgenson: Yeah, I think it depends on the individual technology that kind of triage. One that I know he hits hard in our physical future is also robotics. And the second and third order effects of those are crazy, right? Like imagine, how much we pay for something now? Let's just say a car needs to be, the parts are, the raw materials are mined, then shipped, then parts are made all over the world, then shipped again, then assembled, then driven by a human to a place where it sits in a building run by a human, where it is eventually sold by a human to a human.
[00:40:30] Eric Jorgenson: And imagine that whole process with just robotics, right? Like the raw materials go into a factory, a series of robots perform all of their tasks, and a car rolls out the other side and it autonomously drives itself to you. Like you buy a car online and it is manufactured and delivered with zero human involvement from a much more elegant, efficient, kind of like entire supply chain.
[00:40:54] Eric Jorgenson: That car could be a lot cheap. That maybe that's a $5,000 car, you know, that is just as good as what is today, a $25,000 or $30,000 car. You know, and I'm sure the first thing in a lot of people's mind is like, what about all those jobs? It's like, well, somebody's gotta be fixing the robots. Somebody's gotta be coding the things. Somebody's gotta be monitoring all this stuff. Like there are always more jobs and there's always a new frontier. But imagine how cheap a fully robotic farm and chef and food delivery supply chain could be. Like all of these things, robotics are getting incredibly capable, incredibly quickly. And AI to your point is like helping a lot with that. You know, the self-reinforcing sort of feedback loop between AI and robotics is incredibly important.
[00:41:36] Eric Jorgenson: Energy plays a really important role in that because once you've got, once everything's robotic it's a huge CapEx, but like the main input is just energy. And so the cost of energy generation has to go down, which is just really a matter of building more nuclear plants and solar. But like the reasons for not having that are incredibly, like almost entirely incentive based and due to like lobbying and oil and gas, like we could have energy that is the phrase we like to use is like too cheap to meter, you know, it's like bandwidth. Like we don't worry about how much bandwidth we use. We pay a flat rate and use as much as we possibly could dream of using. Stream Netflix all day. Who cares? Nobody. And that's possible with energy like, nobody will, your parents will never yell at you to turn the lights off again 'cause you pay a flat rate for electricity 'cause electricity is truly abundant, right.
[00:42:19] Eric Jorgenson: And that's an input to compute, which is AI, that's an input to all the robotics. Things get drastically cheaper for everybody. And the quality of life goes up for everybody. And our ability to support a larger population, our ability to lift people out of poverty. And that's all before we get to like the mature nanotech where you can synthesize food like out of black box with a few, like with pellets can pull stuff out of the air and like produce food. It's crazy. There's no, you know, back to your question on like the laws of physics. There's no law of physics against that says we can't do that. There's no law of physics that says we can't build these like little machines, self-replicating machines that are building 12 mile high towers of diamond to do, to create space elevators. Like these are all within the realm of possibility that are so fanciful, you kind of like scoff when you hear it. But 150 years ago, we would've scoffed at the fact that we, we could fly or build a cruise ship the size of a city or walk on the moon. You fucking kidding me? Like there are, we forget all the miracles that we've achieved when we think about the miracles to come.
[00:43:19] Eric Jorgenson: And like, let's just, let's reach for those miracles. Let's work on them. You know, what's the most incredible thing you can imagine? And go start building it. Like go try it. Get to work. Like put your shoulder against that problem.
[00:43:32] Eric Jorgenson: And counterintuitively, you're more likely to be successful the crazier the thing is that you pick, you know, I don't remember a meeting from a year ago that I had of somebody doing like a SaaS app for, you know, gym memberships. Like, it just falls outta my brain as soon as it happens. I definitely would remember the dude I met with a year ago who's like, I want pervasive hospital grade like in every hospital in America. Cryo-preservation for humans. Here's my 12 step plan over the next 10 years to get there. I'm starting with cryo-preservation of pets, and I'm starting in my garage and I'm seeking funding for my company. Kai Micah Mills. Go follow him on Twitter. He's a like, brilliant dude. He started a bunch of companies even as a teenager. Crazy vision. Incredible like a thought. He's a impatient for like, getting started on that problem and he wants to solve death and I salute him. Like what a crazy awesome, bold, brave thing to do, and people are so excited to support him and be a part of that because his vision is so extreme and it gives meaning to the lives of people who want to solve that problem. It's a beautiful thing.
[00:44:37] Eric Jorgenson: You know, you could talk about that a number of different examples in a number of different directions, we hold ourselves back for a variety of like, psychological reasons, basically from taking the biggest swing that we can and it's a shame. So I hope, I hope this removes some mental shackles for people.
[00:44:50] David Elikwu: Yeah, I think it's really important. And there's a few things with the future, you know, first of all, going back to something you were saying, sometimes the order in which we unlock different things changes the way that the future unfolds. So it can be hard to fully envision exactly what the, the future looks like if you haven't seen what those unblocks will be. But for the people that actively live on the frontier, I think it's Mike Maples Jr. he has a quote, which is something along the lines of, the good startup founders build things for the present. But great startup founders don't just build things for the future. They build things in the future and then invite us to join them there, right?
[00:45:25] David Elikwu: So they are building things that don't really work for today. They're building things for the world of tomorrow. And then we have to cross the chasm. It's the customers that have to cross over to join them in this future vision of what the world could look like.
[00:45:38] David Elikwu: So I guess maybe tying that to a question, I think Balaji's had a lot of wide ranging ideas about the future. Are there any that for you, you have found it hard to wrap your head around either because you are not yet fully convinced or because you hold a slightly different opinion?
[00:45:52] Eric Jorgenson: Oh, plenty. Yeah. I can't imagine anybody who's gonna read this book and be like, I agree with every word. Yes, Balaji is completely correct, right? Like that is he's got so many, so many takes of such variety in so many different directions. Like you'll absolutely find something to disagree with. I think everyone will. And I think he would, you know, welcome that.
[00:46:11] Eric Jorgenson: And it's just an interesting to kind of foil off of like, you can think two of them are absolutely crazy. You can think five of them are okay, and you'll think three of them are like, holy shit. He nailed that. And I've never had that thought before. And I absolutely believe that's how the future's gonna be. And that's a, like a pretty good ratio. Like that is a book worth reading in my opinion, right. You will hear takes that you have never heard before, and some of them you'll love and some of them you'll hate and that's okay.
[00:46:32] Eric Jorgenson: Everybody will take something else away, but I think everyone will take something away from this book.
[00:46:35] David Elikwu: Okay. Fair. And you mentioned before that the second pillar of this was truth, and I think there was four five different kinds of truth. So scientific truth, technical truth, political truth, economic truth, and cryptographic truth.
[00:46:47] David Elikwu: So what's the importance of these, this delineation between different types of truth? Why should people care about that?
[00:46:53] Eric Jorgenson: Yeah. It's important to recognize, truth seems so singular. You know, as soon as someone says there's multiple kinds of truth, it sounds like somebody's trying to bullshit you, right? Like, oh, well, there's your facts and there's my facts, and there's, there's a truth to everything. There's important nuance between a scientific truth, which is something that we are kind of constantly in pursuit of. A true scientist would never say we are a hundred percent confident in anything. We're only 99.9 some long number of repeating percent sure that the sun is gonna rise tomorrow. Like everything is a hypothesis in the process of getting upheld or disproven.
[00:47:27] Eric Jorgenson: Technical truth or mathematic truth you know, one plus one equals two. We can be a hundred percent confident in that, it's a little more, it's in the abstract, a little more. Political truth is an entirely different thing, right? And the difference between those two, well, political truth is something that is really determined by a consensus belief. You know, when we all vote for the president, pick a new president and choose to believe that, you know, that guy isn't president, instead of that guy, we've updated our mental picture of who the president is. And that's a political truth, that's a consensus social truth that just humans have determined.
[00:48:01] Eric Jorgenson: Same thing with country borders. If an alien arrived on earth, they would be able to tell us, they would agree with the scientists who were saying, this tree is, you know, this age because it has this many rings. They would not be able to tell us where the borders of our countries are. They would not be able to tell us, determine which of these, you know, 400 million people was the president. ' Cause it's not an objective fact. It's a politically socially constructed truth. And that gets interesting when it becomes fractal, right? Like there is a truth that the left believes there's a truth, that the right believes. And it's frustrating to see that people have, can create their own social consensus truth that may or may not reflect underlying reality. And when the scientific truth and the political truths come into conflict. Again, maybe a naive perception of the past, but it seems like we might be worse at reconciling political truth and scientific truth than we have been you know, 50 years ago, let's say.
[00:48:52] Eric Jorgenson: Economic truth is a very interesting one and I hope people read that chapter. 'Cause it talks a little bit about the interplay between, there's a version of an economy that runs on political truth and there's a version of an economy that runs on scientific truth. And I would say, you know, a free market capitalist economy, something closer to scientific truth. If you have a hundred units in inventory and you report that you have a hundred units in inventory, the economy and that you can get them there by next week, the economy functions relatively well. And there's incentives for doing that right. If you have a sort of a communist economy that really wants you to have 200 units in stock, and so you say you have 200 units in stock and that you say you can get it there next week, but you can't, 'cause you can't get it there for two weeks, but you say you can, 'cause it's safer, then things start to fall apart before too long.
[00:49:35] Eric Jorgenson: It's really important to learn to distinguish all those things. You know, and it's a source of opportunity, like if you wanna start a company, if you want to succeed politically, if you want to succeed. So like, understand how people determine truth and how truth is, like, how the different types come around.
[00:49:52] Eric Jorgenson: Different people respond to different types of truth constantly and believe different things from different sources. And if you can learn to see the difference between those two, you know, you'll see all kinds of gaps and opportunities that other people, it's like they're wearing a blindfold their whole lives. They can't see it. And, you know, you'll be the only, the only man with sight in the world of the blind. If you can learn to take that blindfold off and see the underlying truth. It's an incredible source of opportunity. And I think you see those, you hear some of the great entrepreneurs talk like that. Like it was, was so obvious to me or the world refused to see it, but I was willing to, was willing to act on this unspoken truth.
[00:50:28] Eric Jorgenson: There's so many fascinating things about that, and if you look at companies that are succeeding or people that are succeeding through that lens, I think you'll start to see a different set of patterns in the world.
[00:50:40] David Elikwu: Thank you so much for tuning in. Please do stay tuned for more. Don't forget to rate, review and subscribe. It really helps the podcast and follow me on Twitter feel free to shoot me any thoughts. See you next time.
Member discussion