Transcripts

Intelligent Machines 837 transcript

Please be advised this transcript is AI-generated and may not be word for word. Time codes refer to the approximate times in the ad-supported version of the show.

 

Leo Laporte [00:00:00]:
It's time for Intelligent Machines. Paris Martineau's here. Jeff Jarvis, our guest this week, Nick Foster. Don't call him a futurist, but he did write a book about how to think about the future called Could, Should, Might, don't, and all the AI news. Coming up next on Intelligent Machines, podcasts you love from people you trust. This is twit. This is Intelligent Machines with Paris Martineau and Jeff Jarvis. Episode 837, recorded Wednesday, September 17, 2025.

Leo Laporte [00:00:38]:
Could, should, might, Don't. It's time for Intelligent Machines, the show. We cover the latest in AI and robotics and all the smart little doodads and gew gaws surrounding Gee gaws surrounding you in your life. That there is Jeff Jarvis. He's professor emeritus of journalistic innovation at the Craig Newmark Graduate School of Journalism at the City University of Almost made it all the way through Montclair State University and Stoney Sunybrook. He is the author of the Gutenberg Parenthesis magazine and what Would Google Do? Look, I see.

Jeff Jarvis [00:01:18]:
What would Google it out from the closet. Yeah.

Leo Laporte [00:01:20]:
Wow. Also, the web we weave. It's nice to see you, mister.

Jeff Jarvis [00:01:24]:
Good to see you, boss.

Leo Laporte [00:01:25]:
Also with us, the wonderful Paris Martineau from Consumer Reports. We were talking about you earlier, Paris. I don't know if you. Your ears are burning. Because on Windows Weekly we talked about that interesting letter Stacey Higginbotham wrote on behalf of Consumer Reports to Microsoft saying, please, please, sir, give us more windows 10, please, sir.

Paris Martineau [00:01:48]:
I'm interested to see how that goes.

Leo Laporte [00:01:49]:
Yeah, let's see how that goes. But I, you know, I honor Stacy and I honor Consumer Reports because they're fighting the fight, the good fight, for users, for real users. So as often is the case on this show, we'd like to start off with an interesting interview. We've got a very interesting guest this week. Nick Foster is a former EDM dj. Okay, you knew I was going to bring that up. And designer who has worked in some of the biggest companies in the world, including Sony, and has written a new book about the future called Could, Should, Might, Don't. How we Think about the Future.

Leo Laporte [00:02:29]:
Welcome, Nick. It's great to see you.

Nick Foster [00:02:31]:
Nice. It's nice to be here. Thanks for inviting me.

Leo Laporte [00:02:33]:
Yeah, I love this because I was a little, I have to say, approach it with some trepidation because I thought, oh, another book about what we should do, what we could do, what we might do in the future, what we shouldn't do in the future and all of that. And I really wasn't interested in predictions and prescriptions. You are a futurist. That's what you do for a living. You consult with companies about what's coming down the pipe. But this is a much more interesting book than that. This is a wonderfully, almost like a I wrote a note. I call it a book length essay about four ways we can look at the future.

Leo Laporte [00:03:13]:
Full of examples. Is that a fair way to describe it?

Nick Foster [00:03:17]:
Yeah, that is a fair way to describe it. I think there are enough of those books and those on stage presentations about what the future is going to hold. I'm more interested in how we think about the future because I think we need to go one layer deeper and start to sort of reassess what goes on in our brains and what goes on in society and in meetings when we think about the future and talk about it.

Leo Laporte [00:03:37]:
Yeah, you say at the very beginning that the fact that you can listen to award winning podcasts, read books by Pulitzer Prize winning authors, open globally renowned magazines, magazines and newspapers, watch presentations from CEOs with PhDs or meet marketing teams with million dollar budgets and still hear lazy references to flying cars, hoverboards, the Jetsons and Star Trek is not just embarrassing, it's a disgrace.

Nick Foster [00:04:04]:
I stand by that.

Leo Laporte [00:04:07]:
We don't know how to think about the future, do we?

Nick Foster [00:04:10]:
I think so. I mean, I've been around, so I've been working in kind of big technology companies for about 25 years and I've been around conversations about the future for most of that time. And I'm a designer, but I've been around engineers and scientists and CEOs and marketing teams and financiers and all that sort of stuff. And it feels like we sort of lack the skill set to talk with rigor about the future. And I think that represents a growing problem. So I wanted to, yeah, write a book about what goes on in our minds and what do we reach for when we think about the future. And unfortunately, all too often it is sort of snippets that we snatch from everywhere. For a lot of people that is sort of sci fi tropes, but for other people it's statistics or headlines or comments from intellectuals or whatever it is.

Nick Foster [00:04:53]:
And it doesn't sort of hold together or withstand any kind of scrutiny. So the book is about trying to deconstruct and go back and saying, actually I think there are four channels of thinking that we all fall into and they all have their strengths and they all have their weaknesses. And actually what I'd love to see is more Breadth across all of them.

Leo Laporte [00:05:13]:
There's no one particular way we should think about the future.

Nick Foster [00:05:20]:
I don't think so. But what I do think is that we all fall into our own particular patterns. And I think when we do that, we start to look at the world. Like if we're obsessed by statistics and projections of statistics and algorithmic extrapolations, we start to see the world through that lens and we miss all of the other stuff that's going on and the. All the possibilities and the other ways of thinking and some of the imagination and some of the caution. So what I'm trying to say is that there's a breadth of. Of thinking about the future that I don't see very much of. And so, yeah, this book is just trying to poke and provoke people to say, do you fall into one of these four more than the other three? And making people a bit aware of the inherent weakness in doing that?

Leo Laporte [00:06:01]:
Yeah. And one of the could is sci fi. A lot of people think that sci fi is about predicting the future. You quote. I love it. You quote Charlie Strauss, who's a sci fi, a famous sci fi author, saying, we're not trying to accurately predict possible futures, we're trying to earn a living. Any foresight is strictly coincidental. So we shouldn't look to sci fi.

Leo Laporte [00:06:24]:
I feel like I've often said this, a lot of what people like Elon Musk do is based on sci fi. Like they try to. They try to recreate their childhood fantasies from these sci fi books that they read.

Nick Foster [00:06:37]:
Yeah, without a doubt. It's very prevalent, particularly in this part of the world. And it's also sort of heretical to live in. When I say this part of the world I'm based in the Bay Area, it's sort of heretical to say anything downmouthed about science fiction and its input on our imagination about the future. But what I've found is that people lean on science fiction and the tropes and the language of science fiction and the ideas within science fiction in a sort of lazy way, in a way that they use it as a substitution or a placeholder for real imagination and real ideas of their own. And we do see it a lot from our sort of technical leaders who not only try to recreate the ideas from science fiction, but they use the language of science fiction to name the projects that they're running and the products that they're making, and they use the sort of behaviors that they've seen in science fiction and try to recreate them. But science fiction, I don't think is a brief, it shouldn't be treated as a brief. I think we see that headline a lot like science fiction has become science fact or trying to make science fiction a reality.

Nick Foster [00:07:39]:
And I think that's a fundamental misreading of science fiction on a lot of people's parts. But I do think it's a substitution for genuine imagination and genuine sort of exploration of your own.

Jeff Jarvis [00:07:50]:
Nick, I'm less a futurist than I am a pastist.

Leo Laporte [00:07:55]:
We call that historian, Jeff.

Jeff Jarvis [00:07:57]:
Yeah, well I'm not really word but, but you know I write about, about Gutenberg.

Leo Laporte [00:08:02]:
I by the way, Nick does too. Did you see the reference?

Jeff Jarvis [00:08:04]:
Yes, I saw, I looked, I immediately searched for Gutenberg. Yes, absolutely. And about the Linotype and recently I've been going down rabbit holes about the triode vacuum tube and the beginnings of the amplifier. And what strikes me in all of these things, especially around this, is that when the amplifier was invented, the inventor had no idea what he had and no one understood what broadcast radio would be. It took years past that invention before people realized what they have. And so what I like best about your, your taxonomy of the future is the final part is the mundane future which is kind of presentist. And what I've come to believe is that technological determinism takes over too much of the public discussion and that we all have more of a role in creating that future. Which I think is the big point that you're making is that if we learn the lessons from the past, if we learn the lessons from history and how technology has adapted before and the choices that were made and the paths that were taken and not taken.

Jeff Jarvis [00:09:12]:
So. So as we face AI and literate machine alongside the Internet, what do you think are the most important decisions that everyone using it can make now? What are the butterfly wings that they fly that's going to set that future in motion?

Nick Foster [00:09:29]:
Yeah, I mean it's treading dangerously close to prediction territory here.

Jeff Jarvis [00:09:33]:
Yeah, well, no, I'd ask instead for, for your view of our responsibility.

Nick Foster [00:09:38]:
Yeah, I think so. A thing that's followed me around for the last 13 or so years is an essay that I wrote many years ago about what I call the future mundane, which is about trying to think about the future as a lived in place. Because we tend to talk about the future as someplace other populated by other people when actually it's just an evolution of the present. And so if we look at the present and we look at the lives that we lead now, we have experienced massive amount of change. I'm almost 50 years old. And there's been a ton of change. There's twice as many people on earth, for example, now than when I was born. But it doesn't sort of feel like the future.

Nick Foster [00:10:14]:
There's change all around me, technological, societal, cultural, political change. But it feels just sort of like the present. And I went to the bakery this morning, you know, my left knee still hurts every time I stand up. So what I'm trying to do when I talk about the future mundane is say, yes, there are lots of changes coming. Both changes that we think of as sort of ideologically positive or ideologically negative and they'll shift the way we live. But if we talk about the future as someplace other and we just talk about the change, we sort of miss the fact that people like us are going to be there. So I like to tell stories about the future and show future technologies as embedded, ordinary, mass adopted parts of everyday life alongside things that will continue, like the chair that I have behind me that I inherited and other bits and bobs that are 20, 30 years old. And so I think that's the way we need to start thinking about technology as a sort of part of a lineage and part of a life.

Jeff Jarvis [00:11:06]:
I like that.

Leo Laporte [00:11:07]:
The New York Times illustration for your essay is hysterical because it has a crystal ball with waffles, laundry and a toothpick in it. So this is the future, ladies and gentlemen, just in case you're curious. Do you think, Nick, that modern companies, you've worked for Apple, you've worked for Google, do you think today's technology companies think about the future appropriately or not?

Nick Foster [00:11:33]:
I think it's a mix. Right. And I think what I'm trying to point out with this book is I think they're not any worse or better than any of us because I think we all fall into one of these four corners of the map, let's say I don't want to try and create some sort of method and say this is the Nick Foster could, should, might, don't method. It's just an observation. And Jeff's point about a taxonomy is exactly right. It's just sort of pockets of behavior that we fall into. And I think if we look at our big tech companies, we see them falling into those pockets. Some of them almost permanently live in one of them and some of them sort of move between them, but they don't talk about them in the round.

Nick Foster [00:12:07]:
And I think what I would love to see is a technology company or a leader of a technology company doing all four of those behaviors, all at once talking about what we could make and being excited and being sort of motivated to create amazing new things, being very sort of directional and saying, this is what we should do, and we should do it for these reasons. Also saying, these are all the things that might happen. And some of them, we're not quite sure where they might lead. And I'm not quite sure what the answer is here. And then these are the things that we don't want to do. These are the things that we should be a bit careful of or cautious of. You don't really see that balance. When people talk about the future, it's either straight to super intelligence or straight to here's where the numbers are pointing or straight to here's the problems ahead of us.

Nick Foster [00:12:46]:
And I think that's sort of what I'm trying to poke at and provoke people to realize that we all do that as well.

Leo Laporte [00:12:52]:
Although I have to say, if waffles are in my future, God bless it, because that sounds right, I see no.

Nick Foster [00:12:58]:
Reason why they would go away.

Leo Laporte [00:12:59]:
Okay, good. In fact, better waffles.

Paris Martineau [00:13:01]:
That's a prediction.

Nick Foster [00:13:02]:
That's a prediction, yes.

Leo Laporte [00:13:05]:
Actually, it is interesting you point this out in the book, how in some ways, I mean, here we are with these AIs that are doing things we could have never thought of or predicted with. In other ways, everything is just as it was 100 years ago. There's change. Is. Who was it that you described? Was it Arthur C. Clarke? Thinks of changes as different at different levels, at different speeds. At different levels, Yeah.

Nick Foster [00:13:27]:
I use a really precocious academic word. I use the term accretive, like sedimentary rock stuff just piles up over time and erodes at different rates. And I think that's the reality. And it's taken me a long time to sort of realize that as I get into either the second half or the final third of my life. But actually, yes, these persistencies that keep coming through our lives exist all around us all the time. And like I said, there's been huge changes. Like gay marriage wasn't legal anywhere on Earth when I was born, for example. There were no MRI machines.

Nick Foster [00:14:00]:
And all those things have existed in being created.

Leo Laporte [00:14:05]:
Yes.

Nick Foster [00:14:05]:
But we still have buttons and people still brush their hair and, you know, they still go out for lunch and have a ham sandwich. And I think, again, I'm not trying to diminish the scale of the challenges that face us or the size of the impact that some of these technologies might have, but they will be lived in and they will exist. And I think certainly in the Bay Area, something like Waymo, for example, Undoubtedly a sort of step change in capability from a robotics, from a navigation, from a societal and a governmental and legal level. Like it's a big change, but most people that have taken away mo two or three times, it sort of normalizes quite quickly, right?

Leo Laporte [00:14:43]:
Yes, it's gradual, yeah. I mean, I haven't been inside a bank in years and that's a huge shift in how I live my life. And soon I have a feeling I won't really have a need for cash. My wife asked me, why are you going to the atm? You don't need cash. She's got a good point. I don't really need cash. I just like to carry it.

Jeff Jarvis [00:15:06]:
Oh.

Paris Martineau [00:15:07]:
I was going to say, one thing you talk about in the book is this concept of numeric fiction. Could you tell us a little bit about that and how it plays into kind of the flawed assumptions about the future and why we should interrogate them?

Nick Foster [00:15:20]:
I love that question, Paris.

Leo Laporte [00:15:21]:
Thank you.

Nick Foster [00:15:22]:
It sets me up for one of my pet peeves. That's why I think I've been around a lot of different types of people who work in trying to understand the future and trying to convince people of a certain future. And it feels like the people who play with numbers always have the upper hand. The corporate strategists, the people who have sort of worked out the numeric patterns in the world and tried to decode the world and say, we've done the numbers and the arrow points there, so we should do that. Now, I'm not here to say that that's always wrong, but it is wrong quite often. And I think the thing that we always miss is that no matter how good your data is, when that solid line turns into a dotted line, it ceases to be data and it becomes a story. I don't think we're honest enough about that. So I refer to it.

Nick Foster [00:16:04]:
There is a bit of stink on this, but I refer to it as numeric fiction because I tell stories too, as a designer, but I use films and objects and prototypes and, you know, bits of conversation. But people who come from maybe a corporate strategy or an analyst perspective, they'll use numbers. And because numbers hold this sort of lovely place in our hearts, we believe them to be rigorous and we believe them to be sort of honest and true and empirical. They sort of convince us of a future that's definitely coming. And anyone that's had any money in the stock market knows that those dotted lines are not truths, they're not data. They are stories. So I use that term as a way to sort of deliberately pick at people's brains and say, just maybe don't lean on that as much as perhaps we do.

Leo Laporte [00:16:49]:
You actually quote from one of my favorite TV shows, Succession, when they ask one of the Roy kids, where do those numbers come from? And he goes, you know, man, projections.

Nick Foster [00:17:01]:
Yeah, like it's a truth. Like it's. Like it's a truth.

Leo Laporte [00:17:05]:
It's projection, man, of course.

Nick Foster [00:17:06]:
Yeah, yeah. So that's why I sort of deliberately pick on that. Because that type of futurism, which doesn't really declare itself to be futurism, but it is, it's trying to predict or show or convince a future coming our way. It seems to get a seat at the highest table in almost all decision making meetings or in all sort of leadership meetings. And it can be very useful because it can give somebody some decent background to lean on and say, this is why we made this decision. And it's trackable, traceable. You can see when you were right and you can see when you were wrong. I get that.

Nick Foster [00:17:38]:
But I think there are other ways of imagining and talking about and steering the future that are equally flawed but also equally beneficial.

Leo Laporte [00:17:47]:
We're talking to Nick Foster. His book, which came out at the end of August, is in bookstores now from Farrar, Strauss and Giroux, is called Could, Should, Might. How we think about the future. And Nick does also narrate it. So you can get the audiobook and.

Jeff Jarvis [00:18:03]:
A different cover in publisher in the UK next month for our UK folks.

Leo Laporte [00:18:10]:
Yeah.

Nick Foster [00:18:11]:
So in the us, as a designer.

Paris Martineau [00:18:12]:
Are you more involved in the COVID approval process, do you think, than the average book for author or do you just let them do their thing?

Nick Foster [00:18:22]:
A bit of both. A bit of both, to be honest. One of the things that I was involved in, in this is so graphic design is not my background. I'm an industrial designer. I was one of the early design team at Dyson back in the 90s.

Leo Laporte [00:18:33]:
Oh, wow.

Nick Foster [00:18:34]:
So you know about that? No, let's not get into that. So graphic design. Graphic design is not my idea.

Leo Laporte [00:18:41]:
Talk about something designed to look like the future. All right, clearly Dyson from day one said, what would be the most futuristic way you could make a vacuum cleaner look?

Nick Foster [00:18:51]:
I wish I'd never mentioned it now.

Leo Laporte [00:18:52]:
Okay, never mind, go ahead, keep. I'll pretend I didn't hear it. Nevermind.

Nick Foster [00:18:56]:
On that specific point about the book covers, what I did was I worked with two really like world class book cover designers. One in the UK is A company called Pentagram and Angus Highland, and one in the us a guy called Rodrigo Corral, who's won countless New York Times best cover awards and things like that. So they know what they're doing. But what I did was I said, this is the book that I'm not writing. Please don't do covers that look like this. You know, the ones with the wiggly arrows pointing up to the sky or the light bulb or the robot on the COVID You know, a book about the future.

Paris Martineau [00:19:23]:
They love doing that.

Nick Foster [00:19:26]:
But that was part of the brief, which is why we have two very different books.

Jeff Jarvis [00:19:30]:
Can you show us both covers? Can you pull them from behind your eyes?

Leo Laporte [00:19:32]:
I have a picture of Rodrigo's design. This is the American cover, the white one.

Paris Martineau [00:19:37]:
Ooh.

Leo Laporte [00:19:39]:
Oh, that's interesting.

Nick Foster [00:19:41]:
This One is the US cover by McD, Farah Strauss and Giri. This is the UK cover with a bit of spot varnish on there. Very nice. But they're very different.

Jeff Jarvis [00:19:50]:
They feel culturally appropriate, each of them.

Nick Foster [00:19:52]:
Yeah, yeah.

Paris Martineau [00:19:53]:
Like the American one is on our level as a country.

Leo Laporte [00:19:56]:
Yeah. It's like cartoon. Big cartoon letters and then the British chairs. What are those on there? I don't even know what that is.

Paris Martineau [00:20:04]:
Delightful. I really like the UK one.

Nick Foster [00:20:05]:
Yeah, it's. Yeah, Maybe you like what you can't have. How about that?

Leo Laporte [00:20:10]:
I like the Flintstones one. That's my favorite.

Jeff Jarvis [00:20:13]:
Yeah. But I would do.

Nick Foster [00:20:14]:
But I think. I think it's. I think it's job done, actually, is because it would be very tempting to find a sort of middle aged white tech guy from Google who's written a book about the future and just sort of your brain already draws that cover. And I'm really proud that we've ended up with two very different covers, that neither of them fall into that trap. So I'm quite happy.

Jeff Jarvis [00:20:31]:
Paris, when you write your book, it's a fun part of it. It's a troubling part of it, but it's a fun part. Yeah, Nick, I'd love to, because I'm not talking about the aesthetics of design, but design as an ethic, design as a practice. I'm curious to hear your views about the role of design and AI. Yeah, because it seems very undesigned at this point. It's just a whole bunch of text or it's a whole bunch of mashing up of images, but in terms of a larger context of how it operates. I'd love to hear your design philosophy for AI.

Nick Foster [00:21:01]:
Yeah, I don't have a design philosophy for AI Other Than I think we're still. Obviously everything's moving very quickly and there's not a lot of time for detailed thought. But it doesn't mean to say that people aren't working on that people aren't like people are starting to try and figure out what the shape of these things are. But as with every designed object, in order to make it well resolved takes a bit of time. So I think that work is going on. I think when it comes to the kind of design that I'm mostly familiar with and most interested in, I went to the Royal College of Art in London in 1999, which was a real kind of pivotal time for what we call critical design, which is a growth of speculative design. So using the tools of design to sort of poke and provoke people to think slightly differently. And there were were two lecturers there called Tony Dunn and Fiona Raby, who I encourage everyone who is interested in technology to look up who looked at the shape of things like Bluetooth that was just starting to appear and WI fi that was just starting to appear.

Nick Foster [00:21:59]:
And there was a whole cohort of people who were looking at all the things we could make and what they could enable and all the tech and all the objects and the gadgets. And what they did was look more at what it might mean and use design to sort of tell stories and ask uncomfortable, awkward questions. And I think that's the role that design can play right now in AI, we can work to deliver really seamless experiences that make sense to people that feel well considered. But I think design has a really good role to play in where things like AI and machine learning fit in society and what it might point at and the types of lives that we might lead and those sorts of questions at the moment.

Paris Martineau [00:22:36]:
Anyway, I do have to ask you an off topic question that I've been burning in my brain, but I kept inside until I saw people in the chat also wondering. Tell us about your giant ring.

Nick Foster [00:22:48]:
This ring, it's a tesseract.

Leo Laporte [00:22:50]:
No, it's not.

Paris Martineau [00:22:50]:
It's incredible.

Leo Laporte [00:22:51]:
It's Punisher. What is it?

Paris Martineau [00:22:53]:
For those listening, imagine the largest ring you've ever seen. Oh, whoa. And your finger goes through the mouth.

Nick Foster [00:23:00]:
It's very subtle and understated.

Paris Martineau [00:23:04]:
So I'm not wearing rings right now, but I know I've been on a ring kick lately.

Nick Foster [00:23:08]:
It's based. It's been based off a very, very expensive ring that I saw from a very, very famous French jewelers that I couldn't afford. So i3D printed it in wax and Cast it in silver.

Leo Laporte [00:23:19]:
Bravo. So bravo.

Paris Martineau [00:23:22]:
Delightful.

Nick Foster [00:23:23]:
There you go. If you can't have it, just make it.

Paris Martineau [00:23:26]:
Have you done this before or what did you.

Nick Foster [00:23:29]:
Bobs here? I mean, this is the only one I've made, but there's another. There's another one. Work in progress here. Little. Just a bit of CAD work and some 3D printed.

Leo Laporte [00:23:38]:
But I love the idea of then using it as a cast so you can make it out of a more durable.

Nick Foster [00:23:42]:
Yeah, it's just like a piece of investing. You 3D, print it in wax and then investment cast in silver.

Leo Laporte [00:23:46]:
Perfect. Yeah.

Nick Foster [00:23:47]:
And I sort of wore it as a bit of a joke because it is quite Liberace. But that joke. That joke is now a decade old. So. Yeah.

Leo Laporte [00:23:55]:
Who's Liberace? No, no. So could, should, might, don't. These are four different kind of ways of thinking about the future. But you're not prescribing any of them.

Nick Foster [00:24:09]:
No, not at all. I think they all have their benefits. They all have quite interesting origins too, but they all have blind spots and weaknesses that I think we overlook or don't interrogate enough. And I think when you introduce me as a futurist, I don't actually call myself a futurist. And if I do, it's with great reluctance because there isn't sort of another word. And I think that's. This is where I do get a bit strident. But I think that's because the work of futurists is typically subpar.

Nick Foster [00:24:37]:
And I don't really want to be part of that cohort. What I want to do is empower people. All of us are consumers of futures work all the time, through media, news, entertainment, stuff we do at work. And I don't think we're critical enough of it. Which means I think a lot of futurists get away with sort of biased, subpar, not very well developed, non rigorous work. And so I refer to myself as a futures designer, which sounds equally pretentious, but at least it gets design in there, which I'm.

Leo Laporte [00:25:07]:
Jeff will like. Jeff will like this because he's a fan of German words. You say futurism falls into the Gesamtkunstwerk.

Nick Foster [00:25:15]:
Yes.

Leo Laporte [00:25:17]:
Do you know what that means, Jeff? He's nodding.

Jeff Jarvis [00:25:20]:
The collected work of art.

Nick Foster [00:25:23]:
It's a complete work of art.

Jeff Jarvis [00:25:25]:
Yeah, I mean, I like this a lot, Nick, because I mean, futurist to me is a rather hubristic job title, self anointed. Whereas what you're really saying is that you're building the future. Yes, I Think that the responsibility to do that rather than predicting it.

Nick Foster [00:25:39]:
Yes. Bruce Sterling, the science fiction writer who I know reasonably well, he gave one of his usual long ramblings at South. Yeah, yeah, he did one at the Long now foundation. And there's a quote in my book that he kindly let me use, which is I'm going to butcher it. But he said if I'm practicing futurism all on my own on a desert island wearing a goatskin hat, am I a futurist? And he said I'm pretty sure I'm not because I've got no one to tell about it. Which sort of. Yeah, it does really get to the nub of what you're saying. I know people that call themselves futurists and I consider them friends and that's fine.

Nick Foster [00:26:15]:
But I'd love to see the quality of that work go up. Yeah. And. And when it does, maybe I'm. I'll be more comfortable calling myself that.

Leo Laporte [00:26:23]:
Well, give us. Go ahead, Jeff.

Jeff Jarvis [00:26:25]:
Well, now, now the. So much of the future talk around AI is that is the doomsters and the test realists and the long term and all that, which I think even further corrupts if it's a field. The field. Right. Because it's now taken it to kind of a Looney Tunes end. How do you recapture a sane. Which is what your whole book does is try to recapture a sane view of the future rather than this woo woo doom crap.

Nick Foster [00:26:57]:
Yeah, I mean it's very tricky territory because a lot of people have got big excitement about this particular family of technologies for quite good reason. A lot of people have got big fears about these technologies for quite good reason. And I think as soon as you call those out and start to say we need to move away from it, people think you're trying to diminish both of those perspectives. What I'm doing when I talk about the future mundane is just acknowledging this middle of the bell curve experience that will likely have. Now that bell curve could skew from one side to the other. But I love to talk about these technologies not in everything's all collapsed and gone super south or everything's amazing and utopian and we're all going to live fancy lives. My experiences of most technological shifts have been somewhere in the middle. And so I like to try and talk about, try and acknowledge that those extremes exist and understand the powers and the faults of both of them, but spend as much time as I can trying to consider what that's going to be like.

Nick Foster [00:27:54]:
Just when I go back to see my dad and go for a pint with him at the social club in his damp northern British town. When you talk about AI, what do you mean there? Is this pint more expensive or cheaper? Does the pinball machine still work? Those kinds of questions start to force you to interrogate the future with more rigor. And all those coulds and shoulds and mights and don'ts, they just start rushing in with real sort of vivid reality in a way that hype and doom just sort of feel like alien opinions and alien places that are hard to wrap your hands around. So, yeah, I like to try and think about the future. And when I've been working with people in the past, they found value in doing that as well. Like, really thinking about, what would my auntie do with this? Like, is her life any different at all? Anyway, that's. That's the reason I like to focus on that.

Leo Laporte [00:28:43]:
Your dad's pub sounds like a place the future left behind. It's very much like it was.

Paris Martineau [00:28:48]:
I don't know that the future needs to be integrated into his dad's pub.

Leo Laporte [00:28:53]:
Well, it's gonna be technically in the future. Tomorrow is the future. But it may not change in a substantive way.

Jeff Jarvis [00:29:00]:
It doesn't mean rejecting the past, it means learning from it.

Nick Foster [00:29:03]:
Yeah, it might not have a huge moment when we sort of actively inject AI into my dad's pub. We've really gone deep on this one.

Paris Martineau [00:29:13]:
Pub on the blockchain.

Nick Foster [00:29:14]:
But I would say if we were to revisit my dad's pub in five years or so, there would be noticeable differences. I think they would be outweighed by the persistencies and the things that remain largely the same. You know, the prices might be slightly higher and the beers that they have on tap might be slightly different, but everyone's phone will be slightly different. People might be wearing slightly different bits of tech they're paying for. Things might be slightly different. You know, there will be little tells about the change that's happened. And I think really spending time to think about those little nuanced changes in the world and thinking more rigorously about how we feel about them is what I'm trying to call for in this book.

Leo Laporte [00:29:54]:
That's one of the things I love about some filmmakers. Love subtly telling you you're in the future where everything looks the same, except that everybody's wearing high water pants, as in her. Or, oh, we have a black president. This must be the future. Or a female black president must really be the future. Now that in a way acknowledges what you're Saying which is that the future happens unevenly. By the way, let me correct my. It wasn't Arthur C.

Leo Laporte [00:30:25]:
Clarke and Stuart Brand's gonna kill me. It was Stuart Brand who came up with this idea of an occurren Creative.

Nick Foster [00:30:29]:
Yeah. Pace layers. Yeah.

Leo Laporte [00:30:31]:
Layers. Yeah, yeah, yeah. Pace layers is what he calls them. Yeah, yeah.

Nick Foster [00:30:34]:
I think the.

Leo Laporte [00:30:35]:
The.

Nick Foster [00:30:38]:
Yeah.

Leo Laporte [00:30:38]:
I think there was no question in there. I'm sorry. That was a long nothing.

Jeff Jarvis [00:30:42]:
I do too.

Leo Laporte [00:30:43]:
Nothing ball of statement. Let's. Because we only have a few more minutes. Help us Is it. Is there a prescription for how we should be thinking about One of the things that's really hard for us especially if we're fans of technology and we're kind of immersed in technology and what's going on. We kind of have this vague through a glass darkly vision of what's going to happen. We know it's going to change anyway. But help us think about this change and how.

Leo Laporte [00:31:16]:
Is there any preparation we should do. Is there any way we should get ready for the future?

Nick Foster [00:31:23]:
Yeah. Again, at risk of turning into a purveyor of predictions or a prediction machine. I think one of the challenges is most of decent futures work is just doing the thinking, not necessarily getting to the answer or being sort of better prepared. Just spending the time to invest real time and money and effort and building teams around thinking about the future is the best preparation. And it's not again to try and discover the answer somehow. It's much more focused on getting a group of people together to really focus on doing that kind of work.

Leo Laporte [00:31:56]:
So that you're thinking about the many different tendrils of the future. I think of Harry Seldon in Isaac Asimov's foundation where he mathematically predicted the future using probability. We obviously can't do that. But are you saying that we could kind of think about where the multi verse might be headed?

Nick Foster [00:32:18]:
So in. In sort of professional foresight and futures work, the might futurists mental model is probably the most prevalent for good reason which is again a built on sort of scenario planning and strategic foresight that came out of the Cold War and RAND Corporation. RAND Corporation Herman Kahn people like that. Which is sort of the best in class that we've got which is trying to think with plurality about the future. Trying to think all of the different directions and and different scenarios that could play out. And that takes a lot of time and a lot of work. And also the downside of that way of thinking is you'll never figure them all out by a long stretch. You'll get A few percent of them, probably, actually.

Nick Foster [00:32:57]:
And our experiences that we've had in the past absolutely color what we're able to imagine in the future. And the example that I use for that is when we think about stage magic, for example, when somebody pulls a card out of a box that had nothing in it before, it feels amazing because previous us had never seen a scenario where that could happen.

Leo Laporte [00:33:17]:
Right.

Nick Foster [00:33:17]:
And so I think the challenge of that, sort of the best thing you can do is think as many, as broadly as possible, all scenarios that could come to pass, knowing that there'll be tons that you won't see, or if you do see them, you'll push them out into impossible territory, like Kodak did, like Nokia did, like Blockbuster did. And I think that's the challenge is knowing that it's never going to be a finished body of work. Another Bruce Sterling is. Is the future is a verb, not a noun. It's, it's. It's a process, not a destination.

Leo Laporte [00:33:49]:
You know, love that.

Nick Foster [00:33:50]:
So you have to keep. You have to keep doing the thinking all the time and adjusting based on new evidence coming in.

Leo Laporte [00:33:56]:
But my reaction to that, and this is historic for me, it's not just to you, but it's in general, has been, well, that I just can't predict the future. And so I'm not going to even attempt to think what's going to happen because it's unpredictable. It's un.

Jeff Jarvis [00:34:10]:
But we have a responsibility, each of us. What I wrote in what we weave is that every time we do something on social media, every time we do something on the Internet, we're building the Internet and we're choosing what Internet to build. Right?

Leo Laporte [00:34:24]:
Well, that's. Well, yeah.

Jeff Jarvis [00:34:26]:
And so I learned. I used to be more of a jerk, but I think I've learned a little bit. And I try to be better.

Leo Laporte [00:34:32]:
And that's not about the future. That's just.

Jeff Jarvis [00:34:35]:
No, it is about the future because. No, because every time you point to a fight and say, okay, fight, let's all go over there and fight, you're encouraging more of those fights. You're encouraging a future that's filled with those fights. Whereas when you say to somebody who's spreading lies, sorry, bud, but you know, you're wrong about that. You should look that up. And there's this thing called Google. You're trying to encourage a different behavior. They may hate you for it, they may ignore you for it, they may call you woke for it.

Jeff Jarvis [00:34:59]:
I don't give a.

Paris Martineau [00:35:02]:
But.

Jeff Jarvis [00:35:02]:
But that's our responsibility and we're building that Internet a brick at a time.

Nick Foster [00:35:06]:
Or that AI, There is that sort of dogma that's creeping in, which is everything's changing so fast. Not just tech, but everything's changing so fast that thinking about the future is really hard. And my response to that is, well, so we just not do it then. Because that doesn't seem like a good answer. But it doesn't seem like a good answer.

Jeff Jarvis [00:35:21]:
Imagine being around. Imagine being around in the Industrial Revolution. That felt pretty damn fast. That felt more profound than what we're going through now. Being around during Gutenberg. It was profound.

Nick Foster [00:35:33]:
One of the things that I think we're all starting to realize, and when I say we're all. That's a difficult phrase, obviously, is 8 billion of us. But we're all starting to realize that we're living in time capsules that were accidentally planted by people who didn't think enough about the future. And we have an opportunity to change. We have an opportunity to change that prioritization. And as things change so quickly and with such magnitude and we're able to deploy massive change at scale quicker than our predecessors, I think it naturally comes with that responsibility that we should have at least a conversation about where all this might be going and what it all might mean and what we would like it to become and what we're scared about and what we're anxious about and what we should keep an eye on. To me, the fact that it's moving so quickly is more of a reason to spend more time thinking about the future than to encourage you to do it less.

Jeff Jarvis [00:36:24]:
Nick, can I ask you a favor? You've used the phrase bits and bobs a couple times in the show. I'm going to translate it into New York, otherwise known as tchotchkas. I would love to. If those of you who are. Who are not on video, Nick has a brilliant studio. I would love you to just point to one beloved tchotchka and tell us its story. Anything in your studio.

Nick Foster [00:36:43]:
Crikey, that's a good one. There's a. Well, I've got to learn how to point here. This is an air freshener for a car made out of an image of my head.

Leo Laporte [00:36:52]:
I like that. It's really good. Very refreshing.

Nick Foster [00:36:55]:
These were some. Actually. These are lovely. I'll do this one. This is a set of two bird boxes, boxes that my friend Reese Newman created with a string in between that were designed to be thrown up onto telegraph lines, like instead of sneakers, two wooden bird boxes that you throw over the line. And then birds can go live in them.

Jeff Jarvis [00:37:13]:
Thank you. I needed that.

Leo Laporte [00:37:14]:
I don't think Pacific Gas and Electric will be thrilled to hear that, but.

Jeff Jarvis [00:37:19]:
Cause a blackout in California, it's not hard.

Paris Martineau [00:37:21]:
Other problems?

Nick Foster [00:37:22]:
Yeah, I'm not. I'm not representing any brand or anything.

Leo Laporte [00:37:26]:
What's the use my ID $1. Is that a street sign you found somewhere?

Nick Foster [00:37:30]:
This was. Sorry, I'm really bad at pointing. This is. It was for. Actually, I think for my interview to get a job at Google when we had to talk about. I can't really get into it too much, but I made a short film and yeah, that was one of the props.

Leo Laporte [00:37:46]:
I love it. Use my ID $1.

Nick Foster [00:37:50]:
It was about privacy and security.

Leo Laporte [00:37:51]:
I think you could get more Worldcoin for that actually, if you shop. Nick, it's been such a pleasure. We could go on and on. The book is really a fun read with lots of great quotes and stories and more. As much as anything, a description of how not to think about the future as it is. How to think about the future. But a lot of very inspiring things. And I do hope that at some point you will make the cricket crunch because that is the cereal of the future, ladies and gentlemen.

Leo Laporte [00:38:28]:
It's in the book. The book is. And let me pull up your page because it's. The best way to get it is to go to Nick's page. Could, should, might, don't. Nick Foster, the author, how we think about the future. And don't be put off by the idea that this might be somehow prescriptive. It's just a really interesting, fun, thought provoking kind of tour inside of Nick's mind.

Leo Laporte [00:38:56]:
Thank you. I really enjoyed it and I very much enjoyed talking to you, Nick. Thank you.

Nick Foster [00:39:00]:
Yeah, likewise. Thanks for having me on.

Leo Laporte [00:39:02]:
It's been a pleasure. So it isn't deck chairs. I'm looking at the British cover now. It's beach chairs.

Jeff Jarvis [00:39:07]:
Orange.

Nick Foster [00:39:07]:
Beach.

Leo Laporte [00:39:08]:
And orange, by the way, you must have known. Is the new color, by the way, thanks to Apple.

Nick Foster [00:39:12]:
That's true. This decision was made much before then.

Leo Laporte [00:39:15]:
We're all going orange. Well, no. As usual with a designer, you've got your fingers on the pulse of the. Of the world.

Jeff Jarvis [00:39:24]:
Having worked in Kaden Ask, there were other moments where they've discovered the color.

Nick Foster [00:39:28]:
The color. This season.

Leo Laporte [00:39:30]:
This season the color is orange. Lovely video too. On the front. What is the story of the rain and the leaves? Is there some deep.

Nick Foster [00:39:40]:
I encourage you to watch the whole. It's a trailer for a book, which I thought was a nice idea. So we have trailers for movies, but this is a trailer for the book which introduces all.

Leo Laporte [00:39:49]:
There's Moses. There's for the Fifth. The Fifth Element. And there's space. Oh, there's Zoltan.

Paris Martineau [00:39:58]:
Very important.

Leo Laporte [00:39:59]:
Oh, this is fun. So you're filming as well? Yeah, yeah, yeah.

Nick Foster [00:40:03]:
I try everything.

Leo Laporte [00:40:04]:
Yeah. Whatever it takes, future.

Nick Foster [00:40:06]:
Whatever it takes.

Leo Laporte [00:40:07]:
Very good. Thank you, Nick.

Nick Foster [00:40:08]:
Really lovely. Nice to see you all. Thank you.

Leo Laporte [00:40:10]:
Have a wonderful day. Thanks. Cheers. We'll be back with the Intelligent News AI news in just a moment. You're watching Intelligent Machines with Paris Martineau and Jeff Jarvis. The show today brought to you by Pantheon. I mean, that seems only fair, since our website is also brought to you by Pantheon. It's funny they didn't know this when they came to us to do advertising.

Leo Laporte [00:40:33]:
Lisa said, should we endorse Pantheon? I said, endorse them. We use them. This is how our entire workflow operates. Your website. Our website is really important to you. For many of us, it's the number one revenue channel. But when it's slow, when it's down, when it's stuck in a bottleneck, well, it could also be your number one liability. Pantheon keeps your site fast, secure, and always on.

Leo Laporte [00:40:58]:
That means better SEO, more conversions, and no lost sales from downtime. But that's not just a business win. No, it's a developer win, too. Just ask our engineer, Patrick Delahanty. He loves Pantheon. Your team gets automated workflows, isolated test environments, and zero downtime deployments. Actually, I really appreciated Pantheon's isolated test environments when I accidentally pushed a change to the website that broke everything. Fortunately, it pushed it to test, not to production.

Leo Laporte [00:41:32]:
Thank goodness. That means no more late night fire drills. No works on my machine. Headaches. You've heard that. Oh, it works on my machine. Just pure innovation. Marketing can launch a landing page without waiting for a release cycle.

Leo Laporte [00:41:45]:
Developers can push features with total confidence. And your customers, they just see a site that works 24 7, they don't know what's behind it. But you will. Pantheon powers Drupal and WordPress, sites that reach over a billion unique monthly visitors. Visit Pantheon IO and make your website your unfair advantage. Pantheon, where the web just works. We love Pantheon. And it's more than just our website.

Leo Laporte [00:42:15]:
It's actually the headless Drupal, the public API that runs the website. The private API that all of our editors and producers use for all of their workflow. We are completely dependent on them. They're awesome. Thank you. Next week, our guest will be Steven Alevi, editor at large. At Wired. I'm looking forward to talking to our dear friend Steve.

Leo Laporte [00:42:36]:
In fact, you were just with him, weren't you, Jeff? On Monday.

Paris Martineau [00:42:40]:
Jeff is. We've momentarily lost his computer right now. Jeff was with him. Not a blackout. Jeff was just having some choppiness in his video. But Jeff was with Stephen at Wired's AI Summit. And I used to work. It's with Stephen at Wired, so it'll be.

Leo Laporte [00:42:56]:
Oh, that's right. Of course you did. Yeah.

Paris Martineau [00:42:59]:
See him again.

Leo Laporte [00:43:00]:
I have an autographed copy of Hackers. That's how much I love Steven.

Paris Martineau [00:43:04]:
Such a delight.

Leo Laporte [00:43:05]:
Yeah, he's kind of legendary. He's been covering the kind of longer than this theory. Yeah, yeah, yeah. And it's one of the best tech journalists. You read his stuff at Wired. But he just recently. This is why I said, we got to get Stephen on. I said, stephen, we have to get you on.

Leo Laporte [00:43:20]:
He just wrote a piece on Wired about the anthropic decision and had kind of interesting. I won't. I won't telegraph his take, although I guess you could just go to Wired and read it. But I thought it kind of interesting take. And I thought, hey, we gotta get Stephen on to talk about that. So Stephen Levy, one of the great tech journalists, will join us next week for a conversation on intelligent machines. We'll wait until Jeff comes back, I guess. What have you been up to lately? Paris Martineau.

Leo Laporte [00:43:52]:
Did your story get published yet?

Paris Martineau [00:43:54]:
No, it will at the end of the month.

Leo Laporte [00:43:58]:
Excellent.

Paris Martineau [00:43:59]:
Keep your eyes out. I'm going to Yonkers again tomorrow for a, I don't know, big work event sort of thing. It'll be fun.

Leo Laporte [00:44:08]:
Well, please thank whoever. I guess Stacey Higginbotham was one of the people who wrote the letter, the open letter to Michelle. Yeah, To Satya Nadella. We talked about it on Windows Weekly, and I think Paul was a little bit, I don't know, skeptical, but I think she made a really excellent point in this. Let me pull up the letter here so I can show everybody. She was telling Microsoft and Satya, you know, can you please extend the deadline for Windows 10? Windows 10 will go out end of life, October 15th, which is awfully soon.

Paris Martineau [00:44:53]:
And what number of Windows are we on currently?

Leo Laporte [00:44:57]:
Well, they want everybody to go to 11, but as Stacy points out, as Consumer Reports points out, most of the machines sold until very recently, until the last like two or three years, cannot be upgraded to Windows 11. Microsoft has made what many consider an arbitrary decision. To require two key technologies, you have to have an intel processor of 8th generation or later, a much more modern one and something called TPM 2.0 which is a security feature built into the hardware of the operating system. Microsoft says, well this is because we want to make Windows more secure and so these hardware features we will be using to make it more secure. But as Steve Gibson, our Security now host continually points out, it's an arbitrary distinction because there are ways to run Windows 10 hacks, if you will, to run Windows 11 on machines that technically are not supposed to run Windows 11.

Jeff Jarvis [00:45:57]:
Windows 10 is what, 10 years old?

Leo Laporte [00:45:58]:
It's exactly 10 years old. So Microsoft typically will support an operating system and I think this is not unreasonable for 10 years. October 15th they will stop shipping security updates unless a you pay $30 for the next year if you're a company, you may have to pay even more if you have an enterprise license or B you have been using B. Oh look, hello. I gotta turn that off. That's an Apple thing. You're using Bing Rewards and you have what was it, 5,000 Bing rewards points. Now I happen for some reason, I think because I subscribe to Xbox Game Pass or something have like a hundred thousand Bing Rewards points.

Jeff Jarvis [00:46:39]:
But you're rich, I tell you.

Leo Laporte [00:46:41]:
And they're non transferable so don't ask me for them. And then, or the third way you could get a year's extension is to back up your settings to OneDrive which as Paul points out and I think this is the case is Microsoft saying, well we don't want to just say everybody gets a free year so we're going to put the lowest speed bump possible and then if people do it then we're going to give them an extra year.

Jeff Jarvis [00:47:05]:
So what's Stacy asking for a year or what period?

Leo Laporte [00:47:08]:
Well, she's just saying it's. Yeah, she doesn't specify a speed.

Jeff Jarvis [00:47:11]:
Speed. It's going to end at some point.

Leo Laporte [00:47:13]:
Yeah, it has to because no company supports its operating systems forever. Right.

Jeff Jarvis [00:47:17]:
Microsoft used to be really backward compatible as an ethic way there's still backward compatible.

Leo Laporte [00:47:23]:
This is what we're talking about is not compatibility. In fact, businesses will continue to run Windows 10 if they pay for the updates. Oh, this is the other thing that's a little galling. These updates are available because companies can buy them. So it's not that Microsoft is not supporting it, they've just decided to stop delivering the updates. That is a way to kind of compel people to Upgrade to Windows 11.

Jeff Jarvis [00:47:47]:
Planned obsolescence makes capitalism great.

Paris Martineau [00:47:49]:
Yeah, I mean, but yeah, this is on A scale that I feel like is unique. The fact that this is happening one operating system deep is rough. Yeah, I can understand if this is like Windows 9 or something, but this is.

Jeff Jarvis [00:48:03]:
This is Benito. It's also that, like, there is no functional difference between Windows and 11 and Windows 10 for a user.

Leo Laporte [00:48:09]:
Like, they're basically the same.

Jeff Jarvis [00:48:10]:
For me to go at 11, there's no real.

Leo Laporte [00:48:12]:
Yeah, they're basically the same. That's another really important.

Jeff Jarvis [00:48:16]:
Like Windows 7 to Windows 8 and 9 at least gave you DirectX 12.

Leo Laporte [00:48:22]:
Right. And CR also points out that there are quite a few people, a large number of people still on Windows 10 who will be left out in the cold October 15th. They will have insecure computers, which doesn't just hurt them, but frankly also hurts the general computing ecosystem. Because, you know, if you think about it, those compromised computers end up becoming a hazard on the Internet for everybody. So I think it was appropriate and maybe we should get Stacy on to talk about it.

Jeff Jarvis [00:49:04]:
We were going to get Stacy and Craig on at some point to talk about his support of her work there.

Leo Laporte [00:49:08]:
Oh, that's true. We should do that too.

Paris Martineau [00:49:10]:
Additionally, we should do an oops all cross show at some point.

Leo Laporte [00:49:14]:
I would love to do that, Stacy. Yep.

Paris Martineau [00:49:16]:
Nick.

Leo Laporte [00:49:16]:
Yeah, I was trying to do that.

Jeff Jarvis [00:49:18]:
That's something I was trying to do for Twit.

Leo Laporte [00:49:20]:
Yeah, we tried to. Yeah.

Paris Martineau [00:49:22]:
Be fun. We got time.

Leo Laporte [00:49:24]:
There's. It's not over yet. Don't give up yet. We can still do it. So I doubt very much that anything will happen. Although if you. By the way, it was authored by Stacy Higginbotham and Justin Brookman, who's director of Technology Policy. These are two policy folks for Consumer Reports.

Leo Laporte [00:49:44]:
Consumer Reports. The last sentence is calls on Microsoft to extend Support for Windows 10 to allow these consumers to keep up hundreds of millions of PCs on the market, particularly consumers who recently bought incompatible hardware. Paul says, you know, you couldn't buy a computer that wouldn't run Windows 11, you know, as. As long as three or four years ago. I mean, at some point manufacturers stopped shipping. But I don't know if that's true. If you go to Costco, even as recently as last year, you could buy a Windows 10 machine that couldn't run Windows 11. I'm pretty sure because Costco would have.

Leo Laporte [00:50:24]:
Outside of America's old ones. Outside of America. Right. And there's of course, entire.

Jeff Jarvis [00:50:28]:
Very important.

Leo Laporte [00:50:29]:
Yeah, there's one and a half billion Windows users and half of them are still using Windows 10.

Jeff Jarvis [00:50:34]:
At least Benito speaks for the World.

Nick Foster [00:50:37]:
Yes.

Leo Laporte [00:50:38]:
As always, he speaks from the world as well. So it's kind of a beautiful thing that way. So thank you to your compadres and Yonkers.

Paris Martineau [00:50:48]:
I shall pass the message up the chain.

Leo Laporte [00:50:50]:
Yeah. When you, when you see them.

Jeff Jarvis [00:50:52]:
Yonkers is such a New York name, isn't it? It's a Dutch name as far as I know.

Leo Laporte [00:50:56]:
Did you ever ask them about the Yonkers questions?

Paris Martineau [00:50:59]:
No, I mean I, I know what the, I know what the uncle's questions are. Now I, I did a, I did a brief Google, but I do, I do think it's fun for the bit to pretend to have no understanding of why my tax forms in New York continue to live in Yonkers.

Leo Laporte [00:51:15]:
If you live in Yonkers, we can help.

Paris Martineau [00:51:18]:
If you live in Yonkers, call for help immediately.

Leo Laporte [00:51:22]:
Parmi Olson, who of course has been a guest on this show, she wrote a very good book called Supremacy. AI Chat, GPT and the Race that Will Change the World has an opinion piece in Bloomberg this week. AI's 344 billion dollar language model bet looks fragile. We've seen be seeing more and more and we've talked a little bit about this that LLMs, perhaps this is what Gary Marcus has been saying all along are, are kind of hitting a plateau and that there it seems to be the case that LLMs will always have a hallucination problem that will always have edge cases they can't solve. Hallucinations are not going to go away. In fact you quoted last week, was it Fei Fei Li who was it that you taught?

Jeff Jarvis [00:52:17]:
Yeah.

Leo Laporte [00:52:17]:
Oh, in fact here's Fei Fei Li who try to figure out the. Oh, I don't know where that's coming from. Let me mute whatever tab that is. Somebody's still talking.

Jeff Jarvis [00:52:28]:
Somebody's always talking around.

Paris Martineau [00:52:29]:
Someone is always talking.

Leo Laporte [00:52:30]:
Why are they.

Paris Martineau [00:52:31]:
You get. If you open a critical mass of tabs, someone will be talking always.

Jeff Jarvis [00:52:35]:
Yes.

Nick Foster [00:52:36]:
I hate that extra emphasis in ZDNet special feature.

Paris Martineau [00:52:40]:
We can hear it now.

Leo Laporte [00:52:41]:
There it is. I think that was Jason Heiner. Maybe it was just left over from Twit. Here is Fei Fei Li on the limitations of lln.

Paris Martineau [00:52:50]:
Language is fundamentally a purely generated signal. There's no language out there. You don't go out in the nature and there's words written in the sky for you. Whatever data you feeding you pretty much can just somehow regurgitate with enough generalizability the same data out and that's language to language. But 3D world is not. There is a 3D world out there that follows laws of physics that has its own structures due to materials and many other things. And to fundamentally back that information out and be able to represent it and be able to generate it is just.

Jeff Jarvis [00:53:37]:
Fundamentally quite a different problem.

Paris Martineau [00:53:40]:
We will be borrowing similar ideas or useful ideas from language and LLMs, but this is fundamentally, philosophically, to me, a different problem.

Leo Laporte [00:53:53]:
That's interesting.

Jeff Jarvis [00:53:54]:
That's really good. Leo, if you don't mind. Line 111 is Yann LeCun's Mondo Presso that he gave to students and graduates.

Leo Laporte [00:54:04]:
You sent us this. Yeah.

Jeff Jarvis [00:54:05]:
And if you go down to page 113 1. Okay, I saved you five hours.

Leo Laporte [00:54:12]:
We are going to go to page 113. Get ready to scroll.

Jeff Jarvis [00:54:19]:
No, you can just put a number.

Leo Laporte [00:54:21]:
Oh, I could just enter 113, huh? Okay.

Paris Martineau [00:54:26]:
While we're searching, I'll point out that in that video they did have a. In the shot behind Fei Fei Li, a Motorola Dynatec brick phone.

Leo Laporte [00:54:36]:
I think it was bigger than normal though. I think that was like a giant replica of. Looked huge.

Paris Martineau [00:54:43]:
It did look huge. I don't think that was the.

Leo Laporte [00:54:45]:
I've used those phones. They're big, but they're not big.

Paris Martineau [00:54:48]:
But they're not that big.

Leo Laporte [00:54:50]:
That's what I thought. A16Z is exaggerating a little bit. We're going to talk about a 16Z.

Paris Martineau [00:54:56]:
We got to get the biggest brick phone available.

Leo Laporte [00:54:58]:
Yeah, right. So here are the recommendations from Yann Lecuna.

Jeff Jarvis [00:55:03]:
So start at the bottom.

Leo Laporte [00:55:04]:
Set this up. So who is Lon Yann?

Jeff Jarvis [00:55:05]:
Yann Lecun is chief AI scientist at Meta. He's a New York University professor and he's one of the people I trust in AI think he's very smart about this stuff and he's looking at the.

Leo Laporte [00:55:19]:
Next.

Jeff Jarvis [00:55:22]:
Levels of what's going to take to get AI advanced. And at the last line of this slide is if you are interested in human level AI don't work on LLMs. So the whole presentation is making other recommendations. Abandon generative models in favor of joint embedding architectures. Now if you ask me what that means, I will start fumpering. Abandoned probabilistic model in favor of energy based models. He has a long thing in the PowerPoint about that. Abandoned contrasted models.

Jeff Jarvis [00:55:50]:
I don't know what that is. Abandon reinforcement learning in favor of model predictive control. Point being here that there's, there's new paradigms to be worked on. And that's what, that's what Jan has been arguing and he just now I'M.

Leo Laporte [00:56:03]:
Going to give you the counter argument which is all these guys don't work for open AI Yann, LeCun, Fei Fei Li and many others and are just jealous of the success that OpenAI has had with LLMs. The surprising success.

Jeff Jarvis [00:56:16]:
Well, this is, we talked about this with, with, with Karen Ho, how that they have put everything into the scale bucket.

Leo Laporte [00:56:26]:
Yeah. In fact, they call it open the Open AI law.

Jeff Jarvis [00:56:29]:
Yeah.

Leo Laporte [00:56:29]:
Which is the more you scale, the better it's going to get.

Jeff Jarvis [00:56:33]:
And that's such an American attitude.

Leo Laporte [00:56:36]:
Well, it doesn't matter to be proven out.

Paris Martineau [00:56:39]:
I mean, first thought, best thought basis. But so far we're not seeing the sort of returns that you'd expect at this specific juncture in the scale equation.

Leo Laporte [00:56:52]:
It's like telling the Wright brothers guys, you only threw three feet. I don't know how you expect to get across the Atlantic, but why be.

Jeff Jarvis [00:56:59]:
Closed to other options? Go to the next slide.

Leo Laporte [00:57:01]:
Well, no, by the way, no one is everybody but Open AI is trying other stuff.

Jeff Jarvis [00:57:06]:
Yeah, yeah.

Leo Laporte [00:57:07]:
So good.

Jeff Jarvis [00:57:07]:
But AI is being.

Leo Laporte [00:57:09]:
There should be competition. There should be. Let a thousand flowers bloom. But I might, I don't know what the answer is, but I'm just saying all of this is kind of speculative at this point.

Jeff Jarvis [00:57:21]:
Well, it's. No, it's research. It's what they should be doing. So the next slide is problems to solve.

Leo Laporte [00:57:26]:
Okay.

Jeff Jarvis [00:57:26]:
Number one is large scale world model training from video, speech, text code, dialogues, math, everything. Planning algorithms. Jepa. I forgot what that stands for. But, but, but he's there looking at, at what else people should be working on and what should be invested in, in universities and corporations. I think that's what's critical is if all of the, this is what Karen said is so much of the venture money went purely into scale, purely into huge data farms and not necessarily investigating other models. And we forced the Chinese because we wouldn't give them chips and we wouldn't give them other things into investigating other models. That's where Deep SEQ came from.

Jeff Jarvis [00:58:09]:
And if we're going to stay competitive, we got to do that. The last thing in the whole slide is the whole presentation is open source. AI platforms are necessary.

Leo Laporte [00:58:17]:
Yeah, I would agree with that. Yeah.

Paris Martineau [00:58:19]:
I love the visual aesthetic of this presentation.

Jeff Jarvis [00:58:22]:
Yeah. It's amazing.

Paris Martineau [00:58:23]:
It's very much a professor writing in red ink on your essay.

Leo Laporte [00:58:30]:
But it also violates the fundamental principles of PowerPoint presentations. Too many slides, too many points per slide. Very wordy.

Jeff Jarvis [00:58:39]:
He's a professor.

Leo Laporte [00:58:40]:
Yeah. Well, okay.

Jeff Jarvis [00:58:43]:
It's.

Leo Laporte [00:58:44]:
Yeah, all right.

Jeff Jarvis [00:58:46]:
He's a CS compressor.

Paris Martineau [00:58:47]:
I know. I do think that this is a good point. It's something that I feel like we've talked about in the show before. Why? Like one of the downsides of this current moment we are in is that yes, there's an upside that AI is getting outsized attention and capital, but only a very specific sub section of AI and machine learning research is getting any capital and attention at all. Everything else is being kind of pushed by the wayside so that people can pour more tens of millions of dollars into a different chat bot.

Leo Laporte [00:59:21]:
Well, tens of millions of dollars into robots that may or may not provide a better AI. I mean, you're going to be investing money speculatively no matter what you invested.

Jeff Jarvis [00:59:32]:
In, but they're investing in hardware right now rather than research. And so it's as if you said we're going to get the whole world up and we're going to make nothing but tubes because we know tubes work. Screw these little transistor things. That's unknown. Tubes work, damn it. And look at the amount of investment in the hardware of these huge Manhattan sized data farms. That's hard to just say, oh, okay, now we're going to pursue something else. It's one matter to invest in a new company in a startup, in research, in.

Leo Laporte [01:00:07]:
Maybe it's the wisdom of crowds, Maybe the, the money flowing out of about five people. No, maybe the money's flowing in that direction because there's a general consensus this is where the money should flow.

Jeff Jarvis [01:00:18]:
They're, they're, they're lemmings, they, they follow each other. That's the, that's the essence of, of, of Silicon Valley.

Leo Laporte [01:00:25]:
I do think the one. I will disagree with you. I don't think we should, I don't disagree with you that we should try many avenues, but I do disagree with you that, oh, LLMs are dead end. I don't know if we know that yet.

Jeff Jarvis [01:00:38]:
I think there's more to be learned.

Paris Martineau [01:00:39]:
I don't think we're saying they're a dead end.

Jeff Jarvis [01:00:41]:
No, they're not dead end.

Leo Laporte [01:00:42]:
Well, but I will say there is a great cost to, and this is what Karen Howe was saying and is what others have said. It's what Fei, Fei Li was saying. There is a great cost to some of the avenues they're choosing. For instance, research learning has a huge cost and a human cost because reinforcement learning, because it requires a lot of humans to do the work. Although interestingly, Xai has decided not to go down the data annotation road. Elon Musk laid off about 500 people, a third of the group on their data annotation team. These were unskilled people who were doing reinforcement learning, much like what scale AI was doing as generalists. And instead they're going to hire experts, which.

Jeff Jarvis [01:01:32]:
Which is a new path that some are taking. And I think it's a smarter path.

Leo Laporte [01:01:36]:
Yeah. And it may be an acknowledgment from Xai that, that, you know, what they were doing wasn't working. But I, I can think it's reasonable to say, look, there are big costs to this, human costs, climate costs, energy costs that we should really as a society decide are we willing to pay these costs, Particularly since regardless, it's a speculative process. We don't know what's going to come out of it. But if people are willing to throw that money at it. I don't know. TikTok. We don't know yet.

Leo Laporte [01:02:11]:
We'll find out on Friday. The president is going from one monarch to another. From King Charles to meet with Xi Jinping. It's the meeting of the emperors. Actually. Charles has no power at all, so forget that.

Jeff Jarvis [01:02:27]:
It looked like they were throwing Trump little Donnie a birthday party today. They brought all the horses and the drums and it was just unbelievable to watch.

Leo Laporte [01:02:37]:
I honestly don't think it's a good thing for the President of the United States to associate too closely with monarchy, peace in the country we rebelled against. But that's just my.

Jeff Jarvis [01:02:47]:
It was. The Red Coats were out there. I thought. I thought we defeated you guys.

Leo Laporte [01:02:50]:
Didn't we beat the Red Coats? Anyway, apparently a deal, a framework has been made. Trump and Xi Jinping will finalize this in three days. According to Treasury Secretary Scott Besant, the discussions in Madrid have sent Besent.

Jeff Jarvis [01:03:10]:
I don't know. I think it's Bessie. Well, you're French.

Leo Laporte [01:03:13]:
Besson. Monsieur Besson. He says that discussions have yielded fruit and delicious fruit. Poison fruit, maybe TikTok will continue. First of all, the president has to extend once again that deadline, get it into December. But it looks like. And China's lead trade negotiator has confirmed there is a framework. What will happen is that Oracle and Andreessen Horowitz and Silver Lake, which is Larry Ellison's kids business.

Leo Laporte [01:03:56]:
Is it, do you think the kids business or is it Larry Ellison?

Jeff Jarvis [01:04:00]:
It's all dads, which, which I mean, stop here for a second. Not only did Ellison Fee just purchase Viacom and with its CBS now, of course the word is that they want to take control of Warner Brothers Discovery which would put them in charge of CNN. And then on top of all that, TikTok, the answer to all that big old media. This is a frightening, frightening consolidation of.

Leo Laporte [01:04:30]:
Media power which will be faced with absolutely no opposition.

Jeff Jarvis [01:04:35]:
No, no, stop. These are the. The Ellisons are the mini Murdochs.

Leo Laporte [01:04:39]:
Well, they're not gonna be so many, right. Of course, TikTok had been working on a special version of the Tick Tock app for the U.S. apparently, that will be the app that you will have to move to in the United States if you want to continue to use it.

Jeff Jarvis [01:04:56]:
The Chinese algorithm, though the ch.

Leo Laporte [01:04:58]:
The Chinese are going to license their algorithm.

Paris Martineau [01:05:01]:
That's surprising.

Leo Laporte [01:05:03]:
Well, a, they had said never, and B, how are we now protected from the Chinese? Right.

Jeff Jarvis [01:05:11]:
After all of this, where do we end up? We end up with Trump's buddies owning.

Leo Laporte [01:05:17]:
TikTok, and 20% will be still owned by Chinese investors. The US will get a seat on the board of directors.

Jeff Jarvis [01:05:25]:
The US will. Government.

Leo Laporte [01:05:27]:
Yeah.

Jeff Jarvis [01:05:28]:
What f me?

Leo Laporte [01:05:30]:
Well, again, this isn't finalized. This we'll find out on Friday.

Jeff Jarvis [01:05:33]:
Hello, First Amendment.

Leo Laporte [01:05:36]:
We already get 15% from video. You know, we get. We get. It's fine. This is the new socialist capitalism.

Jeff Jarvis [01:05:45]:
Yeah. Yeah.

Leo Laporte [01:05:48]:
We'Ll see. But apparently. Now, here's the question I have for you. And actually I should ask people like Henry, will TikTok survive this? Will people who are currently using TikTok get the new app?

Jeff Jarvis [01:06:05]:
Because what's different?

Leo Laporte [01:06:05]:
The whole.

Jeff Jarvis [01:06:06]:
The whole media structure in this country.

Leo Laporte [01:06:07]:
Okay.

Paris Martineau [01:06:08]:
The question is the. It depends on whether the old app stops working and they have to get.

Leo Laporte [01:06:15]:
The new app to continue presume that it will.

Paris Martineau [01:06:18]:
And is it supposed to. In. In this case, are we assuming that everything transfers over immediately and effortlessly?

Leo Laporte [01:06:26]:
Well, we don't know that, but yes. Let's. Let's stipulate a speed bump then.

Paris Martineau [01:06:31]:
I do think. Yeah, I think everybody transfers. I think it's the Twitter to X principle.

Jeff Jarvis [01:06:37]:
Transfer just means hit, click one box and you're over.

Paris Martineau [01:06:42]:
I think you lose 20 to 30% of TikTok users. But that's the highest level.

Jeff Jarvis [01:06:47]:
And it grows again still.

Paris Martineau [01:06:49]:
Yeah, that's just probably because those people were counted as users because they had TikTok on their phone but didn't really use it outside of, like, clicking links.

Leo Laporte [01:06:59]:
We know that if you're still using TikTok today, you're not concerned about Chinese control.

Jeff Jarvis [01:07:04]:
In fact, I try. Hello. I don't like the Chinese government. I trust the Chinese government more than the Ellisons and Marc Andreessen.

Leo Laporte [01:07:11]:
Well, so that's the other question. So the people using TikTok aren't going to care who owns it, Right?

Jeff Jarvis [01:07:15]:
Paris look frightened at that.

Paris Martineau [01:07:18]:
I don't.

Nick Foster [01:07:19]:
I think.

Paris Martineau [01:07:19]:
I don't know that I trust anybody, but that's besides the point.

Jeff Jarvis [01:07:22]:
Nihilism.

Leo Laporte [01:07:23]:
Well, I'll tell you one thing for sure. The new owners will not hesitate to slant the coverage in TikTok in the same way that Elon has slanted the.

Jeff Jarvis [01:07:35]:
Coverage and the Chinese were accused of doing.

Leo Laporte [01:07:38]:
Yeah, well, they weren't. Which probably they never did. At least it's not as far as we could tell. But. But this for sure will happen. TikTok will become like X. Oh, yes. Right.

Jeff Jarvis [01:07:47]:
Oh, yes. And let me confess here. I thought X would be dead by now. How wrong could I be?

Leo Laporte [01:07:53]:
People like X, Although it's gotten worse and worse.

Paris Martineau [01:07:57]:
Well, it's the Everything app. It's where I do all of my banking.

Leo Laporte [01:08:06]:
So we. We won't know until the actual details are announced. But. But this is what everybody is saying, including the Chinese negotiator.

Jeff Jarvis [01:08:13]:
All of life is a framework.

Leo Laporte [01:08:15]:
Secretary of the Treasury, a senior White House official, said, quote, told the Wall Street Journal, any details of the TikTok framework are pure speculation until they're announced by this administration, which I understand. I mean, well, we don't know what the President's gonna do until he does it, so. And even then.

Paris Martineau [01:08:33]:
I mean, even then, the deal could change 25 times between announcement and implementation.

Leo Laporte [01:08:40]:
So the users aren't concerned about who owns it. They're concerned about the experience, which is why it was fair.

Jeff Jarvis [01:08:47]:
And who's there. Whose friend. The people they like and admire are there.

Leo Laporte [01:08:51]:
Right. I think there's a great risk. I mean, look, I watch Henry. He's basically. He's still on TikTok, but he's moved his operations Instagram. I'm going to ask him. I'm going to see him next week.

Jeff Jarvis [01:09:00]:
Yeah, I'm really curious. Yep.

Leo Laporte [01:09:02]:
But I think he realized that he can't be on a single platform he started on. TikTok still has the most followers. It made him talk. It made him. But I think he'll move. We're talking about my son, Salt. Hank, in case you don't know, he. He's.

Paris Martineau [01:09:16]:
Mark it down, people. He said he was gonna stop the next episode of the podcast. He went back to the.

Leo Laporte [01:09:22]:
But this is germane to the comments.

Jeff Jarvis [01:09:24]:
Don't worry, we're just jumping.

Paris Martineau [01:09:27]:
No.

Leo Laporte [01:09:28]:
Although I have right here a delicious. A French dip sandwich that you might want to spend 28 bucks for. Did you find one in New Jersey?

Jeff Jarvis [01:09:36]:
In Jersey city? That was 28 bucks. It's this, it's the going price now tonight.

Leo Laporte [01:09:42]:
What is a corn. What is that? Corn pie.

Jeff Jarvis [01:09:44]:
Corn pizza. New Jersey corn pizza. Oh, my Lord, it's so good.

Leo Laporte [01:09:48]:
It's a, it's a, it's a pizza.

Jeff Jarvis [01:09:50]:
It's a pizza with corn, tomatoes and no tomatoes. It has a little kind of burrata and a little bit of pepper and it's corn. It's, it's a. Because Jersey corn is the best corn there is anyway.

Leo Laporte [01:10:02]:
Sweet corn. Yeah.

Paris Martineau [01:10:03]:
You guys had a pizza with an egg in the middle. That's something I've been into lately. No, just like a sunny side up egg.

Leo Laporte [01:10:09]:
Here's the quote from the President. I wish I could do a Trump. The kids wanted it so badly I had parents calling me up. They say if I don't get it done, they're in big trouble with their kids. I think it's great. I hate to see value like that thrown out the window. Instead, throw it into this pocket right here, ladies and gentlemen. I don't think he gets any money out of this.

Leo Laporte [01:10:33]:
Although I bet you Jeff Yass, who is a very big Republican donor, will maintain his value. Other investors beside existing ByteDance investors, including Susquehanna International, KKR and General Atlantic will continue to be part of the group. 80% of the new company will be owned by these American companies. The stake of the Byte Dances Chinese shareholders would dip under 20% to comply with that law. It's interesting. I wonder how they convinced China to give them the, the algorithm and will.

Paris Martineau [01:11:08]:
That is very interesting.

Leo Laporte [01:11:09]:
This is the real question. Will China then have control over what you see?

Jeff Jarvis [01:11:13]:
Well, that's. China's not giving them access to the algorithm. China is licensing in the algorithm, which means China still controls the algorithm.

Leo Laporte [01:11:20]:
Well, that's a. I, that's a technical question. I don't know how.

Jeff Jarvis [01:11:23]:
I would bet.

Leo Laporte [01:11:24]:
I would bet TikTok engineers will recreate a set of content recommendation algorithms for the app using technology licensed from Bite Dance. So I think the Americans can't have a thumb on the scale.

Jeff Jarvis [01:11:41]:
Can't.

Leo Laporte [01:11:41]:
Can.

Jeff Jarvis [01:11:42]:
Can. Yes, can, can. They'll have, they'll have mechanisms to control things.

Leo Laporte [01:11:48]:
Oracle was stored.

Jeff Jarvis [01:11:49]:
Chinese know how to do that in China.

Leo Laporte [01:11:51]:
Yeah. Oracle will store the date, the user data at its servers in Texas. Silver Lake and Andreessen Horowitz. Both right leaning companies. Very much so, yeah. So I suspect that. I suspect. Well, we'll see.

Leo Laporte [01:12:10]:
Now the question is whether TikTok's users will sense that and then say, well, maybe I'll go to Instagram, but is Instagram going to be any better? I don't know.

Paris Martineau [01:12:18]:
But.

Jeff Jarvis [01:12:18]:
But the New York Times, the Washington Post, the Wall Street Journal, cnn, cbs, abc, Disney. What's the diff?

Leo Laporte [01:12:29]:
Right?

Jeff Jarvis [01:12:29]:
What's the diff? And. And Matthew Dodd got fired from msnbc. And Comcast sent out a message to all employees praising that move. People have lost.

Leo Laporte [01:12:41]:
Comcast threatened to fire employees if they said anything bad about Charlie Kirk. I think we are entering a new realm, a new time, a new era of where free speech is being redefined. I can promise you that until they manacle me and drag me off, which could happen at any moment, we will not change our tune. I will let Jeff be as.

Jeff Jarvis [01:13:10]:
Liberal.

Leo Laporte [01:13:10]:
As he wants, as dangerous as he wants. It's all him.

Paris Martineau [01:13:13]:
Him. We can't stop him from tweeting.

Leo Laporte [01:13:18]:
All right, I think that's interesting. We'll see what. We'll see what's going to happen. Let's take a little break and we'll have more intelligent machines. Is on the air. Program reminder. We're going to end the show in an hour, roughly an hour, because we have to get ready. We have to get our.

Jeff Jarvis [01:13:36]:
You're going to change your clothes. You're going to put on a change.

Leo Laporte [01:13:38]:
I'm going to put on a tuxedo. Meta connect is at 5pm Pacific, 8pm Eastern tonight. And while you are invited, you do not have to stick around if. If your contacts are really getting you, you can. You can put your glasses on or something. But we will have a little break.

Paris Martineau [01:13:54]:
We'll be eating dinner.

Leo Laporte [01:13:55]:
You won't stick around or you'll be eating and stick around.

Paris Martineau [01:13:59]:
I think I'm going to make food. You can abandon us watching Meta Connect. I'm sorry.

Jeff Jarvis [01:14:04]:
So I. I went to. To Panera to get a French dip. So I'm going to test that versus Hanks.

Leo Laporte [01:14:10]:
It's not going to be as good. I have a piece of celery, so I am prepared.

Paris Martineau [01:14:15]:
He's prepared to loudly snack.

Jeff Jarvis [01:14:18]:
Go ahead, crunch it. I brought. Let's crunch it.

Paris Martineau [01:14:20]:
Just one. The mic.

Jeff Jarvis [01:14:21]:
Oh, just one good crunch.

Leo Laporte [01:14:22]:
You want one good crunch?

Jeff Jarvis [01:14:23]:
Yeah, one good crunch.

Leo Laporte [01:14:23]:
Cover your earsophonia. Stick your fingers in your ears right now.

Paris Martineau [01:14:29]:
Oh, that was good. That was satisfying.

Jeff Jarvis [01:14:32]:
Yeah, that was. Yes. Swallow. Which is not easy with celery. It doesn't just go down. It's a lot of fiber. You gotta chop on. Yep.

Jeff Jarvis [01:14:47]:
Filling the air here.

Paris Martineau [01:14:48]:
Now is the time that he's stuck chewing and we could say anything stop us.

Leo Laporte [01:14:54]:
We could.

Jeff Jarvis [01:14:55]:
We will cover that celery trick.

Leo Laporte [01:14:58]:
We will cover Meta Connect. Very interested. Because there have been a number of leaks implying that they may in fact announce the most advanced AR glasses yet with actual color screen and so forth.

Paris Martineau [01:15:11]:
So I'll believe it when I see it.

Leo Laporte [01:15:14]:
I know. Well, I'll get my credit card ready just in case. But we will be covering that. But as usual with stream, with, you know, live streams, we don't want to get in trouble with the companies. We don't get taken off of YouTube or any of the other platforms. So we will stop this show, wrap it up and then start the stream in the club Twit Discord. So we will be doing that. If you are a club member, head to Discord.

Leo Laporte [01:15:38]:
We will in Discord give you a private YouTube channel so you can watch on YouTube if you prefer. But that is for club members only. That will be at 5pm so you can join now.

Jeff Jarvis [01:15:48]:
I'm telling you now.

Leo Laporte [01:15:49]:
Yeah, you still could take advantage of the two week free trial and get, you know, just see how it is to be in the club, what it's like to be a member. Our show today brought to. We have of course, Paris Martineau, Jeff Jarvis, thrilled to have you. Our show today brought to you by Threat Locker. I you know, if you ever tune in our security now show you you know this ransomware is killing businesses worldwide. Jaguar has been down there. They can't make parts for two weeks. They say it might be a third.

Leo Laporte [01:16:23]:
Already the companies they supply with those parts are going under because of ransomware. This is a nightmare. But Threat Locker can prevent you from becoming the next victim. Threat Locker uses an approach that is unique. Is the best way to do this. It's called zero trust. It takes a proactive and here's the three words that matter. This is how zero trust works.

Leo Laporte [01:16:48]:
Deny by default approach. In other words, any action, anything anybody can do, especially bad guys, cannot happen, will be blocked unless it's explicitly authorized. That's huge. It's zero trust. You trust no one. Even if they're in your network. You trust no one. This protects you from both known and unknown threats.

Leo Laporte [01:17:10]:
0 Days Links that your employees click because they just can't do anything. It's trusted by enterprises that can't afford to go down for even one minute. Like JetBlue uses Threat Locker. The Port of Vancouver uses Threat Locker. Threat Locker shields you from zero day exploits from supply chain attacks while providing complete audit trails for compliance. As more cybercriminals turn to malvertising. We talked about this on security. Now there's almost nothing you can do to keep your employees from from, from being attacked.

Leo Laporte [01:17:45]:
You need more than just traditional security tools. Attackers are creating fake websites impersonating popular brands like AI tools and software applications. And then they're distributing links through social media ads and hijacked accounts. They use legitimate ad networks to deliver their malware. I don't know how you could train an employee not to browse to those sites. They're mainstream sites. Anyone who browses on a work system could infect your computers and you're done. Traditional security tools almost always miss these attacks because they are using fileless payloads that run in memory.

Leo Laporte [01:18:27]:
They exploit trusted services that bypass the typical filters. Not Threat Locker. Threat Locker's innovative ring fencing technology strengthens endpoint defense by controlling what these applications and scripts any application, any script, can access or execute unless explicitly authorized by you, which contains potential threats, even if malicious ads successfully hit the device, even if your employee clicks the link. Threat Locker works across all industries. It supports Windows and Mac. They have a 247 US based support line that's fantastic. And they enable comprehensive visibility and control. Ask Jack Senisap.

Leo Laporte [01:19:06]:
He's director of IT infrastructure and security at Redner's Markets, another company that just does not want to get bit by ransomware. He says, quote. When it comes to Threat Locker, the team stands by their product. Threat Locker's onboarding phase was a very good experience. They were very hands on. Threat Locker was able to help me and guide me to where I am in our environment today. That's a happy customer. Jack sent us app I get unprecedented protection quickly, easily and cost effectively, yet surprisingly affordable.

Leo Laporte [01:19:36]:
Visit threatlocker.com TWICK Get a free 30 day trial and learn more about how ThreatLocker can help mitigate unknown threats and ensure compliance. That's threatlocker.com TWIT threatlocker.com TWIT we thank them so much for their support. Ah, where else should we, where else should we wander in this landscape of AI of ours?

Paris Martineau [01:20:05]:
We should talk about the really interesting report released this week by some Harvard economists about what OpenAI users are doing with ChatGPT. This.

Leo Laporte [01:20:18]:
What are they doing?

Paris Martineau [01:20:19]:
So, no, I mean basically, now I'm nervous. No, I just thought this was a very interesting study. It's on line 100. Basically, these Harvard economists got access to millions of logs of ChatGPT users and they analyzed it to break down how the Average person uses ChatGPT and kind of broke it into different Buckets. Let me try and find the.

Leo Laporte [01:20:49]:
This is from the National Bureau of Economic Research, which sounds pretty serious.

Paris Martineau [01:20:54]:
It had just some really interesting breakdowns in the sort of things that people are using it for.

Leo Laporte [01:21:03]:
By the way, four of the others authors are from OpenAI. So this is with a. This isn't, you know, this is with OpenAI's cooperation and improvement.

Paris Martineau [01:21:11]:
Of course, it's entirely. It's within Open AI endeavor. But kind of what they looked into is that one detail I thought was fascinating is that not that many people. I think the amount of people that use it for self expression, which is like role play therapy, relationships, personal reflection, that sort of stuff. It's only 4.3% of users, which is a lot lower than I would have expected given how outspoken those users are online.

Leo Laporte [01:21:44]:
What are the. What is the what? Well, first of all, let's say this, this is also kind of a revelation. By July 2025, 18 billion messages were being sent each week by 700 million users. That's 10% of the global adult population. You. Every week, every week, 10 of the global population uses chat GPT. Wow, though.

Jeff Jarvis [01:22:11]:
Yeah, I wonder how there's not. Not everybody signs in. I don't know.

Leo Laporte [01:22:17]:
Well, let's see, let's see. How do they use it? Is the thing that this is. So if you go to. If they're not using it for page.

Jeff Jarvis [01:22:25]:
17, there's a chart.

Leo Laporte [01:22:26]:
Okay. Because they're not using it for what we thought they were using it for, which is like writing and stuff. Right.

Paris Martineau [01:22:32]:
So. Or they break it down into work uses, users and non work.

Leo Laporte [01:22:40]:
Okay, this is so for hard to read.

Paris Martineau [01:22:43]:
No, we're gonna go to the one that's up. So in Figure 7, this is for non work user use.

Leo Laporte [01:22:51]:
So self expression, what does that mean?

Paris Martineau [01:22:53]:
Self expression means like kind of therapy stuff like role play, sexy chat, that sort of stuff. It's interesting. So the largest bucket for non work users is practical guidance, which is a bucket that basically means. Yeah, it is the.

Jeff Jarvis [01:23:15]:
Scroll up, Leo.

Paris Martineau [01:23:16]:
Leo. You're looking good, boy. So the largest one, what's the difference.

Leo Laporte [01:23:22]:
Between that and this?

Paris Martineau [01:23:23]:
That's work related conversation.

Leo Laporte [01:23:25]:
Oh, this is consumer. Okay, but I'm going to say something. I note that they have broken out seeking information, practical guidance and technical help into three categories. But to me, and by the way, that's more than half of all usage. What that really is is they're using it in lieu of search.

Paris Martineau [01:23:48]:
Well, so right when you do, when.

Leo Laporte [01:23:51]:
You use a search engine, you're looking for information, practical Guidance or technical?

Jeff Jarvis [01:23:56]:
Depends on what the question.

Paris Martineau [01:23:57]:
No, the close they break this down in further. And I'd honestly really recommend people peruse this paper. I know it like seems very dry and it kind of is, but it's really interesting. So the part that you're looking for, looking for specific information like you would from a search engine that is a part of the. The seeking information bucket. And specifically that's like 18% of queries that are specifically looking for factual.

Jeff Jarvis [01:24:21]:
If you go to page 13, technical helps.

Leo Laporte [01:24:23]:
The same thing. I can't get this to print. How do I get this to print? Same thing.

Paris Martineau [01:24:28]:
Technical help in this case refers to questions about medical calculations, data analysis or computer programming.

Jeff Jarvis [01:24:36]:
Go to page 13 and 14 and you will see. See, we do our homework. You will see how they define these things.

Leo Laporte [01:24:43]:
Oh, I see. Okay. Techno mathematical calculation. Didn't ask. So it's not search. Okay. Seeking information, specific info, purchasable products, cooking and recipes.

Jeff Jarvis [01:24:55]:
Search.

Leo Laporte [01:24:57]:
That's search.

Paris Martineau [01:24:58]:
That is definitely more search. And so I don't know just.

Leo Laporte [01:25:03]:
But I think practical guidance is searched too. How to advice tutoring or teaching creative ideation and else fit me. See, this is my thing. I would say I believe that more than half of all the usage is really replacing Google search with AI search. I noticed that's what I do. A lot of the people I talk to, that's what they do. Ask yourself, when you use AI, are you using it to help you figure out the right word in an article? Maybe you are more because you're a writer, but I think you probably use AI in lieu of search as a search engine.

Paris Martineau [01:25:34]:
I don't, but I am an unusual use case. I was talking about this with a. I don't use it for primary search because I don't trust anything to search the way I would. And I also find the. I use Google, I use various academic search things. I use a lot of different forms of search and I use them very specifically and I use them in such specific ways that I keep getting. Getting dinged by Cloudflare because they think I am an AI bot.

Leo Laporte [01:26:10]:
Yeah. So that. And that's. That's a occupational hazard. It's because of the kind of stuff that you need and you're doing.

Paris Martineau [01:26:18]:
Yeah. But I do think you're right that the average person is probably. Or just intuitively it seems like the average person is probably using this to replace search. And this is what it. This study tries to get into is what are people actually using it for?

Leo Laporte [01:26:32]:
That's.

Paris Martineau [01:26:32]:
And they break this down A bit in a question of One of the ways they break this down is trying to break queries down into are they asking which is seeking information or advice that will help the user be better informed, make better decisions either at work, school or in personal life like who is president after Lincoln or how do I create a budget for this quarter? Like what's the difference between correlation and causation? Or. Or are they doing which is doing Messages, requests that ChatGPT perform tasks for the user like drafts an email, writing code or are users expressing like expressing a statement that doesn't ask for information or for them to perform a task and it breaks down the 51% of user queries are asking for information. So like you said said most people are using this to ask for some sort of knowledge back 35% are doing like asking it to generate code or write or something like that. And 14% are expressing which I thought was interesting, like not asking for anything at all. Which I guess could be something as simple as thank you but it could also just be like telling the chat bot about your day. And something I thought that was very interesting and counterintuitive about this research is that. Let me see if I can find it here. The amount of people using chat GPT for programming related stuff or software engineer related stuff is really minuscule.

Paris Martineau [01:28:08]:
Yeah, like.

Jeff Jarvis [01:28:09]:
Well that's because that's the core nerdy audience, I think. Yeah, I think when you were that scale, the geeks are going to be a small but intense audience, I think. Think Leo, on your search question, if you want to go to line 127, page seven, it's academic week here on intelligent machines. A very interesting study out of the University of Toronto looked at the difference between the links from Google Search and ChatGPT.

Leo Laporte [01:28:38]:
The links generated in the generated.

Jeff Jarvis [01:28:40]:
Right. Yes, it's really revealing. So if you go to page seven there in the odd. They did it by. By. By commercial categories, Automotive, consumer electronics and so on in the automotive in the.

Leo Laporte [01:28:53]:
US I must be looking at the wrong thing.

Jeff Jarvis [01:28:56]:
Okay, 1127 go all the way down to page seven. Show me your screen.

Leo Laporte [01:29:04]:
Generative AI at the crossroads. Light bulb. Dynamo.

Jeff Jarvis [01:29:07]:
Sorry. Oh, that's my fault. That's my fault. No, line 125. I'm sorry I screwed that up. I was going to make fun Leo, but I'm the one who screwed up.

Leo Laporte [01:29:16]:
It's my scrolling. It doesn't work in the paper.

Jeff Jarvis [01:29:19]:
145.

Leo Laporte [01:29:22]:
Page 7, page 7.

Jeff Jarvis [01:29:24]:
So in the they differentiated brand links which is to Say to the company from social links to so obvious to earned links, which is media. Right. When Ford gets mentioned in the story. So in Google in the US, 39%, 39.5% of the links were brand, 15% were social and 45% were earned. Whereas in ChatGPT, 81% were earned, 81% were media, double that of Google, 18% were brand going right to the company, zero social media. So what is becomes really revealing to me because that says that these, at least ChatGPT is valuing media and is sending people there even more than Google search. So while the media brands are complaining, oh my God, they're not linking to us. Well, they are, but they're also doing answers.

Jeff Jarvis [01:30:23]:
I went to the Wired AI Summit at Conde Nast on Monday and what it really was was everybody there complaining about there's taking our soul when they take our content. And, and they.

Leo Laporte [01:30:35]:
It's, it's more of an emotional complaint. It's a.

Jeff Jarvis [01:30:38]:
No, no, no. It was all, it was all copyright. Copyright, Copyright.

Leo Laporte [01:30:41]:
Yeah. But I mean it's a feeling like I don't like it that they do this.

Jeff Jarvis [01:30:44]:
Oh no, no. It was business. Like we want to sue these bastards and we want to get them to give us a lot of money.

Leo Laporte [01:30:48]:
Well, there's certainly a lot.

Jeff Jarvis [01:30:49]:
But this reveals to me a very different picture here. So if you tie this, this to the usage that Paris was talking about. You're right. Search becomes very important here because. Because they're doing a good job of it. It might appear if you want to go and find out something about consumer electronics, you may get better sources linked to. Now also, social is Reddit, so maybe you're losing out stuff you care about, but social is really cut out of here.

Leo Laporte [01:31:16]:
Yeah. You know, if I look back through my uses, almost every case it is, it is, it's search. It's. It's in lieu of search.

Jeff Jarvis [01:31:30]:
And what, what are the main uses you put it to? Because you said it's not search. What's an example?

Leo Laporte [01:31:35]:
Yeah, do you use AI at all? Do you generate images? I know you don't code.

Paris Martineau [01:31:41]:
I don't code. I'm trying to think of a good example.

Leo Laporte [01:31:45]:
You can look at your history like.

Paris Martineau [01:31:47]:
Writing a. Yeah, let me see. Oh, something I used it for that I guess is similar to. This is the other day, I was just wondering, as I was walking, I was wondering about how many currently active news orgs are over 25 years old and like what.

Leo Laporte [01:32:06]:
That's a perfect AI.

Paris Martineau [01:32:08]:
Just I asked, couldn't do that Google I don't make a chronological list of news organizations that are active today that were founded at least 25 years ago. Include the name, date, founded, age today and citations for the date. And I got a great list.

Leo Laporte [01:32:20]:
But I would submit that that is, that is a search. I know, it's just, it's not the kind of search you could do in Google very easily. So we are using AI in new ways but it's still basically the same idea which is hey Internet, what's the answer?

Paris Martineau [01:32:36]:
You know another example honestly a frequent use case in the last couple of weeks has been if I'm writing up something from a like FDA recall notice or something that includes a list of like 19 states just by initials and I need to quickly do that. All states there are. I just ask it and then of course I like publish anything or would send it to my editor. I go and count and check myself as well as like check that it expanded into the right But I will turn my little list of initialed 19 states into a comma base.

Leo Laporte [01:33:17]:
All right, both of you, do you have Chat GPT to hand? Because I have a query I want you to try. Yes, Ask it how old it thinks you are.

Jeff Jarvis [01:33:27]:
It probably doesn't know me well enough.

Leo Laporte [01:33:29]:
Well this is important because whatever AI.

Paris Martineau [01:33:33]:
Thing you've used frequently.

Leo Laporte [01:33:34]:
No, no, it's Chat GPT and I'll tell you why it's Chat GPT they are going to start guessing your your age.

Paris Martineau [01:33:42]:
I like this answer because it says I don't actually know your age. From what you've shared about yourself, your career tastes and interests, I could probably make an educated guess but it would just be speculation. Do you want me to take a stab at it or would you rather just tell me?

Leo Laporte [01:33:54]:
The only downside of that is they may start asking you for ID if they don't know how you old you are. This is Sam Altman said we are. They released a statement I thought was very interesting. We really want to protect protect teenagers and kids who are using AI. Of course they're being sued by parents of kids who've self harmed because of they say because of Chat GPT allegedly because of chat gdp. So they're trying to do something to protect themselves. They say they will now try to guess your age and if they can't guess that you're over 18 they might ask you for age verification. The company said we know this is a privacy compromise for adults but believe it is a word word worthy trade off.

Leo Laporte [01:34:37]:
So that's why I'm wondering see I asked Chad GPT and it said, oh, I already know you're 68. What are you talking about?

Jeff Jarvis [01:34:44]:
Yeah, you said you're Jeff Jarvis and here's how old you are.

Leo Laporte [01:34:47]:
Yeah. So I was surprised. Yeah. It said. Here's what it said. You told me before you're 68. A seasoned human with the mileage to prove it.

Paris Martineau [01:34:57]:
Do you want to see what it got, what it gets for me? All right. Based on what you've shared, being an established investigative journalist, starting a new beat, hosting a podcast.

Jeff Jarvis [01:35:06]:
Let me leave.

Paris Martineau [01:35:07]:
Having pretty formed aesthetic tastes like your love for 6070 design, plus your gaming habits and cultural touchstones.

Leo Laporte [01:35:16]:
32.

Jeff Jarvis [01:35:18]:
Until we got to the gaming. Oh, sorry, use. What was the answer?

Paris Martineau [01:35:21]:
No, I didn't say.

Jeff Jarvis [01:35:22]:
Okay. I think because of all your accomplishments in your career, I think it's going to think you're 46, but then it's going to bring it down a little bit because of the gaming to about.

Leo Laporte [01:35:32]:
42 and the fact you like 60s and 70s design. Exactly. 50s. So I'm gonna say 32.

Paris Martineau [01:35:38]:
Yeah, it guessed 33, which is quite a good guess from Leo, is you're one off.

Leo Laporte [01:35:43]:
Yeah. Because you're much younger than that, we should point out. Yeah. So that's interesting. But based on your accomplishments, you are very accomplished for a young person your age, that, like 60s and 70s furniture is a little weird, but okay.

Paris Martineau [01:35:58]:
Yeah, that's true.

Jeff Jarvis [01:35:59]:
Quirky.

Leo Laporte [01:36:00]:
Quirky.

Paris Martineau [01:36:00]:
Asked it for more and it was like it pointed out that I like early 2000s eras, JRPGs and PC narrative games, which puts me more likely in the millennial range, which feels big.

Leo Laporte [01:36:13]:
Ah. Now, see, the other thing that's interesting is how much it knows about you. What? What it knows about me. I actually told it in a prompt, but for you, this is stuff. It's.

Paris Martineau [01:36:27]:
Yeah. These are all searches gleaned from my searches.

Leo Laporte [01:36:29]:
Yeah.

Jeff Jarvis [01:36:30]:
Google Gemini said, I don't know you. I don't know who you are. We care about privacy.

Leo Laporte [01:36:34]:
I think they're lying. They know exactly who you are, but they don't want to say so. OpenAI, in an attempt to protect itself and maybe to protect teenagers as well, has decided they're going to try to guess how old you are. He says we shared more today about this is Sam writing building the age prediction system and new parental controls to make all of this work. This was thoughtful of him. We realized these principles are in conflict and not everyone will agree, you know, privacy versus protecting kids. And not everyone will agree with how we're resolving that Contact conflict. These are difficult decisions, but after talking with experts, this is what we think is best.

Leo Laporte [01:37:15]:
And we want to be transparent in our intentions, which I think is well said. And I don't think any. I wrote that. That. So they're going to try to. He says Chat GPT is intended for people 13 and up, but we have to separate users who are under 18 from those who aren't. And we're going to try to figure it out. If there's doubt, we'll play it safe and default to the under 18 experience.

Leo Laporte [01:37:37]:
That's why it's kind of important for you, Paris, because it might start giving you the teen version. There's going to be a new Teen Chat GPT.

Jeff Jarvis [01:37:45]:
Hi. For the record, thought I was 33.

Leo Laporte [01:37:50]:
Oh, everybody's 33.

Paris Martineau [01:37:51]:
Ooh, everyone's 33.

Leo Laporte [01:37:53]:
But Benito, you're older than that.

Jeff Jarvis [01:37:55]:
Yes, much older.

Paris Martineau [01:37:59]:
Why did it say. Why did it say 33? Bonito.

Leo Laporte [01:38:02]:
Okay.

Paris Martineau [01:38:03]:
Did you ask?

Jeff Jarvis [01:38:04]:
Yeah, it said, you want to explain more? I said yes. Well, I mean, the thing is, like, I, I. What I used ChatGPT for personally was to. Was most mostly to test things out to see how it performed. So I don't know how accurate this really could be.

Leo Laporte [01:38:20]:
Well, this is relevant because everybody's going to be subject to this. This is the new thing they're going to do age guesstimation. And if you're on, if they guess you're under 18 or they can't figure it out, they're going to give you the teen version of Chat GPT, which we don't know yet what that's going to be.

Paris Martineau [01:38:35]:
Very soon, Benito and I are going to be putting a version of Chat GPT that's just Italian brain rot. Me. That's the really the first, like, teen meme culture moment thing that I just don't understand at all. That's the first thing that makes me feel truly old is Italian brain rot. Do you guys know what I'm talking about?

Jeff Jarvis [01:39:00]:
No.

Leo Laporte [01:39:01]:
No. But I bet Chat GPT does.

Paris Martineau [01:39:04]:
I don't. I don't. How do I even.

Jeff Jarvis [01:39:07]:
It's just more. It's just more young people pissing off old people.

Paris Martineau [01:39:11]:
Okay.

Leo Laporte [01:39:12]:
It's like, is it from Italy or is it.

Paris Martineau [01:39:14]:
No, it's like there's a computer generated image of a shark wearing blue Nikes and it says stuff in a vaguely stereotypical Italian accent. There's also other examples.

Leo Laporte [01:39:28]:
Here's what ChatGPT says. It usually refers to people who are absurdly deep into Italian culture, food, music, or media. Sometimes Genuinely, sometimes ironically, think arguing about the correct way to cook pasta, posting endless mamma Mia means obsessing over Roman history, gesturing with your hands online, or spamming phrases like gabagoo and Ciao bella.

Jeff Jarvis [01:39:52]:
This is not right. Says it's used to describe trivial or unchallenging online content that can lead to a perceived mental deterioration, but with a specific Italian themed twist.

Leo Laporte [01:40:02]:
No.

Paris Martineau [01:40:03]:
Okay, I'm gonna post a photo in the chat and you're going to tell me what you think the name of this is in. Oh, and I've got to make sure the name isn't in the image. What? This name is in Italian. Brain rot.

Leo Laporte [01:40:19]:
Okay, here comes the image. Ladies and gentlemen, we're very excited about all of this.

Paris Martineau [01:40:23]:
Is it? Did it not. Oh, no, it didn't.

Leo Laporte [01:40:25]:
I don't know.

Paris Martineau [01:40:27]:
Sorry.

Leo Laporte [01:40:28]:
Didn't.

Paris Martineau [01:40:29]:
We're gonna try. We're gonna try one more time, guys. Hold on.

Leo Laporte [01:40:34]:
Oh, boy.

Paris Martineau [01:40:35]:
Great, guys.

Leo Laporte [01:40:36]:
ABC affiliates on the next Next Star Media Group have announced that they are preempting Jimmy Kimmel live indefinitely due to comments he made about Charlie Clark. This. This. We're going to see more and more of this. I really worry about this.

Jeff Jarvis [01:40:52]:
It's not coming. We're here. Oh, my. Paris. That image is weird.

Paris Martineau [01:40:57]:
What do you think this is called? An Italian brain route?

Leo Laporte [01:41:01]:
Oh, there it is. Oh, wow.

Paris Martineau [01:41:04]:
How would you describe it?

Leo Laporte [01:41:05]:
So it's a. It's a B17 bomber with an alligator instead of a front alligator head. And for some reason the bomb is suspended on a string. It's a flying.

Jeff Jarvis [01:41:20]:
What are we supposed to guess? What's.

Paris Martineau [01:41:22]:
What's it called?

Leo Laporte [01:41:25]:
Flying Gator Fortress. It's floor. It's called a Florida bomber.

Paris Martineau [01:41:32]:
Bombadiro Crocodile or Dillo. I don't know. I don't know how they actually pronounce that last thing. All of these have different names, like tongue. Tongue. Tongue Sahir, which is a anthropomorphic wooden object who holds a baseball bat. There's chimpanzees.

Leo Laporte [01:41:48]:
Wait a minute. These are like giving me a headache. See, this is. To me, this is really. There's something wrong with Internet meme culture. There's something wrong.

Paris Martineau [01:41:57]:
This is just the new Skibidi.

Leo Laporte [01:42:01]:
Skibidi. I know, but I don't like it.

Paris Martineau [01:42:02]:
Skibidi Toilet, but worse and less meaningless.

Jeff Jarvis [01:42:07]:
See, because. Because this is Gen Alpha and you're Gen Z. So you think the new agency.

Leo Laporte [01:42:11]:
You think it's because I'm old? Old.

Jeff Jarvis [01:42:13]:
It is because we're old. Like, this is the young people's stuff. This is what they used to piss off old people.

Leo Laporte [01:42:17]:
Creepy. There's something wrong with him.

Paris Martineau [01:42:20]:
No, there's.

Leo Laporte [01:42:20]:
Their brains are twisted. They've got Italian brain.

Paris Martineau [01:42:23]:
They do have Italian brain.

Jeff Jarvis [01:42:24]:
When I was young, it was bombed.

Leo Laporte [01:42:27]:
Yeah, I don't like that either.

Jeff Jarvis [01:42:29]:
See, because you're older than me.

Leo Laporte [01:42:31]:
I don't like Howard Stern either. I don't know what I like.

Jeff Jarvis [01:42:34]:
Yo, Yo.

Leo Laporte [01:42:37]:
I just want peace and quiet. A little bit of Mozart, some fine Scottish oatmeal and a nap. Maybe I am old. Come to think of it, can I watch PBS now? All right, let's find out.

Jeff Jarvis [01:42:53]:
Can I give you a paper?

Leo Laporte [01:42:54]:
You're gonna like another paper?

Jeff Jarvis [01:42:55]:
You gotta stop. 127. No, it's good stuff. You're gonna like this.

Paris Martineau [01:43:00]:
Archive.org Trippy Tropi described as a cat with a shrimp's body.

Leo Laporte [01:43:07]:
Don't you feel. Isn't it a little, like, creepy?

Paris Martineau [01:43:12]:
I wouldn't say it's creepy.

Leo Laporte [01:43:13]:
The island of Dr. Row. It's creepy.

Paris Martineau [01:43:18]:
Goosebumps. Esque. Sorry, Jeff. We can talk about an academic paper.

Leo Laporte [01:43:22]:
Generative AI at the crossroads.

Jeff Jarvis [01:43:25]:
Light bulb, dynamo or Microsoft brooking its institute and the Federal Reserve Board of Governors.

Leo Laporte [01:43:32]:
Oh, Lord. Page 4 Two of the least interesting. I used to be the librarian.

Jeff Jarvis [01:43:37]:
You're like this.

Leo Laporte [01:43:40]:
My first job in college was working in the library of the Brookings Institute. Talk about a somnolent job. But that's where I word the word somnolence, so that's good.

Jeff Jarvis [01:43:52]:
So they ask on page 4 whether AI is more like a light bulb, a dynamo, or.

Leo Laporte [01:43:58]:
I would have been asleep by this page.

Jeff Jarvis [01:44:00]:
Wait a second. You're gonna like this. I'm telling you, you. They said labor saving innovations such as the light bulb temporarily raised productivity growth as adoption spread. But the effect faded. The market was saturated. That was it. It was no big.

Jeff Jarvis [01:44:15]:
Unlike electric dynamo, which spurred all kinds of knock on innovations in new products and process innovations and industry, including the electric light bulb. Well. And yes, that's true. And the microscope led to other inventions increasing the efficiency of research and development. What's.

Leo Laporte [01:44:34]:
So they're basically distinct. Some inventions are generative, some are kind of dead.

Nick Foster [01:44:38]:
Exactly.

Jeff Jarvis [01:44:38]:
And they conclude AI is generative.

Leo Laporte [01:44:40]:
Of course it is.

Jeff Jarvis [01:44:41]:
Well, I thought you'd like that.

Leo Laporte [01:44:43]:
Well, it's obvious.

Jeff Jarvis [01:44:44]:
You can go to your sand pile, your sandbox and how many brag how.

Leo Laporte [01:44:48]:
Many trees died to prove that point? Oh, geez.

Jeff Jarvis [01:44:52]:
Anti academics.

Leo Laporte [01:44:53]:
I know. My catch is academics. I grew up in academia.

Jeff Jarvis [01:44:56]:
That's true.

Leo Laporte [01:44:57]:
I don't like it. I don't want It.

Jeff Jarvis [01:44:59]:
It is called Generative AI. So.

Leo Laporte [01:45:02]:
Yeah, I mean, that's kind of a giveaway. It's in the name.

Jeff Jarvis [01:45:05]:
All right, can we brag that Gemini is number one in the Apple App Store?

Leo Laporte [01:45:08]:
Yeah, it's a big deal, but you know why?

Jeff Jarvis [01:45:10]:
Why? Well, tell me what you're laughing at.

Leo Laporte [01:45:14]:
Paris, share it with the group. She can't speak.

Paris Martineau [01:45:22]:
Get it out. Ballerina Cappuccino is a female ballerina wearing. Blah, blah. She's married to a ninja named Cappuccino Assassino who kidnapped her prior to their marriage and also has a sister named Espressona Sigurona.

Leo Laporte [01:45:39]:
And this amuses you Very, very much. Here's a picture of the ballerina Cappuccino. See, I think this is a sign of brain rot.

Jeff Jarvis [01:45:52]:
Well, that's the point. That's the irony.

Paris Martineau [01:45:55]:
And this amuses you? Is real. Is real teacher energy here. I'm sorry.

Leo Laporte [01:46:01]:
All right, here's a. Here, let's have a word from Ballerina.

Paris Martineau [01:46:04]:
You have to unmute it.

Leo Laporte [01:46:05]:
I know. I'm a senior. I'm a senior. Give me a break.

Paris Martineau [01:46:14]:
And I love ballerina. Okay, I don't like this anymore.

Leo Laporte [01:46:18]:
And there's a shark playing the piano.

Jeff Jarvis [01:46:20]:
Playing piano.

Paris Martineau [01:46:23]:
They look too tender, frankly.

Leo Laporte [01:46:26]:
They're in love.

Paris Martineau [01:46:27]:
I know, and that upsets me.

Leo Laporte [01:46:29]:
They're married. It's okay. And the shark's wearing blue sneakers. Yeah, that's important because sharks have legs.

Jeff Jarvis [01:46:39]:
And feet and fingers to play the piano.

Leo Laporte [01:46:43]:
No, there's some. No, here's why. I think it's not my age. I think there's something viscerally creepy about the way these things are combined. I mean, it's intentional, right? It's to make you.

Paris Martineau [01:46:54]:
I mean, I feel that's what. Can't even where the comedy comes from.

Leo Laporte [01:46:58]:
Yeah, yeah, it's. It's unsettling. I personally don't like unsettling humor. I find it. It unsettling.

Paris Martineau [01:47:11]:
We were talking about.

Leo Laporte [01:47:12]:
You know what, Jeff? Let's talk about measuring epistemic humility in multimodal large language models. Now that. Why did you put that in?

Paris Martineau [01:47:20]:
Now, that's not unsettling at all.

Leo Laporte [01:47:23]:
No, it isn't. It's soporific. Somnolent.

Jeff Jarvis [01:47:26]:
Well, the one that really got me. The title that really got me that I'll make fun of what is. Where is it here?

Leo Laporte [01:47:31]:
Boy, there's a lot of archive.org stuff in here.

Jeff Jarvis [01:47:34]:
Well, I'm working hard.

Leo Laporte [01:47:35]:
Here's a Maslov inspired hierarchy of engagement with AI model.

Jeff Jarvis [01:47:39]:
That's amusing. Come on. That's funny.

Leo Laporte [01:47:43]:
Okay, I'm getting the one that got me the funny part.

Paris Martineau [01:47:46]:
Your version of Italian brain rot it is. Looking at too many scholarly papers, Brain rot.

Jeff Jarvis [01:47:52]:
My favorite this week is building self Evolving agents via experience driven lifelong learning. Otherwise known as child rearing.

Leo Laporte [01:48:03]:
That's how we train AIs now.

Jeff Jarvis [01:48:05]:
Yeah, yeah, yeah.

Leo Laporte [01:48:07]:
Actually, this is interesting. This is the AI model of the Maslov hierarchy.

Jeff Jarvis [01:48:12]:
Gotcha.

Leo Laporte [01:48:13]:
So this is for those who don't know. This is the very famous Maslov's Hierarchy of needs. And as you progress up it, you're getting, you know, higher. In the. In the realms, the basic survival needs of food, water, shelter and sleep are at the bottom. Physiological needs, then safety, the need for security, then love. Love and belonging. We need relationships, friendship, intimacy.

Leo Laporte [01:48:35]:
You go to the next step up, you've got self respect, recognition, a sense of accomplishment, esteem, and of course you're aiming for self actualization to achieve your full potential. Creativity and personal growth. I'd never seen this top level. Maybe I wasn't ready. Transcendence.

Paris Martineau [01:48:51]:
Because you haven't self actualized?

Leo Laporte [01:48:53]:
Apparently not. The pursuit of meaning beyond the self. Oh, yeah, yeah. Because it involves helping others. Yeah. No, I don't do that. So that's the traditional hierarchy of needs.

Jeff Jarvis [01:49:04]:
The human.

Leo Laporte [01:49:05]:
The human. Here's the AI. So this is what AI needs. Is that what this is?

Jeff Jarvis [01:49:10]:
I guess I. I was just amused by the title.

Leo Laporte [01:49:16]:
Okay. It starts with initial exposure and curiosity, then awareness and orientation. Where do you think we are right now? Guided application, Belonging, collaboration, Structured literacy. I think we've gotten there.

Jeff Jarvis [01:49:26]:
About there.

Leo Laporte [01:49:27]:
Autonomous utilization. No, not yet. Esteem, autonomy, Data sovereignty, Federation. The next step probably would be what you'd call AGI, which is creation and innovation, Self actualization, creativity, ethical design. Are there humans involved in this or. No, this is just AI on its own.

Jeff Jarvis [01:49:47]:
Well, you're going to get there. Go. Two more steps.

Leo Laporte [01:49:49]:
No, step five is responsible to plan.

Jeff Jarvis [01:49:51]:
Plenty of engagement with the AI model.

Leo Laporte [01:49:54]:
Oh, this is our engagement with it.

Jeff Jarvis [01:49:55]:
Yes.

Leo Laporte [01:49:57]:
So at the top you got human AI, co evolution, and then finally societal. And this is the Computronium, right?

Jeff Jarvis [01:50:03]:
Yeah, this is.

Leo Laporte [01:50:04]:
Yeah, we've got Computronium and global integration, Transcendent stewardship and reciprocity. You know what? This is fine. I'll. I'll sign up. Where do I sign up? Yeah, I'll do that. Because I want to. I think we should merge, don't you think? I feel like we should. You're watching Intelligent Machines.

Leo Laporte [01:50:26]:
Paris Martineau. Jeff Jarvis. So glad you're here. We will talk Next week with Steven Levy. The great Steven Levy. His article that inspired us to book him came out and Wired a couple of days ago. I wasn't sure I wanted Anthropic to pay me for my books. I do now.

Leo Laporte [01:50:48]:
So you and Jeff, you and Jeff, you and Stephen can get in a debate because this was the event.

Jeff Jarvis [01:50:53]:
So Stephen got me into the Wired AI Summit, and Senator Blumenthal was there. Stephen ran a panel with four technical people. There was a panel of. So that was, you know, there was everybody beat up on Google, and. And there was a panel of media people, four white men, and the entire. Every single one of this. And Ana Wintour opened the whole thing, you know.

Leo Laporte [01:51:20]:
Wow.

Jeff Jarvis [01:51:21]:
Right? Steve Newhouse, my old boss, was there. I got to talk to him for a while and. But the whole thing, everything came back to, well, what about ip? What about copyright? What about those thieves taking our valuable content? How dare they? Don't we need a law for this, Senator Blumenthal? Yes, we do. It's very funny. I found. I also kind of felt, this is the world I left. I don't belong there anymore. Anymore?

Leo Laporte [01:51:47]:
Yeah.

Jeff Jarvis [01:51:48]:
I'm not a media guy anymore.

Paris Martineau [01:51:50]:
You're a tech boy now.

Jeff Jarvis [01:51:51]:
Yeah.

Leo Laporte [01:51:52]:
No. You're a professor.

Jeff Jarvis [01:51:54]:
Yes.

Leo Laporte [01:51:55]:
You're an academic. Read archive.org yeah, and you don't fall asleep.

Jeff Jarvis [01:52:00]:
Paris asked me for one and I got it for her. So, you know. Come on.

Paris Martineau [01:52:05]:
No, I. Jeff's got access. Yeah, you can get those, or you can get those academic articles that college alumni account won't allow you to access.

Leo Laporte [01:52:19]:
Boy, I put a lot of stories in here, as did you guys. So I don't know what you would like to go with. OpenAI has upgraded its Codex, which is this command line coding tool, to compete with Claude. They have a new version of GPT5 in Codex. And I'd love to hear from some of our club members who are using Claude, and if they've tried the new Codex and what they think, if it's a. A improvement.

Jeff Jarvis [01:52:48]:
We have Anthropic's giant albino alligator on line 116.

Leo Laporte [01:52:52]:
That sounds like Italian brain rot.

Paris Martineau [01:52:55]:
We're just now trying different ways to get you to look at Italian brain rot. Okay, so I have an albino alligator story once.

Leo Laporte [01:53:04]:
The story's how an albino alligator became an obsession inside an AI giant. Now, are these the alligators that in the 50s, kids in New York City got at the circus and they were an inch long and then they got big, and so they flushed them down the toilet and they Live in the sewers, in the subways. Is this. Is that who these are?

Jeff Jarvis [01:53:23]:
No, this is.

Paris Martineau [01:53:24]:
No, I don't think so because there's a very small. There are very few albino alligators on Earth.

Leo Laporte [01:53:30]:
I believe I've met this one. This one's in San Francisco at the California Academy of Sciences.

Paris Martineau [01:53:35]:
I know the one that I've met. Met is named Pearl.

Leo Laporte [01:53:38]:
Oh, she is Pearly. Pearl and Claude should meet. Would they have albino children? Is the question.

Paris Martineau [01:53:43]:
I'd love for Pearl to meet Claude, because last I met Pearl, she was in a. She was in an al. She was in an alligator enclosure, which was part of a three story alligator enclosure, slash restaurant slash bar, slash arcade that I worked at in Florida where you could. We served gator on the menu, but it was not the same gator that you could feed or hold. But Pearl had to stay in a Plexiglas cage 247 when she wasn't being handled by adults because the other gators would tear her to pieces. Oh, I assume that that's what this story is about.

Leo Laporte [01:54:20]:
No, this story is that Anthropic, which has a coding platform called Claude, has taken on this Claude as their. As their mascot. In fact, the dog, the son of one of the founders, thinks that she works with this Claude, the alligator.

Jeff Jarvis [01:54:38]:
They have little stuffed clods there.

Leo Laporte [01:54:40]:
Oh, isn't that nice? Yeah.

Jeff Jarvis [01:54:43]:
Who says they're not nice people?

Paris Martineau [01:54:45]:
Claude is known as a bit of a diva with a picky appetite, said Bart shepherd, senior director of the Steinhardt Aquarium. He used to have a roommate named Bonnie until the two had an allergy and claw that ended in her biting off one of his toes.

Leo Laporte [01:55:00]:
Oh, dear. I mean, this albino alligator violence has got to stop, so.

Paris Martineau [01:55:12]:
Wow, what a world.

Leo Laporte [01:55:14]:
Yeah, I forgot what I was going to say. Oh, I was going to talk about Anthropic. I'm kind of. I'm kind of. Okay. I'm kind of becoming more of an Anthropic fan of late. Claude is really good. Anthropic really seems to care about AI safety.

Leo Laporte [01:55:33]:
And then there's this article, anthropomorphic safety.

Jeff Jarvis [01:55:35]:
As they define it in doomer terms. But go ahead.

Leo Laporte [01:55:37]:
Yeah. And then there's this semaphore article, Anthropic irks the White House. Our friend Reid Albergatti wrote this one with limits on the model's use. They're currently on a splashy media tour in Washington, but they do not allow their models to be used for law. Some law enforcement purposes, which is apparently irritating. The Trump administration don't want to do that. They refuse to make an exception allowing its AI tools to be used for surveillance of US citizens. Thank you, Anthropic.

Jeff Jarvis [01:56:15]:
They currently limit ethics and safety. I won't go the most that.

Leo Laporte [01:56:19]:
Yes, I'm. Like I said, I'm kind of getting a little bit more in favor of these guys because you don't see this kind of rhetoric coming from anybody else. It's too dangerous.

Jeff Jarvis [01:56:27]:
You and Harper, I thought both already said that you liked best.

Leo Laporte [01:56:29]:
Well, I use Claude. Yeah. Anthropic currently limits how the FBI, Secret Service and ICE can use its AI models because Anthropic's usage policy prohibits surveillance. Bravo. Bravo. Other AI model providers also list restrictions on surveillance but have often have carve outs for law enforcement. So we don't want them surveillance unless it's by people who surveil people. Then okay, it's different.

Leo Laporte [01:57:05]:
Good. I agree. I think one of the things that concerns me a lot about AI is that it is being used more and more widely to repress people for by governments and law enforcement and others to, to repress. And I, I think that's not what AI should be used for. Is this good news, Amazon, or is it just a PR stunt? Amazon says we're going to raise employee pay, lower health care costs and we're going to invest a billion dollars to do it. They want to bring the average total compensation of warehouse workers and drivers to $30 an hour.

Jeff Jarvis [01:57:44]:
The total includes benefits.

Leo Laporte [01:57:46]:
Oh, including benefits. Oh, so you're going to get an increase of $60.

Jeff Jarvis [01:57:52]:
Is that what it says?

Leo Laporte [01:57:52]:
$30 a year? Yes. Says including benefits. I don't know. You know what, that's ambiguous. It could be $30 an hour and benefits.

Jeff Jarvis [01:58:01]:
Oh, that is right. Yeah.

Leo Laporte [01:58:03]:
I don't know what that means.

Jeff Jarvis [01:58:04]:
It must be $30 an hour, period. I mean this is also how Amazon has done this in the past where they've raised labor costs for everybody else.

Leo Laporte [01:58:11]:
No, I think you were right. They said the average pay will increase to more than $23 an hour. Full time employees will see a pay increase of $1600 a year.

Paris Martineau [01:58:22]:
It is actually rather impressive that Amazon got Reuters to put $30 an hour including benefits in the first story.

Leo Laporte [01:58:34]:
That's not it fooled me.

Paris Martineau [01:58:35]:
Being paid per hour.

Leo Laporte [01:58:42]:
Put this in here. The center for the alignment of AI alignment centers.

Paris Martineau [01:58:47]:
Yes, I did. This is the new center that we all need to be talking about. Who aligns the aligners? It asks. Every day thousands of researchers race to solve the AI alignment problem. But they struggle to pull coordinate in the basics, like whether a misaligned super intelligent will seek to destroy humanity or just enslave and torture us forever. Who then aligns the aligners? They ask? We do.

Leo Laporte [01:59:11]:
Wow.

Paris Martineau [01:59:11]:
This is just a fun parody website about.

Leo Laporte [01:59:13]:
Yeah. And I can't pull it up because I stupidly turned on a restriction that says if a domain URL is brand new, like less than 90 days old, I can't go there because a lot of, a lot of, you know, malicious sites are brand new.

Paris Martineau [01:59:26]:
They have a. A great scrolling thing on their website that has all of great logos of like the center for AI Safety, center for Essential Risk and then very small text it says completely unaffiliated with these AI alignment organizations. But our design agency said their logos would look good in our site.

Leo Laporte [01:59:47]:
We don't work with them, but.

Paris Martineau [01:59:48]:
And it's got a big clock ticking down. It's two days, 14 hours, 22 minutes. And it says this is. We've got two days until our next prediction of when AGI is coming.

Leo Laporte [02:00:01]:
All right, I have to turn off that feature because I really want to see this site. Do not block newly registered domains.

Paris Martineau [02:00:07]:
Leo's going to get hacked.

Leo Laporte [02:00:09]:
I'm going to get hacked now because of you.

Paris Martineau [02:00:12]:
Oh, they've got a great game where you can start your own AI center in under 60 seconds. Oh, center. Gen 4. Oh. The powerful tool behind the creation of CAC. The center for the Alignment of AI. Alignment centers. Do we want our organization to be a center? Lab Initiative, Institute, Global Center, Lab Forum project.

Leo Laporte [02:00:34]:
I like Institute.

Paris Martineau [02:00:34]:
Okay.

Leo Laporte [02:00:35]:
I'm a fan of Institute.

Jeff Jarvis [02:00:37]:
I ran a center. I think Institute has a more short term view.

Paris Martineau [02:00:41]:
Are you able to pull this up? Because there's a lot of words that we probably need to look at. Leo. Or should I go through it?

Leo Laporte [02:00:47]:
No, no, don't worry about it.

Paris Martineau [02:00:48]:
Okay.

Leo Laporte [02:00:49]:
Someday I'll be able to pull it up.

Paris Martineau [02:00:50]:
Okay.

Leo Laporte [02:00:51]:
It takes a little while to at.

Paris Martineau [02:00:52]:
Least two abstract concepts. We want to do alignment. Ethics, democracy, mercy, power, security.

Leo Laporte [02:01:00]:
I like the combination of mercy and power.

Paris Martineau [02:01:03]:
Yeah, I think that's great.

Leo Laporte [02:01:05]:
Yeah, we want to do mercy.

Paris Martineau [02:01:06]:
And I'm going to put beauty and forgiveness in there too. Well, no.

Leo Laporte [02:01:10]:
Oh yeah.

Paris Martineau [02:01:11]:
I'll do human compatibility, Mercy and power. Ooh. And then we get to choose a logo. I'm going to do the one that looks like a butthole.

Leo Laporte [02:01:19]:
It's got to. Otherwise it's not AI.

Paris Martineau [02:01:22]:
Let's see. And it will. Oh, they've formally incorporated the brand new merciful, all powerful and human compatible AI Institute. And Exciting news. I have been offered a job as the executive director. So I've got to put my name.

Leo Laporte [02:01:34]:
Raised money on this. You could get a discard, you could get an A funding series going here. This is good.

Paris Martineau [02:01:40]:
Okay, great. I will put this in the chat here. And it says take a screenshot and share on LinkedIn to show your AI researcher friend. Friends how easily they can set up their own centers.

Leo Laporte [02:01:54]:
Wow.

Paris Martineau [02:01:55]:
There we go. Put it in.

Leo Laporte [02:01:57]:
No, no, that's not it. Here it is. Congratulations, Paris. Wow.

Paris Martineau [02:02:03]:
Now I'm the inaugural exec. There we go.

Leo Laporte [02:02:07]:
Executive Director of the merciful all powerful and Human Compatible AI Institute.

Paris Martineau [02:02:14]:
It's beautiful.

Leo Laporte [02:02:15]:
Put that on LinkedIn.

Paris Martineau [02:02:17]:
This is my future.

Leo Laporte [02:02:20]:
And a good choice of logo.

Paris Martineau [02:02:23]:
I know. I was like we gotta do that one.

Leo Laporte [02:02:25]:
It's got a certain cutaneous appeal.

Paris Martineau [02:02:28]:
It's true.

Leo Laporte [02:02:31]:
AI sees your location but with a bias towards the wealthy world, which is not terribly surprising. Yeah. I mean that's going to be just. Just everything AI does with a bias.

Jeff Jarvis [02:02:45]:
Bias toward the wealthy world.

Leo Laporte [02:02:47]:
Right. Because that's what it's trained on.

Jeff Jarvis [02:02:49]:
Is that a paper you're looking at there?

Leo Laporte [02:02:51]:
Yes. There's so many articles on here. So many articles. Pick some guys because we're going to wrap.

Jeff Jarvis [02:02:59]:
Rest of World had a good story. I love Rest of World. They're just.

Leo Laporte [02:03:02]:
I love them.

Jeff Jarvis [02:03:03]:
Thirteen.

Leo Laporte [02:03:04]:
Okay. I can get behind anything they're talking.

Jeff Jarvis [02:03:07]:
About a hidden network. Handles chats for only fan stars. AI could soon take that over.

Leo Laporte [02:03:15]:
I think it has in many cases.

Jeff Jarvis [02:03:17]:
I think in many cases it has. So. So here is a photo and reports from the Philippines where a hub for this work where rising sales quotas have made their work more stressful. They work 12 hours a day. We keyboard smash, intentionally misspell and use Gen Z slang or perhaps, perhaps Italian. Never mind Brain. I don't think the AI is that level of flirting yet. Said a 23 year old street artist who works on the chatter.

Leo Laporte [02:03:50]:
This makes me sad about. It really is about men specifically.

Jeff Jarvis [02:03:55]:
Yes. Yes.

Leo Laporte [02:03:58]:
Wow.

Jeff Jarvis [02:04:00]:
So. But Arrest of World is a wonderful journalistic organization. The stories that are being covered.

Paris Martineau [02:04:06]:
They're doing great.

Jeff Jarvis [02:04:07]:
They're doing wonderful. Sophie Schmidt.

Leo Laporte [02:04:12]:
This is a good business. Wait a minute. In the past year a handful of tech companies developed AI chatbots aimed at the fast growing ecosystem that has coalesced around only fans. The UK based platform has 305 million fans who collectively paid creators a record 6.6 billion last year. This is. There's big money on this only fans thing.

Jeff Jarvis [02:04:37]:
I think it's the same as the app model, though, where there's a couple of whales and then everybody else.

Leo Laporte [02:04:43]:
Yeah, yeah. They don't look like whales, though. That's. That's important.

Jeff Jarvis [02:04:51]:
Do you want to see what Americans think about AI? Line 120.

Leo Laporte [02:04:57]:
What do Americans think of AI? That's what I want to know. I suspect they don't like AI as much as I like AI.

Jeff Jarvis [02:05:04]:
Yeah, well, I think it's also. They're influenced by media. That's a question there.

Leo Laporte [02:05:08]:
They want more control over its use. And about half say it will erode creative thinking. So Pew does these surveys. Quite a few of them. I don't know this time, 5,000 adults. This was back in June. They did. It took them a while to put the results out.

Leo Laporte [02:05:27]:
Everyone who took part in the survey, a member of the Center's American Trends panel, a group of people recruited through national random sampling of residential addresses who've agreed to take surveys regularly. I kind of start to wonder how, how long you can use somebody like that before they become panels.

Jeff Jarvis [02:05:46]:
The advantage of panels, I mean, I think all polling is wrong and disrupts democracy, but that's a whole other rant. The advantage of panels is you can see changes over time.

Leo Laporte [02:05:55]:
Right. About half say AI will worsen people's abilities to think creatively and form meaningful relationships. On the other hand, it's almost. It's a little bit more balanced on whether AI will help them solve problems. 38% say they'll get worse, but 29% say they'll get better. I think that's probably about right. 50% of Americans are more concerned than excited about the increased of use of AI in daily life. This all kind of correlates with about what I thought.

Leo Laporte [02:06:27]:
In fact, I think I saw the number. 71% of people don't like AI or wouldn't want AI. Here we go. Most Americans think it's important to be able to tell the difference between AI and human generated content, but few feel they can.

Jeff Jarvis [02:06:41]:
I think that's true.

Leo Laporte [02:06:43]:
Few feel they can.

Paris Martineau [02:06:47]:
So next, most Americans think that they expressed various degrees of support for AI being used to do the following. Forecasting the weather. 74% said it should play either a big or small role. Searching for financial crimes, searching for fraud and government benefits claims.

Leo Laporte [02:07:05]:
Good for that. Okay.

Paris Martineau [02:07:06]:
61% said AI should be used to identify suspects in a crime. Oh, 33% said it should be used to select who should serve on a jury.

Leo Laporte [02:07:19]:
That maybe is actually true. I mean, the jury selection process has been highly tainted by these jury experts who are very Clever. About how they put a panel.

Jeff Jarvis [02:07:29]:
Who makes the algorithm.

Leo Laporte [02:07:31]:
I think it should be random.

Jeff Jarvis [02:07:34]:
Random.

Leo Laporte [02:07:36]:
Well, yeah. Yeah.

Jeff Jarvis [02:07:37]:
You also have a right.

Paris Martineau [02:07:38]:
18% say that AI should be used to judge whether two people could fall in love, which I think is a very funny question to have asked, you.

Leo Laporte [02:07:46]:
Know, but who knows? Maybe AI is really very, very good at that. We don't know. Do we know?

Jeff Jarvis [02:07:51]:
We don't. We don't. Paris is. That's. That's Paris's dubious face.

Paris Martineau [02:07:56]:
That was my dubious face. Sorry. I forget sometimes that we're on an audio podcast. I was dubious. I was expressing dubitude. As. As the kids say, should we do.

Jeff Jarvis [02:08:08]:
Three dubious faces for. For bonito. Okay.

Leo Laporte [02:08:18]:
Young adults are more likely than adults 65 and older to say they've heard about or interact regularly with AI. Jeff, you and I are outliers.

Jeff Jarvis [02:08:28]:
We alone change the. The be.

Paris Martineau [02:08:31]:
This is very interesting. A majority of the Americans say they interact with AI at least several times a week.

Leo Laporte [02:08:37]:
Yeah, that's interesting, isn't that 53%, 31%.

Paris Martineau [02:08:42]:
Of US adults say they interact with AI almost constantly or several times a day.

Leo Laporte [02:08:47]:
That's crazy scary. I don't think I even interact with it constantly. I used to, though, when I wore my little pins. Yeah, that's true. I used to. So maybe. Yeah. I mean, if you have a very.

Paris Martineau [02:09:01]:
Broad understanding of what AI is, then technically, I interact with AI almost constantly because I'm using Google products or using things that have algorithms.

Jeff Jarvis [02:09:11]:
The ads that are shown you.

Paris Martineau [02:09:16]:
So a majority of Americans say they have little to no control over whether AI is used in their life.

Leo Laporte [02:09:24]:
That's probably true.

Paris Martineau [02:09:25]:
Especially a majority say they would like more control.

Leo Laporte [02:09:28]:
Yeah.

Jeff Jarvis [02:09:29]:
Americans expressed mixed views on how big of a deal has been made of a high.

Paris Martineau [02:09:36]:
Half of Americans say that AI will worsen people's ability to form meaningful relationships with others. They also think it'll make people worse at thinking creatively. And fewer people say it will. Will worsen people's ability to make difficult decisions or something.

Leo Laporte [02:09:51]:
Did they ask them how much AI will help a podcast grow its numbers?

Paris Martineau [02:09:59]:
They did not like that. They should.

Leo Laporte [02:10:00]:
They should.

Jeff Jarvis [02:10:02]:
Most Americans see no role for AI in advising people about their faith in God or in matchmaking.

Leo Laporte [02:10:09]:
That's probably true. I think that's probably true. But 73% of Americans say it's extremely or very important for people to understand what AI is. That's good. That's good. It is very important. And then those numbers are consistent across all ages and even pretty consistent across all education levels. That's interesting.

Leo Laporte [02:10:38]:
Yeah, these are polls, you know.

Jeff Jarvis [02:10:39]:
Yeah, they're polls.

Leo Laporte [02:10:40]:
It's how you ask the question. It's who you ask the question of.

Paris Martineau [02:10:44]:
I don't know. I think the sort of polling is very interesting, especially when it's done over time. I think it's a great view into just how people think about the best. In some ways was surprising to me that this many people say that they use AI that frequently.

Leo Laporte [02:10:59]:
Yeah, AI will not make you rich.

Paris Martineau [02:11:06]:
Dang.

Jeff Jarvis [02:11:07]:
Oh well, let's give it up right now.

Leo Laporte [02:11:10]:
Article Colossus Jerry Newman writing. The disruption is real. It's also predictable. All right, let's take a break and we shall get your picks of the week because we're going to get ready to wrap this up because Meta Connect is only about 45 minutes away and we want to get some cacio e Pepe in us before we buckle down to what will be replacing these. They've sold 2 million now of these Meta AI glasses from Rainbow.

Jeff Jarvis [02:11:45]:
How often do you wear them?

Leo Laporte [02:11:46]:
Never. Seriously? I wear them on the show. That's it?

Jeff Jarvis [02:11:50]:
That's it.

Leo Laporte [02:11:51]:
But I would be very interested if there was a heads up display. The problem is it's from Meta, but they are way ahead of the game. No one else is coming even close. Google, even far.

Paris Martineau [02:12:03]:
The issue is if they give you a heads up display. That heads up display will feature notifications you can't turn off that tell you when a brand you follow on Instagram is going live.

Leo Laporte [02:12:15]:
Yeah, that would be bad. I don't like that.

Paris Martineau [02:12:19]:
Not ideal.

Leo Laporte [02:12:20]:
You're watching intelligent machines. Paris Martineau from Consumer Reports is with us. Paris NYC is her website. Soon you will see her byline soon. Not the first time but.

Paris Martineau [02:12:33]:
But not the first time. But soon we will discuss my byline on the show.

Leo Laporte [02:12:36]:
A major. A major something or other coming.

Paris Martineau [02:12:39]:
Something or other.

Jeff Jarvis [02:12:40]:
The Nellie Bly of food.

Leo Laporte [02:12:43]:
Oh, I like it. Of food poisoning.

Jeff Jarvis [02:12:46]:
Of radioactive shrimp.

Leo Laporte [02:12:50]:
Jeff Jarvis is also your professor of journalistic innovation emeritus from the City University of New York, now at Montclair State University and SUNY Stony Brook and the author of the Gutenberg Parenthesis magazine. The web. We many, many, many great books.

Jeff Jarvis [02:13:04]:
Oh, if you don't do it, change quick. We got. We got a containers moment. There we go.

Paris Martineau [02:13:08]:
She's really trying. I. Oh, she's.

Jeff Jarvis [02:13:11]:
She's.

Paris Martineau [02:13:11]:
Katana's shy.

Leo Laporte [02:13:13]:
It's 4:20. She wanted to tell you something. She was saying it's 4:20, Mom. Oh, it's only 4:20 in California. Sorry.

Paris Martineau [02:13:20]:
It's.

Leo Laporte [02:13:21]:
She's like it's no wonder you want to go make dinner. It's 7:20. It's late there.

Jeff Jarvis [02:13:25]:
We're so weird.

Leo Laporte [02:13:25]:
Or late.

Paris Martineau [02:13:26]:
Yeah. You know, I really did think it would be great to watch the meta connect at 8pm on a Wednesday evening, but I was like, what if I sustain myself with food instead?

Leo Laporte [02:13:36]:
Okay, here's what I'm going to suggest. Just put the stream on. You don't have to participate. Just put the stream on and if you have a thought, type it in somewhere and I'll come back often.

Jeff Jarvis [02:13:46]:
Yes.

Leo Laporte [02:13:46]:
Yeah, I'll say, hey, Paris is in the discord. She says, are you still watching this, you idioticious? Yeah. What's. What's wrong with Mark? He looks dead.

Jeff Jarvis [02:13:56]:
I'll put a picture of my shrimp in the chat. Are you eating shrimp knowing everything you know?

Paris Martineau [02:14:04]:
Yes, I'm eating shrimp, but I would before, I. I probably would not instinctively order shrimp out just because I don't know. I'm not normally ordering shrimp out, but if I'm buying frozen shrimp, I'm checking to see what country it's imported from and staying away from Indonesian imported frozen shrimp, as those are. That is where the source of potential radioactive contamination do they have originated.

Leo Laporte [02:14:31]:
Do they harvest shrimp in Brooklyn or anywhere nearby like that?

Paris Martineau [02:14:34]:
Yeah, they get Trump straight from the Gowanus Canal.

Jeff Jarvis [02:14:38]:
Those are really radioactive.

Leo Laporte [02:14:39]:
May not be. Yeah, radioactive.

Paris Martineau [02:14:41]:
I prefer to get my own homegrown local radioactivity. I just think it's really important to me that I know where the radioactivity is coming from. And it's coming from a super fun site.

Leo Laporte [02:14:53]:
We'll have more with intelligent machines in just a moment. Picks of the week. Let's start it off with the wonderful Paris Martineau.

Paris Martineau [02:15:03]:
My pick of the week is what I'm going to be doing instead of watching Meta Connect after I eat some food. It's a great RPG called Road War Jordan that is.

Leo Laporte [02:15:12]:
I've really been text based, so you're gonna get this. You're gonna get it. This is what we were talking about. This is great.

Paris Martineau [02:15:18]:
I basically, it's not a mud, but it's pretty close. It feels like you are like in a mud, but there's no other people there. And it basically is almost like a DND campaign, but written and in text. You can play it on Steam. I'm playing it on Steam deck right now. It's delightful. You are a road warden in an unnamed peninsula, which is kind of. Basically, you are the one person trusted to go between settlements and kind of explore the great unknown and Clear roads to make sure it's safe.

Paris Martineau [02:15:54]:
And you've got to kind of decide what sort of road warden you want to be and how you want to shape the land you have a dominion over. And it's. I don't know, it's just been a really delightful, delightful, almost like sandbox like experience to play in. Which is a strange thing to say about a text based rpg.

Leo Laporte [02:16:11]:
But I'm really impressed that you and that you're enjoying it because I think I've always thought people of your generation are so, you know, you, you grew up with really good graphics in games. That text based game may not do it for you.

Paris Martineau [02:16:24]:
I ideally like my games to just be hidden books. That's my ideal form of game is. And this is not hidden book, this is game that is quite apparently a book. And it's been delightful. The writing's good.

Leo Laporte [02:16:41]:
Is there you think any AI writing in it or do you think it's all human?

Paris Martineau [02:16:46]:
You could tell it's definitely all human. It's not algorithmically and it's also I believe like a very like indie operation. I'm trying to remember the name of the creator, but this is basically all he's done. He's thinking of releasing a. He's planning on releasing a upcoming game called Windy Meadow but that hasn't come out yet. But I've really enjoyed it. I've been looking for a new game like this for a bit and really speaking of bad graphics though, a game I have been wanting to play but I have, I haven't because I'm scared of. It is called Mouthwashing, but it is well known because it has PS1 graphics.

Paris Martineau [02:17:29]:
Basically, despite being a 2024 released game, it's basically kind of like a dystopian game where you are on a like delivery ship, like a spaceship in the future that ends up being kind of forcibly crashed by your pilot for unknown reasons. It's kind of horror. Like you're trapped, you don't know if you're going to be rescued and out of desperation the employees decide to break into the cargo even if it might mean they lose their jobs. Being like, well at least we've got food or something here and it's all mouthwash. And so they start drinking mouthwash and going insane.

Leo Laporte [02:18:03]:
That's hysterical.

Paris Martineau [02:18:05]:
But the graphics are crazy.

Leo Laporte [02:18:08]:
That is crazy is right.

Paris Martineau [02:18:10]:
It's all PS1 style graphics, but in a really interesting modern way.

Leo Laporte [02:18:17]:
Wow.

Paris Martineau [02:18:17]:
PS1 is what I'm thinking of.

Leo Laporte [02:18:19]:
I have a Steam game I'm actually very interested in. Playing, but I haven't played yet. Did you ever play Minesweeper?

Paris Martineau [02:18:29]:
Yeah, of course.

Leo Laporte [02:18:31]:
So this is kind of like Minesweeper, but it's. I don't know how to describe it. It's programmatic. It's called bomba B O M B E. And, you know, I kind of like. I like the idea of coding and all that. The idea is, instead of, as you do in Minesweeper, clicking to see, you know, where the mines are and so forth, this. You make a rule and say, well, I'm going to make this rule.

Leo Laporte [02:19:04]:
And I think these will not be mines. You have to use logic to solve. Solve this.

Jeff Jarvis [02:19:11]:
That's actually how nerds play Minesweeper, right?

Leo Laporte [02:19:13]:
Yeah. Well, you do in your head, don't you really? You say, well, if I click this and that, and then this is a mine and that's not a mine. And yeah, I mean, you do do that, don't you, if you're playing, like serious mind sleeper instead of just clicking the. The buttons to see what happens. Anyway, I'll tell you what, I will. I will play this and give you a report back, but I've been. I've been meaning. I've been wanting to pay play this for some time.

Leo Laporte [02:19:39]:
It's only 10 bucks, as is yours. Right. It's. It's inexpensive.

Paris Martineau [02:19:44]:
Yeah. I think it frequently goes in sale, and I got it for like $3.

Leo Laporte [02:19:47]:
Yeah. Yeah, very cool. Jeff, you never played computer games, did you? Now we're trying to get him to play pentimento, I think, but Pentiment.

Paris Martineau [02:19:58]:
But he'll need a.

Jeff Jarvis [02:20:00]:
You're thinking of pimento loaf.

Paris Martineau [02:20:03]:
Pimento loaf. We're trying to get him to play this pimento loaf, but he keeps saying it's a piece of bread.

Leo Laporte [02:20:12]:
Jeff Jarvis, what do you have for Paris?

Jeff Jarvis [02:20:14]:
Because Paris is a big David lynch fan of late. David Lynch's home in California is for sale.

Leo Laporte [02:20:21]:
Let's. Let's buy it. Let's make it the new Twitter compound.

Paris Martineau [02:20:25]:
I'd love that.

Leo Laporte [02:20:26]:
How much is it selling for? Probably 15 million. Oh, well, never mind.

Jeff Jarvis [02:20:33]:
But it's two and a half acres.

Leo Laporte [02:20:35]:
Well, it's made out of concrete, so it's gonna last. It's on Mulholland Drive.

Paris Martineau [02:20:40]:
Gorgeous inside, too.

Leo Laporte [02:20:42]:
Is it? Oh, look at this. What is that?

Paris Martineau [02:20:45]:
Theater room?

Jeff Jarvis [02:20:46]:
That's the edit bay.

Leo Laporte [02:20:47]:
This is.

Paris Martineau [02:20:48]:
That's his. The edit bay in studio. Yeah.

Leo Laporte [02:20:50]:
Oh, my God.

Paris Martineau [02:20:51]:
David lynch is a big, big screen fan.

Leo Laporte [02:20:55]:
Is. Frank Lloyd Wright designed it. Yeah.

Paris Martineau [02:20:57]:
Oh, yes.

Leo Laporte [02:20:58]:
Okay.

Jeff Jarvis [02:20:58]:
Look at that.

Leo Laporte [02:20:59]:
Oh, David Lynch.

Paris Martineau [02:21:04]:
Ugh.

Leo Laporte [02:21:05]:
Who designed his hair. That's what I want to know.

Paris Martineau [02:21:08]:
God.

Leo Laporte [02:21:11]:
The Crow's nest hilltop shoe. This is beautiful. Look at this.

Paris Martineau [02:21:16]:
Delightful.

Leo Laporte [02:21:17]:
Oh, it wasn't Frank Lloyd Wright. It was his son, Eric Lloyd Wright.

Paris Martineau [02:21:21]:
They really buried the lead by just.

Nick Foster [02:21:23]:
Saying Lloyd Lloyd Wright.

Leo Laporte [02:21:26]:
It's a Lloyd, right?

Paris Martineau [02:21:27]:
Look at his workshop. That's so cool.

Leo Laporte [02:21:30]:
Oh, man. Must be nice.

Jeff Jarvis [02:21:35]:
Is it empty? If you buy it, is it empty?

Leo Laporte [02:21:40]:
Well, what are they going to do with it all? Leave it all there.

Jeff Jarvis [02:21:47]:
That's not concrete. That's painted plaster to look like concrete.

Leo Laporte [02:21:50]:
Oh, really?

Paris Martineau [02:21:53]:
Venetian type plaster.

Leo Laporte [02:21:55]:
Wow, look at this beautiful view. That is incredible.

Jeff Jarvis [02:22:00]:
His workshop. What part of town is this in?

Leo Laporte [02:22:03]:
He's hoping the property will sell to a Lynch fan or a foundation or a museum that might want to keep it as is. It's possible buyer would want to build a new home on the site, but new California. Yeah, new.

Paris Martineau [02:22:15]:
How dare they?

Leo Laporte [02:22:16]:
How dare they? Look at that. And it is. Is it on Mulholland Drive? Oh, it's. It's where he did Mulholland Drive. But it's not on.

Paris Martineau [02:22:28]:
It's not on. Mahaland.

Leo Laporte [02:22:29]:
That would be too much.

Jeff Jarvis [02:22:30]:
I was wrong.

Leo Laporte [02:22:31]:
That would. What a coinky dink. That would be.

Jeff Jarvis [02:22:33]:
That would be. Yes.

Leo Laporte [02:22:35]:
Not only is the movie called Mulholland Drive, he lives there. Good. Good picks. Good picks, everyone. Good picks. Stay tuned. My pick for you is our Meta Connect coverage. We will be going to Mountain View.

Leo Laporte [02:22:52]:
It's a live stream. I hope they're streaming it. You know, I didn't even ask. We think they're streaming it. If they are, Jeff and I will talk along.

Jeff Jarvis [02:23:01]:
Otherwise, I just got a message from CNN wanting to talk about Jimmy Kimmel. So who knows?

Leo Laporte [02:23:05]:
Go do that. Go do that.

Jeff Jarvis [02:23:06]:
We'll see. I'll let you know.

Leo Laporte [02:23:08]:
That's fine. And I will just sit here all alone talking about your glasses with my meta glasses. Taking pictures. Thank you. Paris Martineau. Wonderful to see you. Thank you, Jeff Jarvis. Just a reminder.

Leo Laporte [02:23:23]:
I've mentioned this to you guys. I was going to be on vacation for the next three weeks. I am not. Because our house has a big hole in it. Somebody has to guard the hole to keep the riff raff out. So I will be here for the next three weeks. I will miss one twit in a couple of weeks because I am going to go back east to visit my mom and to have a prime rib, roast beef, French dip. Delicious thing for my son, Sammy Sammo.

Paris Martineau [02:23:52]:
We're gonna have a great time.

Leo Laporte [02:23:53]:
Although he has. I have no guarantee that he's gonna be there. I said, I'm gonna be out. Will you be there? He said, yeah, probably. Our friends Renee Richie and Luria Petrucci went out last weekend and he wasn't there because he was doing a festival. So. So you never know. He's a big shot now.

Leo Laporte [02:24:12]:
He's a big shot. Thank you, Jeff Jarvis. Thank you, Paris Martino. Thanks to all of you who join us. We do the Intelligent Machines every Wednesday right after Windows Weekly, 2pm Pacific, 5pm Eastern, 2100 UTC. You can watch us do it live. We stream it in the discord for our club members. Thank you club members for your support.

Leo Laporte [02:24:31]:
We really appreciate it. We also stream it on YouTube, Twitch, TikTok, Facebook, LinkedIn, X.com and Kickstarter. So there's plenty of places to watch live. You don't have to after the fact. On demand versions of the show, audio and video are on our website, Twitter TV IM. You can also go to the YouTube channel. Great way to share clips with friends and family. So you know Everybody can see YouTube.

Leo Laporte [02:24:54]:
So if there's a little thing you liked, you want to say, oh, there's this new game Paris was talking about. You want to share that with them? Just go to that YouTube and clip it there. Otherwise the best thing to do is subscribe. And your favorite podcast client, pick one. Doesn't matter what you use, Overcast, Pocket, Cast, Apple's podcast. But whatever one you use, subscribe and then it's free. Leave us a five star review saying how much you love the show. And if you make it clever, if you make it creative.

Leo Laporte [02:25:21]:
Perhaps Paris Martineau will read it on the air next week. Next week our guest, Steven Levy, legendary tech journalist, will talk about AI. He writes for Wired, the magazine and the anthropic decision and why he's going to take the money and run. Thank you everybody. We'll see you next time on Intelligent Machines. And the rest of you, stay tuned. Meta Connect is coming up.

Paris Martineau [02:25:48]:
I'm not a human being, not into this animal scene. I'm an intelligent machine.

All Transcripts posts