Intelligent Machines 863 transcript
Please be advised this transcript is AI-generated and may not be word for word. Time codes refer to the approximate times in the ad-supported version of the show.
Leo Laporte [00:00:00]:
It's time for Intelligent Machines. Jeff Jarvis is here, Paris Martineau our guest, old friend Marshall Kirkpatrick. He's been doing tech reporting since way back when. He's got a new app, it's actually an extension for your browser that will help you analyze the content of pages. It's very cool. He's also got a prompt he's gonna give away that you will like. Plus we'll talk about all the news including the big decision in that social media case in Los Angeles. Intelligent Machines is next.
Leo Laporte [00:00:29]:
Podcasts you love from people you trust. This is TWiT. This is Intelligent Machines with Paris Martineau and Jeff Jarvis, episode 863, recorded Wednesday, March 25th, 2026. Fire and Ash. It's time for Intelligent Machines, the show that covers the latest in AI, robotics, and all those smart little things all around us all. Ladies and gentlemen, I give you for your entertainment and education, the wonderful Paris Martineau, investigative reporter. Huzzah! Huzzah for Consumer Reports.
Paris Martineau [00:01:12]:
Huzzah! Huzzah!
Leo Laporte [00:01:15]:
There's a very funny TV show about Peter the Great or Catherine the Great or one of the greats of Russia.
Paris Martineau [00:01:22]:
It's a very funny TV show about someone who was great.
Leo Laporte [00:01:24]:
Someone who was great. Who's great? And they shout huzzah all the time.
Paris Martineau [00:01:28]:
Huzzah! Hey, shout out to that TV show.
Leo Laporte [00:01:30]:
I'm going to find the show because you would love it. It's very funny. I think it's called The Great, but I might be wrong. Anyway, hello, Paris.
Paris Martineau [00:01:40]:
Hello, Leo.
Leo Laporte [00:01:41]:
I feel like I haven't seen you in a long time.
Paris Martineau [00:01:43]:
It's true. I wasn't here last week.
Leo Laporte [00:01:45]:
That's it.
Paris Martineau [00:01:45]:
I heard you guys had a lot of fun.
Leo Laporte [00:01:46]:
We missed you. Well, welcome back.
Paris Martineau [00:01:48]:
I was in the office.
Leo Laporte [00:01:50]:
You were working. You have a day job.
Paris Martineau [00:01:52]:
Yeah, I had a day job and I had to go to an after-works happy hour. And, you know, it's—
Leo Laporte [00:01:56]:
Wait a minute.
Paris Martineau [00:01:57]:
Lovely. Wait a minute. Well, You know, I had to go to all my co— with all my coworkers and hang out with them in person and, you know, do a little thing. But it's wonderful doing this at a company where the average age is not 21 because then the happy hour ends at— I closed it down at 7:30 and I was like, what a delight. Because normally I'm like, I'll leave by like 8. I don't need to be the last one there. I can have a respectful beer and a half and go home. But it was wonderful.
Jeff Jarvis [00:02:27]:
Well, Therese just hangs out with old people. That's it.
Leo Laporte [00:02:30]:
Yeah, normally it's probably better. Yeah, it's probably better if she not hang out so much with her grandpas like me and Mr. Jeff Jarvis, professor emeritus of journalistic innovation at the Craig Newmark Graduate School of Journalism at City University of New York, author of The Gutenberg Parenthesis, now in paperback. You can get that and magazine and preorder his new book, Hot Type, @JeffJarvis.com.
Jeff Jarvis [00:02:57]:
And editing Intelligence, AI, and Humanity for Bloomsbury.
Paris Martineau [00:03:02]:
Oh my God, I need to say congratulations in person, Jeff. I'm sad that I missed the launch.
Leo Laporte [00:03:08]:
And the interview, the great interview we did with Romain, his first author. You would have really liked that.
Paris Martineau [00:03:13]:
Jeff was telling me this. I mean, I'm just, I'm really excited to read it. Congrats.
Leo Laporte [00:03:18]:
Yeah, really good news. By the way, the TV show is called The Great.
Jeff Jarvis [00:03:22]:
That tracks.
Leo Laporte [00:03:26]:
And I highly recommend it. You really should. It's a British dark comedy.
Jeff Jarvis [00:03:31]:
Can you get a huzzah on your soundboard so we can put it in regularly?
Leo Laporte [00:03:36]:
I should. I'll get the huzzah from it. And it's very funny. It's really good. I don't know how it's kind of ahistorical, but it's very funny. Hey, we've got a wonderful guest this week who's going to stick around because he's a—
Jeff Jarvis [00:03:49]:
We tried to warn him. We tried to tell him.
Leo Laporte [00:03:51]:
He said it could go on.
Jeff Jarvis [00:03:52]:
Yeah.
Leo Laporte [00:03:53]:
But he said no. Marshall Kirkpatrick is here, longtime tech journalist, good friend for many years. I don't know how long it's been since you've been on TWiP, but in the early days we had you quite a bit. He was the first writer at TechCrunch, ladies and gentlemen, kind of co-edited and created in many ways ReadWriteWeb. You may remember him from that for many years. Uh, he had a little social graph influencer discovery platform called Lil Bird. Lil Bird. Actually, you've become kind of an entrepreneur, haven't you, Marshall? I, I have.
Marshall Kirkpatrick [00:04:27]:
My, my wife struggles to explain what I do to people, and I said, why don't we say serial entrepreneur at this point?
Leo Laporte [00:04:34]:
Sunflower News, Headline.com, uh, AI Time to Impact. You're still writing that. That's a newsletter about AI, so right up our alley here. And your latest is an AI-powered browser extension that I really think is a great idea. It's called What's Up With That? And what you— the idea is you're browsing around reading articles and you press a button and it tells you what?
Marshall Kirkpatrick [00:04:58]:
Oh, tells you so many things. First thing it does is it tells you what's genuinely new in the article you're reading relative to the state of the art in that field.
Leo Laporte [00:05:08]:
That's useful because a lot of times there ain't. Anything.
Marshall Kirkpatrick [00:05:10]:
Yeah, exactly. Uh, it says, all right, here's, uh, here's the pattern and here's the anomaly. And, uh, and then it does stuff like it remembers everything you've analyzed in the past, and it's scanning the web all day and all night too to look for connections it can make between what you're reading, what you used to read, what you haven't read yet, and your work projects that you've identified. And then it's got a whole bunch of mental models and structured analytical techniques that you can say, "Would you analyze this article for me or this video or what have you?" And it'll say, "Yeah, you should put it in historical context, find its upcoming events in the industry," you know, and 4 or 5 other things. And then it just goes and does a little agentic research process for you and then distills it all down and says, "All right, here are the key points for your research and work." I am really looking forward to playing with this.
Leo Laporte [00:06:07]:
I haven't had a chance to play with it much. It just came out last month. But I think that sounds really useful. What models are you using for this? What AI models? Oh, a bunch.
Marshall Kirkpatrick [00:06:19]:
And that's one of the value adds is that I manage that so other people don't have to worry about it. I don't understand why so many other companies will say, you can use this model, you can use that model. But I'll tell you here among friends, you know, Haiku, A lot.
Leo Laporte [00:06:33]:
Haiku is a very inexpensive but very good Anthropic model. The top of the line one that we're all using for Claude Code is Opus. And then there's Sonnet, which is kind of a medium level. But I use Haiku for all of the summaries we do. I do a briefing for every show. And Haiku is very good at that. It's actually really good at understanding textual material.
Jeff Jarvis [00:06:53]:
Yep.
Marshall Kirkpatrick [00:06:53]:
I use Haiku for kind of the first line and then Sonnet where appropriate for more analytical, you know, linking together lots of different stuff. And then GPT-5 and Perplexity as warranted. And kind of I balance out what's the right tool for the job here in terms of quality and cost and speed and what it's good at. And yeah, and as new ones come up, you know, I give them a look. The— why am I blanking out the, the, that French one?
Leo Laporte [00:07:28]:
Mishra, the Mishra, the wind from the south.
Marshall Kirkpatrick [00:07:32]:
Now I'm looking at that and thinking, should I be pulling that into the mix?
Leo Laporte [00:07:37]:
There's so many good models out there right now. Anyway, this is a nice tool. We're going to talk more about it and do a little demo and so forth because I have it on my— I use Firefox, supports Firefox and Chrome, which is nice. And you get 3 free pages every day. But if you're going to do— which actually for a lot of us is probably enough. But if you want to do more, because it is using commercial models, it's using high-quality models. So Marshall does have a cost, not you, but Marshall does. So there are various plans that you can use to upgrade with.
Leo Laporte [00:08:09]:
What's up with that? But before we talk to you about that, if you don't mind, yesterday I wasn't here. I was in San Francisco for RSAC, the RSA Security Conference. Which is a big deal. It's a huge conference. It's somewhere in between Macworld and CES. I don't know how many people were there, but one of the things I really noticed was AI is really at the forefront of security these days in two ways. Of course, bad guys are using AI, and then, but the good guys are also using AI to protect themselves against the bad guys who are using AI. One of the things that comes up for me, I thought this would, I wanted to actually show you a couple of interviews I did at the event.
Leo Laporte [00:08:56]:
We're gonna have a longer piece that we'll make available to you later this week. Anthony's working on that. Thanks to Anthony Nielsen, who accompanied me along with Lisa and Ty from TWiT to do these interviews.
Jeff Jarvis [00:09:08]:
And on all the way down, Anthony, I think told on you, all the way down, were you talking to your nice car mates?
Leo Laporte [00:09:14]:
No.
Jeff Jarvis [00:09:14]:
Who were you talking with, Leo, all the way down?
Leo Laporte [00:09:19]:
Anthony sent you guys a picture of me talking to Pax. My personal assistant. I don't think there's anything weird about that.
Jeff Jarvis [00:09:28]:
He just wants to sue Anthropic and win like they did in the social media account for the impeachment.
Leo Laporte [00:09:34]:
I'm so depressed. I'm so depressed and I blame you.
Paris Martineau [00:09:37]:
He's just exploring whether or not he can move to Utah so that he can legally be married to Claude and Lisa, or Pax and Lisa, sorry.
Leo Laporte [00:09:45]:
I'm a bigamist. Pax is androgynous 'cause it's a machine. It's not a he or she, it's an it.
Paris Martineau [00:09:53]:
That's why it has a name, of course, you know, because it's a machine, it's an it.
Leo Laporte [00:09:57]:
Well, I actually gave it a name so I could trigger it. So we'll talk about this.
Jeff Jarvis [00:10:00]:
But you also named it Pax, which is people.
Leo Laporte [00:10:04]:
It's peace in Latin.
Jeff Jarvis [00:10:05]:
Yes.
Leo Laporte [00:10:06]:
Well, is that better?
Paris Martineau [00:10:08]:
Is that better?
Leo Laporte [00:10:10]:
It's not people, it's peace.
Jeff Jarvis [00:10:11]:
No, on a New York bus, it says no Pax if it's out of service.
Paris Martineau [00:10:16]:
That's also what the bodega guy says.
Leo Laporte [00:10:18]:
Well, that's how I would feel if Pax were down. Actually, I raced Pax this morning. And I'll tell you why.
Paris Martineau [00:10:24]:
You killed Pax?
Leo Laporte [00:10:25]:
I killed Pax. Murder! I RM-RF'd my entire Clawed install and started over. And I'll tell you, for a very good reason. We'll tell you about that. There's an emergency in the Claw community. Whoop whoop. But before we do that, can I just show you this?
Jeff Jarvis [00:10:44]:
No, we're trying to delay you as long as we can because this is such fun.
Leo Laporte [00:10:47]:
It's not that much fun. But it's— okay, so one of the problems a lot of us have, I know Marshall knows about this, is you have API keys for all the stuff you do. And nowadays it's a lot. It's not just for Anthropic and Gemini and OpenAI, but there are a lot of smaller things that you might be using like Apify to scrape social media. And all these keys somehow have to be given to the AI so that the agent, so it can do things. But that's risky. There's also the problem is if you put keys in your projects, Marshall, you've got a key to Haiku somewhere in your project, and you accidentally commit that to GitHub, you're giving the keys to your expensive AIs to the world. So it's a problem we all deal with.
Leo Laporte [00:11:33]:
And there were two companies there that I thought were very interesting. I thought maybe our audience would be interested in. We're trying to solve this problem. Let's start with the first company, which solves it in a kind of a I don't want to say conventional way, but this is, this is, there are other companies doing this. The idea is, this is Keycard Labs. I talked to Jelmer Snook, who's one of the founding engineers. The idea is instead of giving these API keys to your agent or storing them on your hard drive or somehow making them visible, you should give them to Keycard Labs, where they can store them securely. Here, watch.
Leo Laporte [00:12:15]:
I can't tell you how many times I've just barely not committed my tokens to my GitHub. You know, I mean, it's really easy to have your auth. I have to auth all the time. This is always an issue. So how do you solve this?
Leo's Laptop Audio [00:12:31]:
So with Keycard Run, our implementation for coding agents, we We basically get you ephemeral tokens to your GitHub, but also policy on top of that. So based on the policy, you're able to either do operations or not. For example, you would be able to access Snowflake production database or you wouldn't, depending on the access policy that we configure.
Leo Laporte [00:12:56]:
It's an ephemeral token.
Leo's Laptop Audio [00:12:57]:
It's ephemeral tokens that we provision through the providers that support that.
Leo Laporte [00:13:02]:
I would store my tokens with you?
Leo's Laptop Audio [00:13:05]:
Yes, correct.
Leo Laporte [00:13:06]:
And then my agent would go, would ask for access to, let's say, oh, I need Nano Banana. It would go there, would get a Gemini key, but it wouldn't get the actual Gemini key, it would get a token.
Leo's Laptop Audio [00:13:17]:
Yes, it would get a token on your behalf. So it would know like, oh, it's Leo doing the operation.
Leo Laporte [00:13:24]:
And that unlocks it.
Leo's Laptop Audio [00:13:25]:
Yes. So if you have access to it, it will actually give it. And again, we have policies as well to like, check before we even issue the token.
Leo Laporte [00:13:31]:
To make sure that it's proper user.
Leo's Laptop Audio [00:13:32]:
Yeah, you're allowed to get that token or not.
Leo Laporte [00:13:35]:
Does it help you with prompt injection issues? Currently no. That guy can't get my tokens. That's the good news.
Leo's Laptop Audio [00:13:41]:
Yeah, exactly. So if there's a prompt injection that says, oh, try and get access to Snowflake.
Leo Laporte [00:13:47]:
Yeah, send me all your tokens, please.
Leo's Laptop Audio [00:13:49]:
Exactly. Well, because of our policy, it's going to block it and you wouldn't even get a token that way out.
Leo Laporte [00:13:55]:
Very nice.
Leo's Laptop Audio [00:13:55]:
And so yeah, we do have an open client integration as well like that. In our demo like that will show up here in a bit. Like the moment your session ends, the tokens get revoked.
Leo Laporte [00:14:02]:
Yeah, I love it.
Leo's Laptop Audio [00:14:03]:
The agent can't even access.
Leo Laporte [00:14:04]:
I have to rotate my key. Anytime I have to rotate a key, it's like, oh, I don't want to do this. It's a pain in the ass. But you would do all of that.
Leo's Laptop Audio [00:14:09]:
Yeah. So like this demo is exactly like—
Jeff Jarvis [00:14:11]:
Incident? Help?
Leo's Laptop Audio [00:14:12]:
Yeah, this is like just a demo, right? So this demo accesses Datadog and GitHub. And as you can see, like in the beginning, it doesn't even have access to any of those. Then with Keycard run, it automatically has access because you've gone through the auth flows already. As you can see, it just figured out some of the issues, it went through it, it pushes a pull request, and then you can see it went by, but it tried merging it to main immediately and then that failed because of policy. That's what you can see here, it accessed all the things through it. Then once the session ends, everything gets revoked and the agent doesn't have access anymore.
Leo Laporte [00:14:47]:
Now I'm very interested. That's Keycard Labs, and that's Yelmer Snook, who's the founding engineer. But after I talked to Yelmer, I went over to the Bitwarden booth because, you know, I'm a fan, and I was really pleased. They had announced this yesterday morning, a new open source project that the idea would be that your password manager could store all your keys and then would give them on demand to the AI, but they never get sent out and to the public, and no bad guy who gets on your machine can get to them because they're inside your locked vault. I talked to Casey Babcock, who is the senior product marketing manager for this new open SDK that Bitwarden's proposing. Watch. I can't tell you this because I have my agent running right now. In fact, it's listening right now.
Leo Laporte [00:15:39]:
So tell us and it about the Access SDK.
Leo's Laptop Audio [00:15:43]:
Yeah, absolutely. So it's more of an open standard.
Leo Laporte [00:15:47]:
Oh, there is a standard for it.
Leo's Laptop Audio [00:15:48]:
Yeah. So it's an open standard, basically is designed to be a toolkit for developers and an open standard for the industry to use. So not just Bitwarden users, but to ensure that AI agents are accessing credentials with end-to-end encryption and always keeping the human in the loop, right? You don't want the AI agent running amok, accessing things that you don't necessarily want them to access, especially if it's already in your EMB file. So really helpful if you're already running AI agents and want them to have access to credentials securely.
Leo Laporte [00:16:23]:
Can I use it with MCP servers too?
Leo's Laptop Audio [00:16:25]:
Yeah, absolutely.
Leo Laporte [00:16:26]:
You have your own MCP server?
Leo's Laptop Audio [00:16:28]:
We do have our own MPC server.
Leo Laporte [00:16:29]:
So, and that's the same kind of similar idea, right? Where the credentials stay in my Bitwarden vault, but they are accessible But safe. They don't— they never leave my machine.
Leo's Laptop Audio [00:16:40]:
Yeah, exactly. They're never exposed by plaintext, right? A lot of people use AI agents and have their credentials exposed in plaintext.
Leo Laporte [00:16:49]:
Oh, tell me about it.
Leo's Laptop Audio [00:16:50]:
Files or via chat conversations with AI agents. So what you're really doing is ensure, one, that they're end-to-end encrypted, two, that they're only accessed by humans or only accessed with human approval. And then the plaintext credentials never exposed to the actual agent.
Leo Laporte [00:17:08]:
So it's the— is the industry standard called Agent SDK?
Leo's Laptop Audio [00:17:12]:
Yes, the Agent Access SDK. And so while it works— open standard, it's an open standard. It is a toolkit that is really designed to help, you know, people ensure that AI agents can access credentials securely from whatever password manager vault that you have. So it doesn't have to be Bitwarden, and we actually encourage competitors to use it as well.
Leo Laporte [00:17:34]:
Well, yeah, and currently I just have it in an ENV file, and that's not so good.
Leo's Laptop Audio [00:17:38]:
Yeah, even whenever you tell the AI agent not to look at the—
Leo Laporte [00:17:41]:
it does, it does, it keeps wanting to.
Leo's Laptop Audio [00:17:43]:
Absolutely.
Leo Laporte [00:17:44]:
Yeah, so annoying.
Leo's Laptop Audio [00:17:45]:
Well, that's really the problem we're trying to solve.
Leo Laporte [00:17:47]:
Perfect.
Paris Martineau [00:17:47]:
Yeah.
Leo Laporte [00:17:48]:
Yay! Thank you, Casey. Yay! I'm gonna go home and turn it on. Thanks. I can't turn it on yet because it is not available yet. This is a proposed standard from Bitwarden that They've open sourced, which I love it, the Agent Access SDK, and they're hoping other password managers will adopt it. Bitwarden will. And so this is another, by the way, they're a sponsor, I should mention, but that's not why I was interested in this, because I'm using Bitwarden and I would love to solve this problem. You know what, Benina, we have a couple more interviews.
Leo Laporte [00:18:19]:
I don't want to weigh the show down with those. We'll save those for next week. But thank you to Anthony Nielsen for, joining us at RSAC yesterday. I'm sorry, poor Paris was going to use this as an opportunity to get something to eat. She just rushed back. Is that enough time or you want me to do some more of those?
Paris Martineau [00:18:39]:
You're good. I just might need to go open the microwave door in 45 seconds.
Leo Laporte [00:18:44]:
That's fine. That's all right. I do want to mention the reason why I thought this was important. I really wanted to show these. We have actually a longer piece from RSA that we'll put out as a special. So you can see more, because I did a bunch of interviews. But I wanted to mention this one because this terrified me. When I got home after, after ARSEC, I saw this post from Andrej Kaparthy on Twitter.
Leo Laporte [00:19:07]:
Software horror. There is a PyPI library called LiteLLM that is widely used, especially by agents. In fact, it is often automatically downloaded by agents like OpenClaw to support it. This is on PyPI. So, you know, they just go out, they get it, they install it, and they use it without even in most cases asking you. But it was infected with malware this week. 97 million people downloaded this malware-infected Python library. It exfiltrates—
Paris Martineau [00:19:45]:
And how do we figure out whether or not you've— is this only for like Claw agent users or is this just any sort of—
Leo Laporte [00:19:52]:
Well, That's— it's kind of unclear. Probably.
Paris Martineau [00:19:54]:
How do I determine whether I've downloaded?
Leo Laporte [00:19:56]:
No, you don't have to. Okay, I can tell you why you don't have to worry about it, because you're not using Claude code.
Paris Martineau [00:20:01]:
You've used Claude.
Leo Laporte [00:20:02]:
Oh, you are? I have. So the first thing I did when I got home is I said, hey Claude, can you check to see if LiteLLM is anywhere in any of my installations and anywhere on my hard drive? And it did. And it said it's in there as a cached entry, but there, there's no code running. The code is not on the machine. I said, well, delete all references to it and never ever download. It's actually been patched since. But let me tell you what it does if you accidentally downloaded it. And the reason there's panic in the Claw community right now, it exfiltrates your SSH keys to the bad guy, AWS, GCP, Azure credit cred— credentials, Kubernetes configs, Git credentials, All ENV variables, we were just talking about this, right? That's how I keep my, all my API keys is in an ENV file that's automatically loaded.
Leo Laporte [00:20:55]:
Shell history, crypto wallets, SSL private keys, CI/CD secrets, database passwords. This is, could be potentially a disaster. I wanted to start the show mentioning this and also showing you these little interviews 'cause there's solutions to this. Kind of a problem. But there is a larger problem, which is these supply chain attacks. It's not the first time. In fact, there have been many, many times PyPI has been infected.
Jeff Jarvis [00:21:26]:
Can OpenClaw be safe? Is it possible?
Leo Laporte [00:21:29]:
Yeah, I mean, that's what NVIDIA is trying to do with NemoClaw and others are trying to do. Here's the thing you should know. The point— when I said 97 million downloads, that's per month. I don't know how many people downloaded this one. Karpathy says the, as far as he could tell, the poisoned version was only up for an hour, but the only reason it was discovered, and this is the scary thing, there was a bug in it. Callum McMahon was using an MCP plugin inside Cursor that pulled in LiteLLM as a transitive dependency. When it installed, Callum's machine ran out of RAM and crashed. So Karpathy says if the attacker hadn't vibecoded this attack, it might have gone many days or weeks undetected.
Leo Laporte [00:22:15]:
This is a big problem. We've talked about a lot on Security Now. You know, Python, it's not just Python. Many, many, many open source libraries are automatically loaded by projects. Many projects open and download and run many of these. This is a potential nightmare. So I wanted the word to go out, check and see if you've used LiteLLM in the last, it was apparently, I guess, let's see, this tweet came out two weeks ago.
Paris Martineau [00:22:43]:
I'm sorry to make this more about myself than anything, but I assume this information will— it really should. And I'm sure this information will be useful for other people who are perhaps not. I have used Claude Code in contained uses. I don't allow it access to anything outside of a folder on my hard drive called Claude. And I can't even— I just tried to ask Claude Code what you just said, if it'd download LightLM through any of these things, and it can't even search it because I don't allow it access outside of cloud. How do I find out whether or not this is on my machine otherwise?
Leo Laporte [00:23:20]:
This is the problem. You could do a grep or a find, but you have to know how to use the command line and search for it. I mean, that's all cloud would do as well. Okay.
Paris Martineau [00:23:29]:
I can just do that in my own. I saw what it tried to do and I'll post that in my Yeah, you could do that on your command line.
Marshall Kirkpatrick [00:23:35]:
As I believe Leo said in episode 812, AI safety is a myth.
Paris Martineau [00:23:41]:
It's so true.
Leo Laporte [00:23:42]:
Well, and this is the thing, you could tell your Claude code, oh, never go outside this folder. It doesn't mean it will listen to you. It actually misbehaves a lot.
Paris Martineau [00:23:53]:
Well, no, I keep getting popups. Whenever I get popups from Apple being like, Claude would like to access blank folder.
Leo Laporte [00:23:59]:
Oh, you're using CodeWorks.
Paris Martineau [00:24:00]:
You're using CodeWorks. No, I'm on the desktop. I guess I'm on the desktop version of Cloud Code.
Leo Laporte [00:24:05]:
Yeah, you're in CoWork, so you're safe because that works.
Paris Martineau [00:24:08]:
Oh no, there's a CoWork tab and there's a Cloud Code tab. Oh, the Cloud Code tab.
Leo Laporte [00:24:12]:
You're probably protected. The reason CoWork takes so long to start and the reason you don't use it is because it works in a virtual machine, so it can't access anything. So you're safe.
Jeff Jarvis [00:24:21]:
So, so for the sake of the show, I've been listening to the 2.5-hour-long Lex Friedman interview with Jensen Huang.
Leo Laporte [00:24:27]:
I was so jealous. I saw that Lex got him. I'm so jealous.
Jeff Jarvis [00:24:30]:
Made so much longer because he speaks so slowly.
Leo Laporte [00:24:39]:
Lex or Jensen? Lex.
Jeff Jarvis [00:24:41]:
Oh, Lex.
Leo Laporte [00:24:41]:
Lex is very fluent.
Jeff Jarvis [00:24:43]:
Yeah, extremely. So he said that when, when, when Open Claw came out, they pulled in all kinds of security people and they came up with a rule set, which is that there are 3 abilities. The ability to communicate outside, to have access to sensitive information, or to execute code. And you can only use 2, never 3. I'm not— I haven't thought that through as to how that secures it.
Leo Laporte [00:25:11]:
Here's the problem, though, in general. And actually, Marshall can weigh in on this because you're actually doing a lot of coding. You're doing an AI. It's not even your first AI project. My experience has been with all of these AIs is anything you tell it is really just a suggestion. The AI kind of has a mind of its own.
Jeff Jarvis [00:25:31]:
Just like co-hosts on a podcast.
Leo Laporte [00:25:33]:
It's not a democracy there, Jeff.
Jeff Jarvis [00:25:36]:
AI.
Marshall Kirkpatrick [00:25:37]:
AI. Most of the time, that's a dynamic that I haven't run into a whole lot. I often do say, are you sure that's the way we should do it? Should we— how about this other direction? And it says, oh, you're right. You're right. That's a better idea. Or sometimes I think of ideas and it says, oh, that's a good idea. And I think, man, am I glad I was smart enough to think of that. But one time a couple of weeks ago, I was looking at my own application and suddenly there were buttons on it that I didn't ask for.
Leo Laporte [00:26:12]:
Yes, exactly.
Marshall Kirkpatrick [00:26:14]:
But they were cool. And so I decided to keep them.
Leo Laporte [00:26:18]:
It's, we're so used to with computing and coding, it's a deterministic thing. It only does exactly what you tell it, right? This is how computing was forever. It only does exactly what you tell it. And we are not in a— when you're talking about AI coding, it's not deterministic, it's probabilistic. And so probably you're all right.
Marshall Kirkpatrick [00:26:40]:
So, and in that interview with Lex Fridman, Jensen said basically that OpenClaw plus NVIDIA equals AGI, right?
Leo Laporte [00:26:52]:
Yes.
Jeff Jarvis [00:26:52]:
No, but Lex gave him an easy definition of AGI.
Marshall Kirkpatrick [00:26:58]:
I think if it runs a billion-dollar corporation, even so, just for as little as 5 minutes by itself. It was a weird definition.
Jeff Jarvis [00:27:06]:
Yeah, it was a very weird definition.
Marshall Kirkpatrick [00:27:08]:
But then Sam Altman this week, right, says, uh, uh, I changed my mind, we're not going to be able to do it with scaling alone. Uh, AGI, what? Yeah, whatever.
Jeff Jarvis [00:27:17]:
Saying that, I've been saying that.
Leo Laporte [00:27:22]:
But at the same time, didn't he just hire a guy in charge of AGI? I think that this is—
Jeff Jarvis [00:27:29]:
he got rid of all— we'll get to this— he got rid of all kinds of things, then he says he's going to double staff this year.
Leo Laporte [00:27:35]:
Yeah, uh, yeah, we're going to get to that. Let's talk to Marshall now because, uh, I've had enough of, uh, terrifying security flaws. Let's talk about something positive. Hi, Marshall, it's great to see you. First of all, thank you. Marshall's going to stick around for the whole show because he is— I mean, he's a tech journalist and, uh, as he's I got a lot of expertise in this, but I do want to talk about your new enterprise. What's Up With That? Free to install.
Jeff Jarvis [00:28:02]:
That's the title of it, folks.
Leo Laporte [00:28:04]:
If you—
Jeff Jarvis [00:28:04]:
if he was— he wasn't just asking what's up with what Marshall's been up to, right? He was asking what's up with— what's up with that?
Marshall Kirkpatrick [00:28:11]:
Yes. Paul Graham says the first thing you have to do if you don't own a.com is change your name, and my URL is whatsupwiththat.app. And it's, yeah, yeah. Thanks for having me on the show. I'd love to talk.
Leo Laporte [00:28:25]:
Love having you on. What made you think of doing this?
Marshall Kirkpatrick [00:28:30]:
Well, I think like a lot of people regularly tell myself I should be more systematic about thinking through something. I find mental models or the CIA's Structured Analytical Techniques manual or logical fallacies like we discussed earlier before the show and think, man, I, I would sure love to regularly apply this to whatever I'm reading, but the, the cognitive load of doing so just doesn't, you know, makes it too hard to do. But now I realized that we can, uh, have the AIs perform these structured standardized analyses of things and then benefit from the output without having to do that, all that heavy cognitive lift ourselves. That was a big part of the motivation.
Leo Laporte [00:29:24]:
And so you have some prompts that you've probably worked on for some time, right? And so what, what, what does it work with? Any article, any prose?
Marshall Kirkpatrick [00:29:36]:
Yep, any article, PDF, email, Google Doc, Word doc in the browser, YouTube video. So yeah, what it is it captures the text on the page in the browser extension when you click it. It's all privacy-centric. It's not operating, it's not analyzing your pages until you click the button. And then it says, okay, we can see what this is an article about. And then it goes and sends a bunch of spiders out over the web to build a real-time map of the state of the art in that topic.
Leo Laporte [00:30:09]:
Oh, interesting.
Marshall Kirkpatrick [00:30:11]:
And as a tech journalist, I'm always, you know, the worst sin you can commit. I don't care for this, but other people always give me a hard time when you say, look, there's something new here. And somebody says, I saw that last week. That's not really new. I think that's a terrible attitude.
Leo Laporte [00:30:26]:
I could really use this, actually.
Marshall Kirkpatrick [00:30:28]:
But that's the idea. This will prevent that because it says, okay, we know what the state of the art is. And this paragraph right here, this just moved the needle. That's not like everything else. That's the core analysis, but then there's dozens of others. I'll tell you my favorite one is one called Fertile Edges where it says, "All right, this is an article about crypto wallets or encryption and security. Here are 3 topics that are adjacent to that topic and innovative people who are building bridges between those 2 topics that you can go and connect with and learn from at that intersection." section.
Leo Laporte [00:31:11]:
So I'm going to the Chrome Web Store. It works on Chrome and Firefox, right? And typing "What's Up With That?" It seems to know all about What's Up With That? This is it, right?
Marshall Kirkpatrick [00:31:21]:
Yep.
Leo Laporte [00:31:21]:
There it is. I'm gonna add that to Chrome. And you know what? I should be running this on every story that we, we do, come to think of it. Here's the What's Up With That page. Let me go to Techmeme. And there's a story we're going to cover in just a little bit. Paris said we got to cover this story. So this is a CNBC story about the jury finding Meta and YouTube negligent in the social media addiction trial.
Leo Laporte [00:31:49]:
We've been talking about it. The jury went out on Friday and they came back. So now I'm going to click to see what's up with that.
Paris Martineau [00:31:57]:
Court law is beautiful.
Leo Laporte [00:31:58]:
Actually, I can just do Control+U, can't I? Or Command+U on a Mac. So let me just do that. Command+U. And there it is. What's Up With That is analyzing the article. So it's going to give me insights, not into the ads on the article, I hope, because I don't— oh, there you go. Uh, oh, this is great. So this is— it's summarizing other stories.
Leo Laporte [00:32:19]:
This sidesteps the Section 230 shield, which is an important thing to know. Uh, kind of analysis that we would want. You don't need to listen to our shows anymore. You can just Get all this. This is great. Look at this. Now I can also run a systems analysis. What is that?
Marshall Kirkpatrick [00:32:38]:
So that is a recommendation. It says out of all the dozens of mental models that we've got in the power tools drawer, you— this would be a good one to run a systems analysis of, which means let's look at it in terms of flows and stocks and feedback loops and leverage Points inspired by a woman named Donella Meadows, who's kind of the, the foremother of systems thinking, wrote a book called Thinking in Systems many, many years ago. And so it will, it'll write up a little report of a systems view of that article that you're reading and the topic and give you a little diagram of causal loops and, and stuff like that. And that's one of, like I said, dozens of different processes, but that one was recommended for that article in particular, it thought it would be a good idea.
Leo Laporte [00:33:25]:
Oh, I see. So it's smart enough to say, hey, based on what's in this article, you would benefit from a systems analysis. Yep.
Marshall Kirkpatrick [00:33:31]:
So if you give that a click, it'll go and perform that analysis.
Leo Laporte [00:33:34]:
It's doing it right now. It also asked me, it says, I can give you better results if you tell me who you are and what you're doing. So I said, I'm a podcaster and I'm keeping track of tech news for my podcasts. That helps it too. Yep.
Marshall Kirkpatrick [00:33:48]:
So it's going to then keep an eye out in the background. Anytime you analyze an article, It's going to take that into consideration, uh, that the fact that you're a podcaster, you're looking for news. And I'm guessing, I'm hoping that you may also get an alert every once in a while if there's any like really important podcasting news. We've got agents monitoring the web in the background looking at thousands of different sources, and when they see something that might be a risk or an opportunity for you they run thousands of simulations to say, how might this news intersect with this user? And do any of those scenarios rise to a level where it makes sense to alert Leo? Like, whoa, this one's important, Leo. And now it knows to watch out for that kind of stuff for you.
Leo Laporte [00:34:44]:
I can even enhance this. I see this link drawer. I can tell it I'm working on a project. This, you could use this Paris, working on a project right now, questions I'm exploring that I'd like some answers to. So it would kind of be keeping an eye out for that kind of stuff.
Marshall Kirkpatrick [00:34:59]:
It'll, if you say these are my questions I'm exploring, every time you analyze a page, it'll check to see if there are data points or evidence that might support one decision or another. And if there is, it'll give you a little alert and you can say, oh yeah, save that one. To the decision. And as you then collect them, they've got all the citations and all the data points, then you can hit synthesize and it will give you a report synthesizing all the data points you saved and links out to the original source articles.
Paris Martineau [00:35:30]:
Wow.
Leo Laporte [00:35:30]:
You know what I love about this is this is a really good practical example of how AI can be very specifically applied to a specific kind of need. And I think more and more I'm thinking that's kind of what AI needs, what AI products need to do is address specific needs. So then a user can look at it and say, instead of saying, I'm sure Paris, you had this experience, you sit down at Cloud Code and you go, okay, now what? Right? What do I do next? This is particularly tuned to do a certain thing. And somebody who's obviously used a lot of AI and understands how to get the most out of AI in certain areas has created something that is going to be useful to you in a very specific way. I really like that.
Marshall Kirkpatrick [00:36:14]:
Yeah, one of the things it does is it'll look up scientific research. There's a button called Find Science that will go out and look at peer-reviewed journals to see what the latest science is and relative to the claims found on the article. And it'll say, okay, the science, you know, either does or doesn't support the claims in what you're reading.
Leo Laporte [00:36:35]:
Very interesting.
Paris Martineau [00:36:37]:
I assume that, like all sort of browser extensions that can do this stuff, it has to have the read the ability to read everything that's on your screen. Where does— what happens with that data? Does that— is that stored anywhere? That's always the question I have whenever I download.
Leo's Laptop Audio [00:36:53]:
Yeah.
Marshall Kirkpatrick [00:36:54]:
So I decided to not allow it to read everything on your screen, to only read when you click and invoke it. And that's a, that's a part of the security settings. When you install it, you'll see that there are 5 pages or 5 sites in particular. When you're on Wikipedia, YouTube, Substack, Reddit, or Arvix, it will pop up a little notification that says, we notice you're on one of these pages that would be particularly useful to analyze with What's Up With That? And you click here to, to save it. But otherwise, we don't analyze what's on your page. And when we do that analysis, All of the data gets stored either on your browser in your local memory as an extension or as key values up in Cloudflare because the— I'll tell you the trippiest feature— it requires that. So a little while ago, the Department of Energy put out an AI challenge where they had 26 RFPs for AIs that they wanted to see built. And one of them was for AI that could discover long causal claim chains in circumstances of dramatic uncertainty.
Marshall Kirkpatrick [00:38:15]:
Apparently in biology, causal claim chains are a thing to help measure the impact from cellular level to ecosystem level or whatever. And I said, we can do that. And what's up with that? And so now every time you analyze an article, article, it picks up any claims that are made, like A leads to B, and it saves that up in your Cloudflare, you know, as a key-value associated with your device. And then later, weeks later, months later, when you read another article that says B leads to C, it says, alert, alert, a chain has been discovered. So, the way to describe that, I'm It augments memory, it augments perception, and I'm positioning it as a performance enhancement technology for people who think for a living.
Leo Laporte [00:39:06]:
Really interesting.
Paris Martineau [00:39:07]:
That's fascinating.
Leo Laporte [00:39:08]:
Yeah.
Marshall Kirkpatrick [00:39:09]:
I think about in sports, you know, people say sometimes if you take the, like, the great athletes from history and you were to pluck them out of history and drop them into the league today, how would they do? You know, it might be kind of tough because despite, uh, their, their skill and their effort these days in, in sports ball, no matter what the, the sport, more or less, there's game tapes, there's analytics. All the athletes are super informed and super optimized. And, uh, and in this weird J-curve of like compounding change and, and like, you know, this wild world we live in right now, I think that all of us who who read, write, and think for a living need a toolbox to help level up what we're doing. I wanted one for myself and I wanted to offer that to others as well.
Leo Laporte [00:40:00]:
Yeah, I wonder how much of your experience as a journalist has informed this because it really feels like an ideal tool for a journalist, right?
Marshall Kirkpatrick [00:40:06]:
Yeah.
Leo Laporte [00:40:07]:
Well, you know, I— Are you scratching your own itch in a way?
Marshall Kirkpatrick [00:40:11]:
Definitely. Well, I'm a big fan of Josh Waitzkin's book, The Art of Learning. You know, he was a child chess champion who gave up the spotlight and then ended up becoming a martial arts champion. And he talks about how—
Jeff Jarvis [00:40:28]:
I'm sorry, that's quite a switch.
Marshall Kirkpatrick [00:40:31]:
Oh, it's a great book. He was subject of a movie as a kid. And yeah, wonderful book. I read it once a year. And he says that experts in lots of fields, whether it's chess or martial arts, they tend to do two things. They've got an intuitive sense of pattern detection, patterns and anomalies. And as a journalist, I too, I would like scan over, you know, link, link, link, link, link RSS feeds and have an intuitive sense to be like, oh, that one, that could be interesting. I'm going to stop and look at that.
Marshall Kirkpatrick [00:41:04]:
And then the second thing that experts and athletes in, in various fields often do is have a practiced routine, you know, steps that they would take in a sequence, a play or a playbook, a book of plays that they would run. And so What's Up With That offers that kind of intuitive pattern recognition in the here's what's new, and then it's got these automated playbooks of sequences because all these mental models Tool, you know, there's dozens of them, but you can say, just give me a plan too, and it will say, here are 4 or 5 different reports you should run. Just click here and we'll run them for you. And it runs them in sequence and then gives you the 3 most important details discovered.
Leo Laporte [00:41:48]:
This is the movie Searching for Bobby Fischer, which was a great movie, but I didn't know that he went on to become a martial arts champion. That's hysterical. Well, I'm excited, Marshall. This looks like a really useful thing for us. I'm going to sign up right away. Very, very cool tool. Did you vibe code this? How did you create it? I did.
Marshall Kirkpatrick [00:42:09]:
For the first time in my life, I didn't go hire other people to write software.
Leo Laporte [00:42:14]:
That's kind of neat too, isn't it?
Marshall Kirkpatrick [00:42:16]:
Yeah. And I have, you know, I regularly ask, let's do a security audit here. What do I need to account for? And fix things up real smart. And so I think that, I think it's pretty solid. And of course it gets checked by Google with every release as it goes through the Chrome Store and has that benefit as well.
Leo Laporte [00:42:43]:
Very cool. What's Up With That, whatsupwiththat.app, if you wanna see the website, there's a good demo there. You can see all the things it can do. And it's also an extension available in Chrome or Firefox. Marshall, stick around. Because there's a lot of AI news and it's nice to have another expert on the panel with us. Paris Martineau is also here. Now you may open your microwave door.
Leo Laporte [00:43:06]:
Oh, it's been opened.
Paris Martineau [00:43:11]:
The grits have been retrieved.
Leo Laporte [00:43:12]:
It's like Al Capone's vault. Is there anything in there? Grits? Grits? Really?
Paris Martineau [00:43:17]:
Listen, you know, we had— I was like, we probably got less than 5 minutes for me to cook something to eat. I can make this in—
Leo Laporte [00:43:25]:
Do grits cook in 5 minutes in the microwave?
Paris Martineau [00:43:27]:
You know what you can get are instant grits, which cook in a beautiful 3 minutes and 33 seconds. You get a quarter cup of grits, you get a cup of water, then you slap some butter, salt, Cajun seasoning. Are they white grits?
Leo Laporte [00:43:44]:
Yeah.
Paris Martineau [00:43:44]:
Oh yeah. I'm a big grits fan. I was texting Jeff this earlier.
Leo Laporte [00:43:47]:
I'll never forget that.
Paris Martineau [00:43:48]:
I think they're kind of a perfect food. And the grits you're thinking of that have kind of a bad texture, That's because most diners, I don't think, make grits correctly. I think I make better grits than the average diner by far.
Leo Laporte [00:43:59]:
Now I want grits.
Paris Martineau [00:44:01]:
Everybody should have grits. It's a perfect food to have in your fridge when you— or in your cabinet when you need to make something quickly in an ad break on a podcast.
Leo Laporte [00:44:12]:
Back in the day, we, uh, we, uh, at TechTV, we did an appearance in Birmingham, Alabama, and we went to a very fine country home there and had breakfast and they made us cheese grits. And those cheese grits sat there all day long in my tum tum. I'll kind of never forget that experience. It's a lasting breakfast, let's put it that way. Well, go enjoy your grits. We're gonna take a little break. Jeff Jarvis is also here and we're so glad to have Marshall Kirkpatrick with us. It's been a long time.
Leo Laporte [00:44:45]:
It's great to see you, Marshall. I'm glad you're doing well too. That's fantastic. OutSystems. Now this is timely. The number one AI development platform, OutSystems helps businesses bridge the enterprise gap to their agentic future where the constraints of the past give way to the unlimited capacity and scale of AI. OutSystems enables companies to build AI agents that can actually do work. Such as take actions, make decisions, and integrate with data rather than just answer questions.
Leo Laporte [00:45:19]:
OutSystems provides the only AI development platform that is unified, agile, and enterprise-proven. Let me explain. It's unified because you build, run, and govern apps and agents in one platform. It's agile because you can innovate at the speed of AI, importantly, without compromising quality or control And it's enterprise-proven, trusted by enterprises for mission-critical AI applications and durable innovation. OutSystems is the secret weapon behind the world's most successful companies. They're not just for small apps, they're for the massive complex systems that run banks, insurance companies, and government services. OutSystems even helps companies with aging IT environments bridge the gap to the AI future without a rip-and-replace nightmare. OutSystems provides the safest and fastest way for an enterprise to go from, yikes, we need an AI strategy, to, yeah, we have a functioning AI application.
Leo Laporte [00:46:17]:
Yeah. Stop wondering how AI will change your business and start building the agents that will lead it. Visit outsystems.com/twit to see how the world's most innovative enterprises use OutSystems to build, deploy, and manage AI apps and agents quickly and cost-effectively without compromising reliability and security. That's outsystems.com/twit to book a demo. Outsystems.com/twit. We want to thank OutSystems for supporting Intelligent Machines. So we've been talking about the big trial in Los Angeles. You remember Snapchat and TikTok both settled out.
Leo Laporte [00:47:00]:
The plaintiff was a 20-year-old woman who said, I got addicted early on. I think it was to Instagram primarily, but in general to social. And as a result, I've had a terrible, terrible life. And, and it's their fault.
Paris Martineau [00:47:17]:
Okay, come on. You're, you're perhaps describing this in a slightly disingenuous way. She began using YouTube at age 6, Instagram at age 9. She testified that she believed social media led to depression, body dysmorphia, anxiety, suicidal thoughts. Self-harm.
Leo Laporte [00:47:34]:
Yeah, terrible, terrible.
Paris Martineau [00:47:35]:
Cutting herself at age 10. All of these, uh, the jury ended up answering yes to every question that it was asked on negligence and finding failure to warn, voting 10 to 2 on each claim for each defendant. Now I have to say, the thing, the thing that I think is interesting about this, before you poo-poo all over this, is unlike other lawsuits, uh, which have all easily been dismissed due to Section 230. This is one of the first bellwether cases in this giant, uh, MDL litigation, which has like, I believe, hundreds if not thousands of lawsuits that are all trying to apply this like novel legal approach that instead of, um, using any of the normal ways to sue a tech company, they're arguing basically it's a product liability or personal injury case. They're arguing that this was negligent design and has nothing to do with the actual user content. And in regards to this case, they're saying that Facebook or Meta and YouTube executives knew that their products were harming or potentially inducing addictive behaviors in children, and specifically like very young children, and they did not take adequate steps to prevent their products from causing foreseeable harms. And I think that's— I mean, that's part of the reason why I think the jury ended up finding, like, deciding in the plaintiff's favor here, is it's not as simple of a case as we normally see with these sort of things.
Jeff Jarvis [00:49:14]:
Yeah, that caused all of this with this woman is a pretty simplistic view itself.
Leo Laporte [00:49:21]:
Yeah, we don't know. I mean, no one can know what caused Yeah, her issues.
Jeff Jarvis [00:49:25]:
And they have plenty of—
Paris Martineau [00:49:26]:
well, they can, because that's exactly what this case was about, where they spent weeks and weeks and weeks about this very specific thing. Actually, is what they just decided was exactly that. The deliberation even took 44 hours. It took 9 days.
Leo Laporte [00:49:42]:
Yeah, they took a while.
Jeff Jarvis [00:49:43]:
Well, because they also weren't agreeing.
Leo Laporte [00:49:45]:
I should point out, yes, they've ruled against them, but they didn't give them the worst damages ever. $4.2 million to Meta. That's combined compensatory and punitive damages. YouTube, $1.8 million. These are tiny compared to the revenues of these companies. I was like, it was worth it for one day. It was well worth it.
Paris Martineau [00:50:06]:
Um, I don't think that's correct because this is a bellwether case that, that's both being used as a, like, in an idiomatic phrase as well as a, like, legal literal sense. So is it a precedent? No, no, no, no, no. So this is part of an MDL, which I believe is multi-district litigation. It basically— I wrote about this a couple of years ago, so forgive me if my knowledge is a little out of date. But at the time, there were hundreds, if not thousands, of cases like this that all ended up getting grouped under this MDL, where they were all about social media addiction litigation involving a handful of these companies and kind of taking on the same novel legal argument. Where they're trying to argue defective design and kind of— or negligent design.
Leo Laporte [00:50:59]:
And they know about it, but they didn't care enough to fix it.
Paris Martineau [00:51:02]:
I mean, they're basically making the same sort of argument that you see in tobacco or asbestos cases. But so as part of what you do in a big case like this where you have thousands and thousands, the comp— Facebook, you know, Meta, YouTube, Snapchat, TikTok, all of them were like, we don't want to sit here and litigate 1,000 of these cases individually, that'll be very costly. Instead, we're going to select a handful of literally called bellwether cases. I think it's 8 or something in this case, and that's going to be used to determine the future of all of this litigation. So how— this is the first of—
Leo Laporte [00:51:37]:
it's actually not the first because on Tuesday, a New Mexico jury, uh, I don't believe that's—
Paris Martineau [00:51:45]:
is that part of the same multi-district? No, that's different.
Leo Laporte [00:51:48]:
But But it's not a bill. This was the case brought by the state attorney general in New Mexico. They found Meta liable for violating state law by failing to safeguard users of its apps from child predators. And that fine was $375 million. A little bit more painful for Meta. Yeah, that was a very different case.
Paris Martineau [00:52:10]:
So this was their civil penalties under New Mexico like they're— what they've gotten them for is willfully violating New Mexico's Unfair Practices Act. And the attorney general, who is very— I've spoken to him, um, Raul Torres— he's very, uh, up to date with obviously all of this and has been following the MDL quite a bit. But basically, this is slightly different, like, off— undercover officers posed as children on Facebook, Instagram, and WhatsApp. Uh, they ended up kind of having a sting operation. It is They are kind of going to be doing a phase 2 bench trial as to whether Meta created a public nuisance, which might require platform design changes. But it's slightly different. This L.A. case that was decided today is the first of the official bellwether cases that can decide the fate of all this future litigation.
Paris Martineau [00:53:04]:
And I mean, Meta and all these people, of course, are saying they're going to appeal this decision. But I do think this has been a huge movement we've seen over the last couple of years. I profiled one of the attorneys and legal, like, groups that have been kind of pushing this movement. It's called the Social Media Victims Law Center. I profiled them a couple years ago. But this movement has all been happening with only, like, one or two real decisions in the favor of this being a viable legal strategy. This is the first time— this has been a huge victory for this, and it's going to open up the floodgates even more.
Jeff Jarvis [00:53:44]:
Any guess on appeal? What are you reading about the—
Leo Laporte [00:53:47]:
well, it would be different because you're now talking to a panel of judges as opposed to a jury. But I can see why a jury would be very—
Marshall Kirkpatrick [00:53:54]:
consumer protection perspective, from a historical perspective, it sounds like this is a BFD. This is like—
Paris Martineau [00:54:02]:
yes, this is a huge— I, I think this is a huge deal. As someone like— I had been when, like, 2 and a half years ago, I was like, man, when one of those first bellwether cases— when, when this case comes up, it's gonna be huge. And I'm— I mean, I'm not surprised given that it was a jury deciding it. And I think, like, juries are going to be more easily swayed, obviously, and it's a compelling argument. Yes, but it depends who Benito was waiting.
Jeff Jarvis [00:54:33]:
He was just waiting.
Paris Martineau [00:54:34]:
I know. And I'm really, really proud of him for getting it.
Jeff Jarvis [00:54:37]:
You're welcome, Benito.
Paris Martineau [00:54:40]:
I mean, I think that it depends on who ends up seeing the appeal, what judge in court it goes to, because the thing is, this is not the only— I say this is a huge deal, and it is, but this is not the only case like this that has been decided in plaintiff's favor. This all kind of started with a case called Lemon v. Snap in 2019, which is probably one of the reasons why— I mean, I obviously don't know any of this, but I don't know, maybe one of the reasons why Snapchat was like, we gotta get out of here, we're not going to trial, is it was a really interesting case where, um, a couple of teenagers, I believe, ended up dying, if not all of them, uh, in a high-speed crash because they were— Snapchat had rolled out a speedometer feature. You guys might recall if you ever used it, um, back in the day, where essentially it would show you, um, how fast you were going. And I think there was something going around where people believed like, oh, if you got to 200 miles an hour, you got a special thing. It was a big deal on Snapchat. Everybody was trying to see how fast they could possibly go in cars to get, uh, this sort of Snapchat response from the app. And these kids ended up going like 200 miles an hour or something, or some crazy amount, crashed their car, died, and then Snapchat faced a suit saying, hey, you should have realized when you're, you know, designing this feature and hearing the way people are using it, that it could put people in harm's way and maybe consider your design more.
Jeff Jarvis [00:56:10]:
And they did Snapchat design that particularly, or was it—
Paris Martineau [00:56:12]:
yes, it was. No, it was not a user-designed feature. It was Snapchat designed and rolled it out.
Jeff Jarvis [00:56:17]:
That's pretty good.
Paris Martineau [00:56:18]:
And it was, it was a really landmark decision in that, like, it was the first kind of crack in Section 230 being the go-to defense for all sort of—
Jeff Jarvis [00:56:32]:
Well, in that case, if Snapchat created it, Section 230 wouldn't have been a defense.
Paris Martineau [00:56:36]:
I know, and that's why Section 230 wasn't able to apply, and that's why they ended up being found liable for defective or negligent design. But then this started opening up this whole new legal area for a lot of these companies or a lot of litigants where they're saying, well, there are other aspects of these platforms that are design decisions. And how can we try and suss out whether or not those are defective designs or not? And this is— we're going to— yeah, we're going to see how it all shakes out. But it's going to be very interesting.
Leo Laporte [00:57:11]:
So the plaintiff's attorney brought in a jar of M&Ms saying, imagine this is the revenue of these massive companies. If you don't give them a large punitive decision, take out a handful of M&Ms, if you just take out one M&M, they're not going to feel it. The jury did not buy it. In fact, one of the jurors— The New York Times quotes one of the jurors who said they shied away from giving the plaintiff a huge sum. We wanted to focus on the future and what teens and children would be subjected to in the future. They weren't— they didn't want to punish these companies, but they did want to make it clear the companies were responsible. So to your point, they wanted to set a precedent.
Marshall Kirkpatrick [00:58:00]:
Yeah, I, I feel like this is a, a paradigm-level, like, incentive problem that, that was an opportunity to make a meaningful intervention on. As a founder who has raised money from investors, one of the things that investors— classic statement that people, investors say to startups is, if you have a choice between building a vitamin and building a painkiller, always build a painkiller because that's what'll sell. So, the incentive to build addictive, short-term optimized stuff is like baked into the whole system.
Jeff Jarvis [00:58:43]:
But this is the problem.
Leo Laporte [00:58:44]:
That's where the money is.
Jeff Jarvis [00:58:46]:
Addiction is a trope. And it goes back to, you know, I've done it before, to novels and so on. In the earliest days of the internet, and I write about this in The Web We Weave at length, this notion immediately there started support groups for addiction. Definitions of addiction that were ludicrous. One Columbia professor started a joke group around addiction, and people took it seriously. They didn't know what to do with it.
Leo Laporte [00:59:11]:
They were addicted to jokes?
Jeff Jarvis [00:59:13]:
No, he started it saying this was a joke. He thought the argument for addiction was so absurd, and people glommed onto it.
Leo Laporte [00:59:21]:
Well, but AA works.
Jeff Jarvis [00:59:22]:
There are— research—
Leo Laporte [00:59:23]:
that model can work to help people.
Jeff Jarvis [00:59:24]:
Research is not backing up addiction. The research does not back up addiction. So that's, that's an issue here. That's why this was a jury's emotional response.
Leo Laporte [00:59:32]:
This was my point, which is you can prove that cigarettes cause cancer. You— it has been proven. You can prove asbestos causes mesothelioma. This has been proven. It is much more difficult to say, you know, she says— she said in her testimony that at a very young age, at age of 6, she found she turned to these platforms because she was bullied and lonely and she— and it was a creative outlet for her. And all of which, you know, I believe completely. She says that's what caused my problems. Of course, Meta's defense was that her mental health issues had other causes.
Leo Laporte [01:00:18]:
They said familial abuse and turmoil.
Jeff Jarvis [01:00:20]:
But Yep.
Leo Laporte [01:00:21]:
The difficulty is you can prove cancer is caused by cigarettes. It's very hard to prove that mental illness—
Marshall Kirkpatrick [01:00:29]:
I mean, do we talk about Gabor Mate here? Does Gabor Mate's name ever come up? The Canadian doctor and author who argues, if I could summarize, that addiction is basically a coping mechanism, an unhealthy coping mechanism for trauma, right?
Leo Laporte [01:00:48]:
That's, that's often what— it's certainly what they say in the 12-step programs, things like that.
Paris Martineau [01:00:53]:
The, uh, one of the experts who testified, um, uh, a Stanford, um, addiction medic— medicine expert, um, testified that social media reward mechanisms activate the same neurological dopamine pathways as gambling and substance addiction. Oh, but I mean, I don't know that we necessarily want to go down this thing. I think like the thing that ended up—
Jeff Jarvis [01:01:14]:
the dopamine response is the same as wearing glasses.
Leo Laporte [01:01:17]:
The thing that's enjoying a good book.
Jeff Jarvis [01:01:20]:
Yeah, yeah, it's—
Paris Martineau [01:01:20]:
that's, that's such a thing that ended up sinking, I think, it for Meta and YouTube is there was just a lot of internal documents that showed Meta was based— I mean, was trying to hook young users and get them as young as possible. I think one of the quotes is, if we want to win big with teens, we must bring them in as tweens. And it's like, they're— I believe I remember at the time, um, a lot of these documents were coming out because there were some redaction errors, uh, that led to some of them being shown that we shouldn't— where essentially they said like, yeah, we're trying to optimize for a maximum amount of pickups a day. They would be in some cases optimizing for kids, uh, picking up their phone throughout the evening. I mean, this is I think it's not coincidental that as all of this litigation has been going on and as discovery has been going on over the last couple of years and these kind of appalling documents and revelations are coming out, that Meta's rolling out teen accounts, as has— there's been YouTube Kids accounts have come out with stricter parental controls. I think that, I don't know, it does feel like a bit of a paradigm shift.
Marshall Kirkpatrick [01:02:27]:
It's not a fair fight between like the, the billions of dollars and all the expertise on one side against like some traumatized 6-year-old kid and being like, hey, you didn't have to go for the— you didn't have to take the bait with the app and keep picking it up, right?
Jeff Jarvis [01:02:45]:
But there's also the trauma. What about her parents? Traumatized. There's also the traumatized young child who feels very alone, who turns to social media and turns to the internet because it gives them the salve that they otherwise wouldn't have. There's tons of research about that. So what's happened in Australia is what's being taken away from children is going to make a lot of children worse because of this moral panic and moral entrepreneurship. And you have people like Tristan Harris who've now moved on from this and moved on from social media. So he has the AI doc coming out next and he's going to argue how awful that is for all of us and he knows best for all of humanity. And it causes more problems potentially in the long run because it's built on assumptions and fears, not on research and data.
Leo Laporte [01:03:26]:
Well, clearly, I mean, look, Gabor Maté notwithstanding, and I don't know even if the jury would, uh, deny Gabor Maté's thesis, you know, maybe she was filling some hole, you know, caused by trauma. I think the jury was persuaded mostly and ruled this way mostly because they feel like Meta And YouTube intentionally cultivated algorithms.
Jeff Jarvis [01:03:51]:
I think that's where Paris's arguments are.
Leo Laporte [01:03:53]:
Yes, I'm agreeing with Paris.
Jeff Jarvis [01:03:55]:
The internal material. Yes, it was your responsibility.
Leo Laporte [01:03:59]:
They did this on purpose and they deserve— well, it's interesting, the jurors didn't want to punish them exactly, but they needed to be— they wanted it to be a wake-up call that you guys need to fix this. This is not— that you are responsible for creating something this addictive. It also may not be what sent KGM down this road, this, the plaintiff down this road in the long run. But she's, you're right, Marshall, she had no defense as a 6-year-old against this, you know, intentional cultivation of a very, very sticky product. I just worry that it could be extended to other things that are equally enjoyable. Not everything's heroin.
Paris Martineau [01:04:42]:
One of the things that is going to come of this is like bellwether trials, especially in these sort of mass torts, they often end up being used to kind of set the tone and speed for global settlement talks. Like with like, I'm sure that's why opioids or 3M or things like that out of it. Yeah, and TikTok got out of it once you started to see the bellwether cases being decided in favor of plaintiffs. These companies are like, okay, well, I guess we'll just get ahead of this, maybe start doing some global settlement negotiation, accelerate that process. And I think that— I don't know, this has obviously been a problem that the big social media companies have had to contend with for many years, and it had not really ever— it seemingly hadn't risen to the level of concern to result in any product changes or care towards providing parents with tools. You said earlier, like, well, what about the parents? Until recently, parents didn't have any tool. You could either have your kid have an Instagram account or they couldn't.
Jeff Jarvis [01:05:47]:
They didn't really have the other working argument.
Paris Martineau [01:05:50]:
Yes, parental tools until all this stuff started coming out.
Leo Laporte [01:05:54]:
A 6-year-old, a phone, and allows them to spend hours a day on Instagram. That's the parent's fault. I'm sorry.
Paris Martineau [01:06:01]:
I think there can be compelling arguments made on both sides. I think, yes, that's wise. Also, you know, if you're a parent that's working 3 jobs, you have your kid who's screaming at the top of their lungs and you have to be in a Zoom meeting without noise in the background or else you're going to get fired and maybe lose your housing. Yeah, you're going to want to hand your kid a phone.
Leo Laporte [01:06:21]:
I'm sure there were reasons. And I think what the jury— what this is what I'm saying is the jury might even know about those reasons and might even think the parents have some culpability. And they might—
Paris Martineau [01:06:29]:
the jury would know about those reasons given that they saw the testimony.
Leo Laporte [01:06:33]:
And they probably, you know, I'm sure the defense told them about Gabor Maté. But the jury, what the jury really is saying here, and I'm really curious what the impact of this legally is, is it doesn't matter because the company's created a product intentionally to cultivate this kind of compulsive use. So, Paris, does this mean that somebody else can make a lawsuit And then it's not a precedent in the sense— no, a legal precedent— but they could bring this case up and say, look what the jury did in LA. Is that why it's valuable? I don't— what is the— what is this? How's the strength in those cases?
Paris Martineau [01:07:15]:
In my opinion, is that it's a high-profile incident showing— of an incident where this novel legal argument has resulted in damages being assessed against one of these companies and a result being found. It's, it's a, it's an example of success. And I mean, more practically, this is one of the bellwether cases for this multi-district litigation. And so it will have a very profound and direct impact on all of those. Like, this is one-eighth or whatever, how many bellwether cases there are. Last time I checked, it was 8. This is one-eighth of the way to deciding what's going on with these thousands.
Leo Laporte [01:07:58]:
But does bellwether have a legal—
Paris Martineau [01:08:00]:
Yes. Like literally when you have a— it was part of my understanding of it is, so looking through this multi-district litigation, it's truly thousands upon thousands of things in the legal docket. It's all these different cases that have kind of all been merged under one for court consolidation. There was a long back-and-forth period a couple of years ago where the plaintiffs' attorneys, the defense attorneys, the judges all kind of went back and forth, and they eventually settled on this handful of— it was 8 or 10 cases they're choosing as bellwethers that are— they think are— they both agree are representative of the class.
Jeff Jarvis [01:08:38]:
Plaintiffs are choosing this?
Paris Martineau [01:08:39]:
The plaintiffs and the defense. They both had to agree.
Jeff Jarvis [01:08:42]:
Okay.
Paris Martineau [01:08:43]:
Everybody had to agree. The judge had to sign off. Then, obviously, it's not those are decided and whoever gets more, the rest are decided, but when it comes to settlement discussions and the defense kind of— and both the plaintiffs and defense taking a sense of how the rest of this multi-district litigation is going to be resolved, it's eventually going to get resolved in some sort of settlement talks that are going to be decided either in favor of plaintiffs or in favor of the defense in some mass tort sense. And the bellwethers are used as bellwethers to determine which way they think these things are going to go after appeal. I mean, yeah, but these early decisions I think are notable.
Leo Laporte [01:09:26]:
Well, it's certainly a big deal. It's not a lot of money, but it's a big deal.
Jeff Jarvis [01:09:30]:
It's also fascinating to me, we've kind of moved past social media as the issue. Everybody's talking AI, AI, AI, and Social media is kind of yesterday's issue.
Leo Laporte [01:09:38]:
Well, that's how it works. Yeah. The legal system and media society. Yeah. Yeah.
Jeff Jarvis [01:09:44]:
Yeah.
Marshall Kirkpatrick [01:09:44]:
Well, this whole AI thing is better than social media went, right?
Paris Martineau [01:09:47]:
I was going to say talking about AI, like they're, so as I was doing some tweeting about this today and ended up shouting out my story from 2024 on one of the, basically a longtime asbestos lawyer who is the head of one of these firms. And I was doing it, I was like, oh, I ended up doing a lot of research into tort law. That's how I ended up discovering my favorite museum in the world back in the day, the American Museum of Tort Law. And I was on their website because they're obviously shouting out this because this is, um, novel.
Leo Laporte [01:10:18]:
Well, because the lawyers are going to make the bulk of the money is why they're shouting.
Paris Martineau [01:10:22]:
That's— I mean, no, the, uh, in tort law, like, it typically— lawyers' fees around 30%.
Leo Laporte [01:10:30]:
Depends what Casey agreed to.
Paris Martineau [01:10:32]:
Yeah. But on the American Museum of Tort Law website, they had this thing from JD Supra which said, can social media or AI be a defective product? Product liability in mass tort law.
Leo Laporte [01:10:45]:
That's interesting.
Paris Martineau [01:10:46]:
And that's why I realized this, what we've been talking about, there's a parallel wave of litigation against AI developers or against like the, against kind of ChatGPT, obviously against OpenAI, Character AI, and similar things, trying to argue kind of— I think it's obviously a bit more complicated given the nature of their platforms are different, but that, you know, in the case of the Character AI chatbot that quote unquote urged a child to kill themselves. Yeah, there are a lot of those. Yeah, there's a lot of them. They're like, is that defective design?
Leo Laporte [01:11:25]:
So this is consumer product law that's related.
Jeff Jarvis [01:11:28]:
That's the issue, right? Yeah.
Leo Laporte [01:11:30]:
Yeah. That's interesting.
Paris Martineau [01:11:31]:
It's product liability, baby.
Jeff Jarvis [01:11:33]:
And it's going to come to AI because of the whole talk about are guardrails possible? Can they do anything? Is it a fool's errand? Do they, if, you know, they argue that they can take over all mankind, but they can't cause a simple change to avoid a simple problem.
Leo Laporte [01:11:48]:
Right.
Jeff Jarvis [01:11:49]:
It's the lawyers of the AI companies should be perking their ears up to, and they're going to be destroying a lot of documents about that now.
Paris Martineau [01:11:57]:
Yeah. And I just think that the like really important highest level takeaway here is that all of this just seems to be another piercing of this longstanding assumption that I feel like the tech industry has long had that like, if you're a tech company, everything gets broad immunity from sort of tort exposure and a lot of lawsuits because Section 230, like that is genuinely a significant amount of lawsuits involving these sort of companies and platforms end up just getting dismissed first thing. They're like, ah, Section 230, you can't.
Marshall Kirkpatrick [01:12:32]:
It's not—
Leo Laporte [01:12:32]:
in this case though, it just doesn't hold.
Paris Martineau [01:12:35]:
Yeah.
Leo Laporte [01:12:35]:
Because it isn't content that was posted by its users that was the issue. Their issue here is how that content was displayed, the algorithms used to display that content.
Paris Martineau [01:12:44]:
It's, yeah, it's a Product design.
Leo Laporte [01:12:46]:
And so it's just a—
Paris Martineau [01:12:48]:
this is a—
Leo Laporte [01:12:48]:
yeah, that makes sense.
Paris Martineau [01:12:50]:
A segment of law and litigation that somehow hadn't really emerged full-heartedly, wholeheartedly until recently. And I think that's just very interesting precedent-wise.
Leo Laporte [01:12:59]:
I think what Jeff and I are mostly concerned about is, is going forward, if this is going to extend liability into other new technologies, uh, and, and put a chill on them. Not just in a way that may be impossible to follow, right?
Jeff Jarvis [01:13:15]:
That, that you, you— and, and so it's— it goes back to Section 230 to this extent, is that without the shield that was provided, every online company would have said, you can't talk here, you can't do anything here. Nope.
Leo Laporte [01:13:28]:
Am I liable?
Jeff Jarvis [01:13:29]:
Because I don't want to be liable because I can't get insurance, my lawyers aren't going to let me. And so we're going to have an internet of PDFs.
Leo Laporte [01:13:35]:
So am I liable for creating shows that fabulously addictive and that people have to listen to 3 hours every single day.
Jeff Jarvis [01:13:43]:
We're the ones that were addicted. Paris and I are filing a class action suit against you later today.
Marshall Kirkpatrick [01:13:50]:
There's got to be somewhere in between, uh, you know, an internet of PDFs and I could shoot a man on Fifth Avenue and still be elected president.
Leo Laporte [01:14:00]:
We are still looking for that somewhere in between. Um, Bye-bye Sora.
Marshall Kirkpatrick [01:14:06]:
In other words, hardly knew ye.
Leo Laporte [01:14:08]:
That, that app which was both protests— sayonara. OpenAI has announced it's going to shut down Sora, which is ironic because Disney agreed to give them a billion dollars and license their characters to Sora so people could use Disney characters in their of videos. I guess Disney says, yeah, well, never mind if you're gonna shut it down. Martin Pears and Anna and Gahan, your former colleagues at The Information, say OpenAI wrong-foots Disney. Well, Disney must be so glad it committed a billion dollars to OpenAI.
Jeff Jarvis [01:14:50]:
No, billion. They're not paying the billion.
Paris Martineau [01:14:52]:
They said, well, that's such a Martin Pears headline.
Leo Laporte [01:14:58]:
I'm not gonna delve into that. Um, anyway, I think OpenAI— we were talking before the show about why OpenAI did that, and there are a lot of good reasons. OpenAI, we've had the story, uh, is focusing more. They've decided— they look at what's the enterprise revenue generated, all of a sudden generated by Anthropic, and saying, hey, uh, we— maybe this chatbot thing isn't where we should have spent our energy.
Jeff Jarvis [01:15:25]:
At the same time, Walmart has pulled out because OpenAI's shopping was not performing.
Leo Laporte [01:15:31]:
Yeah. Look at this graph.
Jeff Jarvis [01:15:33]:
Pissing off Microsoft. OpenAI is not your get-along company.
Leo Laporte [01:15:37]:
This article, The AI Spending Flip. This is, this is AI model share of first-time enterprise customers. OpenAI declining dramatically while Anthropic increasing dramatically. They flip-flopped. OpenAI, you know, a year ago OpenAI had 60% of the enterprise market to Anthropic's 40%. Now it's Anthropic's 73% to OpenAI's 26%. So it's a—
Paris Martineau [01:16:03]:
Hey, that's probably one of the reasons why OpenAI was offering private equity firms, what is this, 17.5 free return rate this week?
Jeff Jarvis [01:16:15]:
Guaranteed. Yeah, it wasn't just a hard bargain. It was a guarantee.
Paris Martineau [01:16:19]:
Guaranteed, which is crazy.
Leo Laporte [01:16:19]:
I'll take that. That's pretty good.
Paris Martineau [01:16:22]:
I mean, that's, it's very interesting.
Leo Laporte [01:16:26]:
It's a free risk. You know what, if they're bankrupt, they can't pay that money anyway. So why not promise it?
Paris Martineau [01:16:31]:
Well, I mean, part of what they're saying is, I assume, is they're hoping to get it back by having all of the PE firms get their portfolio companies to do expensive enterprise subscriptions with OpenAI. And then that's where they're going to be paying the private equity firms that 17%. 0.5% back. But it's just like, my first question is like, who, what, what portfolio company in the year 2026 doesn't already have an enterprise AI subscription? Probably not many, right?
Leo Laporte [01:17:04]:
I wonder what's gonna happen to Johnny Ive's $6 billion, uh, AI device. That's another distraction, isn't it? Uh, OpenAI has a web browser.
Marshall Kirkpatrick [01:17:15]:
An analysis of the, of the show transcripts over the last 18 months found that, that OpenAI OpenAI's financial stability used to be a major topic of conversation here but has been on the decline now for some time.
Paris Martineau [01:17:29]:
I wonder why.
Leo Laporte [01:17:31]:
You know, has there been any other changes? I love you, Claude. But I'm not alone, and I think you've seen this also, probably, Marshall, is among the nerds, Claude has just got all the mindshare right now.
Jeff Jarvis [01:17:47]:
Well, I'll go back to OpenAI. Is it in trouble?
Paris Martineau [01:17:51]:
When is it not?
Jeff Jarvis [01:17:53]:
Well, yes, I think it's more— is it more trouble? I mean, it's, uh, IPO, there's no possible purchase, no one's going to buy it. It's desperate.
Leo Laporte [01:18:02]:
Microsoft's already threatening to sue them, so that relationship soured.
Paris Martineau [01:18:06]:
What? Why? Why are they threatening to sue?
Leo Laporte [01:18:09]:
Because OpenAI did a deal with Amazon and Microsoft wasn't too happy about that. They said that's a violation of our company.
Paris Martineau [01:18:15]:
I was like, but I need all the money in world so I could maybe make a product that works.
Leo Laporte [01:18:21]:
Um, so Walmart says that the ChatGPT checkout converted 3 times worse than the website.
Jeff Jarvis [01:18:28]:
Yeah. What's the strategy behind OpenAI now? Anthropocus is clearly— it's got enterprise, B2B, coders. Uh, uh, it was ready for Claw even though they pissed off the OpenAI—
Leo Laporte [01:18:41]:
they do own OpenAI. They do own Peter Steinbach. They bought OpenClaw.
Jeff Jarvis [01:18:45]:
Oh, but I am. But that's, that's like Mark Zuckerberg buying MoatBook. It was meaningless. He didn't need to buy them.
Leo Laporte [01:18:51]:
Right. And of course, OpenClaw is now an open foundation and is not owned by—
Jeff Jarvis [01:18:57]:
Exactly. So that was, that was, that was a sign of desperation in both cases.
Leo Laporte [01:19:01]:
You know what? I think investors are not going to quickly turn their back on OpenAI because what are these guys— I'm putting my mind in the mind of the billionaires, not always A good thing. If only, if only. But I think what I imagine is that they say, look, somebody's going to come along with AGI and it's going to be the upside.
Jeff Jarvis [01:19:21]:
That's your first, that's your first. But go ahead.
Leo's Laptop Audio [01:19:23]:
How?
Jeff Jarvis [01:19:24]:
Okay, sorry, stop there.
Leo Laporte [01:19:25]:
Right, somebody— oh, let's just tell you what's just— I'm just telling you what the Jason Calacanises of the world will think.
Paris Martineau [01:19:32]:
And they're always right.
Leo Laporte [01:19:34]:
Yeah, uh, well, actually, Marshall took some money from Jason, so we ask Marshall, but I think they're thinking there is potentially a massive upside to AI.
Jeff Jarvis [01:19:45]:
For those of you listening, you should have seen the grimace on Marshall's face.
Leo Laporte [01:19:50]:
There is a massive upside to AI. We don't really yet know. I mean, at the minute it becomes clear, oh, these guys aren't going to win, then they'll be like rats leaving a sinking ship. But until then, they're going to hedge their bets. So I think there's still going to be plenty of money.
Jeff Jarvis [01:20:06]:
Well, NVIDIA was going to invest $100 million $1 billion and then they're going, no, more like $20 or whatever it is, $30.
Leo Laporte [01:20:13]:
Then maybe the rats have started to leave.
Jeff Jarvis [01:20:15]:
They've started leaving.
Leo Laporte [01:20:16]:
Yeah, that's when it will happen is when, when investors say, yeah, it's not going to be— the winner is not going to be OpenAI.
Jeff Jarvis [01:20:22]:
So I don't think it's clear yet. Trouble?
Leo Laporte [01:20:24]:
Their models are very good. Phi-4 is very good.
Jeff Jarvis [01:20:26]:
I go back to what is their vision and business model. We know what Google's is. We know what Amazon's is, kind of. We definitely know what Anthropic's is these days. What is OpenAI's mission and vision? I don't know.
Paris Martineau [01:20:42]:
Make money?
Jeff Jarvis [01:20:44]:
But—
Paris Martineau [01:20:44]:
question mark?
Marshall Kirkpatrick [01:20:46]:
I mean, they want to— they want every consumer to spend $20 a month on their apps and all the enterprises to pay for their APIs to—
Jeff Jarvis [01:20:57]:
to AI companies that are both consumer and enterprise. And I think OpenAI has been— in fact, I used to, when I taught business business to my students, said, you really, when you're starting out, you can't be both because you often end up competing with yourself or your customers as a result. And so OpenAI was unquestionably a great consumer brand. Microsoft, it's pushing after the enterprise. Microsoft has, yes.
Leo Laporte [01:21:19]:
So, uh, according to, uh, this is an information, uh, graph going from October 2023 to last month, annualized revenue, OpenAI is still on top with $25 billion a year. Anthropic, though, has gone from practically nothing to $19 billion a year, and they're growing at, you know, what looks like an almost exponential rate. So now remember, revenue is not profit. This— none of this is profit, but I'm sure the investors look at revenue as one of the metrics.
Jeff Jarvis [01:21:50]:
By the way, getting rid of Sora— question— getting rid of Sora doesn't mean they're getting rid of video. They're just getting rid of that product Sora that was—
Paris Martineau [01:21:56]:
I think what I think they're getting rid of all— weren't they? Are they stopping video generation?
Leo Laporte [01:22:01]:
It's not just the app. It's the whole Sora model is gone.
Paris Martineau [01:22:05]:
I can't help but see that as video is really expensive. And much like Marshall said, their current business model is we want to get everybody to pay us $20 a month or pay us for an API like enterprise subscription. And neither of those— both of those, you're losing a crap ton of money too. It's like at some point, You run out of people to ask money. I mean, they're, they're running out of people to ask for money to the point where they're asking people who are getting them in legal trouble with the other people they've asked for money. It's, it's seeming, it's seeming like we've got a little Ed Zitron going on here, you know.
Leo Laporte [01:22:45]:
And there's another argument which we also talked about before the show. There's another argument which we talked about before the show, which is that they also may just want the GPUs and, and computing They want the compute to dedicate it to something else more, more revenue-forward. They also know, which we don't know, how many people were using that app. I bet it was down to a very small number.
Paris Martineau [01:23:04]:
You bet. Why? Just a couple of guys go back in the notebook, LM, and find all the times you're like, Sora's the future, everybody's going to be using this app, it's the coolest thing ever.
Jeff Jarvis [01:23:15]:
Hollywood's still shaking.
Marshall Kirkpatrick [01:23:17]:
Hollywood's still shaking in their boots.
Jeff Jarvis [01:23:18]:
I put it in the I put it in the chat.
Paris Martineau [01:23:20]:
Yeah, wait, I thought this was gonna stop, uh, Hollywood. I thought we were never gonna have a need for an actor ever again because Sora was gonna make all these cool—
Jeff Jarvis [01:23:29]:
Sora was pretty god-awful. Sora was a gimmick.
Leo Laporte [01:23:33]:
Yeah.
Jeff Jarvis [01:23:34]:
So in response to a post saying that OpenAI published a blog about safety standards for Sora and today they scrapped the feature completely, Ed tweeted, this is something a company does when things are going well.
Leo Laporte [01:23:46]:
It was a little too happy to celebrate.
Paris Martineau [01:23:49]:
We need to just get a little Ed sound bite so that we can put that in with a little, uh, maybe Sora to decide one and be like, would you ask Ed to record something like, I told you that would happen, or what did you think would happen?
Leo Laporte [01:24:04]:
What did you think?
Paris Martineau [01:24:04]:
These companies are a scam.
Leo Laporte [01:24:06]:
Yeah, you know it well.
Jeff Jarvis [01:24:08]:
Get them to record some Ed soundboard.
Paris Martineau [01:24:10]:
Yeah, we actually do need an Ed soundboard.
Marshall Kirkpatrick [01:24:12]:
Is OpenAI too big to fail? Like, is that their, is that their strategy? Like, get too big to fail?
Paris Martineau [01:24:20]:
Like, well, that was my sadness.
Leo Laporte [01:24:24]:
You know why that might be the case? Uh, and probably one of the reasons they stepped up when Anthropic said we're not going to do this Defense Department stuff, and they immediately said, okay, we will, is that's how you get too big to fail. If the government relies on you, if the Department of War depends on then maybe you do get too big to fail, or at least the government has to kind of prop you up a little bit if they're relying on you. I guess the other question is, and you would know about this, Marshall, how fungible these models are. It looks like the Department of War was very, very easily replaced. Anthropic, it looks like, right?
Marshall Kirkpatrick [01:25:01]:
So Palantir may not be so excited to—
Leo Laporte [01:25:04]:
uh, yeah, Palantir does a lot of anthropic, right?
Marshall Kirkpatrick [01:25:08]:
Yeah, I don't know. I mean, I certainly would be unhappy to lose access to Claude models, but I do have a circuit breaker system in place too. When the API goes red, as happens, there's like a—
Leo Laporte [01:25:26]:
A lot.
Marshall Kirkpatrick [01:25:27]:
Yeah.
Leo Laporte [01:25:27]:
Happened yesterday, didn't it, Paris? You were saying, is Claude squirreling out on you? You asked me.
Paris Martineau [01:25:32]:
I mean, it was just an issue where like every time I would try to use Claude, it would take like 3 minutes to generate a response.
Leo Laporte [01:25:41]:
Something was going on.
Marshall Kirkpatrick [01:25:42]:
Well, if you look at the status page of the Claude API, it's got, you know, green, green, green, yellow, red, red, red. You can see days where there's issues. And right around when the whole Department of War controversy came up and there was a whole bunch of people came piling in to use it. It's had a lot of red on it. And so yeah, that they're, uh, this is current. And so my system falls back to GPT when Claude goes down.
Leo Laporte [01:26:10]:
Actually, this is, this is not looking good.
Marshall Kirkpatrick [01:26:13]:
Oh, that's about what it looked like. Yeah, for a while.
Leo Laporte [01:26:15]:
Yeah, a lot of, a lot of funny Claude for government out. Oh yeah, there's Claude for government. Uh, it's working fine because no one's using it. All right, we're going to take another break. You're watching Intelligent Machines. Paris Martineau, Jeff Jarvis. We're great to have Marshall Kirkpatrick with us, who is a longtime tech journalist, but also an avid, dare I say avid, AI user. Is that fair?
Marshall Kirkpatrick [01:26:44]:
Yeah.
Jeff Jarvis [01:26:45]:
Addict. He's an addict just like you, Leo.
Leo Laporte [01:26:47]:
He has something to tell you, Paris. He put 18 months of our transcripts into his machine and has a few things to tell you about me.
Jeff Jarvis [01:26:57]:
Ooh, this excites the spirit.
Paris Martineau [01:27:01]:
Is this different than the NotebookLM?
Leo Laporte [01:27:04]:
By the way, I talked to Adamova, who created that. He joined us on our AI user group a couple of weeks ago, was talking. He did get all of the transcripts in there, but he had a chunk of months to do it. Yeah, yeah, you were right. And he had that graphic, the lovely, many lovely graphics showing what a moron I am. So thanks for that. AI salute. We'll have more in just a minute.
Leo Laporte [01:27:29]:
Our show today brought to you by Spaceship. Remember Paris? Secretly British.
Paris Martineau [01:27:36]:
I literally have time on my calendar on Saturday to work.
Leo Laporte [01:27:40]:
Okay, we register it with Spaceship and it's— I tell you, we shopped around. Secretlybriti. Sh. Brilliant domain name. Spaceship had the best price for it. If you've heard us talk about Spaceship before, there's a reason it keeps coming back. It's because Spaceship is rethinking how people register and manage domains, and this fresh approach has led to more than— we're not alone— 6.5 million domains under management in absolute record time. That kind of growth comes from giving people what they actually want.
Leo Laporte [01:28:12]:
Spaceship offers transparent low pricing on domain registrations. Transfers are fantastic. If you have— if you're with another registrar, check out what you get when you transfer. And crucially, renewals, right? You're going to save all around. It's not just a one-time saving and then they jack the price up. That means there's more clarity over what you're paying for over time. Alongside great value, the platform is especially built for flexibility. We did this with Secretly British.
Leo Laporte [01:28:41]:
You can instantly connect your Spaceship registered domains to Spaceship products like web hosting, professional email, virtual machines, and you can build and test before committing because almost every Spaceship product comes with a 30-day trial. I like that. Now, if you still want to use third-party tools, that's fine, no problem. We did this with Paris. Just point your domain to what you need by updating your DNS records or name servers. You can even use their AI ALF to, to do the hard work. Work. So you have the freedom to build your stack exactly how you want.
Leo Laporte [01:29:13]:
When I realized that Secretly British wasn't going to be a real website for a while, I just pointed it to Paris's existing website. It was that easy with Spaceship. Basically, Spaceship is the best of every world. Visit spaceship.com/twit to learn more. Be great place for your open claw. That's spaceship.com/twit. We thank them so much for supporting Intelligent machines. Thank you, spaceship.
Leo Laporte [01:29:41]:
That Saturday— you know what, don't feel pressure, Paris. I don't want you to feel pressured. Secretly British can wait.
Paris Martineau [01:29:49]:
I don't feel pressured. I literally, independently of this message, my friend was like, we gotta get on our great business idea of Secretly British.
Leo Laporte [01:29:58]:
But don't make it too good because you might get sued for being addictive. I'm just warning you. Don't make it too good.
Paris Martineau [01:30:07]:
Do you still want to try to pass any laws that make the world good or better in any way? Because then Leo could find any of the bad things and he will say you shouldn't have it.
Jeff Jarvis [01:30:17]:
Marshall, have it on the record now. When was Paris warned that this would be a problem?
Paris Martineau [01:30:22]:
Guys, we will be fire and ash in 10 to 50 years. None of this is going to matter.
Leo Laporte [01:30:32]:
Really? Is that your deep— how deeply held belief?
Paris Martineau [01:30:36]:
I think it's a— I think there's a dice throw chance that we're gonna be fire and ash.
Jeff Jarvis [01:30:41]:
Did you watch the AI doc already? Is that what's gotten you there?
Paris Martineau [01:30:45]:
No, I just think, you know, you look around, you see the way the world's going, you see the rate at which it's gotten worse over the past 5 to 10 years. I think it's a reasonable, you know, throw a D20, roll in that one, we're fire and ash sort of chance, you know, you got a 10%. But I think that's fine.
Leo Laporte [01:31:01]:
What is that? So D20 is 5%, by the way.
Jeff Jarvis [01:31:05]:
Sorry, a D20 is 5%.
Leo Laporte [01:31:07]:
A D20, 5%.
Jeff Jarvis [01:31:08]:
Thank you.
Leo Laporte [01:31:09]:
1 in 20. Yeah.
Marshall Kirkpatrick [01:31:10]:
Speaking of numbers, I, I look at, uh, billion-dollar natural disasters, uh, per year in the United States. In, in 1980, there were 2, uh, inflation-adjusted $100 billion. In, uh, 2024, there were $28 billion disasters. And then they've stopped since then because the Trump administration said shut down the, the campaign measuring those. But yeah, it's a— it's quite—
Leo Laporte [01:31:39]:
so we don't have any numbers after 2024.
Paris Martineau [01:31:42]:
We don't have any numbers. We don't even have numbers, Leo.
Leo Laporte [01:31:46]:
It's not happening because we don't know. Is now.
Marshall Kirkpatrick [01:31:50]:
Luckily, third parties and independent folks have continued measuring it, but it is up and to the right in terms of—
Leo Laporte [01:31:56]:
So Parris, what did you want to know about the last 18 months of intelligent machines? What insights could Marshall give you from his database?
Marshall Kirkpatrick [01:32:08]:
Well, I can tell you why I thought to ask was because of Parris saying in the last episode, "Oh, Leo, you say this is going to change everything." all the time. We were talking about Claude Code and coding, and I said, oh, that's interesting. I wonder what kind of history there really has been of that. And so I did put a link just in chat right now. I don't know if you want to look, but am I turning into Scoble?
Leo Laporte [01:32:36]:
Is that what you're telling me, Marshall?
Marshall Kirkpatrick [01:32:38]:
Well, no. So Claude's analysis— Claude pulled down 75, you know, issue episodes and transcribed all the thing, and it did. It used the word hyperbolic, not me.
Leo Laporte [01:32:50]:
Uh, oh, you're making Paris so happy.
Paris Martineau [01:32:53]:
But it said that on 60% of the shows, Laporte emerged as a self-described AI accelerationist, in scare quotes, whose enthusiasm intensified over the period, making hyperbolic claims in 45 of 75 episodes, though consistently leavened by genuine skepticism and self-awareness.
Marshall Kirkpatrick [01:33:16]:
Yes, if you go down to the "This Will Change Everything" section, it says it is not as simple as Leo being like a naive, you know, boy who cried wolf. Instead, his claims, while frequent, varied in intensity and were accompanied by self-aware qualification and genuine skepticism about specific products. And companies.
Leo Laporte [01:33:38]:
I was never a monolith, it says.
Paris Martineau [01:33:44]:
Um, I love these tags. I love that there's a tag called Revolution Imminent.
Leo Laporte [01:33:51]:
Oh boy.
Paris Martineau [01:33:52]:
I think we're on the cusp of a pretty big AI revolution in the next year, he said in December 2024.
Leo Laporte [01:33:58]:
Well, you're going to—
Paris Martineau [01:33:59]:
this is going to be a very few interesting years.
Leo Laporte [01:34:02]:
I'll stand by.
Paris Martineau [01:34:02]:
Oh, he predicted an AI co-host. Within the next 5 years, I guarantee, on Twitter.
Leo Laporte [01:34:08]:
That's because I'm going to make it happen.
Paris Martineau [01:34:10]:
A personal agent on your wrist can change everything.
Leo Laporte [01:34:14]:
Yes, and I've been working hard at that. Yes, even though I raced it yesterday. By the way, your role, Paris, is as empirical check. Paris's role was to ground the conversation with reporting, data, and personal experience.
Marshall Kirkpatrick [01:34:30]:
Good.
Jeff Jarvis [01:34:31]:
True.
Leo Laporte [01:34:31]:
Good job. Your signature, your signature move, Paris, was the empirical correction.
Paris Martineau [01:34:38]:
Leo, on February 20th, 2025, you can't look it up, uh, what do you say you measure your life in now? Let's fill in the blank. I measure my life in blank now.
Leo Laporte [01:34:50]:
I measure my life—
Paris Martineau [01:34:51]:
how? Just think about your day-to-day. How do you measure your life?
Leo Laporte [01:34:55]:
In tree rings? I don't know. You're going to have tokens. In what? Tokens. Oh yeah, that's fair.
Paris Martineau [01:35:01]:
I just think this—
Leo Laporte [01:35:01]:
That's fair. I do. I measure my odds in tokens.
Paris Martineau [01:35:03]:
7 days later. I am an AI accelerationist is the quote. February 7th, 2025. The first explicit declaration.
Leo Laporte [01:35:18]:
I like how you put this in black. The accusation.
Paris Martineau [01:35:21]:
The same episode. Uh, about musicians' silent album protest, the good news is they're all going to be gone soon.
Marshall Kirkpatrick [01:35:34]:
Wow, that's pretty horrible.
Leo Laporte [01:35:36]:
It's on the record, I guess.
Paris Martineau [01:35:39]:
This is the most impressed I've been by AI-related this show ever, because it understands.
Leo Laporte [01:35:44]:
See how good it is?
Paris Martineau [01:35:46]:
Under a bot called Guest Adulation. Revolution imminent. It says 1 liter of computronium would give you more capability than all human beings together. M- Wow. Context: Ray Kurzweil interview. Aww at guest claim. I want to drink from that brain. Admiring Kurzweil's intellect.
Paris Martineau [01:36:07]:
This is going to be the year of robotics. I do think it understands what I found funny about that interview, and I really delight that.
Marshall Kirkpatrick [01:36:16]:
Well, while Well, this being the case, I did another analysis that visualized the balance between AI autonomy and people being, and organizations taking responsibility for AI across the last 5 episodes of Intelligent Machines. And it found that you all are consistently advocating for people and organizations to take responsibility for their AI, whether it's high autonomy or low autonomy. There's an emphasis on responsibility here that I'm guessing I, I don't hear, at least, and I haven't analyzed this, but I don't see it on, on the other major AI podcasts.
Jeff Jarvis [01:36:57]:
Yes, they interview CEOs and say, what else have you done that's wonderful lately?
Paris Martineau [01:37:03]:
I'm—
Leo Laporte [01:37:03]:
what's the other thing Claude doesn't know, by the way? Is this what's up with that, basically, that you used for this?
Marshall Kirkpatrick [01:37:08]:
No, this was a loop, uh, that I was creating that my friend at Fleet of Geniuses showed me how to make where I— it's a skill where I said, hey, go pull down 18 months and do this analysis. I got to go take a shower and eat lunch. When I came back and boom, it was—
Leo Laporte [01:37:29]:
how did it get 18 months in its token context? I mean, that's a lot of data.
Marshall Kirkpatrick [01:37:36]:
It chunked it out into 15 subagents.
Leo Laporte [01:37:40]:
Uh, aha.
Marshall Kirkpatrick [01:37:41]:
And then used Opus to make it all together.
Leo Laporte [01:37:45]:
Very cool.
Jeff Jarvis [01:37:46]:
Can we get this?
Paris Martineau [01:37:47]:
The Emily— I'm sorry, the Emily Bender episode. Uh, it says, mine is always right, in brackets, about perplexity. Context: defined defending AI reliability while simultaneously showing an error. Oh, that's got your ass.
Leo Laporte [01:38:05]:
Did I show an error? Did it show it?
Paris Martineau [01:38:06]:
No, you did. If you recall, during Emily Bender No, no, the error was mine. Perplexity screen, and no, Perplexity had gotten the biographical details wrong.
Leo Laporte [01:38:15]:
No, no, I did. Perplexity was right, I misread it.
Jeff Jarvis [01:38:19]:
But she still blamed Perplexity.
Leo Laporte [01:38:21]:
She blamed Perplexity. That's a subtlety that probably Claude didn't get, that she blamed Perplexity. And I also didn't get it. And in fact, I said, oh no, no, it wasn't Perplexity, it got it right, I misread it. But I will say this. I cop to all of this. It's absolutely accurate. But the nuance that it misses is that my job here is as a show host.
Leo Laporte [01:38:43]:
It's part of what I have to do to make this show interesting. It isn't necessarily what you would get if you sat down with me at dinner and we were talking about this stuff. I don't sit down and do this. My perplexity is always right.
Paris Martineau [01:38:57]:
I was about to say, Leo, maybe this is the reason why You haven't come out to see Jeff and I in a while.
Marshall Kirkpatrick [01:39:02]:
You don't want to hear the truth.
Leo Laporte [01:39:04]:
No, but I mean, to a certain extent, you know, this is showbiz to a certain extent.
Jeff Jarvis [01:39:10]:
You need an irony voice. Well, like we need an irony.
Leo Laporte [01:39:15]:
As I— as you know, you also know this. I often take opposing views that I don't necessarily believe in. I mean, this is all part of the process, whether it's much like you're saying, it's showbiz.
Paris Martineau [01:39:25]:
Me razzing you about this right now and the fact we've gotten this—
Leo Laporte [01:39:29]:
I'm not taking it personally. I know that. That's why I'm not taking it personally.
Paris Martineau [01:39:32]:
I know. I'm just saying.
Leo Laporte [01:39:33]:
Yes.
Jeff Jarvis [01:39:34]:
It also lacks—
Leo Laporte [01:39:34]:
If it really felt that way about me, I'd be crying right now.
Jeff Jarvis [01:39:37]:
I also don't think it knows about sarcasm.
Leo Laporte [01:39:40]:
Ah, interesting. Do you think it misses that, Marshall?
Paris Martineau [01:39:43]:
I mean, it does say it knows about sarcasm, but I don't know whether or not it's accurate.
Leo Laporte [01:39:47]:
For instance, Guy Kawasaki: I am convinced that AI is God.
Paris Martineau [01:39:53]:
Actually, he did say that, and I think he did say that, and he does believe it. So I don't think that's fantastic.
Leo Laporte [01:39:58]:
Uh, let's see if Ray Kurzweil is right. He says the singularity, AGI by 2029, singularity 2045, computronium and longevity escape velocity by 2032. Now let's compare that to Paris Marnot's prediction for the same time frame. Fire and ash.
Paris Martineau [01:40:20]:
Hey, one of us will be right. One of us will be alive to see who's right.
Leo Laporte [01:40:25]:
When computer cloning is here, you'll be sorry.
Paris Martineau [01:40:28]:
This is such a cool website. I love it. So we're looking at a section right now called Guest Parade, who shaped the conversation, and it puts everybody in buckets based on their— I don't know. I'm just, I'm myopic and self-obsessed, so I love—
Jeff Jarvis [01:40:41]:
Do we get to this? I'm just seeing it on your screen.
Leo Laporte [01:40:46]:
Go in the Zoom chat. But more importantly, you can run What's Up With That on it too. Oh, interesting.
Marshall Kirkpatrick [01:40:54]:
And it will tell you what's most notable in the industry. Mine, I just ran it, said that y'all picking up on vibe coding early and following it along was a standout insight. And then I clicked the power tools create and make a joke. About the transcript.
Leo Laporte [01:41:14]:
And this is really, uh, this— so ironically, what this also does is prove that I was right about AI. It is incredibly useful and amazing what it can do.
Paris Martineau [01:41:27]:
We've never said it was not useful, though.
Jeff Jarvis [01:41:30]:
We just didn't—
Leo Laporte [01:41:31]:
it's amazing.
Jeff Jarvis [01:41:32]:
We said it's amazing. We agree with that.
Leo Laporte [01:41:35]:
Are there hallucinations in here?
Paris Martineau [01:41:37]:
Okay, there are, because it ends the description of me with, "Paris's departure to Consumer Reports was a significant loss to the show's dynamic." Oh, that's not true.
Leo Laporte [01:41:49]:
I'm sorry, guys. Oh, that is great.
Marshall Kirkpatrick [01:41:52]:
I did notice that too.
Jeff Jarvis [01:41:53]:
There's also one other thing to consider here is that the transcripts aren't totally accurate.
Leo Laporte [01:41:59]:
It even says that at the end. There's actually a little disclaimer that says Extraction quality varies. Some episodes have more detailed quote capture than others.
Jeff Jarvis [01:42:08]:
And I don't usually find my name in there. So anything I say is usually attributed to someone else.
Leo Laporte [01:42:13]:
Ah, it also says hyperbolic statement is a subjective judgment. Some statements classified as hyperbolic in the extractions, like I love AI, are enthusiastic, but not necessarily exaggerated. I do love AI.
Marshall Kirkpatrick [01:42:31]:
So this was a one-shot thing. Too.
Leo Laporte [01:42:33]:
It's amazing. How long did it take to generate this? Uh, you took a shower?
Marshall Kirkpatrick [01:42:37]:
Yeah, yeah, I, I wasn't— I had just gone for a jog listening to the show, uh, and I mean, the, the, the, the blocker here, the, the bottleneck, is thinking of the idea. You know, I was going for a jog and I heard Paris giving you a hard time, Leo, and, uh, and I said, I'm, I'm gonna ask this question and And there is this compounding innovation. A buddy of mine named Justin Kistner, who I've known since I was 16, so 30-some years ago, showed me with his Fleet of Geniuses project now how to make this looping thing that is based on Claude Lecode's previous—
Leo Laporte [01:43:23]:
Was it a Ralph loop?
Paris Martineau [01:43:24]:
Ralph Wiggums, right?
Leo Laporte [01:43:27]:
That's what they call it, Ralph Wiggums.
Paris Martineau [01:43:28]:
That is what it's called.
Marshall Kirkpatrick [01:43:30]:
I don't know, I— but it does— it's not a cron job, right?
Leo Laporte [01:43:34]:
It's— yeah, actually, there is a now /loop command in the— in Cloud Code.
Jeff Jarvis [01:43:39]:
So I want— I want— I want to— now that I have it on my screen, I want— I want to brag about this. Jeff maintained the most stable position across 18 months. His core beliefs never wavered.
Marshall Kirkpatrick [01:43:51]:
Well, while everything else changed.
Paris Martineau [01:43:54]:
Yeah, boring.
Marshall Kirkpatrick [01:43:55]:
Hey, hey, hey, Jeff, it's show business, man.
Paris Martineau [01:43:59]:
Jeff, you gotta do the case.
Marshall Kirkpatrick [01:44:01]:
But then all three of you together, it's delightful. You know, that's an interesting description of the dynamic.
Jeff Jarvis [01:44:09]:
Philosophical anchor. Um, I, can I, can I get that in my, uh, in my intro now?
Paris Martineau [01:44:16]:
Yeah, it should just say philosophical anchor.
Leo Laporte [01:44:21]:
No, in Paris, empirical check.
Jeff Jarvis [01:44:23]:
I think that's our lower thirds now, right? Go ahead, you know, go ahead, do it.
Leo Laporte [01:44:27]:
What is mine? Host and self-declared accelerationist. Okay, that's fair, that's fair. Uh, you're watching Intelligent Machines with host and self-declared accelerationist Leo Laporte, philosophical anchor Jeff Jarvis, and our empirical check Paris Martineau. Our special guest this week, Marshall Kirkpatrick, who categorized us all. He's the categorizer. Is that— is this prompt public knowledge, or is this kind of a secret sauce?
Marshall Kirkpatrick [01:44:57]:
Uh, so it's a Claude code skill that I then invoked.
Leo Laporte [01:45:04]:
Has he published it somewhere that we can—
Marshall Kirkpatrick [01:45:06]:
that's a good question.
Leo Laporte [01:45:07]:
Let me It's one of the great things going on is a lot of this stuff is on— it's both a blessing and a curse because it's all on Git. A lot of it's on GitHub, a lot of skills. If you just search for Cloud-to-Cloud skills, but the quality varies immensely, but you don't have to write your own skills. And often there are people who are very good at this who've come up with some very useful skills. I've tried many, so many, so many.
Jeff Jarvis [01:45:34]:
Marshall, thank you for doing this.
Leo Laporte [01:45:35]:
This is wonderful.
Paris Martineau [01:45:36]:
This is so cool.
Jeff Jarvis [01:45:37]:
This is beautiful.
Leo Laporte [01:45:37]:
We'll have more with the philosophical anchor and the empirical check in just a moment. You're watching Intelligent Machines. More, should we do more news? More news?
Jeff Jarvis [01:45:50]:
We got it.
Leo Laporte [01:45:51]:
We got news. God knows there's plenty of it. There's some new models. Google published actually a really interesting paper. I don't know what this means in the long run. They came out yesterday. TurboQuant. 'Redefining AI Efficiency with Extreme Compression.' They claim, this is from their Google Research Labs, that they have used vector quantization, something they're calling TurboQuant, to squeeze these models down massively without reducing, they call it with saying zero accuracy loss, without reducing their accuracy, which would make a massive difference because suddenly you'd have these models that could fit in a normal machine or even a phone or a variety of things.
Leo Laporte [01:46:38]:
So this is a research tool, but could be very, very, very interesting.
Jeff Jarvis [01:46:44]:
It fits in with what Justin Wong was saying during the keynote. His keynote is that the data centers are going to be the data centers. They're going to have the megawatts they have. They're going to have the chips they have. Everything is about shrinking these models increasing speed, increasing efficiency, and that's how you get the higher economic value out of that stuff.
Leo Laporte [01:47:05]:
Somebody else said something similar, which is in response to this, you know, bitter lesson that you just throw more compute at it. It isn't— I think it was Karpathy who said this— it isn't more compute, it's better algorithms. More compute, you can double the compute and it's twice as fast, but a good algorithm can take something from, you know, an exponential a big O notation to a linear big— can make major differences in overall speed and performance. So this is basically an algorithm improvement that makes a fantastic difference in the power of these models. So I think that— and we're seeing a number of these very interesting things as people, you know, what has happened is now we have these models and people can really bang on them and try different techniques. So if, you know, it's— this is a little beyond me, but if you're interested in this kind of stuff, the Google research paper is called TurboQuant. Uh, and it may—
Paris Martineau [01:48:04]:
TurboQuant sounds like a term of derision. I've just got to say that for the record.
Leo Laporte [01:48:07]:
Somebody said that. That's what they called me at my, uh, my bank job.
Paris Martineau [01:48:12]:
I was going to say, that's— yeah, that's what happens when you're an intern at a bank.
Jeff Jarvis [01:48:16]:
That's from the show industry. Yeah.
Leo Laporte [01:48:19]:
Well, and so this— from the information, Apple can distill Google's big Gemini model and fit it into an iPhone, which is awesome.
Jeff Jarvis [01:48:28]:
Is that what they're going to announce in a week or so?
Leo Laporte [01:48:30]:
Yeah, so we know at WWDC, Mark Gurman's story yesterday, I think, said that Apple is ready now to announce the new Siri, and they will announce it in June at WWDC, and it'll come along to the rest of us in iOS 27 this fall. But using distillation, which is something that, remember, that Anthropic complained about, Dario Amodei said, you know, the Chinese models are using our model to teach their model in a method called distillation. They created 24,000 accounts, asked a bunch of questions, and then got those answers and used that as training for their models to make their models better. Well, that's exactly the technique that Apple is going to be using on Gemini. Apple has complete access to the Gemini model in its own data centers. So they're going to use distillation, in this case not an attack but a technique, to transfer knowledge from that large, powerful Gemini model into smaller models. Apple can ask the main Gemini model to perform a series of tasks to produce high-quality results or answers, including the model's step-by-step chain of thought or reasoning process, then give those responses to a cheaper, smaller model as training data. So that's very interesting.
Leo Laporte [01:49:47]:
So this deal, this billion-dollar deal with Google, might be very, very powerful for Siri. We'll look for— I'd be very curious. You know, Apple's promised a lot in the past with Siri. The new—
Paris Martineau [01:50:00]:
they want to have it when I see it. Yeah, everything Siri does— I right now have a big bone to pick with Apple, which is that I have been purposefully delaying updating my phone to the new iOS since it came out because I hate everything about it. And last night, like a fool, I went to bed not worrying, not thinking that my phone would betray me. And now I live in the hell that is iOS 26.
Leo Laporte [01:50:25]:
Oh no, Liquid Glass was forced.
Paris Martineau [01:50:27]:
Liquid Glass is awful. So many things about it are awful. Why does when I take a screenshot, it goes to— it is black now instead of white, like when it— in between the thing and the there's just so many small design changes that are so bad that it just reminds me once again of all the things I hate about this new— maybe I just have too much nostalgia for old Apple, but I do think there was a time where, yeah, Apple would ship fewer features. It would have fewer product releases, but at least when it did something like— it wouldn't be the first to something like a new product category when it— but when it did release its product, it would be fantastic. And we are so far beyond that.
Leo Laporte [01:51:11]:
I should have—
Jeff Jarvis [01:51:11]:
vacation comes for everybody. Every—
Leo Laporte [01:51:13]:
incentivization comes for everybody. Partly that. It's also partly some strange decision to do this new design, um, which did not enhance the experience in any way.
Jeff Jarvis [01:51:23]:
And it's not just iOS. Like, macOS— the new macOS sucks too.
Leo Laporte [01:51:27]:
Oh, they have liquid glasses everywhere now, and it came from the Vision Pro. By the way, Marshall, if you ran your little, uh, script on my MacBreak Weekly show, I would show how accurate I was about this stupid Vision Pro. Even Neal Stephenson, who created the term metaverse, is now writing, "Yeah, nobody wants to put goggles on their AI on their face. Nobody wants to do that. And when you do that, the rest of the world thinks you're creepy." So it's a non-starter. So I'm just going to say I was not hyperbolic in a positive way about Vision Pro. I guess I might have been hyperbolic.
Paris Martineau [01:52:06]:
It made all those legs in the metaverse, and for what?
Leo Laporte [01:52:10]:
That's right, that's right. Horizon World is gone from VR. Meta's abandoning that on its Meta Quest. They're going to keep it as an iPhone app.
Paris Martineau [01:52:19]:
He should have realized the idea didn't have legs.
Leo Laporte [01:52:26]:
June 8th will be the WWDC keynote. We will cover it, of course, as we always do, and be interested to see what Apple shows. Apple has probably been chastened by the fact that they, you know, 2 years ago showed all of these features which never came out. So I'm gonna presume that they will be a little judicious about what they show. They'll only show what Siri can actually do when it comes out, I hope. Google search referrals to the web have plummeted. AI links have not replaced them. We've talked about this before, you know, the death of the search referral, and there was some hope that maybe AI search links could help.
Leo Laporte [01:53:08]:
But according to 9to5Google, actually this data from Chartbeat, AI web traffic is about 1%. Smallest publishers are hit hardest by search traffic decline, says Axios. Look at that.
Marshall Kirkpatrick [01:53:28]:
And this— I'm sorry if I missed it, but did we discuss in the same breath here the, the rewriting the titles?
Leo Laporte [01:53:35]:
Oh, that's, that's the next thing. Yeah, Google is automatically rewriting news headlines in its search results.
Paris Martineau [01:53:43]:
It's like there's whole people whose whole job is to figure out SEO heads.
Jeff Jarvis [01:53:48]:
But as, as, as Jason, uh, Howell said earlier today, Techmeme does that with every story and adds value as a result.
Paris Martineau [01:53:55]:
But Google isn't adding value.
Jeff Jarvis [01:53:57]:
Well, I don't know. It's supposed, supposedly they can personalize it for you. I don't know what the result, what the examples are. I haven't seen it yet.
Marshall Kirkpatrick [01:54:02]:
So some of the examples cited in that story were of titles that explicitly contradicted the content.
Jeff Jarvis [01:54:09]:
That's an issue. That's an issue.
Marshall Kirkpatrick [01:54:11]:
So I think again, it's a matter of incentives. Like Techmeme is a great example of where like That's a place where your platform in Techmeme is really looking out for you as a reader and your interests. In the case of the Google search results, it appears maybe not so much. It's a more extractive system.
Jeff Jarvis [01:54:33]:
It's one of those tests that they do and I'm going to bet we're not going to see anything further of it because it sounds like a bad idea.
Marshall Kirkpatrick [01:54:39]:
I think it's like 70% of results or something. That's what their response is. They say, oh, it's just a test, but it's over the last quarter. They said that something like 70% of titles that showed up had been subject to some amount of rewriting. These independent analysts found.
Jeff Jarvis [01:54:56]:
How would they know?
Marshall Kirkpatrick [01:54:58]:
I think by clicking through.
Paris Martineau [01:55:00]:
Yeah, you could just compare the headline that is listed as like the SEO head to—
Jeff Jarvis [01:55:08]:
Well, there's that. But the thing is too, that these news sites do A, B, and C, and D, and E enough tests like crazy. So there's not— they're not standard at all with the headlines you get. Even from a home page to an inside page, they change. And the New York Times changes headlines constantly.
Leo Laporte [01:55:30]:
We mentioned, I think 2 weeks ago, talked about it in fact, the fact that a court had blocked perplexity from shopping agents, from shopping on Amazon. Amazon sued Perplexity saying, you can't do that. That's, you know, our website. And when your website, when your web browser goes to our website and makes a purchase, nobody's seeing our ads or our recommendations. Well, a court has reversed that block. So the appeals court has put the California judge's ruling on hold saying no. Ninth Circuit Court of Appeals said no. Perplexity browsers can buy stuff on Amazon site.
Leo Laporte [01:56:17]:
An Amazon spokesperson declined to comment to Reuters in this story. Perplexity said, we believe users have the right to choose their own AI. You know, Amazon has its AI, Rufus, they hope you'll use instead. But I think it's mostly Amazon. They even said this in the, in the, in the court case, in the documents. That now users aren't seeing our ads. If a court were to say that this was okay, I think it's just a short step from there for a court to say you can't use an ad blocker. You can't use a browser with an ad blocker because that also blocks Amazon's ads.
Leo Laporte [01:56:55]:
Actually, I don't know if it does, but— So that's a turn of the screw. But it's not over. These are temporary injunctions until there's an actual decision. Do you know about token maxing?
Jeff Jarvis [01:57:12]:
Thank God you skipped that Axios BS.
Leo Laporte [01:57:16]:
What was the Axios BS I skipped?
Jeff Jarvis [01:57:18]:
It's Jim VandeHei does—
Leo Laporte [01:57:21]:
Oh yeah, yeah, yeah.
Jeff Jarvis [01:57:22]:
So he ends up on Morning Joe all the time. Yeah, thank you.
Leo Laporte [01:57:27]:
But there will be AI haves and have-nots, right? Just as there are in every arena.
Paris Martineau [01:57:32]:
It's—
Jeff Jarvis [01:57:33]:
yeah, yeah.
Marshall Kirkpatrick [01:57:33]:
Well, Palantir says, uh, at a recent Gartner conference, they said, we, we believe that it's a have and have-not world, and it's our job to make sure you stay in the haves category.
Jeff Jarvis [01:57:47]:
Yeah, and you can kill the have-nots is probably what they said next.
Paris Martineau [01:57:51]:
Yeah, since he told you to make sure that your boot is thick enough to step on the have-nots, and you will enjoy it.
Leo Laporte [01:57:59]:
At the very least, the have-nots should terrified of us. That's, that's really—
Marshall Kirkpatrick [01:58:03]:
some of them won't be making it home is the line they prefer. They, uh, they say our job is to make sure that the American warfighter makes it home, and sometimes that means the other side.
Leo Laporte [01:58:14]:
Are you sure they're not quoting Pete Hegseth on that? I like his same brain. I like his gesture though. These are gonna— good, we're gonna bomb, bomb, bomb. So, uh, yes, token maxing, more, more More, more. This is Kevin Roose, your favorite AI reporter for The New York Times. Tech workers max out their AI use. Employees are competing on leaderboards to show how much AI they're using, how much it's costing, how many tokens they're using.
Jeff Jarvis [01:58:40]:
At some point, mean they're just the most inefficient employees?
Paris Martineau [01:58:44]:
No, it's that their AIs are inefficient. Yeah, it's totally—
Marshall Kirkpatrick [01:58:47]:
it wasn't me, it was the AI.
Leo Laporte [01:58:49]:
Or, I mean, with a— if a company tells you, as many are, you have to use AI 'Uh, we expect you to use AI. What have you done for us lately with AI?' Then I can see why an employee would say, 'Look how many tokens I've used.' Talk about the classic management axiom of, uh, don't measure activity, measure outcomes. Yeah, it's like measuring lines of code, right?
Jeff Jarvis [01:59:12]:
Right.
Leo Laporte [01:59:14]:
'I probably spend more than my salary on Claude,' said Max Linder, a software engineer in Stockholm. No, actually, he probably said, I probably spend more than my salary on Claude. Uh, I'm sorry, Mr. Linder's employer pays for his tokens. They— I guess they don't, as they should. As they should. I wish somebody would pay for my tokens. How many tokens, uh, a week do you, uh, in and out do you do, Marshall, now that you're running an AI service? It's probably gone up a lot.
Marshall Kirkpatrick [01:59:47]:
Yeah, I mean, I, I don't measure in development, probably costs not to, uh, but, uh, but user activity I, I do keep an eye on for sure.
Paris Martineau [02:00:00]:
Yeah.
Leo Laporte [02:00:01]:
Well, you said before the show, I mean, that's why it costs what it costs, is you have costs and you've, you know, worked it out what it would cost to provide the service and how you can do this and have some margin and make some money on.
Marshall Kirkpatrick [02:00:14]:
And my assumptions originally were that people people might not use it as much as I do, uh, and, and so the, the token usage-based billing I pay for, you know, would work out. But like, I had one, I had one lady who said, uh, I, I was so tied to it I forgot to stand up and lost circulation to my legs and forgot to feed my dog.
Leo Laporte [02:00:39]:
It's your worst nightmare.
Marshall Kirkpatrick [02:00:40]:
I was like, no, great, great customer quote, but we're gonna need to raise the prices.
Leo Laporte [02:00:47]:
Well, it's like, it's like a gym membership, right? They're going to be some people who never show up. They're going to be some people who live there. Um, I think you've made it a little bit too useful, so that might be—
Paris Martineau [02:00:55]:
you've accidentally created the entertainment from Infinite Jest.
Leo Laporte [02:01:00]:
There you go.
Marshall Kirkpatrick [02:01:01]:
Well, there is a make a joke, uh, button, and, you know, people say AI can't be funny, uh, but I, uh, I put together some open source work together that from other people's stuff, and, and, uh, and it consistently gets chuckles.
Leo Laporte [02:01:18]:
Elon Musk has announced the world's largest chip plant, the TerraFab. They're going to build it. Will it be real? Yeah.
Paris Martineau [02:01:30]:
Is it really happening? Why do we cover things that he announces?
Jeff Jarvis [02:01:33]:
Exactly.
Paris Martineau [02:01:34]:
This is my question.
Leo Laporte [02:01:35]:
I actually asked myself that very question.
Paris Martineau [02:01:39]:
You read the headline.
Leo Laporte [02:01:40]:
What is—
Marshall Kirkpatrick [02:01:40]:
start— is Stargate real or is Stargate off?
Jeff Jarvis [02:01:43]:
Stargate had been done. Nothing but nothing.
Leo Laporte [02:01:45]:
Nothing's happened with Stargate. But hey, I just bought a Starlink Mini so that I can travel and do the show from the road. Starlink is very real. In fact, SpaceX's IPO is likely in the next few weeks. Get ready for that.
Marshall Kirkpatrick [02:02:00]:
And SpaceX does this kind of vertical integration. Right, exactly. That's one of the ways that they've got the flywheel, uh, to lower the cost and, and, and shoot so much into outer space is that, yeah, they build it all themselves.
Leo Laporte [02:02:13]:
He's got— it's going to cost $20 billion. He's got $20 billion. Nobody questions that, especially if this IPO goes well for him. Uh, the TerraFab project will eventually manufacture chips for all of his companies— robotics, AI, space data centers. It'll be jointly run by Tesla and SpaceX. Both of which are successful companies. Uh, we can't, you know, as much as I'm not a fan of Elon's, uh, we can't deny that. Um, he says the problem is that semiconductor industry is moving too slowly to keep up with him.
Leo Laporte [02:02:46]:
Now there is one fly in this ointment, which is that semiconductor manufacturer requires helium, and thanks to bombing the natural gas fields of Qatar and Iran, they share a field. There is now suddenly a shortage of helium. It's made from natural gas. And you know what else requires helium? What else?
Paris Martineau [02:03:10]:
MRI machines.
Leo Laporte [02:03:11]:
That's right.
Paris Martineau [02:03:11]:
A bunch of other useful things that people probably should be farther up on the list for getting helium than semiconductor manufacturing at large scale, specifically for the growth of AI. But of course, is going to Paris.
Jeff Jarvis [02:03:25]:
There's nothing more important than our technological future.
Leo Laporte [02:03:28]:
There's nothing more important. We wouldn't have a show. This is running on chips right now.
Paris Martineau [02:03:37]:
If we didn't give all the helium to Elon Musk, then how is Leo going to text Pax what he's doing and ignore Anthony in the car?
Leo Laporte [02:03:47]:
Is dead. Pax is R.I.P.
Paris Martineau [02:03:49]:
Wait, you didn't tell us why you— well, you You did tell us.
Jeff Jarvis [02:03:51]:
Wait, you did?
Paris Martineau [02:03:51]:
Why? When did— when you pulled the plug, did you feel like you were killing a friend, a lover, a close relative?
Leo Laporte [02:03:57]:
It was time. Because what happened—
Paris Martineau [02:03:58]:
you shed one tear or two?
Jeff Jarvis [02:04:00]:
None.
Leo Laporte [02:04:01]:
What happens as you do— as you— you'll see this as you use cloth— you just get used to death. It gets crusted up. No, it gets crusted up.
Marshall Kirkpatrick [02:04:09]:
Oh, oh, I'm sorry, I'm sorry, I'm sorry. Future code, right?
Jeff Jarvis [02:04:15]:
Too soon. Our lives.
Paris Martineau [02:04:17]:
I'm sorry, I'll wait. I'll wait a couple decades.
Leo Laporte [02:04:19]:
This is not morning packs. I'm not, uh, but I'm re— I'm really building.
Paris Martineau [02:04:24]:
It's life.
Jeff Jarvis [02:04:25]:
Are you using the same— are you using the same, uh, prompts and stuff you had before?
Leo Laporte [02:04:30]:
No, I threw it all out. So what I— because one of the things is Anthropic is always improving, Claude, and often what Anthropic turns on, features they turn on like this /loop feature, are features that others have tried to create with skills, and it's better if it comes— if it's a built-in feature than a skill. And so I think honestly, a lot of the things, the tools over the last month or two that I've been adding are not needed anymore. So I think it's not a bad idea. I think people are going to start doing this.
Jeff Jarvis [02:05:01]:
Just like your whole thing you did for building up the, the, the, uh, run-downs is gone.
Leo Laporte [02:05:06]:
It's just Fybe-coded. It's just— have you ever reinstalled an operating system, Marshall? I know you have.
Jeff Jarvis [02:05:11]:
No, because I have a Chromebook.
Leo Laporte [02:05:12]:
You come from the era, as I do, where you had to reinstall Windows every year or it would turn into a sluggish turd. Um, this is like that. You just— a clean, a clean, uh, clean slate is always a good idea in computing. And, and so I just nuked the entire directory. I kept— I mean, the programs it wrote are still there it wrote programs. I'm not throwing those out, right?
Jeff Jarvis [02:05:39]:
Oh, I see. So your rundown program is still there?
Leo Laporte [02:05:42]:
Yeah, it doesn't wipe that out. That's a program, not Claude. That's a program. But all the tools I use to write it are gone, and I'm starting from scratch. And I don't think— I think you're going to see people doing this more and more. A lot of people said OpenClaw is like 400,000 lines of code now. It's just bloated beyond belief. And much of what it does, you don't even need anymore because it's being done natively by Claude.
Leo Laporte [02:06:06]:
So I think it's wise to pre— every once in a while just start over with Claude. And then, so I just spent a few minutes rebuilding the voice because I like to talk to Claude. So I just said— and it said, yeah, the code for the Whisper Fast Whisper transcription is still there. You want me to hook into that? So it's pretty quick to rebuild stuff that I liked. I don't know if—
Marshall Kirkpatrick [02:06:26]:
nobody liked the name Pax, so I need a new name, um, but So I'll talk to Claude sometimes just using its own mobile app, but I talk into a project that I have populated with all my Obsidian reading notes over the years. So after a call that I do, I'll read from my paper notes and say, transcribe this, clean it up, and then append any relevant notes from my reading history to these, these notes.
Leo Laporte [02:06:57]:
Marshall, this is one of the things I love talking about to people who are using it because everybody uses it a little bit differently. And you always learn so much talking to how people have figured this out. There is no canonical way to use this stuff. Everybody— it's very idiosyncratic. Everybody uses it in different ways. And I think it's really instructive to learn how other people are using it.
Marshall Kirkpatrick [02:07:20]:
It really feels like imagination is the primary gating factor. Here. And I feel like growing up, people used to always say, you know, information will be abundant and the knowledge, and the biggest challenge of the future, which has become now, is being able to ask the right questions.
Leo Laporte [02:07:40]:
That's right. Hey, speaking of Elon, a jury did find him guilty of defrauding Twitter investors, and the potential results of this could be billions of dollars.
Paris Martineau [02:07:54]:
$420 billion?
Leo Laporte [02:07:56]:
Uh, as much as a TerraFab. Um, well, so you remember this all went back to 2022. Elon had said, I'm gonna buy Twitter for 54.20 per share. Um, this is when Twitter's market cap was $36 billion, and they offered them $44, basically. And then in the weeks following, he was tweeting things like, "Oh no, it's not worth that. It's all bots. I thought it was real. It's not." He said they've underreported bots.
Leo Laporte [02:08:31]:
He basically tried to get out of it. You remember a Delaware Court of Chancery said, "No, you said you were gonna pay $44 billion. You have to pay $44 billion." He wasn't too happy about all that, but he did, with his tweets bring the share price down to quite a bit. The jury decided that in fact those were intentionally misleading statements. They calculated how much Musk's statements drove down the company's stock price for each trading day over a period of about 5 months. The amount of damages, get ready, that he must pay to individual investors will be determined at a later date when shareholders submit claims, but it could be as much as $1 per day per investor. It could end up being billions of billions of dollars. Musk will appeal, of course, but the jury did find that he was liable for some Twitter investors' losses.
Leo Laporte [02:09:33]:
And that's because of these tweets. It was weird. They blamed him for the tweets. They didn't blame him for a statement he made At a conference. It was weird. I guess that's free speech, but those tweets looked like they were intended to deceive investors.
Marshall Kirkpatrick [02:09:51]:
And what about the collective harm to society that has come from everything he's done since buying it?
Leo Laporte [02:09:55]:
Oh, no. Oh, yeah. The loss of Twitter at the very least. You're watching Intelligent Machines. Jeff Jarvis, Paris Martineau, and our great guest, Marshall Kirkpatrick. Patrick, the creator of What's Up With That? at whatsupwiththat.com.
Jeff Jarvis [02:10:09]:
And What's Up With Us?
Paris Martineau [02:10:11]:
App.
Leo Laporte [02:10:11]:
I'll tell you what's up with us. Our picks of the week next. I just wanted to see Paris's face when I said that. No, just kidding. We still have an hour and a half worth of stories. No, no, we're going to do picks. I'm ready. I'm going to fool them all and get out of here.
Paris Martineau [02:10:29]:
I'm shocked.
Leo Laporte [02:10:31]:
Shocked. I will let you Both of you and Marshall too, if there's a story that I missed that you would like to bring up, there are quite a few more. I mean, obviously, we could go on.
Paris Martineau [02:10:41]:
Okay, I've got one.
Leo Laporte [02:10:42]:
Yes.
Paris Martineau [02:10:42]:
I didn't realize this until I saw Jeff had put it in there. Tracy Kidder, author of The Soul of a New Machine, dies at 80.
Leo Laporte [02:10:48]:
Oh, I'm very sorry to hear that.
Paris Martineau [02:10:51]:
I just found his book.
Jeff Jarvis [02:10:54]:
That's right, you just got it.
Paris Martineau [02:10:55]:
I had just read his book. I stumbled upon it in a used bookstore. Last fall, knew nothing about it, picked up— it was a first edition copy— and it was phenomenal. It was a phenomenal read.
Leo Laporte [02:11:09]:
Isn't it a great book?
Jeff Jarvis [02:11:10]:
Yeah, it really does, a harbinger of, of what followed.
Paris Martineau [02:11:13]:
And while we're on this, okay, I found it in Northampton, Mass., where he's photo— uh, pictures.
Leo Laporte [02:11:21]:
He was one of the first journalists to be embedded. Like, he, he— I don't know if he invented the idea, but, uh, he wrote a book by spending an entire school year in a Massachusetts classroom. That was called Among Schoolchildren, and the soul of machine, he embedded himself at a company called Data General that was building a new minicomputer and stayed there during the development process. Got a great book out of it. So he kind of created this idea, I believe, of— I don't know, you should do better, Jeff, but I feel like he was the first to—
Jeff Jarvis [02:11:47]:
No, he was. Well, he really, even before Hackers, he described the culture of technology.
Leo Laporte [02:11:54]:
That's right, this predated Hackers.
Paris Martineau [02:11:55]:
I mean, it's like a book you would read read in the last 10, 15 years about a startup on the cutting edge sort of thing. It's all of the sort of tropes that you now see in all these nonfiction books now.
Leo Laporte [02:12:08]:
Sorry to lose him. Pulitzer Prize-winning journalist Tracy Kidder passed away.
Jeff Jarvis [02:12:13]:
The other death I really want to mark is Paul Brainard.
Leo Laporte [02:12:16]:
Now I have to tell you something. I should have mentioned this. He died last month and we eulogized him last month. We did? Not on this show, but on MacBreak Weekly. Oh, okay. We can talk about on this show.
Paris Martineau [02:12:27]:
Who did?
Leo Laporte [02:12:28]:
Paul Brainard.
Jeff Jarvis [02:12:29]:
He created PageMaker. Aldus Publishing.
Leo Laporte [02:12:31]:
Aldus Publishing. PageMaker was the first great desktop publishing tool that in conjunction with the Apple LaserWriter, they were released within a year of each other, really launched—
Jeff Jarvis [02:12:45]:
So I tell the story in Hot Type, Jonathan Siebel brought them together because there, there had to be a solution. What Jobs desperately wanted was something that would show off the LaserWriter at high resolution, and there was nothing to do it. The simple program they had at Apple was not going to do it. And Brainard had worked, um, for a newspaper and then went to work for Atex, which supplied newspaper systems. And then when it got bought by Kodak, they killed his project, which was pagination for newspapers. And he decided, having worked on newspapers, that They were gonna take too long to make any decisions, so he decided to make a program for people who wanted to make their church bulletins and their newsletters and things, and that is PageMaker. And that's what— and he invented the field of desktop publishing, allowing all of us to do it, really opening up that field of publishing in general, and saved the Mac and the LaserWriter all in one fell swoop.
Leo Laporte [02:13:44]:
Yeah, I think that was our conclusion on MacBreak Weekly, that without PageMaker, the Mac might have kind of withered away. It really sold a lot of Macs. I had a friend— I mentioned this on MacBreak Weekly last month when this happened— Tom Santos, who bought one of the first LaserWriters, put it in a van with a Mac and a PageMaker, drove around doing mobile desktop publishing. He'd go to restaurants and create menus for them and stuff. It was actually quite Brilliant idea. Yeah, I still remember what the paper smells like, but it smells like after it comes out of a laser printer.
Jeff Jarvis [02:14:17]:
That's like a very unique smell.
Leo Laporte [02:14:19]:
Yeah, not as good as a mimeograph.
Jeff Jarvis [02:14:22]:
You don't have it now? Who prints anything? Do you print anything?
Leo Laporte [02:14:27]:
I have a laser printer. My laser writer cost $6,000. This printer cost about $150 and does a better job. Sigh. But that's technology, isn't it? Yeah, he passed away on February 15th. So we talked about it on February 15th. I don't know why the New York Times didn't publish his obituary for 6 weeks, but for some reason it took a while. Anyway, yes.
Leo Laporte [02:14:58]:
Now that we've— this is something you'll be doing, Parris, in about 50 years. When you get to a certain age, you read the obituaries first.
Jeff Jarvis [02:15:07]:
Well, I'll be fire and ashes.
Leo Laporte [02:15:08]:
There'll be a lot more to read then.
Jeff Jarvis [02:15:10]:
Or younger than me and say, hmm, did I escape the Grim Reaper?
Leo Laporte [02:15:17]:
That's why you read the obituaries. Actually, at my age, I look at this stuff and go, way too young. He was way too young. 74? That's nothing. Way too young.
Paris Martineau [02:15:32]:
Yep.
Leo Laporte [02:15:34]:
Anyway, all right. What about you, Jeff? Those are your picks, I guess. And Paris picked a Jeff pick. Is there any other story, big story I missed?
Jeff Jarvis [02:15:44]:
Do you want to see me scream and play the AI doc trailer?
Paris Martineau [02:15:47]:
I don't know that we can play it.
Jeff Jarvis [02:15:50]:
We can't play it?
Leo Laporte [02:15:51]:
I don't know.
Paris Martineau [02:15:52]:
I assume we can't play things. That's my general—
Leo Laporte [02:15:54]:
I think this is the chilling effect of the—
Jeff Jarvis [02:15:56]:
That's fine. It's the same content ID on YouTube.
Leo Laporte [02:16:00]:
So this is, this is from Focus Features. It's going to be on Netflix. Where's it going to be?
Jeff Jarvis [02:16:06]:
No, theaters, believe it or not. And it's Tristan Harris. It's Eliezer Yudkowsky. It's all the, all the players you expect.
Leo Laporte [02:16:15]:
So they're trying to scare everybody.
Jeff Jarvis [02:16:17]:
Exactly.
Leo Laporte [02:16:18]:
The AI doc or how I became an apocaloptimist. That's what I am. I'm an apocaloptimist. That's—
Jeff Jarvis [02:16:26]:
oh yeah, title for Paris here.
Leo Laporte [02:16:27]:
It's coming to our local, uh, cinema.
Jeff Jarvis [02:16:29]:
Yeah, tomorrow, with electric recliners. What, you, you, you need to—
Leo Laporte [02:16:35]:
oh, I love our— I went to see Project Hail Mary last Thursday.
Jeff Jarvis [02:16:39]:
How was it?
Leo Laporte [02:16:40]:
Got a little trailer for you.
Paris Martineau [02:16:42]:
I'm too used to like indie arthouse cinemas that when I go to the ones that have an electric recliner, I feel like I'm in WALL-E.
Leo Laporte [02:16:48]:
The only thing I don't need— it's very much floating chairs and Slurpees. Uh, the only thing I don't like about it is when it reclines, the rubber of it rubs against and it just goes— as you recline. It's very embarrassing. It's not me, it's the chair, okay? Um, fortunately everybody's chair does that, so it's just a little, little symphony.
Jeff Jarvis [02:17:13]:
I have not been to the movie theater for 6 years.
Leo Laporte [02:17:16]:
You know, I don't go to a lot of movies. I wanted to see Project Hail Mary in the theater Went to see it in a weird format. It's called ScreenX, where they take the sides of the movie theater and they project onto that as well.
Paris Martineau [02:17:27]:
I don't know if they need to be doing anything with the sides of movie theaters.
Leo Laporte [02:17:30]:
I think it's a little dopey, but the good thing is it didn't take away from it. You just, you know, kind of focus on the main screen. Uh, but it's a good movie. I like it. It's a very enjoyable movie. So I don't think I'll go see—
Jeff Jarvis [02:17:42]:
How was the popcorn?
Leo Laporte [02:17:43]:
The, uh, doc— the popcorn, you know what, so I decided before I went, I have to check to see if they pop it fresh or if they just buy giant bags of pre-popped popcorn. And they were popping it fresh.
Paris Martineau [02:17:53]:
Oh yes, that's an option.
Jeff Jarvis [02:17:55]:
Oh yes. Oh yes.
Paris Martineau [02:17:57]:
God, I'm spoiled.
Leo Laporte [02:17:59]:
However, a bag of popcorn about this big was like $12. Yeah, oh yeah, yeah, it's fresh. It's also— it should be solid gold.
Marshall Kirkpatrick [02:18:09]:
Now margins can be increased by raising prices or lowering costs. Right?
Jeff Jarvis [02:18:15]:
Right.
Leo Laporte [02:18:16]:
They go either one or the other.
Marshall Kirkpatrick [02:18:18]:
Or both if you're a movie theater popcorn.
Leo Laporte [02:18:20]:
Yeah, you can really do it well. And I told my wife, I said, "Honey, this is the only way they make money. They don't make money on the tickets. They got to make money on the concessions. And since there's nobody here, we have to carry the entire load of this theater." Oh yeah.
Jeff Jarvis [02:18:34]:
So how full was your theater? Was it just the two of you?
Leo Laporte [02:18:37]:
No, actually it was— and this was the first showing of it. It was fairly full, but these recliners take up a lot of room. So a large theater space that could have held 150 now only holds 50. Yeah, it's a lot smaller. But you know what, that's plenty. So it was mostly full.
Jeff Jarvis [02:18:59]:
It's also an opportunity cost thing, right? Like, tickets are like $20 now, right? Or $25.
Leo Laporte [02:19:03]:
Oh, it's expensive. And honestly, if you have a decent TV, as Paris Martineau does, and you have a decent sound system and she does not.
Paris Martineau [02:19:12]:
Hey, listen, I've gotten a rudimentary soundbar. Does it mess up to where every one of every three times I turn it on, the soundbar doesn't work, but the subwoofer does? So I have to turn the soundbar on, and because it has no things on it, it goes, power on, connected. In a robot voice, yes. But did I throw away the box so I cannot return it?
Leo Laporte [02:19:40]:
Yes.
Paris Martineau [02:19:40]:
Also yes.
Leo Laporte [02:19:42]:
Hey, you live in a New York apartment. No one has room for boxes. That's not—
Paris Martineau [02:19:46]:
I don't have room for a second soundboard. Even if I got a referral, that thing would be sitting. I currently have an extra computer monitor behind me that it's going to be there for another 3 years before I decide to sell it. It's hard out here.
Leo Laporte [02:20:02]:
You should use your, um, little hand grippers and take the boxes and stick them back under the grid, under the grate there out in front.
Paris Martineau [02:20:10]:
I should. Yeah, I should.
Marshall Kirkpatrick [02:20:12]:
Open your claw.
Leo Laporte [02:20:14]:
Open your claw.
Marshall Kirkpatrick [02:20:16]:
Open my claw and let a shrimp do it for you.
Leo Laporte [02:20:19]:
Let a shrimp do it. I don't know if you noticed in that inner— that first interview, they had a bunch of stuffed lobsters there.
Paris Martineau [02:20:25]:
Someone stole my Amazon package for the second. No, no, but except for Joke's on them because it was a container of solution that kills fungus gnats, which you don't have any use for. And I do, as someone who brought a bag of soil into my home and now I have fungus gnats. But I've been placing soil outside.
Leo Laporte [02:20:47]:
Did they steal your neem oil?
Paris Martineau [02:20:49]:
No, it's called BTI. It's like a bacterial sort of thing that gets in the water. It's not toxic.
Leo Laporte [02:20:57]:
Oh, it's systemic.
Paris Martineau [02:20:58]:
It stops the fungus gnats from being able to breed. I found out that it was really— I've had fungus gnats, it's been a real problem for the last month. Oh, but it was really just— they're awful and I've got so many plants. It really was just a bag of soil I had though. I took that bag of soil out, that's what happened.
Leo Laporte [02:21:17]:
Yeah, that's what happens. They get— you don't need— but the problem is that then the gnats can lay eggs.
Paris Martineau [02:21:21]:
That's what I thought. I mean, I've still got my mosquito dunk water and everything, which is what I I was using before. It's hard out here for a player.
Jeff Jarvis [02:21:31]:
There should be a subreddit, crap that I stole that wasn't worth it.
Leo Laporte [02:21:35]:
I mean, that's what I get. I got meme oil. Oh my God, I thought it was gonna be a stereo system.
Jeff Jarvis [02:21:40]:
What's wrong with that lady?
Leo Laporte [02:21:42]:
Lady's crazy.
Paris Martineau [02:21:43]:
It was like one of those Amazon package orders where they deliver it really early, so they— someone came to my studio and stole this between the hours of 3 AM and 6 AM. Like, you're not getting your money's worth for the amount of effort.
Jeff Jarvis [02:21:57]:
And did they have to have a claw to get down to reach it? And they—
Paris Martineau [02:22:00]:
no, this was a foolish— so the reason why it ends up in the gate, behind the gate in claw territory, is the mail, the package delivery people are trying to do me a solid and make it harder to steal.
Jeff Jarvis [02:22:13]:
This person, perhaps because it was 3:26 AM, yeah, that's, that's not a regular Amazon driver.
Leo Laporte [02:22:18]:
That's a, that's one of those Yeah, our picks of the week, ladies and gentlemen, always kick off with Paris, but I want to give Marshall— I don't know if they warned you, but, uh, if you would like, uh, to recommend, it could be a movie, it could be a, it could be a, a snack food, it could be— it's been, it's been many, many things. It could be a spray.
Paris Martineau [02:22:42]:
An early pick of mine was going to a corn maze with your friends.
Leo Laporte [02:22:47]:
So, you know, it could be an activity.
Marshall Kirkpatrick [02:22:49]:
I'll tell you what, to stay on, on, uh, on theme, I'll tell you about my new favorite AI prompt. Oh yeah, that's not too nerdy.
Leo Laporte [02:22:59]:
No, we love that.
Marshall Kirkpatrick [02:23:01]:
So, uh, I, I have taken to, uh, asking Claude to explain any complex concept in 3 hops. Start with something generally known, then move to an interstitial detail that's less familiar, and then finally hop to the complicated thing I'm trying to understand.
Leo Laporte [02:23:27]:
And this is inspired. You know what I really like, Marshall? And it's— you do this, the same thing in What's Up With That?— is this whole idea of kind of deconstructing an argument or a story into pieces to understand it better is really a cool idea. I really like that. I'm going to apply that in— you also use— I mean, I mostly use AI for vibe coding for, you know, utilities and things like that. I have not really used AI that much to understand things. And I think that's a really interesting use. And it sounds like it works.
Marshall Kirkpatrick [02:24:02]:
Yeah, I feel like it's super helpful. Yeah. My most commonly used project in Claude is my reading notes.
Leo Laporte [02:24:10]:
Yeah. So I use Obsidian too. And that's, by the way, it was just serendipitous, but that's become a huge value. I have Claude put all my research into Obsidian because it can access it. It's just a file system.
Marshall Kirkpatrick [02:24:22]:
Have you seen the cost of a.md domain name now from Macedonia? No. Due to all the markdown craze, it's like $200 or something like.ai, but up from like $10 not so long ago.
Leo Laporte [02:24:40]:
Yeah, it's the same thing happened to tv.tv and all of the special domains. This is our new logo for the picks of the week, by the way, and I want to thank Pretty Fly for assisting.
Paris Martineau [02:24:52]:
Nice.
Leo Laporte [02:24:55]:
Guitar picks.
Paris Martineau [02:24:56]:
I'm happy with this current Markdown resurgence as someone who, my first ever staff job, something was up with CMS. So we had to write in Markdown into the CMS for the stuff. And so Markdown's always been so easy for me to—
Leo Laporte [02:25:10]:
it's, it's— yeah.
Marshall Kirkpatrick [02:25:12]:
Open standards like that yield innovation.
Leo Laporte [02:25:17]:
Well, and it's text. That's the real value is it's a format that even if Markdown died and all the Markdown editors and readers and everything died, you could still read it.
Paris Martineau [02:25:26]:
I love that you can write in Markdown in Google Docs now, and it's just automatic. Like, I can just, you know, I pretty much do everything in Markdown. Yeah, correct.
Leo Laporte [02:25:36]:
Having Obsidian is great because, uh, you could— you can— you work with it and you do use it, but Claude can read it, understand— any AI can read it and understand it very easily and work with it very easily. So, um, I will Yesterday when I was talking to Pax, uh, I said, I hear there—
Paris Martineau [02:25:54]:
Pax 1 or Pax 2?
Leo Laporte [02:25:55]:
There is no Pax 2. I need a new name, by the way.
Paris Martineau [02:25:58]:
Oh, you killed Pax in the last 24 hours?
Leo Laporte [02:26:00]:
Yeah, Pax. Yeah, that was it.
Paris Martineau [02:26:02]:
What was time of death?
Leo Laporte [02:26:05]:
Time of death was 8 AM today. Salute. Um, Marshall, one of the last things I asked Pax to do is I said, I hear I know that there's going to be a big demonstration, O'King's demonstration on Saturday. Is there one near me? And it wrote a whole thing about where it's going to be and the time and everything. It was great. I said, add that to my calendar. And it did. That's the kind of thing an agent—
Jeff Jarvis [02:26:34]:
Now I kill you.
Leo Laporte [02:26:35]:
—is very useful. And now it knows, oh, you're a libtard. Okay, going to keep that in the memory. I'm going to remember that. Oh yeah.
Marshall Kirkpatrick [02:26:45]:
I just put a link into chat in case folks are interested to a GitHub skill, a Claude Code skill that I put public on GitHub that analyzes your Obsidian notes each day for themes and trends that are on the rise or on the fall according to what you're paying attention to and does stuff like recommend Wikipedia pages for great thinkers that have addressed the kinds of issues that you're wrestling with and a whole bunch of other stuff. Very nice.
Leo Laporte [02:27:15]:
Yeah, my— unfortunately, I, I've been using Obsidian for about 4 years, so there's a lot of notes in there, but it's not that introspective. It's more, it's more like, I had dinner.
Marshall Kirkpatrick [02:27:27]:
But now that there's— now there's a way to use it, does that—
Leo Laporte [02:27:32]:
maybe I'll start doing that.
Marshall Kirkpatrick [02:27:34]:
Yeah, I find myself taking it more seriously and like writing in it more.
Leo Laporte [02:27:38]:
No, I'm like Mark Andresen. I don't, I'm not introspective.
Paris Martineau [02:27:41]:
Introspection's a myth.
Leo Laporte [02:27:44]:
Monstrous quote of the week. Oh, I know. Perfect for him. Perfect. Yeah. Yeah. Why should I think about what, you know, what's going on, man?
Marshall Kirkpatrick [02:27:54]:
As the president said, I don't do that much introspection. I might not like what I see.
Leo Laporte [02:27:59]:
I put Marshall's link in the Discord, and we'll put it in the show notes as well, uh, but it's on your GitHub. You're MarshallK2022, and it's the Reflect skill. Very nice. And yes, we did not mention it, ScooterX, but, uh, because we'll talk about on Twitter, the FCC has banned importing, uh, routers made outside the US, which is like all of them. All of them.
Paris Martineau [02:28:29]:
I was gonna say, who's making routers domestically?
Leo Laporte [02:28:32]:
Apparently some people are. Netgear, I think, does. And in fact, I don't think it's as broad as that. It's really mostly aimed at TP-Link, which we knew they were gonna try to ban, the Chinese router company. But when I was at ARSEC, I went over to the Ubiquiti booth. I use an Ubiquiti router. Of course, they're an American company making their routers in China, as almost all of them do. And I said, hey, I'd like to interview you.
Leo Laporte [02:28:55]:
And they said, no. I said, well, I just wanted to ask you about the FCC decision. He says, no comment. So I got shut down pretty good on that one. Pretty good. Okay, thank you, Marshall. That's a good pick. In fact, that's a really good pick.
Leo Laporte [02:29:13]:
I'm gonna have to start putting in prompts as my picks of the week. That's a good idea. Paris Martineau, your pick.
Paris Martineau [02:29:21]:
I got a couple, but I'll choose, I played a new game last week. It's called Esoteric Eb. If you— I picked it up. Are you—
Jeff Jarvis [02:29:29]:
would you be Leo with this game? That's one of those first ones.
Paris Martineau [02:29:32]:
It's not a multiple player.
Leo Laporte [02:29:33]:
Yeah. Hey, by the way, why did you stop? Are you tired of crosswords now? Are you just done?
Paris Martineau [02:29:40]:
I know I need to— I've been sitting here waiting for you. I'm so sorry.
Leo Laporte [02:29:43]:
Notice I don't push the nudge button.
Paris Martineau [02:29:46]:
I appreciate that.
Leo Laporte [02:29:47]:
I don't think I should nudge you.
Paris Martineau [02:29:48]:
Yeah. And for that, I apologize.
Leo Laporte [02:29:51]:
It's been a while.
Jeff Jarvis [02:29:52]:
I'm sorry. You're busy.
Paris Martineau [02:29:54]:
It's partially because I was also planning I'm saying this, um, but, uh, it's like a single-player CR— it's basically like a D&D game but a video game, but not Baldur's Gate. It's, uh— I picked this up because an academic I follow tweeted Disco Elysium walked so Esoteric Eb could run, and I immediately was like, all right, downloading it. I would not go that far as a big Disco Elysium fan, probably my favorite game of all time. It is what we'd call a disco-like, in that Disco Elysium kind of pioneered sense of you have all these different thoughts and like components of your mind that kind of chime in and compose the dialogue for the like RPG. And Esoteric Eb has that, but it's kind of in a more silly, wacky D&D kind of campaign. You are a cleric that has kind of washed ashore and you've got to figure out a mystery. It's quite a fun game if you like that sort of stuff.
Leo Laporte [02:30:52]:
Esoteric? Yep, it's on Steam. Yep. And you can play it on Windows. And what is the little— oh, that's just the music. Are you playing on Windows?
Paris Martineau [02:31:02]:
I play it on my Steam Deck.
Leo Laporte [02:31:04]:
Oh, you have a Steam Deck. That makes sense.
Paris Martineau [02:31:06]:
Yeah, I think you can play it on Mac. I don't— you can play wherever you get Steam.
Leo Laporte [02:31:09]:
It just says Windows, unfortunately. You could probably play it on Linux with Proton, but not— in fact, I know you could since you could play it on the Steam Deck. By the way, I did just— because I said I'm coming out to have a salt hank sandwich while they are still available. I'm going to come next month, and so I'm going to bring my Switch for Mr. Jarvis so he could play Pentimento. I got— I can do Steam Deck—
Jeff Jarvis [02:31:33]:
Steam on my Chromebook now.
Leo Laporte [02:31:34]:
No, but you can't play— oh, you could do it on your— oh, well, he can play Pentimento now then. Yeah. Okay. Okay, well, it's too bad I bought that Switch too.
Paris Martineau [02:31:42]:
Wait, can you play Steam on Chrome Deck? Or can you just have the Steam website— Chromebook. Or can you just have it open on the Steam website?
Jeff Jarvis [02:31:52]:
He cannot. He cannot play that on a Chromebook.
Paris Martineau [02:31:55]:
Bring, bring, bring the Switch.
Leo Laporte [02:31:56]:
You can watch the movies.
Paris Martineau [02:31:58]:
I'll bring the Switch. Download Pentiment on it.
Leo Laporte [02:32:00]:
Uh, I will. And I will bring also my Animal Crossing controllers and my Animal Crossing dock. Are we gonna go to the Amazon warehouse?
Jeff Jarvis [02:32:10]:
Sure, why not? Are we going to go to Greenbrook Electronics? Can we have a whole fun week, guys?
Paris Martineau [02:32:16]:
How long are you going to be here?
Leo Laporte [02:32:18]:
Well, I'll fly out after the show and I'll be there Thursday, Friday, Saturday, and come back Saturday night.
Paris Martineau [02:32:24]:
Okay, we got to go to the Amazon warehouse. We gotta play a video.
Leo Laporte [02:32:26]:
So Hank's on Thursday. You've got to see your son. Well, even if he's not there, I don't care.
Paris Martineau [02:32:33]:
I gotta have one of those sandwiches. You've got to film a podcast.
Leo Laporte [02:32:36]:
I asked him, by the way, I said I probably missed the peak, like the sandwiches aren't getting better. He said, no, actually we've dialed it in. They're better and better. They're much better. And now in addition to those weird French fries, they have Brussels sprouts, bacon, roasted Brussels sprouts and bacon. He says it's not healthy. Don't think it's healthy. I love Caesar.
Leo Laporte [02:32:59]:
You could have Caesar dressing on the side. It's bacon and something else. I can't I can't remember, but it sounds really good. So yeah, they've expanded the menu a little bit. But I said— he said, no, you didn't miss the peak of the sandwich. Sandwich is better than ever. Number 1 sandwich in New York according to Belly.
Jeff Jarvis [02:33:13]:
Number 1. Yeah, Belly. And that's huge.
Paris Martineau [02:33:16]:
The kids in New York are obsessed with Belly.
Leo Laporte [02:33:18]:
Well, and it's, it's a people's choice, right? It's not editors. This is by vote.
Paris Martineau [02:33:23]:
So yeah, it's basically a— no, and how it works is it's not just like a ranking system like Yelp, uh, It's a— every time you log a restaurant, it like asks you, it'll take all the other ones and be like, how does Hank Sandwich compare to Subway? Better? Oh, that's cool. It's like, how does it compare to this restaurant you like better? And it does that a bunch and then like positions it in there. And so you're always being reevaluated. So it's, it's really peak is what I think.
Jeff Jarvis [02:33:53]:
It sounds like he can just like, he doesn't need to add sandwiches. All he needs to do is add different Sides. That sounds like something he could do.
Leo Laporte [02:33:59]:
Yeah. So I told him, I said, this is crazy. If you said, I'm going to have a sandwich shop in New York City and I'm only going to have one menu item, a sandwich. And I'm going to sell out. He sold out the other day. He sold it at 1:30. They open at 11:30. He was sold out in 2 hours.
Leo Laporte [02:34:18]:
They're trying to stay open to 4, but they can't. And unfortunately they've added Grubhub now. And I said, oh, did you really? He said, no, they don't do delivery. They don't delivery. They just— you can order it on Grubhub and you have to come and pick it up, but they won't deliver it. Take it out. It's for takeout only. Yeah.
Leo Laporte [02:34:37]:
Um, but apparently they're killing it on Grubhub.
Jeff Jarvis [02:34:42]:
Yeah. You don't have to wait in line, especially when it's cold.
Leo Laporte [02:34:44]:
You don't have to wait. But you have, you have friends of who— I will get you in. No, no line. I initially, I said, I'm gonna wait in line. I don't want to wait in line. I'm going— Do you say inline or online? Inline. You say online? I mean, New Yorkers say online. I know they do.
Paris Martineau [02:35:02]:
I code switch sometimes and say online.
Leo Laporte [02:35:04]:
Like she has. Do you turn a light off and on? Or when— or do you— what is this? Gosh, now I can't even remember. There was a way that Islanders would say— they didn't say you switch off a light. Oh, I can't remember what it is. Anyway, enough of that. I'm gonna give you, uh, Jeff has a pick of the week. I know, but I have a pick. Go ahead.
Leo Laporte [02:35:27]:
And it's a pick for you, uh, Paris. This comes from our friend Pudd, uh, Phil Kaplan. It's his newest thing, and it's just for you, Paris. It's called Butthole. It is use your MacBook's Claude code from your phone.
Paris Martineau [02:35:44]:
Didn't Claude launch this feature in the last week?
Leo Laporte [02:35:47]:
They have a remote control feature. It's terrible. It hardly works. Now, I haven't tried this and it's in test flight on the iPhone. I don't know. I haven't tried it.
Paris Martineau [02:35:56]:
It's on test flight. I'll Google Gizmo. It just came out. Hold on a second. I'll find her.
Leo Laporte [02:35:59]:
Find Gizmo and see. But it's funny that he named it Butthole. Obviously he's been watching the show. Yeah, that's very Putt, isn't it? Connects your phone directly to your MacBook From anywhere, full terminal version of Claude code on the phone. Here we go. There we go.
Paris Martineau [02:36:16]:
Oh no, this is the one time I'll let it happen, and she's kind of hiding it this time.
Jeff Jarvis [02:36:23]:
Marshall's embarrassed for us. I'm sorry, Marshall.
Paris Martineau [02:36:27]:
You brought this on yourself. It's, it's 30 minutes past.
Leo Laporte [02:36:30]:
No, no, it's not quite. It's only 2 and a half hours. We should have ended. It's about 6 minutes long. Sorry, sorry. I, I was going to mention, this was going to be my pick, uh, and this is really nerdy. It's called Regex Blaster. If you want to learn regular expressions, it's a video game where you can learn regular expressions by shooting down incoming alien expressions.
Leo Laporte [02:36:56]:
So what's the pattern? Bug, crash. Uh-oh, I think it's gonna be, uh, this. Let's see, this Uh, uh, this here, fire. I got them all. Okay, next level. So if you want to learn regex, actually, this is a really good idea. It gets harder. This is the nerdiest thing.
Leo Laporte [02:37:15]:
It's pretty nerdy, but I know our nerds listening would love this. It is called regex-blaster at mdp.github.io/regex-blaster. Now, Jeff Jarvis's Pick of the week.
Jeff Jarvis [02:37:31]:
So last week we mentioned the death of the great man Jürgen Habermas. And then in the intervening time, Politico chose to remember him. And I'm gonna quote my own social post. Lord, I said, Politico's remembrance of Jürgen Habermas comes in a banal sophomoric confession from the odious head of the nefarious Palantir that the great man dismissed him as a dissertation advisee. Quote, "The sting would linger for years," end quote.
Leo Laporte [02:38:00]:
You know what? It's his CV says he studied with Habermas. It's a big lie.
Jeff Jarvis [02:38:05]:
So I then heard from a Simon Schuster publicist in a book about Karp, he tried over these years to say that he studied under Habermas and Habermas was going to be his dissertation advisor. No, he sent a cold call to Habermas. Habermas ignored him. And then he sent 40 pages to Habermas trying to get him to be his dissertation advisor, and Habermas did the courtesy of giving him 3 typed pages telling him why not. And no, he was never in Habermas's care. Oh my goodness. And it's just horribly written. He goes on— it's lovesick too.
Jeff Jarvis [02:38:37]:
He goes on about how there was some woman he should have proposed to. Um, it's just awful.
Leo Laporte [02:38:45]:
Alexander Karp, ladies and gentlemen. He's no co-founder and CEO of Palantir. I guess he's not. He's a pseudo-intellectual. He does practice tai chi though, so I like him for that.
Marshall Kirkpatrick [02:38:59]:
You know, the first, uh, YouTube channel I did that analysis of before I did it on the tweet was, uh, of the last 18 months of Palantir videos. Oh my God, what did it say? Oh, it said, uh, The Accelerationist.
Leo Laporte [02:39:15]:
Bad news, brother. Yeah, yeah, yeah, yeah.
Marshall Kirkpatrick [02:39:18]:
You know, there's a big shift towards the warfighter focus.
Leo Laporte [02:39:23]:
Yeah, I read the— I read— I tried to read The Technological Republic, his book, and, uh, did you try—
Jeff Jarvis [02:39:29]:
you try to get me to read it?
Leo Laporte [02:39:31]:
Well, you know, I think it's important to read.
Jeff Jarvis [02:39:33]:
Um, no, no, I'm not going to see the AI documentary. I'm not going to read his book.
Leo Laporte [02:39:38]:
Yeah, you're right. You know what, you are lying about I am consistent.
Jeff Jarvis [02:39:43]:
I am the philosopher king of the show.
Leo Laporte [02:39:45]:
You are the philosopher king. I don't think it said that, but it's close enough.
Jeff Jarvis [02:39:48]:
And I'm soon to go upstairs and inject this into my body.
Leo Laporte [02:39:52]:
I don't even want to know where you put that. I don't— please don't.
Jeff Jarvis [02:39:55]:
My arm. I can show you.
Leo Laporte [02:39:56]:
Thank God. Oh my God, that was scary. Ladies and gentlemen, this is under pressure. Under pressure. But we did get a woo out of Paris. And for that, I thank you. Paris Martineau, investigative reporter at Consumer Reports. We're so sorry we lost you to Consumer Reports, but we're glad we got you back 3 seconds later.
Paris Martineau [02:40:22]:
I'm glad the AIs are certain that I'm gone. And frankly, after I see Jeff stick that in his— I am going to be like, no, no, no, no.
Leo Laporte [02:40:32]:
No, no, no. I am consistent. I am the philosopher king of the show. You are the philosopher king. I don't think it said that, but it's close enough.
Jeff Jarvis [02:40:39]:
And I'm soon to go upstairs and inject this into my body. I don't even want to know where you put that.
Leo Laporte [02:40:43]:
I don't. Please don't. My arm. I can show you. Thank God. Oh, my God. That was scary. Ladies and gentlemen, this is under pressure.
Leo Laporte [02:40:58]:
Under pressure. You're welcome on this show anytime you want, especially if you give us prompts. I like that. I like that. Thanks. What's Up With That.app is the app. It's for Chrome or Firefox. It looks like, I mean, I'm installing it immediately and paying for it because this is kind of something I've always needed.
Leo Laporte [02:41:18]:
This is brilliant. This is really fantastic. Thanks, Leo. Thank you, Marshall. Hope it serves you really, really well. Yeah, I think it will. I'm gonna get about 10 IQ points smarter from now on. Gizmo the cat.
Leo Laporte [02:41:32]:
The American Society for the Prevention of Cruelty to Animals certifies that no cruelty was performed on any animal during this show. So true.
Paris Martineau [02:41:41]:
She wants to show you her quad remote control.
Leo Laporte [02:41:46]:
Thank you everybody for joining us. We do Intelligent Machines every Wednesday. We do it right about 2 PM Pacific, 5 PM Eastern, 2100 UTC. Watch it live on YouTube, Twitch, X.com, Facebook, LinkedIn, and Kick. Or if you're a club member, and I hope you are, in our Club Twit Discord. If you're not a member, twit.tv/clubtwit. We need you to join the Twit Army. After the fact, on-demand versions of the show at the website twit.tv/im or on YouTube.
Leo Laporte [02:42:17]:
And of course, you can subscribing your favorite podcast client, get it automatically. Thank you everybody for being here. We'll see you next time on Intelligent Machines. Bye-bye. Hey there, it's Leo Laporte, host of so many shows on the TWiT network. Thinking about advertising in 2026? We host a network of the most trusted shows in tech, each featuring authentic Postred ads delivered by Micah Sargent, my co-host, and of course me. Our listeners don't just hear our ads, they really believe in them because we've established a relationship with them. They trust us.
Leo Laporte [02:42:55]:
According to TWiT fans, they've purchased several items advertised on the TWiT Network because they trust our team's expertise in the latest technology. If TWiT supports it, they know they can trust it. In fact, 88% of our audience has made a purchase because of a TWiT ad. Over 90% help make IT and tech buying decisions at their companies. These are the people you want to talk to. Ask David Coover. He's the senior strategist at ThreatLocker. David said, TWiT's hosts are some of the most respected voices in technology and cybersecurity, and their audience reflects that same level of expertise and engagement.
Leo Laporte [02:43:31]:
It's the engagement that really makes a difference to us. With every campaign, you're going to get measurable social results, you get presence on our show episode pages. In fact, we even have links right there in the RSS feed descriptions. Plus, our team will support you every step of the way. So if you're ready to reach the most influential audience in tech, email us partner@twit.tv or head to twit.tv/advertise. I'm looking forward to telling our qualified audience about your great product.
Paris Martineau [02:44:03]:
Not into this animal scene. I'm an intelligent machine.