Transcripts

FLOSS Weekly 712 Transcript

Please be advised this transcript is AI-generated and may not be word for word. Time codes refer to the approximate times in the ad-supported version of the show.

 

Doc Searls (00:00:00):
This is FLOSS Weekly. I'm Doc Searls. This week, Katherine Druckman and I talked with Dan Lorenc, who is the co-creator of SI store, which is a new standard for signing, verifying, and protecting software. And the president and c e o of Chain Guard, which is a company that puts this to work, protecting the supply chain of open source software, which is massive, complicated, and really important. And that is coming up next,

Announcer (00:00:30):
Podcasts you love, From people you trust. This is TWiT.

Doc Searls (00:00:37):
This is FLOSS Weekly, episode 712, recorded Wednesday, December 21st, 2022. Software supply Chain security. This episode of FLOSS Weekly is brought to you by code comments and original podcast from Red Hat that lets you listen in on two experienced technologists as they describe their building process and what they've learned from their experiences. Search for code comments in your podcast player. Hello again, everybody everywhere. I am Doc Searls, and this is FLOSS Weekly. I'm joined this week by Katherine Truck, who will appear in a second. There she is. And hello. In, in Houston, Texas. And I'm here in yep, you're still there. And I'm in New York where there is a deceptively positioned air conditioner for those of you who can see me behind my head, which has been idled for a couple months. There's some Christmas lights, <laugh> next to it. This is not my usual place, but I don't have one right now. So How you doing, Katherine?

Katherine Druckman (00:01:45):
I am doing pretty well. Sorry, I was just looking at my dead plant. <Laugh>. I am woefully under prepared for holidays, so that's a thing. But I'm really excited about this because I'm talking a lot about this topic lately, so this will be a good I think I'll learn a lot, which is nice. It's always nice to come here and learn new things.

Doc Searls (00:02:07):
So, so, and the topic being <laugh>. Oh,

Katherine Druckman (00:02:13):
Are we teasing it? Sorry. So, because you

Doc Searls (00:02:15):
Are, I think we're gonna get you, you are more prepared than I am on this, so,

Katherine Druckman (00:02:18):
Okay. The topic being software supply chain security, which is I think on a lot of people's minds right now.

Doc Searls (00:02:25):
Excellent. So, so then, so, so, so let's get into it. Our, our guest today is Dan Lorenc. He's he's the c e o and president of Chain Guard, among other things, and expert on Software Cha <laugh>. So supply chain security, I think we don't often think about a supply chain. Anyway, welcome, welcome to the show, Dan. Where, how you doing there? He's Oh, good, thanks. Thanks for having me.

Katherine Druckman (00:02:53):
You're in for an adventure today,

Doc Searls (00:02:55):
<Laugh>. Yeah. Yeah. And where are you on earth at this point?

Dan Lorenc (00:02:59):
<Laugh>? Good question. I'm just outside of Providence, Rhode Island.

Doc Searls (00:03:03):
Oh, good. Good. So we're all, we're all in the, in the, in the center of the country in the east. We're preparing for a, a bad cold snap, supposedly. Oh, it's

Katherine Druckman (00:03:14):
Gonna hit bad here.

Doc Searls (00:03:15):
Yeah. In Texas it's gonna get down to like 40 <laugh>.

Katherine Druckman (00:03:19):
No, no, it's actually gonna be something like 16, which Houston is just not prepared

Doc Searls (00:03:23):
For. Really?

Katherine Druckman (00:03:24):
Oh, yeah, yeah. Houston and our pipes are not, we're, we're gonna have to turn the water off for a couple days in our house. Like, we're just gonna camp and our, wow. I'm hoping that my jasmine on the fence doesn't die Completely <laugh>. It's

Doc Searls (00:03:37):
Just, oh, yeah. It's terrible. Don't she ask Jasmine was, thank for that. So, so, so Dan, Ted, tell us how you got to where you are in, in, in this adventure and kind of frame it up for us to, to start off the show.

Dan Lorenc (00:03:52):
Sure, yeah. Thanks for having me today. So yeah, I started this company Chain Guard with a few others a little over a year ago. You started back in October of last year, but before that, I was an engineer at Google for about nine years where I was doing a bunch of stuff from open source infrastructure security, developer tooling, and then kind of toward the end, the intersection of all of that stuff with open source and supply chain security. So I've, at this for a while the topics recently gotten very exciting with things like the attack on SolarWinds, the massive log for shell vulnerability last year, and tons and tons of typo squatting attacks that we see on popular open source package managers each week. So it's a fun journey and we're doing a bunch of open source ourselves and products to help combat this across the industry.

Katherine Druckman (00:04:40):
Could you, since you brought it up, I wondered if you could Sure. Tell us what typo squatting attacks are.

Dan Lorenc (00:04:47):
Sure. There's actually just one yesterday that was all over the news today on Sentinel One, if you've heard of that company. But a typo squatting attack is kind of a social engineering style trick where an attacker, so there's a bunch of different ways to do type of squatting in a bunch of different places you can do it in. But in the context of open source and supply chain security, typically what will happen is somebody will take a popular company or a popular open source or project, and then change the name slightly, whether they put in a typo or something slightly more nefarious, like change in abbreviation or something like that, and then upload another version of that same package to something like PPI or N P M or Ruby Gems. And that other version is pretty close to the original one, except it has some malicious code inside of it. So that's what happened yesterday with Sentinel One. Somebody uploaded a package to PPI pretending to be some kind of SDK or client library for Sentinel One, but it actually had nothing to do with the company. And the end goal then is to trick people into downloading it, thinking it's the real version. And then at that point, your malicious code gets to do whatever it wants, whether it's steal passwords, mine, bitcoins, or something even worse. Those happen pretty often, and there's very little you can do about it.

Katherine Druckman (00:05:59):
Yeah. The pretty often thing is, is the, the scary part these days. So I, you know, I have so many questions for you, actually, <laugh>, but one of my first one. So how has the, the, the, how has the recent government attention, let's say changed the, the landscape of the work that you do? Right, there was the executive order on cybersecurity that was last year. There is the open source security legislation that is sort of, you know, in, in progress I, or no, maybe not in progress, but out there <laugh>. So a lot of it I think focuses on open source security. So that's, that's I guess kind of part two of the question. So there, there seems to be a, maybe a hyperfocus, and I wonder is that because people realize that basically all software is open source software now? I mean, not all, but like a considerable chunk of it. I mean, how, you know, is there software that doesn't use any open source components anymore? So I, I just wonder if the scrutiny and you know, over open source software, is that fair? Shouldn't it just be sort of, instead of the problem with open source software, maybe it's the problem of software. So those are two questions Yeah. In there in a very <laugh>

Dan Lorenc (00:07:14):
Yeah, there's a lot to those two questions too. Yeah, so I think you, you started off by asking, you know, how is the government attention and legislation affecting the landscape? Yeah, I think typically if you ask a software engineer, especially an open source maintainer, if they want the government paying attention you know, that's a terrifying thought for most folks. But you know, really what these attacks over the last couple of years have proven though, is that governments themselves are also at risk. They also use open source and they have to start paying attention to this area. So it's, it's a reality that folks are starting to learn to live with. You referred to the executive order from the Biden administration last, I think it was last May. Something like that on, you know, securing America's cyber security posture, digital infrastructure, all of that fun stuff, of which open source and supply chain is a huge component.

(00:08:04):
Folks get confused a lot when looking at these things. Like, you can't have a, an executive order that says, write secure code and secure, do things securely. I wish it were that simple. But instead, kind the way these regulations work is, oh yeah, here it is. The way these regulations work is by kind of instructing other agencies to instruct other agencies to look into things and produce recommendations, and then other folks to start, you know, adopting and following those recommendations. So a lot of the fallout in the last year or so that we've seen from this was nist the standards making body inside of the US government put together a huge framework for how to develop software securely. So this is kinda that shift instead of the software itself having to be secure at the end, the actual development process for itself should be secure, because that's kind of how supply chains work. And if you're not doing that securely and you're building out on secured laptops and using unsecured CI servers, then attackers can find those. So that's called the SSD F or Secure Software Development Framework. And again, folks like this can't make people follow any of these, they just publish these specifications. So we're kinda the

Katherine Druckman (00:09:09):
Very long <laugh>. Yes,

Dan Lorenc (00:09:12):
Yes.

Katherine Druckman (00:09:13):
Incredibly long.

Dan Lorenc (00:09:13):
Massive. They're massive. Yeah, so we're kind of at the end game now, I think, of these regulations rolling out where the final step in a lot of this process is that the government can exercise their power as, you know, the largest consumer of software in the world, or one of the largest buyers of software to start requiring people that sell 'EM software to follow these practices. That's typically how it works. And so we're right at that stage now with things like the N D A A or National Defense Authorization Act that just got passed last week, which includes a bunch of regulations around cyber security. Those is the new omnibus bill, which is, I think only up today actually folks who are trying to scramble through 4,500 pages of legislation, which is tied to funding. So it's been, it's been fast in terms of government time, I would say, you know, going from an executive order to this in like a year and a half, but it's, you know, pretty, feels pretty slow on the outside as folks are waiting for all of this to trickle around the second half of that question.

(00:10:12):
Actually, do you wanna jump in there before going to the second half of your question around open source itself?

Katherine Druckman (00:10:15):
Yeah, no, no. I mean, I, I, I did just wanna say one thing about the length and the massiveness of these, these documents. I, I wonder if that's a, if that's a bit of a deterrent or it, it makes them less useful. I, I've wondered that. I, I read some interesting criticism, you know, online about, about that, and I am, I would actually plug here the work of the open F S F and in, in releasing some concise guides. But yeah, I wonder if you thought about that before you go into the open source part.

Dan Lorenc (00:10:43):
Yeah, yeah. The length is always fun. Nobody reads them, you know, in the, the actual, like, that's my concern, word for word and, and tries to follow it. But yeah, it, it ends up starting to get this distilled down into scores and rubrics and frameworks and prescriptive guides. You mentioned the open SSF a couple times. So it's the Open Source Security Foundation which is part of the Linux Foundation. And they've kind of been at the forefront of a lot of this work, even, you know, writing recommendations and participating in the process by which just came up with these recommendations. And so they have a couple different efforts going to help condense that into smaller prescriptive steps. One of those is the SALSA framework, or S SLS a which provides, you kinda boils it down to four levels, levels one through four, and it's very easy to see what you have to do to get your supply chain to level one, to level two, to level three, to level four, hopefully save you time in not having to read hundreds and hundreds of pages of occasions. The open source angle, though, you mentioned is, is a really interesting one. You asked, is there software out there that's not open source or doesn't use open source? All the stats I've seen, say somewhere between like 90 and 98% of the code and most modern applications is open source. Yeah. it's something like 98% of organizations surveyed say they're using it, and I think the other 2% got confused or clicked the wrong button or something like that with

Katherine Druckman (00:12:02):
Filling out.

Dan Lorenc (00:12:03):
Exactly. it's impossible to not be using open source. And it's kind of like, I look at it as like the, the kind of tip of the iceberg analogy. The little proprietary part most people write themselves and don't release is just that, that tip of the iceberg. And you have this massive iceberg underneath as of all of the open source code to get you up and running. And that all code has bugs. Open source code is no different. The more code you're using, the more bugs you're gonna find. And some of those bugs have security implications. So that's where the kind of open source security angle comes from. You see vulnerabilities, you know, constantly stuff like log four shell, last year was one of the biggest ones in recent memory for most folks. But not knowing what is under the water, not knowing what's kind of beyond the tip of the iceberg is where folks are struggling today.

(00:12:52):
You said, you know, it's not really an open source security problem. Maybe it's a, a software problem. I like to take that a little bit further, right? There's nothing bad about open source. Open source security is actually better than proprietary code security by pretty much every measure. The problem is in how organizations consume open source, though. So if you look at Log for Shell, again, as an example, it was fully patched by those maintainers. Within a week or two, you can say, oh, you wanted an underfunding and under staffing and all of that, and we should do better, but they did their jobs within a week or two. And then you see stats last week on the anniversary of Log for Shell, where 70% of organizations were still vulnerable to it. You know, that to me isn't an open source problem. That's a corporation the way you're using open source problem.

Katherine Druckman (00:13:36):
That's a, that's a great point. Yeah. If you're not responding, if you're not taking these gifts that are thrown out into the, into the public, if you're not accepting them the fixes, then yeah, you're gonna leave yourself open. That's, that's a good, it's a good point. So something that you mentioned in your intro, actually, I, I think you did. Anyway I wanted to talk about identity and the, the concept of, of identity as it relates to signing software and, and, and trust. So could you talk about that a little bit? Why is identity verification so important to supply chain security?

Dan Lorenc (00:14:13):
Yeah. Identity. That is a, a really deep topic. I know you both have a lot of background in identity is <laugh> a little bit. Yeah, I think identity is really tricky and open source in particular, but it's tricky, you know, for everything. But in open source, a lot of times folks don't know the identities of maintainers. They don't pay attention to any of this. You're just grabbing code off of the internet and picking it up and using it. There have been a lot of efforts to help improve that and help maintainers, you know, sign their code to make sure that when people grab it, they're, they know they're getting it from the right source. This happens in companies quite often. You know, companies have corporate code signing certificates. Open source is a little bit behind there because the whole identity world is trickier.

(00:14:56):
It's a really tough privacy question. And open source is kinda the way I look at it. You might not necessarily care who the maintainer of a particular package is, you just wanna know that it's that same person over time, or if it changes that it changed on purpose and it wasn't one of these typo squatting or package repository or GitHub account takeover style attacks. And so there's a lot going on in that space to help folks, you know, maintain these kind of persistent identities, but without necessarily, you know, taking photocopies of their passports and uploading them to GitHub along with every single release of software.

Katherine Druckman (00:15:31):
Oh, I think Doc may have a question.

Doc Searls (00:15:35):
Oh, he's mute actually. I do. And I was on mute there for a second. Sorry. Sorry. I, I I, I, I missed chat there for a second while I'm sitting here. Okay. So Dan you know, I know, I know there's a role that SIG store plays in this. It's a standard for doing the work that you do. Tell us what your involvement with that is and how it relates to what you're doing with Chain Guard.

Dan Lorenc (00:15:57):
Sure. Yeah. So the SIG store project is a set of open source projects and some also some infrastructure, which makes it a little unique, but I can get into that a little bit. But but the overall goal of all of this is to make code signing really, really easy and free for open source maintainers, if you're familiar with, with what Lets Encrypt did for t l s across the internet. That's kind of the same approach we're trying to take with SIG store, except for code signing rather than t l s and h gtps S certificates. The infrastructure allows you to request these short-lived certificates that are only good for a couple minutes to use to sign your code. You authenticate yourself by logging into most common identity systems, so say Gmail or GitHub or GitLab or any of these other common identity providers, and you sign your code with that.

(00:16:45):
So it might be an email address or GitHub account or something else that is known to the users and contributors of that project. And by doing this, it's all free. You don't have to worry about keeping keys around like you might have had to before with pgp. You don't have to worry about losing them or anything. And anybody can get up and running with it in a couple minutes. So this project got started a few years ago before we started the company back when I was at Google. And it's part of the open SSF today. So we're working on integrating it with most popular language package managers and ecosystems and tools in the container space all across the cncf, all that kind of fun stuff. At our company, we're using a lot of that tooling as well. So we have products and systems that help you start doing that inside of corporate environments. That same kind of open source technology that's free for PPI and N P M need some tweaks to work behind firewalls and inside of enterprises. So we have a bunch of stuff in that space. So that kind of ties into the identity question we were talking about before.

Doc Searls (00:17:52):
Sorry about that. Boy, I'm writing writing mute at home. I have just have a button. I hold it down and it's muted. I pick it up and it's not muted here. I'm looking at the Zoom window. Sorry about that folks. There, there's, there's a term that I've heard more and more over the recent years around in the identity world which is about Providence and mm-hmm. <Affirmative>. Two questions about that. Is it provenance or provenance <laugh>. And, and the other is and, and the other is and the way open source works with code coming from so many places that get put in containers and other things like this, how important, or is it, is it to keep on top of what the providence of something is when really you're trying to guard what happens along the way, and maybe your work actually takes that off the table so customers aren't, was worried about it. So just wanted to see if you can address that issue.

Dan Lorenc (00:18:49):
Sure, yeah. You, you mentioned the, the magic providence word. I've heard to pronounce a bunch of different ways, including like common misconception there. People say providence a lot, which is, you know, a city in Rhode Island near where I live. It has nothing to do with all of these. But yeah Providence is kinda where say that that piece of code came from. An example I like to use is most folks hopefully know that here, lemme grab one from my desk. Most folks know if you find a, you know, USB thumb drive on the, in the parking lot or sidewalk outside your work, you are not supposed to pick that up, bring it inside and plug it into your computer. Hopefully most security folks have have scared people away from doing that. But when you look at it running pip install on some random package or MPM install on some random package isn't really that much different.

(00:19:39):
You're still just kind of taking code from somebody you've never met. You have no real reason to trust them downloading it and then running it on your computer. Or, you know, even worse, sticking it into a container and sending that into your production data center. So Providence, and a lot of this tooling is kind of taking that concept and saying, well, why can't we plug in that USB thumbs stick? Look at the code in there and then actually trace that back from that, you know, compiled binary back to the build system that it was built on back to the commit and the git repo that it was built from back to the maintainer that wrote that commit. We don't really have those breadcrumbs today in a lot of these open source ecosystems and cryptographic signatures and stuff like Sig store are a way to start building up that trail of breadcrumbs in a way that you can verify it later. We're not all the way there, but we're getting close. I think

Doc Searls (00:20:30):
I actually have a question about Sistar, but first I have to Sure. Let let everybody know that this episode of Flow Weekly is brought to you by Code Comments, an original podcast from Red Hat, you know, when you're working on a project and you leave behind a small reminder in the code, a code comment to help others learn your work. This podcast takes that idea by letting you listen in on two experienced technologists as they described their building process. There's a lot of work required to bring a project from whiteboard to development, and none of us can do it alone. The host Burr Sutter is a Red Hatter and a lifelong developer advocate, and a community organizer in each episode verse sits down with experienced technologists from across the industry to trade stories and talk about what they've learned from their experiences. I subscribe to it by the way.

(00:21:18):
I really like the Deep Learning episode goes into how companies themselves, like Intel actually use deep learning to help their process internally. It's good stuff. Episodes are available anywhere you list a podcast. And at Red hat.com/code comments podcast, it's all one word code comments podcast, search for code comments in your podcast player will also include a link in the show notes by thanks to code comments for their support. So I'm, I'm looking at some notes here about you know, along with prominence, what's the difference between the way you're doing it with your work and the way it's done in the all or mostly proprietary world where you, do you just leave it up to an Oracle or you leave it up to one of those big guys to take care of the whole thing? Or are learnings moving back and forth between these two worlds, or the world's entirely no longer separate the way they once were?

Dan Lorenc (00:22:17):
Yeah. you know, if you look at it and you look at the history of Codesign you know, stuff like PGP has been around an open source for, for decades, and you only use it in some large projects have, but it hasn't really seen widespread adoption. But if you look at, say, the proprietary worlds you do kind of see these small circles or walled gardens that do have pretty good code signing set up. Some examples there, say like, drivers for Windows, Microsoft requires those to be signed before they can be executed. Similar things for like the Apple App Store for phones, or the Android store for Android. But none of this was ever really set up general purpose. A lot of those concepts there around like, is this sign? Yes, then it's safe to run. They're great for establishing some type of, of trust or barriers, but they leave a lot of problems unsolved.

(00:23:07):
One great example there was, you know, the attack on SolarWinds, we started by talking about SolarWinds knew those certificates had been compromised as part of the attack, but the p k i, the infrastructure there for, for signing and the certificates made it so hard for them to revoke that certificate that it took them over a year to do so because it would've kind of bricked or broken a bunch of installations that weren't compromised. And so there's a lot of improvements to be made across both open source and closed source, kind of these proprietary app store worlds. Providence takes it a little bit further too, right? The, the typical code signing model that gets criticized a lot is, is if it's signed, it's good. If it's not signed, it's bad. That whole concept of Providence you were talking about is more about attaching semantics. So you can sign something as it gets built and you're not necessarily saying it's safe or not safe. You're just saying, Hey, I built it on GitHub using this GitHub action. You're kind of stating and putting a bunch of facts on the record that somebody else can evaluate later. The only thing you're saying is that this statement is true. It's, you're not saying it's good or bad or safe or unsafe. Does that make any sense?

Katherine Druckman (00:24:13):
Yeah, absolutely. I have a question. Go ahead. Maybe others would be interested. So, you know, we've talked a bit about six, but can you kind of lay out for us, what, what does it actually help prevent and what are its limitations?

Dan Lorenc (00:24:31):
Sure. yeah, CTOR really, really is targeted just at that kind of authenticated metadata about software case. So it lets you make those claims. It lets you say, I, you know, with my email address sign off on this saying it's a real release, or I built this using GitHub actions from this commit. It lets you put all these on the record, it lets people find them, and it's really great for that use case. I mentioned before that there's a bunch of open source projects with stor and also a bunch of infrastructure which is a little bit u unique. So a lot of projects just come with something you can download and run. Sig story has that, but it also comes with access to this free community run public good infrastructure instance of SIG store that you can interact with that contains a certificate authority.

(00:25:19):
You can get certificates from as well as a bunch of transparency logs where this stuff gets stored and can be looked up from. These transparency logs are great because once you sign something, you can put it on the record, anyone can find it, you can't really change it later or take it back. But the downside there you know, the big limitation is that everything is on this public record mm-hmm. <Affirmative>. so if you're trying to run some of this infrastructure internally or inside of your company and you're building code you might not be shipping then you probably don't wanna be putting every single build of your internal backend system on this public log where anybody can read it. Or you might not wanna take a dependency on that service for something running inside of your data center. So there are different ways you could run parts of it inside or parts of it outside, but that's kind of the biggest limitation today. It's both a, a benefit and a drawback.

Katherine Druckman (00:26:07):
Okay. I wonder, could you also talk about the underlying software that Powers Six Door? Are there other pieces that fit into it, let's say fio reco, that kind of things, and how all of these things fit together in the kind of to tighten up the software lifecycle

Dan Lorenc (00:26:23):
<Laugh>? Sure. Yeah. So I mentioned all those different components before. Those are some of their names. So the typical interaction model is somebody would download a, a client for, say, store. So depending on what type of stuff you're signing if you're doing containers, you would download the, the co-sign tool it's called if you're uploading a Python package, there's a special version of that for Python, or if you're doing npm some of that stuff is getting baked into NPM directly. So once you have that tool for signing and verifying you just run the commands depending on how you're trying to sign your code, a browser window might pop open. You might click log in with Google or log in with Microsoft or something like that to get your identity. And then at that point you get a certificate. So that certificate comes from the certificate authority, which is how this, this works for most other ecosystems too.

(00:27:07):
And that's called fio, so F U L C I O, that's the, the name of the certificate authority piece. After you do all of that, the signatures, your SBOs, the Providence not the Providence though all that stuff gets signed and then stuck into the transparency log. So a transparency log is a kind of a pen only database where you can add records and other folks can verify that records are in there but you can't tamper with them later. You can't delete them, you can't change them after they're in that log. And that piece is called Reor, R E K O R. So there's a network of folks out there verifying that stuff only gets added to the log and doesn't and nothing gets removed or modified. And then you as a user can check that log too and see a record of everything that you signed.

(00:27:51):
So if all of a sudden something shows up in there that shouldn't be in there you don't remember, say, doing a release of that package that day then it's a pretty good sign. Somebody stole your identity somewhere, and you can use that as a hint or the start of a process to go and recover. So those are kind of the three components of client, something like cosign, and then the certificate authority and the transparency log. But it's all set up in a way that's supposed to be developer friendly, and you don't even necessarily have to know about it beyond clicking that login button every once in a while when you release a package.

Katherine Druckman (00:28:22):
Developer friendly is a, is an excellent phrase that we like to hear <laugh> actually, so I'm wondering, like if I put myself in the shoes of a DevOps engineer say, and I've, I've, let's assume I've spent a lot of time getting my, my pipeline to some near perfect level of efficiency in a, in a perfect world <laugh> how, how do I add this to my setup in the absolute easiest way possible without completely moving my cheese?

Dan Lorenc (00:28:48):
Yeah, it depends kinda where that pipeline is, and everybody's got different systems for it. But that's really the magic, the secret sauce here. We're trying to get it to where you shouldn't have to do anything other than click a button saying you want to use this. And if you're using GitHub actions, or GitLab Runner or Circle CI or some of these popular build systems then you're almost at that world today, which is great. The, these systems should be transparent to you as a user, so it should be pretty easy to to get set up and you shouldn't have to think about it after that. I think we're sort of seeing a shift from everybody running their own Jenkins build systems on Old Mac minis and closets to <laugh> the people coalescing on a smaller number of more highly secured systems like these. It's definitely a trade off. There's a little bit of centralization happening but overall I think it's a net win for, for the industry for folks to be relying on stuff that's more actively looked after <laugh>.

Katherine Druckman (00:29:45):
That's yeah, that's good. Yeah, I mean, easy to implement is, is the magic word there. What are, but what are the barriers to adoption

Dan Lorenc (00:29:54):
Or there <laugh> Yeah, like the, just kind of the, the breadth of the space and the number of package managers and the number of build systems to integrate with. It's easier to bring on, say, every user of a packet of a language ecosystem at once by integrating directly into that tooling than having everyone do it themselves. So that's the approach that we're taking. Just time though. The project's only been around a couple years and the adoption's kind.

Katherine Druckman (00:30:21):
Yeah, I was about to impress

Dan Lorenc (00:30:22):
Everyone.

Katherine Druckman (00:30:23):
It seems early, right? It's early to talk about adoption quite yet, maybe, although it seems to, you know, it seems to have some traction. Yeah, I dunno. There is a, what there's a landscape page, right, where you can, the, we love those landscape pages, don't we? Yeah. <laugh>. But yeah, you can kind of get a picture of of who's using it and where, and it's it does, it does have some nice traction, right?

Dan Lorenc (00:30:46):
Yeah. Yeah. There's a landscape, I think it's a, a tab on the open ssf landscape that shows projects that integrate with it or use it to sign their releases or yeah let you verify stuff on the other end. So yeah, there, there's a bunch of different categories for it and tons of projects. And I think most notably stuff like the Kubernetes project itself, which releases dozens and dozens of containers and binaries and manifests for every single platform under the sun uses it now. They're one of the, the heaviest lifts that we've had to do today just because of the, the size of their releases and complexity.

Doc Searls (00:31:23):
So, so I was looking through your you know, some of your white papers and projects and things, and and you have these two products images and enforce, are they simultaneous for customers or sequential? How does that, how does it, how does what your cell work as people move into the, is become deeper in, more deeply involved as customers?

Dan Lorenc (00:31:48):
Sure, yeah. I'll explain the products a little bit. Kinda the images product is a set of container images that we build. And you can use those to either put your applications on top of, to use to build your applications in your development pipeline, or even some just kind of off the shelf apps that you can run in your infrastructure for web servers or frameworks or build systems, that kind of thing. Those images we rebuild them constantly. We do all that salsa stuff with SBOs and providence and signatures. So when you use our stuff, you can trace it back to the GitHub commits and all of the coder reviews and testing that was done for, for each of the builds. And then powering those it's actually a really interesting new Linux distribution that we built called Wolfie, W O L F I.

(00:32:37):
 And so we built this Linux distribution to be optimized for these container environments. So there's stuff like no kernel in there, and the packages are all built declaratively inside of containers themselves, so that when you get one of these, it has the bare minimum number of dependencies and we can keep the vulnerability scan results incredibly clean. So yeah, here's, here's the little website. You can look up the sbam for each of the images, and if you scan them with your, you know, vulnerability scanner of choice we should be down close to zero vulnerabilities in most days, which is refreshing result compared to a lot of the, the other images out there where they're a lot more batteries included. I have a lot more software and it's not updated as frequently as ours. So that's kinda the start of, of the journey for a lot of folks.

Doc Searls (00:33:27):
<Laugh>, there's, there's a horn honking outside. You've

Dan Lorenc (00:33:31):
Got a, you've got a delivery right outside

Doc Searls (00:33:35):
Question. I know I'm waiting for the buzzer to go off in the kitchen and says there's an Amazon guy at the front door and somebody has to let him in, and it's not gonna be me. <Laugh>, I have, I have some other business questions, but first I need to let everybody know that about Club TWiT. Club TWiT is our own thing here at the TWiT Network. Joining Club TWiT is another great way to support our network is a member. You get access to ad-free verses of all the shows on TWiTter, as well as other great benefits. There's a bonus TWiT plus feed, which includes footage and discussions that didn't make the final show edit, as well as bonus shows who started, such as hands on Mac, hands on windows and ask Me Anything. And fireside chats for some of your favorite TWiT, TWiT <laugh> TWiT guests and co-hosts.

(00:34:25):
 As plus weekly listeners, you may also be interested in checking out another Club TWiT exclusive show, the u the Untitled Linux Show with her own Jonathan Bennett. It's a great show. Should go check that out. It's on weekends, so sign up to join Club TWiT for just $7 a month. Head over to TWiT tv slash club TWiT and join today. We thank you for your support. I was just thinking, I I, for Providence and Provenance and Pro Providence and Providence I wrote a book called The Intention Economy that was against the attention economy. And everybody calls it the Attention economy because that's what they hear. So it's hard <laugh>. Anyway, English is a tough language. I, I wanna ask that business model, which, cuz kind of, cuz you always have to a ask that. Do you have a free version? Do you start people out at, I mean, it subscri it's a subscription of some sort, but you know, you have big companies that you're serving, you have little companies. How's that all work?

Dan Lorenc (00:35:26):
Yeah, so the images themselves, all the code, all of that is open source, right? You know, we're by building our own Linux distribution, we are, you know, rebuilding all of the packages. Stuff like, like Curl and open SSL and everything that you're familiar with. So all of that is open source. All of our distribution is open source. And we have a free tier of the images as well. So a lot of the images we keep a, a free version that anyone can download. You don't have to sign up, you can just scan it or pull it. And those are at kub.com/chain guard in images slash images is the whole list of all of 'em. You can find them there. The paid version of all that comes with multiple versions and older support and lts and other stuff that folks need for compliance. So we support a bunch of o open source projects that use these things for free and then charge companies for enterprise features and support for older stuff that they typically need. Pretty simple overall.

Katherine Druckman (00:36:25):
So I, I actually have another question and it's one of my annoying multi-part questions. <Laugh> I read a, a post you did a retrospective, so to speak, on, on, on all the great things that have happened in open source security in 2022. And there are many, and, and please, you know, feel free to, to speak about those, but I also wondered what you hope to see next year, now that we're at the end of the year. It's, it's a nice time to talk about these things and, and what you hope to see. So that's part one, part two. And what role do you expect to play in, in those things that you hope to see? And the third part is how can the open source community, how can people like me or anybody listening get involved in making those things happen?

Dan Lorenc (00:37:08):
Sure. All right. So part one was what do,

Katherine Druckman (00:37:12):
What do you hope to see?

Dan Lorenc (00:37:13):
I hope to see, yeah. What do I hope to see? And then how am I gonna be a part? And then how can you be a part? Okay, so what do I hope to see? You know, this year, 2022 kind of, this is definitely the year of the sbo m I would say the S B O M or software bill of materials. That was kind of the centerpiece of most of the legislation, most of the government regulation that we've seen going around. And the whole idea with SBO M is that when you get a piece of software from someone, they should give you this standard format, machine readable document explaining all of the stuff in there, including all of the open source, so that way you can look for stuff like log 4k check the versions, scan all of the stuff inside for vulnerabilities and stop training software.

(00:37:52):
It's like a black box. Mm-Hmm. <affirmative>. It's unfortunately it's a really hard topic. It's really complex. There's tons of different ways software is built and bundled together, and engineers love to bike shed about all of those little details and how to describe those relationships and how to name those components and all of that fun stuff. And so a lot of the conversation has just been around, you know, what is an sbam? How do we do them? Are they good? Do you have one? Do I have one? Do we need one? Does it secure me? That kind of stuff. So I hope next year is really the year that we can put them into action rather than just talking about them. It's way more than just binary too. Even if you have one we just released some data today about a lot of the, the existing ones out there, they're incomplete, they're inconsistent, they're missing things, they're not even, you know, in the correct standard format.

(00:38:37):
 So I think hopefully next year we can actually get SBOs in action and then start talking about the quality of them and making sure that they have enough data to be useful. We're doing a bunch of work there. We're trying to, you know, we're participating in the standards groups, updating the formats, and, you know, releasing those with all of our software for our customers as well. It's pretty easy to get involved. If you're not doing SBOs, then try and then complain and file bugs and send fixes in for all the tooling to make it easy. Cause you said before, developer friendliness is gonna be key here. And if, especially an open source, if people have to think about it, it's an extra step, then they're just not gonna do it.

Katherine Druckman (00:39:13):
Yeah, I, I think I think you were spot on there. I think yeah, developers, we, you know, I think developers, I used to be one <laugh> tend to sort of hyperfocus you're hyperfocused all the time, right on your problem, whatever the problem of the day or the month or the quarter of the year is, and you're so focused on that, it's so hard to get a, a big picture of what you're working on. And I think, you know, I'm very sympathetic with people presented with having to get on board with yet, you know, add another, add another thing that you're responsible for and then be and comply with. And, and but you know, it, I, you know, I also think it's important because who among us have has not been independency hell. And I think the, the visibility that this will, this will give you is going to be a lifesaver to many eventually. But may, you know, maybe we could talk actually a little, since you brought up SBOs, I think that's something

Dan Lorenc (00:40:07):
We haven't, I said the word. Yeah,

Katherine Druckman (00:40:09):
<Laugh>, I know you said the word. So, so yeah, I wondered if you could talk just a little bit about, you know, let's say our, the, the current state and, and where we need to be with SBOs. Especially when you talk about things like US government or other governments requiring SBOs of certain vendors.

Dan Lorenc (00:40:30):
Yeah. Yeah. So that's kind of one of the big areas. We talked about some of the regulations at first and how the government's gonna start requiring this as it purchases software from from vendors. This is one of the things they're gonna start asking for this year. So when they buy your software, they might ask you to provide an SBO m alongside of it. And so that's, you know, a list of all of the sub-components inside of that software down to, you know, all the transitive dependencies, the dependencies of those transitive dependencies which is really hard to do for, for folks in the beginning. They, they serve a couple use cases. I think the biggest two of the easiest two to reason about are number one, vulnerability management. So through Zoom back a year when Log for Shell happened everyone had to spend their holidays emailing every single one of their vendors, asking 'em if their software used Log for Shell, and then how to remediate it.

(00:41:19):
If if it did use that effective version of Log four J SBOs would let you skip that step hypothetically, cuz they'd give you a list of everything inside and you wouldn't have to email them, you could just check those SBOs to see if it's in there. Oh we're jumping ahead to vex. Yeah, VEX is another awesome topic. So vex is kind of step two I see here to the, the SBO movement. Once SBOs do get widespread, if they do get widespread folks are gonna start seeing a lot more, right? It's a, a transparency conundrum when you can start seeing how the sausage is made and you see all of the stuff in these software packages that you're using, the obvious next step is to start scanning all of that for vulnerabilities. And the whole software world's gonna look much worse and much more insecure than it did, you know, before this transparency was introduced. The VEX is kind of the compliment there where VEX is a way for vendors to say that, yeah, we looked at all of those vulnerabilities in the scanner and we determined they're not applicable in this case because we compiled with different flags or we don't call the function that's vulnerable or some reason like that. So they're kind of a way to automatically quiet the scanners or automatically annotate those results to help reduce the noise that's gonna come from all this increased transparency. Is that enough? Did I answer it all?

Doc Searls (00:42:39):
Yeah, it did. And I was muting myself again and sorry, I would never do it again and I already did it. We have a, an active IRC back channel here, and one question from there is there, it is, is how will you see customer inventorying and managing the lifecycle of your certificates beyond and beyond signing certificates? What others are in a blind spot for customers? I probably didn't read that right, but but there's, there you go.

Dan Lorenc (00:43:11):
Yeah, so the, the question is how do we see folks managing the lifecycle of certificates? Yeah, the, yeah, the kinda shift I think that we're seeing, and this ties into a lot of the other federal buzzwords around zero trust but I think the shift we're seeing is kinda a movement away from these really long-lived secrets or long-lived private keys or long-lived certificates that you have to keep private and keep access to for a really long period of time. And if they leak, then the damages are long lasting. As well, we're seeing a shift from that to much more shorter lived automatically rotating tokens and credentials and certificates that you can get on demand as part of, you know, similar to most of the, the zero trust infrastructure movement. So instead of just doing everything with a password or some long access key that can get leaked, you're doing it with stuff that's automatically retrieved on demand. That's the general trend, not just for code signing, but for, for everything like that dealing with cryptographic material.

Doc Searls (00:44:10):
Yes, I <laugh> I was fishing from where, from where questions in the back channel, which is busy talking to itself about everything else at the same time. So let me ask a different question, which is you spent a lot of time at Google and I suspect it was at a fairly transitional time. And, and this may relate in this interesting way I've been around Silicon Valley for a long time and I always thought there were like three stages for every company, which is new, hot and then big and they're almost completely different. But Google's been big for a long time and I think there are probably a lot of learnings you got inside of their before you punched out to do your own thing. And I'm just wondering if you could give us some insights about what you learned there and, and a separate question, I'll be a multi-part like Katherine, what is the, what is the role of big platforms like, like like Google and Amazon and Microsoft for that matter with Azure going forward, it seems to be, we're sort of in an era of maximum big tech and they're starting to move in another direction now, at least from the outside it looks like that, but the change that you're concerned with are a little bit outside that concern, but I, I'm just wondering what's happening to Big Tech?

Dan Lorenc (00:45:28):
Oh sure, yeah. So transition, you said, what did you say? New hot and then big.

Doc Searls (00:45:33):
Yeah, new hot and big. You get a startup and it gets hot. If it lives, it gets hot and then you plateau, you're big now and it's like a whole other, a whole other thing. Very few companies manage those things the same way or with the same leaders of the same people. <Laugh>, it's, it's sort of an inter, they're sort of state changes that occur, like the stages of a rocket getting into orbit.

Dan Lorenc (00:45:55):
Yeah, so I think, you know, I was there for about nine years when I started there. It was 20 12, 20 13 and it felt big at that time but you know, I think 10 times bigger by the time I left. So I don't know what phase it was in at at either of those times, <laugh> but I think, you know, this kind of ties into the rest of your question on what role do those big platforms play? Yeah, I think the biggest thing that I saw happening there that I think we're still seeing happen throughout the rest of the industry was what motivated a lot of the Zero trust and what motivated a lot of this supply chain security work. It was, if you, if you look back at what was happening in kind of that 20 12, 20 13 timeframe at, at Google and most of these other big companies it was operation Aurora the, the kind of be rise of the beginning of these nation state attacks where persistent threats and really scary security folks were spending lots and lots of time and trying to break into companies like Google.

(00:46:53):
 And that was a threat model that most companies weren't really considering at the time, like an actual nation trying to attack you. So that led to huge amounts of innovation and reinventing and rethinking of the way secure systems internally at these companies. Then kind of the rest of that decade, the rest of the, the 2010s, I think, you know, every big massive kind of hyperscaler company had to deal with that. But unless you were one of them, you probably didn't think about it at a 50 or a hundred or, you know, even 500 or a thousand person company didn't think or model that in your threat models. But now what we're starting to see is what we saw with, with the SolarWinds attack and what we see with these other insider compromises happening across companies across the industry is that you don't have to be a hyperscaler to be in those crosshairs. If you're somebody that sells software to somebody that sells software to someone that is one of those hyperscalers or say is the federal government now you're just a couple hops in the supply chain away and everyone is having to start to reckon with that in their threat models.

Doc Searls (00:47:56):
It it's interesting that I'm thinking about I'm, I'm kind of going back to what you said earlier about the S S D F, which is provided by nist, which is a federal government institution and how big a customer they are. And and I'm wonder what it on the outside, I tend to think of the government as a slow mover just by nature. I worked for a company once that was one of those that supplied $25,000 PC xts to, to the federal government. When they have a big buy, they're gonna buy, you know, 2 million PCs, <laugh>, they're gonna be costing more than anybody's paying in the open market for these things. But I, I suspect that the federal government is being much, is much more sophisticated now. And so I'm wondering what, what kind of poll they have and how that in a way drives the market from the demand side. I'm always interested in what the demand sides of markets do to pull better stuff out of suppliers, back up a chain, which isn't always a chain cuz you're getting things, you know, containerized with five or six different locations at once. So it's more like a web, I guess, <laugh> than, than the chain.

Dan Lorenc (00:49:14):
Yeah, yeah, it's complex. You know, I'm sure the federal government's a lot more advanced now than it was back then. There's definitely pockets you know, there's really modern areas and then there's, you know, areas that are still modernizing. They do have a lot of power though on that purchasing side, like you said, as one of the largest vendors when they buy something, they buy a lot of it <laugh> and they buy a lot of things across the board. Because of the way a lot of this stuff like the SS d f is written it applies recursively too. So if you're trying to comply with a lot of that stuff, you also have to require the same guarantees from your vendors. So it's kind of impossible to get started with actually, cuz somebody has to do it first and they can't be compliant until everyone else is. But it will spread pretty fast across the web just given the size and number of folks that sell software to the government.

Doc Searls (00:49:59):
Do, does, does anybody in the government write software? I'm just wondering if they are, are they, if I go into GitHub and I'm looking around for, for developers that are doing things or, I mean, I know there are lots of them that're working for a lot of the big companies. We had Greg Crow Hartman on here earlier who's one of the alpha maintainers of Linux who talked about all the big companies that, that Linux maintainers work for, do any work for the government? Or is they don't just, maybe they don't pay enough. I don't know. I have no idea. <Laugh>,

Dan Lorenc (00:50:30):
Yeah. Yeah. The government employs a lot of full-time software developers. They also do a lot through contracting. So there's a lot of custom software that gets written for the government, whether it's, you know, as a government employee or somebody contracted directly by the government.

Doc Searls (00:50:45):
Yeah. So Katherine, you had something there.

Katherine Druckman (00:50:50):
<Laugh>. Yeah. You know, I, I, I go back to every, you know, I go back to putting myself in the shoes of a developer who doesn't know nearly enough about security, which I think is maybe everybody, because security is a complicated topic and you can never, you know, we're al we're always learning. Right. so I'm wondering, you know, to kind of recap some things you said earlier like what, what would be the next steps you would recommend for a developer who, who just needs to dig in and understand the risks of that software, supply chain security? Where would you recommend those people turn for resources right now?

Dan Lorenc (00:51:33):
Good question. Yeah, there's a really good intro course from the Open ssf actually. Ah, yes, exactly. Sure. Find the link. But yeah, they do have a bunch of education materials and courses on how to get started. And not just the supply chain security aspects, but just things to think about and things to watch out for as you write your software to help reduce the introduction of security bugs. So, yeah, it's a great course. I was able to review it before it was published and I, and recommend it to lots of folks.

Katherine Druckman (00:52:02):
Yeah, that's, that's actually a great one. I was gonna mention <laugh> the Open SSS s SF has done a lot of really great work, as you mentioned. I mean, as you mentioned in your post, you know, over, over the last year especially the org as an organization, they seem to have really kind of built up a lot of steam. And maybe that's something you could talk about too. I mean, so how, you know, how did Sig store go from a project that you co-created to being part of the open S S F? What was that process?

Dan Lorenc (00:52:32):
Sure, yeah. It's kind of cyclical actually. Cause I was also working on getting the Open SF started kind of at the same time. The Open SSF itself it's pretty new as well. Was it the end of 2019? Beginning of 2020? Mm-Hmm. <affirmative>, something like that when it actually got started. So, you know, six stores kind of going on in parallel at the same time. And once the Open SF got going and got through the pandemic and all of those delays and everything like that it was a natural home. So we were able to, to move it into the open ssf maybe about a year ago now. I can't remember exactly when that went done.

Katherine Druckman (00:53:05):
Yeah, that's great. It's a great reminder of I don't know the, the nature of organizations like that where you, you pitch in, you pitch in your solutions and, and the, the best things kind of bubble up and, and it's a, I think, a good reminder to go to them for not only training, but to get involved or possible. I think Doc

Doc Searls (00:53:26):
<Laugh>, no, I'm just, I'm just mindful, mindful of the time cause we're getting down toward the end of the show here. So let, let me ask you, before we get to our final two kind of controlled questions, our experiment controlled questions are there any questions we haven't asked you yet that you'd like to address?

Dan Lorenc (00:53:43):
I think we covered everything.

Doc Searls (00:53:45):
<Laugh>. I don't think that ever happens. <Laugh>. We always think of things after this show's over. Well, lemme get to the final two then. Which, which are, what are your favorite text editor and scripting language?

Dan Lorenc (00:53:58):
Ooh. I don't really get to to write much code anymore, <laugh>. But yeah, I'm a, a vs code user when I do write code. And then scripting language gotta say bash <laugh>.

Katherine Druckman (00:54:12):
Ah, good answer. Bash

Doc Searls (00:54:14):
Seems to be, love

Katherine Druckman (00:54:15):
It when people say bash.

Doc Searls (00:54:16):
Yeah. A lot of people say, I don't wanna say it, but his bash and it, I don't know what's wrong with that. He had Brian Fox in here earlier. It was his favorite too, but he wrote it <laugh>. So, so that, so this has been been great, Dan. We really loved having you on the show and and I, and I apologize for being less prepared and, and situated as I would like to be, but I think it went well. <Laugh>, Kath, Katherine Strong, she <laugh> and she lives this stuff sometimes, so that, always that helps. I love it. Sorry for your rod. So, and we'll have to have you back to talk more about this stuff after the world changes again in the Sure. Five minutes, <laugh>. So Katherine, how was that for you, man, <laugh>,

Katherine Druckman (00:55:04):
That was great. You know, I, I enjoy, you know, people like Dan are, are people that I, I honestly look to, I look to the, his work and, and his peers and, and those are the people that I, I like to learn from. And and you know, and honestly also spread the word because they're doing a lot of really important work. So I I was very excited to be part of the conversation to begin with, so I still am and I hope we can continue it. And I look forward to what, what he does next year.

Doc Searls (00:55:34):
Yeah. And I, and I, you know, when I saw the topic, I thought, oh my gosh, this is an important topic. And I thought nothing about it here. Yeah,

Katherine Druckman (00:55:40):
It's good. It's important stuff. Yeah. So that's, it really is.

Doc Searls (00:55:44):
Yeah. I mean, what, wait,

Katherine Druckman (00:55:45):
None of us want, none of us want our software to be compromised. I mean, I think <laugh> I think we learned some hard lessons over the last few years and so, so that's, that's why this, this conversation is really bubbled up and it's like kind of in the forefront, or it is in my mind. I mean, I'm a little bit biased because I talk about this stuff a lot, but but yeah, it's, but people are out there

Doc Searls (00:56:05):
Doing the work. I, I, I'm, I'm living it now because my, you know, my suppliers Oh,

Katherine Druckman (00:56:13):
Your email,

Doc Searls (00:56:13):
Email hosting my email hosting, yeah. You know, all Yeah, with Rackspace, and Rackspace is still down as far as, I don't even even check anymore. It's, they're down <laugh>, you know, the Yeah, the, the part of it that's probably, I, I shouldn't say that. I, I don't know if they're actually down. I haven't checked in several days because I've just given it up, you know, but I mean,

Katherine Druckman (00:56:34):
You may never get it back. <Laugh>

Doc Searls (00:56:36):
Call that mail back, whatever's in there, maybe, maybe being imap, I may have it here and I haven't, I've been so busy with other things I haven't looked, I kind of don't wanna look to see if it's here. It's but yeah. And that was an attack they got attacked and yeah, it

Katherine Druckman (00:56:51):
It's a great example.

Doc Searls (00:56:52):
Something in their supply chain wasn't right. And right.

Katherine Druckman (00:56:57):
There are so many, I mean, you know, it's like the, the more complicated software gets, I guess the more the greater the attack surface or, you know, number of, of vulnerable spots. And, and that's why, you know, so many really smart people are scrambling right now to, to come up with solutions and, and again, putting in a lot of work and coming up with some good stuff. So the more we talk about it, variables work is done, all these things.

Doc Searls (00:57:21):
Yeah. I mean, the more variables, the more dependencies, the more the attack surface turns into a kind of four-dimensional polyhedron, <laugh>, you know, surfaces.

Katherine Druckman (00:57:31):
We talked about this in an episode, I think about the number of dependencies, you know, that their average piece of software has, is just, you know, increased tremendously over, over time. And, and, and those, the, all of those are, are opportunities for vulnerabilities, I suppose. And I don't know, in the way that we write software, we, everything is sort of discreet and, and plugged in like a little tower of Legos, but you just yank a little Lego out and the whole thing. Right. It's

Doc Searls (00:58:01):
<Laugh>,

Katherine Druckman (00:58:01):
It's complicated.

Doc Searls (00:58:02):
It's an X xkcd cartoon in all dimensions. Exactly.

Katherine Druckman (00:58:06):
We love to cite that one. Although I, I I, I, I'm a little bit cautious because I, I, I worry that it gives open source a bad name, but it is a good one. I think there's a misconception still that open source is something, you know, volunteers and hobbyists, which isn't necessarily true, but but it is a good xk c d so,

Doc Searls (00:58:24):
And I think, I think a point Dan made early on, which is that 80, 90, a hundred per close to a hundred percent is establi. Yeah.

Katherine Druckman (00:58:30):
I, no, it's like, it's basically,

Doc Searls (00:58:31):
That's a, that's kinda amazing

Katherine Druckman (00:58:33):
Software amazing is open source software. That's just the way it's made. Yeah. That's the way we make software. Yeah. So I get a little testy, you and I, having been around open source for so long and, and, and in the role that we were talking about it in, in ways, you know, evangelizing it, promoting it. And I'm like, well, and I hear the problem with open source software. I'm like, what? It's not open source. That's the problem. <Laugh> <laugh>,

Doc Searls (00:58:54):
But yeah, right. Yeah. Exactly. Exactly. Wow. It's like the problem with lumber is trees, you know? Well, not really <laugh>. Yeah. The

Katherine Druckman (00:59:02):
Problem with air is that we have to breathe it

Doc Searls (00:59:05):
<Laugh>. Yeah. So, so, and, and let's get a, a plug in before we go.

Katherine Druckman (00:59:10):
Ah, plugs. Yes. Always. What can we plug? We can plug the other podcast that we do. Actually,

Doc Searls (00:59:16):
We came up really see what I,

Katherine Druckman (00:59:18):
Yeah, that was a good, we had a really good one. This, this week, week about chat G P T and I'm looking forward to a holiday break though. Yeah. And I may have some other things to plug in the new year. I'll just be cryptic about that. You can follow me on TWiTter and, but actually don't follow me on TWiTter. I hate TWiTter today. Follow me on, yeah, it's like, it's like a, comes like the muscle memory. I say TWiTter and then I don't mean it. I know.

Doc Searls (00:59:42):
Mastered on,

Katherine Druckman (00:59:43):
I thinking follow me, unmask it on <laugh> Katherine D at Libra one. But

Doc Searls (00:59:49):
Yeah, I'm a, I'm a, I have a TWiTone and I have a journal one and I had a, I've had a number of, I've fully figured out, mastered in yet. But it's starting to become more useful. Oh,

Katherine Druckman (01:00:01):
It's so much more fun. Yeah. All the open source nerds are there and it's a good time. And yeah, if you're curious about what I, what I'm up to Yeah. And what I might have to plug in the future, that's where to find me.

Doc Searls (01:00:11):
Well that's great. And next week I think we're off, but then we have a a a round table coming up after that and a lot of good guys

Katherine Druckman (01:00:22):
That Oh yeah, we're, I think I'm on it.

Doc Searls (01:00:24):
We're more and more them, so Yeah, you're on it. <Laugh>,

Katherine Druckman (01:00:28):
<Laugh>. I guess I

Doc Searls (01:00:28):
Better show up that day. Square. Yeah. <Laugh>. That's great. So until then, everybody, there's been Flo Sweepy. I'm Doc Searls with Katherine, and we'll see you next.

Jonathan Bennett (01:00:37):
Hey, we should talk Lenox. See you operating system that runs the internet, bunch of game console, cell phones, and maybe even the machine on your desk. Then you already knew all that. What you may not know is that TWiT now is a show dedicated to it, the Untitled Lennox Show. Whether you're a Lennox Pro, a burgeoning Ciit man, or just curious what the big deal is, you should join us on the Club TWiT Discord every Saturday afternoon for news analysis and tips to sharpen your Lennox skills. And then make sure you subscribe to the Club TWiT exclusive Untitled Linux Show. Wait, you're not a Club TWiT member yet? We'll go to TWiT.tv/club TWiT and sign up. Hope to see you there.

All Transcripts posts