Transcripts

This Week in Enterprise Tech 535 Transcript

Please be advised this transcript is AI-generated and may not be word for word. Time codes refer to the approximate times in the ad-supported version of the show.

 

Louis Maresca (00:00:00):
On This Week in Enterprise Tech we have Mr. Brian Chee and Mr. Curtis Franklin back on the show today. Now, generative AI is the new craze and the publishing world is being flooded with it, but the copyright office is not gonna take it anymore. We'll talk about what that means. Now. Realtime data processing could literally change your business. Today we have CEO Materialized Arjun Narayan, and he's gonna talk about the technology behind realtime processing and the ability to transform your business as we know it. You definitely should miss it. Quiet on the set

Announcer (00:00:33):
Podcasts you love from people you trust. This is TWiT.

Louis Maresca (00:00:46):
This is TWiT This Week in Enterprise Tech episode 5 35 recorded March 17th, 2023. Clouds Materialize. This episode of This Week in Enterprise Tech is brought to you by Collide. Collide is a device trust solution that ensures that if a device isn't secure, can't access your apps, it's zero trust for Okta. Visit collide.com/twit and book a demo today. And by Cisco, orchestrated by the experts at C E w when you need to get more out of your technology, Cisco makes hybrid work possible. CDW makes it powerful. Learn more at cdw.com/cisco. And by thanks Canary, protect attackers on your network while avoiding irritating false alarms. Get the alerts that matter for 10% off and a 60 day money back guarantee. Go to canary.tools/twit, enter the code twit into how do you hear about US box.

(00:01:48):
Welcome to twit This Week in Enterprise Tech, the show that is dedicated to you, the enterprise professional, the Iz Pro, and that geek just wants to know how this rolls connected. I'm your host, Lewis Maraski, your guys with Big World of the Enterprise. Hey guys, you by myself. I need to bring in the professionals and the experts. So their very own principal analyst, and I'm Dia, he's the man who eats and sleeps the enterprise and security. He's our very own Mr. Curtis Franklin. Curtis, how's your week my friend? And you got a busy week coming up, right?

Curtis Franklin (00:02:16):
I've had a busy week and I've got more busy weeks on tap. There are the, the spring show season conference season is starting ranging from Enterprise Connect next week here in Orlando. We've got MegaCon coming up, which is not so much an IT conference but it is, I, full disclosure, it is produced by the same company that owns Omnia. After that we've got some vendor conferences, and then the first really big security conference of the season. We've got RSA Conference in San Francisco coming up towards the end of April. So I'm getting ready for some of those got publications coming out, just all sorts of entertaining things happening. And once in a while I try to do things like sleep. So, you know, it's, it's, it's a, it's a full dance guard.

Louis Maresca (00:03:21):
Fantastic. Well, welcome back for sure. We'll also have to welcome back our, our favorite network guy and also all around techie. He's our very own Mr. Brian Chee Sheever. How are you doing, my friend? How's the Orlando fairgrounds?

Brian Chee (00:03:35):
Right now I gotta go and try and force a force of expenditure of a little money. I really and truly want to go and start on the project getting fiber optic strung. Now, interesting enough, a byproduct of this mad rush for consumer fiber optics, you know, fiber from the curb is really doing a good job of driving the cost of fiber optic style. There. There's a class of fiber called F T T H. Basically it's a figure eight cable where one part of the figure eight is a steel messenger wire, then the fi fiber embedded in the plastic, and then the last is amid yarn strength member. And the cool thing is, is you can actually just kind of split it apart very easily with a knife and have a clamp on a pole. So as the fiber goes by, you just stick it into a clamp, crank it down, and just keep going. Perfect. And that'll make it a lot easier for me to go and string fiber to some of the buildings around the fairgrounds.

Louis Maresca (00:04:44):
Amazing, amazing. That sounds very cool. Very cool. Yeah.

Brian Chee (00:04:47):
Well, good luck

Louis Maresca (00:04:48):
With that.

Brian Chee (00:04:48):
Yeah, it's funny, the fiber's actually now cheaper than Cat six

Louis Maresca (00:04:53):
<Laugh>. That, that's the truth. I can tell you I bought some just recently to even just at a consumer price, and I'll tell you, it was a lot of it's mu much more expensive. I was really kind of blown away. It's all the shielding they do now.

Brian Chee (00:05:04):
Yeah.

Louis Maresca (00:05:05):
So, well, thank you. Cheaper. Well, welcome back. You know, we should definitely get started. Lots of talk about here in the enterprise this week. Generat aai is the new craze, and the publishing and copyright world has been flooded with a ton of content. You've probably seen some of it now. The US Copper Rite office is, it's not gonna take it anymore. We'll discuss exactly what's going on there. Realtime data stream processing could literally change your business. Today we have c e o of Materialized arju ion he's gonna talk about, and we're gonna talk about the tech bet behind realtime processing and the ability to potentially even transform your business as we know it. So definitely stick around lots to talk about there. But before we do, let's go ahead and jump into this week's news. Blips organizations and users have gotten used to being notified when a breach has happened and having, having to be okay with the data hitting the wind.

(00:05:54):
The question is, should be asking yourself, in fact, is should you be desensitized to that? While some people haven't been, because according to this legal scoops article, a class action suit has been filed against PayPal's data breach. Now, back in January, PayPal admitted that a data breach in December compromised personal and financial information of over 35,000 users that it did send out notices to users around January 19th. But the data loss was pretty extensive. In fact, it leaked names, addresses, phone numbers, emails, date of birth, social security numbers, bank information, and account balances. Yikes. The missing information here is the fact that the login credentials used by the hackers were obtained from PayPal, from not from PayPal's network, but from some undisclosed source. Hey, but they did provide free credit monitoring for a short term to the effective users. I guess that's the glass half full part.

(00:06:44):
Now, most of the users, however, have the glass half empty, empty. Here there's now a class action suit against PayPal for breach of contract, breach of an implied warranty, negligence, invasion of privacy, and unfair business practices. It also demands that PayPal implement better security measures to prevent future data breaches. Now guys, am I allowed to clap during these blips? Like I wish I could just sit here and clap because, you know, the fact is, will this make an impact? I don't know, but I hope it does because companies need more reasons to not or to disclose their data breaches. Now, some of the plaintiffs claimed their credit history and money were be, were actually impacted before all the disclosures in the heartfelt apparel topologies by PayPal. Now, if anyone can erode consumer trusts, it's how corporations handle breaches responsibly. Now, companies must take data security seriously and they need to invest into more robust technologies and practices to protect their customers' data. They may, may also, they need to also be transparent and accountable when data breaches occur and provide timely notification and remediation to the effective users. So, but unfortunately, so far only a select few have really done this well. Let's hope suits like these are a signal for others to do better.

Curtis Franklin (00:08:02):
So I have a story today that helps dispel two common myths about malware. The first myth is that all malware today is the result of huge budgets and large professional programming teams on the part of the malware authors. The second is that all the action from malware takes place within a few seconds of infection and persistence. So what exactly is this myth busting story? Winter Viron, also known as U a c dash 114, is a politically motivated cyber threat group that you don't hear about very much. And it's made something of a comeback in recent months with campaigns against government agencies and individuals in Italy, India, Poland, and Ukraine. The organization has been active since at least December, 2020 with initial activity in 2021. But the group has remained out of the public eyes in the years since. In an article at Dark Reading, we read that Tom Haggle senior threat researcher searcher at Sentinel One reported on the group and emphasized his close alignment with global objectives that supports the interests of Belarus and Russia's governments.

(00:09:16):
In his opinion, it should be classified as an advanced persistent threat or a p t, even though its resources aren't on the par of its other Russian speaking peers. In his report haggle says that winter viron, whose name is a derivative of the wever, a type of bi dragon with a poisonous pointed tail, is a category of I love this scrappy threat actor. The group's most defining characteristic is its fishing lures usually documents mimicking legitimate and publicly available government literature, which drop a malicious payload upon being open group's. Most tongue in cheek tactic though is to disguise its malware as antivirus software owe those scrappy malware folks with their hints of humor. Like many of their other campaigns, the fake scanners are pitched through email to target as government notices. Most common payload in recent months has been a aperitif, a Trojan that collects details about victims, establishes persistence on a target machine and beacons out to an attacker controlled c2 ta server located offshore. Taken as a whole winter of I straddles the line, separating low-end a p t groups from activists. Whichever they are though, their story is a good reminder not to assume that all a p t groups are large and in charge. Sometimes those scrappy little troublemakers can cause us some serious problems.

Brian Chee (00:10:54):
This RS technical article might actually sound familiar. It's because it's not that original anyway. So in this case, they're calling it Moon Gate and the Samsung fans are kind of mad about how the AI processing is happening to photos of the moon. So here's, here's what's happening. If you take a photo of the moon on a Samsung device, it will return a detailed photo of the moon. And some people are mad about this. The issue is that Samsung software fakes some details the camera can't really see. Leading, in this case, a Reddit user called ire photos to accuse the company of faking moon photos, the user's post claims to be able to trick Samsung's moon detection, and it went viral enough that Samsung's press site had to respond. Good article. So what this is, is the article really reminds me of when a certain PC manufacturer added a feature into their PC firmware specifically to take shortcuts when it detected the PC magazine performance test, also a single case of fudging a test result.

(00:12:11):
This adding of additional detail in only for moon pictures feels similar. Hmm. The reality is is smartphone has a teeny tiny camera sensor and can't possibly expected to produce superb images, and Samsung is certainly not the only fan phone manufacturer to fudge their results. Their article goes on to say that Huawei did something similar, and in the test case, it added craters onto the image of a light bulb. The article does speculate that one must wonder if it's the AI system being too much of a black box. My spin on this is perhaps two systems should offer to display better images that it has found as alternatives to the actual image taken by the phone. I'd compare it to how web search engines will often correct spellings of your search terms, but will display that it's on the, you know, on the results that here's what you're getting with the corrected spelling instead and giving you the option to redefine the search criteria according to your original spelling. Hmm.

Louis Maresca (00:13:23):
Continuing with the trend of ai, now deep fakes are concerns for lots of people. The concern is growing even more so for brands and organizations out there. Now, according to this New York Times article, AI might just make this problem more of a reality for some. Now, new altered videos so far, mostly the work of me makers or marketers. Some have gone viral in social media sites like TikTok and Twitter, but most work by cloning celebrity voices altering mouth movements to match alternate video and audio and, and writing persuasive dialogue. However, this month there have been some serious situations that are highlighting the onset of a bigger issue. In fact, the question has to be asked, how is this even legal? Now, one example, potentially a company using DeepFakes to pretend that celebrities are using and talking about their products, others are showing even the president calling out policy and law changes that are coming soon.

(00:14:12):
Many of the video clips feature synthesized voices appeared to be using the technology from 11 Labs, an American startup co-founded by a former Google engineer. Now in November, the company debuted a speech cloning tool that can be trained to replicate voices in seconds. Now videos are using Clone Voices created by 11 Labs or similar technology and have gone viral in the recent weeks. Now, many social media companies, including Meta and Twitch, have banned seat deep fakes and manipulated videos that deceive users. However, the damage could already be done by the time they actually remove the videos. Now, many social media companies will hopefully change their policies in the near future to help this a little bit more. But the question is, how much damage will it need to take effect before real change happens? Well, folks that does it for the blips. Next up we have the bites.

(00:15:03):
But before we get to the bites, we do have to thank a really great sponsor of this week, enterprise Tech, and that's Collide. Collide is a device trusts solution that really ensures unsecured devices can't access your apps. Now Collyde has some big news. If you're an Octa user, Collyde can get your entire fleet to an a hundred percent compliance. Collide patches one of the major holes in zero Trust architecture device compliance. Now think about it, your identity provider only lets known devices log into apps. But just because a device is known doesn't mean it's actually secure in a secure state. In fact, plenty of devices in your fleet probably shouldn't be trusted. Maybe they're running out maybe an out-of-date OS version, or maybe they've got unencrypted credentials lying around. But if a device isn't compliant or it isn't running the Collide agent, it can't access the organization's SaaS apps or, or the resources and the device itself, or the device User can't log in to your company's cloud apps until they fix the problem on their end.

(00:16:03):
It's really that simple. Now, for example, a device will be blocked if an employee doesn't have an up-to-date browser. Or using, using end user remediation helps really drive your fleet to that a hundred percent compliance without overwhelming your IT team. Now, without Collide IT teams really have no way to solve these compliance issues or stop insecure devices from logging in. Now, with Collide you, you can set and enforce compliance across your entire fleet, including Mac, windows, and even Linux. Now, KA Collide is unique in that it makes devices compliance part of the authentication PO process. So when a user logs in with Okta, collide actually alerts them to compliance issues and prevents unsecured devices from logging in, now it's security you can really feel good about because KA Collide puts transparency back into the game and respect for its users at the center of their products.

(00:16:53):
Now, to sum it up, collides method means fewer support tickets, a lot less frustration, and most importantly, a hundred percent fleet compliance. Visit collide.com/twit to learn more or book a demo. That's k O L I D e.com/twit and we think collide their support of This Week in Enterprise Tech. Well, folks, it's time for the bites. Now, AI is the topic of the week these days, but the question is, how will impact industries, especially ones like the copyright sector, it's really unclear. Well, this RS Technica article talks about how the copyright office might be trying to crack down on generated content. Now as generated AI technologies like G P T four and Mid Journey have really rapidly become more sophisticated. Their's creative use has exploded really in popularity. The US and the US Copyright Office has issued guidance this week to clarify when AI generated material can actually be copyrighted.

(00:17:57):
Now listen to some of this guidance. Guidance comes after the copyright office decided to, that an author could not copyright individual AI images used to illustrate a comic book because each image was generated by the AI generator Mid journey, not a human artist. Now, in making this decision, the copyright Office was committed to upholding a lot of things, long-standing legal definitions that authors of creative works must be human to actually register their stuff. And because this officials confirmed the AI technologies can never be considered real authors. Now here's some of the guidance. The guidance offers some specifics on what isn't copyrighted eligible when it comes to ai. Now, in fact, anything that's solely generated by prompts with no modifications made, which copyright office likens to instructions to, to like a commissioned artist, these are things like that lack Cuban authorship and therefore cannot be or won't be registered.

(00:18:50):
When an AI quote, AI technology receives solely a prompt from a human and produces complex written visual musical works and responds, the traditional elements of authorship are determined and executed by the technology, not the human user. And the guidance also explains that based on the office's understanding of generative AI technologies currently available, users do not exercise ultimate creative control on, on how such systems interpret prompts and generate material. Now here's the interesting part. As in the case of Mid Journey, an author who arranges generative AI into a specific sequence like designing the layout of a co a comic book, they can potentially copyright the sequence of images if the arrangement is sufficiently creative. Now, similar logic applies of an author artist modify the AI generated generated material and quote the modifications, meet the standards for copyright protection. Here's an example. It could be a modifying an AI image in Adobe Photoshop or altering AI generated audio by using guitar pedals or others cases. All right, guys, first question goes out to you, <laugh>, maybe it's obvious. How's the copyright off gonna know?

Curtis Franklin (00:20:00):
<Laugh> will do.

(00:20:03):
I was gonna say it, it's gonna depend on a couple of things. In many of the areas where this would be a big issue the, the, the human involved will feel the need to disclose this. And so I think that that's one where we're gonna depend on a certain level of ethics. But there are some tools out there, oddly enough, AI generated tools that work to discover techs that has been generated by ai. I think that we will see a lot more of that coming up in the same way that most college students today and many high school students, if they submit their a an essay or long form homework, the teacher or professor will run it through a program that looks for plagiarism. I suspect that running it through one of these packages that does at least a rough cut on whether it is AI generated, is going to become standard operating procedure as well.

Brian Chee (00:21:25):
So I don't think this is quite as cut and dry as the copyright office would like it to be. This is not quite the tool that they think it is. So, like for instance, at last year's Maker Fair in Orlando I had a very confidential conversation with some, with some artists, shall we say. These artists said it has become a very normal thing for them to take line drawings or such, you know, logos and so forth and run it into something like Dolly whatever, to try and find variations on those logos. Might be things like trying to get it to fit on a prop or trying to get it to look a little different like it's been, it's company logo that has changed over the years and they're saying it has become a very, very normal thing for them to do.

(00:22:31):
 Especially when you start making movies. So I got a sneaky, sneaky hunch that when the copyright office starts asking or allowing comment on this, I think Hollywood is going to be jumping in with both feet. And it's gonna be a really interesting conversation if you ask me. And boy, I can see, I can see it happening. You know, I I definitely love the fact that I can use Dolly to go variations, try to get things to fit, try to get things maybe a little different, try to make someone happier with maybe a different version. And I'm wondering, does that mean I don't own the copyright on that creativity anymore? I think that's gonna be the big bottom line. And I don't see the copyright office having tools, the money or energy to be able to go and prevent it right at the beginning. I think it's going to be used more when people come in and say, Hey, he stole my design. Right? And I think it's gonna be in, you know, I think it's to be more, these tools are gonna be used in court cases when people say a copyright has been violated.

Louis Maresca (00:23:52):
Right, ma'am? I mean, the really interesting thing here, I, and I, I'm curious what you guys think here. I mean, there's a lot of chances in ch cases where if they even change a pixel to question is, is like now copyright of eligible, right? I mean, in fact, I just had a conversation with my mother-in-law today about a bestselling author she knows called Colleen Hoover and see they were talking about how there's an author out there that's actually copying her books and content in her books by just changing the names in the books. And like, I think obviously in that case it makes the book different unless, you know, in, in some cases. And so that's why I'm curious this interpretation by the copyright office is saying, Hey, if you change something, if you import it in a Photoshop, change some things. The question is how much can you change it or how much do you need to change it in order to really be copyrightable? I think it's, and like you were saying, Brian, the language is kind of, I dunno, muddy in that case.

Brian Chee (00:24:44):
Oh yeah. And my wife likes the so-called bods rippers of the, you know, romance novels. And there's been lots and lots of lawsuits being slung around in that industry of people that have co wholesale copied plot lines you know, all kinds of things. Because if you are a big fan, I, you know, I'm a big fan of science fiction and so forth, you start seeing, wow, someone's reusing that, wait, he's not the original office author. Where, where'd that come from? I would've thought the copyright office would've had a little more practice at this by now. Cause I know this has been happening. I I've heard of all kinds of things happening way back into the 15 hundreds. So yeah, what's going on,

Louis Maresca (00:25:40):
Right, <laugh>. Now Curtis, if we talk about this from an enterprise perspective, I'm curious about that. Like, obviously there are businesses that make their dime on making sure that they can copyright things and give out. Obviously we talked a little bit about the movie industry, you know, does this make it a much harder problem for the copyright office at this point to go and, and help with some of these cases?

Curtis Franklin (00:26:03):
Well, I think we could call this the IP Attorney Continuous Employment Act of 2023 <laugh>. Because it's going to keep ip, you know, copyright, trademark, various IP attorneys busy for, you know, years and years to come. There have already been questioned because, you know, let's face it, most of the machine intelligence systems that are out there right now, they do less generation of new content, then they do compilation. And the question is, if they compile from a bunch of different sources I is it, is it original? Which gets back to the old thing in academic research that if you steal from one paper, it's plagiarism. If you steal from a hundred, it's research. And so I think we're gonna see that. I have seen formulas in the past. I I don't recall what they are, that there are percentages that must be changed in order for a work to be considered a new work.

(00:27:25):
 And it varies depending on what we're talking about. Obviously just changing the names, but leaving everything else the same is not a new work that's been shown to not be a new work. On the other hand, we've already had a case. I know that there was one and possibly two of the science fiction magazines that accept over the transom submissions that had to shut them down earlier this year because they were getting so many pieces that were obviously generated by ai and as they said they were uniformly bad. It, it's, it's not, you know, it wasn't the volume so much as the volume of bad content. So this is gonna keep a lot of folks busy many of them attorneys going forward from a corporate standpoint. I suspect that in many cases with larger organizations sending in something that was purely AI generated and claiming it as your own is going to be a fireable offense if it's discovered because it, it's, you know, at least lazy and at most fraudulent when it comes to your employment. Now here's the thing that I'm going to ask you. I have heard people say that one of the places where AI is actually pretty good is in certain types of code generation. What do you think about AI generated code? I mean, if you were on a team, would you want move the chat G P t bot as part of your team?

Louis Maresca (00:29:20):
<Laugh>? I would say there's an answer to that. I'm gonna give it in two parts. One, would I want to code chat G p t coder developer on my team? No. But what I would want is the help of things like in the fact, in fact, I even use co-pilot, which is a, you know a GitHub a tool that lets you essentially provide you code samples. And in fact, I've looked to try to train G P T around other languages and other frameworks to help provide you know code to be able to solve some of the tough challenges that maybe I've not done before in code or use particular algorithms or design patterns and code before that kind of bridge the gap between going and searching for it and trying to find an example versus actually just giving me the code.

(00:30:06):
And so I say, yeah, the, the, I think the productivity aspect of this is pretty impressive. And so I would say I would definitely encourage people to use it more because it gets you closer to where you need to be. And a perfect example is authentication code. It's usually a blocker to any type of productivity. So like if I want to go use a graph API or you know, or an API by a specific organization like Stripe or something, I need to be able to first get the authentication, right, to be able to pass tokens back and forth and access the endpoint. And the problem is that code is sometimes complex. It's a handshake of this and a handshake of that. And I need to be able to get the right type of handshake to be able to get the token that I need to access.

(00:30:47):
And so I haven't been productive yet. I've been just waiting on trying to essentially get the code right. And so this bridges that gap by saying, Hey, here's the code. Just use this. And you can, you know, you can essentially get going with actually being productive and coding things. And I think that's worth this will really be helpful. Will it replace an actual developer? I don't think so. Not, not at least in the short term. Maybe if these get infinitely better, that will be true. But right now it's just augmentation of a real engineer who knows the difference between bad and good code and the right design patterns and the right tested capabilities and that kind of thing. And e even as a good engineer, you have to feed information to it. Like, Hey, I want unit testable code that allows the visitor pattern, that allows me to pass references. You know, like, so it, it, you still need to describe what you want. So I think that there's a, there's a key in aspect to all of this, but I do think on the very baseline of it, it's, it's productivity.

Brian Chee (00:31:44):
But I think this is gonna be really interesting, you know, especially if we start getting something like do me a favor, design me a bridge and you'll set it up so they can do this, this, and this and stand this river and all that. And then forget to tell 'em it needs to be made outta steel <laugh>

Louis Maresca (00:32:03):
All,

Brian Chee (00:32:04):
You know, that's actually been a plot in some science fiction books. So this I'm, I'm gonna keep a good watch on this cuz this is going to be a great show. Let's get some popcorn

Louis Maresca (00:32:18):
<Laugh>. That's right. Indeed it will. All right, well I think we put that one to bed. There's gonna be a lot more for us to talk about in AI in the coming weeks, I am sure. But we, we wanna get to our guests cuz there's lots of in information and lots of interesting topics to talk about there. So we should definitely get to that. But before we do, we do have to thank another great sponsor of This Week in Enterprise Tech and NA's Cisco orchestrated by the experts at C D W. Now the helpful people at C D W understand that hybrid work continues to evolve and that your organization really must evolve with it to succeed. Now with so many options to collaborate remotely, you need a strong and consistent network to empower your workforce and keep 'em together. Consider a Cisco hybrid work solution designed and managed by C D W experts to deliver the same quality network experience to all your offices, even your satellite ones, connecting your team from pretty much anywhere.

(00:33:13):
Because Cisco network keeps things flowing smoothly and securely with embedded security compliance and multifactor authentication that protects collaboration among your spread out team. Now with real-time visibility into distributed application security, user and service performance, you get a better line of sight into how your network is operating and how better to grow your organization. And Cisco networking levels, the playing field, providing access to flexible high-end collaborative experiences that create an inclusive work environment. When you need to get more out of your technology, Cisco makes hybrid work possible. CDW makes it powerful. Learn more at cdw.com/cisco and we think CDW for their support of This Week in Enterprise Tech. Well, folks, it's my favorite part of the show where we actually get to bring in a guest to drop some knowledge on the twit riot. Today we have Arjun Narayan, he's c e o of Materialized. Welcome to the show, Aune.

Arjun Narayan (00:34:15):
Thank you very much for having me.

Louis Maresca (00:34:17):
Now we have lots to talk about here, lots of really interesting stuff to talk about. So I'm, I'm looking forward to getting there. But our audience comes from all walks of life, all different chapters of their career, and they love to hear people's origins stories. Can you take us through a journey through tech and where it brought you to Materialize?

Arjun Narayan (00:34:33):
Absolutely. So I was doing a PhD in computer science at the University of Pennsylvania in distributed systems and had this recurring journey where I was finding many of the key fundamental work streams or research going on distributed systems were repeating work that had been done decades ago in databases. And that's how I got sucked into databases and I've never, I haven't turned back since.

Louis Maresca (00:35:00):
That's amazing. Yeah, I love to hear those types of stories where you're doing research and it generates an idea and you just have to run with it. That's amazing. Now, I, this is an interesting topic that we're gonna get into because a lot of organizations, they, they understand and see the value in being able to be able to manipulate and analyze data in real time. Now maybe we could start with what is stream processing and why is it important to the business world?

Arjun Narayan (00:35:25):
So stream processing is the ability to work on data that is changing within seconds or subsecond milliseconds. And increasingly more and more of the experiences that you and I use in our day-to-day are powered by stream processing. Just think of pulling out your phone and calling an Uber, right? There is some system over there that's keeping track of the real-time locations of all of the available for higher vehicles. And these vehicles are popping on and dropping off as drivers, you know, decide that they, that they want to be available, they end their shift. All of this is happening dynamically. The prices are changing, right? Like this is feeding into a surge pricing algorithm that is deciding on real time. It's matching up supply and demand within your location within the neighboring locations. And then you hit the button and then a car gets assigned to you.

(00:36:16):
That happens within seconds. That car gets routed to you, you get a little real time image of where that car is. All of this is stream processing, right? And increasingly, more and more of the experiences that we have on the internet are powered by stream processing. And I think it's in inevitable that the customer expectation, the user expectation is that everything is always up to date. Batch processing and contrast to stream processing typically happens overnight, and that means acting on yesterday's data, which is acceptable for some use cases. But you know, as we start to buy seats for a concert and those seats are going within less than five minutes as we are bi clicking on an advertisement for a sneaker that is not gonna be there a couple minutes from now. Live inventory tracking, live pricing, and live pretty much everything is the new expectation.

Louis Maresca (00:37:10):
Right, right. Well, let's talk tech, because I'm really interested about this part of it and the, there's actually a lot of stream processors out there today. We talked about Apache Spark, in fact, I know Splunk uses Spark in their backend. I've talked a lot of engineers at, at Splunk. There's Storm, there's Tegan, there's others. What's the technology you're using and what, what's the differentiator there?

Arjun Narayan (00:37:30):
So the underneath Materialize is a powerful stream processor in order to do live up to date queries. And the technology here is called Timely Data Flow, invented by my co-founder Frank McSharry, who begun this research back when he was at Microsoft Research and then later on as an independent academic before co-founding Materialized. So stream processing is as, as you rightly pointed out, a rich field with a lot of complex history and, and, and technological evolution. And our take at Materialize is that users should be able to get the power of stream processing without having to manually program these beasts. And so in batch processing you have all of these amazing pieces of technology where you can write a sequel query and you can get the full power of a scale out elastic cloud, multi-core experience powering and harnessing all of that data and giving you an answer. That's not how streaming is today. You have to write a lot of Java code, a lot of microservices in order to get the power. Even if morally speaking, what you're doing is you're just saying, Hey, I have the sequel query. I want it to always be up to date. That democratizes access to harnessing the power of streaming for all of the experiences that we wanna build.

Louis Maresca (00:38:48):
So I, I still have my ac m access, I'm still ACM member and I read the, is it the, I think it's called N and aid that's the Project naed. So I read a little bit of the, of the research document there really interesting stuff. And one thing I call I noticed is that obviously this requires distributed computing to be able to, to produce this obviously. And so what kind of, you know, what kind of compute is required to be able to process this type of data in real time? I mean, obviously if there's a large data stream coming in, it sounds like there's gonna be a lot of compute power required, even like some of the AI models that are out there today requires a lot of compute power, so does stream processing, right?

Arjun Narayan (00:39:27):
Yes. And if you have large data sets, you are going to have to put a lot of course to work to have a low latency experience at the same time. There's a lot of data sets where stream processing is working on a smaller volume of very high volume high value data. So you could think of fraud detection on a payments network, right? The number of payments that are being processed per second, while it's a large number, it's a number that's eminently handleable by a moderately sized computer. It's not like the number of clicks, right? Like there's orders of magnitude, fewer payments than there's clicks. And so what's needed is an elastically scalable stream processor. And of course there's a lot of, there's a lot of research that has gone into building elastically scalable stream processors just as they have been in elastically scalable batch processors. And again, the user experience that I want to bring is for developers to be able to write SQL queries and get the full power of this. Just as today in a cloud data warehouse you can write a SQL query and behind, behind the scenes there's complex microservices, elastically, allocating compute and putting the, putting a thousand cores to work, right? But most of the users writing SQL queries in the cloud data warehouse don't really need to know. And that to me is, is successful infrastructure.

Louis Maresca (00:40:43):
I love that. I love that it's kind of bridging the gap between previous knowledge and being able to utilize more technology using that knowledge. I think it's interesting cuz obviously Apache Spark, even in a case we know we have graph databases like cuo, Microsoft, you have to learn new languages and structures and semantics in order to and syntex to be able to produce the right outcome in these large data sets in, in these data lakes. And I think that knowledge gap is, is tough to grok. Cause I'd have to go and learn more about to technology and how to do specific things. Sometimes I relate them, I translate what I know and already t SQL over to what I need to know and these other languages. And it's, it's a hard challenge. In fact, that's why I think AI comes in sometimes to help me out with it. But if I could just use my, the existing knowledge I already have and SQL and SQL START procedures to be able to write these data sources, I think that's super powerful. Are you seeing a lot of organizations having their IT staff and other you know, and their engineers being able to just write queries really quickly and produce better reporting and that kind of thing because of the fact you, you kind of have that, that the ability to write normal sequel?

Arjun Narayan (00:41:49):
Absolutely. In fact, I would go even beyond say sequel. So we take a lot of carrot Materialize. The sequel that people write is the Postgres dialect of sequel, right? So there's many flavors of sequel. We go out of our way to present as close as possible to Postgres, which is in my opinion, or at least in my biased opinion one of the cleanest dialects of of of sequel out there and, and one that is extremely popular. The problem with a lot of existing frameworks is they exactly as you pointed out, they require you to first learn the language of that framework, which, you know, there's only so many frameworks that people can co hold in their head. And part of part of the reason we chose the Postgres sequel was a, a, a belief that people should not have to learn one more framework. Is it, is it the most polished, most evolved sequel? And there's no improvements that I can think of, of course not. But at the same time, there's a tax that someone has to pay in order to learn a new thing and, and, and, and you have to earn the right to propose improvements over time after you've already demonstrated the value that your framework brings without requiring your users to learn something new just as the price of entry.

Louis Maresca (00:43:06):
Now I I saw a really cool architecture diagram recently from about Materialize and I noticed that they, you had, you essentially did some re architecture in the last years. I don't know how far back you did the re-architecture, but it sounds like your storage is now in the backend, is now using s3. Is that correct? So you're using ACT S3 clusters to store the data once you actually get the data into the storage layer. Is that right?

Arjun Narayan (00:43:31):
Yes. we have a separated, separated compute and storage architecture, which is all the rave these days. It's very valuable because S3 is extremely cheap and extremely infinitely scalable and extremely reliable. And this gives users the best of both worlds, which is they get the absolute lowest cost for storing all of their data while they get to elastically scale out the compute resources based off of what they actually need to use. This is most, this was pioneered, I don't, I don't wanna say pioneered, but at least popularized by cloud data warehouses that excel at, at, at allowing you to dump all your data into S3 and not really worry about the cost of that, which is, which is wonderful for folks enterprise cloud bills.

Louis Maresca (00:44:22):
Right, right. I I think the one interesting thing I'm, I'm curious to dig into now I normally leave the compliance security questions to Curtis cuz he's, he's, he's got better questions than I do. But I wanna, I want to essentially ask the one compliant, cause I deal with this a lot. I deal with the fact that I gotta work in government clouds. I gotta be able to have my service and my storage in places where these organizations need to access it in a compliant way. How does that work for, for your organization and Materialize? Are you having to have the compute clusters and the data planes and, and the storage layers all in different clouds all over the world is to be able to kind of hit all these different compliance regulations?

Arjun Narayan (00:45:00):
So today we are on a w s only and, and over time we will be on other large clouds as well. We are in multiple AWS regions and spinning up new a AWS regions all the time. But yes, this is, this is one of the, one of the pain points of the cloud is where is your data domiciled, right? And that requires us and every other data management, cloud, data management solution to really have our a-game when it comes to security and compliance and that's very important.

Louis Maresca (00:45:34):
Yeah. Yeah. I think, I think it's a, it, it's always a challenge. It's, it, it's like you, obviously you have to build, you have to spit up new places, new new storage locations and you, you know, sometimes you can't even get the data from them, so you can't even see what's going on. I mean, it's just a very complex environment when it comes to security and and compliance. So I totally get that. One another curious thing before I bring in my co-host back in again, I I have 1,001 questions, this is cool stuff. Is the fact that, you know, obviously we talked a little bit about patchy spark and the one thing that you know, I think is interesting about them is they try to lower the barrier for people to start using them. And you can, you know, can host it locally. You can have their the ho the cloud hosted version of it. What about Materialize? What's the barrier of entry there? How do you get data on there? How do you start stream processing? Is it good for small, medium, large enterprise? What's the kind of the audience for that?

Arjun Narayan (00:46:28):
Absolutely. I think one of the things we're proudest of is the fact that people can be productive within minutes of connecting their data to our cloud platform. Because the sequel is so familiar that they can just take queries that they've maybe already have running on a batch system or on their Postgres database. A lot of people start with just a single Postgres instance. It's how I would start a new project anyway. And they can take that existing bed of queries that they have and be doing stream processing within a half hour onboarding. That, that's one of the, the, I think the most powerful things about Materialize is not asking you to learn something and we onboard folks onto our cloud platform every single day with the single onboarding call. There's self-service as well for, for those who don't want to be on call of course.

Louis Maresca (00:47:16):
Right, right. Okay, so I, I think the, I gotta shift a little bit because I think the interesting part here is the fact that, you know, you, you have a good e easy barrier entry. I'm curious about, again, some little bit more the technology. You know, you talked a little bit about some scenarios that you were giving and were like, I think it was the Uber scenario. There's also the payment scenario. Now I have seen architectures out there where they use it kind of an event based architecture where they essentially throw an event up, a microservice, catches it, it goes and does some work, and eventually the data will make it to a backend and be stored somewhere. So they kinda hand off these events to each other and they kind of have a chain of responsibility, mon, you know, do some things with them and then again, maybe eventually gets persisted or cashed somewhere. How is this different? Like how should people use this instead?

Arjun Narayan (00:48:05):
I would say that most microservices or many microservices, a large fraction of microservices don't need to be full fledged microservices. This isn't an attack on all microservices, right? So what we wanna do with Materialize is make it easy for users to build, maintain, and evolve the bottom 80% of microservices that are morally speaking, just Materialized views over some other set of input data that's just constantly streaming it, right? So take a fraud detection microservice, right? You're looking at the stream of stuff that's happening and then coming up with some fraud categorizations. This is morally speaking a Materialized view. Take a bill. A bill is a Materialized view over the audit log of resources that were spun up and spun down. We wanna take care of the low hanging fruit to leave users to specialize on the top 20% where it's truly differentiated, right? And that depends on your application. But it, it, it's, it's, it's similar to databases, right? Databases don't make your backend server obsolete. You still have a backend server. What you're doing is you're no longer worrying about do I have to keep sorting my data so I can efficiently access this? That's what the, the database does. The heavy lifting for you takes care of the 80% of data movement and data storage so that you can focus on that set of logic that truly needs to be bespoke, custom logic that doesn't quite fit Sequel.

Louis Maresca (00:49:36):
Got it. Well, I do wanna bring my co-host back in, but before we do, we do have to thank another great sponsor of this week Enterprise Tech, and that's thanks to Canary. Now everyone knows about Honeypots and how good of an idea they are. I've used them before. They are fantastic. So why don't you and all of your internal networks run on them? That's a good question, right? Well simple because with all of our network problems, nobody needs one more machine to administer and one more to worry about. Well, we know the benefits that Honeypots can bring, but the cost and effective effort of deployment always drop honeypots to the bottom of the list of things to do well. Things canary changes. This canaries can be deployed in minutes, even on complex network architectures giving you all the benefits without the admin downsides. And the canary triggers are simple.

(00:50:23):
If someone is accessing or, or trying to lower your files or brute forcing your fake internal SSH server, you have a problem, right? Well, Canary uses deceptively uncomplicated high quality markers of trouble on your network. And here's how it works. Simply just choose a profile for the Canary device, such as a Linux box or Windows Box, a brand name router. And if, if you want, you can actually further tweak the services for your canary to run on. You may even be able to use a specific IS server version or open SS h or a Windows file shared with actual files constructed accordingly to your naming scheme. Now lastly, you register Canary with the hosted console for monitoring and notifications. Then you sit back and wait. Attackers who have breached your network, malicious insiders and other adversaries make themselves known by accessing your canary. Now there's a little room of doubt, but if someone browsed a file share and opened a sensitive looking document on your canary, you'll immediately be alerted to the problem.

(00:51:25):
View testimonials at Canary Tools slash Love and see why customers on all seven continents love their things. Canaries. Deploy your birds and forget about them. They remain silent until needed. Get one alert via email, text Slack, web hook, or Syslog whenever it matters. When it matters, visit canary.tools/twit and for just $7,500 per year, you'll get five Canaries, your own host of console upgrades, support and maintenance. And if you use Code Twit and know how to hear about Us box, you'll get 10% off the price for Life things canary as incomparable value. But if you're unhappy, you can always return your Canaries with their two month money back guarantee for a full refund. However, during all the years twit has partnered with Things Canary, their refund guarantee has never needed to be claimed. Visit Canary tools slash twit and enter the code twit and the how do you hear about Us box and we think things canary their's support of this week and enterprise tech. Well folks we've been talking with Arju ion, he's the c e O of Materialized. We've been talk about realtime data stream processing and the tech behind a lot of interesting stuff here. But I do wanna bring my co-host back in too because they have a lot of interesting topics and, and questions to cover. I wanna start with Mr. Curtis Franklin. Curtis,

Curtis Franklin (00:52:51):
Thank you very much Arja, and I've got a question because sort of the holy grail for database work has forever been unlimited high speed response ad hoc queries. You know, anyone can ask any question at any time. Is, is that where you're going? Is is that one of the, the use cases that if, if the c e O asks a truly off the wall question involving complex data, you can go ahead and, and using Materialize, give them an answer in 4.3 seconds.

Arjun Narayan (00:53:34):
No, and I'm glad you asked this question cuz there's always a trade off, right? Like there is the speed of light in which you process data. If you come up with a completely new data, a completely new question that we, for which we've never built any of the relevant indexes that we would want to have built to answer that question quickly, we're gonna have to process that data and we're talking large amounts of data. There is a certain lower bound amount of time that will be needed. Where Materialized shines is when you ask a question the second time, and if you think of most data processing, the questions are not changing every second. The questions are following some programmatic form. The questions are oftentimes repeated. It's just that there's new data, there's ne the next second's worth of data. Think of a fraud alert or a risk limit.

(00:54:22):
It's the underlying transactions that are going on at high volume and we want to know as soon as some threshold is triggered. That's the same query, different data. So I'm glad you asked that question. Cloud data warehouses, shine, oap, warehouses shine at no at answer at at answering novel questions on EC static dataset. You don't upgrade, you don't change the data more than once a day, but they really are designed to process as fast as possible. A novel question. What what their weakness is, is the flip side of our strength, which is if you ask the same question a second time and only 0.1% of the data has changed, they have no choice but to grind over all of that data. Again, spending your time and your money spinning up resources. And that's where we really excel cuz we do work proportional to the amount of data that's changed, not the entire dataset, which really comes in handy for data sets that are changing very quickly.

Curtis Franklin (00:55:24):
Okay, so you, you mentioned a, a great little little code phrase in there and I want to dig into that. Traditionally we've had oap online analytics processing databases, we've had O L T P online transaction processing. OAP is making queries of data that doesn't change very often. Transaction processing is stuff that's changing constantly, is Materialized by being an intermediary, something that can help bridge that. Is, is it one so that people don't have to have as many backend databases to still get good use of that data?

Arjun Narayan (00:56:16):
I, I think yes, it, it absolutely bridges the O L T P and OAP world. It allows you to have OAP views kept up to date at O L T P timescales. Now this only works if your query is fixed, right? So for a fixed set of queries or views, technically it can keep up to date at O L T P timescales as the underlying data changes. In fact, a dominant source of data coming to Materialize today from our users is change data capture coming out of O L T P databases, be that Postgres or Oracle or SQL Server. And then within milliseconds or subsecond we have the views up to date, which is very valuable for these realtime experiences. As we've talked about, you still, this doesn't eliminate the need for an OLA warehouse because as, as you pointed out earlier, you have certain exploratory workflows where you're slicing and dicing data. In fact, when you're slicing and dicing data, you actually don't want the data to change cuz you're trying to hold as many dimensions of this thing fixed as you're just trying to figure out what, what's going on. And so a batch oap warehouse is not something that's gonna go away. It's just a tool that is today being used to run repeated pipelines in a way that's very inefficient because when all you have is a cloud native hammer, everything looks like a nail

Curtis Franklin (00:57:43):
And those cloud hammers are, are going to be the death of us yet, I'm sure. But you know, one, one last question before I I hand things over to my colleague Brian. Over the past decade or so, we've seen a number of different attempted solutions to the data problem. One that I'm very familiar with is, for example SAP's HANA database. I have had the pleasure of sitting through a number of hours explaining what that's all about. And it's, it's a combination of re-architecting the backend and writing an eight figure check for Ram on your server. Is there a a reason you, you've talked about some of the reasons you went with an intermediary rather than a basic re-architecting of the database itself. You know, do you think that ultimately Materialize will lead to a rearchitecting of the database? Something on the backend that optimizes the fit with Materialize? Or is your mission to not require that no matter what happens?

Arjun Narayan (00:59:11):
I, I would say that our mission is to work with realtime data no matter what the source is and what our recommendation would be. Or, or my personal recommendation is that you almost certainly want to land new pieces of data in an O L T P database somewhere. And what they really shine at is data integrity, making sure that all of your rows or your transactions process in some order so that your rows are not violating certain data constraints. That is, that is where your data should land. It's a question of the use cases that happen from that moment onwards, which are these views that power all these other services that Materialize really is gonna help you orchestrate.

Brian Chee (00:59:59):
Oh, my turn, I think, yeah. Okay. Aj, my background is, I've got a background in physical oceanography, and one of our biggest problems is how to deal with time series data. Don't know if it's even appropriate in this case, but anytime some database person sticks the word time in there, I go and haul out this question. Say can I say there might be a better solution than flat files for my physical oceanographer friends? And to illustrate just the kind of data that I'm looking at, I'm gonna ask Anthony to switch over to a feed from my underwater observatory that I work on. And if you can scroll down, if you could click here for the realtime integrated audio and then press play, that's actually a visualization. It's called a waterfall display. And it's data, lots and lots of it. But it's very consistent. And I want to be able to go and say, we have the tools to search for occurrences, but we don't have tools to optimize searching the data. Should I shut up and go to my next question? Or is this something that maybe might work nicely for Materialized to work on?

Arjun Narayan (01:01:26):
So this is a great question. There are two kinds in my view of of time series use cases. One that is appropriate and one that are not. The one that is appropriate is when you want some real-time view over this fast-changing data that is changing with time some kind of aggregation, some kind of concise representation that's always up to date. There's a second type of query that's, that we are not purpose built for, but there are other databases. And I almost got to this point in response to Kurt, but then I, I I, I, I held it back as I was droning on for too long. But one of the nice things about the cloud is you can use the database that's purpose built for the task that you have, and you are not restricted to a single monolithic do everything data architecture.

(01:02:16):
 You can stream this data using a message bus or like Apache Kafka land it in old TP database, move it to a time series database, also move it to an o app warehouse and, and, and things like that. The kinds of data use cases for, which I would recommend you use a purpose-built time series database is when you have a ton of historical data, you want to keep all that historical data at various Fidelitys and not blow out your storage budget because you only want lossy representations of some data that happened two years ago. But you need high fidelity for say the last, last 24 hours times series databases excel at this sort of very compact, efficient representation of vast quantities of lossy data. And that's not what we do.

Brian Chee (01:03:07):
Okay. Thank you. So let's shift to something more in your wheelhouse. And that is, gee, we've got a lot of people that maybe this isn't something they've even considered for their business. You know, maybe they're, they're starting with something small and say e-commerce, and they'd like to be able to go and make it more efficient, make it more secure, make it more of everything. What kinds of things should they collect as far as their problem? You know, let, let's call it a homework. What kind of homework should they be doing before they call you at Materialize so that you don't spend a ton of time, you know, just trying to collect homework?

Arjun Narayan (01:03:53):
Our most successful users, the ones that get to wow as quickly as possible, are those that say, I have this SQL query, I just need it to always be up to date. So if you have a database, if you have an application running with a database today and you're saying, oh, gee, this, this, this one query, I only get to compute it on a pipeline once a day. I wish it was just always up to date. We can do that for you with very minimal re-architecting whatsoever. The other, the other way to think about this for those who have more mature architectures is would you like to simplify and get rid of some of your, your more low-hanging fruit microservices? A not every microservice needs to be a microservice is the next microservice that you have on your roadmap, could that just be replaced by a single Materialized view that will allow you to focus your development, your limited development efforts and budget on the few that can't be right. So you, you probably have 10 that you wanna build. Can we, can we model seven of them or eight of them as SQL queries and then build those in a very rapid fashion, allowing you to focus on the two that really require a bunch of bespoke application logic?

Brian Chee (01:05:11):
Cool. Well, let's kind of dig into that a little bit. Can you share some examples, some success stories of how Materialized has made a difference? You, you can change the names to protect the innocent if you'd like

Arjun Narayan (01:05:28):
<Laugh>. Yeah. So we have this amazing payments processor and, and they have many payments that they process, and they have fraud models that were running in batch. So these fraud models were updating on a continuous loop over all of the input signals that they have in terms of account logins, in terms of transactions. And, and they had built a very sophisticated payment model that took about 30 minutes end to end to load into a batch processor and update the signals and then sh feed those signals back into shutting down accounts that were suspected of being taken over or hacked. And 30 minutes is a very long time for a fraudster to get away with a lot of mischief. With Materialized, they're able to move that down to sub-second or single digit seconds since they get these fraud signals. So it's really about taking those sophisticated models that they've already built, because a lot of these models do get built based off of data exploration, data analytics in a cloud data warehouse. That's when you're saying, Hmm, that's funny that that data point is correlated with bad activity. And this is the thing, if you had this data point and this data point, that's when we really should know. And, and, and that ends up a lot of the time being modeled as SQL Query. And we really wanna take that very same SQL query that you came up with in the expiration phase and just put it to production without having to unpeel it and then rewrite it as a Java microservice.

Brian Chee (01:07:05):
Super cool. Well, hey, we've got viewers all over the world. Are there any kinds of things people need to keep in mind if they're not in America or say we have gdpr issues because of Europe? What kinds of things should people be asking themselves as they start throwing an email or a chat session at Materialize? So, you know, just so that you're asking the right questions,

Arjun Narayan (01:07:42):
That's an excellent question. We have cloud regions in Europe where, where gdpr came from and customers in Europe. If there's an AWS region in your location, then we can likely work with you. The if you can use a cloud data warehouse, we can work with you. There are of course some use cases, Nat, Nat na, national Security ones, for instance, where they're very strict about data domicile that we're not yet ready for. We hope to be one day just as the cloud data warehouses have gotten there. But we would love to work with any users who have any SQL queries that they wanna keep always up to date.

Louis Maresca (01:08:27):
Fantastic. Well, thank you, Arjun. Unfortunately, time flies when you're having fun. A lot of interesting stuff going on here. Thank you so much for being here. I, you know, since we're running low on time, I did wanna give you a chance to maybe tell the folks at home where they can learn more about Materialize, how they can get started, maybe interested in moving some data over.

Arjun Narayan (01:08:46):
Thank you very much. I really appreciate the opportunity to be here. You can find us@Materialize.com. Head on over and you can sign up and try it, or you can read our documentation and take a look at some of our case studies or our architectures.

Louis Maresca (01:09:03):
Fantastic. Now you said try it. Is there any, is there like a, like a, like a try before your buy kind of thing?

Arjun Narayan (01:09:10):
Yes, there is a online trial and today we're still in early access, but we are onboarding folks every single day. And, and we hope to one day have a you know, completely seamless login, try it, get some free, free usage. 

Louis Maresca (01:09:28):
Fantastic. All right. Well thank you so much for being here. Well, folks, you've done it again. You sat through another hour, the Best Day Enterprise and ID podcast in the universe, so definitely tune your podcasters. So twice I wanna thank everyone who makes this show possible, especially to my wonderful co-host starting at the very on Mr. Brian Chee Jeer. What's going on for you in the coming weeks and work with people, find you?

Brian Chee (01:09:50):
Well, apparently I'm helping on MegaCon. That's gonna be a lot of fun. I always enjoy people watching. There's all kinds of really, really, really interesting costumes walking by. And more than likely I'm gonna be doing the helping with the Learn to Solder section. So oughta be fun. In fact, I just bought myself a new Ryobi 18 volt cordless soldering iron. That is the best. Wow. It's so cool. I can't wait to be able to not have to plug in a soldering iron. Wow. <laugh>. Anyway you know, I've got all kinds of things happening and I'd love to share it with you and all our viewers. I still use Twitter, I'm kind of a dinosaur, but my Twitter handle is A D V N E T L A B advanced net lab. And I share all kinds of weird and wacky things that I'm doing.

(01:10:51):
 You're also welcome because we would love to hear your opinion on shows. I try to book by threads, you know, different major subjects, but we want to hear your opinions. So feel free to throw an email at me. I am scheiber, spelled C H E E B E R T twit tv. And you're also welcome to throw an email at TW tv and that'll hit all the hosts. Want to hear your opinions? I've got pretty thick skin. If you didn't like what I did, fine. You know I'm more than interested in hearing your criticisms too. Take care and be safe everybody.

Louis Maresca (01:11:32):
Thanks Geer. Well, folks, we also have to thank you very own, Mr. Curtis Franklin. Curtis, what about you? What's coming up in the coming weeks where it'll be find you?

Curtis Franklin (01:11:41):
Well, I'm going to be writing, I've got several things that I'm working on. I'm gonna be talking to a lot of people around the industry. Moving from cybersecurity awareness training. Gonna be working a lot on risk quantification over the next few weeks. Got a lot to do there and have a number of other things planned for the remainder of the year, including, but not limited to professional cybersecurity training and cyber ranges. Nothing like a good cyber range to make you feel all warm and fuzzy inside. So lots to do. You can follow me on Twitter at KG four GWA on Mastodon kg four gwa@mastodon.sdf.org or on LinkedIn. I put plenty of stuff up there and would love to have a conversation with any of the TWiTriot on any of the social networks.

Louis Maresca (01:12:44):
Thank you, Kurt, for being here and all that you do. Well, folks, we also have to thank you as well. You're the person who drops in each and every week to get your enterprise goodness. We wanna make it easy for you to watch and listen and catch up on Enterprise and IT News. So go to our show page right now, TWiT tv slash twice easy now. You'll find all the amazing back episodes, the notes, the co-host information, the guest information, of course the links that we do of the stories that we do during the show. But more importantly, right there next to those videos, you'll get the, those helpful subscribe and download like support the show by getting your audio version, your video version of your choice. Listen on any one of your devices, any one of your app, podcast applications, pod catchers, app, podcast, YouTube, you name it.

(01:13:25):
Subscribe, support the show cause it's really the best way to support the show. Of course, you may have also heard, we also have club Twitter as well as another way to support the show. It's members only add free podcast service with a bonus TWiTplus feed. And that that TWiTplus feed you can't get anywhere else, you can't hear anywhere else. And guess what? It's only $7 a month. There are a lot of great things that come with TWiT Club twit that that feed is one of them. But there's also this exclusive access to a member-only Discord server. In fact, I'm on it right now. We're having a lot of fun topics of talking about a lot of fun stuff on there. So definitely join that Discord server, be part of the fun. You can chat with hosts, producers, you can chat with Leo, you can chat with separate discussion channels, special events go on there, lots of fun stuff.

(01:14:09):
So definitely join Club twit, be part of that movement. Join, join it at twit tv slash club twit. Now, club twit also offers corporate group plans. That's right. It's a great way to give your team access to our Ad Free Tech podcast and the plans just start with five members at a discounted rate at $6 each per month. And you can add as many seats as you like after the fact. And this is a great way for your IT department, your developers, your tech teams to stay up to date with access to all of our podcasts. And just like the regular memberships, they can actually join that Discord server and get the TWiT plus bonus feed as well. So twit.tv/club twit. Now, after you've subscribed, you can impress your family members, your friends, your your coworkers with the gift of Twit cuz we talk about some really fun tech topics and I guarantee, I guarantee it, they will find it fun and interesting as well.

(01:15:01):
And if you've already subscribed, definitely have them subscribe as well and support the show. Now if you have subscribed and you're available on Fridays right now, 1:30 PM Pacific Time, we do the show live. You can watch the live stream at live dot TWiT tv, you can some come see how the pizza's made, all the behind the scenes, all the fun stuff, all the banter before and after the show. So come be part of that fun. And of course you can always jump into our infamous IRC channel as well. That's right. We have an I Rrc chat channel and it's been there for a long time and there's a lot of amazing characters in there. And that's IRC dot twit tv. You can do it on your web browser or you can use whatever IRC client you have. And of course join the Twit live channel and you can be part of that conversation as well.

(01:15:43):
So definitely join them. Now I also want you to contact me just like Brian, I want you to hit me up on Twitter. It's twitter.com/luum. I'm there. I post all my enterprise tidbits. I have lots of great conversations. In fact, I also have my masses on account, Lu Twit social, of course I'm Lewis Mareka on LinkedIn as well. I have a lot of great conversations there. Plus I, you know, I have some good topics about even when I do at Microsoft. In fact, Michael Damron reached out to me just recently. Michael, I promise I will respond soon. Lots of lots of questions coming up, but lots of good topics you brought up and I wanna, I want to give you some information about that. So thank you for reaching out. You can always reach out to me. I love hearing from you. And if you wanna know what I do at my normal work week at Microsoft, check out developers.microsoft.com/office.

(01:16:28):
There we post all the amazing great ways for you to make office more productive. And if you are on Microsoft 365, check out your Excel instance. There's an automate tab. That's right. That's my tab. I love that tab, check it out. It's a lot of fun to use and a lot of cool things can happen if you do use it. So definitely check it out. I wanna thank everyone who makes this show possible, especially to Leo and Lisa. They continue to support This Week in enterprise tech each and every week. And we couldn't do the show without them. So thank you for all their support over the years. Of course, thank you to all the engineers and staff at twit cuz they make it easy for us. Thank you guys and of course thank you to Mr. Brian Chee one more time cuz he's not only our tireless coho co-host, but he's also our tireless producer as well.

(01:17:08):
He does all the show bookings and the plannings for the show and we couldn't do the show without him. So thank you sheer again for all your support. And before we sign out, I wanna thank our editor today because you know what? They make us look good after the fact. They cut out all my mistakes. So thank you very much. And of course, thank you to our TD for today, Mr. Anthony, because he has done a fantastic, seamless job, makes our lives a lot easier. Thank you Anthony, for all your support. Well folks, until next time, I'm Lewis Masco. Just reminding you, if you wanna know what's going on in the enterprise, just keep twi.

Ant Pruitt  (01:17:42):
Hey, what's going on everybody? I am Ant Pruitt and I am the host of Hands On Photography here on twit tv. I know you got yourself a fancy smartphone, you got yourself a fancy camera, but your pictures are still lacking. Can't quite figure out what the heck shutter speed means. Watch my show. I got you covered. Wanna know more about just the I is O and Exposure Triangle in general. Yeah, I got you covered. Or if you got all of that down, you want to get into lighting, you know, making things look better by changing the lights around you. I got you covered on that too. So check us out each and every Thursday here on the network. Go to twit tv slash hop and subscribe today.

All Transcripts posts