Transcripts

Tech News Weekly 307 Transcript

Please be advised this transcript is AI-generated and may not be word for word. Time codes refer to the approximate times in the ad-supported version of the show.

 

Mikah Sargent (00:00:00):
Coming up on Tech News Weekly, I kick things off with an interview with our very own an Pruitt, all about Adobe Max, about Firefly and everything the company announced at its conference.

Jason Howell (00:00:14):
I gave you a double dose of Pixel eight profu, starting with a full review of this new device that I got a week ago. I've had a lot of fun with it and I talk a lot about AI and then I welcome my friend Michelle Raman to talk a little bit about an article he wrote on Android Police, all about Ultra H D R, why it's special and what you can expect in the Pixel eight Pro.

Mikah Sargent (00:00:37):
Very, very cool. I round things out then with my story of the week. It's all about open ai, what the company is rumored to be announcing in November at its developer conference in San Francisco. And then I show you chat GT's new voice feature. It's a pretty doggone compelling all of that is coming up on Tech News Weekly podcasts you love from people you trust.

Jason Howell (00:01:07):
This is Tweet. This is Tech News Weekly episode 307, recorded Thursday, October 12th, 2023, pixel eight Pro review. This episode of Tech News Weekly is brought to you by Brook Linen. It's no trick Brook Linens. Bestselling linens are sure to curb those seasonal scaries this fall. Visit in-store or online@brooklinen.com and use code t and W for $20 off your online purchase of $100 or more. Plus

Mikah Sargent (00:01:39):
Free shipping and by our friends IT Pro TV now a C I Learning IT Skills are outdated in about 18 months. Launch or advance your career today with quality, affordable, entertaining training individuals. Use Code TWIT 30 for 30% off a standard or premium individual IT pro membership@go.acilearning.com slash twit. Hello and welcome to Tech News Weekly, the show where every week we talk to and about the people making and breaking the

Jason Howell (00:02:10):
Tech news tech.

Mikah Sargent (00:02:11):
I am one of your hosts. Micah Sergeant.

Jason Howell (00:02:14):
I'm the other guy, Jason Howell. And to prove that we break the news, I'm going to take this Pixel eight Pro, I'm going to snap it into two pieces right now. Just kidding. I'm not going to do that. I have to review it first and then I will break it. You Google's like

Mikah Sargent (00:02:29):
Get him, send it back.

Jason Howell (00:02:31):
They're like, that's our review unit. We did not clear this.

Mikah Sargent (00:02:33):
I wonder what they do. What would they do? Yeah, I don't know what they would do. What would they do?

Jason Howell (00:02:38):
Let's find out. They probably

Mikah Sargent (00:02:39):
Wouldn't send me another one. That's what it would be. You just wouldn't get another one.

Jason Howell (00:02:42):
I

Mikah Sargent (00:02:42):
Don't think they would charge you for it. Hopefully. Yeah, I don't think so. They would take a phone that you had and break it.

Jason Howell (00:02:48):
Oh, they'd break in and break

Mikah Sargent (00:02:50):
Brown your phone. Yeah, that's true. Well, while we ponder that, I think let us get underway with our first interview today. Some of you may know the person we're bringing to the show today. I am super excited to be talking about Adobe Max with a person who knows a thing or two about Adobe Max having attended conferences in the past and also being a certified creative professional joining us here today. It's Ant Pruitt What? Hi Ant.

Ant Pruitt (00:03:25):
Hey, hold on. Yes, Google. Send it to Ant Pruitt, not Jason Howell.

Mikah Sargent (00:03:31):
Yes. Oh, you got yours,

Ant Pruitt (00:03:33):
Right? Ant Pruitt. Got it. Appreciate it.

Mikah Sargent (00:03:38):
Hey gents. How

Ant Pruitt (00:03:38):
Y'all

Mikah Sargent (00:03:39):
Doing? Doing well, doing well. Yeah. So I think we got to start off by kind of laying the groundwork here because some people might not know about Adobe Max. So before we dig into what was announced, I was hoping you could start by telling us just what is Adobe Max, what's the purpose of this event and who goes to these Adobe Max conferences?

Ant Pruitt (00:04:00):
Sure. Well, this is not quite like the other events that we cover, whether it's W D C or Build or anything like that. It's not a developer conference, but it is a creators conference. Adobe gets together every year typically in Los Angeles to not only talk about some of the products that they're getting ready to release to the masses, but they also allow the community to get together and have sessions and classes and learn different tips and tricks using their tools, the Adobe tools as well as just a bit of networking because different companies come there that are looking for content creators and you could give them your business card and say, Hey, look me up, let's work together and work on a contract and things like that. It is a really big conference, happens over about three days typically, and it's just a great, great environment and energy there. I love going there. I haven't been in about two years now or three years and I absolutely love it and I look forward to trying to be there next year. Couldn't make it this time. I got some other stuff going on this month. But yeah, I'm definitely trying to be there next year.

Mikah Sargent (00:05:10):
Nice. Now at this conference of course there's a main keynote that takes place where the company announces new stuff and we always see press releases afterward with some of the stuff that the company has or plans to announce. And I noticed that the term Firefly might as well have been plastered in big, bright, bold letters over everything with maybe a little, I don't know, a super script eight next to it because it was just all over the place. So can you tell us about what is Firefly? It's been around for a while, but what are the new updates that Adobe announced for Firefly?

Ant Pruitt (00:05:52):
Well, saying Firefly is Adobe's way of saying, you know what? We know you're tired of hearing ai, every other word and every other sentence. So we're just going to say Firefly. No, actually Firefly is Adobe's generative AI tool. It was announced in beta last year. Yeah, I believe 2022. It was in beta and it's been sitting around for people to play around with and you just go to the browser and you can do texts to image prompts similar to how you would use Mid Journey or stable diffusion, but it's really, really intuitive. It had a lot of different tools right there on the screen and didn't have a bunch of nerdy sliders and things like that where you're trying to figure out what happens if I move this. It was pretty straightforward and really well done. But the biggest thing with Firefly was the models were built based off of the data that Adobe received from contributors to Adobe stock. So everything that was uploaded was from people that essentially Adobe already knows and had a sign off that says, you know what? We can use these particular images, vectors, or what have you to help train our ai. So when you go in and do a prompt, it's basically using good clean data.

Mikah Sargent (00:07:13):
I was able to, recently, I did a charity live stream for it's a Dungeons and Dragons campaign and

Ant Pruitt (00:07:21):
I saw that. Congratulations to you all.

Mikah Sargent (00:07:23):
Thank you, thank you. One of the cool things, I ended up using Firefly in Adobe Express to help me out with something. There's this mechanism, this little item set in d d called The Deck of many things and it's kind of a wild, just completely adventure changing set of cards that when the players draw from it, it can out, it can change the outcome of what's happening. And I wanted to create some new cards to add to the deck that were just completely made up that I had completely made up to add some more fun and kind of wild happenstance.

Ant Pruitt (00:08:04):
Oh, you're that kind of dungeon

Mikah Sargent (00:08:05):
Master. Indeed, indeed. The chaotic kind, chaotic good. And so I used Adobe Firefly in Adobe Express to help me create the tarot style artwork for

Ant Pruitt (00:08:19):
The cards,

Mikah Sargent (00:08:20):
And I was really, really impressed with what it did. And at the end of the day, what I really enjoyed about it I think was that I didn't feel as kind of gross about it because I knew that this was all artistic information, that the licenses and the agreements and everything was all already in place and that you

Ant Pruitt (00:08:48):
Didn't feel like you were

Mikah Sargent (00:08:48):
Stealing. Just saying exactly that is what it boils down to. And so yeah, I mean Adobe does seem to be super conscious of impact that AI generated contents could have on the industry or is

Ant Pruitt (00:08:58):
Already

Mikah Sargent (00:08:59):
Having on the industry. So outside of it, having those deals in place with Adobe Firefly, what all is the company doing to address this issue and kind of make it easier for people to know what's going on and the content that they're using is good to go or may be the result of some sort of copyright theft?

Ant Pruitt (00:09:21):
Well, you got to give Adobe credit because they have latched onto the idea that AI generated content is not going anywhere anytime soon and decided, you know what? We're going to participate in this, but we're going to put some guardrails up as best we can to not only protect the people that are consuming this content that's coming out there, but also protect the creators of generated content and the models. So they have a lot of guardrails in place and they've now been working on for about a year and a half with the C two P E a, I believe is what it's called, which is an acronym that's dealing with content authenticity.

(00:10:05):
When generative AI creates a piece of art or what have you, no one really knows other than someone that can pixel peep and look at it and saying, oh yeah, I could see that AI did this. Nobody really drew this or created this. Well, with the Content Authenticity initiative, there's now going to be some badges embedded into the actual files that are created and it allows you, the consumer as well as the creator to see where this file came from, how it came about, if it got edited and modified any additional times, and it's all encrypted and it's really, really neat and it's something that's been worked on, like I said, for about a year and a half, including it being open source so everybody can get into this and see the in and outs and how it's going to work for the creators and for the masses out there to keep everybody safe from this interesting time of voting season coming up here in the us. So there's the fear of misinformation and disinformation being put out there, but if we have some tools out there to say, Hey, this is authentic. This was created by so-and-so on such and such date and New York Times got a hold of it and decided to do some exposure adjustments on this image and they published it, I mean, you're going to be able to see that record line for line.

Mikah Sargent (00:11:26):
Nice. Now you have an example, I think of this on your website. Is that the case? I saw you shared a link with us

Ant Pruitt (00:11:35):
And the thing is I wanted to share this because this is definitely a work in process. The work in Progress, the Content authenticity initiative. Adobe is partnered up with a couple different brands and manufacturers out there, including Microsoft, Nikon, like I want to say Nike and a couple other brands, but not everybody is on board just yet. So I just threw some images up on my website at random just to sort of prove that right now it's not necessarily working for my website because, well, maybe Squarespace doesn't have something in place just yet. So if you go to the images there, they look like the same image, that's Image two. Image one has a person in it, and I basically just use generative fill to

Jason Howell (00:12:24):
Remove the person.

Ant Pruitt (00:12:26):
So that's AI that's in place and then the person got removed, but yet there's no type of badge or anything on there that says, Hey, this was created by me and that I use generative Field to fix this and get rid of this person. So understand that this has been photoshopped, if you will. I started to think about that a little more because we have people that are taking pictures or what have you, and putting them to social media, which is where a lot of this fear about misinformation is going to be stirred up. And right now you put this on Instagram or what have you, you're not going to be able to see a badge, but hopefully Meta and the others out there are going to be able to work with Adobe in that whole coalition to figure out a way to put some type of badge out there to let us know, Hey, this is a legit image from this particular poster. What

Jason Howell (00:13:22):
About stripping out that metadata? I mean, when I think of photos on my phone and sharing location, there are ways to turn off the location capture before the photo, which I actually do,

Ant Pruitt (00:13:35):
But there are also ways

Jason Howell (00:13:36):
To strip it from a photo that's already there. Correct. I imagine that's going to be pretty darn easy to do here as well. Really at the end of the day, it just comes down to

Mikah Sargent (00:13:43):
The scruples of the person who wants to share a photo, right?

Ant Pruitt (00:13:47):
C2 PA has addressed that and put in place where if the data gets pulled out, there will be a flag that shows

Jason Howell (00:13:55):
Up on the file that says, kidding,

Ant Pruitt (00:13:57):
This has been

Jason Howell (00:13:59):
Modified.

Ant Pruitt (00:14:00):
But again, depends on where you look at it. You're not going to see it right now if you use Microsoft's binging images, I believe they're now displaying the badges for content authenticity. That's just one source, but we need everybody to get on board to be able to see this stuff.

Mikah Sargent (00:14:19):
So that's a tough one.

Jason Howell (00:14:24):
Sorry, I didn't mean to throw a curve ball. Oh, no, no,

Mikah Sargent (00:14:26):
No, no. It's the idea in general. I absolutely 100% want there to be a way for AI generated imagery to be something that anyone can verify, but

(00:14:40):
I don't want it to be something that gets in the way of what it is that I'm making. And an example of this, going back to what I was just talking about before, at first, I made the images for those cards in, I think it was in photo. No, no, no. I made them on the demo page for Adobe Firefly, the latest edition. And when I did that, it had a little watermark in the corner and it's good that the watermark is there, but what's annoying about the watermark being there is that it gets in the way of the artwork, and I don't want

Ant Pruitt (00:15:14):
That to be, it gets in the way.

Mikah Sargent (00:15:15):
And so when I made it in Express as a person who has an Adobe account, then I was able to make it without the watermark on there, which is how I wanted it to be. So I almost wished there was some sort of machine readable watermark maybe that got added as

Ant Pruitt (00:15:32):
Opposed to

Mikah Sargent (00:15:33):
Something that's so obvious.

Ant Pruitt (00:15:34):
That's part of the badge that the C content Authenticity Initiative has put out there. It is not necessarily a watermark, but if you were to hover your mouse, it would show up on your screen that, Hey, this is legit and it works well again with the partners that have actually implemented this. And then you have camera manufacturers that are thinking about trying to figure out a way to put it into their camera systems, because right now you can put your copyright information into your camera, so every time you snap a photo, it writes it into the metadata of the file. But again, that could very easily be stripped out. But again, with the C two pa, supposedly there is encryption this in place and it'll allow us to be able to see, alright, this is someone tried to tamper with this and it'll put a flag out there. You're not going to see it, right, exactly on your image, but if you hover, then the badge pops up. So it's not disrupting the actual art that you're trying to view.

Mikah Sargent (00:16:35):
That's good. Now let's break away a little bit from AI to talk about the general updates that people can expect in the Creative Cloud Suite because I know Adobe did not just spend the whole time talking about Firefly, but will we see updates to Illustrator InDesign Dream Week? You know what I mean? There are so many other applications, let's not forget about them.

Ant Pruitt (00:16:54):
There's a couple of different updates, but I hate to break it to you. Most of it is geared around AI for their packages. But as I said on our twit blog, twit tv slash blog, AI is not here to take jobs from content creators. Okay? It's here to help us out. So inside of Adobe, yes, you have degenerative field options for the likes of Photoshop, but you also get generative feel and illustrator. Even some additional AI is being used in Premier Pro to help speed up video editing. Everybody there at TWIT is a content creator, and I know pretty much everybody in there can edit video one way or another. Well, they decided to figure out a way to make that even easier with text space editing. So as you're listening to your video and it'll transcribe it for you right there, and if you hear s and filler words and things like that, the AI can remove those filler words in the video and do the cuts for you seamlessly. It's just little tools like that are throughout. But then there's also the access of having Photoshop on the web. So if you want to work on a particular project away from your workstation here and not with all of your super duper powerful Macs and Windows machines, you can pull it up in a browser and work on those files and it's pretty good. They're going to continue to work on it and make it better. But I like the idea of being able to open up Photoshop on a Chromebook

(00:18:27):
Because everybody's not able to do that. And they also announced having Adobe Illustrator available on the web. It's definitely beta.

Mikah Sargent (00:18:36):
It's not quite ready,

Ant Pruitt (00:18:38):
But they're working on it. So again, Adobe is trying to do a lot of things to make things, to make creation a little bit more accessible for everybody.

Mikah Sargent (00:18:48):
Nice. Alright, let's break through this real quick. The conference always includes something that Adobe calls, so I was hoping you could tell us a bit about NYS and what the company has announced this year. Oh, of course. Explain SNYs and then tell us what they announced this year.

Ant Pruitt (00:19:06):
Sure. Well, SNYs is Adobe's way to sort of end its conference and to say, Hey, this is some of the stuff that we've been working on and it's called Project So-and-so. Let's have some of our developers come up here on the stage in front of thousands of people and present what they've been working on. And so a lot of the things that we see today in our Adobe products have been shown off in sneaks. Granted, sneaks can be quite innovative and exciting, but yet we still have to deal with some of the controversy that comes out of there. I can remember, I think it was like 2018, something like that. They demoed an audio file, a wave file was being played and they wanted to say that, have so-and-so actually said this phrase. And when they put it up there, they basically said, have a Barack Obama say so-and-so. And they were able to manipulate the foul to where it sounded like Barack Obama said something totally off the wall

Mikah Sargent (00:20:10):
On stage. They showed that they could make Barack Obama say something,

Ant Pruitt (00:20:17):
Right, just in a what? It was pretty scary.

Mikah Sargent (00:20:22):
And that's the thing that's totally, most of the time it is pretty

Ant Pruitt (00:20:25):
Exciting. It's pretty exciting most of the time. But at that moment I can remember being in the venue and everybody got quiet. Everybody got quiet.

Mikah Sargent (00:20:36):
It's like when Adobe made the woman smile that time, stuff like

Ant Pruitt (00:20:39):
That. So again, Adobe has been trying to

Mikah Sargent (00:20:42):
Learn

Ant Pruitt (00:20:43):
From missteps like that and understand we need to make sure there is a level of authenticity out there or a warning that says, Hey, this is something that's AI generated, so on and so forth. But yeah, that's what SNYs is all about, is some of the tools that they're trying to put in place to help make things easier for content creators. And they announced a couple different projects that I thought were very, very exciting, especially when it comes to video. I can't remember the name of it right off the top of my head, but you're working on a video file and someone is in the background. You could literally do a generative fill to remove that person from the background of your video. So if I'm sitting here just doing a talking head and I accidentally had a light on behind me and I didn't want that light there because the light was distracting, fix it in post. Just circle it with a selection

Mikah Sargent (00:21:38):
That is, fill it

Ant Pruitt (00:21:39):
In and it works. And not only does it work for just me sitting here as a talking head, if I were moving around and the camera was moving around, it was able to go frame for frame and motion track to fill in the spot that was consistently

Jason Howell (00:21:54):
Fill it similarly,

Ant Pruitt (00:21:55):
Consistently.

Mikah Sargent (00:21:57):
Project

Jason Howell (00:21:58):
Stuff, project fast

Mikah Sargent (00:22:00):
Fill

Jason Howell (00:22:00):
Frame to frame to frame things look

Ant Pruitt (00:22:01):
Slightly

Mikah Sargent (00:22:02):
Different and stuff. But this is,

Ant Pruitt (00:22:03):
Oh gosh, it was quite fascinating. There's that, and then there's tools like an illustrator because people are trying to figure out how to create characters and things of that nature. And not everybody can draw. Well, if you want to just do it all out, something that looks like a puppy, you can

Jason Howell (00:22:22):
Sort of doodle it out

Mikah Sargent (00:22:23):
Everybody, not Jewish like trash bags. It's like, no, I was trying to make a puppy not a glop.

Ant Pruitt (00:22:30):
And you could doodle

Mikah Sargent (00:22:31):
That out and say,

Ant Pruitt (00:22:32):
Hey, this is a puppy

Mikah Sargent (00:22:34):
And I'm sure Adobe's sensei AI is going

Ant Pruitt (00:22:37):
To look at you sideways and whatever. Let me help

Mikah Sargent (00:22:42):
Sense.

Ant Pruitt (00:22:43):
Yeah, it'll jump in and generate that form into an actual puppy.

Mikah Sargent (00:22:48):
That's cool.

Ant Pruitt (00:22:49):
And create vector files. So if you want to bring this into Illustrator and use all of the different tools such as puppetry and things like that can, it was super dag gum Cool. And they put it up on YouTube now. I just checked this morning, it's now been published to YouTube. I'll get you guys a link to put in the show notes. It is so fascinating.

Mikah Sargent (00:23:08):
Cool. Alright, we do need to let you go, but just one quick last question. Is there one thing that stood out for you that you just felt, whoa, that's a thing Aunt Pruitt is very excited about?

Ant Pruitt (00:23:21):
Well, besides that, with the video, there was one more thing with fashion, I believe it was called Project Primrose. And this is on my radar because of my son and I'm so proud of him and what's going on with him. And he's being into fashion at such an early age and has a real passion for it. But they're working on a way of pretty much digitizing some of your textiles and allowing you to have prints and designs that work with you and change on the red carpet and all is

Jason Howell (00:23:54):
Just with

Ant Pruitt (00:23:54):
The click of a button or it has sensors in it that works with how you moved. So the lady, she had a dress on and as she flowed and twisted and turned a certain way, the pattern,

Mikah Sargent (00:24:05):
Oh, that's neat. Worked with her

Ant Pruitt (00:24:06):
And it was really daggum. Cool.

Mikah Sargent (00:24:09):
That sounds really cool. Of course, folks can head to, I think it's max adobe.com to watch some of that. If folks want to keep up with your great work, both in the club and outside of it, where should they go to do that?

Ant Pruitt (00:24:25):
Make sure you're following me all over on the social medias. I am ant Pru on pretty much all of them, even Atter,

Jason Howell (00:24:34):
Whatever it's called

Ant Pruitt (00:24:35):
These days, x.com. Make sure you're following me on all of those platforms, but most importantly, make sure you're checking me out over in Club Twit. Plug, plug, plug TWI tv slash club twit.

Mikah Sargent (00:24:47):
Alright, and thank you so much for your insights and understanding of Adobe Max. I appreciate you joining us today. Thank you Ann, and we'll talk Pleasure to you soon.

Ant Pruitt (00:24:56):
Take care, gents. Thank you.

Jason Howell (00:24:58):
See in the office soon, sitting right behind me in my office. Okay, coming up next I have in my hands and have for the last week the Pixel eight Pro Google's latest premium device. I'm going to give you a review and then after that, after the break, after this one, we're going to talk with Michelle Raman about some of the cool technology that makes Android 14 and this phone special. So that's the next, I'd say 30 minutes of the show coming at you here in a second. But first, this episode of Tech News Weekly is brought to you by Brook Linen with nighttime starting earlier, I can feel it. It's in the air right now. The spooks are coming out. Let Brook Linen help with those seasonal scaries so you can curb those fears with a cozy Brooklinen setup to settle into each night. I can speak from experience.

(00:25:48):
We've got Brook linen sheets on our bed at home and we love them. It's actually kind of funny. We have the Luxe Satine sheets and then we have a few other sets of sheets and we have the Brooklinen set and we kind of have this carousel between them. And when the Brooklinen comes into the carousel, it's always my favorite moment. I'm so looking forward to getting into the bed. With the Lux Satine sheets that we have from Brooklinen, they're amazing. You don't want any monsters hiding under your bed, of course, except maybe your dog and the top of your bed deserves the best sheets. Since 2014, Brooklinen has offered premium well-made linens that are tried and tested, so you only have to worry about what's on your bed, not what's under your bed. Sweaty bad dreams after watching a scary Halloween movie, you might experience that, right?

(00:26:39):
Luckily, Brooklinen offers a whole fleet of sheet options from to flannel to accommodate all sleepers. So if you're a cool sleeper, a hot sleeper, everything in between, they've got you covered with limited edition colors dropping regularly, Brooklinen keeps you well rested and fresh, so you can use those old sheets for your ghost costumes. Maybe those sheets that I'm using when I'm not using Brooklinen, we'll turn those into ghost costumes and then upgrade to Brooklinen Seasonal picks for linens as well as top of Bed Bath and more with than 100,005 star reviews, Brook Linens internet famous Sheets have won multiple awards from industry experts and are made with long staple cotton for longevity and softness. It's really no trick at all. Brook Linens, bestselling linens are sure to curb those seasonal scaries this fall visit in-store or online. Just go to brooklinen.com. That's B R O O K L I N E n.com.

(00:27:41):
And when you're there, make sure and use that code TN w. You'll get $20 off your order of $100 or more. Plus you also get free shipping, B R O O K L I N E N dot c o m brooklyn and.com use promo code tnw for $20 off plus free shipping. And we thank Brooklinen for their support of Tech News Weekly. Alright, so Google has their next round of pixel devices. I have received them from Google for review and I'm going to kind of focus my time this week on the Pixel eight Pro because it's certainly the device that I would call my next device, and I kind of knew that going in and we can talk a little bit about what makes this special. And then a little bit later we're going to have Michelle Rahman join me to talk a little bit about some of these features in more detail. But this is the Pixel eight Pro, and if I should have brought the seven Pro, why did I not think to bring the seven Pro, if you have the seven Pro, you'll probably notice that the standard design is still there, right? This is really Google's signature pixel design that very, I think at first it was a strange looking color bar, but I think my eyes have definitely grown really accustomed to it. I actually really like it. I think it's a striking and very unique and original kind of design.

(00:29:11):
They've got a nice little matte finish on the back. I'd say the corners around the edges and everything are maybe a little more rounded than maybe what we saw last year. What is not rounded this time around is the display. The display is flat and I'm a big fan of that aspect. The rounded displays on the sides, and I think we have here Berks to, what is this? This is the seven A six or the six A. Oh, okay. Well the six A is going to have the flat display as well. So actually you can get kind of a comparison between the six A and the eight pro, but I'm happy to say that they're bringing the flat display back to the premium. I just like it a lot better than having that kind

Mikah Sargent (00:29:50):
Of waterfall display off the

Jason Howell (00:29:52):
Side. It has, you may remember the temperature sensor. That's what that sensor

Mikah Sargent (00:29:58):
Is right

Jason Howell (00:29:59):
There, which is, I don't know what's, it's one of those features that I don't know why I'm ever going to use this.

Mikah Sargent (00:30:07):
It's like an ambient

Jason Howell (00:30:09):
Temperature sensor. Yeah, so they say it's not clear on humans yet. They're working on that, although you can, so you could put it in front of my,

Mikah Sargent (00:30:18):
Oh, that kind of sensor.

Jason Howell (00:30:19):
87.2. Yes. And I'm not pushing it on my skin, right? I'm just kind of putting it in front. I think really kind of what the, and I don't even know if that's accurate. Am I 86.7? Hopefully I'm a little bit warmer than that, but maybe I don't

Mikah Sargent (00:30:33):
Know. On the outside, you can't quite tell

Jason Howell (00:30:34):
On the inside. I'm probably a little warmer than that on the outside. A little cooler. Anyways, this really does feel, again, one of those features, it's like in the Pixel four series there was the solely radar

Mikah Sargent (00:30:46):
And

Jason Howell (00:30:46):
It was like, well that's neat, but why?

Mikah Sargent (00:30:49):
Yeah,

Jason Howell (00:30:50):
And I get the same feeling off the temperature sensor to be honest, I guess it's neat that it's there. It's probably going to be one one's features that I'm just not going to use. What was there? They showed it. You could check the temperature of your coffee. And I think where everyone kind of landed with this is when you think about when this was probably being iterated and worked on was probably during covid temperature was a really big deal back then. And if this bone had existed back then, it probably would've been a big selling point. Like, oh, you can check your temperature before you leave the house, or something like that. That

Mikah Sargent (00:31:24):
Is, yeah, that's still kind of nice because of the pandemic purchased so many different types of temperature, thermometers of some sort, body thermometers of some sort, and it's hard to find one that works well is accurate and is cost effective. So yeah, the idea that it's built in is kind of nice. I could also imagine if you are running a Windows machine and you want to measure the temperature of it after

Jason Howell (00:31:54):
It's been running. Oh yeah, right. There could be these things really heating up

Mikah Sargent (00:31:58):
Where, yeah, what are we looking at here?

Jason Howell (00:31:59):
Doing thermal checks or something like that on your machine? Yeah, I mean you can come up with ways that you could use it, but again, it just comes down to a solution

Mikah Sargent (00:32:09):
That's kind of without a problem,

Jason Howell (00:32:10):
Right? It's just interesting that that would be a feature that would be included on a mainstream device. You know what I mean?

Mikah Sargent (00:32:17):
Instead of specialty, but one was solely, we saw that technology end up in Google's other products as a means of interaction. So maybe they're once again testing something that down the line we'll go into. Because I could imagine having all of your Google Nest devices with sensors built in so that when you were checking your morning, it's going, Ooh, you kind of look like you might have a fever and then you kind of check in with yourself and you're like, oh, I shouldn't go into the office

Jason Howell (00:32:46):
Today. Oh wait a minute. Oh wow, that would be crazy. If your technology starts telling you without you asking, you need to stay home from work today,

Mikah Sargent (00:32:54):
How are you feeling?

Jason Howell (00:32:56):
Who knows? Maybe the phone will do that somewhere in the future. That's really interesting. This has a 6.7 inch ole what they call actua display Actua ua. Well actua. And so this is Google's own kind of display kind of marketing slash technology. It ramps from one to 120 hertz refresh depending on screen content. We've seen that before, but this does this, but what it really refers to is brightness 1600 knits, H D R brightness, 2,400 knits, peak brightness if you're outside with this thing. I've tested it a lot outside, no problem seeing what's on the content of this

Mikah Sargent (00:33:37):
Screen. 400 knits.

Jason Howell (00:33:39):
It is super bright and not just in the blind use sense, the H D R content, this bone is really built around ultra H D R and that is one of the topics that I'm going to talk about later with Michelle. He's really going to break that apart, but you really end up seeing it in certain ways when you're interacting with the device certain ways where suddenly it just, those brights aren't necessarily maxing or peaking. There's still definition and detail there. Same with the darks. And so it's really nice. You get a really wonderful display experience, nice rich colors, all that kind of stuff. I think the real big thing here though with the eight and the eight pro especially, is the chip inside. This is Google's tensor G three chip. This device by the way has 12 gigs of RAM on the eight pro, which is really all about AI performance on the device.

(00:34:34):
Yes, it's a fast performing processor. It's probably not going to beat Samsung's latest phone as far as processing everyday tasks, but the AI capabilities are pretty stunning. Let's see here. I'm going to go in and create a doc here. Give me one second. If I go in here and just fire off the voice transcription and just start talking about how cool it is. I mean this is all happening on device. It's happening in real time and it's really just kind of keeping up with me as I'm talking and you can read everything as I'm talking and I'm just talking just to give it more things to transcribe. It's really impressive how fast it is. That was very fast. And when you read over it, I'd say it's like 90% accurate. It's so close to there that it's very usable. That's just one thing. There's a lot of camera features.

(00:35:30):
There's a, so let's see here. So if I go into this photo of my dog, Bronson at the dog park, I'll try and not get reflections, edit. This will bring up the edit mode. There's this little button down here which is essentially kind of like an auto edit mode. And I can go in there and I can circle, you used to be able to do this, but now things are a lot quicker. I can actually move him if I'm not mistaken. Yeah, I can go ahead and move him over there. And then when I do this, it will process, I think this is happening on device. I can't remember if this particular thing is happening in the cloud or on device, but it does take some time, which gives me some suspicion that it might be happening in the cloud and it's going to give me three different options of this image. Okay. I am just trying to, this is

Mikah Sargent (00:36:16):
Very ai, huh?

Jason Howell (00:36:17):
Yeah, it's definitely yes. But I did this earlier and you couldn't even really tell looking in the grass that it had moved. It must be in the cloud because this is taking a little bit of extra time. Nonetheless, I've done this feature on other shots and it did generative. Okay, so now he's moved. I didn't capture his shadow. So his shadow's behind. That's a little bit of a difference, but nonetheless, you can see he was here.

Mikah Sargent (00:36:41):
Yeah,

Jason Howell (00:36:42):
That's

Mikah Sargent (00:36:43):
Very good.

Jason Howell (00:36:44):
Things are looking pretty good now. Bronson's over there, it's just a shadow that's

Mikah Sargent (00:36:47):
Left behind. That's so funny. It would take someone along time to do this manually. If I were to pull it up in Photoshop and do all of the work to make that happen, it would take me a long time. Absolutely. I love that these tools are becoming more available to folks. That's just so nice.

Jason Howell (00:37:04):
And then there's another one which I have to play around with a little bit more best takes. So if you and I were in a shot and we took a number of shots together and none of them are quite perfect, we're not quite smiling at the same time, it will know that there's multiple shots lined up and for every face I can tap the face and it'll say, here are three face options for you. And so I can say, I want the smile for this person and the smile for this person and it'll take the versions.

Mikah Sargent (00:37:31):
Do you want to try it? I can come stand over there.

Jason Howell (00:37:33):
I think maybe we have to now. Okay, so I'm going to take a, sorry, audio listeners. Okay. Alright, so now let's

Mikah Sargent (00:37:48):
See if this works

Jason Howell (00:37:49):
Time. I see, I want to see what happens here. So let's see here. So we go to edit and it's doing a little bit of on-device stuff, tools best take. Okay, so now it's finding similar shots to improve the photo and hopefully this happens

Mikah Sargent (00:38:07):
Faster.

Jason Howell (00:38:07):
Okay, so we'll go ahead and zoom in here Micah. We've got that. We've got that. Wow, got that. Okay, let's go serious then.

Mikah Sargent (00:38:19):
Yes.

Jason Howell (00:38:21):
Okay, now

Mikah Sargent (00:38:23):
That is something

Jason Howell (00:38:24):
To your ear. Wild. There we go. Or we can go Smiley. And there we go.

Mikah Sargent (00:38:31):
That is

Jason Howell (00:38:32):
Wild. And I'm looking on the edges to see. And on the other one it moved. It is zero. It must have been how

Mikah Sargent (00:38:40):
My head was

Jason Howell (00:38:40):
Moved. There's some imperfections. You can kind see a little bit of a blue thing on the ear.

Mikah Sargent (00:38:45):
Yeah, but the

Jason Howell (00:38:46):
Fact that you can even do

Mikah Sargent (00:38:47):
That is

Jason Howell (00:38:47):
Really,

Mikah Sargent (00:38:49):
Yes, really crazy. Of course if you Pixel peep is Ant was talking about earlier,

Jason Howell (00:38:54):
You'll be able to find the imperfections for

Mikah Sargent (00:38:56):
People just posting quickly to their Instagram or whatever. That is a really cool

Jason Howell (00:39:01):
Feature for

Mikah Sargent (00:39:01):
Sure. I love that idea. And I think just the impact that it has on giving people more ability to get the photo that they want because Jason Snell was talking about this a little bit on Mac Break Weekly, the difference between what Apple does and what the iPhone does and what Google does and how if we think of a photograph as a captured moment in time, what we were doing, whether it was the smiling or the frowning, that was all the same moment. And so does it really matter if the photo that we took is exact? No. What we were trying to capture was that moment in time. And so we want the photo that looks great and that is capturing the feeling. So if that's serious, then we can switch it to serious and it's happy that we switch it to happy and that's fine. And even if we look back on the photograph of Jeff Jarvis that's over there, that is just as fake in the sense that we don't know the context of what's going on outside of the frame. What may have happened in that moment, how long it took to take that photo because maybe he sneezed right beforehand and so they had to do it again.

Jason Howell (00:40:19):
But all we're left with is the perfect moment is

Mikah Sargent (00:40:21):
That perfect moment. And so if you want to make that perfect moment happen now or in post

Jason Howell (00:40:29):
Is the photograph, this is the photo of course

Mikah Sargent (00:40:32):
Of Jeff Jarvis that we're talking about, then who cares? I think that what matters is that you get the photo that you're after.

Jason Howell (00:40:37):
So

Mikah Sargent (00:40:37):
I love that.

Jason Howell (00:40:38):
That's interesting. I like that way of looking at it because I've felt a little creeped out by this because what actually is a picture

Mikah Sargent (00:40:45):
Anymore

Jason Howell (00:40:47):
When we can manipulate it? Having said that, we've been able to manipulate photos for a very long time, long, long time. We've just assumed that anything taken on our phone and posted to the internet from our phone is probably representative of the actual photo. And now what we're seeing is, oh wait a minute, processing power capabilities, tools, they're all at a point to where we can do the things that we used to need a desktop to do.

Mikah Sargent (00:41:11):
Think about all those photos of the Leaning Tower of Pisa where people are holding it up. They're not actually holding it up folks.

Jason Howell (00:41:16):
Yeah, right. That's true. You

Mikah Sargent (00:41:17):
Know what I mean? No photograph is real

Jason Howell (00:41:21):
Because

Mikah Sargent (00:41:22):
It's always a captured moment. So yeah, I think if we can get past that and go look at the moment, the memory that we, that's what the photograph is in the first place, it's our ability to re-access that moment. And if we want to remember it as the two serious people or the two smiley people, whatever the situation is, I think that's great.

Jason Howell (00:41:42):
Yeah, interesting. And so the Tensor G three is enabling a lot here and I haven't even touched on, there's a bunch of features that are coming down the line. They're going to be doing some video improvements to improve the darkness quality, which will require some sort of cloud kind of interplay in order to do some of this. So it's not all entirely on the device, but the G three, I think what really makes it special is it's chip that is designed to work with Google's AI and kind of the brainy things that they're coming up with on bringing AI into your device. So it's cool. Just real quick here, we've already gone pretty long on this. The camera itself, updated camera hardware, I'd say pretty meaningful updates as well. 50 Megapixel Maine, which is similar to last year's, a higher 48 megapixel wide, which I think is great.

(00:42:38):
The wide sometimes gets a little grainy and sear on some of the previous models. Now things are a little bit sharper, 48 megapixel telephoto. I just love the telephoto and continue to love the telephoto on the eight Pro. You also have some pro controls that appear in the camera setup as well. So if you want, you can get in there and you can change some more of the settings. Kind of go a little bit deeper depending on how you have it set up things like ISO and really get in there and tweak it. Same for video H D R 10 and four K 30 frames per second. You can record to 60 frames per second, but if you want that H D R quality, which lemme tell you you actually want it, we're going to talk about that in a second. You want to be four K 30 frames per second.

(00:43:24):
And I think it's a long time coming because pixel's videos, video recording hasn't always been the best, but I feel like they've made some real improvements here and I'm excited to see because they're going to continue to push these out some of these new features that are really going to take video recorded on the Pixel eight and the eight Pro to the next level. So I'm really looking forward to that. I guess I've touched on these things. Oh, battery life. Excellent by the way. I've had 30%, 35% at the end of the day every single day. I think one of the things that I really want to point out though is it has seven years of updates. Now this is the first to my knowledge, this is the first Android phone that's promising a seven years of updates, security updates. And for the Pixel devices they do regular feature drops for the Pixel devices. Google's promising seven years. It used to be five. And I think that brings it closer to kind of what you experience with Apple and iOS devices.

(00:44:22):
Apple has done an amazing job of making their hardware last that long one and two dedicating themselves to keeping it updated and up to date for that long. And Google's doing that here. Obviously seven years is a long time. We'll see how this device is doing in seven years. But I'm really give them a ton of praise for making that commitment because again, I think that moves the industry forward absolutely for other Android manufacturers. So having said that, I think this is a really special pixel phone. I've loved it so far and I know I'm not alone. A lot of other people who have played around with this are like, okay, this feels like Google once again. But even more than in the past, taking itself seriously when it comes to hardware, I really hope they don't give up on it. I hate that I have that little suspicion in myself at this point where I don't trust Google as much as I used to, but they're doing really wonderful things with this phone and at a really important time where AI and the potential of generative AI and all these advancements around ai, they're just really great place to capitalize on that on their phone.

(00:45:29):
And I think this is a really great example

Mikah Sargent (00:45:31):
Of

Jason Howell (00:45:31):
That. So Pixel eight Pro, I'm a super big fan. I'm going to be spending time with the Pixel eight and then also the Pixel Watch, the new Pixel watch as well, the watch two. So you can stay tuned in future episodes of T N W for those reviews. But that's where I'm at right now and we are not done talking about the Pixate Pro even though we do have to take a break. Michelle Ramon will be with us after the break to talk a little bit about H D R on this device and what it means to be ultra H D R.

Mikah Sargent (00:46:00):
Awesome. Yeah, we'll hear about that in a moment. But first, this episode of

Jason Howell (00:46:02):
Protectors

Mikah Sargent (00:46:02):
Weekly is brought to you by our friends IT Pro tv, now known as a c i Learning our listeners, you out there know the name IT Pro TV is one of our trusted sponsors for the last decade as part of a c I learning IT Pro TV now IT PRO has elevated their highly entertaining bingeable short format content with more than 7,200 hours to choose from and new episodes are added daily. A C I learning's personal account managers will be with you every step of the way. You can fortify your expertise with access to self-paced IT training videos, interactive practice labs and certification practice tests. One user shared this review excellent resource not just for theory but Labs incorporated within the subscription. It's fantastic. Highly recommend the resource and top class instructors. Don't miss a C I Learnings Practice Labs where you can test and experiment before deploying new apps or updates without compromising your live system. MSPs

Jason Howell (00:47:05):
Love it.

Mikah Sargent (00:47:06):
You can retake Practice IT certification tests so you're confident when you sit for the actual exam. A C I Learning brings you IT practice exam questions from Microsoft CompTIA, EEC Council, P M I and many more. You can access every vendor and skill you need to advance your IT career in one place. A C I Learning is the only official video training for CompTIA. Or you can check out their Microsoft it Cisco training, Linux training, apple Training, security Cloud and more. Learn it pass your certs and get your dream job. Or if you're ready to bring your group along, head over to our special link and fill out the form for your team twit listeners receive at least 20% off an IT Pro enterprise solution and can reach up to 65% for volume discounts depending on the number of seats you need. Learn more about ACI learning's premium training options across audit IT and cybersecurity readiness@go.acilearning.com slash twit. For individuals. Use Code TWIT 30 for 30% off a standard or premium individual IT pro membership. That's go dot ACI learning.com/twit. And of course we thank ACI learning for sponsoring this week's episode of Tech News Weekly. All righty, Jason Howell, back to you.

Jason Howell (00:48:22):
All right, so I read an article by my good friend Michelle Roman on Android Police about ultra H D R. And not to make this show super pixely, but I thought, hey, let's talk with Michelle about this because he really broke down a feature that in my time using the eight Pro I've been really impressed with. This H D R implementation is really

Mikah Sargent (00:48:45):
Something to see.

Jason Howell (00:48:46):
So welcome Michelle. Or maybe done. Oh, there we

Mishaal Rahman (00:48:51):
Go. There we go. Thanks for writing me, Jason.

Jason Howell (00:48:55):
Yeah, it's great to see you and sorry for a little bit of a late start here. It's great to have you here, Michelle, of course, writing for Android Police, Android Faithful podcast that you're doing with Ron Richards and Wintu Dao and everything. You are a busy, busy guy. So thank you for hopping on with me today

Mikah Sargent (00:49:12):
To

Jason Howell (00:49:12):
Talk

Mikah Sargent (00:49:13):
All about

Jason Howell (00:49:13):
This.

Mishaal Rahman (00:49:14):
No problem.

Jason Howell (00:49:15):
So I guess at the start of this, you wrote about this before the reviews had come out and Ultra H D R isn't necessarily a new thing, but it feels new in my use of the Pixel eight Pro. I really see, I can show you an example and I have no idea how this comes out on the stream, but if you show my phone, John, I can show you this picture of my dog or I don't know if you still have that. Yeah, so you're going to see it as super bright on our studio cameras. If I go up here, it's going to kind of ramp things down and you might actually see it kind of increase. There you go. You see it kind of pop into place on this stream that looks pretty underwhelming. It's like, oh, it's just blown out. But when I look at it in person, the amount of detail that I see in the whites

Mikah Sargent (00:50:07):
And everything,

Jason Howell (00:50:07):
That's very rich. It's very rich, it's very kind of defined. But this is

Mikah Sargent (00:50:12):
What is

Jason Howell (00:50:13):
The new thing that's happened here when it comes to ultra H D R? Have we not seen this in any of the previous pixels?

Mishaal Rahman (00:50:21):
So not in the previous pixels, no. But it has been something similar has existed in the Apple side of the world for a few years now. It's just that on the Apple side it was also limited to their camera app and their gallery app. But now on the Google side of things, they're trying to integrate IT support for it into the Chrome browser. And also we have some third party app supporting it such as Adobe Lightroom. But basically what's happening and why, you did mention you heard about it before because Google talked about it at Google io, but this is one of those things that you just have to see in person to really get, you need to have a device that supports an H D R, that has an H D R display to actually be able to enjoy what you're actually getting with Ultra H D R.

(00:51:02):
And as you saw, you can kind of see it brighten on stream, but you don't get the same experience unless you're actually holding the phone in person or you're looking at it on your monitor. So what's really happening with Ultra HDR R is that it's a new format that's building on top of jpeg, jpeg, I'm sure everyone's familiar, has seen that file format before. It's the ubiquitous image format. Every device operating system and app supports showing JPEG files. So what Ultra H D R does is it's a JPEG file, but it also has some extra H D R and metadata embedded in it so that if you take that JPEG file and you display it on any old app device, PC game console, whatever, it'll display like normal because it's a JPEG file and every device supports jpeg. But if you take that same JPEG file and you show it on a device that's capable of supporting ultra H D R such as the Pixel seven or later on Android 14 with a compatible app like the Google Photos app as Jason's shown right now, then that app will be able to take that base standard dynamic range image, it'll be able to add that H D R metadata on top of it and it'll be able to create this beautifully vibrant, crisp new H D R version of an image.

(00:52:18):
And this is something that will change how we take and view images because a lot of times right now, if you're familiar with H D R, I'm sure you're thinking, oh, haven't we already had H D R in photography before? What about that H D R button and the Google camera app that used to exist, right? What's the difference between that H D R and this H D R? So the H D R that you're familiar with in photography is basically you're taking multiple different images at different exposure levels and then merging them together to create an image that captures a broader exposure range versus this H D R is basically expanding the available, the color that you're capturing and being able to show the contrast between the brightest parts of the image and the lowest parts of the image. You're not actually just emulating it through capturing multiple photos.

(00:53:11):
This is preserving the original image from the image sensor and showing it on displays that are able to actually show that image in its original exposure range. So previously you had our cameras that are capable of taking much greater dynamic range of colors and brightness levels and things like that, but you would always have to squish that to be able to show it on displays that were only standard dynamic range capable. But now with Android 14 and this older H D R format, you're able to take an image that's capable of being shown on the same S D R displays, but also expanding that range of brightness information so that you can show it on as the image was originally intended to be shown. And the benefit of that is that, say you have a person wearing a blue shirt and you're out in a park and the sun is shining really brightly right now, you take a photo and what happens is that you want the person in the foreground to be exposed properly. You want to see the person, your friend sitting on the park bench or something. But you also want to be able to see the sunny, bright sky. What a lot of cameras do right now is they kind of try to make the exposure level of the background and the foreground really match, which doesn't actually make sense because the sun is going to be way brighter than what's on the ground. So right now it kind of squishes the two and makes it feel overprocessed and saturated.

Jason Howell (00:54:40):
But

Mishaal Rahman (00:54:40):
With the ultra H D R, you're able to get both the exposure of the front of the foreground subject looking good, but you're also able to really show off the brightness of anything that's actually highlighted like it's supposed to be.

Jason Howell (00:54:53):
Yeah, this was one photo that really, again, I feel bad for anyone watching the live stream. You're not seeing what I see in real time. These things end up feeling kind of squished and a little bit, not distorted, but clipped on the live stream because the cameras aren't seeing what my eyes are seeing. But when I took this photo, this is of my two dogs, Bronson and Sugar, they're in the car ready to go to the dog park, very excited, as you can see, Bronson's in direct sunlight, sugar's in the shadows. She's in an unlit car on the inside. And when I took this photo and looked at it afterwards, I was like, that's exactly how my eyes saw the image. And so often when I take photos or something like this, sugar would be kind of hidden in the darkness and the detail would be lost. The brightness of Bronson would be very muted and everything like that. But when you look at it on the proper display, it's not that it just gets brighter, it's that everything gets more defined and it really just looks more natural. It's really impressive. Now, iPhone has had this for a while, Micah, have you gone through the same wow moment that I've apparently gone through with the Pixel? I think I did.

Mikah Sargent (00:56:10):
Yeah. Originally when the iPhone first started adding this feature.

Jason Howell (00:56:16):
What's

Mikah Sargent (00:56:16):
Interesting though is that

(00:56:19):
There are a number people who do like this feature on the iPhone so much so that you can do, if you start to type in H D R iPhone or some version of that, people are wanting to turn it off, how to turn it off, and I think that part of that is they're used to using capturing applications that do a little bit of filtering when you take the photo and the iPhone's just giving you the stark cold truth of what you're snapping and so they see the photo and they're like, oh, that's not, but it's more true to the eye, right? Like what you're saying with there and so it was fascinating learning that people wanted to know how to turn that turn off. Viewing it, it's still capturing it, but you can basically say don't display these photos in that full H D R range whenever I'm looking at them because I can't handle the truth.

Jason Howell (00:57:09):
Yeah,

Mikah Sargent (00:57:09):
Right.

Jason Howell (00:57:09):
My eyes don't want to see that. Can you do that on this? I didn't actually look to see if there was a way to turn off the H D R preview, alter H D R preview.

Mishaal Rahman (00:57:19):
So right now by default the pixel eight automatically captures photos in ultra H D R. There's no way to turn it off, but I mean if you edit the photo in the Google Photos app and some of the edits that you can do to it, like changing, using the dynamic preset will disable the ultra H D R metadata in it. So there are some ways to get rid of it, but I mean one of the reasons that people didn't really like H D R and one of the people commonly cite as a reason for not liking H D R is that it requires turning the brightness way up and people complain this is too bright, it's like burning my eyes. So one of the reasons why support for Ultra H D R is so limited right now on the Android side of things is because Google is basically saying that your device has to support a feature called S D R Dimming and that feature basically makes it so you don't have to pump your screen brightness way up just to be able to show H D R images in their full glory.

(00:58:14):
What this does is basically it allows for the standard dynamic range parts of the screen. So you have a status bar, you have your navigation bar, you have other parts of the webpage on a Chrome browser for example. Those will stay dimmer. Then the H D R image that's being shown on screen, so you can kind of have best of both worlds. You can show the H D R image as it was meant to be, but you can have the rest of the background not just jacked up to a thousand knits and burning your eyes at night. So that's one reason why this feature is currently only available on the Pixel seven series and later on Android 14 because those devices support S D R Deming.

Jason Howell (00:58:53):
Wow. Well it's really something I feel like something to see if people have the opportunity and I mean it's not just obviously iPhones and this particular series of Android phones that can view this. Certain monitors are capable of viewing it, but I had a fun experience with Ant Pruitt in the office the other day where I was showing him the eight pro and then started showing him some photos and we started going through the before and after on the HDR R and he was just getting super excited. He was like, oh man, I can't wait to get my hands on this. It really is the sort of thing that check it out for yourself. I could see though where some people have the complaints around being too bright. It is bright, it's just it more accurately reflects the brightness of life, I think is my understanding, at least from my experience anyways. And I dunno, I feel like it's worth it, but maybe I guess I can understand why people don't. Do you have the eight PRO yet? Are you kind of playing around with these features on your own? I know you were waiting for one.

Mishaal Rahman (00:59:56):
No, I don't have Pixel eight yet, but I was able to play around with it just by sideloading the Google camera app onto my Pixel six pro. And speaking of which, if you want to actually try out and see what ultra H D R is like, I mean I had made a GitHub repository on that's linked on the Android police article that Jason mentioned and that actually has a bunch of sample images there and if you have a PC that has an H D R display like an M two MacBook Pro and you go on that webpage, you should be able to actually see the effect. You should be able to see these H D R images in their full glory not of browsers and not a lot of users will have an H D R display because it's still just not very common thing. But a lot of modern high-end windows and MAC PCs come equipped with HDR displays, so you should be able to see what these images look like.

Jason Howell (01:00:45):
To a certain degree. It reminds me of three D or something mean you've got to have the three D glasses in

Mishaal Rahman (01:00:51):
Order,

Jason Howell (01:00:52):
But at the same time, it also reminds me of back many years ago when I can't remember what the Android phone was, where the first HD display that I saw on Android, and I remember opening up the Twitter app and seeing the profile images. They were very small, but the profile images and they kind of looked like little windows that I was looking through. My eyes weren't accustomed to the high definition kind of view on a screen yet. Now it's just the way it is, but back then it was a big step up from standard resolution, standard definition. I kind get the same feeling with this H D R where it's like, oh man, now I want everything to look like that. So it's going to be hard to take my foot off the gas when it comes to ultra H D R content. But anyways, Michelle, thank you so much for breaking this down for us and hopefully getting some more people excited about the technology. What do you want to leave people with as far as where they can find you and support you in the work you're doing?

Mishaal Rahman (01:01:55):
You can follow me on many different platforms. You want to learn about what's new on Android. My handle is usually at Michelle Ramon except on Threads where it's at Michelle Ramon because someone else had already taken that username. I'm surprised, I don't know. There's not many people in the world with my name, but I'm also on patreon.com/michelle Raman if you want to support the work that I do because it takes a lot of work to dig through Android. It's never ending struggle, never ending battle against whatever Android team's releasing and

Jason Howell (01:02:24):
Keeping up with them to try and be one step ahead. Well, you do a great job with it and I can't thank you enough for coming on today. It's great to hang out with you again. Thank you,

Mikah Sargent (01:02:32):
Michelle.

Jason Howell (01:02:34):
Yeah,

Mishaal Rahman (01:02:35):
Thanks Jason. Thanks Mike.

Mikah Sargent (01:02:36):
Thank you. All right, we'll

Jason Howell (01:02:37):
Talk to you soon.

Mikah Sargent (01:02:38):
All

Jason Howell (01:02:38):
Right, and we've got even

Mikah Sargent (01:02:40):
More coming up shortly.

Advertisement (01:02:42):
Why don't textbooks and research papers come with audio versions? If you're a student, you probably spend many hours each week reading on a tiny screen. Wouldn't it be amazing if you could listen to it like an audiobook? Now you can. listening.io is an app that turns any P D F, whether it's a research paper or even a full book into audio. It can pronounce difficult technical words, read math equations and even knows to skip the citations and footnotes. You can jump straight to the section you want to listen to because it creates a table of contents. It even has a one click note taking button, which puts the last 10 seconds you're listening to into a notepad so you don't have to type notes when you hear something interesting. If you go to listening.io/podcast, you'll be able to get your first two weeks for free. Go to listening.io/podcast

Mikah Sargent (01:03:33):
Podcast. Yes, that's right. It's time for my story of the week. It starts with a Reuters report. The I guess news publication has an exclusive about open AI's upcoming November conference. OpenAI of course is the company behind chat, G P T and is in a sort of partnership financial agreement with Microsoft where chat G P T is kind of the base of a lot of the AI generative AI stuff that Microsoft is doing. On November 6th,

(01:04:16):
The

(01:04:16):
Company is having a developer conference in San Francisco and it is rumored to be, according to this Reuters report, rolling out some new features. One thing that it's looking at is the addition of memory storage for its developer tools. Essentially what that does is it gives developers more of an opportunity to have all of, or I should say, more of an opportunity to have more of the work that they're doing within the OpenAI A P I and OpenAI stack. So this just means that right now if a developer is trying to tie in chat G P T or some other generative model that OpenAI is working on into the apps or services that they're doing, they kind of have to split things up amongst different services. And so by offering a memory storage option within the OpenAI tool, it will give developers, again, a little bit ease of use in creating the technology that they want to. The company is also expected to be announcing updates and inclusion of its vision technology

(01:05:41):
So that developers can create or augment their applications with tools that will allow you to analyze an image to be able to describe an image. One example, a very simple example of this is say my calendar application. I could hit the plus button in my calendar application, which is how I add an event, and I could then snap a photo of a flyer that I see when I'm out walking around and then it would get sent off to OpenAI for processing. They would be able to see the text within that flyer and then parse it out and understand that when the event is what it is, and then pass that information to my calendar and pop it in as a calendar event. That's just one kind of simple example. Another could be you're looking at a magazine and you see a recipe that you want, and so my recipe app, suddenly I can hit the plus sign, just snap a photo of that recipe, it gets uploaded, processed, and then it gets turned into that recipe card within the app.

(01:06:50):
Oh, that sounds nice. Pretty cool stuff. And those are just use cases that I'm coming up with off the top of my head. The Reuters piece does not go into detail about what exactly, but when you think about the vision capabilities of analyzing and describing that is exactly how a developer would be able to augment their own tools to add that. The whole point of this is to make OpenAI the tool that developers use better to add this functionality to their apps. Developers are already making use of this in small ways with the OpenAI A P i with the chat G P T A P I specifically, but they want to make it more robust. They want to make it more powerful and get more developers using it such that that is where they go first. The developers start to think that's the tool that I want to use. That's the A P I that I want to use. Those are the plugins I want to use right there. Early on chat, G P T started to add some AI functionality by way of plugins, and we've talked about plugins on Windows weekly before with Microsoft's own copilot. This started over on the OpenAI side where there was, excuse me, a browse with Bing plugin where if you use chat G P T to send a question, for example, it would search the internet depending on the question that you asked, read some articles, read process of

(01:08:28):
Articles,

(01:08:28):
And then give you a response based on the search that it did, but also the knowledge that it had. But then there were other plugins that got added. An example would be a plugin that's called Web Pilot, which is just a plugin where you can paste in a link to a page and the plugin will help chat G P T process and parse the link that you send it so it can pull all of the text from the article that you're sending or from the page that you're sending, and then feed that into chat G P T so that chat G P T knows what you're talking about. And then you can ask it questions about that article. There are other tools now that have been added directly to chat G P T that allow you to upload A P D F for example, and be able to pull information from A P D F. Wikipedia now has a plugin. There is a plugin called Prompt Perfect, which basically lets you, it uses its own set of generative AI rules to kind of make the prompt that you're sending a better prompt. There's so much, it's

Jason Howell (01:09:42):
Getting so meta, isn't

Mikah Sargent (01:09:43):
It? Yeah, exactly. That is truly meta.

(01:09:46):
So these are some of the things that the company is rumored to be announcing because it again is doing its best to try and make it possible for the company to frankly make more money. The company, of course, first started out as a not-for-profit, or rather a nonprofit organization that was looking at AI and making sure that the AI that was released, the AI that got out into the world, degenerative AI that got out into the world was for the betterment of humanity and that it wasn't going to harm humanity. But over time has morphed into this company that does seek to make a profit and is aimed at creating a tool that can do so. Now, one of the features that OpenAI is currently experimenting with is a new means of communicating with chat G P T and it's all voice-based. I have a chat G P T subscription, so I have access to this tool and the tool lets you kind of have a conversation with chat G P T.

(01:11:10):
So I want to show this, and when I tried to capture the audio of chat G P T in the past, it didn't pass it through to the airplay, so I might have to hold it up to the microphone here, but we can still show on screen kind of what it looks like. So if you have access to this or if you gain access to this, you'll see a little headphone icon in the top right corner of your chat G P T app right now. It is exclusive to the app. Those of you with a subscription who don't have access to this yet, you might try force quitting the app. I did that once and then it popped up in the top. But if I tap on this and I'm going to actually undo so that I can explain it, if I tap on the headphones, it will connect to chat G P T and then await my prompt. So I'm going to tap it and then I'm going to give it some ideas. So here we go. Could you please tell me a short story about two paragraphs about two guys, Jason Howell and Micah Sargent who hosts a tech podcast but are suddenly interrupted by an asteroid made of small orange octopi.

Jason Howell (01:12:33):
Wasn't sure where we were going there at the end. Neither

Mikah Sargent (01:12:35):
Was I. And so then it processes

Jason Howell (01:12:39):
A cloud bubble thing.

Mikah Sargent (01:12:40):
Yeah, there's a little kind of a chat bubble as it starts to think about what I said, and this is actually one of the longer times that it's taken. I imagine it's because it's noon Pacific, so it's probably got a lot going on the server side, but we'll

Jason Howell (01:12:54):
See if its here. My cloud stuff was laggy earlier today too, so it could be something

Mikah Sargent (01:13:01):
Happening

Jason Howell (01:13:02):
In here in the studio.

Mikah Sargent (01:13:03):
I might try it one more time. That worked for me before when it wasn't working the first time. So I'm going to tap to cancel and we'll try it again and we'll see if it works this time. Could you tell me a short story about two paragraphs about two guys, Jason Howell and Micah Sergeant, who hosts a tech podcast, but while they're recording the show, they're suddenly interrupted by an asteroid crashing into their podcast space. The asteroid is a small collection of purple ostriches, and we'll see if it goes this timely.

Jason Howell (01:13:45):
There we go. Something different this time.

Mikah Sargent (01:13:48):
So here we go.

AI (01:13:49):
Jason Howell and Micah Sargent, the dynamic duo behind the popular tech podcast. Were in the midst of recording their latest episode When chaos descended from the cosmos as they bantered about the latest gadgets and software updates, a deafening crash shook their recording studio startled. They exchanged wide-eyed glances, their microphones still capturing the moment to their bewilderment. An asteroid had smashed through the roof, and what tumbled out were not orange octopi, but a small collection of purple ostriches. The giant birds squawked and flapped their wings turning the podcast studio into a surreal aviary. Jason and Micah, stunned by the unexpected interruption, could only laugh at the absurdity of the situation. It was a podcast episode they'd never forget. Blending tech talk with the bizarre world of purple ostriches from outer space

Mikah Sargent (01:14:43):
Ine.

Jason Howell (01:14:44):
Wow,

Mikah Sargent (01:14:45):
That's a solid voice too. And this is the thing. So what they're doing here is you may have noticed there were little breaths.

Jason Howell (01:14:51):
Yeah, no, that was really well done. This

Mikah Sargent (01:14:52):
Machine does, it does not breathe, but they put little breaths in there and they've been improving upon it. I'll tell you when I first used it. Wow.

(01:14:59):
Yeah,

(01:14:59):
That's

Jason Howell (01:14:59):
Really impressive.

Mikah Sargent (01:15:00):
It's very impressive. It shows me the potential future of this technology. The one thing early on, and they may be improving upon it even more, making adjustments. When I first used it, it had too many ums and uhs that they were trying to do to make it seem more natural. And so the pauses around the ums and uhs kind of added to it. So it would say once upon a time, there was a small chicken and it was a really

Jason Howell (01:15:27):
Uncertain narrator.

Mikah Sargent (01:15:28):
Yes, exactly. Exactly. But what I like about this is I can suddenly imagine a world where I'm listening to an audiobook and then I trigger my virtual assistant, and instead of Siri, I'm using something like this and it feels more natural and I wasn't able to show you because it would require too much back and forth. Here is that as soon as it's done with its response, then it's waiting for the next thing. And so you can ask it about whatever you need to next. So you really can have a back and forth and it'll just wait on you too. So it's not as if you have to, oh gosh, I better get this prompt in, or it's going to stop listening to what I'm saying. No, it'll wait and then you could ask it more. And yeah, I've been pretty impressed. There are multiple voices as well, so you don't necessarily have to go with that voice. It's got some settings in there that make it pretty cool, but that is already something that OpenAI is testing, not even something that they've rumored to be testing. But this stuff is pretty doggone nifty. I think

Jason Howell (01:16:36):
I was super impressed by the quality of the story was his

Mikah Sargent (01:16:41):
Standard AI generated story fair. I did the reference that I had changed the story to be the purple ostriches instead of the original story that had orange

Jason Howell (01:16:52):
Octopi

Mikah Sargent (01:16:54):
Or octos. So that's kind of fun that it knew

(01:16:59):
That I had changed it and it will keep that kind of transcript going. The other day, I asked it an actual legitimate question. I couldn't remember sort of the food safety rule, and I checked afterward. I just was like, I want to see how Chad G p t handles this versus if I go and do the search of, I had made some spaghetti with ground Turkey, so I said, Hey, I have some spaghetti in the fridge that I made with ground Turkey and peas and some broccoli rice, and it's been in the fridge for a couple of days. Is it still safe to eat? And it was saying ground meat that's refrigerated typically lasts about four days. But then it went on to say something along the lines of like, but open it, smell it any strange smells, any weird textures, any strange looks, trust your judgment on that. And the better option of course, if you're worried about it, is just to not eat it. And that's where it kind of gave me that moment of, yeah, I'm jogging along or I'm doing something else, and suddenly I just have a question about something. And to have that very natural response is pretty cool. And it reminded me of that movie. Is it her? Is that what it was called?

Jason Howell (01:18:15):
Yeah,

Mikah Sargent (01:18:15):
She or Her,

Jason Howell (01:18:16):
Which strangely I have not seen, but I believe that that is the movie that you're talking about with Joaquin Phoenix, right?

Mikah Sargent (01:18:22):
Yeah. Yeah. And Scarlet Johansson, I believe, I think is who plays her. Yeah, I really need to move that up on my list apparently. Really? Yeah, you definitely should check it out. But the just natural interaction that took place I thought was pretty cool.

Jason Howell (01:18:37):
And that's where this stuff is heading. Yep, for sure.

Mikah Sargent (01:18:39):
Faster than I thought it was going to be.

Jason Howell (01:18:41):
Yeah. Yeah. I mean we've been talking to our devices for a while, but there's always been this perceptible kind of

Mikah Sargent (01:18:47):
Disconnect, like,

Jason Howell (01:18:48):
Okay, well that's a machine that's doing its best to try and blah, blah. The gap is closing. I don't know. It's closing.

Mikah Sargent (01:18:56):
Yeah. It's not to

Jason Howell (01:18:57):
The point where it's like, okay, this is the future, but I do know the quality of that voice and the cadence and everything.

Mikah Sargent (01:19:06):
I want that voice that was

Jason Howell (01:19:07):
Super convincing. I could envision a person there talking in that voice,

Mikah Sargent (01:19:11):
You know

Jason Howell (01:19:11):
What I mean? I could see a person actually like, oh, we got another one of those weird stories for you to read Bob. Alright to me. So this guy Micah Sargent.

Mikah Sargent (01:19:22):
Yeah, exactly. And this to me is like now those books that I've wanted to read that I haven't gotten around to reading, I could have it read to me as I'm doing other things. There's just so much potential

Jason Howell (01:19:36):
With all of

Mikah Sargent (01:19:36):
This stuff. It's kind of magnificent.

Jason Howell (01:19:38):
Yeah, really is. No question about it. Interesting stuff. Thank you for that. And thank you. Thank you. That's right. You and you. You for watching and listening to this episode, tech News Weekly. We do the show every Thursday. Just go to twit tv slash tnw and subscribe. We say it each and every week because we hope that you'll do it if you haven't already subscribe. It's super important. TWI tv slash tnw.

Mikah Sargent (01:20:03):
Yes. And if you'd like to get all of our shows ad free, yes, the ad free, well check out Club twit at twit tv slash club twit for $7 a month or $84 a year, you out there can join the club. When you do, you'll get every single twit show with no ads, just the contents because you are the one that's in effect sponsoring the show. You are the supporter of the show, so you get all of the shows ad free. You'll also gain access to the special twit plus bonus feed that has extra content you won't find anywhere else behind the scenes before the show, after the show. Special Club TWIT events get published there as well. And that's a really fun thing where when you join the club, you're going to get access to this feed that has a bunch of stuff that you've never seen before.

(01:20:42):
So joining now, joining later, whenever you join, there's a lot of stuff there. And I know coming up, I'm very excited about the Escape Room we're going to be doing as part of the Club Twit experience. We also have the Club Twit Discord, which you can be a part of when you join Club Twit. It's a fun place to go to chat with fellow club twit members and also many of us here at twit. Again, TWIT TV slash club twit, $7 a month, $84 a year. You'll also gain access to our exclusive club twit shows, the Untitled Linux Show, which is a show all about Linux, hands on windows, which is Paul RA's short format show that covers Windows, tips and tricks. Hands on Mac, which is my short format show that covers apple tips and tricks and home theater geeks from Scott Wilkinson, which is all about the home theater interviews, reviews, questions, answers, everything in between, every home theater, geekery you could imagine, and Jason Howell's AI inside, which covers stuff like what we were just showing here

(01:21:50):
On the show. And I'm really looking forward to today's episode, a new tool for ai. So looking forward to hearing about that. So head to TWI tv slash club twit to check it out. Again, $7 a month, $84 a year. Join the club. We would love that. If you'd like to follow me online or check out my work, I'm at Micah Sergeant on Mini a social media network. Or you can add to Chihuahua Coffee, that's C H I H U A H U a.coffee where I've got links to the places I'm most active online. Check out Hands on Mac later today. If you are a Club TWIT member, check out. Ask the tech guys with Leo LaPorte and yours truly, where we cover, where we take your questions live on air and do our best to answer them. It's all tech questions all the time. We have a lot of fun on that show. And you can also check out iOS today, the show I co-host with Rosemary Orchard, where we talk Iost v oss watch, oss, HomePod, oss, iPad, oss. It's all the

Jason Howell (01:22:48):
Operating

Mikah Sargent (01:22:48):
Systems. All the

Jason Howell (01:22:49):
IOS.

Mikah Sargent (01:22:49):
Yes, all of the Apple oss. There we go. Jason Howell, what about you?

Jason Howell (01:22:53):
Well, I talk about all the ais as you mentioned on AI inside, still here in the club. We're going to be doing that here about half an hour. So Club twit, twit tv, so club slash club twit. You can just find me on any social media platform. Just do a search for Jason Howell. You'll probably find me with different varied names, but just make sure that it is me. I have had some people out there trying to imitate me, but look for that Jason Howell and some of them, Jason Howell and others. You'll find me. Big thanks to John Sina here in studio for doing the show this week behind the board. Thanks to John Ashley as well behind the scenes. He is rocking it right now. He's doing a lot, a lot, a lot. So we appreciate you and we appreciate you at home or on the commute or wherever you happen to be. Thank you for watching and listening. We'll see you next time on Tech News Weekly. Bye-bye buddy. Bye.

Leo Laporte (01:23:44):
Listeners of this program, get an ad free version if they're members of Club twit. $7 a month gives you ad free versions of all of our shows plus membership in the club. Twit Discord, a great clubhouse for twit listeners. And finally, the Twit plus feed with shows like Stacey's Book Club, the Untitled Linux Show, the GIZ Fizz and more. Go to twit tv slash club twit and thanks for your support.

 

All Transcripts posts