View Full Transcript
This is your captain speaking. This is time. I'm going to ask you to pass your seat belts. Please let me know when we need you. You'll see us in the takeoff. Oh! Welcome to Morpheus Cyber Podcast. I'm Bill Alderson, your moderator. And as always, I'm joined by two guys who somehow keep getting invited back. Jim Roundsville, our resident tech prophet, who has never met a prediction he wouldn't double down on. And Gus Stein, the man who can find a cybersecurity angle and literally anything, including his own breakfast. Gentlemen, we have a packed show today and who's gonna take it away here first? Yeah, I'm gonna start now right away. Hey everybody, welcome to Morpheus Cyber Podcast, as usual. And I'm Jim and let's get right into it. So apparently there's a bunch of Americans going out there picking sledgehammers to these AI surveillance cameras. And for you to have not noticed when you're driving down the freeways of these cameras on the side of the road, pointing directly at the front of your vehicle and at the back of your vehicle. And what they're doing is they're picking up these license plates as you pass through. And that's for good and bad. We'll find out because some people think it's bad and others think it's great. Now personally, I have an opinion which I'll express as we get going here. But apparently these keyboard warriors didn't think that was enough to voice their complaints and things like deflog. Have you ever seen deflog? It's a website where you can find out where all the flock cameras are. These LPR cameras made by flock. And apparently because they are so easily to be found on these maps, well these antifotypes or these people who are independent thinkers, let's just put it that way. Decided they don't think we should have them. And so they're out there with sledgehammers and these ladders and breaking these cameras. So those cameras cost about $2,500 a piece. So there's definitely a cost and impact to the city or state or the feds who are ever paying for them. But all of the law enforcement uses those cameras. They're doing some destruction out there but hey, as fast as they break them, I'm sure they're getting replaced too. And then things are gonna more fover to a point where they're gonna put them in these tamper proof not containers but... Sturder it happening. Inclosures. Yeah, enclosures. Yeah, that's happening already. So they're gonna come back. It reminds me of the old days when we were kids and you'd take a baseball bat and drive around and knock off post office boxes from people's hanging out. I checked to see if we have those cameras here. I'm just standing Austin doesn't have them. Not surprisingly. I begged a differ Jim. Do I have a different hand? No, I begged a differ. We have an enormous number of toll roads and every one of those toll roads has front and back cameras and they're up way up top. And yeah, they're taking pictures and especially as a license plate because they're trying to get revenue from that, right? Yeah, it's very possible. I mean, I went to the search, I checked on flock and they said, there aren't any flock cameras in Austin. There aren't Allison, Leander and Cedar Park but not in Austin, Trapper, did not allow them to put them in Austin. But they haven't been San Diego where you lived. Yes, they do. Yes, you're under surveillance. It's kind of like living in East Germany, right? There's cameras watching everything, everywhere you go, everything you do. Here's the difference, right? In America, if you go out and damage one of these cameras, you're probably not gonna get shot in the back because that's what they did in East Germany. You climbed over the wall, you did anything and they just shoot you. So there's almost no lawlessness in those countries because they just kill people who violate the law. There's other countries that have similar where they cut off your hand of steel and that sort of thing. They have a really good compliance laws. People don't offend, but we live in a free society and the flock cameras or their equivalent are something that is relatively new. Bill, I'm still battling with the image of you. I'm not sure if you're driving or not, but kidneys, bikes, those were the baseball. I never did that by the way, but it's on a lot of movies and there's not a lot of movies and that sort of thing. Now I did my share of vandalism as a teen and a preteen. When I was in the shoplifting, that was my thing. We'd have contests who could walk into a record store and walk out with the most. It wasn't really a per two. We seriously, you didn't walk out with six or seven and I had this special long jacket that had the inside of the pockets, I cut them. So my after would be wide open and walking past, but it was a game, but anyway, looking at these cameras, I'm undecided, but it's a tough one. Half of me says that crimes down 10 to 15% in the areas where they have these flock cameras. I hope that's a good thing. But then I do feel a little bit dystopian and big brother and all that stuff. And I don't know where, I don't know what my opinion is in the middle. I have a pick to keep them because I like being safe. So my business in California, I still have a company there and we do video surveillance. And a lot of that included the MLPR cameras at gates and those kinds of entryways. This we want to, especially into government facilities, they always have an LPR camera. It's watching every vehicle that comes in or out so they can actually trap who came in, what time of the day. So it's employee non-employee, either one, but probably use more so against their own employees you think about. So you got to think, I can see the ethnicity of that. I think the big complaint of these attackers on these cameras is that they feel like no one asked us if you can put that out there and if you can monitor our vehicles. You didn't even put it out to a vote. You just did it. And I think that is a valid argument. They probably should have just said in general public, here's what's going on. When you see these cameras out here, this is what they're for and then talk up the good side of things like, look, it allows us to catch all these bad guys that are out there stolen vehicles, people that get a crime in a particular kind of vehicle, hit and run, any, in fact, they can even track autonomous vehicles if autonomous vehicles are not behaving correctly. So there's some real value in watching these things as proof of how things are going on the freeways around town, because these are not just on the freeways. They're pretty much at almost every stoplight. So we'll have to look at what our forefathers say. Is this a violation of the Fourth Amendment? You really can't compare why, because it's a completely different day and a different set of risks and a different set of violations. Now, let me blow your mind a little bit. When we were in Iraq, one of the systems that I helped diagnose problems with was a dossier system that had biometric enrollments, which means they took iris scans, fingerprints, facial recognition, voice recognition, and they put it into an enrollment. And then they had that so that whenever you'd come across a checkpoint in Iraq or Afghanistan, the guard, or our soldiers would say, put your finger on this platen, they'd put the finger on the platen, and it would come up. And there was a watch list on there of 30,000 people who were bad guys. So now, and the reason why I say that is because it's classified as to what distance you can do a definitive biometric iris scan. Imagine when you're walking through, and I've been at conferences and that sort of thing where they had these makeshift company entry doors, and you'd walk, you'd go over, and you'd enroll with your eye. And then you'd just walk through the doors, and it would say, good morning, Mr. Alderson, or what have you. Because it was getting iris scans, and then once you're enrolled, then it's a form of security. It's almost as good as a retina scan, right? So you can identify people definitively. Now, when these flock cameras get iris scan resolution, that's when you're really gonna say, whoa, that's like taking a fingerprint off of somebody while they're driving down the road. Yeah, certainly. These are vehicles fingerprint, essentially. Yeah, right? Everything about them. Yeah. And I think if we look at the good side of things first, and we say, look, how many crimes does it solve? How many times do they find a bad guy quickly? Whenever there's a major killing or some sort of a murder, and like everyone's on call to try to find this person as quickly as possible, time is the most important. Especially if there's a kidnapping. And these vehicle cameras have solved more of those problems than people can understand. That's how they catch them so fast, because those cameras are everywhere and they're able to be seen. The other side of it is if you have a regular pattern of getting up at a certain time for certain days of the week and going to these places that would be considered houses of worship or going to Bible studies, and that's your consistent routine back and forth, you set a pattern for yourself. I'm a Christian. That's what it spells out. I'm a Christian, I'm doing these things. Same things if you go to a mosque, and so you're constantly going to a mosque. That means I'm a Muslim. So now you set this criteria that goes into a database. And like it or not, it's there in the database. And now the FBI, the CIA, all of them can actually make these assumptions. AI will make them for them. They don't have to be looking for them. AI is going to say, I noticed this particular license plate is doing this routine, and here this is what it probably means. And that's going to happen. And unfortunately, we just have to understand that somebody uses for the wrong purposes is what we're concerned about. But if you use it for the good purposes to find a culprit or bad guy, hey, I'm all up for them. Hey, now there's also other podcasts who are talking about this recently. Peter Diamandis on Moon Shots mentioned this. And he says, hey, we already have these dense surveillance networks with camera spotting everyone. This technology already exists. And so obviously this is not something new. But it is in the news, and people are talking about it. And with the recent abduction of Nancy Guthrie and Tucson, this is a key situation, right? Because if we had a picture of her in some vehicle with the license plate, and that's what they've been scouring all of those license plate readers all over Tucson, and it's been like several weeks, and they haven't found anything from those as yet that we are aware of. And even though Peter's saying that these have been around for a long time, AI has not. So I was just talking about, that's what people are concerned about, that AI will make the inferences from different patterns. And that's what AI is all about, pattern matching. I see a pattern. Is this an important pattern? Should I tell you it's an important pattern? And that's going to come up in our next segment too. Yeah. Now the other podcast that talked about this was the Olin podcast, right? The billionaire podcast. Because we are the sub billionaire podcast, and all in is the billionaire podcast. But anyway, Garrett Langley from Flock Safety, the folks who produced those cameras was on, and he talked about it. So if you're interested, you can go back to December 2025 and find that episode of you're looking for more information. And Flock is not the only company out there that does this. I mean, there's there are other competitors. So it isn't like Flock is the only camera company out there. They're doing this. There are multiple camera companies that are doing this roughly three large manufacturers to LPR cameras that are providing this kind of service. So don't go and pick on Flock. There's a few others too. So they're all doing this. So it's the government who's made a decision. Look, we need some tools that are going to help us catch the bad guys quicker. I know Bill, you said it's outdated, but Benjamin, good old Benjamin Franklin, he must for saying that those who would give up the central liberty to purchase temporary safety, deserve neither liberty or safety. Yeah. And I think there's a certain amount of truth to that. That's my dilemma. When my daughter gets kidnapped, I'm hoping that the camera's catching. It's a tough one. Well, hey, maybe it's a good time to segue into an ex segment, which is Jim's related. And imagine for a moment and think about a dystopian scenario like this, an AI chatbot turns you in as a potential threat because of conversation that you had with them. And then the AI company debates as the weather to call the cops and turn you in. You know this, but see, that's actually what happened. This isn't science fiction. ChatGPT was talking to a school shooter who told of the plans what they were going to do a couple days before. And this shooting that happened earlier this month up in Canada. Yeah, so allegedly open AI was aware of this threat. They debated it for 24 hours and decided not to turn them in, but actually just shut down the fellow's account. And now to be clear, ChatGPT did not assist with any violent behavior, but it was aware of it. And here we go again, it's privacy versus safety. And I don't know where do you guys fall on this one. I think if AI can prevent a catastrophic harm to another person, I think it's necessary that that does mean it's big brother again. Like I told you guys in previous episodes, I turn off the speaker, the microphones on my Amazon devices because I just don't want it listening to me. And it's not even I think something underhanded is being done there. I think it's really I just don't want it telling me what I should buy next and things like that. It's just an intrusion in my life. And so I just don't want it. And it's just weird to have something know something about me when I haven't even asked for that. So you already know it's happening now. So a lot of things your choices, your favorites, those kinds of things are happening without you even asking. So if it can actually do something more meaningful, save a life, I'm on that side of it. So here's I'm conflicted as well. So here's part of the issue. This is like censorship. This is like surveillance. And as we well know, whether it's political surveillance or op-o research or whatever, like right now France is saying we can't have any misinformation. But yeah, here in the US, we allow misinformation on the free speech. So it all comes down to roost on who gets to decide what is misinformation? Because they can decide, oh, this is my political opponent. I'm going to force this into some type of this guy does porn. This guy does that. This guy does the other thing because they have the power to do this. And so what they're basically saying and what Elon Musk says about X is hey, buyer beware. Yes, there can be misinformation, but that's free speech. Here's the problem. When you finally start clamping down like they are in the UK and in France, who's to say that they're not your opposition? Who's to say they're not your competitor? And they're operating these systems and then they decide to target you. And so that's why this is dangerous, right? Who gets to decide when to turn someone in and who to turn in? Because it's a double ed short. What can be used to help you find someone can also be used to help convict you inappropriately. And here in the US, we know that the FISA, warrants and all that can be used for bad things. And you just, I don't know. I'm conflicted. I don't know exactly which way because it basically means that the party in charge or in power gets to decide what misinformation is. And it may not be in your favor. That's two different subjects. One about free speech and monitored free speech. And that's what we're saying. I guess from to prevent a crime, like minority report was all about free determining where this person might be starting to get to a place where they're going to commit an actual crime or that line. And I think that's Bill's point is that someone needs to decide when it's when they're tracking your buying habits, which is annoying, it when it might save a life. And there's gravy and so truth there. The discussion, right? That's why we're having the discussion, right? Eventually, there will be some guidelines put in place. But that everything we talk about every week, right? Yes, every week we talk about who's that the guardrails? Who's in charge of the guardrail? And so we have to say, look, we have to voice our opinion as people. But if those that represent us to protect us aren't doing a good job, then who's going to do that? And there's a quote that I always like and it's done, it's by Edmund Burke. And it says when good men do nothing, right? Something like that that the evil tryouts. I don't have a quote exactly. And I always think about that because if we have the ability and we don't do it, then evil prevail. So we've got to make a decision. So guardrails is the answer. We still should do this. We still should move ahead with somebody who's already looking up how to create a bomb, looking at all the maps around the particular building. The red building was. That's what we're talking about. It's a red flag. And they're doing that. The FBI has been doing that for a while. But if you ask the eight families that had a death in their family up in Canada, and there was 25 injured, I think if you ask their relatives, that'd be an easy question to answer. We tend to be that way as humans that we don't until it affects us personally, where's ambivalent. It was interesting to me know that to think about the conversation that they were having open AI that is should we pack on this or not? Yeah, sure. Liable either way and they could be. And I'm sure that's what the conversation is. Probably a very interesting conversation because if we don't say anything, damn, if you don't, if you don't, type of thing. Exactly. I guess this expands into every area of our lives. If you have cameras that are watching healthcare facilities, things like that in certain actions that are happening, should we be monitoring that from a public perspective so that we have we don't have to see abuse taking place? Not only that, but hippo laws. So if you're seeing walking into an HIV clinic, what does that tell your insurance provider? Yeah. Yeah. I have a funny story about that too, but I think I'll keep it to myself. Hey, listen, this might be a good time to go to our next segment. We've been, I've been vibe coding for a little over a year now and I just love it just recently for me. Yeah. I've been using AI to write code and one of the big providers of that is Claude and their product is called Claude code. So there's basically three products in the Claude family. There's the basic Claude that's just like open AI. There's Claude code which helps you program and then there's this new Claude code work. Claude code work. And then in addition, there's an add on now to Claude code which is called Claude code security. And this is a brand new offering and they just announced it last week and what happened when they announced it? Stock went down. Crowd strike, cloud flare, a whole bunch of companies lost $1,000,000 in the security world and it just misaligned all of their stock values. And so why the president of Crowd Strike went into Claude and asked it the question, Claude can you create a system that will replicate or improve on what Crowd Strike is doing? And Claude came back and said, of course not. So essentially the stock market is very sensitive, especially because it's at such an elevated level and AI is very sensitive. But in this particular case, they basically gave to AI a much higher waiting factor to say, okay, of Claude code security can replace Claude flare, can replace these other things in the security companies and the value of those are going to go down. Yeah, it's a big potential problem there. And all of those guys, those companies were out there commenting on LinkedIn, in X and stuff like that and just basically saying, hey, this is a misnomer, you're not the value of our stock is not diminished because of Claude code security. Yeah, it didn't hands just like a year ago. I was very impressed by your kind of futuristic thinking and you were on your soap box talking about how I is going to replace a lot of the security tools who don't adopt AI. And I think that's where you're going with this. But I think what we're seeing is a little bit of what you predicted. Then remind me, well, what's the GPU and the GPU coming on and allowing faster processing and more effective correlation and all this stuff looking at our, we're looking at a modern day firewall completely AI. Yeah. And so I get why the stock went down. I think it's a further looking thing they're going, hey, pretty soon because they looked at the coding, right? You're doing all the vibe coding, right? And all along, we just talked about all the other companies are using it to code now. They're going, this is going to knock out. We're spending big bucks on this. And if AI can do it faster and more effectively, I'm selling my stuff. Yeah. I think there's not just they're worried about it doing it faster is that they're worried that it's going to do it differently and do something that completely changes the market. And that's why the stocks drop like they did because those people that are investors are not necessarily technical some are. But if you can see the writing on the wall that AI is just taking over every industry, everywhere because it finds a better way to do it. If we can think it, it can think it faster. If we can come up with correlations, it's already done it a thousand times over before we thought about even looking at it. So there's, it's inevitable that this ability for there's coding platforms with AI attached to it. I can say I see a better way. And what you're doing over here, what you're seeing a product over here, it knows what that product can do. It already knows what all these other products can do. So it's going to say, is that the most efficient way to do it and come up with a different way. And all of these other security coverage are saying, we didn't see this one coming. All of a sudden we have a competitor that is going to probably eat our lunch and we've got to hurry up and catch up or integrate it so tightly that as it progresses, we progress. So let's put some guardrails on this a little bit. Cloud code is a product that helps you write code. It writes all kinds of code. It does a spectacular job a year ago. Not so much. It just came out a little under a year ago and it's progressed and it's gotten better and better and cloud code has gotten a lot of kudos because it is very effective. Now they're using and they're calling this cloud code security. And its focus is on, for instance, all of those open source projects that are out there, right? Everybody knows of an open source project and you can go download it for free. You can use it. But there are holes in that open source software. And one of the things that cloud code is doing is they're offering this free or giving access to the guys who are doing open source software so they can scan that open source software to find vulnerabilities. And when they started doing this, guess what they found? They found 500 plus high severity vulnerabilities and hours across major open source projects. So what is an open source project? It means that it's on GitHub or something and that repo where they keep all the code is stored. So you can go pull that open code and then you can load it. You can then look at it in cloud code and then apply cloud code security to it and it'll find the vulnerabilities in the software that's written, right? So it's pretty powerful. And they did. They found a lot of stuff. And so it had a huge $15 billion market correction and people didn't really even understand what it was about. It was about those open source directories. And so they're doing this now. Good point. Yeah. So the Department of War is now saying, look, we need to get AI integrated into our function as well. And they're very concerned with AI capability. Other countries are actually already putting that light China is already integrating AI defense right into their weapon systems. And I know we're doing the same. But I think that Bill, you were saying something. I think that anthropic was actually approached by the word apartment. I think somebody said that. Yeah, the anthropic is a bit ambivalent because some of their employees don't want to use for war things. But on the other hand, just last week, the Chinese government or the Chinese were into cloud code and stealing cloud codes, great capabilities. And so guess what? So anthropics won't let the US government or allies use it in for defense contracting. But the issue is well, but then they left themselves wide open and the Chinese went and got it. Now do you think the Chinese are keeping all of this technology out of their companies? Heck no, they're stealing it to apply it. And so we have some very key critical ideas to think about. This idea that you can say, okay, I'm a pacifist. Let me just tell you about pacifists. Albert Einstein was a pacifist. He's the one who wrote the letter to get the Manhattan project started. I can show you a copy of it. Why? Because the Germans were so egregious, he wanted the nuclear weapon, not used against the Japanese. He wanted it used against the Germans because he knew how egregiously bad they were and that they wanted to take over the entire world. Even the pacifist gets to a point where they look at that waiting factor. Is it better to let the Chinese have all this technology or should we have it so we can protect ourselves a little bit? And these are the waiting factors. These are all the things that we're talking about, whether it's cameras or code or security. It doesn't matter. We are all in a in a quandary about when and where we apply these things, right? And so these are the topics of the day. I'll tell you why we should use it as a defensive measure because I guess who's using it as an offensive measure. All the bad guys got the same access to AI and they're attacking and we need AI as a defense mechanism for that reason. Yeah, because if we don't, what's going to happen? We're going to let all the bad guys in the world, North Korea, Russia, China have all these tools and capabilities and then we're going to be standing in a puddle of piss with the snot bubble hanging out of our nose when they start firing on us using your fur. Under where you're fur okay, fur on one K went. Exactly. That's scary stuff. Yeah, I think the greatest progress is that we're going to see in our lifetime. We're going to come from AI and quantum together, right? It has to be every part of our lives because if we don't, someone else will and that's always what we say, taking a position of strength, right? So peace through strength so we can be the first ones to have the upper hand. We can promote. Look, we're going to cause you to stand down because we have the bigger weapon. Again, there's some morality in front of this and we hope that always worth the guys on the right side. We don't want to ever be the guys on the wrong side. I would give a percentage of the United States probably things were on the wrong side always. I don't believe that. I believe it's a small percentage, but in their very vocal, let me tie a bow on this. As a vibe coder extraordinaire, I have like about five or six different major projects that I've created and very powerful projects using vibe coding. And consequently, when I use my applications, I secure a lot of it with Cloudflare. Cloudflare is one of the companies that got harmed by their value going down about 8% on the fact that Cloudcode security came out. Let me just tell you, Cloudcode security, and when I use Cloudcode, it helps me configure Cloudflare and the hundreds of parameters on their web application firewalls and all that various things that they do, I could not ever get my head around the hundreds of parameters to protect our applications like Cloudflare can without using Cloudcode and AI system to help me apply it. And that's one of the things that we're learning to deal with. I could never do the kind of software and security capabilities that I'm doing right now without the assistance of Cloudcode. It now Cloudcode security, and that is Cloudcode has intrinsically all these security capabilities. So you can use it to protect your websites and all of your systems, but those capabilities of protecting it in Cloudflare are very complex. So if anything, Cloudflare's stock should have gone up. Why? Because Cloudcode isn't an ableer and perfectly applies a very complex set of protections within Cloudflare to make it viable. That's the whole situation. And so when I saw this, I wrote some articles, I went and did some social media, and that's where CrowdStrike said, hey, these guys can't replace us. The Cloudflare, the same thing. If anything, all of those stocks that went down should have probably gone up. Why? Because Cloudcode and Cloudcode security isn't an ableer to more perfectly apply that technology of those platforms than without them. All right, Jim, we're going to go out and come close. Yeah, the controller occurred in Michael Bell. Probably a smart thing to do, if it's for sure. Probably. Maybe a time for it. No, not break. Donuts. It's pleased with their license plate readers. Now we're going to talk about what? Batteries. Yeah, so I'll state batteries by a company called Donut Labs. And it's funny that Donut kind of applies to their technology because the battery that the creator has been the shape of a donut and you have kind of air cooling that goes through a cool thing. So this company called Donut Labs is potentially solving a real-world problem for AI in the sense that they have faster recharge times. They have increased battery capacity and a huge amount of weight reduction. So they've released their first test results for the charging rate. And also the capacity rate. So there's a full table and you can go to the Donut Labs website and see that testing. And it shows you, by the proof of how they have such a fast charge rate, that they can charge 80% capacity within four and a half minutes. So we're talking about batteries now that can recover very quickly with less heat and more capacity with faster charge times. So isn't he a real big problem with the other battery? And they're low up. They'll catch on fire and that sort of thing. So I guess this technology expands and reduces the risk for that. Wow. For sure. And normally battery packs are liquid cool, if some sort of liquid to cool it to keep that charge. When it's in a high charging rate, they heat up very fast. So just to give you an example, if you need a 75 kilowatt battery pack, like with a lithium battery today. With lithium returns 250 watt hours of power per kilogram. So to get to 75 kilowatts, it the weight equals out to about 664 pounds. The same battery, a solid state battery, instead of a lithium battery for the same amount and output, 75 kilowatt. The total weight for that battery pack is 413 pounds. That's 240 pounds less. So it is serious. It's something that manufacturers are looking at. There is only one product out there right now that's using that solid state battery from donut. And that's verge ebikes. I watched, I saw that on a video on YouTube and it's a very cool bike. I want you to try them. I can't afford them that kind of guy. I don't have that kind of money for that. We're sub billionaires. We can do it. I guess if I sell my motorcycle, I could buy any bike. So probably thing I should be doing. My wife probably wants me off of that motorcycle anyhow. Anyway, so if you get an ebike like that, you can get that thing charged back up in about five to seven minutes. And it's good for another 30 miles. That's here. Yeah. I think it's going to be great for the market. Great for robotics in general because of the lighter weight. So imagine now 75 kilowatts only weighing, but they say 413 pounds I think is what it was. It's like a lot. And you started talking. I was thinking that it was a donut size battery. You're talking about it. It's like a big donut. It's like round rock donuts. I don't know if you guys are aware, but in Austin, we have round rock donuts and they make like a gigantic donut over there. So round rock donuts. Did they sell their donut holes and like basketball or something? There you go. Yeah. Really interesting, Jim. Yeah. I think the key is that there's less heat, which is always a big danger in electric cars, electric vehicles are getting an accident. There's a short those kinds of things. Less of a problem with solid state batteries. There's less lithium involved. Lithium is, you don't want to have lithium in a fire, hard to put it out. So it's actually a good thing to go towards solid state. But just let's be honest about it. It is just testing right at the stage and there's only one vehicle. That's that Virgy bite that's out there. But it's proving that it can be used for that purpose. So where do you think the best application of this is, Jim? Certainly in robots that the longer lifespan of a battery because what good is a robot? Like we talked, I think there was an episode on the robot dogs that are going to be at the World Cup. What good is a robot dog of every two hours you've got to run back and get charged up? You want to have for good. Or be in hop for suit and he runs out of juice. Yeah. What about a robot in your home? So I have this problem with my little robot vacuum cleaner. It's smart. It has to recharge itself mid cycle because it has, it cleans the whole house. If you want to do it, it full vacuum, it takes more battery power. So it has to recharge mid way. So the house is only halfway vacuum before it finishes up or two thirds possibly. But it's okay with the refrigerator. It turns around, goes back to its charging stage. It does. I don't do anything. It goes back for the frame spot. It pays this little chime when it gets back to its back to its dock. And then it shimes again, where it takes off again. So I know it's out in running plus my little app shows it's cleaning again. So it is cool. Both my sisters have bought vacuum since we talked about it. Let's example of it. I replace humans because I'm going really need lorry. This thing's done all the vacuuming. Oh, God. He doesn't watch. I hope your wife doesn't watch the pod. She doesn't watch. She doesn't watch. She goes, you guys are too smart. I can't figure it out. Unit labs is not. Unit labs is not the only company out there. Unit labs is one company out of multiple that have been working on the solid state batteries. And one of the leading companies out there is actually Toyota. They've got more patents than anybody on this. And they're getting extremely close to putting it in vehicles. So they have a they actually have a prototype. So they're further along than everybody. The reason I like Donut is because I was watching that video on those verge ebikes. And man, that thing was so cool. Since I haven't seen that video, go find it. Just look up Donut labs and it'll show up with the verge motorcycle was one of their main partnerships. Neat. One of the things just to put some specifics around it is in a robot, it can get 18 hours of continuous operation from one of these type of things. Six times longer than current AI robot battery life. So that's a measurable improvement and the safety of those the Donut batteries is improved dramatically as well. And it may be like right now you go on a flight and you can you can take these lithium ion batteries in your carry on. You cannot put it in the checked baggage because why because if they do start on fire and it's in the overhead compartment, you can get a something and put the fire out. And so it's kind of it's in a place where you know and you can control it. Whereas if it's in the baggage compartment, you can't. So these batteries are probably going to be the preferred method for charging for laptops and a lot of other things because of the fire hazard improvement. And here you got the longer runtime, you got the smaller footprint, the whole time of the computer thing. It's pretty exciting. Yeah. Yeah, almost double the runtime, which is for less weight. It's going to be amazing. It's not here, but it's coming. Yeah. The fact that the risk is reduced. That's great. Every time they've been saying that these sort of things are coming in 15 in five years and they've been saying that for 15 years. Yeah. Yeah. There's a company actually testing it in front of the whole world. So that's pretty real. And interestingly, it's not just us talking about this. When we choose our topics, we also look at what other people have talked about. Laura Shin recently discussed how robotic accelerators are tied to asset insurance platforms and all sorts of things about AI deployments. It's not just us talking about it. So if you want a little more information, you can go out and find Laura's topics on that and to add to that particular information. Correct. I think probably does it for today's episode. So we started out talking about smashing surveillance cameras. And then we went into a realization that AI isn't limited by its code. It's really about where it has market use. And Tifa should do. And Tifa should get the robots or go smash the cameras. Not quite there yet, but it probably will be in the future. Yes. I don't think you need to give them any ideas like that. Well, they're just the merites. The companies that solve these problems first are the ones that are going to get to market first and they're going to control the future. So that's why there's so much money being thrown at all that. Is that a public company? No, don't. It's not yet. San Francisco started up. I think they're looking at their finish. They're from Finland. Oh, I think. I think they're looking at that market to be like a $9 billion market by 2030. So it's definitely growing and it's out there. So guys, unless you have something else to add, listen, please click the subscribe button, like, share, etc. Comment. And we always appreciate you guys. We'll be looking for you next week on Morpheus Cyber Podcast. Thank you. Hi.