00:00 – Mishaal Rahman: Google Maps will quickly allow you to ask Gemini for espresso store suggestions.
00:04 – C. Scott Brown: And Android will finally allow you to absolutely automate apps just like the Rabbit R1.
00:09 – Mishaal Rahman: I’m Mishaal Rahman.
00:11 – C. Scott Brown: And I’m C Scott Brown, and that is the Authority Insights Podcast the place we break down the most recent information and leaks surrounding the Android working system.
00:19 – Mishaal Rahman: So, I don’t learn about you, Scott, however I’ve fairly a little bit of hassle on the subject of scouring Google Maps for like a restaurant I ought to go check out with some pals. We simply have so many dang good choices right here in Houston. It’s simply such a melting pot of various ethnic cultures and totally different restaurant choices. More often than not I’ve been counting on phrase of mouth. Like I’ve been asking my pals, my household, prolonged relations, or simply going to Reddit, just like the Houston subreddit for suggestions. However possibly quickly sooner or later, I’ll ask Gemini as an alternative via Google Maps.
00:48 – C. Scott Brown: So, you would possibly belief AI to make one thing like a meals suggestion for you, however would you belief AI to truly act in your behalf and purchase issues for you? So, tech corporations are actually hoping that you’ll belief AI to carry out these sorts of duties in your behalf. However thus far, all these AI brokers have been both based mostly in a browser or in another scenario like that. However Google is now engaged on an AI agent that may management your Android apps for you, proper in your telephone, however we’re not so certain that’s a good suggestion.
And one other change that has us scratching our heads is Google’s plan to deliver its new Nano Banana picture creator, which everybody loves, to Google Lens and Circle to Search. So, I imply, Nano Banana’s nice, however why Google Lens and Circle to Search? That’s type of complicated. These issues are made for looking.
01:43 – Mishaal Rahman: Driving that gravy chain a little bit bit too exhausting there, aren’t you, Google?
01:46 – C. Scott Brown: Yeah. Yeah.
01:48 – Mishaal Rahman: I’m not likely certain how I really feel about that specific use of the mixing proper there in Google Lens and Circle search. However I’m actually excited to see this new Ask Maps function that Google’s engaged on for Google Maps. So, I imply, to be clear, Ask Maps itself will not be really new as a result of that is one thing that Google really rolled out late final yr, you already know, the power to ask Gemini a couple of sure place in Google Maps. However the way in which it really works proper now could be that you need to really spotlight a spot, you want faucet on a location, this card exhibits up, the same old element card, and you then would possibly see this Ask Maps about this place type of on the very high within the overview tab. And you’ll type of ask something you need to learn about a spot. You even have some suggestions, like if you choose a bar, you may ask, have they got a full bar? Is it quiet? What’s the gown code like? And and so forth. However that is solely when you’ve chosen a location and also you really discovered the place that you simply already are searching for and also you’re curious to study extra details about it. However the brand new model of Ask Maps that Google’s experimenting with, based on our APK insights group, or our APK teardown, no matter you need to name, our group that does the teardown, AssembleDebug, took a glance via the Google Maps app and he found that Google is engaged on a brand new interface for Ask Maps. This interface is accessed via a chip proper beneath the search bar. You’ll be able to see right here on this screenshot proper right here, nicely, if it’s not obscured by the watermark now we have. Beneath the search bar, there’s this Ask Maps chip. You faucet that, and it pulls up this full display screen, nearly full display screen interface that type of resembles the principle Gemini app really. You’ve this hey indicator, you’ve got these chips on the backside that with like really useful issues you may ask about, search bar with a microphone icon. And mainly, it’s identical to a full Gemini interface however inside the Google Maps app. And I believe it is a huge deal in comparison with, you already know, what we had earlier than, which is simply location dependent. Now, that is type of like a generalized entry entry level to Gemini inside Google Maps. You’ll be able to ask it about just about something that you really want. So, I imply, I personally suppose that is going to be a sport changer for a way we search maps for places, for eating places, for espresso retailers, and and so forth. However what do you suppose, Scott?
04:12 – C. Scott Brown: Yeah, no, completely agree. Sport changer. In my life, if I’m touring alone, I might positively use one thing like this. However once I’m touring with my accomplice, she often does all these things. She spends hours trawling via Google Maps and Instagram and Reddit and determining like if we’re touring to a brand new place we’ve by no means been to earlier than, and he or she’ll save all these various things after which she’s type of like my very own private Gemini as a result of we get to the situation after which I say, we must always get, you already know, burgers tonight or no matter and he or she’ll pull up her maps and work out the place she’s saved the most effective location within the space after which we’ll we’ll go there. However having the ability to simply discuss to, you already know, what appears to be simply Gemini, like a Gemini overlay over Google Maps, having the ability to do that may positively be nice in these uncommon conditions the place we’re in a spot the place she hasn’t already completed that and we are able to simply, you already know, ask maps and determine it out from there. I additionally see this being actually advantageous in case you’re with a bunch of individuals and I don’t learn about you, however that’s at all times actually troublesome for me as a result of they’re attempting to accommodate all these totally different, you already know, in case you’re a bunch of, let’s say 5 individuals, one’s a vegetarian, one’s a vegan, one’s gluten-free, one would like if we went someplace the place they might get one thing that was like not tremendous wealthy, possibly low calorie, one particular person doesn’t drink, one particular person desires to have a beer, you already know, it’s like figuring all that out and simply having the ability to say to Gemini, these are the issues we would like, please give me three choices inside a, you already know, 10 block radius of the place I’m proper now. Sport changer. Like that may be a lot simpler than what now we have to do now, which is, you already know, individuals throwing again issues in a bunch chat and hyperlinks being despatched and, you already know, studying menus, and it’s simply chaos. So no, that is positively, you already know, we spoke final week about implementations of AI which can be genuinely helpful and never identical to we’re throwing AI at this factor as a result of why not? You already know, we’d like AI as a result of that’s what we do now. That is a type of conditions the place I’m like, no, like that is really a very good concept. We’re going to speak later on this episode about issues which may not be such an ideal concept, however this one, this one I believe is actually good.
06:24 – Mishaal Rahman: Yeah, I imply like every time I’m researching, like every time I’m like reserving a resort for an occasion like Google I/O, for instance, these days since I’ve been there a few instances now, I’ve type of settled on a few places that I return to. However every time I’m searching for a brand new occasion in an space I haven’t been to, like I exploit a whole lot of the filters that they provide you. The resort model filter as a result of I want to stick with like Hilton model simply due to, you already know, membership and stuff like that. Worth vary. I additionally like really having a number of tabs open as a result of I need to make certain, oh, does this resort have free parking? Does it not have free parking? Does it have this amenity? So like having the ability to simply open this Ask Maps chip inside Google Maps and say, okay, I’m searching for so and so from accommodations from these manufacturers on this worth vary with these facilities would simply be an absolute sport changer. Like proper now, I may technically try this by opening the principle Gemini web site or the Gemini app. However the issue is it’s not likely inbuilt to maps the way in which, after all, maps itself can be. Like it will simply give me a listing, a textual content checklist of, oh, listed here are these places. Then I must manually, oh, open them up in Google Maps to see, okay, trigger like inside the Gemini app, I can’t visualize how far that’s from the venue that I’m really going to, proper? I must like individually open that in like Google Maps to see, okay, that is this location, these are the transit choices I might take or or like that is how lengthy it will take for me to journey an Uber. However like having this constructed into Google Maps and probably prefer it creating a listing and displaying the whole lot on a map, simply can be an enormous sport changer for planning.
07:53 – C. Scott Brown: For certain, however the challenge that we form of acquired to remember is the trustworthiness of this data. You already know, like how a lot of that is Gemini going to be delivering precisely. And thus far, from what we’ve seen, evidently there are issues. So, you already know, we so, you already know, it’s not like Gemini goes to be flawless. Typically you’re going to say, oh, I’m searching for a spot that has a gluten-free menu. Gemini goes to verify the maps itemizing, discover a menu from 5 years in the past that had a totally gluten-free factor and you then get to the restaurant they usually say, oh, we haven’t had a gluten-free menu in years. And you already know, so it’s like that’s the place I believe a whole lot of the problems are going to come back via.
08:37 – Mishaal Rahman: Yeah, like in case you’re asking Gemini questions on, oh, like, inform me about this metropolis or what’s the tallest mountain, like Shiv did right here in these examples for our article, prefer it’s going to deal with these queries high-quality. However as you talked about, in case you’re asking particular questions on issues that require the restaurant data or the situation data to be updated, you then may not get correct outcomes as a result of, after all, it requires individuals to truly go in and replace that data. And there are a whole lot of locations, like smaller places the place individuals don’t present up-to-date data. You may need years outdated issues. You may need a location that’s fully completely closed, and Google Maps will not be even conscious that it’s been closed for some time as a result of nobody’s reported it as being closed. Or the pricing on a menu is like years old-fashioned, and it doesn’t match along with your price range anymore. So, yeah, I’m a little bit involved about that.
09:25 – C. Scott Brown: To be honest, to this function and Gemini and the AI search normally, like these can be issues that you’d face with out Gemini. Like, you already know, in case you simply go into the Google Maps itemizing, you take a look at the menu and also you say, menu seems nice. Let’s go and you then get there they usually’re like, yeah, the menu is wildly totally different now. That’s one thing that you’d face mechanically,, however yeah, however there can be different conditions that possibly a human would be capable of verify issues. They might be capable of take a look at the menu and say, wait a minute, this menu is from 2015. Like, issues are most likely totally different now, whereas Gemini may not be capable of make that distinction. So, I believe it’s going to be a your mileage might range factor, however the idea, I believe, is superior.
10:08 – Mishaal Rahman: So talking of the idea, I’m type of curious to listen to your ideas on how this may influence like the way in which lots of people type of go about looking for hidden gems of their metropolis. You already know, lots of people simply go round, drive round with out Google Maps or they simply depend on the phrase of mouth they discuss to their pals, household, co-workers, they go on boards like Reddit, like like suggest me some hidden gem, Indian spots within the metropolis, you already know, or like they ask for like what are the most effective Mexican meals locations or what are like this this underrated Italian place, you already know, they usually get suggestions from actual individuals. However now, now we have the power to ask Gemini, which after all, it’s skilled on responses that got by actual individuals. So like individuals who contributed data to Google Maps just like the native guides, Redditors, and so forth. However like, do we predict that is going to totally change asking individuals for suggestions? Or do we predict like we’re going to see a resurgence possibly of people who find themselves eschewing AI and simply driving round and randomly stumbling upon one thing that they noticed with their very own eyes.
11:08 – C. Scott Brown: I believe universally the idea of getting your data instantly from a human goes to get increasingly necessary because the years go by. Yeah, I imply nothing’s going to vary you one in every of my favourite dishes is pork ragu. Like, you already know, pasta, scorching recent pasta, scorching meat sauce on high, tons of cheese, you already know, find it irresistible. May eat all of it day. And yeah, so I may discuss to a good friend or a member of the family or one thing they usually may go they usually may say like, oh, we went to this restaurant and we had this dish and it was precisely what you need. Prefer it was the right pork ragu, you bought to go. That’s not one thing that you would be able to actually get from AI, you already know, that private connection isn’t going to be there until I like actually instructed Gemini like I log into Gemini and I say right here is the precise description right here’s a tome on what I really like about this explicit dish after which Gemini saved that and was in a position to deduce that via imagery and feedback however even then I really feel prefer it wouldn’t be reliable. That human ingredient of any person being like “I’ve tasted this. I do know what you want. That is the restaurant you should go to.” That’s by no means going to vary and that’s going to be one thing that’s going to grow to be increasingly necessary as time goes on. As a as a man with a YouTube channel I’m very a lot anticipating that being a difficulty going ahead as nicely now that we’re seeing a whole lot of AI slop seem on YouTube you already know you don’t actually know is the voiceover for this channel is it an actual particular person or did somebody simply ask ChatGPT to make one thing for them after which feed it out an audio that sounds human and we simply stole B roll from you already know from Android Authority and from Marques and from all these individuals and tossed it right into a video and now we’re a tech creator like that type of factor goes to grow to be much more of an issue so yeah that human ingredient is one thing that’s at all times going to be wanted. So no this isn’t going to be a substitute for that. I believe that is going to be an answer for dashing up a course of that would now take you already know an hour plus you already know I can simply I’ve positively had group chats with individuals attempting to determine the place we’re going to remain or the place we’re going to go eat and it takes a very long time when you have a giant group of individuals so that is simply going to pare that down and simply make it quicker and that’s high-quality that after once more that’s one thing that AI must be doing that’s going to make life simpler however yeah changing that human ingredient restaurant reviewers the Michelin Star system these issues are high-quality like they’re not going anyplace.
13:40 – Mishaal Rahman: Yeah. However I’m curious to consider what is likely to be a few of the potential unintended penalties of this alteration? Like we type of see proper now, due to the rise of Google Maps, there are a whole lot of eating places which have type of optimized themselves utilizing search engine optimization techniques, you already know, type of the stuff you solely count on web sites like Android Authority and you already know, our rivals to do. Actual life eating places are utilizing search engine optimization to make themselves seem extra extremely in Google Maps search outcomes. Like you’ve got eating places which can be simply named Chinese language meals or like Italian meals close to me. You already know, there I believe there was like a viral pizza store in New York Metropolis that was referred to as like finest pizza place close to you. And like, the one purpose to call your self that’s so that you simply present up when individuals seek for pizza locations in New York, proper? However like, with the appearance of generative AI, you already know, the truth that it’s in a position to pull collectively a lot data. Do we predict that that is going to make that type of scenario worse? Are we going to see much more type of pretend data or like AI focused data injected into Google Maps listings?
14:41 – C. Scott Brown: Yeah, I imply it most likely will make it worse, however I imply, I don’t know, like, yeah, if I looked for finest pizza close to me, and a restaurant got here up that was actually referred to as finest pizza close to me, I wouldn’t go. Like I’d be like, that’s ridiculous. That’s clearly a sham. That’s clearly any person who’s simply attempting to sport the system. I’m not .
15:03 – Mishaal Rahman: I imply, hey, if it’s 3:00 a.m. and also you’re actually craving pizza, you already know, such as you’re too
15:08 – C. Scott Brown: Not in New York Metropolis. Oh my god.
15:10 – Mishaal Rahman: You’re too wasted to consider it too exhausting.
15:12 – C. Scott Brown: You’ll be able to’t throw a Starbucks cup in New York Metropolis with out touchdown on a pizza place, so I don’t suppose that may be an issue. However yeah, the widespread sense ingredient of understanding like, oh, wait, this restaurant has, you already know, oh, this restaurant has a 4.4 out of 5 star opinions. That I imply, meaning it’s acquired to be good. And you then discover, wait a minute, there’s solely three opinions. After which, you already know, this different restaurant has 4.2, however a thousand opinions. So you may determine issues out like by deducing, you already know, utilizing your mind. And so I believe that that’s not going to vary. I believe, yeah, there is likely to be extra there could also be alternative ways of attempting to realize the AI system that the restaurant form of implement, however, you already know, in case you see a bunch of pictures and it’s all like influencer-esque girls standing in entrance of angel wings on the restaurant, you already know that the meals goes to be horrible as a result of it’s solely attempting to enchantment to individuals attempting to get their their Instagram on, which is okay. You need to go to a restaurant and do your Instagramming. That’s nice. I’m glad you’ve got that. However I’m extra within the place that has the precise good meals they usually’re not going to have angel wings on their restaurant. So it’s like, you’re going to have to make use of your mind nonetheless a little bit bit and Gemini’s simply going to hopefully simply make it simpler to do the you already know, the deduction that you should do.
16:25 – Mishaal Rahman: I imply, Scott, I believe you’re giving lots of people an excessive amount of credit score. Like if persons are unable to discern AI slop, they is likely to be the type of people that will fall and go even have actual life slop from these eating places as a result of they nonetheless exist, you already know, like they’re clearly doing nicely sufficient to proceed to perform.
16:47 – C. Scott Brown: I’m an optimist, Mishaal, that’s all I can say, you already know, I’ve nice religion in humanity.
16:53 – Mishaal Rahman: You bought a whole lot of religion in individuals.
16:56 – Mishaal Rahman: All proper, so shifting on to our subsequent story. So it is a actually fascinating piece as a result of behind the scenes for a little bit for like over a yr and a half, Google’s been engaged on this venture referred to as Venture Astra, which is like their subsequent technology common AI assistant, mainly an much more clever type of Gemini that may not solely, you already know, do stuff in your telephone, like all the same old stuff that assistant, Google Assistant and Gemini can at present do, however probably work together with the actual world and like you already know, like keep in mind issues that you simply’ve seen and truly do issues in the actual world. And as a part of this new Venture Astra, like this analysis venture that Google’s been doing on, at Google I/O, they confirmed off a model of Venture Astra that may management your Android telephone. It will probably management apps in your Android telephone. So that they have this demo working on a Pixel machine the place they really had this man who needed assist repairing his bike. And what he did was he set his telephone apart on the desk after which as he was engaged on his bike, he would ask it questions. Like, for instance, he would ask it to tug up data from a guide. After which Venture Astra would open that guide. It could discover the guide on-line, after which it will scroll to the precise web page, discovering the knowledge that he was asking for. After which based mostly on that data, the particular person would ask, okay, can you discover me some associated YouTube movies that designate how you can repair this half. After which it will open YouTube, it will search YouTube, and scroll and discover the related video for the particular person to observe so he may repair his bike. So I believe this was a very huge, fascinating venture that Google confirmed off as a result of, you already know, proper now, assistants, they’re hands-free within the sense that you would be able to ask one query, but when it includes something that includes like trying up data and truly scrolling via paperwork, filling out kinds, switching apps, issues like that, it could possibly’t do any of that proper now. Prefer it’s relegated to something it could possibly pull from the online. So, I believe, you already know, there can be a giant deal if there was some technique to really automate the apps in your Android telephone.
Lo and behold, it appears Google is lastly engaged on one thing that enables for that. As a result of I discovered proof that Google is engaged on a brand new function referred to as Laptop Management and what this function means that you can do is, as I simply talked about, agentic AI management of your Android apps. So why is Google engaged on this function? Like what’s the significance of this? Properly, the issue with Google’s Venture Astra demo was that the way in which it type of labored within the background was that you would be able to see right here on this demo right here, this little indicator on the highest left nook means that it is a display screen recording that Venture Astra, the demo that Google made, is actually recording your display screen after which it’s scrolling via and injecting inputs, most likely utilizing the Accessibility API. Now, the issue with that’s, because it’s recording your display screen, it’s really like actively utilizing your display screen, which suggests you may’t do something along with your telephone concurrently the AI agent is doing no matter process that you simply requested it to do. So in case you ask it to discover a guide or search a YouTube video, you may’t contact your telephone in any respect whereas it’s doing that, as a result of in any other case you’d interrupt its movement. It wouldn’t be capable of discover the place it must go subsequent. That’s clearly an issue as a result of Google’s demo, in like a footnote, they talked about that it’s working at 2x velocity. The video was sped up twice as quick, which suggests it’s working considerably slower than what the video really initially implies. And also you solely decide that in case you really take note of the footnote. So, what Google is engaged on, their answer to this downside is Laptop Management. And the way in which this function works is it can create a digital show within the background and this digital show can have the app in query that you really want automated launched onto it. After which the app that’s utilizing the Laptop Management API will be capable of ship faucet, swipe, and scroll inputs to that software that’s working within the digital show within the background. After which all of that is taking place whilst you’re nonetheless ready to make use of your telephone. So your telephone, you are able to do no matter you need on the principle show and you’ve got a digital show within the background the place this agentic AI or Gemini is ready to management that software. And yeah, I imply there’s a whole lot of fascinating elements of this like how Google constructed some type of privileged mechanism to regulate it. So solely pre-installed purposes with a extremely privileged permission can entry it and apps that use this framework can solely management the precise app they have been granted permission to by the consumer, to allow them to’t simply open different purposes every time they need to and begin seeing and controlling them. One other cool facet of this framework is that they developed a mechanism to permit for the trusted digital show to be mirrored onto a separate, extra interactive one. And I believe that can will let you stream that interactive digital show to a pc in an effort to mainly distant into what the AI agent is doing. So that you possibly probably if it makes a mistake, you may then configure and alter issues your self after which it could possibly proceed doing its work within the background. So, I imply, that is actually fascinating to me, simply because like AI devices just like the Rabbit R1, once they launched, they have been nearly universally derided for being ineffective. Like, Scott, I believe you talked about earlier like proper earlier than we began this name that you simply had, you really briefly used the Rabbit R1. What did you consider Google’s tackle this function? Do you suppose it is a good concept? Like do you suppose Rabbit R1 possibly was simply forward of its time and that the issue with that machine was simply the truth that you want separate {hardware} and never the truth that it was only a unhealthy concept itself?
22:58 – C. Scott Brown: Properly, there are a number of issues with the Rabbit R1 as anybody can let you know within the tech world. However the core concept of Rabbit R1 was sound. It didn’t want additional bodily {hardware}. There’s no purpose to have an additional merchandise with you to do this stuff. It ought to all occur in your telephone, which is likely one of the huge the reason why everybody mentioned instantly when the Rabbit R1 was introduced, why isn’t this simply an app? Why do we’d like separate orange {hardware} for this? And in order that was downside primary. However downside quantity two was that it didn’t have entry to your life. You already know, it solely had entry to no matter Rabbit was in a position to get entry to. And, you already know, with what Google is attempting to do right here with what you’ve found, sounds extra like what we really want. You already know, we really want an agentic AI that’s already in a position to act on our behalf utilizing the factor that’s most private to us, which is the smartphone. So, I believe that what Google is doing right here will not be solely a step forward of the Rabbit R1, but additionally simply essentially extra in contact with what it must be to achieve success. Now, clearly, I haven’t used this but. I don’t know how nicely that is going to work. You already know, there are a whole lot of safety and security issues with this, which I’m certain we’re going to speak about in a few minutes. So yeah, I’m cautiously optimistic about this. I do just like the idea of having the ability to discuss to Gemini and have it do one thing, you already know, mundane on my behalf. You already know, simply as a fast instance, my Pixel 10 Professional is a 128GB mannequin and, you already know, if I take a bunch of movies and pictures, that storage fills up shortly. So one factor I’ve to do each infrequently is undergo and guarantee that the whole lot that’s been uploaded to my Google Photographs account, stays on the Google Photographs account, however the whole lot that’s been on the precise telephone {hardware} itself will get deleted as a result of I don’t want it on my telephone anymore. It’s backed as much as the cloud. So I’ve to undergo and delete these issues manually utilizing a few button faucets and swipes and issues like that. Mundane, silly, don’t need to do it anymore. I simply have the agentic AI undergo and do it for me. Like that appears actually helpful. Easy issues, making a restaurant reservation, simply having the ability to say, please make a restaurant reservation, please name this restaurant for me and make this factor, do no matter. These all seem to be actually nice concepts, and also you don’t want an orange block to do this. You’ll be able to simply try this along with your telephone. I like this idea. However yeah, the problem that issues me, the factor that makes me very, you already know, frightened is that, you already know, I wouldn’t belief, you already know, a stranger to behave on my behalf. I’m not going handy a stranger my bank card data and say, go purchase me these live performance tickets. I don’t know if I belief the AI to do this both. So like that’s the place I’m form of getting nervous. Easy issues, fast little actions which can be mundane and take, you already know, 30 seconds of my day that I don’t need to do anymore. Completely. Please, take this away from me AI, however Google clearly doesn’t need to try this. That’s not attractive. Google desires the sexiness of, oh, I need to keep at this resort. Guide this resort room for me. Give it your bank card data, inform it your private data, inform all of it these things. Like that, I’m identical to, I get nervous.
26:27 – Mishaal Rahman: I imply I’m in the identical boat. Like that’s the entire purpose I haven’t tried, I haven’t completed something with these agentic AI browsers like Comet, you already know, like I don’t belief generative AI with really, you already know, finishing purchases or reserving accommodations or any of that kind with me. Like, I need to make certain I’ve the ultimate say earlier than the small print, fee particulars undergo as a result of I don’t need to give it that stage of management over my, you already know, buying energy. However I imply, to be honest, there are a whole lot of customers who would possibly actually profit from this, particularly customers who’ve accessibility points utilizing a pc or aged of us, you already know, who wrestle to navigate web sites or work out what button to press or, you already know, how you can connect paperwork and pictures and stuff like that. Like, this may very well be actually helpful and actually assist them out, assuming it will get issues appropriate, after all, which is the large, which is the large if, as a result of we’ve seen how issues can go unsuitable and when it screws up. And in case you’re doing one thing like filling out a visa software earlier than you’re touring to, you already know, one other nation, a single screw up may imply your software is rejected and also you don’t need that taking place. You don’t need AI to get blamed for that mishap, or at the very least like Google doesn’t need its AI to be blamed for that mishap. So yeah, I like the concept that this opens the door for type of generalized, something you may think about, you may have AI management it as a result of it could possibly actually learn your display screen, it could possibly actually simulate faucets and inputs. However I a lot want the thought of type of having it structured and having it’s type of a forwards and backwards between an software and the AI the place like the appliance tells the AI, right here’s what I can do, right here’s what I’m exposing to you, after which the AI can really simply execute these features, which is type of just like the Intents that now we have or like app shortcuts on iOS, you already know, and truly precisely what Google is already engaged on concurrently facet by facet with this Laptop Management function as a result of with the discharge of Android 16, I really noticed that Google is engaged on an API referred to as App Features and this API is mainly precisely what I mentioned, prefer it opens the door for Gemini to carry out duties in third-party apps, duties that have been particularly uncovered to Gemini by the app developer. So for instance, if a restaurant software desires to create a perform to permit AI chatbots to order meals on their behalf, it could possibly create a particular perform referred to as “order meals” after which outline the parameters that the AI chatbot may undergo earlier than it could possibly order meals. I might be much more snug with that strategy. What about you, Scott?
29:02 – C. Scott Brown: Yeah, I imply that is smart to me, however now we’re speaking about essentially altering your entire app ecosystem, your entire idea of utilizing an software. And I do know that a number of a number of organizations, I believe together with Google are saying like the longer term isn’t any apps. The longer term is that you simply simply have an AI and the AI does the whole lot and there’s however it’s going to be I don’t know what, a decade earlier than I imply prefer it’s possibly not a decade, however it’s going to be an extended a few years earlier than we are able to even get near that type of distinction in how individuals do issues immediately versus how Google and different corporations are envisioning we’ll do them sooner or later. Within the meantime, it’s going to be troublesome to get individuals to form of settle for that. I believe lots of people can be nervous in case you instructed them like, oh, this AI goes to, you already know, use your bank card to purchase one thing or or yeah, such as you mentioned, like get you a visa for, you already know, journey, like there’s so many issues that lots of people would simply be like, nope, don’t belief, you already know, medical data, like making a health care provider’s appointment. Like, you already know, that type of factor is simply very delicate and it’s one thing that, you already know, needs to be completed slowly and methodically. And I do know this doesn’t precisely need to do with what we’re speaking about, however the Google Residence group made it clear to me in a dialog I had with them semi-recently about how the explanation that the Residence group strikes slowly in comparison with the remainder of the Google AI group is as a result of Google is aware of that it could possibly’t mess up within the dwelling. If it messes up within the dwelling, individuals don’t belief it after which they don’t need Google within the dwelling anymore. So, the Google Residence group has to maneuver comparatively gradual in comparison with possibly the Chrome group, for instance. I believe that this agentic AI scenario is one other place the place Google wants to maneuver slowly and methodically and intentionally. Is Google going to do this although? Or is it going to chase and go as quick as it could possibly as a result of it has to beat OpenAI, it has to beat all these different browsers and all these different issues. I don’t know. So I actually hope that Google understands the gravity of the scenario. If it messes up and it has some form of scenario the place you already know, you discover out {that a} bunch of, you already know, senior residents in a nursing dwelling someplace all had their life financial savings taken away by AI as a result of it did the silly factor with a generic system, that’s horrible and that’s one thing that’s going to damage it for the long run as a result of persons are going to perpetually keep in mind that scenario they usually’re by no means going to need to undertake this. So Google’s acquired to maneuver slowly and intentionally right here and I simply I don’t know if it’s going to do this, however I you already know, I’ll give it the good thing about the doubt. I’ll wait patiently and see what occurs, however yeah, Google’s transfer quick, break issues type of scenario is a little bit nerve-wracking on the subject of this type of factor.
31:44 – Mishaal Rahman: I’m a little bit skeptical personally of, you already know, all of the AI corporations’ claims that the longer term goes to be AI chatbots interacting with apps and we’re not going to be utilizing apps in any respect. Like I simply don’t see that taking place as a result of I don’t see app corporations permitting that to occur. Like, I don’t know in case you noticed OpenAI’s current announcement, they launched their apps function they usually have companions like Spotify. You’ll be able to ask it to play sure songs from Spotify. And that’s I believe Spotify’s most likely okay with that as a result of it nonetheless enable that, you already know, you continue to need to if you wish to like sure songs, you need to skip adverts, you need to subscribe to Spotify, proper? You’re nonetheless paying Spotify ultimately, however like a whole lot of purposes simply wouldn’t. How would they make cash in the event that they have been absolutely built-in into AI chatbots? The place would they get income from? How would they get the information that these AI chatbots are getting? Like I believe most corporations need apps. They need you to make use of their apps as a result of they need the management, they need the information, they need to instantly funnel customers into their subscriptions and type of having the whole lot built-in right into a singular AI chatbot simply I don’t see that fulfilling their wants.
32:49 – C. Scott Brown: Yeah, it is a scenario the place Google, OpenAI, these corporations which can be large, large knowledge corporations are mainly attempting to shoehorn their means into these way more conventional companies like Spotify. Spotify, you already know, is a tech firm and really superior and large, no matter, however it’s essentially a subscription service. Like that’s all it’s. You pay cash to them, they provide you music. Prefer it’s a quite simple transaction. And yeah, the information from their purposes is invaluable too. There are tons of purposes on the market that aren’t subscription companies they usually make cash particularly since you’re utilizing the product. You already know, Instagram for instance, you already know, such as you’re not paying to make use of Instagram, you’re simply utilizing it without cost. However the purpose that you already know that you simply’re ready to make use of it without cost is as a result of you’re the product. Your knowledge is efficacious to Meta they usually use that to make cash. So in case you’re utilizing an agentic AI to publish one thing to Instagram, that’s time that you simply’re not spending within the Instagram app. That’s time that Meta doesn’t make cash off of you. So yeah, it’s going to take loads for all these consumer habits and for all these enterprise buildings to vary. You already know, granted, issues modified fairly shortly when the smartphone first got here round. You already know, 2007 when the iPhone got here out, it was solely a few years earlier than we have been neck deep within the app ecosystem that we’re nonetheless in immediately. So yeah, it may occur shortly, however on the similar time, prefer it’s not going to occur as a result of Google says, we did this factor. Like that’s simply not going to be sufficient as a result of there’s going to be, you already know, 1000’s and 1000’s of mega firms which can be going to be negatively impacted by that. So yeah, it is a big can of worms that Google is opening, or it’s not simply Google, like we’re speaking about Google due to this explicit function, however this might apply to any firm that’s attempting to do that. Even Rabbit. It’s going to take much more than identical to we did this factor. Isn’t this cool? Like, you bought to vary like, you already know, many years of tradition and the way in which individuals simply are used to doing issues. So yeah, I’m skeptical. I’m excited as a result of I do just like the idea, however I’m additionally skeptical that it’s really going to do something.
35:00 – Mishaal Rahman: Yeah. I believe we’re simply nonetheless firmly within the time frame the place AI corporations are simply throwing issues on the wall and seeing what’s sticking. So like they’re experimenting like the subsequent huge factor all of them suppose goes to occur is agentic AI. So Google, Microsoft, you already know, Amazon, OpenAI, they don’t need to be behind on this race. So that they’re all speeding to, you already know, launch their very own variations of Laptop Management options to guarantee that, you already know, if that is certainly the subsequent huge factor and if certainly the longer term is apps that you simply all solely work together with via AI chatbots that they’ve they’re prepared for it, proper? They only need to be the primary ones there. And you already know, that’s why we’re type of seeing corporations throwing issues out. They’re not even actually certain if that is the appropriate place to incorporate this function or if that is the appropriate means the function must be applied. They only acquired to have it on the market as a result of they need individuals to make use of their model of the function and never their rivals. And that’s type of why I believe we’re seeing Google probably combine Nano Banana into Google Lens and Google Circle to Search. You already know, as you talked about on the high of the present, Google is experimenting with placing Nano Banana in increasingly locations and Google Lens, Circle to Search doesn’t actually fairly make sense as a result of they’re each instruments for looking, proper? Such as you don’t make movies or make photographs via Google Lens or Circle to Search, you actually it’s within the identify, Circle to Search. You’re circling one thing to seek for it, proper? Like what a part of that means picture creation? However apparently Google thinks like, okay, we acquired this massively standard floor, Circle to Search in addition to Google Lens. Nano Banana goes completely bananas by way of reputation. I see adverts for it in all places. Just like the publication I learn each morning, Morning Brew, and in addition Techmeme like they’ve adverts each single day for Nano Banana and prefer it’s clearly one in every of Google’s most profitable merchandise. So that they’re like, okay, we’re going to deliver this to as many, we would like as many individuals as potential utilizing and switching to Gemini. So we’re going to deliver Nano Banana creation capabilities to Google Lens in addition to Circle Search. And I believe that’s most likely why they’re doing this. That’s why they’re mainly simply dumping this functionality into an interface that wasn’t actually constructed for it, however as a result of Google Lens is extremely standard, Circle to Search is extremely standard, that that is one other avenue for them to advertise Gemini.
37:17 – C. Scott Brown: Yeah, I imply that is positively a traditional case of we have to have AI in the whole lot and simply shoehorn it into no matter it must be shoehorned into. You already know you and I’ve beforehand mentioned how far behind Apple is on the subject of these sorts of AI options. And you already know Apple has an enormous downside on its palms. Like that’s an enormous downside. Siri is horrible. Apple’s lack of innovation within the AI area is holding it again. There’s a whole lot of issues with that. However on the similar time, the truth that Apple will not be messing itself up by throwing all this stuff on the wall and hoping one thing sticks, is laudable as nicely. You already know, their technique is wildly totally different from what Google is doing. Apple wouldn’t identical to develop this and be like we’re going to place this new factor that Siri does into actually the whole lot now we have. Like Apple wouldn’t try this due to what you’re saying as a result of it doesn’t make sense. Like why is Nano Banana in Circle to Search? This makes zero sense in any respect. However yeah, Google simply desires to indicate it’s the chief. With regards to AI, we’re the chief. We’re going to place AI in our AI and that’s what we’re going to do. Please proceed to put money into us, please. That’s what Google is attempting to do. And yeah, so it’s bizarre, however yeah, however there are issues that you already know that you would put it I can consider conditions during which injecting AI into a spot or AI technology I ought to say that possibly it wouldn’t make sense may make sense. For instance, there may very well be a scenario the place possibly your baby is you already know, has your telephone and is taking a look at a chunk of sweet they usually use Circle to Search they usually scan that little piece of sweet and the Google factor says you already know it is a Mike and Ike. Mike and Ike is a sweet that tastes like fruity, you already know, no matter. After which the kid may ask one thing like how are Mike and Ikes made. After which growth AI comes collectively and creates a little bit brief clip for that baby that claims like that is how a Mike and Ike is made. I can think about that and I can think about that being enjoyable. I can think about that being one thing that may be standard and funky. In order that’s a scenario the place injecting AI technology right into a search perform would make a level of sense. However that’s not what we’re, you already know that’s like steps and steps and steps forward of the place we’re proper now. So it’s like possibly Google is laying the framework for issues that it desires to do. I can respect that however yeah like I don’t know it can get very tiring to only have AI injected into the whole lot even when it doesn’t make sense for it to be there.
40:07 – Mishaal Rahman: I imply, possibly it’s simply one other avenue for them like possibly the thought course of behind placing Nano Banana in these two companies is simply they’re hoping that this can be one other means for it to go viral. Like, for instance, individuals use Google Lens and Circle to Search loads. Possibly they’re despatched a meme they usually need to like search up like discover extra details about that meme or that picture or one thing after which they’ve this create choice simply sitting proper there. You already know, they’ve by no means used Nano Banana earlier than as a result of you need to actively go to the Gemini app to make use of it or in Google Photographs you employ the Ask Photographs creator, proper? You must actively go and use it. However like in case you’re simply trying to find photographs, trying up data on one thing and also you simply have that create choice there. Possibly you would possibly suppose, okay, why don’t I simply click on that button after which flip the sky purple or one thing or simply add a hat to this canine. Or one thing foolish like that, proper? After which you’ve got this picture creation, you is likely to be stunned, genuinely stunned by how good the picture creation is as a result of I do know I’ve been actually stunned by how good Nano Banana is. I’ve been utilizing it to generate just like the hero picture within the thumbnails for this very podcast and it’s simply loopy how good it’s. So I believe like when individuals give it a shot they usually’re like, wow, that is shockingly good. Then they may share it on social media, it would go viral. That’s extra individuals spreading the phrase about Nano Banana and thus extra individuals signing up and utilizing Gemini. I believe that is likely to be the purpose behind Google sticking Nano Banana in these two surfaces.
41:26 – C. Scott Brown: I ponder if Google is disillusioned within the identify that they selected. Nano Banana.
41:33 – Mishaal Rahman: It doesn’t let you know something about it, proper? Nano banana, like that’s clearly a code identify. Why did they persist with that?
I believe I believe the explanation for that really is as a result of within the lead as much as the announcement, like a whole lot of influencers have been obscure posting about, oh, this new picture creation factor is superb as a result of they have been all utilizing the code identify for it. Google didn’t have an official identify for it. I believe the official identify really is Gemini 2.5 flash picture enhancing or picture creation. Prefer it doesn’t have an precise identify in addition to Nano Banana.
42:02 – C. Scott Brown: That’s worse than Nano Banana however it’s nonetheless like, I don’t know, Nano Banana like
42:08 – Mishaal Rahman: I imply, I believe you’re anticipating an excessive amount of. Like Google having good naming practices, this is similar firm that couldn’t resolve on a reputation for Google Pockets. Truly like we went full circle. We had like what? We had Google Pockets, Android Pay, GPay, Google Pay, Google Pockets like yeah we’re anticipating a little bit an excessive amount of from Google proper there.
42:29 – C. Scott Brown: Google and Qualcomm, they want some assist on the subject of naming their merchandise. However yeah, no, I positively suppose that, you already know, I believe, you already know, we’re speaking about AI all through this entire podcast and we’re speaking about just like the ramifications that it may probably have for these not solely these totally different merchandise, however identical to our day-to-day lives. You already know, I believe that Google and all these corporations want to consider the place they’re going. Like, proper now I really feel prefer it’s identical to, you already know, there’s a like cartoon that I can think about the place the practice is coming down the tracks and just like the cartoon characters are frantically attempting to place the monitor down in order that the practice doesn’t roll off the tracks. So that they’re simply, you already know, constructing the monitor because the practice is definitely shifting on it and that’s what I really feel like we’re at with AI proper now. It’s just like the practice is barreling forward and we’re simply attempting to put down monitor and it’s like nobody is definitely having the time or the forethought to suppose, the place the hell is that this practice really going? Like the place are we laying the monitor to? And I believe that’s what Google must form of take a step again and and work out. And to be completely honest to Google, there is likely to be a grand overarking plan to this that they’re simply not telling us as a result of they don’t need, you already know, everybody else, OpenAI and all these different corporations to know what they’re doing. However Google has an extended historical past of not planning forward, you already know, simply take a look at all of the merchandise that they’ve launched that they’ve finally moved to the Google graveyard. Simply take into consideration the very fact that there’s a Google graveyard. Like that’s all illustration that Google doesn’t often do very nicely on the subject of long-term pondering. So, I don’t know, like the place are we going and when are we going to get there and what’s it going to be like once we get there? These are the large questions I at all times have in my thoughts every time we discuss these varieties of recent improvements. So, yeah, we’re simply going to have to attend and see.
44:24 – Mishaal Rahman: I imply, I believe personally, so long as you ignore the hype constructing by individuals like Sam Altman and also you simply take a look at the precise issues you are able to do with the instruments which can be on supply immediately, it’s fairly unimaginable. Like there are people who find themselves type of naysayers concerning the developments that AI has sprung, and you then take a look at the issues you are able to do with Nano Banana. Like I have no idea how you can use Photoshop in any respect. I’m horrible at picture creation and picture enhancing. However with Nano Banana, I can create thumbnails that I might by no means have been in a position to do years in the past. Or with Gemini, the way it helps me proofread my very own work and helps me perceive code and issues like that. Like, it’s superb what you are able to do with it. So long as you look previous the hype and truly simply mess around with the instruments and see what works finest on your personal explicit workflow, like possibly it may not be what are they calling it? AGI, synthetic basic intelligence, proper? They’re all hyping that up or we’re going to have robots that may fully suppose on their very own. Like I simply ignore the hype, the hype individuals on Twitter and stuff like that. However take a look at the issues you may really do with the instruments immediately and it’s unimaginable how far we’ve come.
45:34 – C. Scott Brown: No, I agree. There are a whole lot of issues which have, you already know, a whole lot of AI-based instruments which have made a real distinction in my life. Gemini being a significant one. Gemini, I exploit Gemini on a regular basis for brainstorming for video concepts, such as you mentioned, like proofreading and truth checking and serving to to know code or sophisticated ideas. Gemini Dwell. You already know, once I’m touring, I discuss to Gemini Dwell, you already know, mainly like a journey information. Like I present Gemini Dwell a monument or one thing. I say, what is that this? And Gemini tells me and it’s like, nice, I didn’t have to love spend time studying Wikipedia or ask some random particular person. I can simply discuss to Gemini. Like there’s all types of issues that AI is being useful with immediately. And let’s not overlook that AI has been in our telephones for a decade, you already know, like many of the cool options that your Pixel can do are AI-based they usually at all times have been AI-based. It’s simply that they haven’t been, you already know, promoted as such as a result of AI wasn’t a scorching attractive subject. So, you already know,
46:33 – Mishaal Rahman. They used to name the whole lot machine studying again then, again within the good outdated days.
46:36 – C. Scott Brown: Yeah, as a result of they most likely felt that AI, the time period, synthetic intelligence, they most likely felt that it didn’t market nicely. They most likely felt that once you use the time period synthetic intelligence, individuals suppose Terminator, you already know, they consider destructive issues. So that they have been like, okay, we’re not going to make use of that time period, we’re going to make use of machine studying as an alternative. And in some unspecified time in the future previously like two years, machine studying is out the door. Nobody says machine studying anymore. And we’re again to AI and synthetic intelligence. So, it’s prefer it’s all been there, it’s all been taking place. It’s simply that now we’re seeing developments a little bit bit quicker and we’re seeing Wall Road solely care about AI and so now each firm is simply AI, AI, AI, AI. Finally, the bubble’s going to burst, you already know, possibly not this yr, however 2026, 2027, the bubble’s going to burst and all these corporations which can be popping up which can be like, all we do is do AI stuff, they’re going to drown. However what we’re left with goes to be just like when the dot com bubble burst, you already know, no matter period of time it was, I’m actually outdated, so, you already know, possibly 50 years in the past, I don’t even keep in mind. However when the bubble burst, you already know, we nonetheless had the Web, we nonetheless have dot coms, we nonetheless have all the essential issues that the dot com, you already know, growth had. It’s simply that the fluff, the junk is gone, and we simply have the Web. So, I believe that we’re going to see the same factor occur with AI, however within the meantime, we’re simply going to need to endure via, you already know, some firm coming alongside and being like we’ve cured most cancers with AI and it’s like, oh, okay.
48:06 – Mishaal Rahman: Properly, I imply, happily, Google hasn’t made such daring claims but with AI, however what they’re doing is, you already know, you get Gemini, you get Gemini, everyone will get Gemini. That’s Google’s strategy to AI thus far. Okay, and that’s the whole lot we’ve acquired for you this week. Yow will discover hyperlinks to all of the tales talked about on this episode within the present notes and yow will discover extra superb tales to learn over on androidauthority.com.
48:29 – C. Scott Brown: Thanks for listening to the Authority Insights Podcast. We publish each week on YouTube, Spotify, and different podcast platforms. You’ll be able to observe us in all places on social media at Android Authority, and you’ll observe me personally on Instagram, Bluesky, and my very own YouTube channel at C. Scott Brown.
48:47 – Mishaal Rahman: As for me, I’m on most social media platforms posting day in and time out about Android. If you wish to sustain with the most recent on Android, go to my Linktree and observe me on the social media platform that you simply like finest. Thanks for listening.