Introduction

Why does Google insist on making it’s assistant situation so bad?

In theory, assistant should be the best it’s ever been. It’s better at “understanding” what I ask for, and yet it less capable than ever to do so.

This post is a rant about my experience using modern assistants on Android, and why, while I used to use these features actively in the mid-to-late-2010s, I now don’t even bother with them.

The task

Back in the late 2010s, I used to be able to hold the home button and ask the Google Assistant to create an event based on this email. It would grab the context from my screen, and do exactly that. This has been impossible, as far as I can tell, to do for years now.

Trying to find the “right” assistant

At some point, my phone stopped responding to “OK Google”. I still don’t know why it won’t work.

Holding down the Home bar (the home button went the way of the dodo) brings up an assistant-style UI, but it’s dumb as bricks and only Googles the web. Useless.

Home Bar Assistant

So, I installed Gemini. I asked it to perform a basic task. It responded “in live mode, I cannot do that”. Asking it how I can get it to create me a calendar event, it could not answer the question. Saying instead to open my calendar app and create a new event. I know how to use a calendar. I want it to justify its existence by providing more value than a Google search. It was ultimately unable to answer the question.

Gemini Live

Searching the internet, apparently both of the ways I had been using assistant features were the wrong way to do it. You have to hold down the power button, that’s how to launch the proper one. My internal response was:

No, that’s for the power menu. I don’t want to dedicate it to Assistant.

Well, apparently, that’s the only way to do it now, so there I go sacrificing another convenience turning it on.

Pulling teeth with Gemini

So I ask this power-menu-version of Gemini to do the same simple task. I tried 4 separate times.

First, it created a random event “Meeting with a client” on a completely different day (what?).

Second time it just crashed with an error.

Gemini crashes

The third time, it asked me which email to use, giving me a list, but that list did not contain the email I was interested in. I asked it to find the Royal Mail one. No success.

So, quite clearly, it wasn’t using screen content.

I rephrased the question: “Please create an event from the content on my screen”. It replied “Sure, when’s this for?”

Sure, when's it for

I shouldn’t have to tell you. That’s the point. It’s right there.

Conclusion

There are too many damn assistant versions, and they are all bad. I can’t even imagine what it’s like to also have Bixby in the mix as a Samsung user. (Feel free to let me know below.)

It seems like none of them are able to pull context from what you are doing anymore, and you’ll spend more time fiddling and googling how to make them work than it would take for you to do the task yourself.

In some ways, assistants have gotten worst than almost 10 years ago, despite billions in investments.

As a little bonus, the internet is filled with AI slop that makes finding out real facts, real studies from real people harder than ever.

I write this all mostly to blow off steam, as this stuff has been frustrating me for years now. Let me know what your experience has been like below, I could use some camaraderie.

  • Dasus@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 hour ago

    Bixby is better at basic tasks on the phone, like a calendar meeting.

    Gemini is better at controlling my lights and to talk about something not too objective, like asking about cooking recipes. Or it’s not bad if you need to calculate solution strengths while mixing drinks or smth. But aside from maths and other fixed constants, it’s really shit at any sort of facts.

    You can not get a reliable answer to as when a show/movie is airing because the times announced usually shift a bit and the datasets are old. In last December, before Kraven the Hunter was premiering, I asked it when that would be. It told me, in December 2024, that “Kraven the Hunter will premier in May 2024” or something. As in the future tense. And date 6 months in the past.

    So yeah.

    They’re useful for certain things but they’re still far from anything akin to Jarvis, lol. I’ve been meaning to give the one free month of Gemini Pro a go just out of curiosity at some point to see whether it’s equally shit.

    I can definitely see potential in these. It’s pretty nice being able to control my lights just by telling my phone what colour where, how much light. Also I use it for a smart plug on my screen so when I’m watching things in bed I can just hit pause and tell google to shut my screen off for me instead of having to get out of bed or put it on standby (because then the annoying blue led will stay on).

  • prole@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    9
    ·
    6 hours ago

    It just straight up stopped sending me reminders that I’ve scheduled. Literally the one thing I used it for.

  • Petter1@lemm.ee
    link
    fedilink
    English
    arrow-up
    12
    ·
    9 hours ago

    This is because hardcoded human algorithms are still better in doing stuff on your phone than AI generated actions.

    It seems like they didn’t even test the chatbot in a real live scenario, or trained it specifically to be an assistant on a phone.

    They should give it options to trigger stuff, like siri with the workflows. And they should take their time and resources training it. They should give app developers a way to give the AI worker some structured data. The AI should be trained to search for correct context using that API and that it plugs the correct data into the correct workflow.

    I bet, they just skipped that individual training of gemini to work as phone assistant.

    Apple seems to plan exactly that, and that is way it will be released so late VS the other LLM AI phone assistants. I’m looking forward to see if apple manages to achieve their goal with AI (I will not use it, since I will not buy a new phone for that and I don’t use macOS)

    • buddascrayon@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      3 hours ago

      I’ve developed a theory. I think the person who put the “don’t be evil” line in the mission statement for Google put it there with a clever reason. Not simply to say to the company “don’t be evil” but so that we the consumers will know that, on the day they remove the line they will have become truly evil and that we should abandon the company and it’s products with all due haste.

  • Threeme2189@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    8
    ·
    10 hours ago

    I generally use Google assistant to do about 4 things.
    Check the weather, set a timer for X minutes, remind me to do something later today (ie: get the mail at 4) and set an alarm.

    After ‘upgrading’ to Gemini I tried to get it to set a timer or something and it wanted access to all kinds of information that is irrelevant to setting a goddamn timer.

    I promptly disabled it and went back to Google assistant, which does these 4 basic things without prying into my everyday life.

    • BackwardsUntoDawn@lemm.ee
      link
      fedilink
      English
      arrow-up
      6
      ·
      9 hours ago

      on top of that Gemini burns through a bunch of electricity in a data center somewhere to get the basic things wrong.

  • esc27@lemmy.world
    link
    fedilink
    English
    arrow-up
    23
    ·
    14 hours ago

    Pure conjecture on my part but I think…

    When these first came out, Google approached them in full venture capital mode with the idea of building a market first, then monetizing it. So they threw money and people at it, and it worked fairly well.

    They tried making it part of a home automation plaform, but after squandering the good will and market position of acquisitions like Nest and Dropcam, they failed to integrate these products into a coherent platform and needed another approach.

    So they turned to media and entertainment only to lose the sonos lawsuit.

    After that the product appears to have moved to maintenance mode where people and server resources are constantly being cut, forcing the remaining team to downsize and simply the tech.

    Now they are trying to plug it into their AI platform, but in effort to compete with openai and microsoft, they are likely rushing that platform to market far before it is ready.

  • vithigar@lemmy.ca
    link
    fedilink
    English
    arrow-up
    12
    ·
    edit-2
    3 hours ago

    My most common use for Google assistant was an extremely simple command. “Ok Google, set a timer for ten minutes.” I used this frequently and flawlessly for a long time.

    Much like in your situation it just stopped working at some point. Either asking for more info it doesn’t need, or reporting success while not actually doing it. I just gave up trying and haven’t used any voice assistant in a couple of years now.

  • BlushedPotatoPlayers@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    8
    ·
    13 hours ago

    Now try to add an extra layer of crap with English not being your native language. At this point only zoom gives fairly correct transcripts, both apple and Google are on the “you know what, I’m rather typing this shit” level

  • Monstrosity@lemm.ee
    link
    fedilink
    English
    arrow-up
    19
    ·
    16 hours ago

    I have a theory that this is intentional design.

    When products perform smoothly, you don’t interact with them, they become invisible, & Uncle Googs needs you interacting with their products as much as possible.

    Thanks to telemetry, Good Ol’ Uncle Googs has millions of hours of behavioral patterns to sift through, provided by all those free Chromebooks for school districts across the US.

    So Google know exactly what gets folks engaging with their phones &, I believe, intentionally cause frustration over seemingly simple tasks b/c many many people will forget what they were doing, fix the issue, then get distracted by entertainment apps, doom scrolling, etc.

    Dark patterns & behavioral manipulation.

  • AnAmericanPotato@programming.dev
    link
    fedilink
    English
    arrow-up
    32
    ·
    17 hours ago

    Google as an organization is simply dysfunctional. Everything they make is either some cowboy bullshit with no direction, or else it’s death by committee à la Microsoft.

    Google has always had a problem with incentives internally, where the only way to get promoted or get any recognition was to make something new. So their most talented devs would make some cool new thing, and then it would immediately stagnate and eventually die of neglect as they either got their promotion or moved on to another flashy new thing. If you’ve ever wondered why Google kills so many products (even well-loved ones), this is why. There’s no glory in maintaining someone else’s work.

    But now I think Google has entered a new phase, and they are simply the new Microsoft – too successful for their own good, and bloated as a result, with too many levels of management trying to justify their existence. I keep thinking of this article by a Microsoft engineer around the time Vista came out, about how something like 40 people were involved in redesigning the power options in the start menu, how it took over a year, and how it was an absolute shitshow. It’s an eye-opening read: https://moishelettvin.blogspot.com/2006/11/windows-shutdown-crapfest.html

    • 032 Mendicant Bias@feddit.uk
      link
      fedilink
      English
      arrow-up
      2
      ·
      5 hours ago

      The linked article was certainly interesting/alarming, but the original article that seems to have prompted it was a bit questionable to me. Seemed to be a contrived argument from someone knowledgeable to know what all the options meant, complaining that someone who didn’t would get confused - yet in reality, that “normal” user isn’t going to go looking in the extended menu, they would just click the icon, or press the physical button on the laptop.

      • AnAmericanPotato@programming.dev
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 hours ago

        I agree. Of all the UI crimes committed by Microsoft, this one wouldn’t crack the top 100. But I sure wouldn’t call it great.

        I can’t remember the last time I used the start menu to put my laptop to sleep. However, Windows Vista was released 20 years ago. At that time, most Windows users were not on laptops. Windows laptops were pretty much garbage until the Intel Core series, which launched a year later. In my offices, laptops were still the exception until the 2010s.

  • Arghblarg@lemmy.ca
    link
    fedilink
    English
    arrow-up
    59
    ·
    19 hours ago

    The Assistan/Nest devices have gone to shit as well. When I first bought one about 5 years ago, it could:

    • play many games (voice-controlled text adventures, multi-player gameshow-style trivia and party games, etc.);
    • play music from my Google Music Library library with no commercials;
    • play podcasts from 3rd party podcast providers;
    • play almost any radio station that also had livestream feeds

    Now all the games are gone, and ‘podcasts’ are just the small subset of podcasters who also upload their episodes to Youtube (whatever feed they used to use had almost every podcaster there was, even obscure ones); asking for specific episodes by episode number or date is totally broken. Playing radio stations is hit or miss, the voice recognition often picks a stream completely unrelated to the one I ask for; it is waaay worse at matching the callsign/city I ask for.

    Playing music is intolerable. There’s a stupid commercial for Youtube Music Premium after nearly every track. NO, I will not upgrade to your freaking Youtube Premium.

    Enshittification, thy name is Google.

    • badlotus@discuss.online
      link
      fedilink
      English
      arrow-up
      27
      arrow-down
      1
      ·
      19 hours ago

      Have you heard of Homeassistant? It’s a self-hosted smart home solution that fills a lot of the gaps left by the most smart home tech. They’ve recently added and refined support for various different voice assistants, some of which run completely on your hardware. I have found they have great community support for this project and you can also buy their hardware if you don’t feel like tinkering on a Raspberry Pi or VM. The best thing (IMHO) about Homeassistant is that it is FOSS.

      Homeassistant Voice Control

      • foggenbooty@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 hours ago

        Voice control of devices you have in home assistant is cool, but I don’t think I would recommend it to an average person who uses Google assistant. Sure it can turn the lights on and off if it’s aware of those entities, but this user is describing playing games, asking for media streams, podcasts, all things home assistant voice does not support (certainly not out of the box).

      • Arghblarg@lemmy.ca
        link
        fedilink
        English
        arrow-up
        2
        ·
        15 hours ago

        I have heard of it yeah! Definitely want to try it out… just haven’t gotten around to it yet.

        Do you find the voice recognition is decent?

  • HiTekRedNek@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    ·
    9 hours ago

    Weird. My Google Pixel 9 Pro XL has 3 navigation buttons.

    That’s something that’s customizable using the stock firmware.

    Settings -> Display & touch -> Navigation mode.

    My Google Assistant still responds to “Hey Google” or “ok Google” just fine as well on my Google phone.

  • DJKJuicy@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    13
    ·
    16 hours ago

    Google Assistant on Android and Home devices gets markedly worse as time goes on. My Google Home can barely even figure out how to turn on lights anymore. Things that used to be amazing have devolved into “I don’t understand”.

    It’s really quite shocking and hurts a little bit because it feels like everything in the world is degrading and decomposing like in some dystopian novel or something.

    How can such amazing technology get worse? Is it really just “we can’t make money off this”?

  • Telorand@reddthat.com
    link
    fedilink
    English
    arrow-up
    38
    ·
    19 hours ago

    This is by design. They’re trying to frustrate you, because they’ll then upsell you on the “premium” subscription later on that returns the “old ways,” all while mining your data. And if you continue to use the free stuff, they’ll just mine you harder with AI.

    They’re angling for a rent-based economy where they’re the landowners and we’re the sharecroppers paying to use their stuff. The only way out is to de-google and start taking your privacy seriously.

    • sanpo@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      24
      ·
      19 hours ago

      That’s an optimistic view.

      But really, that’s just Google being Google.
      Even years before the “AI” hype their Assistant kept suddenly losing features that worked perfectly fine before.

      • jimmux@programming.dev
        link
        fedilink
        English
        arrow-up
        7
        ·
        16 hours ago

        Just like Android loses features on every major version, and Maps is a skeleton of its former self.

        In a company where nobody is incentivised to maintain anything, cutting features is the easier option.

    • moe90@feddit.nl
      link
      fedilink
      English
      arrow-up
      2
      ·
      11 hours ago

      the thing about degoogled Android is SafetyNet Support. So, if you rely on digital banks then it is serious issue.

      • Telorand@reddthat.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 hours ago

        Hey thanks. That’s something I didn’t even know existed, and it’s something I’ll have to look into before I make the switch.