Wait, Google's AI is Trying to Talk to Dolphins Now? Cool!

Okay, so you know how dolphins make all those cool clicking and whistling sounds? Turns out, it’s super complex, like their own secret language. Scientists have been trying to crack the code for ages, but it’s tough because the ocean is noisy, and dolphin chatter is complicated. Enter Google AI with something called DolphinGemma. It’s basically a smart AI model they’ve built specifically to listen in and try to understand what dolphins might be saying.

So, how does it actually work? Instead of scientists needing tons of custom, expensive gear, the article mentions they actually used Google Pixel smartphones as part of the setup! This is pretty clever because using phones makes the whole system easier to maintain, uses less power, and shrinks the cost and size – huge wins when you’re doing research out in the middle of the ocean. Then, the DolphinGemma AI does the heavy lifting on the sound analysis. Think of it like giving the AI a massive playlist of dolphin sounds recorded underwater. The AI listens to everything – all the clicks, whistles, and background noise. It uses clever machine learning tricks (based on Google’s Gemma AI tech) to start picking out patterns all by itself. It learns to spot the difference between random ocean noise and actual dolphin calls, and even starts figuring out which sounds might be important.

Auto-generated description: A smartphone is placed inside an electronic enclosure with various connectors and a coiled wire.

The really neat part is that DolphinGemma learns directly from the raw sound waves. It doesn’t need humans to tell it “this is a whistle” or “that’s a click” beforehand. It just listens and learns, kind of like how a baby learns language by hearing it all the time. This means it might catch subtle things in their calls that humans could easily miss. The goal is to get good enough to identify different types of calls and maybe even tell individual dolphins apart just by their voice! The AI’s predictive power can also help researchers react faster during interactions, making the study more fluid.

Beyond just dolphins, think about where else this kind of smart listening tech could be useful. The basic idea of teaching AI to pick out meaningful patterns in complex sounds could definitely be applied elsewhere in nature. Imagine, for instance, using something similar to better understand all the different animal sounds happening in a busy rainforest environment. It might help track hard-to-see animal populations or get a better sense of the ecosystem’s health just by listening in.

Now, are we going to have full conversations with Flipper tomorrow? Probably not! Getting a complete “dolphin dictionary” is still way off. But, this DolphinGemma thing is a huge step. Understanding dolphins better could tell us so much about how they live, hang out, and how they’re doing in the ocean. Plus, the tech they built for this could help us understand other chatty animals too. It’s pretty amazing to see AI being used to connect us more with the awesome creatures we share the planet with!

Initial Pixel 9a Photo Impressions

The Pixel 9a’s night mode on the front camera delivers a sharp image, pulling out good detail even when it’s dim. The lighting looks pretty balanced across the picture, which is a nice touch for low-light selfies

A group of people is rehearsing on a dimly lit stage in an auditorium with rows of empty seats.Two people are posing for a selfie indoors, each holding up a peace sign.A person is holding a blue tea bag label with the word Peppermint on it.A glass cup of tea and a couple of crackers on a napkin sit on a wooden table.

Here we go!

Pixel 9a is here and I’m putting my main SIM in it. Packing up my P9Pro and putting it on the shelf. I’m going to use the P9a for a few months to see how I like using a low-mid range Android device.

Comment anything you want me to focus on or “test”.

Pixel's Paradox: AI Excellence, Innovation Apathy

I’ll admit, even I raised an eyebrow at the results of this Pixel poll. I mean, 4,700 people? That’s a decent sample size, I suppose. Still, the reasons they cited… well, let’s just say they weren’t exactly groundbreaking. Especially when you consider the iPhone juggernaut – you know, the one that’s seemingly glued to the hands of 50-60% of the US population (and growing, I’m sure). According to Counterpoint Research, as of Q4 2024.. But, I digress. Android Authority asked a simple question: ‘Why do you use a Pixel phone?’ And the answers… were revealing, to say the least.

Back in my day – and yes, I’m going to start this that way – Google’s Pixel line was the underdog. The scrappy kid on the block, fighting against the Samsungs and Apples of the world. They weren’t just selling a phone; they were selling the Android experience. Pure, unadulterated, straight from the source. Remember the Nexus days? (Those were the real Pixels, if you ask me.) Google promised us innovation, a camera that could rival the best, and software that was always one step ahead. And for a while, they delivered. But somewhere along the line – and I’m looking at you, Pixel 6 – things started to… stagnate. The competition caught up, and Google’s ‘exclusive’ features started feeling less ‘exclusive’ and more ‘finally catching up.’

So, what’s the big takeaway from this survey? Apparently, people buy Pixels for a laundry list of reasons: camera, exclusive features, safety, stock Android – the ‘all of the above’ crowd. And look, I get it. There’s a certain appeal to a phone that tries to do everything well. But here’s the thing: that’s not a strength yet, but it has the potential to be. Right now, it feels a bit like a jack-of-all-trades situation. The camera? Solid, reliable – but can it truly claim to be the best anymore? ‘Exclusive’ features? Some are genuinely useful, others… less so. Safety? Absolutely a plus, no argument there. And stock Android? A clean, bloat-free experience is always appreciated. But the question remains: where’s the spark? Where’s the innovation that made the early Pixels so exciting? They’re good, even very good – but they could be so much more.

Let’s be real, the AI features are where Google’s Pixel is still king – or at least, a solid prince. Magic Eraser? Magic Editor? Game changers. I’ve seen Apple’s attempt at a cleanup tool, and frankly, it’s like comparing a Picasso to a finger painting – impressive for a toddler, less so for a trillion-dollar company. And don’t even get me started on call screening and spam detection. Life before these features was like navigating a minefield of robocalls and sketchy texts. Now? It’s like having a personal assistant who actually knows what they’re doing. Then there’s the little things – the ‘Now Playing’ feature that identifies songs without you lifting a finger, the real-time translation that actually works in noisy environments, the way the camera just gets the right shot, even in tricky lighting. These aren’t just gimmicks; they’re genuinely useful AI-powered tools that make your life easier. And while the competition is scrambling to catch up, Google’s Pixel is still setting the pace. (Though, let’s be honest, they could still push the envelope a bit more. Imagine what they could do if they truly leaned into ‘Pixel Sense’ and integrated all those sensors and AI into a seamless, contextual experience…

So, where does this leave us? Pixel, in its current state, is a paradox. It’s a phone that excels at the mundane – the everyday tasks, the little annoyances that AI can smooth over. But it’s also a phone that’s lost its edge, its sense of daring. It’s become… safe. Too safe. And in a market that’s constantly evolving, constantly pushing boundaries, ‘safe’ is a death sentence. Google needs to remember what made Pixel special in the first place – the innovation, the willingness to take risks. They need to stop playing it safe and start pushing the envelope again. Because if they don’t, they’ll find themselves not just trailing behind the competition, but forgotten altogether. And that – that would be a tragedy."

I’ve been on the hunt for a new computer and I’ve compared an Intel Ultra Code 7 HP x360, M1 Pro MacBook Pro, and the HP Elite Dragonfly Chromebook. I’m writing my full review, but here’s a quick snippet of my setup.

Pixel 9a, anyone?

Pixel 9a gets an official launch date.

“The Pixel 9a will be available on-shelf on the following dates:

April 10th: US, Canada and UK

April 14th: Germany, Spain, Italy, Ireland, France, Norway, Denmark, Sweden, Netherlands, Belgium, Austria, Portugal, Switzerland, Poland, Czechia, Romania, Hungary, Slovenia, Slovakia, Lithuania, Estonia, Latvia, Finland

April 16th: Australia, India, Singapore, Taiwan, Malaysia

Coming soon to Japan”

It sounds like Google has fixed internal issues they were having initially. Those dual embargos came in the right time. Now we wait for the reviews.

Dear Google: Let’s Talk About the Pixel (Comments from Reddit)

A 13 year long iPhone customer, finally made the switch from iPhone to the Google Pixel 9 Pro and he shared some great feedback about his experience using Pixel. Some of them are just the difference from being in a closed-ecosystem, controlled by one company and moving to a more open-ecosystem semi-controlled by one company, but with the philosophy of an open-ecosystem. If that makes sense. I provided my comments to the person’s post on reddit. I’ve share them below. The original redditor’s comments are in the green. Mine are in between. Enjoy.

I love all of these suggestions and would welcome all of these, but I’d like to share my thoughts on these. I’ll skip the first one since that was positive feedback.

2. Improve the Tensor Chip

Make the Tensor chip as fast as the top-tier Snapdragon or the latest iPhone’s chip. I’ve noticed some lag, especially when gaming. Sure, 99% of people don’t game heavily on their phones, but when you have an iPhone, you know it can handle it if you want. That’s a feeling of security.

I think this is coming with time. Both Qualcomm and Apple have been making their own chips for more than a decade now. Google partnering with Samsung in their Tensor White Chapel and newer chips was to jumpstart Google’s experience in making their own chips, while still being able to customize the and tweak the AI experience on their own devices. Similar to how Apple learned from Intel before switching to their own silicon. Overall, I think this is coming, but with a caveat. Google will more than likely rely on the combination and collaboration of Cloud computing. So raw power locally won’t always been the focus. Especially since it requires a whole lot more R&D, will eat into margins, and they’ll more than likely raise the price of devices even more than the usual economical increases.

3. Enhance Video Recording

You guys take incredible photos, so keep it up. But please work on getting the video quality closer to iPhone levels—especially when switching lenses mid-recording.

I think you’re right on with this and I think Google is surely advancing here. Even adding their own custom ISP as well as replace “BigWave” (Google’s in-house video codec) with WAVE677DV suggesting that the latter may offer advantages in areas such as performance, power efficiency, or a combination of factors, along with the added benefit of multi-format support on the new Tensor G5 chip coming this fall.

4. Optimize Your Own Apps

Some Google apps are smoother on iPhone than on Pixel. Why is Google Maps smoother on iPhone, for example? Please optimize your apps to perform best on your own phone.

I’ve actually found Google Maps works better on Pixel than it has on iPhone. At least for me in the past. This is obviously anecdotal.

5. Get Third-Party Apps Onboard

Make sure popular social media and messaging apps are optimized for Pixel. This is crucial for everyday use.

I want this to, and Google has worked with IG, Snap, and the YouTube team for a few years and has added some exclusives that aren’t available on iOS. Like Night Sight in Instagram, Live video switching between Pixel’s, and Live Transcribe. Still there is a lot more work that needs to be done and the CameraX API is the key. Fragmentation is the issue here and Google having to balance prioritizing Pixel or Android comes into play.

6. Merge Google Meet & Messages

Consider blending Google Meet and Google Messages into one seamless experience (similar to FaceTime, iMessages). It would simplify communication across Android ecosystem.

From my understanding, it’s the same. Messages and Facetime are two completely different apps. You can uninstall Facetime and still have Messages. I assume that when you can uninstall Messages on iOS, you can still use Facetime. I might not fully understand what you mean here, but I think they are pretty similar. I think WhatsApp, Telegram, and Signal are the only ones that truly have it blended. But, AFAIK Google Meet and Google Messages are to what Apple Facetime and Apple Messages are today in terms as a seamless experience.

7. Add Built-In Magnets

Implement magnets inside the phone (like MagSafe on iPhone). There are tons of accessories that rely on this feature.

I’m not opposed to this, but is it absolutely needed? Most people have a case on their phone. Does the Pixel need magnets if more cases have magnets? I think that’s up for debate. I know a lot of people like to go caseless, but anecdotally, for every caseless courageous person I see in public, I see 10 cas(r)eful cautious people.

8. Strengthen the Ecosystem

One major reason people stick with iPhone is because of Apple’s ecosystem. Web apps are fine, but if possible, create native desktop apps (Messages, video calling, Notes, Reminders, Calendar, Photos, etc.) for both Mac and Windows. That would help lure more people in.

Would native desktop apps do that? Why? To what metric? What benefit do customers get from having a native Messages app as opposed to a Web one? Web apps don’t take up space on the local machine and web apps can be wherever you are. If the features are consistent between the two, I don’t see the want and need for that.

As for the ecosystem, I agree, and that definitely takes a lot more time because Google can’t and probably won’t force any OEM, Developer, etc… to do things that solely benefit Google. Apple can and does do that at the expense of third-party accessory makers and developers.

9. Bolster Core Android Features

Keep improving Android at its core. Make it even more robust, smooth, and user-friendly.

No notes. This is good. Progression is good.

10. Elevate the Watch Experience

The Pixel Watch needs more attention. Aim to match (or beat) the Apple Watch’s functionality, integration, and polish.

I hear this a lot but I don’t think people are ready to spend $700 to $800 on a Pixel Watch. It’s still new and getting as many Pixel Watch experiences out of the door as possibly is more important with the balance of providing premium fitness experiences. I think for Google only having made 3 watches, they’ve done great. Remember, Samsung and Apple have had watch accessory businesses for at least 10 years. That comes with time, but Pixel is advancing at a fast pace. With all that said, I do think the Pixel Watch needs smoother animations and better loading performance. UX journey in and out of apps need to be a lot more polished with animations coming from a specific point on the screen instead of just appearing.

Lastly, a Big One:

Google is known for killing projects quickly, which might be great for engineers, but from a customer’s standpoint, it shows a lack of long-term commitment. When you launch something, it’s hard for us to fully adopt it because we suspect it’ll be discontinued. That attitude does more harm to your reputation than anything else. Please rethink it.

I LOVE this one and totally agree. I think because of the massive backlash and killing of extra weight/messaging apps, Google is at a lean place that seems to be holding on to things as they are today. Google Assistant going away isn’t a killing. It’s an evolution. However, there are times where it makes sense to kill something that just isn’t working or hasn’t been working for years now.

This is great feedback from a long-time iPhone customer. Some things just don’t apply to Android and even Google. I will say, though, that if you expect Google to completely control and force their hand on things like Apple does, you will be disappointed.

Pixie never was, but is Screenshots?

‘Pixie’ was a canceled Pixel assistant that became Pixel Screenshots

This report comes from a combination of The Informational and 9to5Google.

“…after a leadership review, employees working on the product were told Pichai had ordered a change of direction to ensure the new product didn’t compete with the Gemini assistant, which operates across all of Google’s Android devices, according to a person with direct knowledge of the instructions.”

From a consistency standpoint, this makes sense. Google’s history is stained with a lot of half-cooked up duplicates of messaging platforms and this would’ve been another one of those. I’m glad they went with the larger scale Gemini plan instead. I do hope that Google provides a way to make Gemini more contextual with what’s on-device.

Made by Google Podcast Highlights First Pixel Feature Drop of 2025

In a recent episode of the Made by Google podcast, Aisha Sherif, a Pixel software product manager, detailed the first Pixel feature drop of 2025. The update focuses on enhancing security, safety, and AI-driven productivity.

Here are the key features discussed:

  • Scam Detection: This feature, exclusive to Pixel 9 and 9 Pro phones in the US, uses AI to identify potential scams in phone calls and texts, alerting users with haptic and visual cues. The AI operates on-device to protect user privacy.
  • Satellite SOS: Expanding its reach, Satellite SOS now includes Hawaii, Alaska, Europe, and Canada, ensuring access to emergency services in areas without traditional connectivity.
  • Screenshot Organization: The screenshot app now automatically groups screenshots into collections and is available in German and Japanese.
  • Recorder App: Pixel users switching devices can now have old recordings automatically transcribed. The recorder app also supports transcription and summarization in Japanese.
  • GoPro Connection: Pixel phones can now connect to GoPros for simultaneous recording from multiple perspectives.
  • Pixel Fold and N Pro Enhancements: These devices gain dual-screen preview capabilities for video and media.
  • Additional Updates: The update includes improvements to Gboard voice typing, simplified mode navigation, and the ability for Gemini to analyze attached screenshots, pictures, and YouTube videos.
  • Pixel Watch Updates: Pixel Watch 2 users can expect enhanced step tracking and an automatic bedtime mode.
  • Pixel Studio: The update introduces the ability to generate human images.
  • Find My Device: Users can now see friends and family that they share a location with via Google Maps straight from the Find My Device app.

I’ve always wanted to put together a buying guide for the Google Pixel. After a 9 generations of Pixel phones, it’s pretty easy to know which Pixel is designed for what customer.

Check out the Ultimate Pixel 9 series Buying Guide.

Auto-generated description: A lineup of five Google Pixel phones is displayed, each with a light-colored back and the Google logo.