Happenstance with Tap to Cast on Pixel

The other day, while I was listening to the “Android Developers Backstage” Podcast, my Pixel vibrated and asked me to “Bring Pixel closer to Tablet”. Of course, I followed my pocket Android’s instructions and Voilà! “Tap to Cast” automagically casted the podcast to my Pixel Tablet. You can see below:

View on Threads

You might’ve see this already posted on Threads and BlueSky (which took forever to upload btw), but I thought it was so cool! I heard about the feature when it was announced, but I never knew when it would reach my devices. It was a nice surprise. Especially when it was a happenstance instead of, like I used to, spam the update button to get the latest features. I need to take on that practice more. See the feature announced and go about my day, not angrily spam the “update” button and shake my fist at the clouds like a mad man, trying to force the feature to my devices.

I digress. Abner at 9to5Google explained it best back in January, 2024:

When listening with YouTube Music or Spotify, bringing your Pixel 6 Pro, 7 Pro, 8 Pro, or Fold – which all have UWB (Ultra-Wideband) – near a docked Pixel Tablet will seamlessly move what’s playing between devices. This is bidirectional with the ability to transfer a song from the Pixel Tablet to your Pixel phone also available.

Officially, Google is using a “range of technologies to determine the proximity and presence of devices as part of multi-device experiences, which includes UWB.” Its Cross device SDK was announced in 2022 and consists of three layers:

  • “The first layer recognizes what devices are physically close by, with wireless technologies like Bluetooth Low Energy, Wi-Fi, and ultra-wideband (UWB).”
  • “The second layer is nearby device discovery and context-aware capabilities that identify which device you may want to use based on your current activity.”
  • “Finally, the third layer understands and adjusts actions based on how you interact with your devices with cross-device intelligence.”

Google went as far to explain that “Tap to Cast” is available without any additional work if developers integrate Output Switcher (2.0) on Android 14. Right now, it’s only available with YouTube Music and Spotify with UWB enabled Pixel devices like my Pixel 9 Pro and Pixel Tablet. It’s nice to know that this isn’t an exclusive Pixel feature so more hardware and apps could be supported in the future!

Apple is REALLY trying to "save" themselves in the DOJ v. Google case

Apple, a key witness in the US government’s case against Google, is seeking a more active role in the remedies phase of the trial where solutions to Google’s search monopoly will be decided. They are concerned that the government’s proposed remedies, which include ending lucrative deals between Apple and Google, will negatively impact them. Despite the judge denying their request to participate directly, Apple is appealing the decision and asking for a stay in the proceedings to ensure their interests are adequately represented.

Although Apple understands that this Court intends to adhere to the “strict confines of the [C]ourt’s Scheduling Order,” Order Denying Limited Intervention at 16, Jan. 27, 2025, ECF No. 1153, Apple has no choice but to move for a stay of these proceedings to protect its rights pending appeal. Absent a stay, Apple will suffer irreparable harm: the deprivation of its right to participate as a party in the remedial phase of this case moving forward, including possibly at the trial itself, while its undisputed property rights are adjudicated. These harms are magnified by a position Plaintiffs revealed in a recent meet-and-confer with Apple. Specifically, Plaintiffs represented that they intend to challenge this Court’s decision permitting Apple to submit up to two affidavits during the trial. Courts have commonly granted stays pending appeal of orders denying intervention, and this Court should do the same here. In the alternative, the Court should at minimum afford Apple full access to the record as a nonparty until the D.C. Circuit rules.

Apple continues explaining how much the desired outcome will hurt them with “irreparable harm”:

Second, Apple will suffer clear and substantial irreparable harm if it is unable to participate in the remedies phase moving forward. Apple will be unable to participate in discovery and develop evidence in the targeted fashion it has proposed as this litigation progresses toward a final judgment. If Apple’s appeal is not resolved until during or after the remedies trial, Apple may well be forced to stand mute at trial, as a mere spectator, while the government pursues an extreme remedy that targets Apple by name and would prohibit any commercial arrangement between Apple and Google for a decade. This would leave Apple without the ability to defend its right to reach other arrangements with Google that could benefit millions of users and Apple’s entitlement to compensation for distributing Google search to its users.

I totally understand Apple’s argument here, but allow me to explain further in a fun analogy I came up with to explain it to my wife.

It’s like a playground where kids (users) love to use the swings (search engines). Google owns most of the swings and decides who gets to use them. Apple owns the playground and makes a deal with Google: Google pays Apple a lot of money to make sure their swings are the only ones easily available to the kids. Now, the playground teachers (the government) say Google isn’t playing fair and has too much control over the swings. They want to change the rules to give other swing makers a chance. Apple is worried because if the kids can use any swings they want, Google might not pay them anymore.

So, Apple wants to talk to the teachers and explain why they think their deal with Google is good for the playground. They also want to make sure the new rules don’t make the playground worse for everyone.

Analogies are hard, but she totally understood exactly what’s happening here and you want to know what her answer was? “Why doesn’t Apple just make their own swings? They are a trillion dollar company aren’t they?" 😏

Android Studio's Origin Story: Insights from the Android Developers Backstage Podcast

As an #Android fan, I’m always eager to learn more about the ecosystem, and the “Android Developers Backstage” podcast is a fantastic resource.

They’ve been doing it for a while now, and in this episode, they take us back to the “before Android Studio” days. It’s fascinating to hear from some of the actual Android development team—Chet, Romain, and Tor, brought on special guests Xav, and Jamal to talk about their experiences and the journey of Android Studio. Even reaching back to when Eclipse was the IDE of choice and the transition and excitement to Android Studio.

It’s pretty cool to get this kind of insider knowledge from the people who actually build Android Studio. If you’re an Android enthusiast like me, you’ll definitely want to check out this episode of “Android Developers Backstage”!

Google Invites Pixel, Android Staff to Leave Voluntarily

Google offering ‘voluntary exit’ for employees working on Pixel, Android

This sounds like Google is letting the extra bulk roll off voluntarily? To me this sounds like the equivalent to when Steve Jobs asked Jony Ive his “why” to him being on the Mac team. Google provided a full statement to 9to5Google below:

The Platforms & Devices team is offering a voluntary exit program that provides US-based Googlers working on this team the ability to voluntarily leave the company with a severance package. This comes after we brought two large organizations together last year. There’s tremendous momentum on this team and with so much important work ahead, we want everyone to be deeply committed to our mission and focused on building great products, with speed and efficiency.

I’m assuming this was a conversation that happened when the Platform & Devices team restructured to a more functional organization model a year ago. Something Apple and other hardware makers do to bring more focus on their products. Now that Google is truly in it’s Gemini era, Pixel, Nest, and Fitbit truly need to be the vehicles for Google’s vision. Especially with Android XR getting a lot of positive reception from the public. I’m hoping Google lives up to the hype in 2025 as I know quite a few individuals on that team that have done some amazing work already.

I'm sorry, but these Apple Intelligence, "Notification Summaries" mistakes will never not be funny 🤣

Rise of the Machines

OpenAI launches ChatGPT Gov for U.S. government agencies

OpenAI on Tuesday announced its biggest product launch since its enterprise rollout. It’s called ChatGPT Gov and was built specifically for U.S. government use.

The Microsoft-backed company bills the new platform as a step beyond ChatGPT Enterprise as far as security. It allows government agencies, as customers, to feed “non-public, sensitive information” into OpenAI’s models while operating within their own secure hosting environments, OpenAI CPO Kevin Weil told reporters during a briefing Monday.

I swear I’ve seen this before.

Where are the Android apps for AT Protocols and Activity Pub?

This post is inspired by Parker Ortolani’s thought-provoking question, “Where are the BlueSky Clients?". Like Ortolani, I’m puzzled by the apparent focus on Mastodon clients over BlueSky clients by developers.

“…what I can’t wrap my head around is the business decision of developers to lean into Mastodon clients versus Bluesky ones.”

Ortolani’s observation about developers leaning towards ActivityPub despite BlueSky’s larger user base (30 million vs. Mastodon’s 9 million) raises a valid point. While the extensive tooling available for ActivityPub development might be a factor, I suspect there’s more to the story. Perhaps I’m wrong, but I believe a stronger connection between the Android, ActivityPub, and AT Protocol communities is crucial, given their shared focus on decentralized infrastructure.

My own perspective on the Fediverse and social media ownership has shifted recently. I’m striving to recapture the joy of social platforms as spaces for community and collaboration, rather than succumbing to the pressures of capitalism and centralized control. This very shift prompted me to migrate my blog from Substack to Micro.blog. I appreciate that Micro.blog offers an Android app, and while it’s currently basic, I hope to see more active development on the Android front. A friend of mine has a blog that breaks down the different Activity Pub blogging platforms that I think is incredibly useful, but it’s solely available as a web app.

This brings me to a key point. I’ve even considered exploring Android app development myself, alongside my full-stack web development work. I understand the time and effort required to build clients, and I’m genuinely grateful for platforms like Micro.blog that offer apps in the Play Store (or even sideloadable on Android). However, I believe a stronger synergy with Android, the world’s most popular mobile operating system, is essential for the growth of decentralized social media.

For a decentralized network, prioritizing web development followed by Android development seems logical. Focusing primarily on the web and then closed ecosystems like iPadOS and iOS feels counterintuitive. Perhaps I’m missing a critical piece of the puzzle, but it strikes me as a backwards approach.

Why aren’t we seeing more development for the most accessible and widespread mobile platform?

I'm moving from Substack to Micro.blog

Good news it, it's as easy as following blog.michaelbtech.com. That's it, there's only good news. See ya there. 😁

The Battle Isn't About Operating Systems, It's About Ecosystems

The smartphone "dogfight" between Apple and Google, as masterfully chronicled by Fred Vogelstein in his book of the same name [1], was often framed as a war over operating systems. Android versus iOS. But as we look towards the next computing frontier – mixed reality (MR) – it's becoming increasingly clear that the real battle isn't just about the OS;…

Read more

Android XR: The Dawn of a New Era in Headset Innovation?

After watching Marques Brownlee's exclusive look at the Samsung prototype running Android XR, I'm buzzing with a level of excitement I haven't felt for a new tech platform in years. While Apple's visionOS and Meta's HorizonOS are intriguing, Android XR feels different. It carries the same vibrant energy as the early days of Android, promising a wave of innovation and diverse hardware experiences that could truly revolutionize the XR space.

Think back to the launch of Android. Every time Google's mobile OS enters a new form factor – smartphones, tablets, smartwatches – it ignites a spark. Suddenly, a flurry of manufacturers, both established giants and nimble startups, emerge with unique hardware interpretations. We see a burst of creativity, pushing the boundaries of design and functionality. This is precisely what Android XR promises for headsets.

The open nature of Android fosters a healthy ecosystem. It's not just one company dictating the entire experience, limiting choices and stifling innovation. Instead, Android XR empowers hardware makers to focus on what they do best: crafting amazing devices. Companies like Sony, XReal, and countless others we haven't even heard of yet will have access to a robust software foundation, allowing them to differentiate themselves through innovative hardware designs and specialized features. This echoes the dynamism we've seen in the Android smartphone market, and it's precisely what the XR space needs.

But the excitement doesn't stop at the open ecosystem. What truly blew me away was the sheer polish and sophistication of the Android XR software, even in its prototype stage. Remember those early "Nexus" devices, the developer previews that often left a lot to be desired? Think back to the barebones experience of the first Nexus phones or the initial Android Wear watches like the LG G Watch. Android XR is a completely different beast. It's clear that Google and Samsung have poured significant effort into crafting a premium software experience from the get-go.

The integration of AI, exemplified by the Gemini assistant, is truly mind-blowing. Seeing how effortlessly Gemini can be used for navigation, app interaction, and even real-time environmental information feels like stepping into the realm of science fiction – in the best possible way. The "circle to search" feature, similar to Google Lens, adds another layer of intuitive interaction. It feels like we're witnessing the birth of a truly intelligent and immersive computing platform.

This whole experience resonates deeply with me. As I've been reading Chet Haase's "Androids: The Team that Built the Android Operating System," the collaboration between Samsung and Google on Android XR feels like a modern-day parallel to the "mixing bowl" of engineering talent that birthed the original Android. Just like the engineers from Danger, Be/PalmSource, and WebTV/Microsoft came together to create something magical, the partnership between Samsung and Google has the potential to shape the future of XR.

I'm keeping my expectations in check for now, but the potential of Android XR is undeniable. More than just one company selling its ecosystem and controlling every aspect of the experience, Android XR represents a fundamental shift. It's a platform that fosters open innovation, empowers hardware makers, and promises a diverse range of devices and experiences. It's the dawn of a new era in headset technology, and I, for one, am incredibly excited to see what the future holds.